CROSS REFERENCE TO RELATED APPLICATIONSThis application is a divisional application of U.S. patent application Ser. No. 13/744,936, accorded a filing data of Jan. 18, 2013, which is a continuation of International Patent Application No. PCT/JP2011/004174, accorded a filing date of Jul. 25, 2011, which claims priority to JP Patent Application Nos. 2010-172621 and 2010-172622, accorded filing dates of Jul. 30, 2010, respectively, the entire disclosures of which are hereby incorporated by reference.
BACKGROUND OF THE INVENTION1. Field of Invention
The present invention relates to an electronic device, and particularly to an electronic device provided with an information processing function.
2. Description of the Related Art
Conventionally, electronic devices such as portable game devices, PDA's (Personal Digital Assistant), and the like are widely used. In recent years, multifunctional electronic devices such as, e.g., smartphones have been introduced in which functions of portable phones, PDA's, and the like are put together into one. Such electronic devices are provided with a large-capacity memory and a high-speed processor, and the user can enjoy various applications by downloading content such as game software, music, movies, and the like.
Electronic devices having touch panels provide an excellent user interface that allows the user to perform an intuitive operation. For example, user interfaces and the like are already in practical use that allow a displayed content image (icon) to be tapped using a finger so as to select the icon or that allow a display image to be scrolled by tracing the surface of a panel using a finger.
An electronic device having a touch panel usually provides one type of user interface for a single process. With regard to a scrolling process, a user interface is already in practical use that allows the surface of a panel to be traced using a finger in a direction in which the user wishes to move a display image. Thus, an electronic device provides such a user interface to the user while being equipped with the user interface. However, when many icons are to be displayed, a scrolling process must be repeated until a target icon is displayed, and it sometimes takes time. In particular, there is a fact that the number of icons to be displayed is growing since large-capacity memories allow electronic devices to store many items of content.
The inventors of the present invention have conceived of a possibility of achieving an efficient scrolling operation by providing several different types of user interfaces with different amounts of scrolling. In that case, the same scrolling display may be provided for all the types of user interfaces. However, in order for the user to efficiently search for a target icon, it is preferred to provide, based on the difference in the amount of scrolling, scrolling display with visibility devised for each type of user interface.
An electronic device of recent years has a wireless communication function and is provided with a large-capacity memory that allows for the downloading of various items of content to the memory after accessing an external content server. Therefore, an electronic device is preferably capable of searching for content stored in a server or content downloaded in a memory efficiently. Also, an electronic device is capable of effectively presenting search results to the user, preferably.
SUMMARY OF THE INVENTIONAccordingly, a purpose of the present invention is to provide an electronic device capable of realizing display in a process of moving (scrolling) a content image efficiently. Another purpose of the present invention is to provide an electronic device capable of performing a search process efficiently.
An electronic device according to one embodiment of the present invention comprises: an acquisition unit configured to acquire data of a display item corresponding to content; and a display control unit configured to generate an image to be displayed on a display. The display control unit has: a first display unit configured to arrange a plurality of display items side by side; a second display unit configured to display information related to a display item arranged by the first display unit; a first reception unit configured to acquire a first moving instruction for the display items arranged side by side; and a second reception unit configured to acquire a second moving instruction for the display items arranged side by side. The first display unit moves the display items on the display according to a moving instruction acquired by the first reception unit or the second reception unit, and the second display unit displays different related information for the same display item arranged by the first display unit when the first reception unit acquires the first moving instruction and when the second reception unit acquires the second moving instruction.
Another embodiment of the present invention relates to a method of displaying a display item. The method comprises: acquiring data of a display item corresponding to content; and generating an image to be displayed on a display. The generation of the image has: arranging a plurality of display items side by side; displaying information related to an arranged display item; acquiring a first moving instruction for the display items arranged side by side; and acquiring a second moving instruction for the display items arranged side by side. The arrangement of the plurality of display items includes moving the display items on the display according to an acquired moving instruction. In the display of the related information, different related information is displayed for the same display item when the first moving instruction is acquired and when the second moving instruction is acquired.
Yet another embodiment of the present invention relates to an electronic device having a communication function comprising: a memory unit configured to store content data; a communication unit configured to connect to a server; a reception unit configured to receive a content search instruction; a search processing unit configured to search the memory unit and allow the server to perform a search via the communication unit, in accordance with the search instruction received by the reception unit; and a display control unit configured to display a search result of the memory unit and a search result of the server in different display areas.
Still another embodiment of the present invention relates to a search processing method. The method comprises: connecting to a server; receiving a content search instruction; searching a memory unit for storing content data and allowing the connected server to perform a search, in accordance with the received search instruction; and displaying a search result of the memory unit and a search result of the server in different display areas.
Optional combinations of the aforementioned constituting elements and implementations of the invention in the form of methods, apparatuses, systems, recording mediums, and computer programs may also be practiced as additional modes of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGSEmbodiments will now be described, by way of example only, with reference to the accompanying drawings which are meant to be exemplary, not limiting, and wherein like elements are numbered alike in several Figures, in which:
FIG. 1 is a diagram illustrating the configuration of a content providing system according to an exemplary embodiment;
FIGS. 2A and 2B are diagrams illustrating the exterior configuration of an electronic device;
FIG. 3 is a diagram illustrating the entire configuration of functional blocks of the electronic device;
FIG. 4 is a diagram illustrating a menu screen image provided by a content delivery server;
FIG. 5 is a diagram illustrating a list screen image of game categories;
FIG. 6 is a diagram illustrating a list screen image of game software;
FIG. 7 is a diagram illustrating a content purchase screen image;
FIG. 8 is a diagram illustrating functional blocks of a control unit that performs a process of displaying a content image;
FIG. 9 is a diagram illustrating a display screen image shown at the time of starting a content management application;
FIG. 10 is a diagram illustrating a display screen image shown when an access destination is changed;
FIG. 11 is a diagram illustrating a content table;
FIG. 12 is a diagram illustrating a display screen image shown at the time of a first scrolling process;
FIG. 13 is a diagram illustrating a display screen image shown at the time of a second scrolling process;
FIGS. 14A through 14C are diagrams illustrating an example of adding information related to a display order of content images;
FIG. 15 is a diagram illustrating a play selection screen image;
FIG. 16 is a diagram illustrating a flowchart of a scrolling process;
FIG. 17 is a diagram illustrating functional blocks of a control unit that performs a content search process;
FIG. 18 is a diagram illustrating a search screen image;
FIG. 19 is a diagram illustrating a search result screen image; and
FIG. 20 is a diagram illustrating a flowchart of a search process.
DETAILED DESCRIPTION OF THE INVENTIONThe invention will now be described by reference to the preferred embodiments. This does not intend to limit the scope of the present invention, but to exemplify the invention.
FIG. 1 illustrates the configuration of acontent providing system1 according to an exemplary embodiment. In thecontent providing system1, anelectronic device10 and a content delivery server4 are connected via a network3 such as the Internet in a manner such that theelectronic device10 and the content delivery server4 can communicate with each other. Theelectronic device10 transmits to the content delivery server4 a content search instruction or a content acquisition request. The content delivery server4 transmits to theelectronic device10 content search results, content data, a content list, a content image, or the like. The content delivery server4 may be composed of a plurality of servers. A content image is an image that corresponds to content and may be, for example, a package image of the content. A content image may be included in content data. Theelectronic device10 has a wireless communication function and connects to the network3 via anaccess point2.
Theelectronic device10 according to the present exemplary embodiment has a communication function in a wireless LAN (Local Area Network) method. Alternatively, theelectronic device10 may have a communication function in another wireless communication method or may be configured such that theelectronic device10 connects to an external apparatus via a wired cable such as a USB cable so as to communicate with the external apparatus.
Theaccess point2 connects theelectronic device10 to another access point via a wireless LAN or functions as a relay apparatus that connects theelectronic device10 to the network3 such as the Internet or a wired LAN. When theelectronic device10 has a wire communication function, theelectronic device10 can connect to the content delivery server4 by using, e.g., a PC (personal computer) or the like connected to the network3 as a relay apparatus.
FIGS. 2A and 2B illustrate the exterior configuration of a portableelectronic device10 according to the present embodiment. Theelectronic device10 is provided with anupper housing20 and alower housing30 that are slidably connected.FIG. 2A is a front view when theelectronic device10 is in a closed state, andFIG. 2B is a front view when theelectronic device10 is in an open state. In the state where theelectronic device10 is closed, theupper housing20 almost completely overlaps thelower housing30 above thelower housing30, and operation keys provided on the front surface of thelower housing30 are not exposed to the outside. AnL button37 and anR button38 are provided on the upper surface of thelower housing30. When thelower housing30 is slid against theupper housing20 from the closed state, theelectronic device10 becomes open exposing the operation keys provided on the front surface of thelower housing30 to the outside. Regardless of whether theelectronic device10 is in the open state or in the closed state, the user is capable of operating theL button37 and theR button38.
On the front surface of theupper housing20, aleft speaker21a, aright speaker21b, anoperation button22, and atouch panel23 are provided. A slide mechanism (not shown) that slidably connects theupper housing20 and thelower housing30 is provided between the back surface of theupper housing20 and the front surface of thelower housing30.
In an open state shown inFIG. 2B,directional keys31a,31b,31c, and31d(hereinafter, generically referred to as “directional keys31”), ananalog pad32, amicrophone33, aSTART button34, aSELECT button35,operation buttons36a,36b,36c, and36d(hereinafter, generically referred to as “operation buttons36”) that are provided on the front face of thelower housing30 are exposed to the outside. The operation keys such as the directional keys31, theanalog pad32, theSTART button34, theSELECT button35, and theoperation buttons36 become operable when theelectronic device10 becomes open.
Theelectronic device10 may be a mobile phone provided with a PDA function. In addition to a call function, theelectronic device10 is configured to have a function of executing game software and/or a function of reproducing music, movies, etc., by installing a predetermined application program. Programs used to realize these functions may be already installed by the time anelectronic device10 is shipped from the factory.
FIG. 3 illustrates the entire configuration of functional blocks of theelectronic device10. An opening/closing detection unit80 detects a transition of the state of theelectronic device10 from an open state to a closed state or from the closed state to the open state. The opening/closing detection unit80 transmits to acontrol unit50 transition information with a signal value of “0” when theelectronic device10 transitions from the open state to the closed state and transmits to thecontrol unit50 transition information with a signal value of “1” when theelectronic device10 transitions from the closed state to the open state.
Thetouch panel23 is configured with aposition input apparatus24 and adisplay25, which are connected to thecontrol unit50. Thedisplay25 is capable of displaying various types of information based on a signal transmitted from thecontrol unit50 and displays a content search screen image, a content icon (hereinafter, also referred to as a “content image”), or the like based on an instruction from the user. Theposition input apparatus24 is, for example, a touchpad and transmits to thecontrol unit50 position information regarding a touched part on thetouch panel23 based on a touch operation by a finger or a stylus pen. For theposition input apparatus24, various input detection methods such as a resistance film method and an electrostatic capacitance method can be employed. Thecontrol unit50 performs a search process, a process of generating a display screen image, and the like and writes data to and/or reads data from amemory unit60 as necessary. Thememory unit60 may be a hard disk drive (HDD), a random access memory (RAM), or the like. Acommunication unit40 realizes a communication function and connects to the content delivery server4 via the network3.
The content delivery server4 stores fee-based or charge-free content data. If the content is application software, the content data is configured to include a program for executing the application, a content image and content information that correspond to the application, and the like. For example, if the content is a movie, the content data is configured to include compressed moving image data, a content image and content information that correspond to the movie, and the like. An example of a procedure of downloading content data is shown in the following.
When theelectronic device10 accesses the content delivery server4, a menu screen image is displayed on thetouch panel23.FIG. 4 illustrates a menu screen image provided by the content delivery server4. A plurality oftabs72,73, and74 are displayed on the menu screen image. When the user presses (taps) thetab72 with his/her finger, the content delivery server4 provides to theelectronic device10 information regarding an application (other than game software) that can be provided. When the user taps thetab73, the content delivery server4 provides to theelectronic device10 information regarding game software that can be provided.
When the user taps thetab74, the content delivery server4 provides to the electronic device10 a list of applications and game software that have been already downloaded by the user. Asearch button71 is provided to search for a desired application or desired game software.
When the user taps thetab73, a list screen image of game categories shown inFIG. 5 is displayed on thetouch panel23. This list screen image displays a list of game categories, and the user taps the display area of a category of game software the user wishes to download. If the title of a game is already known, the user may open a search screen image by tapping thesearch button71 on the menu screen image shown inFIG. 4 or on the list screen image shown inFIG. 5 so as to enter the game title.
When the user taps the display area of “Action game”, a list screen image of game software items shown inFIG. 6 is displayed on thetouch panel23. On this list screen image, a list of game software items that belong to the category of “Action game” is displayed. As information for identifying games,package images75athrough75d,game titles76athrough76d, andprices77athrough77dare displayed in respective rows. A picture of stars is added for each item of game software. The number of stars represents an average rating made by users who have already downloaded the game software. The larger the number of stars, the better the popularity of the game software.
When the user taps the display area of a game title “BBB action”, a purchase screen image shown inFIG. 7 is displayed on thetouch panel23. On the purchase screen image, apurchase button78 and a cancelbutton79, which can be operated by the user, are displayed along with apackage image81 anddetailed information82. When the user taps thepurchase button78, the content delivery server4 performs a billing process and transmits game software data of the title “BBB action” to theelectronic device10. Thecommunication unit40 receives the game software data, and thecontrol unit50 stores the game software data in thememory unit60.
In thecontent providing system1 according to the present exemplary embodiment, theelectronic device10 is provided with a content management application for performing a process of displaying a content image and a search process. The content management application performs a process of displaying a content image and a process of searching content.
As explained in association withFIGS. 4-7, theelectronic device10 can download content data by connecting to the content delivery server4. The content management application plays a role of supporting content download by performing a process of displaying a content image and a process of searching content. An explanation is given in the following on the assumption that the content management application performs a process of displaying a package image of game software and a process of searching game software. Content to be managed is not limited to games and may be music, movies, and the like. Content to be managed may even be a name or phone number in a phone book, a picture that has been captured, a document file that has been generated, a product in a shopping site, etc. In the present exemplary embodiment, content needs to be those whose corresponding display items can be arranged on a display, and a display item may be an image or text. An example is shown in the following where a display item is a package image of game software. If a phone book is to be displayed, the display item is a name described in text, a phone number, or an image such as a picture and an avatar. If a document file is to be displayed, the display item is a file name described in text. If the content to be displayed is music, the display item is an image of a jacket thereof. If the content is a movie, the display item is an image of a package thereof. If the content is a book, the display item is an image of a cover thereof. Upon receiving, from the opening/closing detection unit80, transition information with a signal value of “1” indicating that theelectronic device10 has transitioned from the closed state to the open state, system software that is installed in thecontrol unit50 starts the content management application.
<Process of Displaying Content Image>
FIG. 8 illustrates functional blocks of thecontrol unit50 that performs a process of displaying a content image. Thecontrol unit50 is provided with aninput detection unit90, adisplay control unit100, and anacquisition unit120. Theinput detection unit90 detects screen position information transmitted from theposition input apparatus24 and provides the screen position information to thedisplay control unit100. Thedisplay control unit100 has a function of controlling screen display of thedisplay25 and is provided with a contentscreen display unit116 and a scrollinstruction reception unit110. The contentscreen display unit116 has a templatescreen display unit102, a contentimage display unit104, a relatedinformation display unit106, and anoperator display unit108, and the scrollinstruction reception unit110 has afirst reception unit112 and asecond reception unit114.
The functions of thecontrol unit50 are implemented by CPU, memory, a content management application program loaded into the memory, or the like.FIG. 8 depicts functional blocks implemented by the cooperation of these components. Thus, a person skilled in the art should appreciate that there are many ways of accomplishing these functional blocks in various forms in accordance with the components of hardware only, software only, or the combination of both. Thecontrol unit50 has a multi-task processing function and is capable of performing a plurality of tasks at the same time.
Theacquisition unit120 acquires acontent image130 andcontent information132 transmitted from the content delivery server4 and stores thecontent image130 and thecontent information132 in thememory unit60. Theacquisition unit120 acquires thecontent image130 and thecontent information132 stored in thememory unit60 and provides thecontent image130 and thecontent information132 to thedisplay control unit100. As described, theacquisition unit120 has both a function of acquiring acontent image130 andcontent information132 from the content delivery server4 and a function of acquiring acontent image130 andcontent information132 from thememory unit60. The latter acquisition function is realized by reading data from thememory unit60 by theacquisition unit120.
Acontent image130 andcontent information132 transmitted from the content delivery server4 may be treated as a group of data sets for each item of content. Theacquisition unit120 may extract and acquire thecontent image130 and thecontent information132 from content data downloaded in thememory unit60 and provide thecontent image130 and thecontent information132 to thedisplay control unit100. If the content is a game, the content data is configured to include a game program for executing the game and acontent image130 andcontent information132 for identifying the game.
FIG. 9 illustrates adisplay screen image170 shown at the time of starting a content management application. In thedisplay screen image170, an accessdestination selection area140, anindex display area142, a contentimage display area144, and anoperator display area146 are formed.
In the accessdestination selection area140, two access destinations, “Game Store Channel” and “Libraries”, are displayed. In the present exemplary embodiment, the access destination is the content delivery server4 if “Game Store Channel” is selected, and the access destination is thememory unit60 if “Libraries” is selected. In the contentimage display area144, a content image of an access destination selected in the accessdestination selection area140 is displayed. Aselection frame150 is set to specify an access destination selected by the user. In thedisplay screen image170, the access destination, “Game Store Channel”, is being selected. Other display modes may be employed in order to indicate that the access destination is the one that has been selected. For example, “Game Store Channel” may be shown in bold text or displayed in an eye-catching color. The user selects an access destination by tapping the display area of “Game Store Channel” or “Libraries”.
In theindex display area142, an index tab for sorting content images to be displayed is formed. In theindex display area142, “Featured” is an index for a group of games recommended by a content distributor, “Just in” is an index for a group of new games, and “Top download” is an index for a group of popular games. These are intended to be illustrative only, and other types of index tabs may be formed. By tapping an index tab, the user selects the type (group) of content images to be displayed.
In the contentimage display area144,content images160a,160b,160c,160d, and160eandcontent information162 that belong to a type selected through an index tab. In the contentimage display area144, the plurality ofcontent images160a,160b,160c,160d, and160e(hereinafter, generically referred to as “content images160”) are arranged horizontally side by side, and thecontent image160carranged at the center position is displayed such that thecontent image160cis larger thanother content images160a,160b,160d, and160e. Aninformation display area156 for displaying thecontent information162 is formed below thecontent image160c. Thecontent information162 is information related to thecontent image160c. In this example, a game title and a price are displayed as thecontent information162.Information display areas156 may be also formed below theother content images160a,160b,160d, and160e, and respective sets ofcontent information162 may be displayed.
When the user places his/her finger on the contentimage display area144 and moves (traces) his/her finger to the right or left, the content images160 move to the right or left in accordance with the movement of the finger. The contentimage display area144 includes a plurality of areas for arranging the content images160 and may be formed as a rectangular area as shown in the figure. When the content images160 move, thecontent information162 displayed in theinformation display area156 also changes in accordance with acontent image160cdisplayed at the center position. A process of scrolling based on input information in the contentimage display area144 is referred to as a “first scrolling process”.
In theoperator display area146, anoperator154 for moving the content images160 in a transverse direction is formed. Theoperator154 is a bar used to scroll the content images160. By moving the bar to the right or left by a finger from the center position shown in the figure, the content images160 move at speed according to the amount of motion. In this example, as the amount of shift of theoperator154 from the center position becomes larger, the speed of movement of the content images160 becomes faster. Anoperator154ashows a condition where theoperator154 is moved to the right from the center position. The content images160 move to the left at speed according to the amount of displacement from the center position at this time. A process of scrolling based on input information for theoperator154 is referred to as a “second scrolling process”. The second scrolling process may be configured using a scroll bar formed to include a knob such that the amount of motion of theoperator154 corresponds to the amount of movement of the content images160.
As described, two types of scrolling processes are prepared in theelectronic device10. The first scrolling process is used when scrolling the content images160 at low speed, and the second scrolling process is used when scrolling the content images160 at high speed.
When the user taps thecontent image160carranged at the center position while “Game Store Channel” is being selected as an access destination, for example, a purchase screen image shown inFIG. 7 is displayed on thetouch panel23. As described, the content management application has a role of supporting the download of content data. When the user taps alink area152, a menu screen image of the content delivery server4 shown inFIG. 4 is displayed on thetouch panel23.
Referring back toFIG. 8, when the content management application is started, theacquisition unit120 acquires acontent image130 andcontent information132 from the content delivery server4 via thecommunication unit40 and stores thecontent image130 and thecontent information132 in thememory unit60. At this time, preferably, theacquisition unit120 acquires content lists included in the indexes, i.e., “Featured”, “Just in”, and “Top download”, respectively, formed in theindex display area142 and acquires bothcontent images130 andcontent information132 included in all the indexes all at once. Thecontent images130 and thecontent information132 are stored in a predetermined area in thememory unit60. The content lists of the respective indexes include link information for thecontent images130 and thecontent information132 that are stored in thememory unit60. By storing thecontent images130 and thecontent information132 included in all the indexes in advance in thememory unit60, the content images160 and thecontent information162 can be promptly arranged in the contentimage display area144 using the content lists without accessing the content delivery server4 again even when index tabs are switched.
In thedisplay control unit100, the templatescreen display unit102 displays a template screen image. The template screen image is a screen image obtained by excluding the content images160, thecontent information162, and theoperator154 from thedisplay screen image170. Theacquisition unit120 reads from thememory unit60 content images160 andcontent information162 that correspond to an index selected in theindex display area142 and provides the content images160 and thecontent information162 to the contentimage display unit104 and the relatedinformation display unit106, respectively. Theacquisition unit120 reads, in reference to a content list of the index, the content images160 and thecontent information162 from thememory unit60 using link information included in the list at this time.
The contentimage display unit104 arranges the plurality of content images160 side by side in the contentimage display area144. In thedisplay screen image170, the plurality of content images160 are lined up in a transverse direction on thetouch panel23. Alternatively, the content images160 may be lined up in a longitudinal direction or in an oblique direction. The contentimage display unit104 orders respective game titles in alphabetical order and lines up the game titles from left to right in that order. If the game titles are in Japanese, the contentimage display unit104 orders the game titles in the order of the Japanese syllabary and lines up the game titles from left to right in that order. As shown in the contentimage display area144, acontent image160cin the center is displayed such that thecontent image160cis larger than other content images. In a content list acquired from the content delivery server4, the order in which game titles are displayed may be pre-designated to be the alphabetical order or order of the Japanese syllabary, and the contentimage display unit104 may arrange a plurality of content images160 in the contentimage display area144 in accordance with the order in which the game titles are displayed.
The relatedinformation display unit106displays content information162 in theinformation display area156 in conjunction with thecontent image160cdisplayed by the contentimage display unit104. A game title and a price are shown in this case. Alternatively, the name of a game maker, stars indicating an evaluation result, and the like may be displayed.
Theoperator display unit108 displays theoperator154 at the center position of theoperator display area146. As described, when theoperator154 is moved by the user, the second scrolling process will be performed. The scrollinstruction reception unit110 receives a moving instruction, i.e., a scroll instruction, for the content images160 arranged side by side.
When “Libraries” is selected in the accessdestination selection area140, the access destination is changed, and content images160 for the content stored in thememory unit60 are displayed.
FIG. 10 illustrates adisplay screen image172 shown when the access destination is changed. Thedisplay screen image172 is basically the same as thedisplay screen image170 shown inFIG. 9. Thus, an explanation is given regarding differences therebetween.
Theselection frame150 moves to the display area of “Libraries”. This allows the user to confirm that the access destination is thememory unit60, andcontent images130 andcontent information132 for games that have already been downloaded and stored in thememory unit60 are to be displayed in the contentimage display area144. As shown inFIG. 4, the content delivery server4 manages, as “My downloads” for each user, the content downloaded by theelectronic device10. In other words, the content delivery server4 keeps track of the content stored in thememory unit60. In addition to acquiring thecontent images130 and thecontent information132 that are stored in the content delivery server4, theacquisition unit120 may acquire thecontent images130 and thecontent information132 for the content stored in thememory unit60 at the same time when generating thedisplay screen image170 shown inFIG. 9.
In theindex display area142, an index tab for sorting content images160 to be displayed is formed. In theindex display area142, “Titles” is an index for a game group downloaded in thememory unit60, “Recently played” is an index for a recently played game group, and “Recently added” is an index for a recently downloaded game group. These are intended to be illustrative only, and other types of indexes may be formed. By tapping an index tab, the user selects the type of content images160 to be displayed.
Thedisplay control unit100 creates in advance a table for content stored in thememory unit60.
FIG. 11 illustrates the created content table. In the content table, a game title, link information for identifying an area in which acontent image130 or the like is stored, the date and time of the downloading of content data, the date and time of the last play, and the initial letter of the title are associated with one another. Thedisplay control unit100 records information regarding content in the content table at the time of downloading the content data. The initial letter of the title is the initial letter of the game title. Thedisplay control unit100 extracts the initial letter from the game title and includes the initial letter in the table. Every time the game is played, thedisplay control unit100 updates the date and time of the last play. The content table is stored in thememory unit60.
Based on an index selected by the user, theacquisition unit120 reads, from thememory unit60, and acquirescontent images130 andcontent information132 that correspond to the index, in reference to the content table.
For example, when the “Titles” index is selected, theacquisition unit120 acquirescontent images130 andcontent information132 from thememory unit60 using link information of all game titles included in the content table. For example, when the “Recently played” index is selected, theacquisition unit120 identifies game titles last played within a predetermined number of days from the current date and time and acquirescontent images130 andcontent information132 from thememory unit60 using link information of the identified game titles, in reference to the dates and times of the last play included in the content table. The contentimage display unit104 arranges a plurality of content images160 side by side in the contentimage display area144. The relatedinformation display unit106 displays correspondingcontent information162 in theinformation display area156 in conjunction with acontent image160cdisplayed by the contentimage display unit104.
Thedisplay control unit100 may create a content list included in each index in advance in accordance with the content table shown inFIG. 11. Thedisplay control unit100 updates the content list every time there is an update for the information of the content table. For example, a content list of “Titles” includes all game titles downloaded onto thememory unit60. In reference to the respective initial letters of the game titles, thedisplay control unit100 sorts the game titles into classes corresponding to alphabets. The classes specify a display order at the time of the second scrolling process, which is described later. More specifically, thedisplay control unit100 sorts the game titles into classes, “A”, “B”, . . . , and “Z”. If the initial letter of a game title is A, the game title is sorted into the class “A”. If the initial letter of a game title is Z, the game title is sorted into the class “Z”. Thedisplay control unit100 sets a display order according to classes. In this example, thedisplay control unit100 sets to display content images in alphabetical order and creates a content list. A display order in each class is set in alphabetical order of the second and subsequent letters of each game title. This allows a class and a display order to be set for each game title in the content list. The content list includes display information corresponding to a class and, more specifically, includes alphabets such as A, B, . . . , and Z. This content list is updated every time a new game title is downloaded.
A content list “Recently played” includes game titles last played within a predetermined number of days (e.g., 7 days) from the current date. For example, by setting the present day as a current day, thedisplay control unit100 sorts game titles included in the content table into classes, “current day”, “one day before”, “two days before”, “three days before”, and “one week before” based on the date and time of the last play of each game title. More specifically, a game title is sorted into “current day” if the present day and the date of the last play thereof is the same and is sorted into “one day before” if the date of the last play thereof is one day before the present day. Similarly, a game title is sorted into “one week before” if the date of the last play thereof is four to seven days before the present day. A game title is not included in the content list for “Recently played” if the date of the last play thereof is eight or more days before. Thedisplay control unit100 sets a display order according to classes. In this example, thedisplay control unit100 sets to display game titles in order of date and time closer to the present date and time, i.e., in order of “current day”, “one day before”, “two days before”, “three days before”, and “one week before”, and creates a content list. A display order in each class is set based on time information regarding the date and time of the last play. As described, thedisplay control unit100 creates a content list for “Recently played” in which a game title, a class, and a display order are associated. The content list also includes display information corresponding to a class and, more specifically, includes display information such as “current day”, “one day before”, “two days before”, “three days before”, “one week before”, and the like. This content list is updated every time a game is played. Thedisplay control unit100 creates a content list for “Recently added” in a similar manner. The content list is stored in thememory unit60.
When thedisplay control unit100 creates a content list for each index in advance as described above, theacquisition unit120 can acquirecontent images130 andcontent information132 from thememory unit60 in reference to the content list. With this, compared to the acquisition of thecontent images130 and thecontent information132 from the content table, theacquisition unit120 can reduce the time for acquisition.
Referring toFIG. 10, athumbnail image164 is a content image of a game being paused. Theelectronic device10 has a mechanism of pausing the progress of the game and displaying adisplay screen image172 when the user presses a PAUSE button while playing the game. Since theelectronic device10 is capable of performing a multi-task process, the user can select another game in thedisplay screen image172 and play the game while pausing the progress of a game the user has been playing. By tapping thethumbnail image164, the user can go back to the game being paused. Thethumbnail image164 may be, for example, a content image of a last played game. In this case, by tapping thethumbnail image164, the user can execute a game that was previously played.
An explanation is given regarding two types of scrolling processes in the following.
Based on a touch by a finger or the like, theposition input apparatus24 transmits to thecontrol unit50 position information regarding a touched part on thetouch panel23. Theinput detection unit90 detects position information transmitted from theposition input apparatus24 and provides the position information to the scrollinstruction reception unit110.
(First Scrolling Process)
When the position information indicates a touched position in the contentimage display area144, thefirst reception unit112 receives the position information as a moving instruction for content images. More specifically, thefirst reception unit112 derives the direction of movement, the amount of movement (distance), and the speed of movement of content images160 based on the position information and acquires the derived direction of movement, the derived amount of movement, and the derived speed of movement as a moving instruction. In the present exemplary embodiment, thefirst reception unit112 detects the motion of a finger within a range of a predetermined angle from the horizontal direction of the screen as a movement in the rightward direction or the leftward direction of the screen. Thefirst reception unit112 derives the amount of movement and the speed of movement of the content images160 based on a distance from a point (touch start point) at which a finger comes into contact with atouch panel23 to a point (touch end point) at which the finger is removed and on speed at that time. As described, thefirst reception unit112 acquires a moving instruction based on input to the display area of the content images160 displayed on thedisplay25. Thefirst reception unit112 transmits the moving instruction to the contentimage display unit104 and the relatedinformation display unit106. The contentimage display unit104 moves the content images160 in a predetermined order on thedisplay25 based on the moving instruction. Referring toFIG. 10, five content images160 are set to be displayed in the contentimage display area144, and the contentimage display unit104 moves the content images160 by the indicated amount of movement in an indicated direction and in indicated speed.
The relatedinformation display unit106controls content information162 displayed in theinformation display area156 based on the moving instruction such that thecontent information162 matches acontent image160cdisplayed at the center position of the contentimage display area144 at this time. In other words, when the content images displayed in the contentimage display area144 are scrolled, the relatedinformation display unit106 sequentially displays, in theinformation display area156, a game title that matches acontent image160cdisplayed at the center position.
FIG. 12 illustrates adisplay screen image174 shown at the time of a first scrolling process. In addition to displaying a game title of acontent image160cdisplayed in the center position, theinformation display area156 displays anindicator166. Theindicator166 is used to indicate a relative position of content images160 being displayed among a plurality of content images160 for which a display order is set. The entire length of theindicator166 represents the total number of content images160, and the length of adisplay indicator168 included in theindicator166 represents a ratio of the number of content images160 being displayed (five in this case) to the total number of content images160. The position of thedisplay indicator168 indicates a relative position in the display order.Content information162 displayed in theinformation display area156 at the time of a first scrolling process is a game title and is the same as content information162 (seeFIG. 10) displayed in theinformation display area156 when no scrolling is performed (during rest).
(Second Scrolling Process)
When the position information indicates a touched position on theoperator154, thesecond reception unit114 receives the position information as a moving instruction for content images. More specifically, thesecond reception unit114 obtains the amount of displacement from a reference position (center position in the operator display area146) of theoperator154 based on the position information, derives the direction of movement and the speed of movement of the content images160, and acquires the derived direction of movement and the derived speed of movement as a moving instruction. As described, thesecond reception unit114 acquires a moving instruction based on input to theoperator154 displayed in an operable manner on thedisplay25. Thesecond reception unit114 transmits the moving instruction to the contentimage display unit104 and the relatedinformation display unit106. The contentimage display unit104 moves the content images160 in a predetermined order on thedisplay25 in an indicated direction and in indicated speed based on the moving instruction.
FIG. 13 illustrates adisplay screen image176 shown at the time of a second scrolling process and illustrates a state where theoperator154 is moved in the rightward direction. When the user drags theoperator154 from the reference position in the rightward direction, a moving instruction for movement in the leftward direction is generated. When the user removes his/her finger from thetouch panel23, theoperator154 automatically returns to the reference position. Thesecond reception unit114 continues to acquire a moving instruction until theoperator154 returns to the reference position. In the present exemplary embodiment, a mechanism is employed where the content images160 move to the left when theoperator154 is moved to the right. Alternatively, a design may be employed such that the content images160 move to the right when theoperator154 is moved to the right.
As the amount of displacement of theoperator154 becomes larger, thesecond reception unit114 receives a moving instruction for faster speed of movement. Therefore, this second scrolling process is suitable for high-speed scrolling. It is considered that this second scrolling process is actively used by the user, for example, when the number of items of content to be displayed is large. When high-speed scrolling is performed, it is difficult for the user to visually checkcontent information162 even when the relatedinformation display unit106 displays thecontent information162. As information related to the content images160 arranged in thedisplay screen image176, the relatedinformation display unit106 displays, instead of displaying thecontent information162, information related to the display order of the content images160 in association with the content images160. In this example, the content images160 are ordered in alphabetical order. Thus, the respective alphabets of the initial letters of the game titles are used asindicators178a,178b,178c, and178drelated to the display order. The initial letters of the game titles are mapped to the respective game title and the respective content images160 by the relatedinformation display unit106 and acquired from the content table shown inFIG. 11 or from a content list generated from the content table.
In this example, the content images160 are moved to the left by moving theoperator154 to the right. The initial letter of thecontent image160cis “J” and is shown as theindicator178c. If the initial letter of thecontent image160dis also “J”, an indicator178 is not added to thecontent image160d. This is because the user can recognize that the game title thereof has the same initial letter of that of theprevious content image160cif an indicator is not added to thesubsequent content image160d. On the other hand, theindicator178d, which is represented by “K”, is added to thecontent image160e. The user can recognize that a game title starting with “K” has appeared at the right edge after a game title starting with “J”. By displaying the initial letter of the game title of thecontent image160ebefore thecontent image160emoves to the center position, the user can learn when to remove his/her finger from theoperator154 so as to stop scrolling. As described, a plurality of content images160 are displayed in the contentimage display area144. Thus,indicators178a,178b, and178dare preferably added to content images160 other than thecontent image160clocated at the center position in order to show switching of initial letters so as to assist a user's task of finding a target game title
FIGS. 14A through 14C illustrate examples of adding information related to a display order of content images. InFIGS. 14A through 14C, a game title is described in each frame expressing a content image160 in order to facilitate understanding.FIG. 14B illustrates a state where content images have been shifted to the left by one content image from a state shown in the display screen image inFIG. 14A.FIG. 14C illustrates a state where content images have been shifted to the left by one content image from a state shown in the display screen image inFIG. 14B.
InFIG. 14A, the relatedinformation display unit106 displays “J” below acontent image160clocated at the center position. The relatedinformation display unit106 displays an indicator178 below acontent image160cat all times. In a display screen image in which “Libraries” is set to be the access destination, by tapping acontent image160clocated at the center position, the user can execute a game thereof. Therefore, an indicator178 is set to be displayed below acontent image160cat all times.FIG. 15 illustrates a playselection screen image180. When the user touches a Play button, game software, “JKL fishing”, is started.
Referring back toFIG. 14A, the indicator178 is displayed below thecontent image160cat all times. Thus, even when a game title having the same initial letter exists next to thecontent image160c, an indicator178 is not displayed below a content image thereof (content image160din this case). In other words, the same indicator178 is not displayed. “D”, “I”, and “K” are displayed below acontent image160a, acontent image160b, and acontent image160e, respectively. By seeing “K”, the user can recognize that a game title whose initial letter is K is approaching while being scrolled. For example, when playing a game title whose initial letter is K, the user can learn that it is time to remove his/her finger from theoperator154.
InFIG. 14B, the relatedinformation display unit106 displays “J” below acontent image160clocated at the center position. “J” is also the initial letter of “JKL fishing” that has been moved to the left from the center position. However, since the initial latter of a game title “JZZ fight” located at the center position is “J”, the relatedinformation display unit106 displays “J” below thecontent image160cinstead of displaying “J” below acontent image160b. As described, while displaying the initial letter of the game title below thecontent image160clocated at the center position, the relatedinformation display unit106 does not display the initial letter of the game title with the same initial letter below thecontent image160b. The relatedinformation display unit106 displays “K” below thecontent image160d. However, since the initial letter of a game title of thecontent image160eis also “K”, the relatedinformation display unit106 does not display an indicator178 below thecontent image160e. As described, when displaying, before the center position (i.e., at the right side) in the direction of scrolling, an initial letter different from the initial letter of the game title located at the center position, the relatedinformation display unit106 displays the initial letter “K” of a game title below thecontent image160dlocated closer to the center position. The relatedinformation display unit106 displays “I” below thecontent image160a.
InFIG. 14C, the relatedinformation display unit106 displays “K” below acontent image160clocated at the center position. The relatedinformation display unit106 also displays “P” below a newly-displayedcontent image160e. By seeing “P”, the user can recognize that a game title whose initial letter is P is approaching while being scrolled. An initial letter “J” that has passed the center position is displayed below acontent image160b, which is the last content image with the initial letter. As described, when displaying, after the center position (i.e., at the left side) in the direction of scrolling, an initial letter different from the initial letter of the game title located at the center position, the relatedinformation display unit106 displays the initial letter “J” of a game title below thecontent image160blocated closer to the center position.
As described above, in theelectronic device10 according to the present exemplary embodiment, two types of scrolling processes are prepared, and the relatedinformation display unit106 displays different related information for the same content images160 arranged by the contentimage display unit104 in each case, as shown inFIGS. 12 and 13. More specifically, the relatedinformation display unit106 displays different types of related information when thefirst reception unit112 acquires a moving instruction and when thesecond reception unit114 acquires a moving instruction. By displaying different types of related information, a user interface that takes advantage of features of each scrolling process can be realized. As described above, a content list includes display information that corresponds to a class. During a second scrolling process, theacquisition unit120 may read out the display information that corresponds to a class from the content list and provide the display information to the relatedinformation display unit106, and the relatedinformation display unit106 may display, as related information, the display information that corresponds to a class.
FIG. 16 illustrates a flowchart of a scrolling process. A flowchart shown inFIG. 16 displays the processing procedure of components by a combination of a letter “S” (the initial of the word “Step”), which represents a step, and a number. When some sort of a determination process is performed by a process displayed by the combination of a letter “S” and a number, the processing sequence is displayed while adding a letter “Y” (the initial of the word “Yes”) when the determination result is positive (e.g., Y in S10) and is displayed while adding a letter “N” (the initial of the word “No”) when the determination result is negative (e.g., N in S10). The meaning of the display in the flowchart is the same as that in a flowchart shown in another figure.
When thefirst reception unit112 acquires a scroll instruction to a content image, i.e., a scroll instruction input to the contentimage display area144 from the input detection unit90 (Y in S10), the contentimage display unit104 moves (scrolls) content images160 in accordance with the scroll instruction (S12), and the relatedinformation display unit106 displays content information for identifying content for acontent image160clocated at the center position in association with thecontent image160c(S14).
On the other hand, in a case where thefirst reception unit112 does not acquire a scroll instruction (N in S10), when thesecond reception unit114 acquires a scroll instruction to theoperator154 from the input detection unit90 (Y in S16), the contentimage display unit104 moves content images160 in accordance with the scroll instruction (S18), and the relatedinformation display unit106 displays the respective initial letters of the titles of the content images160 displayed in the content image display area144 (S20). If thesecond reception unit114 does not acquire a scroll instruction (N in S16), this flow is ended.
An example has been shown where the contentimage display unit104 orders a plurality of content images in alphabetical order of the respective game titles. For example, in thedisplay screen image172 shown inFIG. 10, content images included in an index group, “Recently played” or “Recently added”, may be ordered according to date and time. The contentimage display unit104 acquires a display order of the content images in accordance with a content table shown inFIG. 11 or in accordance with a content list generated based on the content table at this time. As described previously, a last-played date or a purchase date is classified into a separate class according to several stages in chronological order such as “current day”, “one day before”, “two days before”, “three days before”, and “one week before”.
The relatedinformation display unit106 displays the date and time of the last play or the date and time of purchase as related information during rest and during a first scrolling process. On the other hand, the relatedinformation display unit106 displays “current day”, “one day before”, and “two days before” as indicators178 related to a display order in accordance with a content table or a content list in a second scrolling process. Adding indicators178 to the contentimage display area144 as described above allows the user to recognize related information even during high-speed scrolling.
When the user removes his/her finger from theoperator154 such that thesecond reception unit114 finishes receiving a moving instruction, the relatedinformation display unit106displays content information162 in theinformation display area156 as shown inFIG. 10. As shown inFIG. 12, thecontent information162 is also displayed during the first scrolling process. Therefore, when the second scrolling process is completed, the relatedinformation display unit106 displays, in theinformation display area156, thecontent information162 displayed during the first scrolling process.
<Content Search Process>
FIG. 17 illustrates functional blocks of thecontrol unit50 that performs a process of searching content. Thecontrol unit50 is provided with aninput detection unit90, a searchinstruction reception unit200, asearch processing unit202, anacquisition unit204, and adisplay control unit100. Theinput detection unit90 detects screen position information transmitted from theposition input apparatus24. Thedisplay control unit100 has a function of controlling screen display of thedisplay25 and has a contentscreen display unit116, a searchscreen display unit210, a displayarea determination unit212, and a searchresult display unit214. The contentscreen display unit116 is the same as the content screen display unit shown inFIG. 8 and generates adisplay screen image172 for displaying content stored in the memory unit60 (seeFIG. 10) or adisplay screen image170 for displaying content stored in the content delivery server4 (seeFIG. 9).
The functions of thecontrol unit50 are implemented by CPU, memory, a content management application program loaded into the memory, or the like.FIG. 17 depicts functional blocks implemented by the cooperation of these components. Thus, a person skilled in the art should appreciate that there are many ways of accomplishing these functional blocks in various forms in accordance with the components of hardware only, software only, or the combination of both.
In thedisplay screen image170 shown inFIG. 9 or thedisplay screen image172 shown inFIG. 10, a search processing function is activated when the user taps asearch button71. Theinput detection unit90 detects position information transmitted from theposition input apparatus24 and provides the position information to the searchinstruction reception unit200. When the searchinstruction reception unit200 detects that the provided position information indicates the display area of asearch button71, the searchinstruction reception unit200 recognizes that the position information is a search instruction, receives the position information, and then transmits the position information to thedisplay control unit100. In thedisplay control unit100, the searchscreen display unit210 receives the search instruction and displays a search screen image.
FIG. 18 illustrates asearch screen image230. Asoftware keyboard222 is displayed in thesearch screen image230. The user inputs a desired character string to aninput window220 by pressing thesoftware keyboard222. Theinput detection unit90 transmits the input character string to the searchinstruction reception unit200. The searchinstruction reception unit200 receives the input character string as a search query. The searchinstruction reception unit200 transmits the search query to thesearch processing unit202.
In accordance with the search instruction and the search query received by the searchinstruction reception unit200, thesearch processing unit202 searches thememory unit60 and also allows the content delivery server4 to perform a search via thecommunication unit40. For example, if the character string that has been input is “action”, thesearch processing unit202 searches content data stored in thememory unit60 for content data that includes “action” in a character string. In content data, the category of a game is incorporated as attribute information. Thesearch processing unit202 searches for content data that includes “action” in attribute information thereof or in a game title thereof. Thesearch processing unit202 instructs the content delivery server4 to perform the same search via thecommunication unit40.
Theacquisition unit204 acquires search results from the content delivery server4. Theacquisition unit204 also acquires search results of thememory unit60 from thesearch processing unit202. Theacquisition unit204 provides the acquired search results to thedisplay control unit100. Thedisplay control unit100 displays search results of thememory unit60 and search results of the content delivery server4 in different display areas. In other words, thedisplay control unit100 divides thedisplay25 into two display areas and displays the search results of thememory unit60 and the search results of the content delivery server4 each in a separate display area. By displaying the search results separately, the user can tell between the respective sets of search results at a glance.
The displayarea determination unit212 determines a display area for the search results of thememory unit60 and a display area for the search results of the content delivery server4. The displayarea determination unit212 determines the display area for the search results of thememory unit60 and the display area for the search results of the content delivery server4 according to a screen image shown when the searchinstruction reception unit200 received the search instruction. More specifically, the displayarea determination unit212 determines a display area for a search result based on whether the searchinstruction reception unit200 received the search instruction while adisplay screen image170 in which the access destination was set to “Game Store Channel” was being displayed or the searchinstruction reception unit200 received the search instruction while adisplay screen image172 in which the access destination was set to “Libraries” was being displayed. Information identifying a display screen image shown when the search instruction was received is transmitted to the displayarea determination unit212 by the searchinstruction reception unit200.
FIG. 19 illustrates a searchresult screen image236. The searchresult screen image236 is divided into afirst display area232 located at the upper part of the screen and asecond display area234 located at the lower part of the screen. In the searchresult screen image236, thefirst display area232 is located above thesecond display area234. Thus, it is easier for the user to visually recognize a search result displayed in thefirst display area232.
The displayarea determination unit212 determines the display area for the search results of thememory unit60 and the display area for the search results of the content delivery server4 according to a screen image shown when the searchinstruction reception unit200 received the search instruction. More specifically, when the searchinstruction reception unit200 receives the search instruction while the display screen image172 (seeFIG. 10) displaying content stored in thememory unit60 is being displayed, the displayarea determination unit212 determines the display area for the search results of thememory unit60 to be thefirst display area232 located on a side that allows for easier visual recognition and determines the display area for the search results of the content delivery server4 to be thesecond display area234. On the other hand, when the searchinstruction reception unit200 receives the search instruction while the display screen image170 (seeFIG. 9) displaying content stored in the content delivery server4 is being displayed, the displayarea determination unit212 determines the display area for the search results of the content delivery server4 to be thefirst display area232 located on a side that allows for easier visual recognition and determines the display area for the search results of thememory unit60 to be thesecond display area234. For example, in a case where the display areas of the respective search results are set to be on the left side and on the right side of the screen, the display area on the left side is treated as thefirst display area232, and the display area on the right side is treated as thesecond display area234 since the display area on the left side corresponds to an area that allows for easier visual recognition.FIG. 19 illustrates the searchresult screen image236 shown when the search instruction is received while thedisplay screen image170 is being displayed.
It can be said that the user is interested in the search results of thememory unit60 when the user taps asearch button71 in thedisplay screen image172, while it can be said that the user is interested in the search results of the content delivery server4 when the user taps asearch button71 in thedisplay screen image170. Thus, by displaying search results in which the user has higher interest in thefirst display area232, a searchresult screen image236 can be provided that can be easily viewed by the user. Thefirst display area232 may be set to be larger than thesecond display area234. With this, the number of displayed search results in which the user has higher interest can be relatively increased.
The searchresult display unit214 displays the search results of thememory unit60 and the search results of the content delivery server4 side by side in the respective determined display areas. The searchresult display unit214 lines up the search results of thememory unit60 in first order and the search results of the content delivery server4 in second order different from the first order. More specifically, the searchresult display unit214 lines up the search results of thememory unit60, for example, from above in order of recently played, while the searchresult display unit214 lines up the search results of the content delivery server4, for example, in order hits by search in the content delivery server4. The search results of the content delivery server4 may be lined up in alphabetical order.
In thefirst display area232 shown inFIG. 19, a mark “installed” is added to a game title “BBB action”. This means that the game is already downloaded. The content delivery server4 manages already-downloaded content of theelectronic device10. Thus, when returning search results, the content delivery server4 may add a predetermined mark such as “installed” to content that has already been downloaded. This process may be performed by theelectronic device10. When the search results of the content delivery server4 include already-downloaded content, the display of the content can be deleted from the searchresult screen image236.
FIG. 20 illustrates a flowchart of a search process. When the user taps asearch button71, the searchinstruction reception unit200 receives a search instruction (Y in S30). If the user does not tap the search button71 (N in S30), the search process is not started.
The searchscreen display unit210 receives the search instruction and displays a search screen image (S32). Upon receiving a search query input to the search screen image (S34), the searchinstruction reception unit200 transmits the search query to thesearch processing unit202. In accordance with the search instruction and the search query, thesearch processing unit202 searches thememory unit60 and also allows the content delivery server4 to perform a search via the communication unit40 (S36). Theacquisition unit204 acquires search results from the content delivery server4 and also acquires search results of thememory unit60 from the search processing unit202 (S38). Theacquisition unit204 provides the acquired search results to thedisplay control unit100.
The displayarea determination unit212 determines the display area for the search results of thememory unit60 and the display area for the search results of the content delivery server4 according to a screen image shown when the searchinstruction reception unit200 received the search instruction (S40). When content of thememory unit60 is being displayed (Y in S40), the displayarea determination unit212 determines thefirst display area232 located at the upper side to be a display area for the search results of the memory unit60 (S42), and the searchresult display unit214 displays the search results (S46). On the other hand, when content of the content delivery server4 is being displayed (N in S40), the displayarea determination unit212 determines thefirst display area232 located at the upper side to be a display area for the search results of the content delivery server4 (S44), and the searchresult display unit214 displays the search results (S46).
Described above is an explanation based on the exemplary embodiment of the present invention. The exemplary embodiment is intended to be illustrative only, and it will be obvious to those skilled in the art that various modifications to constituting elements and processes could be developed and that such modifications are also within the scope of the present invention.
In the exemplary embodiment, the user inputs an instruction through thetouch panel23. Alternatively, the user may input an instruction through an operation key provided on theelectronic device10. For example, if theelectronic device10 does not have atouch panel23, a configuration may be employed in reference toFIG. 2 where a moving instruction during a first scrolling process is input through the operation of anL button37 or anR button38 and where a moving instruction during a second scrolling process is input through a directional key31bor31d. Even when there exists atouch panel23, an entry of scroll instruction input may be assigned to an operational key.
In the exemplary embodiment, a situation is illustrated where content is a game. Even when, for example, the content is music, a movie, a book, or the like, the content to be displayed is classified into a separate class in advance according to titles, reproduction dates and times, and the like. Thus, the titles are displayed at the time of a first scrolling process, and sets of display information corresponding to respective classes are displayed in association with display items at the time of a second scrolling process.