CROSS-REFERENCE TO RELATED APPLICATIONThis application claims the priority benefit of Korean Patent Application Nos. 10-2013-0011281 and 10-2013-0011282, filed on Jan. 31, 2013 in the Korean Intellectual Property Office, the disclosure of which is incorporated herein by reference.
BACKGROUND OF THE INVENTION1. Field of the Invention
The present invention relates to an image display apparatus and a method for operating the same, and more particularly, to an image display apparatus and a method for operating the same, which are capable of increasing user convenience.
2. Description of the Related Art
An image display apparatus functions to display images to a user. A user can view a broadcast program using an image display apparatus. The image display apparatus can display a broadcast program selected by the user on display from among broadcast programs transmitted from broadcasting stations. The recent trend in broadcasting is a worldwide transition from analog broadcasting to digital broadcasting.
Digital broadcasting transmits digital audio and video signals. Digital broadcasting offers many advantages over analog broadcasting, such as robustness against noise, less data loss, ease of error correction, and the ability to provide clear, high-definition images. Digital broadcasting also allows interactive viewer services, compared to analog broadcasting.
SUMMARY OF THE INVENTIONTherefore, the present invention has been made in view of the above problems, and it is an object of the present invention to provide an image display apparatus and a method for operating the same, which are capable of increasing user convenience.
In accordance with an aspect of the present invention, the above and other objects can be accomplished by the provision of a method for operating an image display apparatus including displaying a home screen including at least one card object including a content list, displaying a card object generation screen if card object generation input is received, and, if at least one content item displayed on the card object generation screen is selected, adding the selected content item to a card object to be generated.
In accordance with another aspect of the present invention, there is provided an image display apparatus including a network interface configured to exchange data with a server, a display configured to display a home screen including at least one card object including a content list and to display a card object generation screen if card object generation input is received, and a controller configured to, if at least one content item displayed on the card object generation screen is selected, add the selected content item to a card object to be generated.
BRIEF DESCRIPTION OF THE DRAWINGSThe above and other objects, features and other advantages of the present invention will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings, in which:
FIG. 1ais a diagram showing an image display apparatus according to an embodiment of the present invention;
FIG. 1B is a diagram showing a first surface of a remote controller ofFIG. 1a;
FIG. 1C is a diagram showing a second surface of the remote controller ofFIG. 1a;
FIG. 2 is a block diagram showing the internal configuration of the image display apparatus ofFIG. 1;
FIG. 3 is a block diagram showing the internal configuration of a controller ofFIG. 2;
FIGS. 4 to 5 are diagrams showing various examples of a smart system platform structure in the image display apparatus ofFIG. 2;
FIG. 6A is a diagram showing an operating method using the first surface of the remote controller ofFIG. 1B;
FIG. 6B is a diagram showing an operating method using the second surface of the remote controller ofFIG. 1C;
FIG. 7 is a block diagram showing the internal configuration of the remote controller ofFIG. 1;
FIG. 8 is a flowchart illustrating a method for operating an image display apparatus according to an embodiment of the present invention;
FIGS. 9A to 14B are views referred to for describing various examples of the method for operating the image display apparatus ofFIG. 8;
FIG. 15 is a flowchart illustrating a method for operating an image display apparatus according to an embodiment of the present invention; and
FIGS. 16A to 19B are views referred to for describing various examples of the method for operating the image display apparatus ofFIG. 15.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTSExemplary embodiments of the present invention will be described with reference to the attached drawings.
The terms “module” and “unit” attached to describe the names of components are used herein to help the understanding of the components and thus they should not be considered as having specific meanings or roles. Accordingly, the terms “module” and “unit” may be used interchangeably.
An image display apparatus described in the present specification is, for example, an intelligent image display apparatus including not only a broadcast reception function but also a computer support function and includes a more conveniently used interface, such as a handwriting type input device, a touchscreen or a 3D pointing device, by adding an Internet function while accurately performing the broadcast reception function. In addition, the image display apparatus is connected to the Internet or a computer by a wired or wireless Internet module so as to perform functions such as email, web browsing, banking or games. For such various functions, a standardized general-purpose operating system (OS) may be used.
Accordingly, the image display apparatus of the present invention may perform various user-friendly functions because various applications may be freely added to or deleted from a general-purpose OS kernel. For example, the image display apparatus of the present invention may be a smart TV.
FIG. 1ais a diagram showing an image display apparatus according to an embodiment of the present invention;
Referring toFIG. 1a, theimage display apparatus100 according to the embodiment of the present invention displays an image and includes a display (180 ofFIG. 1B). Theimage display apparatus100 may include acamera195 for capturing an image of a user.
Although acamera195 is placed on an upper side of theimage display apparatus100 inFIG. 1a, the camera may be placed at various locations. UnlikeFIG. 1a, theimage display apparatus100 and the camera may be mounted as separate devices.
Theimage display apparatus100 may exchange data with an external device over a network.
Theimage display apparatus100 may exchange data with adjacent external devices, such as a home appliance670, ahome server650, amobile terminal600, etc. or may share predetermined content data with the adjacent external devices. The home appliance670 may include a set-top box, audio equipment, a refrigerator, a cleaner, an air conditioner, a washing machine, a cooker, etc.
Theimage display apparatus100 may exchange data withexternal servers600a,600b,600c, . . . over a network690. The external serves600a,600b,600c, may be content providers for providing a variety of content.
Unlike the figure, theimage display apparatus100 may exchange data with themobile terminal600 over the network690.
Theimage display apparatus100 may operate in correspondence with a remote control signal from aremote controller200. The image displayapparatus100 and theremote controller200 may exchange data through a pairing operation.
In particular, theimage display apparatus100 according to the embodiment of the present invention may display a pointer corresponding to movement of theremote controller200 or display letters by pressing a letter key of theremote controller200, by data exchange.
Theimage display apparatus100 described in the present specification may include a TV receiver, a monitor, a projector, a laptop computer, a digital broadcast terminal, etc.
Theimage display apparatus100 according to the embodiment of the present invention displays an image and may include a fixed image display apparatus or a mobile image display apparatus.
FIG. 1B is a diagram showing a first surface of a remote controller ofFIG. 1a, andFIG. 1C is a diagram showing a second surface of the remote controller ofFIG. 1a.
First, referring toFIG. 1B, operation keys such as apower key202 may be placed on the first surface (front surface)201 of the remote controller.
The various operation keys will now be described. Thepower key202 is used to turn theimage display apparatus100 on/off. Ahome key204 is used to display a home screen if the home screen of theimage display apparatus100 is set. Asearch key206 may be used to display a search window on theimage display apparatus100 or to search by keyword.
Four-direction keys210 are used to move a pointer or a cursor up, down, right and left and an up key210c, adown key210d, aleft key210band a right key210amay be integrally formed. Awheel key220 may be placed in the center of the four-direction keys210.
Thewheel key220 is used to move a screen or an item displayed on theimage display apparatus100. Thewheel key220 may move up and down and thus the screen or the item of theimage display apparatus100 may move up and down.
Aback key222 is used to move a screen or an item displayed on theimage display apparatus100 to a previous screen or a previous item. Amenu key224 is used to display a set menu of theimage display apparatus100. Apointer key225 is used to display a pointer on theimage display apparatus100.
Avolume key230 is used to change a volume and achannel key240 is used to switch a channel.
A3D key235 may be used to switch a two-dimensional (2D) image displayed on theimage display apparatus100 to a three-dimensional (3D) image or may be used to display a 3D image list which is able to be displayed on theimage display apparatus100.
APIP key241 is used to display a plurality of images on theimage display apparatus100. By manipulating thePIP key241, a plurality of images may be displayed on thedisplay180 in a picture in picture (PIP) manner. Alternatively, a plurality of images may be arranged in parallel.
Any one of a plurality of images may float such that the location of the image is changed. In this case, a PIP image may be referred to as a dynamic screen image.
In the figure, a pointer key for displaying a pointer, a guide key for displaying a guide, a mute key, color key, etc. are further displayed.
Next, referring toFIG. 1C, a second surface (back surface)251 of theremote controller200 may be opposite to the first surface (front surface)201 of theremote controller200. Aletter key260 and adisplay270 may be placed on the second surface (back surface)251 of theremote controller200.
Theletter key260 may include anumeric key262 and analphabetic key264. Theletter key260 may further include an enter key, a function key, a spacebar key, etc.
Thedisplay270 may display letters input through theletter key260.
If theletter key260 is manipulated, theremote controller200 transmits letter key information to theimage display apparatus100.
Theremote controller200 may transmit coordinate information corresponding to movement of theremote controller200 to theimage display apparatus100. Thus, the pointer corresponding to the movement of theremote controller200 may be displayed on the display of the image display apparatus. Since the pointer is moved according to the movement of the remote controller in a 3D space, the remote controller may be referred to as a 3D pointing device.
FIG. 2 is a block diagram showing the internal configuration of the image display apparatus ofFIG. 1.
Referring toFIG. 2, theimage display apparatus100 according to the embodiment of the present invention includes abroadcast reception unit105, anetwork interface130, anexternal device interface135, amemory140, auser input interface150, acontroller170, adisplay180, anaudio output unit185, apower supply190 and acamera195. Thebroadcast reception unit105 may include atuner unit110 and ademodulator120. Alternatively, thebroadcast reception unit105 may further include anetwork interface130.
Thetuner unit110 tunes to a Radio Frequency (RF) broadcast signal corresponding to a channel selected by a user from among RF broadcast signals received through an antenna or RF broadcast signals corresponding to all channels previously stored in the image display apparatus. The tuned RF broadcast is converted into an Intermediate Frequency (IF) signal or a baseband Audio/Video (AV) signal.
For example, the tuned RF broadcast signal is converted into a digital IF signal DIF if it is a digital broadcast signal and is converted into an analog baseband AV signal (Composite Video Banking Sync/Sound Intermediate Frequency (CVBS/SIF)) if it is an analog broadcast signal.
Thetuner unit110 may sequentially select a number of RF broadcast signals corresponding to all broadcast channels previously stored in the image display apparatus by a channel storage function from a plurality of RF signals received through the antenna and may convert the selected RF broadcast signals into IF signals or baseband A/V signals.
Thedemodulator120 receives the digital IF signal DIF from thetuner unit110 and demodulates the digital IF signal DIF.
Thedemodulator120 may perform demodulation and channel decoding, thereby obtaining a stream signal TS. The stream signal may be a signal in which a video signal, an audio signal and a data signal are multiplexed.
The stream signal output from thedemodulator120 may be input to thecontroller170 and thus subjected to demultiplexing and A/V signal processing. The processed video and audio signals are output to thedisplay180 and theaudio output unit185, respectively.
Theexternal device interface135 may connect an external device and theimage display apparatus100. For connection, theexternal device interface135 may include an A/V Input/Output (I/O) unit (not shown).
Theexternal device interface135 may be connected to an external device such as a Digital Versatile Disc (DVD) player, a Blu-ray player, a game console, a camera, a camcorder, or a computer (e.g., a laptop computer), wirelessly or by wire so as to perform an input/output operation with respect to the external device.
The A/V I/O unit may include a universal serial bus (USB) port, a composite video banking sync (CVBS) port, a component port, a S-video port (analog), a Digital Visual Interface (DVI) port, a High Definition Multimedia Interface (HDMI) port, a Red, Green, Blue (RGB) port, a D-SUB port, etc. in order to provide audio and video signals received from the external device to theimage display apparatus100.
Theexternal device interface135 may be connected to various set-top boxes via at least one of the above-described various ports to perform transmit and receive data to and from the set-to boxes.
Thenetwork interface130 serves as an interface between theimage display apparatus100 and a wired/wireless network such as the Internet. Thenetwork interface130 may receive content or data provided by an Internet or content provider or a network operator over a network.
Thenetwork interface130 may access a predetermined web page over a connected network or another network linked to the connected network. That is, thenetwork interface130 access a predetermined web page to transmit or receive data to or from a corresponding server. In addition, the network interface may receive content or data provided by a content provider or a network operator.
Thenetwork interface130 may select and receive a desired application among applications opened to the public over a network.
Thenetwork interface130 may include a wired communication unit (not shown) or a wireless communication unit (not shown).
The wireless communication unit may perform short-range wireless communication with another electronic apparatus. Theimage display apparatus100 may be connected to another electronic apparatus over a network according to a communication standard such as Bluetooth, Radio Frequency Identification (RFID), InfraRed Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Digital Living Network Alliance (DLNA), etc.
Thememory140 may store various programs necessary for thecontroller170 to process and control signals, and may also store processed video, audio and data signals.
Thememory140 may temporarily store a video, audio and/or data signal received from thenetwork interface130 or theexternal device interface135. Thememory140 may store information about a predetermined broadcast channel by the channel storage function of a channel map.
In addition, thememory140 may store an application or an application list received from thenetwork interface130 or theexternal device interface135.
Theimage display apparatus100 may play a content file (a moving image file, a still image file, a music file, a text file, an application file, etc.) stored in thememory140 back to be provided to a user.
While thememory140 is shown inFIG. 1 as being configured separately from thecontroller170, to which the present invention is not limited, thememory140 may be incorporated into thecontroller170.
Theuser input interface150 transmits a signal input by the user to thecontroller170 or transmits a signal received from thecontroller170 to the user.
For example, theuser input interface150 may transmit/receive various user input signals such as a power-on/off signal, a channel selection signal, and a screen setting signal from aremote controller200, may provide thecontroller170 with user input signals received from local keys (not shown), such as inputs of a power key, a channel key, and a volume key, and setting values, or provide thecontroller170 with a user input signal received from a sensor unit (not shown) for sensing a user gesture, or transmit a signal received from thecontroller170 to a sensor unit (not shown).
Thecontroller170 may demultiplex the stream signal received from thetuner unit110, thedemodulator120, or theexternal device interface135 into a number of signals, process the demultiplexed signals into audio and video data, and output the audio and video data.
The video signal processed by thecontroller170 may be displayed as an image on thedisplay180. The video signal processed by thecontroller170 may also be transmitted to an external output device through theexternal device interface135.
The audio signal processed by thecontroller170 may be output to theaudio output unit185. Also, the audio signal processed by thecontroller170 may be transmitted to the external output device through theexternal device interface135.
While not shown inFIG. 2, thecontroller170 may include a DEMUX, a video processor, etc., which will be described in detail later with reference toFIG. 3.
Thecontroller170 may control the overall operation of theimage display apparatus100. For example, thecontroller170 controls thetuner unit110 to tune to an RF signal corresponding to a channel selected by the user or a previously stored channel.
Thecontroller170 may control theimage display apparatus100 by a user command input through theuser input interface150 or an internal program. In particular, the controller may be connected to the network to download an application or application list desired by a user into the image display apparatus.
For example, thecontroller170 controls thetuner unit110 such that a signal of a channel selected according to a predetermined channel selection command received through theuser input interface150 is received, and processes the video, audio or data signal of the selected channel. Thecontroller170 may output the processed video or audio signal such as the channel information selected by the user through thedisplay180 or theaudio output unit185.
As another example, thecontroller170 may output, through thedisplay180 or theaudio output unit185, a video signal or an audio signal from an external device, such as a camera or a camcorder, received through theexternal device interface135, according to an external device image playback command received through theuser input interface150.
Thecontroller170 may control thedisplay180 to display images. The image displayed on thedisplay180 may be a Two-Dimensional (2D) or Three-Dimensional (3D) still or moving image.
Thecontroller170 may generate and display predetermined object of an image displayed on thedisplay180 as a 3D object. For example, the object may be at least one of a screen of an accessed web site (newspaper, magazine, etc.), an electronic program guide (EPG), various menus, a widget, an icon, a still image, a moving image, text, etc.
Thecontroller170 recognizes the position of the user based on an image captured by a camera unit (not shown). For example, a distance (z-axis coordinate) between the user and theimage display apparatus100 may be detected. An x-axis coordinate and a y-axis coordinate in thedisplay180 corresponding to the position of the user may be detected.
If an application view menu item is selected, thecontroller170 may control display of applications or a list of applications that are available in the image display apparatus or downloadable from an external network.
Thecontroller170 may control installation and execution of an application downloaded from the external network along with various user interfaces. Also, thecontroller170 may control display of an image related to the executed application on thedisplay180, upon user selection.
Thecontroller170 may receive a user image captured by thecamera195. The controller may recognize the user based on the captured user image and control the recognized user to log in to theimage display apparatus100. Thecontroller170 may provide a service to each user who logs in to the image display apparatus.
Alternatively, thecontroller170 may recognize a user gesture from a user image captured by thecamera195. In particular, thecontroller170 may recognize the face and hand of the user from the captured image and recognize a specific gesture.
Thedisplay180 converts a video signal, a data signal or an OSD signal processed by thecontroller170 or a video signal and a data signal received by theexternal device interface135 into RGB signals and generates a drive signal.
Thedisplay180 may be various types of displays such as a Plasma Display Panel (PDP), a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED) display, a flexible display, and a 3D display.
Thedisplay180 may also be a touchscreen that can be used not only as an output device but also as an input device.
Theaudio output unit185 may receive a processed audio signal from thecontroller170 and output the received audio signal as sound.
Thepower supply190 supplies power to theimage display apparatus100. Particularly, thepower supply190 may supply power to thecontroller170 which may be implemented as a System On Chip (SOC), thedisplay180 for displaying an image, and theaudio output unit185 for outputting the audio signal.
For supplying power, thepower supply190 may include a converter (not shown) for converting Alternating Current (AC) into Direct Current (DC). If thedisplay180 is implemented as, for example, a liquid crystal panel having a plurality of backlight lamps, thepower supply190 may further include an inverter (not shown) capable of performing Pulse Width Modulation (PWM) for luminance change or dimming driving.
Thecamera195 may capture an image of a user and transmit the captured image to thecontroller170 of theimage display apparatus100. Although the number ofcameras195 is 1 inFIG. 1a, a plurality of cameras may be included. Thecamera195 may be a 2D camera or a 3D camera.
Theremote controller200 transmits user input to theuser input interface150. For transmission of user input, theremote controller200 may use various communication techniques such as RF communication, IR communication, Bluetooth, Ultra Wideband (UWB), and ZigBee.
In addition, theremote controller200 may receive a video signal, an audio signal or a data signal from theuser input interface150. Theremote controller200 output the received signals visually, audibly or through vibrations based on the received video, audio or data signal.
The block diagram of theimage display apparatus100 illustrated inFIG. 2 is only exemplary. Depending upon the specifications of theimage display apparatus100 in actual implementation, the components of theimage display apparatus100 may be combined or omitted or new components may be added. That is, two or more components may be incorporated into one component or one component may be configured as separate components, as needed. In addition, the function of each block is described for the purpose of describing the embodiment of the present invention and thus specific operations or devices should not be construed as limiting the scope and spirit of the present invention.
UnlikeFIG. 2, theimage display apparatus100 may not include thetuner unit110 and thedemodulator120 shown inFIG. 2 and may receive broadcast content via thenetwork interface130 or theexternal device interface135 and play the broadcast content back.
FIG. 3 is a block diagram showing the internal configuration of a controller ofFIG. 2.
Referring toFIG. 3, thecontroller170 according to the embodiment of the present invention may include aDEMUX310, avideo processor320, aprocessor330, anOSD generator340, amixer350, a Frame Rate Converter (FRC)355, and aformatter360. Thecontroller170 may further include an audio processor (not shown) and a data processor (not shown).
TheDEMUX310 demultiplexes an input stream. For example, theDEMUX310 may demultiplex an MPEG-2 TS into a video signal, an audio signal, and a data signal. The stream signal input to theDEMUX310 may be received from thetuner unit110, thedemodulator120 or theexternal device interface135.
Thevideo processor320 may process the demultiplexed video signal. For video signal processing, thevideo processor320 may include avideo decoder325 and ascaler335.
Thevideo decoder325 decodes the demultiplexed video signal and thescaler335 scales the resolution of the decoded video signal so that the video signal can be displayed on thedisplay180.
Thevideo decoder325 may be provided with decoders that operate based on various standards.
The video signal decoded by thevideo processor320 is input to themixer350.
Theprocessor330 may control the overall operation of theimage display apparatus100 or thecontroller170. For example, theprocessor330 controls thetuner unit110 to tune to an RF signal corresponding to a channel selected by the user or a previously stored channel.
Theprocessor330 may control theimage display apparatus100 by a user command received through theuser input interface150 or an internal program.
Theprocessor330 may control data transmission of thenetwork interface130 or theexternal device interface135.
Theprocessor330 may control the operation of theDEMUX310, thevideo processor320 and theOSD generator340 of thecontroller170.
TheOSD generator340 generates an OSD signal autonomously or according to user input. For example, theOSD generator340 may generate signals by which a variety of information is displayed as graphics or text on thedisplay180, according to user input signals or control signals. The OSD signal may include various data such as a User Interface (UI), a variety of menus, widgets, icons, etc.
TheOSD generator340 may generate a signal for displaying broadcast information based on a caption or EPG of a broadcast image.
TheOSD generator340 generates an OSD signal or a graphic signal and thus may be referred to as a graphics processing unit.
Themixer350 may mix the decoded video signal processed by thevideo processor320 with the OSD signal generated by theOSD generator340. The mixed signal is provided to theformatter360. By mixing the decoded broadcast image signal or the external input signal and the OSD signal, the OSD may be overlaid and displayed on the broadcast image or the OSD or the external input image.
TheFRC355 may change the frame rate of an input image. TheFRC355 may maintain the frame rate of the input image without frame rate conversion.
Theformatter360 changes the signal output from theFRC355 to be suitable for thedisplay180. For example, theformatter360 may convert a received signal into an RGB data signal and output the RGB data signal. The RGB data signal may be output in the form of a Low Voltage Differential Signal (LVDS) or mini-LVDS.
Theformatter360 may change the format of a 3D video signal or convert a 2D video signal into a 3D video signal.
The audio processor (not shown) of thecontroller170 may process the demultiplexed audio signal. For audio signal processing, the audio processor may have various decoders.
The audio processor (not shown) of thecontroller170 may also adjust the bass, treble or volume of the audio signal.
The data processor (not shown) of thecontroller170 may process the demultiplexed data signal. For example, if the demultiplexed data signal was encoded, the data processor may decode the data signal. The encoded data signal may be Electronic Program Guide (EPG) information including broadcasting information such as the start time and end time of broadcast programs of each channel.
The block diagram of thecontroller170 shown inFIG. 3 is exemplary. The components of the block diagrams may be integrated or omitted, or a new component may be added according to the specifications of thecontroller170.
In particular, theFRC355 and theformatter360 may be included separately from thecontroller170.
FIGS. 4 to 5 are diagrams showing various examples of a platform structure in the image display apparatus ofFIG. 2.
The platform of theimage display apparatus100 according to the embodiment of the present invention may include OS based software in order to perform the above-described various operations.
First, referring toFIG. 4, the platform of theimage display apparatus100 according to an embodiment of the present invention is a separate type according to an exemplary embodiment of the present invention. The platform may be designed separately as alegacy system platform400 and asmart system platform405. AnOS kernel410 may be shared between thelegacy system platform400 and thesmart system platform405.
Thelegacy system platform400 may include adriver420,middleware430, and anapplication layer450 on theOS kernel410. Thesmart system platform405 may include alibrary435, aframework440, and anapplication layer455 on theOS kernel410.
TheOS kernel410 is a core of an operating system. When theimage display apparatus100 is driven, theOS kernel410 may be responsible for at least one of hardware driver driving, security protection for hardware and processors in theimage display apparatus100, efficient management of system resources, memory management, hardware interfacing by hardware abstraction, multi-processing, and scheduling associated with the multi-processing. Meanwhile, theOS kernel410 may further perform power management.
The hardware drivers of theOS kernel410 may include, for example, at least one of a display driver, a Wi-Fi driver, a Bluetooth driver, a USB driver, an audio driver, a power manager, a binder driver, and a memory driver.
The hardware drivers of theOS kernel410 may be drivers for hardware devices within theOS kernel410 and include a character device driver, a block device driver, and a network device driver. The block device driver may require a buffer for buffering data on a block basis, because data is transmitted on a block basis. The character device driver may not require a buffer since data is transmitted on a basic data unit basis, that is, on a character basis.
TheOS kernel410 may be implemented based on any of various OSs such as Unix (Linux), Windows, etc. TheOS kernel410 may be a general-purpose open OS kernel that can be implemented in other electronic devices.
Thedriver420 is installed between theOS kernel410 and themiddleware430. Along with themiddleware430, thedriver420 drives devices for operations of theapplication layer450. For example, thedriver420 may include a driver(s) for a microcomputer, a display module, a graphics processing unit (GPU), the FRC, a General-Purpose Input/Output (GPIO) pin, a High-Definition Multimedia Interface (HDMI), a System Decoder (SDEC) or DEMUX, a Video Decoder (VDEC), an Audio Decoder (ADEC), a Personal Video Recorder (PVR), and/or an Inter-Integrated Circuit (I2C). These drivers operate in interaction with the hardware drivers of theOS kernel410.
In addition, thedriver420 may further include a driver for theremote controller200, especially a below-described 3D pointing device. The driver for the 3D pointing device may reside in theOS kernel410 or themiddleware430, instead of thedriver420.
Themiddleware430 is located between theOS kernel410 and theapplication layer450. Themiddleware430 may mediate between different hardware devices or different software programs, for data transmission and reception between the hardware devices or software programs. Therefore, themiddleware430 can provide standard interfaces, support various environments, and enable interaction between tasks conforming to heterogeneous communication protocols.
Examples of themiddleware430 in thelegacy system platform400 may include Multimedia and Hypermedia information coding Experts Group (MHEG) and Advanced Common Application Platform (ACAP) as data broadcasting-related middleware, PSIP or SI middleware as broadcast information-related middleware, and Digital Living Network Alliance (DLNA) middleware as peripheral device communication-related middleware.
Theapplication layer450 that resides on themiddleware430 in thelegacy system platform400 may include, for example, UI applications associated with various menus in theimage display apparatus100. Theapplication layer450 that resides on themiddleware430 may allow editing and updating over a network by user selection. With use of theapplication layer450, the user may enter a desired menu among various UIs by manipulating theremote controller200 during viewing of a broadcast program.
Theapplication layer450 in thelegacy system platform400 may further include at least one of a TV guide application, a Bluetooth application, a reservation application, a Digital Video Recorder (DVR) application, and a hotkey application.
In thesmart system platform405, thelibrary435 is located between theOS kernel410 and theframework440, forming the basis of theframework440. For example, thelibrary435 may include a Secure Socket Layer (SSL) being a security-related library, WebKit being a Web engine-related library, c library (libc), and Media Framework being media-related library such as specifying a video format and an audio format. Thelibrary435 may be written in C or C++. Additionally, thelibrary435 may be accessible to a developer through theframework440.
Thelibrary435 may include a runtime437 with a core Java library and a Virtual Machine (VM). The runtime437 and thelibrary435 form the basis of theframework440.
The VM may be a virtual machine that enables concurrent execution of a plurality of instances, that is, multi-tasking. For each application of theapplication layer455, a VM may be allocated and executed. For scheduling or interconnection between instances, the binder driver (not shown) of theOS kernel410 may operate.
The binder driver and the runtime437 may connect Java applications to C-based libraries.
Thelibrary435 and the runtime437 may correspond to themiddleware430 of thelegacy system platform400.
In thesmart system platform405, theframework440 includes programs on which applications of theapplication layer455 are based. Theframework440 is compatible with any application and may allow component reuse, movement or exchange. Theframework440 may include support programs and programs for interconnection of different software components. For example, theframework440 may include a resource manager, an activity manager related to activities of applications, a notification manager, and a content provider (CP) for abstracting common information between applications. Thisframework440 may be written in Java.
Theapplication layer455 on top of theframework440 includes a variety of programs that are executed and displayed in theimage display apparatus100. Theapplication layer455 may include, for example, a core application that has at least one of an e-mail, Short Message Service (SMS), calendar, map, and browser. Function may be provided. Theapplication layer455 may be written in Java.
In theapplication layer455, applications may be categorized into user-undeletable applications465 stored in theimage display apparatus100 and user-installable or user-deletable applications475 that are downloaded from an external device or a network and stored in theimage display apparatus100.
With the applications of theapplication layer455, a variety of functions may be performed by network access, including Internet telephony, VoD, Web album, Social Networking Service (SNS), Location-Based Service (LBS), map service, Web browsing, and application search. In addition, the user may enjoy games and manage schedules using applications.
Referring toFIG. 5, a platform for theimage display apparatus100 according to another embodiment of the present invention is an integrated type platform. The integrated type platform may include anOS kernel510, adriver520,middleware530, aframework540, and anapplication layer550.
Compared to the separate-type platform illustrated inFIG. 4, the integrated type platform ofFIG. 5 is characterized by the absence of thelibrary435 and theapplication layer550 being an integrated layer. Thedriver520 and theframework540 correspond to thedriver420 and theframework440 ofFIG. 4, respectively.
Thelibrary435 ofFIG. 4 may be incorporated into themiddleware530. That is, themiddleware530 may include both the legacy system middleware and the image display system middleware. As described above, the legacy system middleware includes MHEG or ACAP as data broadcasting-related middleware, PSIP or SI middleware as broadcast information-related middleware, and DLNA middleware as peripheral device communication-related middleware, whereas the image display system middleware includes SSL as a security-related library, WebKit as a Web engine-related library WebKit, libc, and Media Framework as a media-related library. Themiddleware530 may further include the above-described runtime.
Theapplication layer550 may include menu-related applications, a TV guide application, a reservation application, etc. as legacy system applications, and e-mail, SMS, a calendar, a map, and a browser as image display system applications.
In theapplication layer550, applications may be categorized into user-undeletable applications565 that are stored in theimage display apparatus100 and user-installable or user-deletable applications575 that are downloaded from an external device or a network and stored in theimage display apparatus100.
The platforms illustrated inFIGS. 4 and 5 may be general-purpose ones that can be implemented in many other electronic devices as well as in image display apparatuses.
The platforms illustrated inFIGS. 4 and 5 may be loaded on thememory140, thecontroller170, or any other processor (not shown).
FIG. 6A is a diagram showing an operating method using the first surface of the remote controller of FIG.
FIG. 6A shows the case in which thepointer205 is displayed in correspondence with movement of theremote controller200 in a state in which thefirst surface201 of theremote controller200 is upturned.
First, FIG.6A(a) shows the case in which thepointer205 is displayed in correspondence with theremote controller200 at a predetermined position of thedisplay180.
The user may move theremote controller200 up and down, side to side (FIG.6A(b)), and back and forth (FIG.6A(c)). Thepointer205 displayed on thedisplay180 of the image display apparatus moves in accordance with the movement of theremote controller200. In this context, theremote controller200 may be referred to as a pointing device or a 3D pointing device.
Referring to FIG.6A(b), if the user moves theremote controller200 to the left, thepointer205 displayed on thedisplay180 of the image display apparatus moves to the left accordingly.
Information about the movement of theremote controller200 sensed by the sensor of theremote controller200 is transmitted to the image display apparatus. Theimage display apparatus100 may calculate the coordinates of thepointer205 from the information about the movement of theremote controller200. Then, theimage display apparatus100 may display thepointer205 at the calculated coordinates.
Referring to FIG.6A(c), while pressing a predetermined button of theremote controller200, the user moves theremote controller200 away from thedisplay180. Then, a selection area corresponding to thepointer205 may be zoomed in upon and enlarged on thedisplay180. On the contrary, if the user moves theremote controller200 toward thedisplay180, the selection area corresponding to thepointer205 is zoomed out and thus contracted on thedisplay180. Alternatively, when theremote controller200 moves away from thedisplay180, the selection area may be zoomed out and when theremote controller200 approaches thedisplay180, the selection area may be zoomed in on.
With the predetermined button pressed in theremote controller200, the up, down, left and right movements of theremote controller200 may be ignored. That is, when theremote controller200 moves away from or approaches thedisplay180, only the back and forth movements of theremote controller200 are sensed, while the up, down, left and right movements of theremote controller200 are ignored. Unless the predetermined button is pressed in theremote controller200, thepointer205 moves in accordance with the up, down, left or right movement of theremote controller200.
The speed and direction of thepointer205 may correspond to the speed and direction of theremote controller200.
FIG. 6B is a diagram showing an operating method using the second surface of the remote controller ofFIG. 10.
FIG. 6B shows the case in which the letter key of theremote controller200 is manipulated in a state in which thesecond surface251 of theremote controller200 is upturned and thefirst surface201 is downturned.
For example, if a firstalphabetic key281, a secondalphabetic key282 and a thirdalphabetic key283 of the letter key are sequentially manipulated, theremote controller200 transmits the key information corresponding thereto to theimage display apparatus100. Then, the image display apparatus may display the corresponding characters “abc”715 in adisplay window710.
The corresponding characters “abc” may also be displayed on thedisplay270 of theremote controller200.
FIG. 7 is a block diagram of the remote controller illustrated inFIG. 1.
Referring toFIG. 7, theremote controller200 may include aradio transceiver820, auser input portion830, asensor portion840, anoutput portion850, apower supply860, amemory870, and acontroller880.
Theradio transceiver820 transmits and receives signals to and from any one of the image display apparatuses according to the embodiments of the present invention. Among the image display apparatuses according to the embodiments of the present invention, for example, oneimage display apparatus100 will be described.
In accordance with the exemplary embodiment of the present invention, theradio transceiver820 may be provided with anRF module821 for transmitting and receiving signals to and from theimage display apparatus100 according to an RF communication standard. Additionally, theradio transceiver820 may include anIR module823 for transmitting and receiving signals to and from theimage display apparatus100 according to an IR communication standard.
In the present embodiment, theremote controller200 may transmit information about movement of theremote controller200 to theimage display apparatus100 via theRF module821.
Theremote controller200 may receive the signal from theimage display apparatus100 via theRF module821. Theremote controller200 may transmit commands associated with power on/off, channel switching, volume change, etc. to theimage display apparatus100 through theIR module823.
In the present embodiment, theuser input portion830 may include an operationkey input portion832 for performing operation key input and a letterkey input portion834 for performing letter key input.
The operationkey input portion832 may include various operation keys placed on thefront surface201 of theremote controller200 as described with reference toFIG. 1B. The operationkey input portion832 may include, for example, thepower key202, thehome key204, thesearch key206, the four-direction key210, thewheel key222, theback key222, themenu key224, thevolume key230, the3D key235, thechannel key240, etc.
The letterkey input portion834 may include various letter keys placed on theback surface251 of theremote controller200 as described with reference toFIG. 1B. The letterkey input portion834 may include, for example, thenumeric key262, thealphabetic key264, etc.
The user may enter a command for remotely controlling theimage display apparatus100 to theremote controller200 by manipulating theuser input portion830. If theuser input portion230 includes hard keys, the user may enter commands related to theimage display apparatus100 to theremote controller200 by pushing the hard keys. If theuser input portion230 is provided with a touchscreen, the user may enter commands related to theimage display apparatus100 to theremote controller200 by touching soft keys on the touchscreen. Additionally, theuser input portion830 may have a variety of input means which may be manipulated by the user, such as a scroll key, a jog key, etc., to which the present invention is not limited.
Thesensor portion840 may sense and output motion information of the remote controller. Thesensor portion840 may include agyro sensor841 or anacceleration sensor843.
Thegyro sensor841 may sense information about movement of theremote controller200. For example, thegyro sensor841 may sense information about movement of theremote controller200 along x, y and z axes.
Theacceleration sensor843 may sense information about the velocity of theremote controller200. For example, theacceleration sensor843 may sense information about the speed of theremote controller200 along x, y and z axes.
Thesensor portion840 may further include a distance measurement sensor for sensing a distance from thedisplay180.
The motion information output from thesensor portion840 may include the information about movement of theremote controller200 from thegyro sensor841 and the information about the speed of theremote controller200 from theacceleration sensor843 and further include the distance information.
Theoutput portion850 may output a video or audio signal corresponding to manipulation theuser input portion830 or a signal transmitted by theimage display apparatus100. The user may be aware from theoutput portion850 as to whether theuser input portion830 has been manipulated or theimage display apparatus100 has been controlled.
For example, theoutput portion850 may include a Light Emitting Diode (LED)module851 for illuminating when theuser input portion830 has been manipulated or a signal is transmitted to or received from theimage display apparatus100 through theradio transceiver820, avibration module853 for generating vibrations, anaudio output module855 for outputting audio, or adisplay module857 for outputting video.
Thepower supply860 supplies power to theremote controller200. When theremote controller200 is kept stationary for a predetermined time, thepower supply860 blocks power from theremote controller200, thereby preventing waste of power. When a predetermined key of theremote controller200 is manipulated, thepower supply860 may resume power supply.
Thememory870 may store a plurality of types of programs required for control or operation of theremote controller200, or application data. When theremote controller200 transmits and receives signals to and from theimage display apparatus100 wirelessly through theRF module821, theremote controller200 and theimage display apparatus100 perform signal transmission and reception in a predetermined frequency band. Thecontroller880 of theremote controller200 may store information about the frequency band in which to wirelessly transmit and receive signals to and from theimage display apparatus100 paired with theremote controller200 in thememory870 and refer to the information.
Thecontroller880 provides overall control to theremote controller200.
Thecontroller880 may transmit a signal corresponding to predetermined key manipulation on theuser input portion830 or a signal corresponding to an movement of theremote controller200 sensed by thesensor portion840 to theimage display apparatus100 through theradio transceiver820.
Theuser input interface150 of theimage display apparatus100 receives key manipulation information or motion information. Theuser input interface150 may have aradio transceiver820.
Theradio transceiver811 may include anRF module812 for performing RF communication with theremote controller200 and anIR module813 for performing IR communication with theremote controller200.
Theuser input interface150 may further include coordinatecalculator815 for calculating the coordinates of the pointer using information corresponding to movement of theremote controller200.
The coordinates of the pointer may be calculated by thecontroller170 instead of the coordinatecalculator815. For calculation of the coordinates of the pointer, theuser input interface150 may send the information corresponding to movement of theremote controller200 to thecontroller170.
FIG. 8 is a flowchart illustrating a method for operating an image display apparatus according to an embodiment of the present invention, andFIGS. 9A to 14B are views referred to for describing various examples of the method for operating the image display apparatus ofFIG. 8.
Referring toFIG. 8, theimage display apparatus100 displays a home screen (S820).
If a home screen is set to be displayed when theimage display apparatus100 is powered on, the home screen may be automatically displayed when theimage display apparatus100 is powered on. Alternatively, if home screen display input is received based on theremote controller200, a local key (not shown), user voice, a user gesture, etc., thecontroller170 may display the home screen on thedisplay180.
FIG. 9A shows an example of ahome screen900 of an image display apparatus.
Thehome screen900 may include adashboard area902 including card objects each including content and alauncher bar area903 including frequently used application items.
Movable, replaceable, switchable card objects may be placed in thedashboard area902. An object (not shown) indicating a logged-in user and time information may be displayed in thedashboard area902.
FIG. 9A shows a liveTV card object920, a premiumapps card object940 and a 3Dworld card object960 as an example of the card objects.
FIG. 9B shows another example of card objects. InFIG. 9A, if ascreen switching object935 is selected to switch a screen, ahome screen901 for displaying atube card object950, a smartshare card object970, a myinterest card object990 may be displayed in thedashboard area902.
Alive broadcast image910 may be displayed in the liveTV card object920. In addition,live broadcast information912, anexternal input object914 capable of selecting external input, a settings object916 for setting the image display apparatus and anadvertisement image915 may be further displayed.
Thelive broadcast image910 may be a broadcast image extracted from a live broadcast signal received by the above-describedbroadcast reception unit105. In particular, the live broadcast signal may be received through thetuner unit110 of thebroadcast reception unit105 or thenetwork interface130.
Broadcast information912 related to a live broadcast program may be program guide information of a broadcast signal or broadcast information separately received from a broadcast station server or another server over a network.
If theexternal input object914 capable of selecting external input is selected, a screen for selecting an external input image received through an HDMI port, a VOD port, a component port, an RGB port or an antenna port among various input ports of theimage display apparatus100 may be displayed.
If the settings object916 for setting the image display apparatus is selected, a settings screen for setting the image display apparatus may be displayed.
Theadvertisement image915 may be based on an advertisement included in a broadcast signal, an advertisement provided by a network provider for providing a network to theimage display apparatus100 or an advertisement provided by a broadcast station for providing a broadcast image to theimage display apparatus100.
The premiumapps card object940 may include acard object name942 and acontent list945. In particular, thecontent list945 may include content items provided by a manufacturer of theimage display apparatus100. In the figure, application items are shown as content items and are installed in theimage display apparatus100 in advance. If any one application item of thecontent list945 is selected, an application corresponding to the application item may be executed and a screen on which the application is executed may be displayed on thedisplay180.
Unlike the figure, content items other than the application items may be displayed.
The content items and, more particularly, the application items, of the premiumapps card object940 may be displayed to be distinguished from other application items when the overall application list is displayed.
The 3Dworld card object960 may include acard object name962 and acontent list965. In particular, thecontent list965 may include 3D content items. The 3D content items may be shortcut items for 3D content execution. If any one 3D content item of thecontent list945 is selected, 3D content corresponding to the 3D content item is played back and a screen on which the 3D content may be played back may be displayed on thedisplay180.
Although threecard objects920,940 and960 are shown inFIG. 9A, if an additional card object is further present, a portion of the additional card object may be displayed on thedisplay180 as shown inFIG. 9A. Therefore, the user can be intuitively aware that the additional card object to be searched for is present.
Thetube card object950 ofFIG. 9B may include acard object name952, anobject954 indicating content list switching and acontent list955acorresponding to any one subcategory.
In particular, thecontent list955amay include content items stored in a server managed by a video provider.
In order to display predetermined content items among a plurality of content items stored in the server managed by the video provider in thetube card object950, subcategories associated with genre, date, region, age and gender may be set. Content lists may be separately configured according to such subcategories.
The content list of thetube card object950 shown inFIG. 9B may be acontent list955aincluding recently featured items. That is, a first content list corresponding to a first subcategory may be displayed.
If theobject954 indicating content list switching is selected, the content list of thetube card object950 may be a content list including most viewed items, a content list including trending video items or a content list including 3D content items.
The smartshare card object970 ofFIG. 9B may include acard object name972, anobject974 indicating content list switching and acontent list975acorresponding to any one subcategory.
In particular, thecontent list975amay be a content list including content items stored in an adjacent electronic apparatus.
In order to display predetermined content items among a plurality of content items stored in the adjacent electronic apparatus in the smartshare card object970, subcategories associated with genre, date, etc. may be set. Content lists may be separately configured according to such subcategories.
The content list of the smartshare card object970 shown inFIG. 9B is acontent list975aincluding new content items.
If theobject974 indicating content list switching is selected, the content list of the smartshare card object970 may be a content list including movie content items, a content list including photo content items or a content list including music content items.
The myinterest card object990 ofFIG. 9B may include acard object name992, anobject994 for setting interesting content and acontent list995acorresponding to any one interesting item.
In particular, thecontent list995amay be a content list including content items provided by a specific server.
In order to display predetermined content items among a plurality of content items provided by the specific server in the myinterest card object990, subcategories associated with genre, date, etc. may be set. Content lists may be separately configured according to such subcategories.
The content list of the myinterest card object990 shown inFIG. 9B is acontent list995aincluding world news items.
If top stories and business items are further set in addition to the world news items by selection of theobject994 for setting interesting content, the content list displayed in theobject994 for setting interesting content may be a content list including world news items, a content list including top stories news items or a content list including business news items.
Thelauncher bar area903 may include anapplication list980 including frequently used application items.
Theapplication list980 of thelauncher bar area903 may be displayed without change when the home screen is switched, for example, when thehome screen900 ofFIG. 9A is switched to the home screen ofFIG. 9B.
FIG. 9C shows the case in which, as application items displayed in thelauncher bar area903 of thehome screen901, awidget item981, anotification item982, an applicationfull view item983, a TV item, a 3D item, a primetime (TV & movie) item, a movie playback item, a video item, an application store item, a web item, a search item, etc. are sequentially included from the left to the right.
Some items may be edited or deleted or the other items may not be edited or deleted.
For example, the video item, the application store item, the web item, the search item, etc. may not be deleted and the TV item, the 3D item, etc. may be edited or deleted.
FIG. 9C shows the case in which the applicationfull view item983 in the application list displayed in thelauncher bar area903 of thehome screen900 is selected based on thepointer205 displayed in correspondence with movement of theremote controller200.
Theapplication list screen986 including a plurality of application items may be displayed on theimage display apparatus100 as a full screen as shown inFIG. 9D.
At this time, if an additional application item to be displayed is further present, a portion of the additional application item to be displayed may be displayed on thedisplay180. InFIG. 9A, the portion of the additional application item to be displayed is displayed in a lower region of theapplication list screen986. Therefore, the user can be intuitively aware that the additional application item to be searched for is present.
Unlike the figure, the portion of the additional application item to be displayed may be displayed in a right region of theapplication list screen986.
Selection, focusing, etc. of the application item may be changed by direction key input of theremote controller200 in addition to the pointer based on movement of theremote controller200.
In the figure, acursor988 is located on a predetermined application item according to direction key input of theremote controller200. If direction key input of theremote controller200 is received, the pointer based on movement of theremote controller200 may not be displayed.
For screen movement, a scroll bar (not shown) may be described in one region of the display.
Next,FIG. 9E shows anapplication list screen987 according to screen movement. If screen downward-movement input is received on theapplication list screen986 ofFIG. 9D, theapplication list screen987 ofFIG. 9E may be displayed. Screen movement input may be pointer input, direction key input or scroll bar input, etc. In the figure, thecursor989 is located on a predetermined application item according to the direction key input of theremote controller200.
Next,FIGS. 9F to 9H show detailed embodiments of the smartshare card object970 on thehome screen901 shown inFIG. 9B.
FIG. 9F shows the case in which the smartshare card object970 further includes asettings object981. At this time, if thecard object name972 is selected using thepointer205, asmart share screen973ashown inFIG. 9G may be displayed on the image display apparatus.
The shared content according to the embodiment of the present invention may include content stored in an electronic apparatus connected to the image display apparatus and may include other content, for example, content stored in an external server over a network. More specifically, if a first user logs in to theimage display apparatus100, the first user may share content stored in the electronic apparatus connected to the image display apparatus or the content stored in the external server in the smartshare card object970 or on thesmart share screen973a. For sharing, the first user requires to log in to the external server. Alternatively, the login of the external server may be automatically performed when the user logs in to theimage display apparatus100. Therefore, it is possible to further activate a content sharing function.
Thesmart share screen973aofFIG. 9G includes a homenetwork content list971aand an externalnetwork content list971b.FIG. 9G shows the case in which the user logs in to the image display apparatus through thecamera195 such that anicon1114 indicating theuser1104 is displayed on thesmart share screen973a.
The settings object981 may be displayed on thesmart share screen973a. Various settings may be performed through the settings object981. For example, content sharing is set to be restricted to the home network or to extend to the external network. As another example, shared content may be set to be displayed on a per user basis or all shared content may be set to be displayed regardless of the user. Alternatively, shared content may be set to be displayed on a per age basis. Alternatively, a lock function may be set to be activated on a per content basis. Alternatively, subcategories other than new, movie, photo and music may be further set. Alternatively, only newly added content may be set to be displayed by setting new shared content. A synchronization period with a home network may be set. Upon shared content update, a time when a content list is displayed may be set. For example, if a new content list is received through a home network, the new content list may be immediately displayed. Alternatively, a content list to be displayed may be updated at a predetermined time interval and a new content list may be displayed upon update.
FIG. 9H shows asmart share screen973bdifferent from that ofFIG. 9G. Thesmart share screen973bofFIG. 9H includes alist971cof electronic apparatuses connected to theimage display apparatus100 and alist971dof electronic apparatuses which are not connected to theimage display apparatus100 among the electronic apparatuses included in the home network. Unlike the figure, thelists971cand971dare separately displayed on the screen.
In the figure, a refrigerator item, a laptop item and a home server item are included in thelist971cof electronic apparatuses connected to theimage display apparatus100 and a tablet item, a computer item and washing machine item are included in thelist971dof electronic apparatuses which are not connected to theimage display apparatus100.
If the computer item is selected from thelist971dof electronic apparatuses that are not connected to theimage display apparatus100, a separate menu may be displayed and thus a remote power-on operation of the computer may be performed.
A card object corresponding to thesmart share screen973bshown inFIG. 9G may be displayed on the home screen. Therefore, it is possible to easily confirm electronic apparatuses connected to theimage display apparatus100 and electronic apparatuses which are not connected to theimage display apparatus100.
After step820 (S820) ofFIG. 8, that is, after home screen display, the following steps may be performed.
Theimage display apparatus100 determines whether card object generation input is received (S825) and displays a card object generation screen (S830) if it is determined that card object generation input is received.
In order to generate a card object in a state of displaying the home screen, the settings object916 for setting the image display apparatus on the home screen may be selected.
FIG. 10A shows the case in which the settings object916 of the liveTV card object920 on thehome screen900 is selected based on thepointer205 displayed in correspondence with movement of theremote controller200.
A settings screen1010 related to the image display apparatus may be displayed as shown inFIG. 10B.
The settings screen1010 may include a homedashboard edit item1012, a customizehome item1014, a home and allapps settings item1016 and asystem settings item1018.
FIG. 10B shows the case in which the homedashboard edit item1012 of the settings screen101 is selected based on thepointer205 displayed in correspondence with movement of theremote controller200.
Then, as shown inFIG. 10C, a homedashboard edit screen1001 for editing thedashboard area902 of the home screen may be displayed. At this time, thelauncher bar area903 may be empty.
The homedashboard edit screen1001 is similar to thehome screen900 ofFIG. 10B but is different therefrom in that card objects1020,1040 and1060 may be edited. UnlikeFIG. 10B, the homedashboard edit screen1001 may further include a cardobject generation item1052 and anedit completion item1054.
On the homedashboard edit screen1001, content list settings of a specific card object may be changed, a card object name may be changed or the position of a card object may be changed.
For display of the homedashboard edit screen1001 ofFIG. 10C, the homedashboard edit screen1001 may be displayed using other methods, instead of selection of the settings object916 of thehome screen900 and selection of the homedashboard edit item1012 of thesettings screen1010.
For example, if thepointer205 is located in any one of the card objects920,940 and960 in a state of displaying thehome screen900 ofFIG. 10A and then long tap or long press input is received, as shown inFIG. 10C, the homedashboard edit screen1001 for editing thedashboard area902 may be displayed on the home screen.
As another example, if themenu key224 of theremote controller200 is manipulated in a state of displaying thehome screen900 ofFIG. 10A, the settings screen1010 related to the image display apparatus shown inFIG. 10B may be displayed. If the homedashboard edit item1012 is selected, the homedashboard edit screen1001 may be displayed.
As another example, if themenu key224 of theremote controller200 is manipulated in a state in which thepointer205 of theremote controller200 is located in thedashboard area902 of thehome screen900 ofFIG. 10A, the homedashboard edit screen1001 for editing the dashboard may be immediately displayed.
FIG. 10C shows the case in which the cardobject generation item1052 of the homedashboard edit screen1001 is selected based on thepointer205 displayed in correspondence with movement of theremote controller200.
As shown inFIG. 10D, the cardobject generation screen1003 for generating a card object may be displayed.
The cardobject generation screen1003 may include afirst region1032 including acard object1036 to be generated and asecond region1034 for displaying anapplication list1035.
Thecard object1030 to be generated may include anicon1033 associated with a user of a generated card object, a card objectname input window1037, anapplication area1036 in which application items are placed, ageneration completion item1031, etc.
Theicon1033 associated with the user of the generated card object may indicate a user who generates the card object or a user who will use the generated card object. Theicon1033 may be set by selecting any one of an icon list or may be set to include an image of a user captured using thecamera195.
As a method of adding an application to theapplication area1036, a predetermined application item of theapplication list1035 may be added to theapplication area1036 by dragging and dropping of thepointer205.
Alternatively, as shown inFIG. 10E, in the case in which each of the application items displayed in theapplication list1035 includes a selection window, when a selection window is selected, an application item corresponding thereto may be automatically added to theapplication area1036.
FIG. 10E shows the case in which fourapplication items1021,1023,1025 and1027 are selected by thepointer205. At this time,cursors1028 may be displayed on recently selected application items.
In theapplication area1036,application items1011,1013,1015 and1017 corresponding to the selectedapplication items1021,1023,1025 and1027 may be displayed.
FIG. 10F shows the case in which oneapplication item1029 is further selected as compared toFIG. 10E. Accordingly, in theapplication area1036, a newly addedapplication item1019 is further displayed in addition to theapplication items1011,1013,1015 and1017.
FIGS. 10E and 10F show the case in which, when the items of the application list are selected, the items are immediately added to theapplication area1036.
Alternatively, the items may be added to theapplication area1036 when selection of the items of the application list has been completed. In this case, an object (not shown) indicating that application selection has been completed may be further displayed.
FIG. 10G shows the case in which a card object name is input to the card objectname input window1037.
For example, a card object name may be entered using the letter key260 placed on theback surface251 of theremote controller200. As another example, if the pointer is located on the card objectname input window1037, a screen keyboard may be displayed on thedisplay180 and thus the card object name may be entered. Alternatively, the card object name may be entered based on voice input using a microphone (not shown) included in theremote controller200.
FIG. 10G shows the case in which thegeneration completion item1031 is selected using thepointer205 after card object name input has been completed.
In this case, as shown inFIG. 10H, a card objectgeneration completion message1009 may be displayed. Therefore, the user may be aware that card object generation has been normally completed.
Next, after card object generation has been completed, the home screen may be displayed. In particular, the home screen including the generated card object may be displayed.
FIG. 10I shows the case in which thehome screen904 including the generatedcard object1007 is displayed. Therefore, the user may generate and display a desired card object.
When the card object is generated as shown inFIGS. 10A to 10I, theicon1033 associated with the user of the generated card object may be based on the captured image of the user. For example, the image of the user may be captured using thecamera195, a user face area may be extracted from the captured image and theicon1033 associated with the user of the generated card object may be automatically generated based on the extracted face image.
As another example, theicon1033 associated with the user of the generated card object may be set by selecting any one icon from a separate icon list.
When a card object is generated, rights to application items added to thecard object1030 to be generated may be differently set. For example, all users may have the right to confirm afirst application item1017 shown inFIG. 10F and only a user who generates a card object may have the right to confirm asecond application item1019 shown inFIG. 10F. Alternatively, when thesecond application item1019 is executed, a lock function such as password input or user face captured may be set. Therefore, since my content items may be added to the generated card object, it is possible to increase user convenience.
When a card object is generated, an application item added to thecard object1030 to be generated may be deleted.
For example, as shown inFIG. 10J, theapplication item1019 added to thecard object1030 to be generated may be selected using thepointer205 indicating movement of the remote controller. In the figure, a selection window included in theapplication item1019 is activated.
As shown inFIG. 10K, the selectedapplication item1019 is removed. InFIG. 10K, the selectedapplication item1019 in thecard object1030 to be generated is removed and only thepointer205 is displayed.
The application item in thecard object1030 to be generated may be deleted by dragging and dropping of the pointer. That is, if the selectedapplication item1019 is dragged to the outside of thecard object1030 to be generated, the application item may be deleted.
AlthoughFIGS. 9A to 10K show that the case in which the object, etc. is selected using thepointer205 indicating movement of theremote controller200, the present invention is not limited thereto and various modifications are possible.
For example, the object, etc. may be selected using the direction key and the OK key of theremote controller200. As another example, the object, etc. may be selected according to a user gesture based on the image of the user captured using thecamera195. As another example, user voice may be recognized and the object, etc. may be selected based on the recognized voice.
For example, theimage display apparatus100 may display a hand-shapedpointer506 corresponding to thehand505 of auser1104 based on the image of the user captured using thecamera195.
In a state of displaying the home screen shown inFIG. 10L, if hand movement of theuser1104 corresponds to a tap gesture for selecting a specific application item in the application list, it may be determined that the application item is selected. A plurality of application items may be sequentially selected. In the figure, five application items are selected by the user tap gesture.
Next, as shown inFIG. 10M, if the user makes a gist gesture and then makes a leftward movement gesture, theimage display apparatus100 may group the selected five application items by the grip gesture and move the selected five application items to the left by the leftward movement gesture, that is, add the selected five application items to thecard object1030 to be generated, based on the image of the user captured using thecamera195. Therefore, it is possible to easily move a plurality of items.
As another example, theremote controller200 including a microphone (not shown) may receive and send user voice to theimage display apparatus100.
FIG. 10N shows the case in which a plurality of application items is added to thecard object1030 to be generated using multiple input means.
First, after the five application items are selected using the direction key, etc. of the remote controller, if the user outputs voice508 “Please add the selected content to the card object”, theremote controller200 may collect and send data ofsuch voice508 to theimage display apparatus100. Theimage display apparatus100 may analyze user voice using a voice recognition function and recognize a command for moving the five application items.
Therefore, as shown inFIG. 10N, the five application items may be moved to and displayed in thecard object1030 to be generated.
The command for selecting the image display apparatus may be input through the direction key and the OK key, the user gesture, user voice, etc. in addition to the pointer of the remote controller. Hereinafter, although the pointer indicating movement of the remote controller is focused upon, the direction key, the OK key, the user gesture, user voice, etc. may be used as described above.
Although generation of the card object is described inFIGS. 10A to 10N, various kinds of the card objects may be generated.
For example, as shown inFIG. 9H, a home network card object indicating a connection state of an adjacent electronic apparatus for providing shared content may be generated. The home network card object may include thelist971cof electronic apparatuses connected to theimage display apparatus100 and thelist971dof electronic apparatuses which are not connected to theimage display apparatus100 as shown inFIG. 9H.
As another example, an integrated electronic apparatus control card object capable of simultaneously monitoring and remotely controlling a plurality of electronic apparatuses may be generated. In such a card object, information for monitoring and remotely controlling home appliances such as a refrigerator, a washing machine, an air conditioner, a cooker, a TV, etc. may be displayed.
The generated card object is a per-user card object and may be viewed by a corresponding user, which will be described with reference toFIGS. 11A to 11E.
FIG. 11A shows the case in which afirst user1103 uses theimage display apparatus100 on which thehome screen900 is displayed. Thecamera190 of theimage display apparatus100 captures the image of the user and sends the capturedimage1106 of the user to thecontroller170.
Thecontroller170 compares the image previously stored in thememory140 with the captured image of theuser1106 and performs user recognition. Then, login of the recognized user is performed.
Login may be performed based on at least one of the captured image of the user, password input or user voice recognition. User voice may be acquired using a microphone included in theremote controller200.
If login is performed, theimage display apparatus100 may display an icon indicating the logged-in user.
FIG. 11B shows anicon1113 indicating the logged-infirst user1103 on thehome screen906.
Thecontroller170 may provide an individual home screen according to the logged-in user. That is, as shown inFIG. 11B, acard object1150 generated by the first user may be displayed on the home screen.
Thecard object1150 generated by the first user may include anicon1152 associated with the user of the generated card object, acard object name1153, anobject1154 for setting content, and acontent list1155 including generated application items.
Thecontroller170 may control display of a recent card object, in which content recently used by the logged-in user is collected, to the user. That is, the recent card object indicating the recently used content may be automatically generated and displayed without generating a separate card object.
FIG. 11B shows therecent card object1160. Therecent card object1160 may include anicon1162 associated with the user, acard object name1163, anobject1164 for setting content, and acontent list1165 including recent content items.
Although thecard object1150 generated by the first user and therecent card object1160 shown inFIG. 11B are provided after login of the first user, modifications thereof are possible.
For example, thecard object1150 generated by the first user may be provided after login of the first user but therecent card object1160 may be displayed regardless of login of the first user.
That is, the recent card object may not include the content items recently executed by the first user but may include content items recently executed by all users. In this case, the recent card object may be displayed on the home screen regardless of login, when the user uses the image display apparatus.
Next,FIG. 11C shows the case in which asecond user1104 uses the image display apparatus to which the first user logs in.
In this case, thecamera190 of theimage display apparatus100 captures the image of the second user and sends the capturedimage1107 of the second user to thecontroller170. Thecontroller170 compares the image previously stored in thememory140 with the captured image of theuser1106 and performs user recognition.
At this time, since the first user has already logged in to the image display apparatus, thecontroller170 informs the second user that another user has logged in to the image display apparatus and displays amessage1162 indicating whether the second user will log in to the image display apparatus again. If a newlogin input item1164 is selected, thecontroller170 performs login of the recognized second user.
Login may be performed based on at least one of the captured image of the user, password input or user voice recognition. User voice may be acquired using a microphone included in theremote controller200.
If login is performed, theimage display apparatus100 may display an icon indicating the logged-in second user.
FIG. 11D shows anicon1114 indicating the logged-insecond user1103 on thehome screen907.
Thecontroller170 may provide an individual home screen according to the logged-in user. That is, thehome screen906 ofFIG. 11B may be provided to the first user and thehome screen907 ofFIG. 11D may be provided to the second user.
Thehome screen907 ofFIG. 11D may include acard object1170 generated by the second user.
Thecard object1170 generated by the second user may include anicon1172 associated with the user of the generated card object, acard object name1173, anobject1174 for setting content, and acontent list1175 including generated application items.
Although not shown, thecontroller170 may control display of a recent card object in which content recently used by the logged-in user is collected.
Next,FIG. 11E shows multi-login of a plurality of users.
For example, if thesecond user1104 uses theimage display apparatus100 to which thefirst user1103 has logged in, the second user may immediately log in to theimage display apparatus100 without new user login. That is, multi-login is possible.
As shown inFIG. 11E, thecard object1150 generated by the first user and thecard object1170 generated by the second user may be displayed together on the home screen.
FIG. 11E shows the case in which anicon1113 indicating the logged-infirst user1103 and anicon1114 indicating the logged-insecond user1104 are displayed.
If the card objects are displayed together according to multi-login and common content is present in the card objects, the common content item may be highlighted and displayed. Such a highlight function corresponds content recommendation. By recommending the common content of the two users, it is possible to increase user convenience.
Upon multi-login, unlikeFIG. 11E, one common card object may be displayed. At this time, the common card object may include a content item in thecard object1150 generated by the first user and a content item in thecard object1170 generated by the second user. At this time, the common content item may be highlighted and displayed. Alternatively, the common card object may include only the common content item.
Next,FIG. 11C shows the case in which thesecond user1104 uses theimage display apparatus100 to which the first user has logged in.
In this case, thecamera190 of theimage display apparatus100 captures the image of the second user and sends the capturedimage1106 of the second user to thecontroller170. Thecontroller170 compares the image previously stored in thememory140 with the captured image of theuser1106 and performs user recognition.
At this time, since the first user has already logged in to the image display apparatus, thecontroller170 informs the second user that another user has logged in to the image display apparatus and displays amessage1162 indicating whether the second user will log in to the image display apparatus again. If a newlogin input item1164 is selected, thecontroller170 performs login of the recognized second user.
Login may be performed based on at least one of the captured image of the user, password input or user voice recognition. User voice may be acquired using a microphone included in theremote controller200.
If login is performed, theimage display apparatus100 may display an icon indicating the logged-in second user.
On thehome screen900 displayed on theimage display apparatus100, various settings may be performed in addition to card object generation. Hereinafter, various settings will be described in detail.
FIG. 12A shows the case in which the settings object916 for setting the image display apparatus is selected on thehome screen900 based on thepointer205 displayed in correspondence with movement of theremote controller200.
As shown inFIG. 12B, the settings screen1010 related to the image display apparatus may be displayed.FIG. 12B shows the case in which the customizehome item1014 of the settings screen1010 is selected based on thepointer205 displayed in correspondence with movement of theremote controller200.
As shown inFIG. 12C, an applicationlist edit screen1180 for editing thelauncher bar area903 may be displayed on the home screen. At this time, thedashboard area902 may be empty.
The applicationlist edit screen1180 is similar to theapplication list980 ofFIG. 12B and is different therefrom in that the application items are switched to an editable state. UnlikeFIG. 12B, the applicationlist edit screen1180 may further include awidget addition item1182 and anaddition completion item1184.
On the applicationlist edit screen1180, content list settings of a specific card object may be changed, a card object name may be changed or the position of a card object may be changed.
For display of the applicationlist edit screen1180 ofFIG. 12C, the applicationlist edit screen1180 may be displayed using other methods, instead of selection of the settings object916 of thehome screen900 and selection of the applicationlist edit item1014 of thesettings screen1010.
For example, if thepointer205 is located in theapplication list980 in a state of displaying thehome screen900 ofFIG. 12A and then long tap or long press input is received, as shown inFIG. 12C, the applicationlist edit screen1180 for editing thelauncher bar area903 may be displayed on the home screen.
As another example, if themenu key224 of theremote controller200 is manipulated in a state of displaying thehome screen900 ofFIG. 12A, the settings screen1010 related to the image display apparatus shown inFIG. 12B may be displayed. If the applicationlist edit item1014 is selected, the applicationlist edit screen1180 may be displayed.
As another example, if themenu key224 of theremote controller200 is manipulated in a state in which thepointer205 of theremote controller200 is located in thelauncher bar area903 of thehome screen900 ofFIG. 10A, the applicationlist edit screen1180 for editing the launcher bar area may be immediately displayed.
FIG. 12C shows the case in which thewidget addition item1182 of the applicationlist edit screen1180 is selected based on thepointer205 displayed in correspondence with movement of theremote controller200.
As shown inFIG. 12D, awidget screen1185 may be displayed. If various widgets which may be installed in the image display apparatus are present,screen switching objects1186 and1187 for switching the screen may be further displayed. Accordingly, widget items desired by the user may be installed in the image display apparatus.
The installed widgets may be displayed on the image display apparatus when thewidget item981 ofFIG. 9C is selected. If thewidget item981 ofFIG. 9C is selected once more, the home screen may be displayed on the image display apparatus again. That is, the installed widget screen and the home screen may be alternately displayed according to widget screen selection.
Next,FIG. 13A shows the case in which, in the state of displaying the settings screen1010 related to the image display apparatus, the home and allapps settings items1016 are selected based on thepointer205 displayed in correspondence with movement of theremote controller200.
Then, as shown inFIG. 13B, a home and all apps settings screen1210 for setting the home screen may be displayed. At this time, thedashboard area902 may be empty.
The home and all apps settings screen1210 may include a plurality of sub-items.
In the figure, astartup application item1212 for executing applications to be displayed when the image display apparatus is powered on, a show/hide application item1214 for setting show/hide of application items installed in the image display apparatus, awallpaper item1216 and a restore item for restoring an originally installed item are shown as the plurality of sub-items of the home and all apps settings screen1210.
If thestartup application item1212 is selected based on thepointer205 displayed in correspondence with movement of theremote controller200, as shown inFIG. 13C, a startup settings screen1220 for executing applications to be displayed when the image display apparatus is powered on is displayed.
Since the home screen is not immediately displayed when theimage display apparatus100 is powered on, the startup settings screen1220 may include items for setting the applications to be displayed at a period from a time when power is turned on to a time when the home screen is displayed.
In the figure, adefault item1222 for displaying a live broadcast, anone item1224 for non-display, a3D item1226 for displaying a 3D image, and an application item for executing a specific application are shown as the items of thestartup settings screen1220.
In the figure, thedefault item1222 for displaying the live broadcast is set as the items of thestartup settings screen1220. In this case, since the home screen is not immediately displayed when theimage display apparatus100 is powered on, a live broadcast program received through thebroadcast reception unit105 may be displayed as the full screen in a period from a time when power is turned on to a time when the home screen is displayed.
FIG. 13D shows the case in which, in the state of displaying the home and all apps settings screen1210, the show/hide application item1214 is selected based on thepointer205 displayed in correspondence with movement of theremote controller200.
Then, anapplication grid screen1230 shown inFIG. 13E may be displayed. Theapplication grid screen1230 may include a plurality of application items and, more particularly, application items displayed in thelauncher bar area903.
In the figure, all application items on theapplication grid screen1230 are selected for display.
FIG. 13F shows the case in which, in the state of displaying the home and all apps settings screen1210, thewallpaper item1216 is selected based on thepointer205 displayed in correspondence with movement of theremote controller200.
Then, a wallpaper settings screen1240 shown inFIG. 13G may be displayed. The wallpaper settings screen1240 may include a plurality of wallpaper items.
FIG. 13G shows the case in which aspecific wallpaper item1242 is selected from among the plurality of wallpaper items based on thepointer205 displayed in correspondence with movement of theremote controller200.
Then, awallpaper screen1250 shown inFIG. 13H may be displayed. As a result, it is possible to easily set the wallpaper when the home screen is displayed.
Next,FIG. 14A shows the case in which, in the state of displaying the settings screen1010 related to the image display apparatus, thesystem settings item1018 is selected based on thepointer205 displayed in correspondence with movement of theremote controller200.
The system settings screen1310 shown inFIG. 14B may be displayed.
The system settings screen1310 sets all items of theimage display apparatus100 and may include arecent menu item1314, anetwork item1315, avideo input item1316, a picture andsound item1317, achannel settings item1318, a3D settings item1319, etc.
When therecent menu item1314 is selected, a list of recently set menu items may be displayed. When thenetwork item1315 is selected, a list for setting information (zip code, address information, etc.) about a region in which the image display apparatus is mounted, information about a network service provider corresponding to the region, etc. may be displayed. When thevideo input item1316 is selected, a list for setting resolution, coding rate and codec of displayed video may be displayed.
In addition, if the picture andsound item1317 is selected, a list for setting resolution, coding rate and codec of displayed picture or output sound may be displayed. If thechannel settings item1318 is selected, a list for setting an automatic channel search range, a manual channel search range, etc. may be displayed. If the3D settings item1319 is selected, a list for setting a 3D image output format, depth of a 3D image, a frame frequency, etc. may be displayed.
The system settings screen1310 may include asearch window1312 for easily searching for numerous setting items with various depths. In addition, avoice recognition icon1313 may be displayed in the vicinity of thesearch window1312 such that search is performed through voice recognition.
Predetermined letters may be entered in the search window using theletter key260 of theremote controller200 or the letter key of the screen keyboard and the setting items corresponding to the entered letters may be immediately searched for and displayed. Accordingly, the user can easily search for a desired setting item.
FIG. 15 is a flowchart illustrating a method for operating an image display apparatus according to an embodiment of the present invention, andFIGS. 16A to 19B are views referred to for describing various examples of the method for operating the image display apparatus ofFIG. 15.
Referring to the figure, theimage display apparatus100 displays the home screen (S1820).
Step1802 (S1820) of displaying the home screen is equal to step820 (S820) of displaying the home screen and thus a description thereof will be omitted.
After step1820 (S1820) ofFIG. 15, that is, after the home screen is displayed, the following steps may be performed.
Theimage display apparatus100 determines whether a broadcast information-related application on the home screen is selected (S1830) and a screen for executing the application is displayed (S1840) if the application is selected.
As shown inFIG. 16A, in the state of displaying thehome screen900, if the primetime (TV & movie)item1306 of the application list is selected, theimage display apparatus100 may display a primetimeapplication execution screen1300 to be displayed as shown inFIG. 16B.
The primetime application executed by selection of theprimetime item1306 may be provided by a current live broadcast program, a VOD such as drama or animation, which is capable of being streamed at a user request, a movie which is capable of being streamed at a user request.
In order to provide such a service, the network of theimage display apparatus100 is preferably set in advance. For example, region information and information about a broadcast service provider and a network service provider according to the region information is preferably set in advance. Network settings are described with reference toFIGS. 17A to 17C.
In the primetime application, the live broadcast program, the VOD such as drama or animation or the movie provided to theimage display apparatus100 may be changed according to the set region and network service provider.
FIG. 16B shows the primetime application screen. Theprimetime application screen1300 may include a livebroadcast program item1310, aVOD item1320 such as drama or animation, and amovie item1330 according to the provided service kind. Theimage display apparatus100 may display a content list corresponding to an item selected from among the items.
FIG. 16B shows the case in which the livebroadcast program item1310 is selected to display a live broadcast program list.
The live broadcast program list may include respectivelive broadcast images1350,1360,1370 and1380 of channels.
Among the respectivelive broadcast images1350,1360,1370 and1380 of the channels, thelive broadcast image1350 of the channel displayed on thehome screen900 ofFIG. 16A may be highlighted. Alternatively, a live broadcast image of a channel having highest real-time ratings among the respectivelive broadcast images1350,1360,1370 and1380 of the channels may be highlighted. In addition, a live broadcast image, which is most viewed by the user, among the respectivelive broadcast images1350,1360,1370 and1380 of the channels may be highlighted.
The highlighted live broadcast image may have a largest size, may be arranged at a foremost side, or may have a largest amount of displayed information. That is, the highlighted live broadcast image may be highlighted with a priority higher than the live broadcast images of the other channels.
In the figure, the highlightedlive broadcast image1350 includes achannel name1352, aprogram name1354 of the program broadcast on the channel,brief broadcast information1356 of the program and a detailedinformation view object1358 of the program.
The live broadcast image of each channel may be a broadcast program image provided by a broadcast service provider for providing a broadcast service.
The displayed broadcast information may be broadcast information provided by a broadcast service provider for providing a broadcast service. Upon primetime application execution, theimage display apparatus100 may access a broadcast service provider server for providing the broadcast service. Here, the broadcast service provider may include a local cable operator.
InFIG. 16B, thelive broadcast image1360 other than the highlightedlive broadcast image1350 includes only a channel name and a program name of the program broadcast on the corresponding channel. That is, thelive broadcast image1360 and the highlighted live broadcast image are different in terms of information provided.
By highlighting any one live broadcast program of the live broadcast program list, user interest in the highlighted live broadcast program may be assumed high.
Theprimetime application screen1300 may display asetting item1340 andregion information1345 of theimage display apparatus100.
According to the region information of theimage display apparatus100, the number of broadcast channels and a broadcast channel number provided to the image display apparatus may be changed. Settings related to the primetime application may be performed through thesetting item1340. This will be described below with reference toFIGS. 161 to 16J.
UnlikeFIG. 16B, the live broadcast images of the live broadcast program list may have the same size and the arrangement order thereof may be changed according to real-time ratings. In this case, if any one live broadcast content of the live broadcast program list is focused upon by movement of the cursor or the pointer, the broadcast information of the live broadcast content may be displayed in the form of a pull-down menu. Accordingly, the user can easily confirm desired information.
Next,FIG. 16C shows the case in which the livebroadcast program item1310 is selected and thus abroadcast information screen1303ais displayed. Thebroadcast information screen1303amay be an EPG screen.
Thebroadcast information screen1303amay include broadcast information centered on the live broadcast program displayed on the home screen ofFIG. 16A. Since the live broadcast program ofChannel5 is displayed on the home screen ofFIG. 16A, thebroadcast information screen1303aofFIG. 16C is displayed to be centered onChannel5 along with broadcast information ofChannels6 to8. In particular, an item “The following” broadcast by SBC onChannel5 displayed inFIG. 16A may be highlighted. Therefore, the user can easily confirm the broadcast information of the channel.
In case of a program provided by a major broadcast station, such broadcast information may be included in a broadcast signal transmitted by the major broadcast station. Theimage display apparatus100 may extract broadcast information from the broadcast signal and generate and display the OSD shown in the figure.
As another example, in case of a program provided by a cable broadcast station, such broadcast information may be included in a broadcast signal transmitted by the cable broadcast station or may be received from a server of the cable broadcast station over a network. Theimage display apparatus100 may extract the broadcast information from the broadcast signal or receive the broadcast information over the network and generate and display the OSD shown in the figure.
If the broadcast information is received via a plurality of routes, for example, is received from the major broadcast station and the cable broadcast station over the network, the image display apparatus may simultaneously display all broadcast information on the broadcast information screen or display a selection menu and display only one piece of broadcast information.
Next,FIG. 16D shows the case in which the livebroadcast program item1310 is selected and thus thebroadcast information screen1303bis displayed. UnlikeFIG. 16C, thebroadcast information screen1303bmay be displayed on a per genre basis.
For example, if the genre of “The Following” broadcast by SBC onChannel5 displayed inFIG. 16A is thriller, thebroadcast information screen1303bmay display broadcast information aligned on a per genre basis.
In the figure, “District13” broadcast by YTC on Channel23, “shooter’ broadcast by CCC onChannel130 and “Deja-vu” of VAN ofChannel527 equal to or similar to “The Following” broadcast by SBC onChannel5 in terms of genre are displayed. Therefore, the user can easily confirm broadcast information on a per genre basis.
At this time,information1349 about the number of viewers of each channel may be displayed. The number of viewers may be the number of viewers in the same region. In the figure, theregion information1345 and theinformation1349 about the number of viewers of each channel are displayed.
Although the broadcast information on the broadcast information screen is aligned on a per genre basis inFIG. 16D, such settings may be changed. If the settings object1348 is selected to change a broadcast information alignment criterion, the broadcast information may be aligned according to the changed criterion. The broadcast information alignment criterion may be “channel”, “genre”, “ratings”, “age”, “person” and “place” related to the viewed program, etc.
If the user has logged in to the image display apparatus, the broadcast information screen may be provided on a per user, preference or genre basis. If a child has logged in to the image display apparatus, a broadcast information screen related to a children's program may be provided. If a female adult has logged in to the image display apparatus, a broadcast information screen related to a cooking program or drama may be provided.
FIG. 16E shows the case in which theVOD item1320 is selected and thus a VOD list is displayed.
TheVOD list1410 may include a plurality of VOD moving-images1351,1353,1361 and1363. The VOD moving-images of theVOD list1410 may be divided according to servers or genres. A popular image or a most downloaded image among the plurality of VOD moving-images1351,1353,1361 and1363 may be highlighted. In the figure, the first VOD moving-image1351 is highlighted. The first VOD moving-image1351 may include a VOD moving-image, a VOD name and the number of views. In particular, the first VOD moving-image1351 may further include the number of views, as compared to the other VOD moving-images.
The VOD moving-image may be provided by a content provider for a VOD moving-image. Theimage display apparatus100 may access a server of a content provider to receive a VOD moving-image.
Upon display of the VOD moving-image list1410, asetting item1340 andregion information1345 of theimage display apparatus100 may be continuously displayed.
UnlikeFIG. 16E, the VOD moving-images of the VOD moving-image list1410 may have the same size and the arrangement order thereof may be changed according to popularity or download count. In this case, if any one VOD moving-image content of the VOD moving-image list is focused upon by movement of the cursor or the pointer, the information of the VOD moving-image content may be displayed in the form of a pull-down menu. Accordingly, the user can easily confirm desired information.
Next,FIG. 16F shows the case in which themovie item1320 is selected and thus a movie list is displayed.
Themovie list1420 may include a plurality movies. The movies of themovie list1420 may be divided according to servers or genres. The arrangement order of the plurality movies may be changed according to popularity or download count.
If any one movie on themovie list1420 is focused upon by movement of the cursor or the pointer, information on the movie may be displayed in the form of a pull-down menu. Accordingly, the user can easily confirm desired information.
The movie content may be provided by a content provider for providing movie content. Theimage display apparatus100 may access a server of a content provider to receive movie content.
Upon displaying themovie list1420, asetting item1340 andregion information1345 of theimage display apparatus100 may be continuously displayed.
Although an object is selected using thepointer205 indicating movement of theremote controller200 inFIGS. 16A to 16F, the present invention is not limited thereto and various modifications are possible.
For example, the object may be selected using the direction key and the OK key of theremote controller200. As another example, the object, etc. may be selected in correspondence with a user gesture based on the image of the user captured using thecamera195. As another example, user voice may be recognized and the object, etc. may be selected based on the user voice.
For example, theimage display apparatus100 may display a hand-shapedpointer506 corresponding to thehand505 of auser1104 based on the image of the user captured using thecamera195.
In a state of displaying thehome screen900 shown inFIG. 16G, if hand movement of theuser1104 corresponds to a tap gesture for selecting the primetime item (TV & movie item)1306, it may be determined that the corresponding item is selected. Any one of the screens ofFIGS. 16B to 16D may be displayed. That is, the live broadcast program list may be displayed as shown inFIG. 10B or thebroadcast information screen1303aor1303bofFIG. 16C or16D may be displayed.
As another example, theremote controller200 including a microphone (not shown) may receive and send user voice to theimage display apparatus100.
In a state of displaying thehome screen900 shown inFIG. 16G, if the user outputs voice509 “please execute primetime”, theremote controller200 may collect and send data ofsuch voice509 to theimage display apparatus100. Theimage display apparatus100 may analyze the user voice through the voice recognition function and recognize a primetime application execution command. Then, the live broadcast program list may be displayed as shown inFIG. 16B or thebroadcast information screen1303aor1303bofFIG. 16C or16D may be displayed.
The selection command of the image display apparatus may be performed using the direction key and the OK key, a user gesture, user voice, etc. in addition to the pointer of the remote controller. Although the pointer indicating movement of the remote controller is focused upon in the following description, the direction key and the OK key, user gesture, user voice, etc. may be used.
FIG. 16I shows the case in which thesetting item1340 of theprimetime application screen1300 is selected and asettings screen1430 is displayed.
The settings screen1430 may include a personalization item, a “review in app store” item and an “about” item.
If thepersonalization item1432 is selected on the settings screen1430 ofFIG. 16I based on thepointer205 displayed in correspondence with movement of theremote controller200, a personalization relatedscreen1440 shown inFIG. 16J may be displayed.
The personalization relatedscreen1440 may include an email account relateditem1442, a content rating relateditem1444 and an application store relateditem1446.
FIG. 17A shows the case in which a settings object916 of the livebroadcast card object920 of thehome screen900 is selected based on thepointer205 displayed in correspondence with movement of theremote controller200.
Then, as shown inFIG. 17B, the settings screen1010 related to the image display apparatus may be displayed.
The settings screen1010 may include a homedashboard edit item1012, a customizehome item1014, a home and allapps settings item1016 and asystem settings item1018.
FIG. 17B shows the case in which, in the state of displaying thesettings screen1010, thesystem settings item1018 is selected based on thepointer205 displayed in correspondence with movement of theremote controller200.
Then, a system settings screen1310 shown inFIG. 17C may be displayed.
The system settings screen1310 is used to set all items of theimage display apparatus100 and may include arecent menu item1314, anetwork item1315, avideo input item1316, a picture andsound item1317, achannel settings item1318, a3D settings item1319, etc.
When therecent menu item1314 is selected, a list of recently set menu items may be displayed. When thenetwork item1315 is selected, a list for setting information (zip code, address information, etc.) about a region in which the image display apparatus is mounted, information about a network service provider (an Internet service provider, etc.) corresponding to the region, etc. may be displayed. When thevideo input item1316 is selected, a list for setting resolution, coding rate and codec of displayed video may be displayed.
In addition, if the picture andsound item1317 is selected, a list for setting resolution (bit rate), coding rate and codec of displayed picture or output sound may be displayed. If thechannel settings item1318 is selected, a list for setting an automatic channel search range, a manual channel search range, etc. may be displayed. If the3D settings item1319 is selected, a list for setting a 3D image output format, depth of a 3D image, a frame frequency, etc. may be displayed.
The system settings screen1310 may include asearch window1312 for easily searching for numerous setting items with various depths. In addition, avoice recognition icon1313 may be displayed in the vicinity of thesearch window1312 such that search is performed through voice recognition.
Predetermined letters may be entered in the search window using theletter key260 of theremote controller200 or the letter key of the screen keyboard and the setting items corresponding to the entered letters may be immediately searched for and displayed. Accordingly, the user can easily search for a desired setting item.
FIG. 17C shows the case in which, in the state of displaying the system settings screen1310, thenetwork item1315 is selected based on thepointer205 displayed in correspondence with movement of theremote controller200.
For network setting of theimage display apparatus100, a list for setting information (zip code, address information, etc.) about a region in which the image display apparatus is mounted, information about a network service provider (an Internet service provider, etc.) corresponding to the region, etc. may be displayed. Thus, region information input, network service provider input or selection may be performed.
FIG. 18A shows the case in which theapp store item1308 of theapplication list980 of thehome screen900 is selected based on thepointer205 displayed in correspondence with movement of theremote controller200.
Then, as shown inFIG. 18B, anapp store screen1500 related to the image display apparatus may be displayed.
Theapp store screen1500 related to the image display apparatus may include asearch window1502 for searching for an application, afirst application list1510, and asecond application list1520. Thefirst application list1510 may include popular or new application items and the sizes of the application items of thefirst application list1510 may be greater than those of the application items of thesecond application list1520.
If aweb item1309 of theapplication list980 of thehome screen900 is selected inFIG. 18A, aweb screen1530 shown inFIG. 18C may be displayed. Then, it is possible to immediately use the Internet.
FIG. 19A shows the case in which apredetermined content image1600 is displayed on theimage display apparatus100. Thecontent image1600 may be a broadcast program image, a VOD image, etc.
If thepointer205 displayed in correspondence with movement of theremote controller200 is moved to a lower side of a content region, that is, to a position corresponding to thelauncher bar area903 of the home screen, theapplication list980 may be displayed as shown inFIG. 19B. That is, theapplication list980 may be displayed even when the home screen is not displayed. Therefore, it is possible to increase user convenience.
When a card object is generated according to the method for operating the image display apparatus described with reference toFIGS. 8 to 14B, the method for operating the image display apparatus ofFIG. 15 is applicable. That is, if an application item related to broadcast information of an application list of a home screen is selected after a card object is generated, a broadcast information-related application screen including a live broadcast image item for viewing a live broadcast image list and a moving-image item for viewing a moving-image list may be displayed.
On the contrary, after a broadcast information-related application screen according to the method for operating the image display apparatus described with reference toFIGS. 15 to 19B is displayed, a card object may be generated in a state of displaying a home screen.
In theimage display apparatus100 according to the embodiment of the present invention, when an object or a menu is selected or focused upon, the object or the menu may be highlighted. Although not shown inFIGS. 9A to 13B orFIGS. 16A to 19B, the contour of the selected or focused object or menu may be made thick or at least one of the size, color or the luminance of the selected or focused object or menu may be changed. Therefore, the user can intuitively perceive selection or focusing of the object or menu.
According to one embodiment of the present invention, an image display apparatus displays a screen for generating a card object to be displayed on a home screen according to card object generation input and adds a predetermined content item to the card object to be generated if the predetermined content item is selected. Therefore, the user can easily generate a desired card object. As a result, it is possible to increase user convenience.
In particular, the card object to be generated is displayed in a first region of the screen for generating the card object and content items to be added are displayed in a second region, such that desired content items are easily added to the card object to be generated on one screen.
The card object may be generated on a per user basis and, when a user logs in, a card object corresponding to the user may be displayed. Thus, the user can view the card object including content items desired by the user.
If system settings input is received, a system settings screen including network settings, video settings, channel settings and 3D image settings is displayed and a search window for searching for a settings item is displayed, such that a desired settings item among a plurality of search items is immediately searched for.
According to another embodiment of the present invention, an image display apparatus displays a broadcast information-related application screen including a live broadcast image item for viewing a live broadcast image list and a moving-image item for viewing a moving-image list, if a broadcast information-related application item in an application list on a home screen is selected. Therefore, the user can easily view desired content.
If a live broadcast image list is displayed on the broadcast information-related application screen, at least one of the size or arrangement order of broadcast images in the live broadcast image list or the amount of display information related to the broadcast images is changed according to real-time ratings. Therefore, it is possible to increase user convenience.
If a moving-image list is displayed on the broadcast information-related application screen, least one of the size or arrangement order of moving images in the moving-image list or the amount of display information related to the moving images is changed according to download counts of the moving images. Therefore, it is possible to increase user convenience.
The image display apparatus and the method for operating the same according to the foregoing embodiments are not restricted to the embodiments set forth herein. Therefore, variations and combinations of the exemplary embodiments set forth herein may fall within the scope of the present invention.
The method for operating an image display apparatus according to the foregoing embodiments may be implemented as code that can be written to a computer-readable recording medium and can thus be read by a processor. The computer-readable recording medium may be any type of recording device in which data can be stored in a computer-readable manner. Examples of the computer-readable recording medium include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage, and a carrier wave (e.g., data transmission over the Internet). The computer-readable recording medium can be distributed over a plurality of computer systems connected to a network so that computer-readable code is written thereto and executed therefrom in a decentralized manner. Functional programs, code, and code segments to realize the embodiments herein can be construed by one of ordinary skill in the art.
Although the preferred embodiments of the present invention have been disclosed for illustrative purposes, those skilled in the art will appreciate that various modifications, additions and substitutions are possible, without departing from the scope and spirit of the invention as disclosed in the accompanying claims.