Movatterモバイル変換


[0]ホーム

URL:


CN111614995A - Menu display method and display equipment - Google Patents

Menu display method and display equipment
Download PDF

Info

Publication number
CN111614995A
CN111614995ACN202010468548.4ACN202010468548ACN111614995ACN 111614995 ACN111614995 ACN 111614995ACN 202010468548 ACN202010468548 ACN 202010468548ACN 111614995 ACN111614995 ACN 111614995A
Authority
CN
China
Prior art keywords
menu
display
user
current channel
menu item
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010468548.4A
Other languages
Chinese (zh)
Inventor
高雯雯
董杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co LtdfiledCriticalHisense Visual Technology Co Ltd
Priority to CN202010468548.4ApriorityCriticalpatent/CN111614995A/en
Publication of CN111614995ApublicationCriticalpatent/CN111614995A/en
Pendinglegal-statusCriticalCurrent

Links

Images

Classifications

Landscapes

Abstract

The application discloses a menu display method and a display device, wherein the method comprises the following steps: determining a type of a current channel in response to an instruction input by a user to display a menu, and displaying menu information on a display; the menu information comprises a menu item of a function matched with the type of the current channel and a menu item inlet used for indicating a system general function; and displaying a plurality of menu items corresponding to system common functions on the display when the menu item entry is selected based on a user input control selector.

Description

Menu display method and display equipment
Technical Field
The present application relates to the field of display technologies, and in particular, to a menu display method and a display device.
Background
Because the video playing source of the display equipment is wide, the traditional menu only displays the general functions of the display equipment, so that a user can only set the general functions, and the use requirement of the user cannot be met. Therefore, there is a need in the art for a menu display method that improves the user experience.
Disclosure of Invention
The embodiment of the application provides a menu display method and display equipment, which are used for improving the use experience of a user.
In a first aspect, there is provided a display device comprising:
a display;
the user interface is used for receiving instructions input by a user;
a tuning demodulator for tuning and demodulating a broadcast program in the digital broadcast signal;
a controller for performing:
determining a type of a current channel in response to an instruction input by a user to display a menu, and displaying menu information on a display;
the menu information comprises a plurality of menu items corresponding to the function matched with the type of the current channel and a menu item inlet used for indicating the general function of the system; and displaying a plurality of menu items corresponding to system common functions on the display when the menu item entry is selected based on a user input control selector.
In a second aspect, there is provided a display device comprising:
a display;
the user interface is used for receiving instructions input by a user;
a tuning demodulator for tuning and demodulating a broadcast program in the digital broadcast signal;
a controller for performing:
determining a type of a current channel in response to an instruction input by a user to display a menu, and displaying menu information on a display;
the menu information comprises a plurality of menu items corresponding to system general functions and menu item inlets of functions matched with the types of the current channels; and displaying a plurality of menu items corresponding to functions matching the type of the current channel on a display when the menu item entry is selected based on a user input control selector.
In a third aspect, a menu display method is provided, including:
determining a type of a current channel in response to an instruction input by a user to display a menu, and displaying menu information on a display;
the menu information comprises a plurality of menu items corresponding to the function matched with the type of the current channel and a menu item inlet used for indicating the general function of the system; and displaying a plurality of menu items corresponding to system common functions on the display when the menu item entry is selected based on a user input control selector.
In a fourth aspect, a menu display method is provided, including:
determining a type of a current channel in response to an instruction input by a user to display a menu, and displaying menu information on a display;
the menu information comprises a plurality of menu items corresponding to system general functions and menu item inlets of functions matched with the types of the current channels; and displaying a plurality of menu items corresponding to functions matching the type of the current channel on a display when the menu item entry is selected based on a user input control selector.
Drawings
Fig. 1A schematically illustrates an operation scenario between thedisplay device 200 and thecontrol 100;
fig. 1B is a block diagram schematically illustrating a configuration of thecontrol apparatus 100 in fig. 1A;
fig. 1C is a block diagram schematically illustrating a configuration of thedisplay device 200 in fig. 1A;
a block diagram of the architectural configuration of the operating system in the memory of thedisplay device 200 is illustrated in fig. 1D.
FIG. 2 is a schematic diagram illustrating an example GUI provided bydisplay device 200;
fig. 3 is a schematic diagram illustrating another GUI provided by thedisplay apparatus 200;
fig. 4 is a schematic diagram illustrating another GUI provided by thedisplay apparatus 200;
fig. 5 is a schematic diagram illustrating another GUI provided by thedisplay apparatus 200;
FIG. 6 is a flow chart illustrating a menu display method;
FIG. 7 is a schematic diagram illustrating the display of a menu;
FIG. 8 is a flow chart illustrating another menu display method;
FIG. 9 is a schematic diagram illustrating the display of another menu;
a display diagram of yet another menu is illustrated in fig. 10.
Detailed Description
To make the objects, technical solutions and advantages of the exemplary embodiments of the present application clearer, the technical solutions in the exemplary embodiments of the present application will be clearly and completely described below with reference to the drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, but not all the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments shown in the present application without inventive effort, shall fall within the scope of protection of the present application. Moreover, while the disclosure herein has been presented in terms of exemplary one or more examples, it is to be understood that each aspect of the disclosure can be utilized independently and separately from other aspects of the disclosure to provide a complete disclosure.
The terms "comprises" and "comprising," and any variations thereof, as used herein, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term "module," as used herein, refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "gesture" as used in this application refers to a user's behavior through a change in hand shape or an action such as hand motion to convey a desired idea, action, purpose, or result.
Fig. 1A is a schematic diagram illustrating an operation scenario between thedisplay device 200 and thecontrol apparatus 100. As shown in fig. 1A, thecontrol apparatus 100 and thedisplay device 200 may communicate with each other in a wired or wireless manner.
Among them, thecontrol apparatus 100 is configured to control thedisplay device 200, which may receive an operation instruction input by a user and convert the operation instruction into an instruction recognizable and responsive by thedisplay device 200, serving as an intermediary for interaction between the user and thedisplay device 200. Such as: the user operates the channel up/down key on thecontrol device 100, and thedisplay device 200 responds to the channel up/down operation.
Thecontrol device 100 may be aremote controller 100A, which includes infrared protocol communication or bluetooth protocol communication, and other short-distance communication methods, etc. to control thedisplay apparatus 200 in a wireless or other wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control thedisplay apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling thedisplay device 200.
Thecontrol device 100 may also be an intelligent device, such as amobile terminal 100B, a tablet computer, a notebook computer, and the like. For example, thedisplay device 200 is controlled using an application program running on the smart device. The application program may provide various controls to a user through an intuitive User Interface (UI) on a screen associated with the smart device through configuration.
For example, themobile terminal 100B may install a software application with thedisplay device 200 to implement connection communication through a network communication protocol for the purpose of one-to-one control operation and data communication. Such as: themobile terminal 100B may be caused to establish a control instruction protocol with thedisplay device 200 to implement the functions of the physical keys as arranged in theremote control 100A by operating various function keys or virtual buttons of the user interface provided on themobile terminal 100B. The audio and video content displayed on themobile terminal 100B may also be transmitted to thedisplay device 200, so as to implement a synchronous display function.
Thedisplay apparatus 200 may be implemented as a television, and may provide an intelligent network television function of a broadcast receiving television function as well as a computer support function. Examples of the display device include a digital television, a web television, a smart television, an Internet Protocol Television (IPTV), and the like.
Thedisplay device 200 may be a liquid crystal display, an organic light emitting display, a projection display device. The specific display device type, size, resolution, etc. are not limited.
Thedisplay apparatus 200 also performs data communication with theserver 300 through various communication means. Here, thedisplay apparatus 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. Theserver 300 may provide various contents and interactions to thedisplay apparatus 200. By way of example, thedisplay device 200 may send and receive information such as: receiving Electronic Program Guide (EPG) data, receiving software program updates, or accessing a remotely stored digital media library. Theservers 300 may be a group or groups of servers, and may be one or more types of servers. Other web service contents such as a video on demand and an advertisement service are provided through theserver 300.
Fig. 1B is a block diagram illustrating the configuration of thecontrol device 100. As shown in fig. 1B, thecontrol device 100 includes acontroller 110, amemory 120, acommunicator 130, auser input interface 140, anoutput interface 150, and apower supply 160.
Thecontroller 110 includes a Random Access Memory (RAM)111, a Read Only Memory (ROM)112, aprocessor 113, a communication interface, and a communication bus. Thecontroller 110 is used to control the operation of thecontrol device 100, as well as the internal components of the communication cooperation, external and internal data processing functions.
Illustratively, when an interaction of a user pressing a key disposed on theremote controller 100A or an interaction of touching a touch panel disposed on theremote controller 100A is detected, thecontroller 110 may control to generate a signal corresponding to the detected interaction and transmit the signal to thedisplay device 200.
And amemory 120 for storing various operation programs, data and applications for driving and controlling thecontrol apparatus 100 under the control of thecontroller 110. Thememory 120 may store various control signal commands input by a user.
Thecommunicator 130 enables communication of control signals and data signals with thedisplay apparatus 200 under the control of thecontroller 110. Such as: thecontrol apparatus 100 transmits a control signal (e.g., a touch signal or a button signal) to thedisplay device 200 via thecommunicator 130, and thecontrol apparatus 100 may receive the signal transmitted by thedisplay device 200 via thecommunicator 130. Thecommunicator 130 may include aninfrared signal interface 131 and a radiofrequency signal interface 132. For example: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to thedisplay device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to thedisplay device 200 through the rf transmitting terminal.
Theuser input interface 140 may include at least one of amicrophone 141, atouch pad 142, asensor 143, a key 144, and the like, so that a user can input a user instruction regarding controlling thedisplay apparatus 200 to thecontrol apparatus 100 through voice, touch, gesture, press, and the like.
Theoutput interface 150 outputs a user instruction received by theuser input interface 140 to thedisplay apparatus 200, or outputs an image or voice signal received by thedisplay apparatus 200. Here, theoutput interface 150 may include anLED interface 151, avibration interface 152 generating vibration, asound output interface 153 outputting sound, adisplay 154 outputting an image, and the like. For example, theremote controller 100A may receive an output signal such as audio, video, or data from theoutput interface 150, and display the output signal in the form of an image on thedisplay 154, in the form of audio on thesound output interface 153, or in the form of vibration on thevibration interface 152.
And apower supply 160 for providing operation power support for each element of thecontrol device 100 under the control of thecontroller 110. In the form of a battery and associated control circuitry.
A hardware configuration block diagram of thedisplay device 200 is exemplarily illustrated in fig. 1C. As shown in fig. 1C, thedisplay apparatus 200 may further include atuner demodulator 210, acommunicator 220, adetector 230, anexternal device interface 240, acontroller 250, amemory 260, auser interface 265, avideo processor 270, adisplay 275, anaudio processor 280, anaudio input interface 285, and apower supply 290.
Thetuner demodulator 210 receives the broadcast television signal in a wired or wireless manner, may perform modulation and demodulation processing such as amplification, mixing, and resonance, and is configured to demodulate, from a plurality of wireless or wired broadcast television signals, an audio/video signal carried in a frequency of a television channel selected by a user, and additional information (e.g., EPG data).
Thetuner demodulator 210 is responsive to the user selected frequency of the television channel and the television signal carried by the frequency, as selected by the user and controlled by thecontroller 250.
Thetuner demodulator 210 can receive a television signal in various ways according to the broadcasting system of the television signal, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; and according to different modulation types, a digital modulation mode or an analog modulation mode can be adopted; and can demodulate the analog signal and the digital signal according to the different kinds of the received television signals.
In other exemplary embodiments, thetuning demodulator 210 may also be in an external device, such as an external set-top box. In this way, the set-top box outputs a television signal after modulation and demodulation, and inputs the television signal into thedisplay apparatus 200 through theexternal device interface 240.
Thecommunicator 220 is a component for communicating with an external device or an external server according to various communication protocol types. For example, thedisplay apparatus 200 may transmit content data to an external apparatus connected via thecommunicator 220, or browse and download content data from an external apparatus connected via thecommunicator 220. Thecommunicator 220 may include a network communication protocol module or a near field communication protocol module, such as aWIFI module 221, a bluetoothcommunication protocol module 222, and a wired ethernetcommunication protocol module 223, so that thecommunicator 220 may receive a control signal of thecontrol device 100 according to the control of thecontroller 250 and implement the control signal as a WIFI signal, a bluetooth signal, a radio frequency signal, and the like.
Thedetector 230 is a component of thedisplay apparatus 200 for collecting signals of an external environment or interaction with the outside. Thedetector 230 may include animage collector 231, such as a camera, a video camera, etc., which may be used to collect external environment scenes to adaptively change the display parameters of thedisplay device 200; and the function of acquiring the attribute of the user or interacting gestures with the user so as to realize the interaction between the display equipment and the user. Alight receiver 232 may also be included to collect ambient light intensity to adapt to changes in display parameters of thedisplay device 200, etc.
In some other exemplary embodiments, thedetector 230 may further include a temperature sensor, such as by sensing an ambient temperature, and thedisplay device 200 may adaptively adjust a display color temperature of the image. For example, when the temperature is higher, thedisplay apparatus 200 may be adjusted to display a color temperature of an image that is cooler; when the temperature is lower, thedisplay device 200 may be adjusted to display a warmer color temperature of the image.
In some other exemplary embodiments, thedetector 230, which may further include a sound collector, such as a microphone, may be configured to receive a sound of a user, such as a voice signal of a control instruction of the user to control thedisplay device 200; alternatively, ambient sounds may be collected that identify the type of ambient scene, enabling thedisplay device 200 to adapt to ambient noise.
Theexternal device interface 240 is a component for providing thecontroller 210 to control data transmission between thedisplay apparatus 200 and an external apparatus. Theexternal device interface 240 may be connected to an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner, and may receive data such as a video signal (e.g., moving image), an audio signal (e.g., music), additional information (e.g., EPG), etc. of the external apparatus.
Theexternal device interface 240 may include: a High Definition Multimedia Interface (HDMI) terminal 241, a Composite Video Blanking Sync (CVBS) terminal 242, an analog ordigital Component terminal 243, a Universal Serial Bus (USB)terminal 244, a Component terminal (not shown), a red, green, blue (RGB) terminal (not shown), and the like.
Thecontroller 250 controls the operation of thedisplay device 200 and responds to the operation of the user by running various software control programs (such as an operating system and various application programs) stored on thememory 260.
As shown in fig. 1C, thecontroller 250 includes a Random Access Memory (RAM)251, a Read Only Memory (ROM)252, agraphics processor 253, aCPU processor 254, acommunication interface 255, and acommunication bus 256. The RAM251, the ROM252, thegraphic processor 253, and theCPU processor 254 are connected to each other through acommunication bus 256 through acommunication interface 255.
The ROM252 stores various system boot instructions. When thedisplay apparatus 200 starts power-on upon receiving the power-on signal, theCPU processor 254 executes a system boot instruction in the ROM252, copies the operating system stored in thememory 260 to the RAM251, and starts running the boot operating system. After the start of the operating system is completed, theCPU processor 254 copies the various application programs in thememory 260 to the RAM251 and then starts running and starting the various application programs.
Agraphic processor 253 for generating screen images of various graphic objects such as icons, images, and operation menus. Thegraphic processor 253 may include an operator for performing an operation by receiving various interactive instructions input by a user, and further displaying various objects according to display attributes; and a renderer for generating various objects based on the operator and displaying the rendered result on thedisplay 275.
ACPU processor 254 for executing operating system and application program instructions stored inmemory 260. And according to the received user input instruction, processing of various application programs, data and contents is executed so as to finally display and play various audio-video contents.
In some example embodiments, theCPU processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and a plurality of or one sub-processor. A main processor for performing some initialization operations of thedisplay apparatus 200 in the display apparatus preload mode and/or operations of displaying a screen in the normal mode. A plurality of or one sub-processor for performing an operation in a state of a standby mode or the like of the display apparatus.
Thecommunication interface 255 may include a first interface to an nth interface. These interfaces may be network interfaces that are connected to external devices via a network.
Thecontroller 250 may control the overall operation of thedisplay apparatus 200. For example: in response to receiving a user input command for selecting a GUI object displayed on thedisplay 275, thecontroller 250 may perform an operation related to the object selected by the user input command.
Where the object may be any one of the selectable objects, such as a hyperlink or an icon. The operation related to the selected object is, for example, an operation of displaying a link to a hyperlink page, document, image, or the like, or an operation of executing a program corresponding to an icon. The user input command for selecting the GUI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch panel, etc.) connected to thedisplay apparatus 200 or a voice command corresponding to a user uttering voice.
Amemory 260 for storing various types of data, software programs, or applications for driving and controlling the operation of thedisplay device 200. Thememory 260 may include volatile and/or nonvolatile memory. And the term "memory" includes thememory 260, the RAM251 and the ROM252 of thecontroller 250, or a memory card in thedisplay device 200.
In some embodiments, thememory 260 is specifically used for storing an operating program for driving thecontroller 250 of thedisplay device 200; storing various application programs built in thedisplay apparatus 200 and downloaded by a user from an external apparatus; data such as visual effect images for configuring various GUIs provided by thedisplay 275, various objects related to the GUIs, and selectors for selecting GUI objects are stored.
In some embodiments, thememory 260 is specifically configured to store drivers and related data for thetuner demodulator 210, thecommunicator 220, thedetector 230, theexternal device interface 240, thevideo processor 270, thedisplay 275, theaudio processor 280, and the like, external data (e.g., audio-visual data) received from the external device interface, or user data (e.g., key information, voice information, touch information, and the like) received from the user interface.
In some embodiments,memory 260 specifically stores software and/or programs representing an Operating System (OS), which may include, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. Illustratively, the kernel may control or manage system resources, as well as functions implemented by other programs (e.g., the middleware, APIs, or applications); at the same time, the kernel may provide an interface to allow middleware, APIs, or applications to access the controller to enable control or management of system resources.
A block diagram of the architectural configuration of the operating system in the memory of thedisplay device 200 is illustrated in fig. 1D. The operating system architecture comprises an application layer, a middleware layer and a kernel layer from top to bottom.
The application layer, the application programs built in the system and the non-system-level application programs belong to the application layer and are responsible for direct interaction with users. The application layer may include a plurality of applications such as NETFLIX applications, setup applications, media center applications, and the like. These applications may be implemented as Web applications that execute based on a WebKit engine, and in particular may be developed and executed based on HTML, Cascading Style Sheets (CSS), and JavaScript.
Here, HTML, which is called HyperText Markup Language (HyperText Markup Language), is a standard Markup Language for creating web pages, and describes the web pages by Markup tags, where the HTML tags are used to describe characters, graphics, animation, sound, tables, links, etc., and a browser reads an HTML document, interprets the content of the tags in the document, and displays the content in the form of web pages.
CSS, known as Cascading Style Sheets (Cascading Style Sheets), is a computer language used to represent the Style of HTML documents, and may be used to define Style structures, such as fonts, colors, locations, etc. The CSS style can be directly stored in the HTML webpage or a separate style file, so that the style in the webpage can be controlled.
JavaScript, a language applied to Web page programming, can be inserted into an HTML page and interpreted and executed by a browser. The interaction logic of the Web application is realized by JavaScript. The JavaScript can package a JavaScript extension interface through a browser, realize the communication with the kernel layer,
the middleware layer may provide some standardized interfaces to support the operation of various environments and systems. For example, the middleware layer may be implemented as multimedia and hypermedia information coding experts group (MHEG) middleware related to data broadcasting, DLNA middleware which is middleware related to communication with an external device, middleware which provides a browser environment in which each application program in the display device operates, and the like.
The kernel layer provides core system services, such as: file management, memory management, process management, network management, system security authority management and the like. The kernel layer may be implemented as a kernel based on various operating systems, for example, a kernel based on the Linux operating system.
The kernel layer also provides communication between system software and hardware, and provides device driver services for various hardware, such as: provide display driver for the display, provide camera driver for the camera, provide button driver for the remote controller, provide wiFi driver for the WIFI module, provide audio driver for audio output interface, provide power management drive for Power Management (PM) module etc..
Auser interface 265 receives various user interactions. Specifically, it is used to transmit an input signal of a user to thecontroller 250 or transmit an output signal from thecontroller 250 to the user. For example, theremote controller 100A may transmit an input signal, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by the user to theuser interface 265, and then the input signal is transferred to thecontroller 250 through theuser interface 265; alternatively, theremote controller 100A may receive an output signal such as audio, video, or data output from theuser interface 265 via thecontroller 250, and display the received output signal or output the received output signal in audio or vibration form.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on thedisplay 275, and theuser interface 265 receives the user input commands through the GUI. Specifically, theuser interface 265 may receive user input commands for controlling the position of a selector in the GUI to select different objects or items.
Alternatively, the user may input a user command by inputting a specific sound or gesture, and theuser interface 265 receives the user input command by recognizing the sound or gesture through the sensor.
Thevideo processor 270 is configured to receive an external video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a video signal that is directly displayed or played on thedisplay 275.
Illustratively, thevideo processor 270 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is configured to demultiplex an input audio/video data stream, where, for example, an input MPEG-2 stream (based on a compression standard of a digital storage media moving image and voice), the demultiplexing module demultiplexes the input audio/video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, for example, convert a frame rate of an input 60Hz video into a frame rate of 120Hz or 240Hz, where a common format is implemented by using, for example, an interpolation frame method.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format of a display, such as converting the format of the signal output by the frame rate conversion module to output an RGB data signal.
And adisplay 275 for receiving the image signal from the output of thevideo processor 270 and displaying video, images and menu manipulation interfaces. For example, the display may display video from a broadcast signal received by thetuner demodulator 210, may display video input from thecommunicator 220 or theexternal device interface 240, and may display an image stored in thememory 260. Thedisplay 275, while displaying a user manipulation interface UI generated in thedisplay apparatus 200 and used to control thedisplay apparatus 200.
And, thedisplay 275 may include a display screen assembly for presenting a picture and a driving assembly for driving the display of an image. Alternatively, a projection device and projection screen may be included, provideddisplay 275 is a projection display.
Theaudio processor 280 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing to obtain an audio signal that can be played by the speaker 286.
Illustratively,audio processor 280 may support various audio formats. Such as MPEG-2, MPEG-4, Advanced Audio Coding (AAC), high efficiency AAC (HE-AAC), and the like.
Audio output interface 285 receives audio signals from the output ofaudio processor 280. For example, the audio output interface may output audio in a broadcast signal received via thetuner demodulator 210, may output audio input via thecommunicator 220 or theexternal device interface 240, and may output audio stored in thememory 260. Theaudio output interface 285 may include a speaker 286, or an externalaudio output terminal 287, such as an earphone output terminal, that outputs to a generating device of an external device.
In other exemplary embodiments,video processor 270 may comprise one or more chips.Audio processor 280 may also comprise one or more chips.
And, in other exemplary embodiments, thevideo processor 270 and theaudio processor 280 may be separate chips or may be integrated with thecontroller 250 in one or more chips.
And apower supply 290 for supplying power supply support to thedisplay apparatus 200 from the power input from the external power source under the control of thecontroller 250. Thepower supply 290 may be a built-in power supply circuit installed inside thedisplay apparatus 200 or may be a power supply installed outside thedisplay apparatus 200.
Note that the item refers to a visual object displayed in the GUI in thedisplay apparatus 200 to represent corresponding content such as an icon, a thumbnail, a video clip, a link, and the like.
The presentation forms of items are often diverse. For example, the items may include text content and/or images for displaying thumbnails related to the text content, or video clips related to the text. As another example, the item may be text and/or an icon of an application.
It is further noted that the selector is used to indicate that any item, such as the focus object, has been selected. In one aspect, the movement of the focus object displayed in thedisplay apparatus 200 may be controlled to select or control the item according to an input of the user through thecontrol device 100. Such as: the user may select and control items by controlling the movement of the focus object between items through the direction keys on thecontrol device 100. On the other hand, movement of items displayed in thedisplay apparatus 200 may be controlled according to an input of a user through thecontrol device 100 to cause a focus object to select or control the items. Such as: the user can control the items to move left and right together by the direction key on thecontrol device 100, so as to enable the focus object to select and control the items while keeping the position of the focus object unchanged. The form of identification of the selector is often diversified.
For example, as in fig. 2, the position of thefocus object 42 may be identified by changing the border line, size, color, transparency, and outline of the text or image of the focused item, and/or the font, etc., and the position of thefocus object 42 may be implemented or identified by setting the background color of the item.
A schematic view of one GUI provided by thedisplay device 200 is illustrated in fig. 2-5.
When a user wishes to view a menu of a display device, i.e. menu information, which may be responsive to a user-entered instruction indicating to display the menu, a "menu" button in the pressure control means, as shown in figure 2, the display device may provide a GUI400 to the display, the GUI400 comprising a menu comprising one or more different items arranged.
For example, the current channel is local media. Themenu 41 includes items 411-419, where the items 411-418 are menu items of functions matching the type of the current channel, for example, the menu items may be an image mode, a sound mode, a repeat playing mode, a film source information, and the like;item 419 is a menu item entry indicating a general function of the system, for example the menu item entry may be settings. And the GUI also includes aselector 42 indicating that any item is selected. The position of the selector in the GUI may be moved by input from a user operating the control device. For example, theselector 42 indicates that theitem 411 in themenu 41 is selected.
In order for the user to more clearly understand the contents of the menu items indicating the system generic functions, in fig. 2, when the user moves the selector to theitem 419 through the control device and presses the confirmation key on the control device, or of course not, when the user moves the selector to theitem 419, as shown in fig. 3, the display device displays the GUI500 associated with the menu item entry in response to an input instruction to activate the menu item displaying the system generic function, wherein the GUI500 includes items under the menu item entry, for example, theitems 4191 and 4197, specifically, the items may be channels, networks, accounts, help, and the like. And GUI500 also includesselector 42 indicating that the item was selected, e.g.,selector 42 indicates thatitem 4191 was selected.
Illustratively, the menu items of the functions matched with the type of the current channel and the menu item inlets for indicating the system general functions are arranged on the display interface from top to bottom and are positioned at the right side of the whole display interface. Of course, the embodiment of the present application does not fix the positions of the menu items of the functions matching the type of the current channel and the entry of the menu items for indicating the general functions of the system.
In other embodiments, when the user presses the "menu" button in the pressure control device, the display device may display a menu in response to the user-entered instruction indicating that the menu is to be displayed, as shown in fig. 4, the display device may provide a GUI to the display, the GUI including a menu including one or more different items arranged therein.
For example, item 411-417, where item 411-416 is a menu item for indicating general functions of the system, which may be channels, networks, accounts, help, etc.;item 417 is a menu item entry for a function that matches the type of the current channel, for example the menu item entry may be an option. And the GUI also includes aselector 42 indicating that any item is selected. The position of the selector in the GUI may be moved by input from the user operating the control means, for example, theselector 42 indicating thatitem 411 inmenu 41 is selected.
To understand the specific contents of the menu items of functions matching the type of the current channel, in fig. 4, when the user moves the selector toitem 417 through the control means, as shown in fig. 5, the display device displays a GUI associated with the menu items of functions matching the type of the current channel in response to an input instruction to display the menu items of functions matching the type of the current channel. Where the type of the current channel is DTV, as shown in fig. 5, the GUI includes menu items of DTV matching function, such asitem 4171 and 4175, specifically, the items may be channel, parental lock, audio language, subtitle, etc.
The embodiment of the application provides two implementation schemes of a menu display method.
The contents of the first implementation are specifically as follows:
pressing a preset key, such as a menu key, directly invoking the menu, wherein a plurality of items displayed in front of the menu are common functions of the lower part of the current channel and special functions of the current channel, the last item displayed on the menu is a menu item inlet for indicating the general functions of the system, and when a user selects the menu item inlet, the setting menu of the whole machine can be invoked. According to the scheme, a user can quickly locate the designated function, and the steps of operating the control device are reduced to a certain extent, for example, the control device can be a remote controller.
The flow chart of this scheme is shown in figure 6 below:
the menu is implemented using a List, which contains a plurality of sets of displayed data, a set of channel-specific independent data, and common menu item entry data. The data of the independent data under each channel is configured in a configuration file, the configuration file can be an xml file, and the loading is performed according to a title (title) in the xml file by analyzing the content in the xml file stage by stage. Where the List is one of many containers provided by the class library. In the embodiment of the present application, besides the List implementation, the implementation may also be implemented in other manners, for example, grid, shareprefer, and the like may be implemented.
When a user presses a menu key through the control device, judging a current channel;
if the current channel is under the DTV channel, loading independent data and menu entry data of the DTV channel;
if the current local media channel is in the local media channel, loading the independent data and the menu entry data of the local media channel;
if the current data is in the third-party application channel, loading independent data and menu entry data of the third-party application channel;
according to the loaded data, corresponding data is displayed, and an exemplary user interface display is shown in fig. 6.
The user can call the general system function menu of the whole machine by selecting the entry item of the menu item.
In the embodiment of the application, the display device is provided with a plurality of channels, and users can see videos from different sources through different channels. In some embodiments, the type of channel comprises HDMI (high definition multimedia interface), DTV, local media, or third party application. The video source of the local media can be obtained by locally inserting a USB flash disk, and the video source of the third-party application is obtained by installing the third-party application on the display device. In addition, the embodiment of the present application is not limited to the above-mentioned channels, and the method can be applied to other channels.
The second implementation is specifically as follows:
the user presses a preset key, for example, a menu key, to call up a menu, and the menu displays the same pattern but the data therein displays different data in different channels. The first part is a common data part, the same data part is displayed in each channel, the last item is set as a menu item inlet of a special function, the menu item inlet items of the special function are special functions of each channel, and the special functions can be displayed by clicking the menu item inlet of the special function.
The scheme has the advantages that the user interfaces of the menus under all channels of the whole machine are displayed consistently and are not dispersed, functional redundancy is avoided, the memory occupation can be reduced, and the pause phenomenon can be reduced particularly when high-definition videos are played.
The flow chart of the implementation is shown in fig. 7 as follows:
the menu is implemented using a List, which contains a plurality of sets of displayed data, a default set of common data, and data under each channel, which is placed in the menu item entry item for the specific function. The data format of the common data and the menu item entry of the special function under each channel is similar to that of the first scheme, and the common data and the menu item entry are configured in an xml file, and corresponding channel data are loaded according to each channel.
When the user presses the menu button through the control device, it is determined which channel the current display device is under.
And if the current channel is in the DTV channel, judging whether a signal exists currently. If no signal, the common data part is directly displayed, as shown in fig. 10, and if a signal exists, the information corresponding to the current signal is acquired and filled in the entry items of the menu items of the special functions, and the common part and the entry items of the menu items of the special functions are displayed, as shown in fig. 9.
And if the current channel is the local media playing channel, acquiring whether the current film source can be played. If the current film source can not be played normally, the data of the public part is directly displayed, if the current film source can be played, the film source information of the current film source is obtained and filled into the entry items of the menu items of the special functions, and the entry items of the public part and the menu items of the special functions are displayed.
If it is currently under the third party application, the common part and the menu item entry items for the special functions of the third party application are displayed directly.
When the user selects the entry item of the menu item of the special function, the menu item of the special function can be entered.
In the above embodiment, in response to an instruction input by a user to display a menu, the type of the current channel is determined, and menu information is displayed on the display; the menu information comprises a menu item of a function matched with the type of the current channel and a menu item inlet used for indicating a system general function; and when the selector is controlled to select the menu item inlet based on the user input, displaying the menu item of the system general function on the display.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application. It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (9)

CN202010468548.4A2020-05-282020-05-28Menu display method and display equipmentPendingCN111614995A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202010468548.4ACN111614995A (en)2020-05-282020-05-28Menu display method and display equipment

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202010468548.4ACN111614995A (en)2020-05-282020-05-28Menu display method and display equipment

Publications (1)

Publication NumberPublication Date
CN111614995Atrue CN111614995A (en)2020-09-01

Family

ID=72202274

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202010468548.4APendingCN111614995A (en)2020-05-282020-05-28Menu display method and display equipment

Country Status (1)

CountryLink
CN (1)CN111614995A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN113434240A (en)*2021-07-212021-09-24海信视像科技股份有限公司Display method and display device of image mode
CN115914766A (en)*2022-11-022023-04-04Vidaa国际控股(荷兰)公司Display device and method for displaying menu on game picture
CN116347180A (en)*2021-12-232023-06-27青岛海信传媒网络技术有限公司 A display device and a display method of a media file display list

Citations (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US6266098B1 (en)*1997-10-222001-07-24Matsushita Electric Corporation Of AmericaFunction presentation and selection using a rotatable function menu
CN101304497A (en)*2008-06-062008-11-12深圳创维-Rgb电子有限公司 Display method of TV menu interface under different signal source channels
CN101763268A (en)*2010-01-222010-06-30惠州Tcl移动通信有限公司Method for dynamic adjustment of menu structure of electric equipment
US20120060115A1 (en)*2010-09-032012-03-08Samsung Electronics Co., Ltd.Method for setting function and display apparatus applying the same
CN103778765A (en)*2012-10-252014-05-07华为终端有限公司Remote control method, remote control equipment and main equipment
CN105792017A (en)*2016-03-042016-07-20青岛海信电器股份有限公司Terminal system parameter setting method and terminal system parameter setting device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US6266098B1 (en)*1997-10-222001-07-24Matsushita Electric Corporation Of AmericaFunction presentation and selection using a rotatable function menu
CN101304497A (en)*2008-06-062008-11-12深圳创维-Rgb电子有限公司 Display method of TV menu interface under different signal source channels
CN101763268A (en)*2010-01-222010-06-30惠州Tcl移动通信有限公司Method for dynamic adjustment of menu structure of electric equipment
US20120060115A1 (en)*2010-09-032012-03-08Samsung Electronics Co., Ltd.Method for setting function and display apparatus applying the same
CN103778765A (en)*2012-10-252014-05-07华为终端有限公司Remote control method, remote control equipment and main equipment
CN105792017A (en)*2016-03-042016-07-20青岛海信电器股份有限公司Terminal system parameter setting method and terminal system parameter setting device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN113434240A (en)*2021-07-212021-09-24海信视像科技股份有限公司Display method and display device of image mode
CN113434240B (en)*2021-07-212022-09-09海信视像科技股份有限公司Display method and display device of image mode
CN116347180A (en)*2021-12-232023-06-27青岛海信传媒网络技术有限公司 A display device and a display method of a media file display list
CN115914766A (en)*2022-11-022023-04-04Vidaa国际控股(荷兰)公司Display device and method for displaying menu on game picture
CN115914766B (en)*2022-11-022025-07-11Vidaa国际控股(荷兰)公司 Display device and method for displaying menu on game screen

Similar Documents

PublicationPublication DateTitle
CN111182345B (en)Display method and display equipment of control
CN111669638B (en)Video rotation playing method and display device
CN111683295A (en)Content display method and display equipment
CN111225262A (en)Function setting method of display equipment and display equipment
CN112004126A (en)Search result display method and display device
CN111857502A (en)Image display method and display equipment
CN111541929A (en)Multimedia data display method and display equipment
CN111246309A (en)Method for displaying channel list in display device and display device
CN111045557A (en)Moving method of focus object and display device
CN111901653B (en)Configuration method of external sound equipment of display equipment and display equipment
CN111726673B (en)Channel switching method and display device
CN111629249B (en)Method for playing startup picture and display device
CN111601142B (en)Subtitle display method and display equipment
CN111614995A (en)Menu display method and display equipment
CN111343492B (en)Display method and display device of browser in different layers
CN109922364B (en)Display device
CN111757160A (en)Method for starting sports mode and display equipment
CN112040308A (en)HDMI channel switching method and display device
CN111857363A (en) An input method interaction method and display device
CN111526401A (en)Video playing control method and display equipment
CN112040285B (en)Interface display method and display equipment
CN111586457A (en)Method for repeatedly executing corresponding operation of input instruction and display device
CN112291598A (en)Display equipment function control method and display equipment
CN111601147A (en)Content display method and display equipment
CN111596771A (en)Display device and method for moving selector in input method

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
RJ01Rejection of invention patent application after publication

Application publication date:20200901

RJ01Rejection of invention patent application after publication

[8]ページ先頭

©2009-2025 Movatter.jp