Movatterモバイル変換


[0]ホーム

URL:


CN111541929A - Multimedia data display method and display equipment - Google Patents

Multimedia data display method and display equipment
Download PDF

Info

Publication number
CN111541929A
CN111541929ACN202010319306.9ACN202010319306ACN111541929ACN 111541929 ACN111541929 ACN 111541929ACN 202010319306 ACN202010319306 ACN 202010319306ACN 111541929 ACN111541929 ACN 111541929A
Authority
CN
China
Prior art keywords
asset data
media asset
page
display
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010319306.9A
Other languages
Chinese (zh)
Inventor
张欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vidaa Netherlands International Holdings BV
Original Assignee
Qingdao Hisense Media Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Media Network Technology Co LtdfiledCriticalQingdao Hisense Media Network Technology Co Ltd
Priority to CN202010319306.9ApriorityCriticalpatent/CN111541929A/en
Priority to PCT/CN2020/101153prioritypatent/WO2021212667A1/en
Publication of CN111541929ApublicationCriticalpatent/CN111541929A/en
Pendinglegal-statusCriticalCurrent

Links

Images

Classifications

Landscapes

Abstract

The embodiment of the invention relates to the technical field of display, in particular to a multimedia asset data display method and display equipment, which are used for reducing the operation frequency of repeatedly stretching a page back and forth for selecting finally viewed media asset data by a user if the media asset data of a certain category are excessive. Therefore, the user can conveniently search the media asset data to be watched, and the film watching experience of the user is improved. The method comprises the following steps: responding to a control instruction which is input by a user and indicates to display an application main page, and displaying the application main page on a display, wherein the application main page comprises a plurality of media asset data display areas which are divided according to categories, the media asset data are distributed according to the categories, and the media asset data display areas of which the media asset data are larger than a quantity threshold value also comprise secondary page entry data; responding to a control instruction of selecting secondary page entry data indicated by a user, and jumping from an application main page to a classification thematic page; the classified special topic page comprises all the media asset data corresponding to the media asset data display area where the secondary page entry data is located.

Description

Multimedia data display method and display equipment
Technical Field
The present invention relates to the field of display technologies, and in particular, to a multimedia data display method and a display device.
Background
Currently, a user can obtain various media resources (referred to as media assets for short) from applications of the smart television.
When the existing intelligent television displays various media assets, the media assets are displayed in different regions according to categories. When a user needs to view the media assets in a certain category, the user can selectively view the media assets in the category. When a plurality of pieces of asset data are placed in the category area, the user usually needs to continuously move the focus and stretch the page in each direction to find favorite asset data. Because the display area of the display device is limited, if the media asset data of a certain category is excessive, the user needs to repeatedly stretch the page back and forth to select the finally viewed media asset data, so that the experience of the user is poor.
Disclosure of Invention
The embodiment of the invention provides a multimedia asset data display method and display equipment, which are used for reducing the operation frequency of repeatedly stretching a page to and fro for a user to select finally viewed media asset data if certain types of media asset data are excessive, facilitating the user to search the media asset data to be viewed and improving the viewing experience of the user.
In a first aspect, a multimedia asset data display method is provided, which is applied to a display device, and includes: responding to a control instruction which is input by a user and indicates to display an application main page, and displaying the application main page on a display, wherein the application main page comprises a plurality of media asset data display areas which are divided according to categories, media asset data are distributed in the media asset data display areas according to corresponding categories, and the media asset data display areas with the media asset data larger than a quantity threshold value further comprise secondary page entry data;
responding to a control instruction of selecting the secondary page entry data indicated by a user, and jumping from the application main page to a classification thematic page; the classified special topic page comprises all the media asset data corresponding to the media asset data display area where the secondary page entry data is located.
In a second aspect, there is provided a display device comprising: a display for displaying a page;
the user interface is used for receiving instructions input by a user;
a controller for performing:
responding to a control instruction which is input by a user and indicates to display an application main page, and displaying the application main page on a display, wherein the application main page comprises a plurality of media asset data display areas which are divided according to categories, and media asset data are distributed in the media asset data display areas according to corresponding categories, and the media asset data display areas with the media asset data larger than a quantity threshold value also comprise secondary page entry data;
responding to a control instruction of selecting the secondary page entry data indicated by a user, and jumping from the application main page to a classification thematic page; the classified special topic page comprises all the media asset data corresponding to the media asset data display area where the secondary page entry data is located.
In the above-described embodiment, the application home page is displayed on the display in response to a control instruction input by the user to instruct display of the application home page. The application main page comprises a plurality of media asset data display areas which are divided according to categories, and the media asset data of different categories are distributed in the media asset data display areas according to corresponding categories. The media asset data display area is provided with a realistic quantity threshold value, and the media asset data display area exceeding the quantity threshold value also comprises secondary page entry data. In response to a control instruction from the user indicating selection of secondary page entry data, a jump is made from the application home page to a category specific page for the category, the category specific page including all of the media content for the category. And if the media asset data of a certain category is excessive, reducing the operation frequency of repeatedly stretching the page back and forth by the user to select the finally viewed media asset data. Therefore, the user can conveniently search the media asset data to be watched, and the film watching experience of the user is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1A is a schematic diagram illustrating an operation scenario between a display device and a control apparatus;
fig. 1B is a block diagram schematically illustrating a configuration of thecontrol apparatus 100 in fig. 1A;
fig. 1C is a block diagram schematically illustrating a configuration of thedisplay device 200 in fig. 1A;
FIG. 1D is a block diagram illustrating an architectural configuration of an operating system in memory ofdisplay device 200;
fig. 1E schematically shows acontrol device 100 implemented as a remote control;
fig. 2-8 are schematic diagrams illustrating a GUI400 provided by thedisplay apparatus 200 by operating thecontrol device 100;
fig. 9 is a flowchart illustrating a multimedia asset data display method in the display apparatus;
fig. 10 is a schematic diagram illustrating implementation logic of a multimedia asset data display method in a display device.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the present invention will be described in further detail with reference to the accompanying drawings, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1A is a schematic diagram illustrating an operation scenario between a display device and a control apparatus. As shown in fig. 1A, thecontrol apparatus 100 and thedisplay device 200 may communicate with each other in a wired or wireless manner.
Among them, thecontrol apparatus 100 is configured to control thedisplay device 200, which may receive an operation instruction input by a user and convert the operation instruction into an instruction recognizable and responsive by thedisplay device 200, serving as an intermediary for interaction between the user and thedisplay device 200. Such as: the user operates the channel up/down key on thecontrol device 100, and thedisplay device 200 responds to the channel up/down operation.
Thecontrol apparatus 100 may be acontrol apparatus 100A, which includes infrared protocol communication or bluetooth protocol communication, and other short-distance communication methods, and controls thedisplay device 200 in a wireless or other wired manner. The user may input a user instruction through a key on the control device, voice input, control panel input, etc., to control thedisplay apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the control device, thereby implementing a function of controlling thedisplay apparatus 200.
Thecontrol device 100 may also be an intelligent device, such as amobile terminal 100B, a tablet computer, a notebook computer, and the like. For example, thedisplay device 200 is controlled using an application program running on the smart device. The application program may provide various controls to the user through an intuitive user page (UI) on a screen associated with the smart device through configuration.
For example, themobile terminal 100B may install a software application with thedisplay device 200 to implement connection communication through a network communication protocol for the purpose of one-to-one control operation and data communication. Such as: themobile terminal 100B may be caused to establish a control instruction protocol with thedisplay device 200 to implement the function of the physical keys as arranged to control theapparatus 100A by operating various function keys or virtual buttons of a user page provided on themobile terminal 100B. The audio and video content displayed on themobile terminal 100B may also be transmitted to thedisplay device 200, so as to implement a synchronous display function.
Thedisplay apparatus 200 may provide a network television function of a broadcast receiving function and a computer support function. The display device may be implemented as a digital television, a web television, an Internet Protocol Television (IPTV), or the like.
Thedisplay device 200 may be a liquid crystal display, an organic light emitting display, a projection device. The specific display device type, size, resolution, etc. are not limited.
Thedisplay apparatus 200 also performs data communication with theserver 300 through various communication means. Here, thedisplay apparatus 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. Theserver 300 may provide various contents and interactions to thedisplay apparatus 200. By way of example, thedisplay device 200 may send and receive information such as: receiving Electronic Program Guide (EPG) data, receiving software program updates, or accessing a remotely stored digital media library. Theservers 300 may be a group or groups of servers, and may be one or more types of servers. Other web service contents such as a video on demand and an advertisement service are provided through theserver 300.
Fig. 1B is a block diagram illustrating the configuration of thecontrol device 100. As shown in fig. 1B, thecontrol device 100 includes acontroller 110, amemory 120, acommunicator 130, auser input interface 140, anoutput interface 150, and apower supply 160.
Thecontroller 110 includes a Random Access Memory (RAM)111, a Read Only Memory (ROM)112, aprocessor 113, a communication interface, and a communication bus. Thecontroller 110 is used to control the operation of thecontrol device 100, as well as the internal components of the communication cooperation, external and internal data processing functions.
For example, when an interaction in which a user presses a key disposed on thecontrol apparatus 100A or an interaction in which a touch panel disposed on thecontrol apparatus 100A is touched is detected, thecontroller 110 may control to generate a signal corresponding to the detected interaction and transmit the signal to thedisplay device 200.
And amemory 120 for storing various operation programs, data and applications for driving and controlling thecontrol apparatus 100 under the control of thecontroller 110. Thememory 120 may store various control signal commands input by a user.
Thecommunicator 130 enables communication of control signals and data signals with thedisplay apparatus 200 under the control of thecontroller 110. Such as: thecontrol apparatus 100 transmits a control signal (e.g., a touch signal or a button signal) to thedisplay device 200 via thecommunicator 130, and thecontrol apparatus 100 may receive the signal transmitted by thedisplay device 200 via thecommunicator 130. Thecommunicator 130 may include aninfrared signal interface 131 and a radiofrequency signal interface 132. For example: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to thedisplay device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to thedisplay device 200 through the rf transmitting terminal.
Theuser input interface 140 may include at least one of amicrophone 141, atouch pad 142, asensor 143, a key 144, and the like, so that a user can input a user instruction regarding controlling thedisplay apparatus 200 to thecontrol apparatus 100 through voice, touch, gesture, press, and the like.
Theoutput interface 150 outputs a user instruction received by theuser input interface 140 to thedisplay apparatus 200, or outputs an image or voice signal received by thedisplay apparatus 200. Here, theoutput interface 150 may include anLED interface 151, avibration interface 152 generating vibration, asound output interface 153 outputting sound, adisplay 154 outputting an image, and the like. For example, thecontrol device 100A may receive an output signal such as audio, video, or data from theoutput interface 150, and display the output signal in the form of an image on thedisplay 154, an audio on thesound output interface 153, or a vibration on thevibration interface 152.
And apower supply 160 for providing operation power support for each element of thecontrol device 100 under the control of thecontroller 110. In the form of a battery and associated control circuitry.
A hardware configuration block diagram of thedisplay device 200 is exemplarily illustrated in fig. 1C. As shown in fig. 1C, thedisplay apparatus 200 may include atuner demodulator 210, acommunicator 220, adetector 230, anexternal device interface 240, acontroller 250, amemory 260, auser interface 265, avideo processor 270, adisplay 275, anaudio processor 280, anaudio output interface 285, and apower supply 290.
Thetuner demodulator 210 receives the broadcast television signal in a wired or wireless manner, may perform modulation and demodulation processing such as amplification, mixing, and resonance, and is configured to demodulate, from a plurality of wireless or wired broadcast television signals, an audio/video signal carried in a frequency of a television channel selected by a user, and additional information (e.g., EPG data).
Thetuner demodulator 210 is responsive to the user selected frequency of the television channel and the television signal carried by the frequency, as selected by the user and controlled by thecontroller 250.
Thetuner demodulator 210 can receive a television signal in various ways according to the broadcasting system of the television signal, such as: terrestrial broadcasting, cable broadcasting, wireless network broadcasting, internet broadcasting, or the like; and according to different modulation types, a digital modulation mode or an analog modulation mode can be adopted; and can demodulate the analog signal and the digital signal according to the different kinds of the received television signals.
In other exemplary embodiments, thetuning demodulator 210 may also be in an external device, such as an external set-top box. In this way, the set-top box outputs a television signal after modulation and demodulation, and inputs the television signal into thedisplay apparatus 200 through theexternal device interface 240.
Thecommunicator 220 is a component for communicating with an external device or an external server according to various communication protocol types. For example, thedisplay apparatus 200 may transmit content data to an external apparatus connected via thecommunicator 220, or browse and download content data from an external apparatus connected via thecommunicator 220. Thecommunicator 220 may include a network communication protocol module or a near field communication protocol module, such as aWIFI module 221, a bluetoothcommunication protocol module 222, and a wired ethernetcommunication protocol module 223, so that thecommunicator 220 may receive a control signal of thecontrol device 100 according to the control of thecontroller 250 and implement the control signal as a WIFI signal, a bluetooth signal, a radio frequency signal, and the like.
Thedetector 230 is a component of thedisplay apparatus 200 for collecting signals of an external environment or interaction with the outside. Thedetector 230 may include a sound collector 231, such as a microphone, which may be used to receive a user's sound, such as a voice signal of a control instruction of the user to control thedisplay device 200; alternatively, ambient sounds may be collected that identify the type of ambient scene, enabling thedisplay device 200 to adapt to ambient noise.
In some other exemplary embodiments, thedetector 230, which may further include animage collector 232, such as a camera, a video camera, etc., may be configured to collect external environment scenes to adaptively change the display parameters of thedisplay device 200; and the function of acquiring the attribute of the user or interacting gestures with the user so as to realize the interaction between the display equipment and the user.
In some other exemplary embodiments, thedetector 230 may further include a light receiver for collecting the intensity of the ambient light to adapt to the display parameter variation of thedisplay device 200.
In some other exemplary embodiments, thedetector 230 may further include a temperature sensor, such as by sensing an ambient temperature, and thedisplay device 200 may adaptively adjust a display color temperature of the image. For example, when the temperature is higher, thedisplay apparatus 200 may be adjusted to display a color temperature of an image that is cooler; when the temperature is lower, thedisplay device 200 may be adjusted to display a warmer color temperature of the image.
Theexternal device interface 240 is a component for providing thecontroller 250 to control data transmission between thedisplay apparatus 200 and an external apparatus. Theexternal device interface 240 may be connected to an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner, and may receive data such as a video signal (e.g., moving image), an audio signal (e.g., music), additional information (e.g., EPG), etc. of the external apparatus.
Theexternal device interface 240 may include: a High Definition Multimedia Interface (HDMI) terminal 241, a Composite Video Blanking Sync (CVBS)terminal 242, an analog ordigital Component terminal 243, a Universal Serial Bus (USB)terminal 244, a Component terminal (not shown), a red, green, blue (RGB) terminal (not shown), and the like.
Thecontroller 250 controls the operation of thedisplay device 200 and responds to the operation of the user by running various software control programs (such as an operating system and various application programs) stored on thememory 260.
As shown in fig. 1C, thecontroller 250 includes a Random Access Memory (RAM)251, a Read Only Memory (ROM)252, agraphics processor 253, aCPU processor 254, acommunication interface 255, and acommunication bus 256. The RAM251, the ROM252, thegraphic processor 253, and theCPU processor 254 are connected to each other through acommunication bus 256 through acommunication interface 255.
The ROM252 stores various system boot instructions. When thedisplay apparatus 200 starts power-on upon receiving the power-on signal, theCPU processor 254 executes a system boot instruction in the ROM252, copies the operating system stored in thememory 260 to the RAM251, and starts running the boot operating system. After the start of the operating system is completed, theCPU processor 254 copies the various application programs in thememory 260 to the RAM251 and then starts running and starting the various application programs.
And agraphic processor 253 for generating various graphic objects such as icons, operation menus, and user input instruction display graphics, etc. Thegraphic processor 253 may include an operator for performing an operation by receiving various interactive instructions input by a user, and further displaying various objects according to display attributes; and a renderer for generating various objects based on the operator and displaying the rendered result on thedisplay 275.
ACPU processor 254 for executing operating system and application program instructions stored inmemory 260. And according to the received user input instruction, processing of various application programs, data and contents is executed so as to finally display and play various audio-video contents.
In some example embodiments, theCPU processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and a plurality of or one sub-processor. A main processor for performing some initialization operations of thedisplay apparatus 200 in the display apparatus preload mode and/or operations of displaying a screen in the normal mode. A plurality of or one sub-processor for performing an operation in a state of a standby mode or the like of the display apparatus.
Thecommunication interface 255 may include a first interface to an nth interface. These interfaces may be network interfaces that are connected to external devices via a network.
Thecontroller 250 may control the overall operation of thedisplay apparatus 200. For example: in response to receiving a user input command for selecting a GUI object displayed on thedisplay 275, thecontroller 250 may perform an operation related to the object selected by the user input command.
Where the object may be any one of the selectable objects, such as a hyperlink or an icon. The operation related to the selected object is, for example, an operation of displaying a link to a hyperlink page, document, image, or the like, or an operation of executing a program corresponding to the object. The user input command for selecting the GUI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch panel, etc.) connected to thedisplay apparatus 200 or a voice command corresponding to a voice spoken by the user.
Amemory 260 for storing various types of data, software programs, or applications for driving and controlling the operation of thedisplay device 200. Thememory 260 may include volatile and/or nonvolatile memory. And the term "memory" includes thememory 260, the RAM251 and the ROM252 of thecontroller 250, or a memory card in thedisplay device 200.
In some embodiments, thememory 260 is specifically used for storing an operating program for driving thecontroller 250 of thedisplay device 200; storing various application programs built in thedisplay apparatus 200 and downloaded by a user from an external apparatus; data such as visual effect images for configuring various GUIs provided by thedisplay 275, various objects related to the GUIs, and selectors for selecting GUI objects are stored.
In some embodiments,memory 260 is specifically configured to store drivers fortuner demodulator 210,communicator 220,detector 230,external device interface 240,video processor 270,display 275,audio processor 280, etc., and related data, such as external data (e.g., audio-visual data) received from the external device interface or user data (e.g., key information, voice information, touch information, etc.) received by the user interface.
In some embodiments,memory 260 specifically stores software and/or programs representing an Operating System (OS), which may include, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. Illustratively, the kernel may control or manage system resources, as well as functions implemented by other programs (e.g., the middleware, APIs, or applications); at the same time, the kernel may provide an interface to allow middleware, APIs, or applications to access the controller to enable control or management of system resources.
A block diagram of the architectural configuration of the operating system in the memory of thedisplay device 200 is illustrated in fig. 1D. The operating system architecture comprises an application layer, a middleware layer and a kernel layer from top to bottom.
The application layer, the application programs built in the system and the non-system-level application programs belong to the application layer. Is responsible for direct interaction with the user. The application layer may include a plurality of applications such as a setup application, a live application, an on-demand application, a media center application, and the like. These applications may be implemented as Web applications that execute based on a WebKit engine, and in particular may be developed and executed based on HTML5, Cascading Style Sheets (CSS), and JavaScript.
Here, HTML, which is called HyperText Markup Language (HyperText Markup Language), is a standard Markup Language for creating web pages, and describes the web pages by Markup tags, where the HTML tags are used to describe characters, graphics, animation, sound, tables, links, etc., and a browser reads an HTML document, interprets the content of the tags in the document, and displays the content in the form of web pages.
CSS, known as Cascading Style Sheets (Cascading Style Sheets), is a computer language used to represent the Style of HTML documents, and may be used to define Style structures, such as fonts, colors, locations, etc. The CSS style can be directly stored in the HTML webpage or a separate style file, so that the style in the webpage can be controlled.
JavaScript, a language applied to Web page programming, can be inserted into an HTML page and interpreted and executed by a browser. The interaction logic of the Web application is realized by JavaScript. The JavaScript can package a JavaScript extension interface through a browser, realize the communication with the kernel layer,
the middleware layer may provide some standardized interfaces to support the operation of various environments and systems. For example, the middleware layer may be implemented as multimedia and hypermedia information coding experts group (MHEG) middleware related to data broadcasting, DLNA middleware which is middleware related to communication with an external device, middleware which provides a browser environment in which each application program in the display device operates, and the like.
The kernel layer provides core system services, such as: file management, memory management, process management, network management, system security authority management and the like. The kernel layer may be implemented as a kernel based on various operating systems, for example, a kernel based on the Linux operating system.
The kernel layer also provides communication between system software and hardware, and provides device driver services for various hardware, such as: provide display driver for the display, provide camera driver for the camera, provide button driver for controlling means, provide wiFi driver for the WIFI module, provide audio driver for audio output interface, provide power management drive for Power Management (PM) module etc..
Auser interface 265 receives various user interactions. Specifically, it is used to transmit an input signal of a user to thecontroller 250 or transmit an output signal from thecontroller 250 to the user. For example, thecontrol device 100A may send an input signal, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by the user to theuser interface 265, and then the input signal is forwarded to thecontroller 250 through theuser interface 265; alternatively, thecontrol device 100A may receive an output signal such as audio, video, or data output from theuser interface 265 via thecontroller 250, and display the received output signal or output the received output signal in audio or vibration form.
In some embodiments, a user may enter user commands on a graphical user page (GUI) displayed on thedisplay 275, and theuser interface 265 receives the user input commands through the GUI. Specifically, theuser interface 265 may receive user input commands for controlling the position of a selector in the GUI to select different objects or items.
Alternatively, the user may input a user command by inputting a specific sound or gesture, and theuser interface 265 receives the user input command by recognizing the sound or gesture through the sensor.
Thevideo processor 270 is configured to receive an external video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a video signal that is directly displayed or played on thedisplay 275.
Illustratively, thevideo processor 270 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is configured to demultiplex an input audio/video data stream, where, for example, an input MPEG-2 stream (based on a compression standard of a digital storage media moving image and voice), the demultiplexing module demultiplexes the input audio/video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, for example, convert a frame rate of an input 60Hz video into a frame rate of 120Hz or 240Hz, where a common format is implemented by using, for example, an interpolation frame method.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format of a display, such as converting the format of the signal output by the frame rate conversion module to output an RGB data signal.
Adisplay 275 for receiving the image signal from the input of thevideo processor 270 and displaying the video content, the image and the menu manipulation page. The display video content may be from the video content in the broadcast signal received by the tuner-demodulator 210, or from the video content input by thecommunicator 220 or theexternal device interface 240. Adisplay 275 while displaying a user manipulation page UI generated in thedisplay apparatus 200 and used to control thedisplay apparatus 200.
And, thedisplay 275 may include a display screen assembly for presenting a picture and a driving assembly for driving the display of an image. Alternatively, a projection device and projection screen may be included, provideddisplay 275 is a projection display.
Theaudio processor 280 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing to obtain an audio signal that can be played by thespeaker 286.
Illustratively,audio processor 280 may support various audio formats. Such as MPEG-2, MPEG-4, Advanced Audio Coding (AAC), high efficiency AAC (HE-AAC), and the like.
Theaudio output interface 285 is used for receiving an audio signal output by theaudio processor 280 under the control of thecontroller 250, and theaudio output interface 285 may include aspeaker 286 or an externalsound output terminal 287, such as an earphone output terminal, for outputting to a generating device of an external device.
In other exemplary embodiments,video processor 270 may comprise one or more chips.Audio processor 280 may also comprise one or more chips.
And, in other exemplary embodiments, thevideo processor 270 and theaudio processor 280 may be separate chips or may be integrated with thecontroller 250 in one or more chips.
And apower supply 290 for supplying power supply support to thedisplay apparatus 200 from the power input from the external power source under the control of thecontroller 250. Thepower supply 290 may be a built-in power supply circuit installed inside thedisplay apparatus 200 or may be a power supply installed outside thedisplay apparatus 200.
It should be noted that, in order to select and execute the relevant functions (e.g., image setting, sound setting, etc.) required by the image content service provided by thedisplay device 200, thedisplay device 200 provides a plurality of menu items for selecting the functions, and at the same time, as shown in fig. 1E, thecontrol apparatus 100 configured to control thedisplay device 200 is implemented as a remote controller including a plurality of number keys, channel +/-keys, volume +/-keys, OK keys, direction keys, color keys, shortcut application keys, etc. for selecting the menu items, so as to prompt the user that thedisplay device 200 can execute the menu item functions adapted to the corresponding keys by operating the keys on thecontrol apparatus 100.
Here, the function information corresponding to the menu item has a mapping relationship with the key information on thecontrol device 100. Specifically, when thedisplay device 200 receives a key event value corresponding to each key on thecontrol apparatus 100, which is input by the user, thedisplay device 200 may perform a function operation corresponding to the key event value based on the mapping relationship, so that the user can easily visually match the key on thecontrol apparatus 100 with the function of each option.
In this way, in the case of providing menu item contents on the display, the user may receive a key event value of a key having a function desired to be executed by the user on thecontrol apparatus 100 by pressing the key, search for and execute a function corresponding to the received key event value from the mapping information, and intuitively perform the desired function in consideration of the correspondence of the key on thecontrol apparatus 100 with the option contents.
It should be noted that the option or item refers to a visual object displayed in a GUI provided by the display device to identify corresponding content such as an icon, a thumbnail, a link, and the like. The presentation forms of the options are generally diversified. For example, the options may include text content and/or images for displaying thumbnails related to the text content. As another example, the options may be text and/or icons of the application.
As shown in fig. 5, the application main page includes a plurality of pictures of the asset data and a "more" icon, which are selectable options.
It should be noted that a selector indicating that any one of the options is selected is also provided in the GUI displayed on the display, and the position of the selector in the GUI may be moved by an input of the user operating the control device to change the selection of the different options.
The display form of the selector may be the focus object. The movement of the display focus object in thedisplay apparatus 200 may be controlled to select or control items according to an input of a user through thecontrol device 100. Such as: the user may select and control items by controlling the movement of the focus object between items through the direction keys on thecontrol device 100. The identification form of the focus object is not limited. For example, as shown in fig. 2, the position of the focus object is realized or identified by setting the edge color of the option, for example, the edge color of the icon without focus being identified is gray, the edge color of the icon with focus being identified is light blue, a light blue aperture flashing at a certain frequency may be further provided at the edge of the icon, and the position of the focus object may also be identified by changing the border line, size, transparency, outline and/or font of the text or image of the focused item.
Fig. 2 to 8 are schematic views exemplarily showing one GUI400 provided by thedisplay apparatus 200 by operating thecontrol device 100.
As shown in fig. 2, in response to a control instruction input by the user and indicating to display the application home page, theapplication home page 41 is displayed on the display, which may be a shortcut application key on the control device pressed by the user, and the display device displays the application home page corresponding to the shortcut application key on the display in response to the key input instruction, for example, a video application, and provides a plurality of media data for the user to select and view. The application main page comprises a plurality of asset data display areas divided according to categories, such as shown in fig. 2, two asset data display areas 41-1 and an asset data display area 41-2 of different categories. Other presentation areas of the asset data are also included on the application home page, which are not shown on the display shown in fig. 2 due to the limited display area of the display device. The present embodiment is described with respect to only the asset data presentation area 41-1 and the asset data presentation area 41-2.
The asset data display area 41-1 includes asset classification names "Architecture and buildings", which indicates that the asset data included in the asset data display area 41-1 are all related to buildings. The asset data display area 41-1 further includes a plurality of pictures of asset data, fig. 2 shows only pictures a 1-a 4, and a4 shows only half pictures. The user can set the maximum displayed amount of the asset data in the asset data display area, namely the threshold of the amount of the asset data. The embodiment shown in fig. 2 sets the threshold value of the amount of asset data to 3.5, i.e., a complete picture of three asset data of a 1-A3 and a half picture of a4 are displayed.
The media asset data display area also comprises quantity details of the current media asset data, and the quantity details are displayed in a score form. The number of the numerator with the number details indicates the arrangement position of the media data at the position of the focus in the media data display area; and the denominator number of the quantity details indicates the total quantity of all the media asset data displayed on the application main page in the media asset data display area at the position of the focus. For example, as shown in FIG. 2, asset data shows quantity detail "1/5" in area 41-1. Here, thenumeral 1 of the numerator indicates that the asset data a1 at which the current focus is located at the first position of all asset data of the asset data presentation area 41-1. Thenumeral 5 of the denominator indicates that 5 pieces of asset data are shown in the asset data showing area 41-1 where the current focus is located.
In the present embodiment, the total amount of asset data with asset classification names of Architecture and buildings is 10, and if the display method according to the prior art is adopted, as shown in fig. 3, 10 asset data are all displayed in the asset data display area 41-1. However, since the display can only display 3.5 pictures of the asset data at most at the same time in the asset data display area, the user needs to repeatedly stretch the page back and forth when viewing and selecting the asset data.
In the embodiment of the application, when 3.5 pieces of media asset data are displayed at most simultaneously in the media asset data display area, secondary page entry data are displayed at the last position of all the media asset data picture lists, as shown in fig. 4, a "more" icon is displayed. The user need only stretch the page so that the focus falls on the "more" icon, which indicates that asset data presentation area 41-1 also contains more asset data, as shown in fig. 5, and the user stretches the page while moving the focus from the picture of asset data a1 to the "more" icon. Meanwhile, the quantity detail in the asset data presentation area 41-1 is updated to "5/5", indicating that the current focus is located at the last position of the asset data picture list.
And responding to a control instruction of selecting the entry data of the secondary page indicated by the user, and jumping from the application main page to the classification thematic page. For example, the focus may be on the "more" icon, the user presses the OK button on the control device, and the display device jumps from the application home page shown in fig. 5 to thecategory topic page 42 of fig. 6 in response to an instruction input by the user. The classified topic page comprises all the media asset data in the media asset data display area 41-1 and all the media asset data which are not displayed in the category corresponding to the media asset data display area 41-1.
The category topic page shown in FIG. 6 includes media asset data for all building categories: the asset data A1-A4 already shown in the asset data display area 41-1 and the asset data A5-A10 not shown in the asset data display area 41-1. All the media asset data are displayed in a classification thematic page in a fixed line quantity and longitudinal pull-down mode. The asset data a 1-a 10 shown in fig. 6 are displayed in a fixed vertical drop-down form of 3 per line.
In some embodiments, the application main page also includes an item detail area 41-3 for indicating a detailed description of the asset data in which focus is located. The project detail area specifically comprises an indication name and indication details, if the position of the focus is the media asset data, the indication name is displayed as the name of the media asset data, and the indication details are displayed as the details of the media asset data. And if the position of the focus is the entry data of the secondary page, displaying the indication name as an entry indication name, and displaying the indication details as entry indication details. For example, as shown in FIG. 2, if the focus is located at asset data A1, the name "Waterower" of asset data A1 is displayed in the application page, and the details "Architecture and building of asset data A1: Waterower Manual of the m have … …" are displayed. The user can not only know the media asset data through the pictures of the media contents but also through the names and the details, so that the user can conveniently make viewing selection.
As shown in FIG. 5, if the focus is located in the "more" icon, i.e. the secondary page entry data, the indication name is displayed as the entry indication name "collectionEnter", and the indication detail is displayed as the entry indication detail "this is the entry for this category, you can watch all videos for this … …", so as to prompt the user to view more media data in this category by selecting this icon.
The asset data presentation area 41-2 includes asset class names: technology and science, it is shown that the asset data included in asset data presentation area 41-2 is related to science and Technology. The asset data presentation area 41-2 further includes a plurality of pictures of asset data. Unlike the asset data presentation area 41-1, the asset data presentation area 41-2 includes only 3 asset data, and thus, the asset data presentation area 41-2 does not include secondary page entry data.
In some embodiments, the classified topic page further includes a video detail area 42-1 for introducing related introduction words of the media asset data at the focus, where the video detail area specifically includes: and the media asset classification name of the media asset data display area where the secondary page entry data is located, the total amount of all the media asset data contained in the classification thematic page and introduction characters of the media asset data at the position where the focus is located. As shown in the category topic page of fig. 6, the video detail area includes the media asset category names "Architecture and buildings", and the category topic page contains thetotal amount 10 of all the media asset data. The position of the current focus is the asset data A2, and the introduction text "Architecture and building" of the asset data A2, video-sales-snow-in-city … … "is displayed at the bottom of the asset class name" Architecture and building ".
In some embodiments, after entering the classification topic page from the secondary page entry data, only a portion of the asset data can be displayed due to the limited range of the page area, but all of the asset data contained in the classification topic page is actually rendered completely. Only the asset data A1-A9 are displayed in the category topic page shown in FIG. 7, but actually the asset data A10 that is not displayed is rendered.
In fig. 7, if the user stretches the page downwards, the focus moves from the asset data a2 in the first row to the asset data a7 in the third row, and since the asset data a7 has no related introduction text, the content is no longer displayed in the area where the introduction text of the asset data a2 was originally displayed, and all asset data moves upwards as a whole, as shown in fig. 8, after the asset data moves upwards as a whole, the asset data a10 which is not displayed on the page before moves upwards is also displayed on the page. After entering the classified thematic page, the media asset data which can be displayed in the page is rendered, and meanwhile, the media asset data which is not displayed in the page is rendered, so that when the focus is moved downwards, the media asset data at the bottom of the page can be displayed more quickly, and the user experience is further improved.
In the above-described embodiment, the application home page is displayed on the display in response to a control instruction input by the user to instruct display of the application home page. The application main page comprises a plurality of media asset data display areas which are divided according to categories, and the media asset data of different categories are distributed in the media asset data display areas according to corresponding categories. The media asset data display area is provided with a displayable quantity threshold value, and the media asset data display area exceeding the quantity threshold value also comprises secondary page entry data. In response to a control instruction from the user indicating selection of secondary page entry data, a jump is made from the application home page to a category specific page for the category, the category specific page including all of the media content for the category. And if the media asset data of a certain category is excessive, reducing the operation frequency of repeatedly stretching the page back and forth by the user to select the finally viewed media asset data. Therefore, the user can conveniently search the media asset data to be watched, and the film watching experience of the user is improved.
Fig. 9 is a flowchart illustrating a multimedia asset data display method in the display apparatus.
In connection with the method of FIG. 9, the method includes the following steps S51-S52:
step S51: responding to a control instruction which is input by a user and indicates to display an application main page, and displaying the application main page on a display, wherein the application main page comprises a plurality of media asset data display areas which are divided according to categories, media asset data are distributed in the media asset data display areas according to corresponding categories, and the media asset data display areas with the media asset data larger than a quantity threshold value further comprise secondary page entry data; as shown in fig. 2, theapplication home page 41, which may be a menu key on the control device pressed by the user, is displayed on the display in response to a control instruction input by the user indicating that the application home page is displayed, and the display apparatus displays the application home page on the display in response to the key input instruction. The application main page comprises a plurality of asset data display areas divided according to categories, such as shown in fig. 2, two asset data display areas 41-1 and an asset data display area 41-2 of different categories.
The asset data display area 41-1 includes an asset classification name "Architecture and databases," which indicates that the asset data included in the asset data display area 41-1 are all related to buildings. The asset data display area 41-1 further includes a plurality of pictures of asset data, fig. 2 shows only pictures a 1-a 4, and a4 shows only half pictures. The user can set the maximum displayed amount of the asset data in the asset data display area, namely the threshold of the amount of the asset data. The embodiment shown in fig. 2 sets the threshold value of the amount of asset data to 3.5, i.e., a complete picture of three asset data of a 1-A3 and a half picture of a4 are displayed.
In the embodiment of the application, when 3.5 pieces of media asset data are displayed at most simultaneously in the media asset data display area, secondary page entry data are displayed at the last position of all the media asset data picture lists, as shown in fig. 4, a "more" icon is displayed. The user need only stretch the page so that the focus falls on the "more" icon, which indicates that asset data presentation area 41-1 also contains more asset data, as shown in fig. 5, and the user stretches the page while moving the focus from the picture of asset data a1 to the "more" icon. Meanwhile, the quantity detail in the asset data presentation area 41-1 is updated to "5/5", indicating that the current focus is located at the last position of the asset data picture list.
The asset data presentation area 41-2 includes asset class names: technology and science, it is shown that the asset data included in asset data presentation area 41-2 is related to science and Technology. The asset data presentation area 41-2 further includes a plurality of pictures of asset data. Unlike the asset data presentation area 41-1, the asset data presentation area 41-2 includes only 3 asset data, and thus, the asset data presentation area 41-2 does not include secondary page entry data.
Step S52: responding to a control instruction of selecting the secondary page entry data indicated by a user, and jumping from the application main page to a classification thematic page; the classified special topic page comprises all the media asset data corresponding to the media asset data display area where the secondary page entry data is located. For example, in response to a control instruction from a user indicating selection of secondary page entry data, a jump is made from the application main page to the classification topic page. For example, the focus may be on the "more" icon, the user presses a confirmation key on the control device, and the display device jumps from the application home page shown in fig. 5 to thecategory topic page 42 of fig. 6 in response to an instruction input by the user. The classified topic page comprises all the media asset data in the media asset data display area 41-1 and all the media asset data which are not displayed in the category corresponding to the media asset data display area 41-1.
The category topic page shown in FIG. 6 includes media asset data for all building categories: the asset data A1-A4 already shown in the asset data display area 41-1 and the asset data A5-A10 not shown in the asset data display area 41-1. The user can see the media asset data under the building category more clearly and intuitively in the classified thematic page, the media asset data to be watched can be found more conveniently, and the film watching experience of the user is improved.
Fig. 10 is a schematic diagram of the implementation logic of the method. Specifically, after the user starts the application, the media asset data is acquired according to the application ID, for example, according to a plurality of classification names: the method comprises the steps of acquiring media asset data of different classifications, displaying corresponding media asset data in different media asset data display areas according to the classifications, and displaying all the media asset data in each media asset data display area in a classification row mode. And then judging whether the media asset data contained in all the media asset data display areas is larger than a quantity threshold value.
For example, according to the page layout of fig. 2, each row of the page can display 4 pieces of asset data at most, and the number threshold is 4. And if the media asset data exceeds 4 media asset data display areas, intercepting the first 4 media asset data in the media asset data display areas, adding more classification special entry data, namely secondary page entry data, and displaying. And simultaneously storing all the media asset data of the corresponding classification into an independent array, and setting the array identification ID as the classified media asset classification name as the display data of a secondary page entered by the more entrance. And if the user selects the entry data of the secondary page, the user enters a thematic classification page containing all media asset data of the corresponding classification. And if the media asset data does not exceed 4 media asset data display areas, displaying normally, not processing the data, specifically, not adding secondary page entry data.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (12)

CN202010319306.9A2020-04-212020-04-21Multimedia data display method and display equipmentPendingCN111541929A (en)

Priority Applications (2)

Application NumberPriority DateFiling DateTitle
CN202010319306.9ACN111541929A (en)2020-04-212020-04-21Multimedia data display method and display equipment
PCT/CN2020/101153WO2021212667A1 (en)2020-04-212020-07-09Multiple media resource data display method and display device

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202010319306.9ACN111541929A (en)2020-04-212020-04-21Multimedia data display method and display equipment

Publications (1)

Publication NumberPublication Date
CN111541929Atrue CN111541929A (en)2020-08-14

Family

ID=71975123

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202010319306.9APendingCN111541929A (en)2020-04-212020-04-21Multimedia data display method and display equipment

Country Status (2)

CountryLink
CN (1)CN111541929A (en)
WO (1)WO2021212667A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN112732396A (en)*2021-01-282021-04-30Vidaa美国公司Media asset data display method and display device
CN116527969A (en)*2022-01-202023-08-01Vidaa国际控股(荷兰)公司 A display device and method for displaying detailed information of media assets
US12411590B2 (en)2021-01-282025-09-09VIDAA USA, Inc.Method for displaying media asset data and display apparatus

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN114845154B (en)*2022-04-282023-05-09四川虹魔方网络科技有限公司Quick adaptation method for media data

Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20100153999A1 (en)*2006-03-242010-06-17Rovi Technologies CorporationInteractive media guidance application with intelligent navigation and display features
CN105074613A (en)*2013-01-242015-11-18汤姆逊许可公司Method and system for content discovery
CN109725784A (en)*2017-10-302019-05-07华为技术有限公司 Information display method and terminal device
CN110337034A (en)*2019-07-122019-10-15青岛海信传媒网络技术有限公司User interface display method and display equipment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN102413373A (en)*2011-11-212012-04-11深圳市同洲视讯传媒有限公司Method for generating digital television menu navigation page and apparatus thereof
EP2631818A1 (en)*2012-02-242013-08-28Research in Motion TAT ABMethod and apparatus for displaying files by categories in a graphical user interface
CN110059255B (en)*2013-05-312022-09-13腾讯科技(深圳)有限公司Browser navigation method, device and medium
CN108200464A (en)*2018-01-252018-06-22青岛海信电器股份有限公司Smart television and the method convenient for selecting educational class content

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20100153999A1 (en)*2006-03-242010-06-17Rovi Technologies CorporationInteractive media guidance application with intelligent navigation and display features
CN105074613A (en)*2013-01-242015-11-18汤姆逊许可公司Method and system for content discovery
CN109725784A (en)*2017-10-302019-05-07华为技术有限公司 Information display method and terminal device
CN110337034A (en)*2019-07-122019-10-15青岛海信传媒网络技术有限公司User interface display method and display equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN112732396A (en)*2021-01-282021-04-30Vidaa美国公司Media asset data display method and display device
US12411590B2 (en)2021-01-282025-09-09VIDAA USA, Inc.Method for displaying media asset data and display apparatus
CN116527969A (en)*2022-01-202023-08-01Vidaa国际控股(荷兰)公司 A display device and method for displaying detailed information of media assets

Also Published As

Publication numberPublication date
WO2021212667A1 (en)2021-10-28

Similar Documents

PublicationPublication DateTitle
CN111405333B (en)Display apparatus and channel control method
CN111698551B (en)Content display method and display equipment
CN111182345B (en)Display method and display equipment of control
CN111447479A (en)Graphical user interface method and display device for providing prompt
CN111541929A (en)Multimedia data display method and display equipment
CN111669634A (en)Video file preview method and display equipment
CN111225262A (en)Function setting method of display equipment and display equipment
CN111417027A (en)Method for switching small window playing of full-screen playing of webpage video and display equipment
CN111246309A (en)Method for displaying channel list in display device and display device
CN111045557A (en)Moving method of focus object and display device
CN112004126A (en)Search result display method and display device
CN111726673B (en)Channel switching method and display device
CN111901653B (en)Configuration method of external sound equipment of display equipment and display equipment
CN111294633B (en)EPG user interface display method and display equipment
CN111614995A (en)Menu display method and display equipment
CN109922364A (en)A kind of display equipment
CN111726674B (en)HbbTV application starting method and display equipment
CN111405329B (en)Display device and control method for EPG user interface display
CN111526401B (en)Video playing control method and display equipment
CN111757160A (en)Method for starting sports mode and display equipment
CN112040308A (en)HDMI channel switching method and display device
CN112004127A (en)Signal state display method and display equipment
CN111641856A (en)Prompt message display method for guiding user operation in display equipment and display equipment
CN111601147A (en)Content display method and display equipment
CN111586457A (en)Method for repeatedly executing corresponding operation of input instruction and display device

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
TA01Transfer of patent application right

Effective date of registration:20221020

Address after:83 Intekte Street, Devon, Netherlands

Applicant after:VIDAA (Netherlands) International Holdings Ltd.

Address before:266061 room 131, 248 Hong Kong East Road, Laoshan District, Qingdao City, Shandong Province

Applicant before:QINGDAO HISENSE MEDIA NETWORKS Ltd.

TA01Transfer of patent application right
RJ01Rejection of invention patent application after publication

Application publication date:20200814

RJ01Rejection of invention patent application after publication

[8]ページ先頭

©2009-2025 Movatter.jp