Movatterモバイル変換


[0]ホーム

URL:


CN113747216A - Display device and touch menu interaction method - Google Patents

Display device and touch menu interaction method
Download PDF

Info

Publication number
CN113747216A
CN113747216ACN202010474023.1ACN202010474023ACN113747216ACN 113747216 ACN113747216 ACN 113747216ACN 202010474023 ACN202010474023 ACN 202010474023ACN 113747216 ACN113747216 ACN 113747216A
Authority
CN
China
Prior art keywords
touch
display
menu
control
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010474023.1A
Other languages
Chinese (zh)
Other versions
CN113747216B (en
Inventor
王学磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co LtdfiledCriticalHisense Visual Technology Co Ltd
Priority to CN202010474023.1ApriorityCriticalpatent/CN113747216B/en
Publication of CN113747216ApublicationCriticalpatent/CN113747216A/en
Application grantedgrantedCritical
Publication of CN113747216BpublicationCriticalpatent/CN113747216B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The application provides a display device and a touch menu interaction method. The interaction method can be applied to display equipment to realize different functions on the touch menu. In the interaction process, a user can input a multi-finger touch instruction through the touch screen to call a touch menu. After the touch menu is called, the user can also input a multi-finger rotating touch instruction to zoom the diameter of the touch menu; and inputting a single-finger sliding touch instruction to adjust the positions of the control options on the touch menu, so that any control option is opened by inputting a single-finger clicking touch instruction. And for the opened control options, an adjusting control can be directly displayed in the middle area of the touch menu, so that a user can input a single-finger sliding touch instruction to adjust the control parameters. The interaction method is simple to operate, and user experience can be effectively improved.

Description

Display device and touch menu interaction method
Technical Field
The application relates to the technical field of touch televisions, in particular to a display device and a touch menu interaction method.
Background
The touch television is an intelligent television device with a touch screen and capable of realizing touch interaction. The user can implement touch interaction on the operation interface of the touch television through operation modes such as clicking and sliding on the touch television, so that media assets displayed in the operation interface can be played, and other auxiliary actions such as page turning and switching can be completed on the operation interface. Menus are frequently used functions during touch interaction.
A plurality of control options, such as "back", "volume", "home", etc., may be set in the touch menu. And when the user clicks the corresponding control option, executing a corresponding control action. For example, if the user clicks the "volume" option icon in the touch menu, the television displays a volume adjustment interface. In order to perform interactive operations on a touch menu, a typical touch television provides a dedicated menu interface. In the menu interface, all control options are displayed in sequence in a longitudinal arrangement mode, and for an option which cannot be completely displayed, page turning can be performed by combining with sliding operation on the touch screen so as to be selected by a user.
However, since the display screen of the touch television is large, when the page is turned by performing the sliding operation, the user needs to slide a finger on the touch screen for a long distance to complete the page turning, and thus, the interaction method brings inconvenience to the user. Moreover, if the number of options in the menu interface is large, the user needs to slide and page continuously, so that the user frequently performs long-distance sliding operation on the screen, the interaction mode is complicated, and the user experience is seriously reduced.
Disclosure of Invention
The application provides a display device and a touch menu interaction method, which aim to solve the problem that the touch menu interaction mode of the traditional display device is complicated.
In a first aspect, the present application provides a display device, which includes a display, a touch sensing module, and a controller. Wherein the display is configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options; the touch sensing module is configured to detect a touch instruction input by a user.
The controller is configured to execute the following program:
receiving a multi-finger touch instruction which is input by a user and used for calling the touch menu;
and responding to the multi-finger touch instruction, controlling the display to display the touch menu, wherein the display position of the touch menu is determined according to the position calculation of each touch point in the multi-finger touch instruction.
Based on the display device, the application also provides a touch menu interaction method, which includes:
receiving a multi-finger touch instruction which is input by a user and used for calling a touch menu; the touch menu is an annular menu control formed by a plurality of control options;
and responding to the multi-finger touch instruction, controlling a display to display the touch menu, wherein the display position of the touch menu is determined according to the position calculation of each touch point in the multi-finger touch instruction.
In a second aspect, the present application provides a display device, which includes a display, a touch sensing module, and a controller. Wherein the display is configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options; the touch sensing module is configured to detect a touch instruction input by a user.
The controller is configured to execute the following program:
receiving a single-finger sliding touch instruction which is input by a user and used for calling the touch menu;
responding to the single-finger sliding touch instruction, and detecting an action track of a touch point;
and if the action track is the same as a preset judgment track, controlling the display to display the touch menu, wherein the display position of the touch menu is determined according to the action track of the touch point in the single-finger sliding touch instruction.
Based on the display device, the application also provides a touch menu interaction method, which includes:
receiving a single-finger sliding touch instruction which is input by a user and used for calling a touch menu; the touch menu is an annular menu control formed by a plurality of control options;
responding to the single-finger sliding touch instruction, and detecting an action track of a touch point;
and if the action track is the same as a preset judgment track, controlling a display to display the touch menu, wherein the display position of the touch menu is determined according to the action track of the touch point in the single-finger sliding touch instruction.
In a third aspect, the present application provides a display device, which includes a display, a touch sensing module, and a controller. Wherein the display is configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options; the touch sensing module is configured to detect a touch instruction input by a user.
The controller is configured to execute the following program:
receiving a multi-finger rotating touch instruction which is input by a user and used for zooming the touch menu;
responding to the multi-finger rotating touch instruction, and acquiring the rotating directions of a plurality of touch points in the multi-finger rotating touch instruction;
and zooming the diameter of the touch menu according to the rotation direction, and controlling the display to display the touch menu in real time.
Based on the display device, the application also provides a touch menu interaction method, which includes:
receiving a multi-finger rotating touch instruction which is input by a user and used for zooming a touch menu; the touch menu is an annular menu control formed by a plurality of control options;
responding to the multi-finger rotating touch instruction, and acquiring the rotating directions of a plurality of touch points in the multi-finger rotating touch instruction;
and zooming the diameter of the touch menu according to the rotation direction, and controlling a display to display the touch menu in real time.
In a fourth aspect, the present application provides a display device, which includes a display, a touch sensing module, and a controller. Wherein the display is configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options; the touch sensing module is configured to detect a touch instruction input by a user.
The controller is configured to execute the following program:
receiving a single-finger sliding touch instruction which is input by a user and used for switching control options in the touch menu;
responding to the single-finger sliding touch instruction, and acquiring a touch point moving track of the single-finger sliding touch instruction;
and adjusting the display position of the control option in an annular menu control according to the touch point moving track, and controlling the display to display the touch menu in real time.
Based on the display device, the application also provides a touch menu interaction method, which includes:
receiving a single-finger sliding touch instruction which is input by a user and used for switching control options in a touch menu; the touch menu is an annular menu control formed by a plurality of control options;
responding to the single-finger sliding touch instruction, and acquiring a touch point moving track of the single-finger sliding touch instruction;
and adjusting the display position of the control option in an annular menu control according to the touch point moving track, and controlling the display to display the touch menu in real time.
In a fifth aspect, the present application provides a display device, which includes a display, a touch sensing module, and a controller. Wherein the display is configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options; the touch sensing module is configured to detect a touch instruction input by a user.
The controller is configured to execute the following program:
receiving a single-finger click touch instruction which is input by a user and used for opening the control option;
responding to the single-finger click touch instruction, and adding an adjusting control corresponding to the opened control option in the middle area of the annular menu control;
and controlling the display to display the touch menu and the adjusting control.
Based on the display device, the application also provides a touch menu interaction method, which includes:
receiving a single-finger click touch instruction which is input by a user and used for opening a control option; the touch menu is an annular menu control formed by a plurality of control options;
responding to the single-finger click touch instruction, and adding an adjusting control corresponding to the opened control option in the middle area of the annular menu control;
and controlling a display to display the touch menu and the adjusting control.
In a sixth aspect, the present application further provides a display device, where the display device includes a display, a touch sensing module, and a controller. Wherein the display is configured to display a touch menu; the touch menu is a circular menu control formed by at least one control option; the touch sensing module is configured to detect a touch instruction input by a user.
The controller is configured to execute the following program:
receiving a first direction rotation touch instruction which is input by a user and used for displaying hidden options in the touch menu;
and responding to the first direction rotation touch instruction, and controlling the display to sequentially display the hidden options according to the first direction.
Based on the display device, the application also provides a touch menu interaction method, which includes:
receiving a first-direction rotating touch instruction which is input by a user and used for displaying hidden options in a touch menu, wherein the touch menu is a circular menu control formed by at least one control option;
and responding to the first direction rotation touch instruction, and controlling the display to sequentially display the hidden options according to the first direction.
In a seventh aspect, the present application further provides a display device, where the display device includes a display, a touch sensing module, and a controller. Wherein the display is configured to display a touch menu; the touch menu is a circular menu control formed by at least one control option; the touch sensing module is configured to detect a touch instruction input by a user.
The controller is configured to execute the following program:
acquiring a multi-finger touch instruction which is input by a user and used for calling the touch menu;
responding to the multi-finger touch instruction, and acquiring the number of control options contained in the touch menu;
and setting the display diameter of the touch menu according to the number of the control options, and controlling the display to display the touch menu according to the display diameter.
Based on the display device, the application also provides a touch menu interaction method, which includes:
acquiring a multi-finger touch instruction which is input by a user and used for calling a touch menu; the touch menu is a circular menu control formed by at least one control option;
responding to the multi-finger touch instruction, and acquiring the number of control options contained in the touch menu;
and setting the display diameter of the touch menu according to the number of the control options, and controlling the display to display the touch menu according to the display diameter.
According to the technical scheme, the application provides the display device and the touch menu interaction method. The interaction method can be applied to display equipment to realize different functions on the touch menu. In the interaction process, a user can input a multi-finger touch instruction through the touch screen to call a touch menu. After the touch menu is called, the user can also input a multi-finger rotating touch instruction to zoom the diameter of the touch menu; and inputting a single-finger sliding touch instruction to adjust the positions of the control options on the touch menu, so that any control option is opened by inputting a single-finger clicking touch instruction. And for the opened control options, an adjusting control can be directly displayed in the middle area of the touch menu, so that a user can input a single-finger sliding touch instruction to adjust the control parameters.
According to the touch menu interaction method, the menu function can be displayed through the annular interface, the operation action is simplified through a single-finger or multi-finger clicking, sliding and rotating touch interaction mode, and the problem that the touch menu interaction mode of the traditional display equipment is complicated is solved. In addition, the annular menu can be freely zoomed according to the quantity requirement of the user on the menu functions, and the user experience is effectively improved.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a block diagram of a configuration of a display device in an embodiment of the present application;
FIG. 2 is a block diagram of an architectural configuration of an operating system in a memory of a display device according to an embodiment of the present application;
FIG. 3 is a schematic diagram of an operation scenario between a display device and a control apparatus in an embodiment of the present application;
fig. 4 is a block diagram showing a configuration of a control device in the embodiment of the present application;
FIG. 5 is a schematic diagram of a touch menu in the embodiment of the present application;
FIG. 6 is a schematic diagram illustrating a touch menu beginning to zoom in and out according to an embodiment of the present disclosure;
FIG. 7 is a schematic diagram illustrating a touch menu being zoomed to any state in the embodiment of the present application;
FIG. 8 is a schematic diagram of an adjustment control option in an embodiment of the present application;
FIG. 9 is a diagram illustrating a circular display control option in an embodiment of the present application;
fig. 10 is a schematic flowchart illustrating a process of calling a touch menu through a multi-touch command in an embodiment of the present application;
fig. 11 is a schematic flow chart illustrating displaying a touch menu according to coordinates of a center point in the embodiment of the present application;
FIG. 12 is a schematic flow chart illustrating displaying a touch menu at an edge of a screen according to an embodiment of the present disclosure;
FIG. 13 is a schematic flow chart illustrating an embodiment of the present application for adjusting control options based on an initial diameter;
FIG. 14 is a schematic flow chart illustrating the determination of the initial diameter according to the number of touch points in the embodiment of the present application;
fig. 15 is a schematic flowchart illustrating a process of calling a touch menu by a single-finger sliding touch instruction in the embodiment of the present application;
fig. 16 is a schematic flow chart illustrating displaying a touch menu according to a center position of an action track in the embodiment of the present application;
FIG. 17 is a flowchart illustrating zooming a touch menu according to an embodiment of the present application;
FIG. 18 is a schematic flow chart illustrating scaling calculation in an embodiment of the present application;
FIG. 19 is a flowchart illustrating a process of determining a maximum angle change amount according to an embodiment of the present application;
fig. 20 is a schematic flowchart illustrating switching of control options by a single-finger sliding touch instruction according to an embodiment of the present application;
FIG. 21 is a flowchart illustrating switching control options according to sliding speed in the embodiment of the present application;
FIG. 22 is a flowchart illustrating a display adjustment control according to an embodiment of the present application;
FIG. 23 is a schematic flow chart illustrating adjustment of control parameters by an adjustment control according to an embodiment of the present application;
FIG. 24 is a schematic view of a volume adjustment control according to an embodiment of the present application;
FIG. 25 is a flowchart illustrating a process of determining whether a control option supports displaying an adjustment control according to an embodiment of the present application;
FIG. 26 is a flowchart illustrating an embodiment of the present application showing hidden options;
fig. 27 is a schematic flow chart illustrating setting of a display diameter according to the number of control options in the embodiment of the present application.
Detailed Description
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following examples do not represent all embodiments consistent with the present application. But merely as exemplifications of systems and methods consistent with certain aspects of the application, as recited in the claims.
The application provides a display device and a touch menu interaction method, the touch menu interaction method can be applied to adisplay device 200, and thedisplay device 200 refers to a device with a display screen and a touch interaction function, such as a touch television. Obviously, the display device is not limited to the touch television, and may be other devices with a touch function, such as a mobile phone, a tablet computer, a notebook computer, a touch display, and the like.
Thedisplay apparatus 200 may provide a network television function of a broadcast receiving function and a computer support function. The display device may be implemented as a digital television, a web television, an Internet Protocol Television (IPTV), or the like.
Thedisplay device 200 may be a liquid crystal display, an organic light emitting display, a projection device. The specific display device type, size, resolution, etc. are not limited.
A hardware configuration block diagram of thedisplay device 200 is exemplarily shown in fig. 1. As shown in fig. 1, thedisplay apparatus 200 may include atuner demodulator 210, acommunicator 220, adetector 230, anexternal device interface 240, acontroller 250, amemory 260, auser interface 265, avideo processor 270, adisplay 275, atouch sensing module 277, anaudio processor 280, anaudio output interface 285, and apower supply 290.
Thetuner demodulator 210 receives the broadcast television signal in a wired or wireless manner, may perform modulation and demodulation processing such as amplification, mixing, and resonance, and is configured to demodulate, from a plurality of wireless or wired broadcast television signals, an audio/video signal carried in a frequency of a television channel selected by a user, and additional information (e.g., EPG data).
Thetuner demodulator 210 is responsive to the user selected frequency of the television channel and the television signal carried by the frequency, as selected by the user and controlled by thecontroller 250.
Thetuner demodulator 210 can receive a television signal in various ways according to the broadcasting system of the television signal, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; and according to different modulation types, a digital modulation mode or an analog modulation mode can be adopted; and can demodulate the analog signal and the digital signal according to the different kinds of the received television signals.
In other exemplary embodiments, thetuning demodulator 210 may also be in an external device, such as an external set-top box. In this way, the set-top box outputs a television signal after modulation and demodulation, and inputs the television signal into thedisplay apparatus 200 through theexternal device interface 240.
Thecommunicator 220 is a component for communicating with an external device or an external server according to various communication protocol types. For example, thedisplay apparatus 200 may transmit content data to an external apparatus connected via thecommunicator 220, or browse and download content data from an external apparatus connected via thecommunicator 220. Thecommunicator 220 may include a network communication protocol module or a near field communication protocol module, such as aWIFI module 221, a bluetoothcommunication protocol module 222, and a wired ethernetcommunication protocol module 223, so that thecommunicator 220 may receive a control signal of thecontrol device 100 according to the control of thecontroller 250 and implement the control signal as a WIFI signal, a bluetooth signal, a radio frequency signal, and the like.
Thedetector 230 is a component of thedisplay apparatus 200 for collecting signals of an external environment or interaction with the outside. Thedetector 230 may include asound collector 231, such as a microphone, which may be used to receive a user's sound, such as a voice signal of a control instruction of the user to control thedisplay device 200; alternatively, ambient sounds may be collected that identify the type of ambient scene, enabling thedisplay device 200 to adapt to ambient noise.
In some other exemplary embodiments, thedetector 230, which may further include animage collector 232, such as a camera, a video camera, etc., may be configured to collect external environment scenes to adaptively change the display parameters of thedisplay device 200; and the function of acquiring the attribute of the user or interacting gestures with the user so as to realize the interaction between the display equipment and the user.
In some other exemplary embodiments, thedetector 230 may further include a light receiver for collecting the intensity of the ambient light to adapt to the display parameter variation of thedisplay device 200.
In some other exemplary embodiments, thedetector 230 may further include a temperature sensor, such as by sensing an ambient temperature, and thedisplay device 200 may adaptively adjust a display color temperature of the image. For example, when the temperature is higher, thedisplay apparatus 200 may be adjusted to display a color temperature of an image that is cooler; when the temperature is lower, thedisplay device 200 may be adjusted to display a warmer color temperature of the image.
Theexternal device interface 240 is a component for providing thecontroller 250 to control data transmission between thedisplay apparatus 200 and an external apparatus. Theexternal device interface 240 may be connected to an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner, and may receive data such as a video signal (e.g., moving image), an audio signal (e.g., music), additional information (e.g., EPG), etc. of the external apparatus.
Theexternal device interface 240 may include: a High Definition Multimedia Interface (HDMI) terminal 241, a Composite Video Blanking Sync (CVBS)terminal 242, an analog ordigital Component terminal 243, a Universal Serial Bus (USB)terminal 244, a Component terminal (not shown), a red, green, blue (RGB) terminal (not shown), and the like.
Thecontroller 250 controls the operation of thedisplay device 200 and responds to the operation of the user by running various software control programs (such as an operating system and various application programs) stored on thememory 260. For example, the controller may be implemented as a Chip (SOC).
As shown in fig. 1, thecontroller 250 includes a Random Access Memory (RAM)251, a Read Only Memory (ROM)252, agraphics processor 253, aCPU processor 254, acommunication interface 255, and acommunication bus 256. The RAM251, the ROM252, thegraphic processor 253, and theCPU processor 254 are connected to each other through acommunication bus 256 through acommunication interface 255.
The ROM252 stores various system boot instructions. When thedisplay apparatus 200 starts power-on upon receiving the power-on signal, theCPU processor 254 executes a system boot instruction in the ROM252, copies the operating system stored in thememory 260 to the RAM251, and starts running the boot operating system. After the start of the operating system is completed, theCPU processor 254 copies the various application programs in thememory 260 to the RAM251 and then starts running and starting the various application programs.
And agraphic processor 253 for generating various graphic objects such as icons, operation menus, and user input instruction display graphics, etc. Thegraphic processor 253 may include an operator for performing an operation by receiving various interactive instructions input by a user, and further displaying various objects according to display attributes; and a renderer for generating various objects based on the operator and displaying the rendered result on thedisplay 275.
ACPU processor 254 for executing operating system and application program instructions stored inmemory 260. And according to the received user input instruction, processing of various application programs, data and contents is executed so as to finally display and play various audio-video contents.
In some example embodiments, theCPU processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and a plurality of or one sub-processor. A main processor for performing some initialization operations of thedisplay apparatus 200 in the display apparatus preload mode and/or operations of displaying a screen in the normal mode. A plurality of or one sub-processor for performing an operation in a state of a standby mode or the like of the display apparatus.
Thecommunication interface 255 may include a first interface to an nth interface. These interfaces may be network interfaces that are connected to external devices via a network.
Thecontroller 250 may control the overall operation of thedisplay apparatus 200. For example: in response to receiving a user input command for selecting a GUI object displayed on thedisplay 275, thecontroller 250 may perform an operation related to the object selected by the user input command. For example, the controller may be implemented as an SOC (System on Chip) or an MCU (Micro Control Unit).
Where the object may be any one of the selectable objects, such as a hyperlink or an icon. The operation related to the selected object is, for example, an operation of displaying a link to a hyperlink page, document, image, or the like, or an operation of executing a program corresponding to the object. The user input command for selecting the GUI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch panel, etc.) connected to thedisplay apparatus 200 or a voice command corresponding to a voice spoken by the user.
Amemory 260 for storing various types of data, software programs, or applications for driving and controlling the operation of thedisplay device 200. Thememory 260 may include volatile and/or nonvolatile memory. And the term "memory" includes thememory 260, the RAM251 and the ROM252 of thecontroller 250, or a memory card in thedisplay device 200.
In some embodiments, thememory 260 is specifically used for storing an operating program for driving thecontroller 250 of thedisplay device 200; storing various application programs built in thedisplay apparatus 200 and downloaded by a user from an external apparatus; data such as visual effect images for configuring various GUIs provided by thedisplay 275, various objects related to the GUIs, and selectors for selecting GUI objects are stored.
In some embodiments,memory 260 is specifically configured to store drivers fortuner demodulator 210,communicator 220,detector 230,external device interface 240,video processor 270,display 275,audio processor 280, etc., and related data, such as external data (e.g., audio-visual data) received from the external device interface or user data (e.g., key information, voice information, touch information, etc.) received by the user interface.
In some embodiments,memory 260 specifically stores software and/or programs representing an Operating System (OS), which may include, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. Illustratively, the kernel may control or manage system resources, as well as functions implemented by other programs (e.g., the middleware, APIs, or applications); at the same time, the kernel may provide an interface to allow middleware, APIs, or applications to access the controller to enable control or management of system resources.
A block diagram of the architectural configuration of the operating system in the memory of thedisplay device 200 is illustrated in fig. 2. The operating system architecture comprises an application layer, a middleware layer and a kernel layer from top to bottom.
The application layer, the application programs built in the system and the non-system-level application programs belong to the application layer. Is responsible for direct interaction with the user. The application layer may include a plurality of applications such as a setup application, a post application, a media center application, and the like. These applications may be implemented as Web applications that execute based on a WebKit engine, and in particular may be developed and executed based on HTML5, Cascading Style Sheets (CSS), and JavaScript.
Here, HTML, which is called HyperText Markup Language (HyperText Markup Language), is a standard Markup Language for creating web pages, and describes the web pages by Markup tags, where the HTML tags are used to describe characters, graphics, animation, sound, tables, links, etc., and a browser reads an HTML document, interprets the content of the tags in the document, and displays the content in the form of web pages.
CSS, known as Cascading Style Sheets (Cascading Style Sheets), is a computer language used to represent the Style of HTML documents, and may be used to define Style structures, such as fonts, colors, locations, etc. The CSS style can be directly stored in the HTML webpage or a separate style file, so that the style in the webpage can be controlled.
JavaScript, a language applied to Web page programming, can be inserted into an HTML page and interpreted and executed by a browser. The interaction logic of the Web application is realized by JavaScript. The JavaScript can package a JavaScript extension interface through a browser, realize the communication with the kernel layer,
the middleware layer may provide some standardized interfaces to support the operation of various environments and systems. For example, the middleware layer may be implemented as multimedia and hypermedia information coding experts group (MHEG) middleware related to data broadcasting, DLNA middleware which is middleware related to communication with an external device, middleware which provides a browser environment in which each application program in the display device operates, and the like.
The kernel layer provides core system services, such as: file management, memory management, process management, network management, system security authority management and the like. The kernel layer may be implemented as a kernel based on various operating systems, for example, a kernel based on the Linux operating system.
The kernel layer also provides communication between system software and hardware, and provides device driver services for various hardware, such as: provide display driver for the display, provide camera driver for the camera, provide button driver for the remote controller, provide wiFi driver for the WIFI module, provide audio driver for audio output interface, provide power management drive for Power Management (PM) module etc..
Auser interface 265 receives various user interactions. Specifically, it is used to transmit an input signal of a user to thecontroller 250 or transmit an output signal from thecontroller 250 to the user. For example, theremote controller 100A may transmit an input signal, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by the user to theuser interface 265, and then the input signal is transferred to thecontroller 250 through theuser interface 265; alternatively, theremote controller 100A may receive an output signal such as audio, video, or data output from theuser interface 265 via thecontroller 250, and display the received output signal or output the received output signal in audio or vibration form.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on thedisplay 275, and theuser interface 265 receives the user input commands through the GUI. Specifically, theuser interface 265 may receive user input commands for controlling the position of a selector in the GUI to select different objects or items.
Alternatively, the user may input a user command by inputting a specific sound or gesture, and theuser interface 265 receives the user input command by recognizing the sound or gesture through the sensor.
Thevideo processor 270 is configured to receive an external video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a video signal that is directly displayed or played on thedisplay 275.
Illustratively, thevideo processor 270 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is configured to demultiplex an input audio/video data stream, where, for example, an input MPEG-2 stream (based on a compression standard of a digital storage media moving image and voice), the demultiplexing module demultiplexes the input audio/video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, for example, convert a frame rate of an input 60Hz video into a frame rate of 120Hz or 240Hz, where a common format is implemented by using, for example, an interpolation frame method.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format of a display, such as converting the format of the signal output by the frame rate conversion module to output an RGB data signal.
Adisplay 275 for receiving the image signal from thevideo processor 270 and displaying the video content, the image and the menu manipulation interface. The display video content may be from the video content in the broadcast signal received by the tuner-demodulator 210, or from the video content input by thecommunicator 220 or theexternal device interface 240. Thedisplay 275, while displaying a user manipulation interface UI generated in thedisplay apparatus 200 and used to control thedisplay apparatus 200. And, thedisplay 275 may include a display screen assembly for presenting a picture and a driving assembly for driving the display of an image.
Thetouch sensing module 277 is configured to detect a touch action performed by a user on thedisplay 275, so as to implement a touch interactive operation.
In an exemplary embodiment of the present application, to implement touch interaction, the display screen of thedisplay 275 may be a touch screen through which a user may perform a series of touch interaction actions, such as: click, long press, slide, etc. The touch action performed by the user on the touch screen can be detected by thetouch sensing module 277, and the corresponding interaction action is performed according to the pre-configured interaction rule.
In some embodiments, the implementation of the touch screen function may be to add a layer of touch sensing elements on the display screen of thedisplay 275, and the specific touch sensing principle may be determined according to the actual interaction requirement. For example, a capacitive touch screen, a resistive touch screen, an infrared touch screen, a surface acoustic wave touch screen, or the like may be used according to the actual application scenario of thedisplay device 200.
Thetouch sensing module 277 may include a sensing unit provided on thedisplay 275 and a signal processing unit provided in the display device. The sensing unit can be used for sensing touch operation of a user and converting the touch operation into an electric signal; the signal processing unit may perform processing on the generated electrical signal, including feature extraction, noise reduction, amplification, and the like.
Taking a capacitive touch screen as an example, the sensing unit may be a layer of transparent special metal conductive substance attached to the surface of the display screen glass of thedisplay 275. When a finger or a palm of a user touches the conductive substance layer, the capacitance value of the touch point is changed, so that a touch signal is generated. The signal processing unit can receive the touch signal and process the touch signal to convert the touch signal into a digital signal readable by thecontroller 250.
Generally, the interactive actions performed by the user on the touch screen may include clicking, long-pressing, sliding, and the like. In order to support more interactive modes, the touch screen can also support multi-touch. The more touch points the touch screen supports, the more interactive actions can be implemented accordingly. For example, a multi-finger click, a multi-finger long press, a multi-finger swipe, etc. may be implemented.
For different interaction actions, characteristics of the touch signal, such as a touch point position, a touch point number, a touch area, and the like, may be obtained for the generated touch signal. And judging the type of the touch signal according to the signal characteristics generated by the touch point, thereby generating a touch instruction. According to the position of the touch point, the touch position of the user, namely the position for executing the handing-out operation, can be detected; the number of fingers used in the touch interactive operation of the user can be determined through the number of the touch points; judging the duration of the touch signal through the sheet, and determining whether the user performs a click operation or a long-time press operation; through the position change situation of the touch point, the sliding operation performed by the user can be determined.
For example, thetouch sensing module 277 may extract features of the touch signal after detecting the touch signal, and if the number of touch points in the touch signal is equal to 1, the duration of the touch signal is less than 0.5s, and the position of the touch point in the touch signal is not changed, determine that the current interaction action input by the user is a single-finger click action, and accordingly may generate a single-finger click touch instruction.
For another example, if the number of touch points in the touch signal is equal to 5, the duration of the touch signal is greater than 1s, the amount of change in the position of the touch point in the touch signal is small, and the change in the angle of the touch point exceeds the preset determination threshold, it is determined that the current interaction input by the user is a five-finger rotation motion, and accordingly a five-finger rotation touch instruction may be generated.
Thetouch sensing module 277 may be connected to thecontroller 250 to transmit the generated touch command to thecontroller 250. Since the interaction process is a continuous process, thetouch sensing module 277 continuously sends a touch command to thecontroller 250 to form a data stream. In order to distinguish different touch commands, thetouch sensing module 277 may generate touch commands according to the rule of one or more touch actions, so that thecontroller 250 may receive a complete and recognizable touch command.
It should be noted that the signal processing operation performed on the touch signal may be performed by thecontroller 250, that is, in some embodiments, the sensing unit may directly send the detected electric signal to thecontroller 250, and thecontroller 250 processes the touch signal by calling a preset signal processing program, so as to generate the touch command.
Theaudio processor 280 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing to obtain an audio signal that can be played by thespeaker 286.
Illustratively,audio processor 280 may support various audio formats. Such as MPEG-2, MPEG-4, Advanced Audio Coding (AAC), high efficiency AAC (HE-AAC), and the like.
Theaudio output interface 285 is used for receiving an audio signal output by theaudio processor 280 under the control of thecontroller 250, and theaudio output interface 285 may include aspeaker 286 or an externalsound output terminal 287, such as an earphone output terminal, for outputting to a generating device of an external device.
In other exemplary embodiments,video processor 270 may comprise one or more chips.Audio processor 280 may also comprise one or more chips.
And, in other exemplary embodiments, thevideo processor 270 and theaudio processor 280 may be separate chips or may be integrated with thecontroller 250 in one or more chips.
And apower supply 290 for supplying power supply support to thedisplay apparatus 200 from the power input from the external power source under the control of thecontroller 250. Thepower supply 290 may be a built-in power supply circuit installed inside thedisplay apparatus 200 or may be a power supply installed outside thedisplay apparatus 200.
In addition to the above touch interaction, the user may perform interaction with thedisplay apparatus 200 through thecontrol device 100. Fig. 3 is a schematic diagram illustrating an operation scenario between the display device and the control apparatus. As shown in fig. 3, thecontrol apparatus 100 and thedisplay device 200 may communicate with each other in a wired or wireless manner.
Among them, thecontrol apparatus 100 is configured to control thedisplay device 200, which may receive an operation instruction input by a user and convert the operation instruction into an instruction recognizable and responsive by thedisplay device 200, serving as an intermediary for interaction between the user and thedisplay device 200. Such as: the user operates the channel up/down key on thecontrol device 100, and thedisplay device 200 responds to the channel up/down operation.
Thecontrol device 100 may be aremote controller 100A, which includes infrared protocol communication or bluetooth protocol communication, and other short-distance communication methods, etc. to control thedisplay apparatus 200 in a wireless or other wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control thedisplay apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling thedisplay device 200.
Thecontrol device 100 may also be an intelligent device, such as amobile terminal 100B, a tablet computer, a notebook computer, and the like. For example, thedisplay device 200 is controlled using an application program running on the smart device. The application program may provide various controls to a user through an intuitive User Interface (UI) on a screen associated with the smart device through configuration.
For example, themobile terminal 100B may install a software application with thedisplay device 200 to implement connection communication through a network communication protocol for the purpose of one-to-one control operation and data communication. Such as: themobile terminal 100B may be caused to establish a control instruction protocol with thedisplay device 200 to implement the functions of the physical keys as arranged in theremote control 100A by operating various function keys or virtual buttons of the user interface provided on themobile terminal 100B. The audio and video content displayed on themobile terminal 100B may also be transmitted to thedisplay device 200, so as to implement a synchronous display function.
Fig. 4 is a block diagram illustrating the configuration of thecontrol device 100. As shown in fig. 4, thecontrol device 100 includes acontroller 110, a memory 120, acommunicator 130, auser input interface 140, an output interface 150, and a power supply 160.
Thecontroller 110 includes a Random Access Memory (RAM)111, a Read Only Memory (ROM)112, aprocessor 113, a communication interface, and a communication bus. Thecontroller 110 is used to control the operation of thecontrol device 100, as well as the internal components of the communication cooperation, external and internal data processing functions.
Illustratively, when an interaction of a user pressing a key disposed on theremote controller 100A or an interaction of touching a touch panel disposed on theremote controller 100A is detected, thecontroller 110 may control to generate a signal corresponding to the detected interaction and transmit the signal to thedisplay device 200.
Thedisplay apparatus 200 also performs data communication with theserver 300 through various communication means. Here, thedisplay apparatus 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. Theserver 300 may provide various contents and interactions to thedisplay apparatus 200. By way of example, thedisplay device 200 may send and receive information such as: receiving Electronic Program Guide (EPG) data, receiving software program updates, or accessing a remotely stored digital media library. Theservers 300 may be a group or groups of servers, and may be one or more types of servers. Other web service contents such as a video on demand and an advertisement service are provided through theserver 300.
In some embodiments of the present application, a touch menu may be displayed on thedisplay 275, and a touch screen may be built in thedisplay 275, and a user may perform an interactive action on the touch menu (or on the entire screen) through the touch screen.
The touch menu is a display control including a plurality of control options, and may be set to different shapes according to a system UI style of thedisplay device 200, as shown in fig. 5, the touch menu may be in an annular shape, the touch menu includes a plurality of control options arranged in an annular shape, and each control option corresponds to one function.
For example, control options such as "set, volume, timer, screen capture, label, smart whiteboard, live classroom" are sequentially set in the touch menu. If the user selects the 'setting' option through interactive operation, the user can directly jump from the current interface to the setting interface; if the user selects the 'volume' option, a control corresponding to the volume adjusting function can be displayed on the upper layer of the current interface or the volume adjusting interface is jumped to.
The touch menu may be called in any interface of thedisplay device 200, and thus the function of the control option in the touch menu may be a basic setting function performed on thedisplay device 200, or another more common function, or another function that can be implemented only by using a touch interaction manner. For example, control options related to basic play functions, such as "image, sound, signal source" and the like, are set in the touch menu. The touch menu may further include a function option more suitable for touch operation, for example, a "brightness" adjustment function set in the touch menu, so as to implement stepless brightness adjustment.
In some embodiments, the touch menu may also be invoked only in a partial scenario, for example, in an application page and a play page, and may not be invoked in the setup interface to avoid repetitive operations.
In some embodiments, the control options included in the touch menu may also support user customization, that is, a user may add or delete a control option to the touch menu through the setting page. For example, when a user uses the display device, the user often transmits resources through the bluetooth module, and then the bluetooth connection function can be added to the touch menu through the setting page, so that the user can quickly jump to the bluetooth connection setting page through the touch menu.
In some embodiments, the touch menu may also automatically adjust the control options included in the application according to the usage scenario. The adjusted content may include not only the specific control options but also the number and arrangement order of the control options. For example, when the play page calls the touch menu, the touch menu may include relatively more play-related content, such as: control options such as "play/pause, volume, picture quality, signal source, fast forward/fast reverse, screen cast", and the like, and according to the actual frequency of use, more common control options may be preferentially displayed.
In some embodiments, each control option in the touch menu may represent its referenced function in graphical and/or textual form for the user to open/enable. For example, the volume control option may adopt an icon shaped like a small horn and a sound wave, and prompt words such as "volume" may be set under the icon. The control options of different functions may also have different icon forms in different states, for example, the volume control option may display less sound wave arcs in front of the trumpet icon when the volume of thecurrent display device 200 is small, and may display more sound wave arcs in front of the trumpet icon when the volume of thecurrent display device 200 is large.
For the touch menu with the annular structure, the minimum and/or maximum display diameter can be set. Generally, the minimum diameter may be adapted to the size of the palm of the user's hand to facilitate direct performance of the operation after invocation, while the maximum diameter generally enables the operating area of the touch menu to extend beyond the edge of the display screen of thedisplay 275 to facilitate the user's performance of the touch operation.
It should be noted that, since the touch menu in the embodiment of the present application is in an annular shape, the touch menu includes a plurality of diameters, for example, an inner diameter and an outer diameter, and the corresponding display diameter may refer to any one of the inner diameter or the outer diameter. The control options are distributed on the annular band of the touch menu in an arrayed manner, and the corresponding effective operation area is also on the annular band, so that the display diameter can also be a central circle corresponding to the annular band, namely the display diameter can be an average value of the inner diameter and the outer diameter. Unless otherwise stated, the diameter or radius described hereinafter refers to the diameter or radius of the center circle corresponding to the annular band, that is, the display diameter is the average of the inner diameter and the outer diameter of the annular control, and the display radius is 1/2 of the display diameter.
In some embodiments, the touch menu supports scaling the diameter of the display ring. The touch menu can be zoomed through a specific touch action, for example, in order to obtain better interaction experience, after the touch menu is called out, multi-finger touch and clockwise rotation may be adopted, at this time, the display diameter of the touch menu may be gradually increased, so as to achieve the effect of enlarging the display area, as shown in fig. 6 and 7.
Obviously, when the display diameter of the touch menu is enlarged, the annular band of the touch menu has a larger display area, and therefore, in some embodiments, the touch menu can show more control options after being enlarged. As shown in fig. 5, when the touch menu is not enlarged, 5 control options of "setup, volume, timer, screen capture, label, smart whiteboard, live classroom" are commonly displayed in the touch menu. As shown in fig. 6, after the touch menu is enlarged, 3 control options of "menu, edit, and map search" may be added to the touch menu. The radius of the circle can be increased according to the requirement, and the menu is increased to N subfunction inlets in the same way.
The number of control options accommodated in the circular touch menu is also limited due to the limited area of the circular band region of the touch menu. Therefore, in some implementations, when the number of the control options included in the touch menu is large, part of the control options may be hidden, and after the touch menu is called up, the hidden control options are displayed in a sliding or page-turning manner.
For example, in the touch menu shown in fig. 6, 8 control options of "set, volume, timer, screen capture, label, smart whiteboard, live classroom, menu, edit, and image search" are included, on the basis of this display screen, if the user inputs a touch instruction for turning pages, the "HDMI" option may be added after the "image search" option, and meanwhile, in order to maintain the circular arrangement, the "set" option may be hidden, as shown in fig. 7.
In order to maintain the continuity of the operation, when the control options displayed in the touch menu are switched, all the control options can be combined into a ring to be displayed. When switching to the last control option, the first control option may be displayed again after the last control option. As shown in fig. 8 and 9, as the "HDMI" option is displayed during the sliding operation, if there is no other function option, the "setup" option is displayed after the "HDMI" option, so that the entire touch menu is maintained in a ring-shaped display, and the split feeling during the interaction process is reduced.
Based on the touch menu, some embodiments of the present application provide adisplay device 200 and a touch menu interaction method. As shown in fig. 10, the touch menu interaction method includes the following steps:
s11: and receiving a multi-finger touch instruction which is input by a user and used for calling the touch menu.
In order to present a touch menu in a display screen, thecontroller 250 of thedisplay apparatus 200 may monitor an interaction of a user on the touch screen in real time through thetouch sensing module 277. After the user inputs an interactive action for invoking a touch menu through a touch gesture on the touch screen, thetouch sensing module 277 generates a touch signal, generates a touch command, and sends the touch command to thecontroller 250, so that thecontroller 250 can control thedisplay 275 to present the touch menu after receiving the touch command.
The specific touch instruction for calling the touch menu may be set according to an application scenario of theactual display device 200, and in order to effectively distinguish the specific touch instruction from a commonly used touch instruction, such as a single-finger click, a single-finger slide, a single-finger long press, and the like, a multi-finger touch instruction may be used in some implementation manners. The multi-finger touch instruction can be generated through multi-finger click action input and/or multi-finger sliding action input triggering detected on the touch screen.
For example, after the software system of thedisplay device 200 is normally started, the user may touch the touch screen of thedisplay 275 with a plurality of fingers (five fingers, or four fingers, or three fingers, or two fingers) in any scene, that is, may call a touch menu function.
S12: and responding to the multi-finger touch instruction, and controlling the display to display the touch menu.
After the user inputs the multi-touch command, thecontroller 250 may present the touch menu in the display screen by executing a corresponding interactive program. In order to facilitate user operation, the display position of the touch menu is determined according to the position of each touch point in the multi-finger touch instruction.
In addition, a series of guide screens can be displayed in the display interface while the touch menu is displayed, for example, the user is instructed to zoom the touch menu through multi-finger rotating interaction action, and the like. The guide picture can be displayed in a mode of matching graphics with characters, and manual closing is supported. For example, the guide screen may be a translucent pattern, and a close button may be provided in an upper right corner area of the screen. The guidance screen may be displayed only when the touch menu is called by thedisplay device 200 several times before, and the guidance screen is not displayed any more when the number of calls of the touch menu or the use time of thedisplay device 200 reaches a certain set threshold.
Therefore, in this embodiment, a user can invoke a touch menu on any interface through multi-finger touch operation. Because a multi-finger click or multi-finger sliding touch mode and the like are used, compared with the traditional menu calling mode, misoperation of a user can be avoided. Meanwhile, the display position of the touch menu can be determined according to the touch position, so that the operation of a user can be facilitated, and the user experience is improved.
In some embodiments, as shown in fig. 11, the display positions of the touch menu and the guidance screen may depend on the position where the system determines that the user touches with the finger, so that the touch menu can appear close to the position where the finger is located. That is, the step of controlling the display to display the touch menu further includes:
s121: acquiring the position coordinates of each touch point in the multi-finger touch instruction;
s122: calculating the coordinates of a central point based on a plurality of touch points according to the position coordinates;
s123: and controlling the display to display the touch menu by taking the coordinate of the central point as a reference.
In practical applications, thecontroller 250 may first determine a touch position point (hereinafter referred to as a touch point) of five fingers (or four fingers, or three fingers, or two fingers), and then form a reference point based on a plurality of touch points as a center, so that the touch menu forms an annular touch menu display screen by taking the reference point as the center.
Because the palm of the hand is usually opened in the process of touch interaction by a user, when a multi-finger touch instruction is input, a plurality of touch points can be approximately positioned on the same arc line, namely, a convex polygon shape can be formed among the plurality of touch points, and therefore the center point coordinate determined by the positions of the touch points can be the gravity center of the formed convex polygon. For example, when a user touches with five fingers, a convex pentagon can be formed corresponding to the positions of the tips of the five fingers, the position coordinates of the five vertexes are extracted, the position coordinates of the center of gravity of the pentagon can be obtained, and finally, the touch menu is displayed by taking the coordinates of the center point as a reference.
In the above embodiment, the central point may be determined by a plurality of touch points, so that the touch menu may be displayed at a position relatively attached to the palm. Thus, the user can realize subsequent interactive operation on the touch menu without moving the finger to a longer distance.
In an exemplary embodiment, any one touch point can be directly selected from the touch points corresponding to the multi-finger touch instruction to serve as a display reference point of the touch menu. Namely, the step of controlling the display to display the touch menu further includes: acquiring the position coordinates of each touch point in the multi-finger touch instruction; and controlling the display to display the touch menu with the position coordinate of the appointed touch point in the plurality of touch points as a reference.
For example, when the user performs a right-hand five-finger touch input, the touch menu may be displayed by designating a touch point corresponding to the index finger as a reference point, that is, a second touch point in the left-to-right direction among the five touch points as a reference point.
Since the touch operation position of the user is uncertain in practical applications, for example, the user may operate in the middle area of the screen of thedisplay 275, and the user may also operate in the area near the edge, if the display position of the touch menu is determined by calculating the position of each touch point in the multi-touch instruction, the touch menu may be too close to the edge of the screen to be displayed completely. In an exemplary embodiment, as shown in fig. 12, in order to fully display the touch menu, thecontroller 250 is further configured to execute the following program steps:
s1231: acquiring the spacing distance between the center point coordinate and the edge of the touch screen of the display;
s1232: if the spacing distances are all larger than or equal to the initial radius of the touch menu, setting the coordinates of the central point as the display origin of the touch menu;
s1233: if any one of the spacing distances is smaller than the initial radius of the touch menu, taking the center point coordinate as a reference, and translating the display origin of the touch menu in a direction away from the corresponding side edge of the touch screen;
s1234: and controlling a display to display the touch menu by taking the origin as a center.
In general, the display screen of thedisplay 275 is rectangular, and thus the distances between the center point and the edge of the screen are determined to be 4 in total, that is, the distance DL between the center point and the left edge of the screen, the distance DR between the center point and the right edge of the screen, the distance DT between the center point and the upper edge of the screen, and the distance DB between the center point and the lower edge of the screen. Wherein the distance may be an international standard length measurement unit, such as 10 cm; or directly pass through the number of pixel points between the center point and the edge of the screen, such as 720 px.
After the distance between the center point and the edge of the screen is obtained, the four distances can be respectively compared with the initial diameter of the touch menu to determine whether the current touch position can satisfy the requirement of completely displaying the touch menu. Specifically, if the spacing distances are all greater than or equal to the initial radius of the touch menu, the current touch position can meet the complete display requirement, and the center point coordinate is correspondingly set as the display origin of the touch menu. For example, the initial radius of the touch menu is set to be 7cm, the distances between the acquired center point and the edge of the screen are respectively DL-87 cm, DR-57 cm, DT-40 cm and DB-41 cm, the distances between the current center point and the edge of the screen can be determined to be greater than or equal to the initial radius of the touch menu through comparison, and the coordinates of the center point can be directly used as the display origin of the touch menu.
And if any interval distance is smaller than the initial radius of the touch menu, translating the display origin of the touch menu in the direction away from the corresponding side edge of the touch screen by taking the center point coordinate as a reference so as to completely display the touch menu. For example, an initial radius of the touch menu is set to be 7cm, distances between a center point and an edge of the screen are acquired as DL being 139cm, DR being 5cm, DT being 40cm and DB being 41cm, respectively, and then the distance DR between the center point and the right edge of the screen can be determined as 5cm < 7cm by comparison, that is, the touch position is too far to the right, and at this time, the display origin may be shifted to the left by at least 2cm to completely display the touch menu.
After the display position of the touch menu is determined, the content displayed in the touch menu can be further determined. In order to bring about better user experience, as shown in fig. 13, the step of controlling thedisplay 275 to display the touch menu further includes:
s124: acquiring an initial diameter of the touch menu;
s125: setting the number of control options in the touch menu and the icon size of each control option according to the initial diameter;
s126: and controlling the display to display the touch menu according to the set control option quantity and icon size.
In an exemplary embodiment, in order to display the control options in the touch menu, an initial diameter of the touch menu may be obtained first, then the number of displayable control options and the size of the icon corresponding to each control option are set according to the initial diameter, and finally the touch menu is displayed according to the set number and the size of the icon.
In the above embodiment, the initial diameter (or the initial radius) may be set as a default value according to the number of control options included in the touch menu, or may be determined by calculation according to the position of each touch point in the multi-finger touch instruction. If the initial diameter is determined by calculation based on the location of each touch point, the initial diameter may be determined by the maximum distance between the multiple touch points. Therefore, as shown in fig. 14, the step of obtaining the initial diameter of the touch menu further includes:
s1241: acquiring the number of touch points and the position coordinate of each touch point in the multi-finger touch instruction;
s1242: calculating the distance between any two touch points;
s1243: if the number of the touch points is 2, setting the initial diameter of the touch menu to be equal to the distance between the two touch points;
s1244: and if the number of the touch points is more than or equal to 3, setting the initial diameter of the touch menu to be equal to the distance between two touch points which are farthest away in all touch points.
Generally, the maximum distance between the multiple touch points is generally the distance after the fingers of the user are opened, for example, when the user operates with five fingers, the maximum distance between the five touch points is the distance between the thumb and the little finger (or between the thumb and the middle finger). According to the distance, a touch menu corresponding to the size of the palm of the user can be displayed directly, so that the operation of the user is facilitated.
Based on the touch menu interaction method, the present application further provides adisplay device 200, which includes adisplay 275, atouch sensing module 277 and acontroller 250. Wherein thedisplay 275 is configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options; thetouch sensing module 277 is configured to detect a touch command input by a user.
Thecontroller 250 is configured to perform the following program steps:
s11: receiving a multi-finger touch instruction which is input by a user and used for calling the touch menu;
s12: and responding to the multi-finger touch instruction, and controlling the display to display the touch menu.
And the display position of the touch menu is determined by calculation according to the position of each touch point in the multi-finger touch instruction.
As can be seen from the foregoing technical solutions, in some embodiments of the present application, adisplay device 200 and a touch menu interaction method are provided, where acontroller 250 of thedisplay device 200 may control adisplay 275 to display a touch menu in response to a multi-finger touch instruction after receiving the multi-finger touch instruction input by a user and used for invoking the touch menu. Through the above configuration of thecontroller 250, a user can use multi-finger touch on any interface to call a touch menu, thereby avoiding misoperation. And displaying a touch menu at a position where the user performs multi-touch. The interaction method is simple and convenient to operate, does not need a user to memorize a complex calling mode, has good interaction guidance, and can effectively improve the user experience.
In the above embodiment, the touch menu is called by a multi-finger touch manner, and other manners may be used to call the touch menu in practical applications.
As shown in fig. 15, in some embodiments of the present application, a touch menu interaction method is further provided, including:
s21: receiving a single-finger sliding touch instruction which is input by a user and used for calling the touch menu;
in an exemplary embodiment, the touch strip sheet may also be called by sliding a specific track with a single finger. For example, the user can input a single-finger sliding touch instruction for invoking the touch menu by sliding a "C" shaped track on the touch screen with a single finger.
It should be noted that the single-finger sliding track is only used as an optional single-finger sliding touch instruction track, and other types of motion tracks may also be used in practical applications, for example: the "W" shaped locus, the "V" shaped locus, the "Z" shaped locus, the "S" shaped locus, and the like, may be other types of loci. Obviously, the track input by the user through the single-finger sliding can be set according to actual needs, and user customization can also be supported.
S22: responding to the single-finger sliding touch instruction, and detecting an action track of a touch point;
after receiving a single-finger sliding touch instruction input by a user, the motion track of the touch point can be detected in response to the single-finger sliding touch instruction. In practical applications, the detection of the motion trajectory may be started when the user touches the touch screen. Specifically, when the user touches the touch screen and then generates a touch signal, thetouch sensing module 277 or thecontroller 250 may first detect a duration of the touch signal, and if the duration is short, such as less than 0.5s, it is determined that the interactive operation corresponding to the touch signal is a click operation.
When the duration of the touch signal is detected, the position change condition of the touch point can be detected. And if the position of the touch point corresponding to the touch signal is not changed or the change distance is smaller than a preset threshold value, determining that the corresponding interaction action is long-press operation. Usually, the corresponding duration of the long press operation needs to exceed 2-3s, and then the corresponding control program is executed, so that the click action and the long press action are effectively distinguished. If the position of the touch point corresponding to the touch signal is changed, the interactive action corresponding to the current touch signal is determined to be a sliding action, so that an action track formed in the sliding process can be recorded.
After the action track of the touch point is detected, the detected action track can be compared with a preset judgment track to determine whether the action track is the same as the preset judgment track.
S23: and if the action track is the same as a preset judgment track, controlling the display to display the touch menu.
By comparing the motion trajectory with the preset determination trajectory, if the motion trajectory is the same as the preset determination trajectory, thedisplay 275 is controlled to display the touch menu. For example, if the predetermined judgment trace is a "C" shaped trace and the interaction action input by the user through the touch screen also forms a "C" shaped trace, it is determined that the action trace is the same as the predetermined judgment trace, and thus the control program for calling the touch menu can be directly executed, and the touch menu is displayed on thedisplay 275.
In order to facilitate the subsequent interactive operation on the touch menu, the display position of the touch menu is determined by calculation according to the action track of the touch point in the single-finger sliding touch instruction. For example, the touch menu may be displayed at the end or the middle of the motion trajectory. Therefore, after the touch menu is displayed, the user can execute the interactive action on the touch menu without moving a long distance, and the convenience of operation is improved.
It should be noted that, in this embodiment, the motion trajectory is the same as the preset determination trajectory, but not the trajectory is completely the same, and the motion trajectory input by the user is only similar to the preset determination trajectory, and thus the motion trajectory can be determined to be the same. For example, if the preset judgment track is a "C" shaped track, the user inputs a larger "C" shaped motion track or a smaller "C" shaped track on the touch screen, and the motion track is determined to be the same as the preset judgment track. In the actual judgment, the change rule of the motion track can be determined by judging the slope change, the angle difference between the start point and the stop point and the like of the track.
In order to display the touch menu, in an exemplary embodiment, as shown in fig. 16, the step of controlling the display to display the touch menu further includes:
s231: acquiring an action track graph of a touch point in the single-finger sliding touch instruction;
s232: positioning a graph center point in the action track graph;
s233: and controlling the display to display the touch menu by taking the graphic central point as a reference.
After acquiring the one-finger sliding touch command, thecontroller 250 may acquire an action track graph of the touch point, and determine a base point for displaying the touch menu by positioning a graph center point in the action track graph, so as to display the touch menu with the determined base point as a reference. The central point can be the center of gravity of a closed graph formed by the tracks, or the center of the closed graph formed by specific points in the tracks, or the center of a circle circumscribed by partial points in the tracks. The base point or the reference point mentioned in the above embodiments may be a center point of the annular touch menu.
For example, if the preset judgment trajectory is a "C" -shaped trajectory, thecontroller 250 may respectively obtain the position coordinates of the starting point, the ending point, and the farthest tangent point after receiving the single-finger sliding touch command of the "C" -shaped trajectory, and determine the circumscribed circle of the formed triangle through the position coordinates of the three points, thereby determining the position coordinate of the center of the circumscribed circle, and determining the position coordinate of the center of the circle as the display reference point of the touch menu.
Based on the touch menu interaction method, some embodiments of the present application further provide adisplay device 200 for implementing the touch menu interaction method. Thedisplay device 200 includes adisplay 275, atouch sensing module 277, and acontroller 250. Wherein thedisplay 275 is configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options; thetouch sensing module 277 is configured to detect a touch command input by a user.
Thecontroller 250 is configured to perform the following program steps:
s21: receiving a single-finger sliding touch instruction which is input by a user and used for calling the touch menu;
s22: responding to the single-finger sliding touch instruction, and detecting an action track of a touch point;
s23: and if the action track is the same as a preset judgment track, controlling the display to display the touch menu.
And calculating and determining the display position of the touch menu according to the action track of the touch point in the single-finger sliding touch instruction.
As can be seen from the foregoing technical solutions, based on the touch menu interaction method and thedisplay device 200 provided in the foregoing embodiments, a user may input a single-finger sliding touch instruction on a touch screen, and make a touch point action track in the single-finger sliding touch instruction have the same shape as a preset judgment track, so as to trigger a calling condition of a touch menu, so as to control thedisplay 275 to display the touch menu according to the action track shape. The touch menu interaction method can be effectively distinguished from conventional touch operation through the preset action track, misoperation is reduced, and user experience is improved.
After invoking the touch menu, the user may perform further operations by other operations on the touch screen. Since the screen area of thedisplay 275 is large, a large display area may be used for displaying the touch menu, so as to reduce the frequency of performing page switching operations when a user opens a certain control option, the display area of thedisplay 275 may be used to display more control options, that is, the touch menu may be zoomed as needed.
As shown in fig. 17, in an exemplary embodiment of the present application, in order to zoom a touch menu, a touch menu interaction method is provided, which includes the following steps:
s31: receiving a multi-finger rotating touch instruction which is input by a user and used for zooming the touch menu;
after thedisplay 275 displays the touch menu, the user may further input touch interaction according to the touch screen. When a user wants to zoom the touch menu, a multi-finger rotating motion can be input on the touch screen, so that a multi-finger rotating touch instruction is generated.
S32: responding to the multi-finger rotating touch instruction, and acquiring the rotating directions of a plurality of touch points in the multi-finger rotating touch instruction;
for the controller of thedisplay device 200, various touch instructions input by the user may be continuously received after the touch menu is displayed. If the received touch instruction is a multi-finger rotating touch instruction, the rotating directions of a plurality of touch points can be further acquired.
Typically, the rotation touch command is a touch command with an obvious rotation characteristic. In practical application, if the position moving distances of the touch points in the input touch instruction are all relatively short, the motion tracks of the touch points are all arc-shaped, and the curvature change rules of the arc-shaped are the same, it can be determined that the touch instruction input by the current user is a multi-finger rotating touch instruction.
Therefore, thecontroller 250 may simultaneously detect parameters such as a moving distance, a track shape, and a curvature change rule of the touch point, so as to determine whether the touch command is a multi-finger rotation touch command. If a multi-finger rotating touch instruction is input, the change rule of the action track can be further extracted, so that the rotating directions of a plurality of touch points are determined. For example, a start point, a middle point, and an end point may be extracted, and the rotation direction may be determined by the positional relationship between the three points.
S33: and zooming the diameter of the touch menu according to the rotation direction, and controlling the display to display the touch menu in real time.
After the rotation direction is determined, the diameter of the touch menu may be scaled according to the rotation direction, obviously, whether the touch menu is scaled down or scaled up depends on the rotation direction of the input touch instruction, for example, if the rotation direction is clockwise, the diameter of the touch menu is scaled up; and if the rotating direction is anticlockwise, reducing the diameter of the touch menu.
Obviously, the zooming-in or zooming-out strategy can be adjusted according to different regions or different use habits set by users. For example, for a left-handed user, the diameter of the touch menu may be reduced when the rotation direction is determined to be clockwise; and the diameter of the touch menu is enlarged in a counterclockwise direction.
In order to obtain better interactive experience, in practical application, the touch menu can be enlarged or reduced in real time along with the rotating process of the multi-finger rotating touch instruction, that is, the diameter of the touch menu can be continuously enlarged or reduced according to the real-time change of the rotating angle.
Therefore, as shown in fig. 18, in an exemplary embodiment, the step of controlling the display to display the touch menu in real time further includes:
s331: obtaining rotation angles of a plurality of touch points in the multi-finger rotation touch instruction;
s332: calculating a scaling ratio according to the proportion of the rotation angle in the maximum rotation angle;
s333: and zooming the diameter of the current touch menu according to the zooming proportion.
In this embodiment, the rotation angles of the multiple touch points in the multi-finger rotation touch instruction may be obtained first, and then the scaling ratio may be calculated according to the ratio of the rotation angle to the maximum rotation angle, so as to scale the diameter of the current touch menu according to the scaling ratio. For example, by analyzing the motion trajectory of the touch point, it may be determined that the current rotation angle is clockwise rotated by 10 degrees, and the maximum rotation angle is 90 degrees, and then the scaling may be calculated as 10/90, and thus, the diameter of the touch menu may be increased by 1/9 according to the calculated scaling.
In order to obtain a continuous zooming effect, in practical applications, the zooming ratio may be calculated once when the rotation angle is changed by 1 degree in a multi-finger rotating touch instruction. If the rotation angle is changed from 10 degrees to 11 degrees, the scaling calculation is carried out once, namely the scaling is 11/90, and the diameter is increased 11/90 correspondingly; similarly, when the rotation is further 12 degrees, the scaling is calculated to 12/90, and the diameter is increased by 12/90 accordingly.
Obviously, the higher the calculation frequency of the scaling is, the smoother the scaling effect in the adjustment process is, so that the calculation frequency of the scaling can be increased as much as possible on the premise of ensuring that the hardware configuration of thedisplay device 200 can support. For example, the calculation frequency may be equal to the screen refresh frequency of thedisplay 275 for best fluency.
Because each touch point in the multi-finger rotating touch instruction corresponds to different fingers, the fingers are different in length and are limited by the influence of different user rotating habits, and the rotating angle variation amount which can be reached by each touch point correspondingly has larger difference. For example, part of users are used to rotate around the palm center as an axis, and the difference between the rotation angles of the thumb and the rest four fingers is small; and part of users are used to rotate by taking the thumb as an axis, and the difference between the rotation angles of the thumb and the rest four fingers is large, so that the detection of the rotation angle is influenced.
Therefore, in order to determine the rotation angle, a designated one of the plurality of touch points may be selected as a criterion of the rotation angle, for example, the rotation angle of the index finger may be selected as a criterion. And determining the rotation angle of the touch point with the largest rotation angle as a calculation basis of the scaling by comparing detection results of a plurality of rotation angles. That is, in an exemplary embodiment of the present application, as shown in fig. 19, the rotation angle is a maximum angle change amount of the current position from the initial position among the plurality of touch points; the step of obtaining the rotation angles of the plurality of touch points in the multi-finger rotation touch instruction further includes:
s3311: acquiring an initial position coordinate and a current position coordinate of each touch point in the multi-finger rotating touch instruction;
s3312: calculating the angle variation of the current position coordinate relative to the initial position coordinate;
s3313: and comparing the angle variation of the plurality of touch points to determine the maximum angle variation.
As can be seen, thecontroller 250 may determine the maximum angle change amount by acquiring the initial position coordinates and the current position coordinates of each touch point, calculating the angle change amount, and finally comparing the angle change amounts of the touch points. More accurate rotation angle change data can be obtained through the maximum angle change amount, so that the scaling can be conveniently calculated subsequently.
In practical application, after the diameter of the touch menu is adjusted, the area of the annular band region in the touch menu is also changed correspondingly, so that the number of the control options and/or the icon sizes of the control options contained in the touch menu can be modified according to the current diameter of the touch menu, and thus, a user can select more control options on the touch menu and execute more functional operations.
Based on the above touch menu interaction method, some embodiments of the present application further provide adisplay device 200, which is used to implement the above touch menu interaction method. Thedisplay device 200 includes adisplay 275, atouch sensing module 277, and acontroller 250. Wherein thedisplay 275 is configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options; thetouch sensing module 277 is configured to detect a touch command input by a user.
Thecontroller 250 is configured to perform the following program steps:
s31: receiving a multi-finger rotating touch instruction which is input by a user and used for zooming the touch menu;
s32: responding to the multi-finger rotating touch instruction, and acquiring the rotating directions of a plurality of touch points in the multi-finger rotating touch instruction;
s33: and zooming the diameter of the touch menu according to the rotation direction, and controlling the display to display the touch menu in real time.
As can be seen from the foregoing technical solutions, the above-described embodiments provide a touch menu interaction method and adisplay device 200, where the touch menu interaction method may adopt multi-finger touch (for example, five fingers) and clockwise (or counterclockwise) rotation to control and adjust the diameter of the touch menu to gradually increase after the touch menu is called out, so as to display more control option functions. The interaction method can modify the number of the control options and/or the icon sizes of the control options contained in the touch menu when the diameter of the touch menu is adjusted and changed, so as to present more functions. According to the touch menu interaction method, the annular menu can be freely zoomed according to the quantity requirement of the user on the control option functions in the touch menu, and the user experience level of the product is effectively improved.
For the touch menu, the maximum diameter of the touch menu can be set according to the size of the actual screen, so as to display a greater number of control options. However, if the number of control options in the touch menu is large, the touch menu with the largest diameter cannot display all the control options. Therefore, as shown in fig. 20, in an exemplary embodiment, the touch operation performed on the touch menu may further include the following steps:
s41: receiving a single-finger sliding touch instruction which is input by a user and used for switching control options in the touch menu;
s42: responding to the single-finger sliding touch instruction, and acquiring a touch point moving track of the single-finger sliding touch instruction;
s43: and adjusting the display position of the control option in an annular menu control according to the touch point moving track, and controlling the display to display the touch menu in real time.
In this embodiment, a user can slide and drag the control options in the annular menu with a single finger, so as to display the hidden control options, and achieve the purpose of switching the control option pages in the touch menu, so that more control options can be displayed.
For better interaction, a single-finger swipe touch command may be input in a ring-shaped band region (or an associated region, a nearby region) of the touch menu. The moving track of the touch point can be formed along with the sliding of the finger of the user, so that the display position of the control option graph in the annular menu control can be adjusted according to the moving track of the touch point.
In practical application, the change speed of the control option icon should be the same as the sliding speed of the touch finger, so that the effect that the icon moves along with the finger is achieved. And after the sliding speed reaches a certain value, when the sliding is finished, the control option is continuously moved in the annular belt so as to obtain the inertia effect. Obviously, the faster the sliding speed, the longer the moving distance is continued, so as to obtain a more real touch experience.
For the sliding speed of the touch finger, other related controls may be further set, for example, different switching effects may be obtained for different sliding speeds. That is, as shown in fig. 21, in one exemplary embodiment, the method further comprises:
s431: acquiring the sliding speed of the touch point in the single-finger sliding touch instruction;
s432: if the sliding speed is less than or equal to a preset speed threshold value, controlling the control option to move in the annular menu control along with the touch point;
s433: and if the sliding speed is greater than a preset speed threshold value, controlling the touch menu to switch display pages.
The sliding speed of the touch point can be judged through a preset speed threshold, and when the sliding speed is lower than the preset speed threshold, the control option icons in the annular menu can be dragged according to the mode, so that the control options are controlled to move in the annular menu control along with the touch point. And when the sliding speed is higher than a preset speed threshold, it can be judged that the user may want to quickly find the hidden control option, so that the touch menu can be controlled to switch display pages, wherein each display page comprises a plurality of control options.
It should be noted that the display page is a relative concept, and the control options in different display pages in practical applications may be partially or completely different. For example, each display page contains at most 11 control option icons, and when the touch menu contains 23 control options in total, the touch menu may include three display pages, a first display page includes 1 st to 11 th control options, a second display page includes 12 th to 22 th control options, and a third display page includes 13 th to 23 th control options. Therefore, when the user slides the first display page with a single finger quickly, the user switches to the second display page, and when the user slides the second display page with a single finger quickly, the user switches to the third display page.
Based on the touch menu interaction method, some embodiments of the present application further provide adisplay device 200, which includes adisplay 275, atouch sensing module 277, and acontroller 250. Wherein thedisplay 275 is configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options; thetouch sensing module 277 is configured to detect a touch command input by a user.
Thecontroller 250 is configured to perform the following program steps:
s41: receiving a single-finger sliding touch instruction which is input by a user and used for switching control options in the touch menu;
s42: responding to the single-finger sliding touch instruction, and acquiring a touch point moving track of the single-finger sliding touch instruction;
s43: and adjusting the display position of the control option in an annular menu control according to the touch point moving track, and controlling the display to display the touch menu in real time.
As can be seen from the foregoing technical solutions, the touch menu interaction method and thedisplay device 200 provided in the foregoing embodiments may adjust the display positions of the control options in the annular menu control according to the touch point movement trajectory by receiving the single-finger sliding touch instruction and acquiring the touch point movement trajectory thereof after the touch menu is displayed, and display the control options on thedisplay 275 in real time. Therefore, the touch menu interaction method can enable the annular touch menu to rotate when a user slides the menu area through a finger, so that more menu function entries can be checked, and acceleration rotation is supported.
For the touch menu, the corresponding setting function can be started by clicking the control option contained in the touch menu. Different control options in the touch menu have different functions, some functions need to be set in detail in a specific control interface, and some functions can achieve the setting purpose through simple controls, so that different pictures can be displayed according to different started options.
In an exemplary embodiment, as shown in fig. 22, after the touch menu is called, the following interaction methods may be further included:
s51: receiving a single-finger click touch instruction which is input by a user and used for opening the control option;
s52: responding to the single-finger click touch instruction, and adding an adjusting control corresponding to the opened control option in the middle area of the annular menu control;
s53: and controlling the display to display the touch menu and the adjusting control.
In practical applications, when the user does not activate any control option, some common function controls may be displayed in the middle area of the touch menu. For example, a return home page control is displayed in the middle area, which when clicked by the user, may switch to the operating system's control home page.
When a user starts a control option by clicking a control option icon, an adjusting control corresponding to the opened control option can be added in the middle area of the annular menu control by responding to a single-finger click touch instruction. For example, when the user clicks the volume control option, the home page return control in the middle area may be replaced with a volume setting control for the user to complete the volume setting function through further interaction.
Therefore, the adjusting control displayed in the middle of the annular touch menu can set functions for simple interactive operation actions. For example, a draggable slider may be included in the control to facilitate parameter setting. Thus, as shown in fig. 23, in one exemplary embodiment, the method further comprises:
s54: receiving a single-finger sliding instruction which is input in the adjusting control area by a user and is used for adjusting the control parameters;
s55: responding to the single-finger sliding instruction, and adjusting the control parameters of the opened control options;
s56: and adjusting the display shape of the adjusting control according to the adjusted control parameter, and controlling the display to display the adjusting control.
For example, as shown in fig. 24, the volume adjustment control may be in the form of a ring-shaped drag line combined with a slider, and after the user clicks the volume control option, the volume adjustment control may be displayed in the middle area of the touch menu and receive a single-finger sliding instruction input by the user to drag the position of the slider to adjust the volume control parameter. Obviously, in the volume adjustment control, the volume to be adjusted is different when the slider is at different positions, so as to achieve the purpose of volume adjustment.
As part of the control options cannot display the adjustment control in the middle area of the touch menu, as shown in fig. 25, in an exemplary embodiment, the step of adding the adjustment control corresponding to the opened control option in the middle area of the annular menu control further includes:
s521: judging whether the opened control options support displaying the adjusting control;
s522: if the opened control option does not support the display of the adjustment control, controlling a display to display a setting interface corresponding to the opened control option;
s523: and if the opened control options support the display of the adjusting control, adding the adjusting control corresponding to the opened control options in the middle area of the annular menu control.
After a user clicks any control option, whether a setting program corresponding to the clicked control option supports displaying of an adjusting control or not can be judged, a specific judging mode can be that a supportable list is preset, then the opened control option is matched and searched with the list, and if the opened control option is in the list, the adjustment control is determined to be supported to be displayed, so that the display is directly controlled to display the corresponding adjusting control; and if the opened control option is not in the list, determining that the opened control option does not support the display of the adjustment control, and at the moment, directly jumping to a setting interface corresponding to the opened control option.
In this embodiment, a Home key (Home/Launcher) function may be defined in the middle of the touch menu, and the user may enter the Home/Launcher by clicking the middle circle center control area. And when the user clicks the volume function inlet area, the volume adjustment page enters and is displayed in the circle center control area, the user slides the circular control to adjust the volume value, clicks the volume function inlet area again or overtime, the volume adjustment page exits, and the circle center control area displays the main page key (Home/Launcher) function.
In an exemplary embodiment, when the user no longer uses the touch menu, an exit instruction may be input, and thecorresponding display device 200 may receive the exit instruction input by the user to exit the touch menu; and controlling the display to cancel displaying the touch menu and the adjusting control in response to the exit instruction.
The exit instruction is a single-finger click touch instruction input by a user in an area outside the touch menu; or the exit instruction is a single-finger click touch instruction input by a user on a control which indicates the exit function in the touch menu; alternatively, the exit instruction is an instruction for referring to an exit function input by the user through thecontrol apparatus 100.
In practical applications, when a user calls out the main entry function of the touch menu to complete related operations, if the user wishes to exit the touch menu, the user can click a blank area outside the screen menu with a single finger to exit the main entry function of the touch menu, and also press a return key on thecontrol device 100 of thedisplay device 200 to exit the main entry function of the touch menu. In addition, if the user does not input other interaction instructions within the preset time after calling the touch menu, the user automatically exits the touch menu when the preset time is reached.
Based on the above interaction method of the touch menu, some embodiments of the present application further provide adisplay device 200, which includes adisplay 275, atouch sensing module 277, and acontroller 250. Wherein thedisplay 275 is configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options; thetouch sensing module 277 is configured to detect a touch command input by a user.
Thecontroller 250 is configured to perform the following program steps:
s51: receiving a single-finger click touch instruction which is input by a user and used for opening the control option;
s52: responding to the single-finger click touch instruction, and adding an adjusting control corresponding to the opened control option in the middle area of the annular menu control;
s53: and controlling the display to display the touch menu and the adjusting control.
In some embodiments, the touch menu may also be a circular menu including at least one control option. After the touch menu is called, at least one control option, such as a "home page option," may be displayed in the circular touch menu. And other control options in the touch menu can be set as hidden options, and the hidden options can be displayed according to further interactive operation of the user.
As shown in fig. 26, in order to display hidden options, an interaction method of a touch menu provided in some embodiments of the present application includes the following steps:
s61: receiving a first direction rotation touch instruction which is input by a user and used for displaying hidden options in the touch menu;
s62: and responding to the first direction rotation touch instruction, and controlling the display to sequentially display the hidden options according to the first direction.
After the touch menu is called, the user can input a first direction rotation touch instruction through the touch screen. The first direction may be set according to actual interaction needs, for example, the first direction may be a clockwise direction. In order to effectively distinguish from other interaction modes, the first direction may also be set to be other special rotation directions, for example, the first direction rotation touch may be clockwise rotation at a specific rotation speed, or clockwise rotation performed by multiple touch points simultaneously at a specific number of touch points.
After the user inputs the first direction rotation touch instruction, thedisplay device 200 may detect the first direction rotation touch instruction through thetouch sensing module 277, and determine that the current user wants to display the hidden option by detecting the rotation touch operation in the first direction, so that thecontroller 250 may control thedisplay 275 to display the hidden option.
The hidden options may be displayed at an outer circumferential position of the circular menu, and for better interaction, the hidden options may also be sequentially displayed in the interface in the same direction as the first direction. For example, when the first direction is clockwise, the hidden options may be sequentially displayed at the outer circumferential position of the "home" control in the clockwise direction, forming a ring-shaped control option layout.
In some embodiments, the displayed hiding option may be further hidden again through a reverse operation of the first direction rotation touch instruction. Namely, the interaction method of the touch menu further comprises the following steps:
s63: receiving a second direction rotation touch instruction which is input by a user and used for canceling the display of the hidden option in the touch menu;
s64: and responding to the second direction rotation touch instruction, and controlling the display to sequentially cancel the display of the hidden options according to the second direction.
Wherein the second direction is a rotational direction opposite to the first direction. For example, when the first direction is clockwise rotation, the second direction is counterclockwise rotation. The user may input a second direction rotation touch instruction through the touch screen after thedisplay device 200 displays the hidden option. Thecontroller 250 may detect the touch command through thetouch sensing module 275, so as to switch the hidden option to the hidden state again, and no longer display the hidden option in the interface.
Also, in order to obtain a better interaction effect, the hiding options may be hidden in sequence according to the second direction. For example, a plurality of control options may be controlled to be dismissed from the interface in a counterclockwise direction.
Based on the above interaction method of the touch menu, some embodiments of the present application further provide adisplay device 200, which includes adisplay 275, atouch sensing module 277, and acontroller 250. Wherein thedisplay 275 is configured to display a touch menu; the touch menu is a circular menu control formed by at least one control option; thetouch sensing module 277 is configured to detect a touch command input by a user.
Thecontroller 250 is configured to perform the following program steps:
s61: receiving a first direction rotation touch instruction which is input by a user and used for displaying hidden options in the touch menu;
s62: and responding to the first direction rotation touch instruction, and controlling the display to sequentially display the hidden options according to the first direction.
As shown in fig. 27, in some embodiments, if the touch menu is a circular menu or a ring menu including at least one control option, the display radius (or diameter) of the touch menu may also be set according to the number of control options included in the touch menu, that is, some embodiments of the present application further provide a touch menu interaction method, including:
s71: acquiring a multi-finger touch instruction which is input by a user and used for calling a touch menu;
s72: responding to the multi-finger touch instruction, and acquiring the number of control options contained in the touch menu;
s73: and setting the display diameter of the touch menu according to the number of the control options, and controlling the display to display the touch menu according to the display diameter.
In practical applications, the larger the display diameter of the touch menu is, the larger the display space that can be provided by the touch menu is, and the larger the number of control options that can be accommodated correspondingly is. Therefore, after the user inputs the multi-finger touch instruction to call up the touch menu, thecontroller 250 may further extract the number of control options included in the touch menu, and determine the initial display diameter of the touch menu according to the number of control options, so as to display the touch menu according to the size of the initial display diameter.
For example, under the preset initial display diameter, 10 control options may be simultaneously displayed in the touch menu, and when the number of the control options included in the touch menu is greater than 10, the display diameter of the touch menu may be increased to obtain a larger display space and display more than 10 control options. Similarly, when the number of the control options in the touch menu is less than 10, the display diameter of the touch menu can be reduced to obtain more various display effects, and the user experience is improved.
It should be noted that the upper and lower limits of the display diameter may be set according to the size and resolution of thedisplay 275 of thedisplay device 200, and the upper and lower limits of the number of control options are correspondingly set. And when the number of the control options is larger than the upper limit of the number, the display diameter of the touch menu is not increased any more, and all the control options can be displayed by adopting the mode of switching the display page. When the number of the control options is smaller than the lower limit of the number, the display diameter of the touch menu can be not reduced any more, and a better display layout can be obtained by adjusting the space between the control options.
Based on the touch menu interaction method, some embodiments of the present application further provide adisplay device 200, which includes adisplay 275, atouch sensing module 277, and acontroller 250. Wherein thedisplay 275 is configured to display a touch menu; the touch menu is a circular menu control formed by at least one control option; thetouch sensing module 277 is configured to detect a touch command input by a user.
Thecontroller 250 is configured to perform the following program steps:
s71: acquiring a multi-finger touch instruction which is input by a user and used for calling a touch menu;
s72: and responding to the multi-finger touch instruction, and controlling the display to display the touch menu.
According to the above technical solutions, the present application provides adisplay device 200 and a touch menu interaction method. The interaction method can be applied to thedisplay device 200 to realize different functions on the touch menu. In the interaction process, a user can input a multi-finger touch instruction through the touch screen to call a touch menu. After the touch menu is called, the user can also input a multi-finger rotating touch instruction to zoom the diameter of the touch menu; and inputting a single-finger sliding touch instruction to adjust the position of the control option on the touch menu so as to open any control option by inputting a single-finger clicking touch instruction, and displaying an adjusting control in the middle area of the touch menu for the opened control option so that a user can input the single-finger sliding touch instruction to adjust the control parameter.
The embodiments provided in the present application are only a few examples of the general concept of the present application, and do not limit the scope of the present application. Any other embodiments extended according to the scheme of the present application without inventive efforts will be within the scope of protection of the present application for a person skilled in the art.

Claims (31)

CN202010474023.1A2020-05-292020-05-29Display device and touch menu interaction methodActiveCN113747216B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202010474023.1ACN113747216B (en)2020-05-292020-05-29Display device and touch menu interaction method

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202010474023.1ACN113747216B (en)2020-05-292020-05-29Display device and touch menu interaction method

Publications (2)

Publication NumberPublication Date
CN113747216Atrue CN113747216A (en)2021-12-03
CN113747216B CN113747216B (en)2023-09-08

Family

ID=78724555

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202010474023.1AActiveCN113747216B (en)2020-05-292020-05-29Display device and touch menu interaction method

Country Status (1)

CountryLink
CN (1)CN113747216B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN114706515A (en)*2022-04-262022-07-05长沙朗源电子科技有限公司 Graphic three-finger rotation method and device based on electronic whiteboard
CN116737019A (en)*2023-08-152023-09-12山东泰克信息科技有限公司Intelligent display screen induction identification control management system
CN120406791A (en)*2025-07-022025-08-01九维算经(浙江)科技软件有限公司 Rotatable human-computer interaction menu interaction method, electronic device and computer-readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN1713123A (en)*2004-06-262005-12-28鸿富锦精密工业(深圳)有限公司 Ring screen display menu selection method and display device
JP2006139615A (en)*2004-11-122006-06-01Access Co Ltd Display device, menu display program, and tab display program
US20120131503A1 (en)*2010-11-222012-05-24Shao-Chieh LinApplication displaying method for touch-controlled device and touch-controlled device thereof
CN103677558A (en)*2012-08-292014-03-26三星电子株式会社Method and apparatus for controlling zoom function in electronic device
CN103713809A (en)*2012-09-292014-04-09中国移动通信集团公司Dynamic generating method and dynamic generating device for annular menu of touch screen
CN106033300A (en)*2015-03-102016-10-19联想(北京)有限公司A control method and an electronic apparatus
KR20190114348A (en)*2018-03-292019-10-10주식회사 네틱스Apparatus and method for multi-touch recognition

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN1713123A (en)*2004-06-262005-12-28鸿富锦精密工业(深圳)有限公司 Ring screen display menu selection method and display device
JP2006139615A (en)*2004-11-122006-06-01Access Co Ltd Display device, menu display program, and tab display program
US20120131503A1 (en)*2010-11-222012-05-24Shao-Chieh LinApplication displaying method for touch-controlled device and touch-controlled device thereof
CN103677558A (en)*2012-08-292014-03-26三星电子株式会社Method and apparatus for controlling zoom function in electronic device
CN103713809A (en)*2012-09-292014-04-09中国移动通信集团公司Dynamic generating method and dynamic generating device for annular menu of touch screen
CN106033300A (en)*2015-03-102016-10-19联想(北京)有限公司A control method and an electronic apparatus
KR20190114348A (en)*2018-03-292019-10-10주식회사 네틱스Apparatus and method for multi-touch recognition

Cited By (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN114706515A (en)*2022-04-262022-07-05长沙朗源电子科技有限公司 Graphic three-finger rotation method and device based on electronic whiteboard
CN116737019A (en)*2023-08-152023-09-12山东泰克信息科技有限公司Intelligent display screen induction identification control management system
CN116737019B (en)*2023-08-152023-11-03山东泰克信息科技有限公司Intelligent display screen induction identification control management system
CN120406791A (en)*2025-07-022025-08-01九维算经(浙江)科技软件有限公司 Rotatable human-computer interaction menu interaction method, electronic device and computer-readable storage medium

Also Published As

Publication numberPublication date
CN113747216B (en)2023-09-08

Similar Documents

PublicationPublication DateTitle
CN109618206B (en)Method and display device for presenting user interface
CN111182345B (en)Display method and display equipment of control
CN113805738B (en)Custom setting method and starting method for control keys and display equipment
WO2021114529A1 (en)User interface display method and display device
CN111625169B (en)Method for browsing webpage by remote controller and display equipment
CN112463269B (en)User interface display method and display equipment
CN113747216B (en)Display device and touch menu interaction method
CN109960556B (en)Display device
CN111913608A (en)Touch screen rotation control interaction method and display device
CN111770370A (en)Display device, server and media asset recommendation method
CN111901646A (en)Display device and touch menu display method
CN111104020A (en)User interface setting method, storage medium and display device
CN112004126A (en)Search result display method and display device
CN114157889B (en)Display equipment and touch control assisting interaction method
CN114339346B (en) Display device and image recognition result display method
CN111045557A (en)Moving method of focus object and display device
CN111259639B (en)Self-adaptive adjustment method of table and display equipment
CN112188249A (en)Electronic specification-based playing method and display device
US20140195981A1 (en)Electronic apparatus and control method thereof
CN112799576B (en)Virtual mouse moving method and display device
CN113747214A (en)Display device and touch menu interaction method
CN111988646B (en)User interface display method and display device of application program
CN112243147B (en)Video picture scaling method and display device
CN113810747A (en)Display equipment and signal source setting interface interaction method
WO2021109411A1 (en)Text type conversion method and display device

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp