Disclosure of Invention
The embodiment of the application provides a touch identification method and display equipment, which are used for solving the problem that the identification of a touch area can not be carried out according to the actual touch type at present.
In a first aspect, an embodiment of the present application provides a touch recognition method, which is applied to a touch screen, where the touch screen includes an emitting edge and a receiving edge, the emitting edge is configured to emit light along a plurality of scanning directions, and the receiving edge is configured to receive the emitted light along the plurality of scanning directions, and the method includes:
Acquiring the number of touch areas detected in each of a plurality of scanning directions;
determining at least one target scanning direction according to the number of touch areas corresponding to each scanning direction;
And determining the position information of the touch point in the touch screen according to the at least one target scanning direction.
In one possible implementation, determining at least one target scan direction according to the number of touch areas corresponding to each scan direction includes:
Determining a reference direction among the plurality of scanning directions; wherein, among the plurality of scanning directions, the touch area detected in the reference direction is the largest;
determining a first number according to the number of touch areas detected in the reference direction;
A first number of target scan directions is determined among the plurality of scan directions.
In one possible embodiment, determining the first number according to the number of touch areas detected in the reference direction includes:
Judging whether the number of the touch areas detected in the reference direction is greater than a first threshold;
if yes, determining the first quantity as M, wherein M is an integer greater than 1;
if not, determining the first number as N, wherein N is an integer greater than 1, and M is smaller than N.
In one possible implementation, determining a first number of target scan directions among the plurality of scan directions includes:
Grouping the plurality of scanning directions to obtain a first number of scanning direction groups;
one target scanning direction is determined in each scanning direction group respectively, and the first number of target scanning directions are obtained.
In one possible embodiment, in one scan direction group, the corresponding target scan direction is the scan direction in which the number of detected touch areas is the largest.
In one possible embodiment, grouping the plurality of scan directions to obtain a first number of scan direction groups includes:
Acquiring initial identifiers of each scanning direction, wherein the initial identifiers of the scanning directions are the same;
and grouping the plurality of scanning directions according to the initial identification of each scanning direction and the first number to obtain a first number of scanning direction groups.
In one possible embodiment, the method further comprises:
updating the identification of each target scanning direction, wherein the updated identification is different from the initial identification.
In one possible embodiment, the method further comprises:
and deleting the touch area corresponding to the scanning direction of the initial mark.
In one possible implementation, determining the location information of the touch point in the touch screen according to the at least one target scanning direction includes:
determining a position and an area of a touch area detected in the at least one target scanning direction;
And determining the position information of the touch point in the touch screen according to the position and the area of the touch area detected in the at least one target scanning direction.
In a second aspect, an embodiment of the present application provides a touch recognition device including an emitting edge for emitting light in a plurality of scanning directions and a receiving edge for receiving the emitted light in the plurality of scanning directions, the device including:
An acquisition module for acquiring the number of touch areas detected in each of a plurality of scanning directions;
the first processing module is used for determining at least one target scanning direction according to the number of the touch areas corresponding to each scanning direction;
and the second processing module is used for determining the position information of the touch point in the touch screen according to the at least one target scanning direction.
In one possible implementation manner, the first processing module is specifically configured to:
Determining a reference direction among the plurality of scanning directions; wherein, among the plurality of scanning directions, the touch area detected in the reference direction is the largest;
determining a first number according to the number of touch areas detected in the reference direction;
A first number of target scan directions is determined among the plurality of scan directions.
In one possible implementation manner, the first processing module is specifically configured to:
Judging whether the number of the touch areas detected in the reference direction is greater than a first threshold;
if yes, determining the first quantity as M, wherein M is an integer greater than 1;
if not, determining the first number as N, wherein N is an integer greater than 1, and M is smaller than N.
In one possible implementation manner, the first processing module is specifically configured to:
Grouping the plurality of scanning directions to obtain a first number of scanning direction groups;
one target scanning direction is determined in each scanning direction group respectively, and the first number of target scanning directions are obtained.
In one possible embodiment, in one scan direction group, the corresponding target scan direction is the scan direction in which the number of detected touch areas is the largest.
In one possible implementation manner, the first processing module is specifically configured to:
acquiring an initial identifier of each scanning direction;
and grouping the plurality of scanning directions according to the initial identification of each scanning direction and the first number to obtain a first number of scanning direction groups.
In one possible embodiment, the first processing module is further configured to:
updating the identification of each target scanning direction, wherein the updated identification is different from the initial identification.
In one possible embodiment, the first processing module is further configured to:
and deleting the touch area corresponding to the scanning direction of the initial mark.
In a possible implementation manner, the second processing module is specifically configured to:
determining a position and an area of a touch area detected in the at least one target scanning direction;
And determining the position information of the touch point in the touch screen according to the position and the area of the touch area detected in the at least one target scanning direction.
In a third aspect, an embodiment of the present application provides a display apparatus, including:
a memory for storing a program;
A processor for executing the program stored in the memory, the processor being configured to perform the touch recognition method according to any one of the first aspects when the program is executed.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium comprising instructions that, when run on a computer, cause the computer to perform the touch recognition method according to any one of the first aspects.
The touch recognition method and the display device provided by the embodiment of the application are applied to a touch screen, the touch screen comprises an emitting edge and a receiving edge, the emitting edge is used for emitting light rays along a plurality of scanning directions, the receiving edge is used for receiving the emitted light rays along the plurality of scanning directions, firstly, the number of touch areas detected in each scanning direction in the plurality of scanning directions is obtained, at least one target scanning direction is determined according to the number of touch areas corresponding to each scanning direction, and then, the position information of touch points in the touch screen is determined according to the at least one target scanning direction. According to the scheme, whether multi-point touch or few-point touch is determined according to the number of the touch areas detected in each scanning direction, the target scanning direction is further determined, then the position information of the touch points is determined based on the target scanning direction, and recognition of different touch points under different touch area numbers can be achieved, so that recognition efficiency is high.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
For ease of understanding, first, the concepts related to the present application will be described.
Touch screen: the touch screen is an electronic system capable of detecting the existence and the position of touch in a display area, and the existence of the touch screen simplifies a human-computer interaction method. Among the current touch technologies, the infrared touch technology has the advantages of strong environmental adaptability, longer service life, more identifiable touch points and the like, so that the infrared touch technology is widely applied.
Fig. 1 is a schematic structural diagram of an infrared touch screen according to an embodiment of the present application, as shown in fig. 1, the infrared touch screen is composed of a long-side transmitting side 12, a long-side receiving side 11, a short-side transmitting side 14 and a short-side receiving side 13. The transmitting side is provided with a plurality of transmitting lamps, and the corresponding receiving side is provided with a plurality of receiving lamps.
As shown in fig. 1, a scanning manner is generally adopted in a 1-to-many manner, that is, one transmitting lamp emits light, and a plurality of opposite receiving lamps can receive the light emitted by the transmitting lamp, so as to form an optical network, and the touch behavior is judged according to different forms of the optical network under the condition of touching and non-touching.
Scanning direction: in the 1-to-n (n > =1) scanning mode, each optical path has a different angle for a particular lamp. For a particular emission lamp, each angle of the n light paths corresponding to it is referred to as a scan direction. Thus, a 1-to-n scanning scheme has n scanning directions, each consisting of a set of parallel light paths with the same slope.
Fig. 2 is a schematic diagram of a long-side scanning direction provided by an embodiment of the present application, and fig. 3 is a schematic diagram of a short-side scanning direction provided by an embodiment of the present application, where, as shown in fig. 2 and fig. 3, the long side includes a long transmitting side and a long receiving side, and the short side also includes a short transmitting side and a short receiving side. The long emitting side emits light, and the long receiving side receives the light emitted by the long emitting side; the short transmitting side transmits light, and the short receiving side receives the light transmitted by the short transmitting side.
Touch area: a particular scan direction corresponds to a set of parallel scan light paths. When a touch action occurs, each contact point can cover a plurality of continuous parallel light paths, and the continuous shielded light paths are a touch area in the scanning direction.
Fig. 4 is a schematic view of light path shielding provided in an embodiment of the present application, as shown in fig. 4, wherein a dashed line represents a shielded light path. Among the consecutively blocked light paths, the first blocked light path is called the start boundary of the touch area, and the last blocked light path is called the end boundary. The starting boundary and the ending boundary are only boundaries of the blocked light path and have no time sequence, and the sequence of the starting boundary and the ending boundary can be customized.
In the existing infrared multi-point identification scheme, the same scanning direction light path and the same logic algorithm scheme are used in any scene to determine the position of a touch point, so that high precision during few-point touch and high speed during multi-point touch cannot be optimized. Based on the above, the embodiment of the application provides a control scheme of a touch screen, which solves the above problems.
Fig. 5 is a schematic flow chart of a touch recognition method according to an embodiment of the present application, where the method is applied to a touch screen, and the touch screen includes an emitting edge and a receiving edge, where the emitting edge is used for emitting light along a plurality of scanning directions, and the receiving edge is used for receiving the emitted light along the plurality of scanning directions, and as shown in fig. 5, the method may include:
s51, the number of touch areas detected in each of a plurality of scanning directions is acquired.
In the application, the touch screen comprises a plurality of scanning directions, and the scanning directions are directions of emitting light rays from the emitting side to the receiving side. On one emission side, there may be multiple scan directions, and for any emission lamp on the emission side, the direction in which the emission lamp emits light is the multiple scan directions. For any one scanning direction, all the emitting lamps on the corresponding emitting edge can emit light rays in the scanning direction, and the light rays emitted by the plurality of emitting lamps in the same scanning direction are mutually parallel. The light rays in the plurality of scanning directions jointly form an optical network.
When a user performs touch operation on the touch screen, light rays on the touch screen are shielded. Meanwhile, for each scanning direction, a touch operation of a user on the touch screen may block one or more light rays in the scanning direction. Therefore, the number of touch areas detected in each scanning direction can be obtained from the light blocked in each scanning direction.
S52, determining at least one target scanning direction according to the number of touch areas corresponding to each scanning direction.
Since when a user performs a touch operation on the touch screen, there may be a case where light is not detected to be blocked in some scan directions, that is, some touch areas are not detected in some scan directions, the number of touch areas corresponding to each scan direction may be different. In the embodiment of the application, at least one target scanning direction can be determined according to the number of the touch areas corresponding to each scanning direction.
S53, determining the position information of the touch point in the touch screen according to the at least one target scanning direction.
After the at least one current scan direction is determined, location information of a touch point in the touch screen may be determined based on the at least one current scan direction. When the number of the touch areas is large, the multi-point touch is realized, and the touch screen has high requirements on the speed of touch response. Thus, fewer target scan directions can be determined and location information for the touch point can be determined based on fewer current scan directions. When the number of the touch areas is smaller, the touch screen is in a small-point touch mode, and at the moment, the touch screen has higher accuracy requirements on touch response. Therefore, more target scanning directions can be determined, position information of the touch point can be determined according to the more target scanning directions, and the like.
The touch recognition method provided by the embodiment of the application is applied to a touch screen, the touch screen comprises an emitting edge and a receiving edge, the emitting edge is used for emitting light rays along a plurality of scanning directions, the receiving edge is used for receiving the emitted light rays along the plurality of scanning directions, the number of touch areas detected in each scanning direction in the plurality of scanning directions is firstly obtained, at least one target scanning direction is determined according to the number of touch areas corresponding to each scanning direction, and then the position information of touch points in the touch screen is determined according to the at least one target scanning direction. According to the scheme, whether multi-point touch or few-point touch is determined according to the number of the touch areas detected in each scanning direction, the target scanning direction is further determined, then the position information of the touch points is determined based on the target scanning direction, and recognition of different touch points under different touch area numbers can be achieved, so that recognition efficiency is high.
The following describes the scheme of the present application in detail with reference to the drawings.
After the user performs the touch operation on the touch screen, the number of touch areas detected in each of the plurality of scanning directions can be obtained, so that the target scanning direction is determined according to the number of touch areas corresponding to each scanning direction.
Specifically, the reference direction, which is the direction in which the number of touch areas detected in the plurality of scanning directions is the largest, may be determined among the plurality of scanning directions. If there are a plurality of directions in which the number of detected touch areas is the largest, one direction may be randomly selected as the reference direction.
After the reference direction is determined, the first number may be determined according to the number of touch areas detected by the reference direction.
Ideally, when the user leaves i touch points on the touch screen, there are i touch areas per scan direction. However, in some cases, such as where the touch point is too small, there may be some touch areas in some scan directions that are undetectable. Therefore, in the embodiment of the application, the scanning direction with the largest number of detected touch areas is taken as the reference direction, and the touch area detected in the reference direction is taken as the touch point to be identified.
According to the number of the touch areas detected by the reference direction, whether the touch operation is multi-point touch or little touch can be judged, and according to the multi-point touch or little touch result, the first number is determined.
Specifically, whether the number of the touch areas detected in the reference direction is greater than a first threshold value can be judged, if yes, the touch operation at the moment can be recognized as multi-point touch, the first number is determined as M, and the M is an integer greater than 1; if not, the touch operation at the moment can be identified as a few-point touch operation, and the first number is determined as N, N is an integer greater than 1, and M is less than N.
After the first number is determined, a first number of target scan directions may be determined from the plurality of scan directions.
Specifically, a plurality of scan directions may be grouped to obtain a first number of scan direction groups.
For example, an initial identification of each scan direction may be obtained, and then the plurality of scan directions may be grouped according to the initial identification of each scan direction and the first number, resulting in a first number of scan direction groups. Wherein the initial identity of each scan direction is the same. For example, the initial flag of each scanning direction may be set to 0. In a subsequent pass, its identity may be updated for part of the scan direction.
After the first number of scan direction groups is obtained, a target scan direction is determined in each scan direction group, respectively, so that a first number of target scan directions is obtained.
Alternatively, after each target scanning direction is determined, the identity of the target scanning direction may be updated, the updated identity being different from the initial identity. For example, the updated flag of the target scanning direction may be set to 1. In this case, the target scanning direction is different from the non-target scanning direction in terms of identification, and thus the target scanning direction and the non-target scanning direction can be effectively distinguished.
After updating the identifier, the area corresponding to the scanning direction identified as the initial identifier may be deleted. Since the mark of the target scanning direction is updated, but the mark of the non-target scanning direction is still the initial mark, the touch area corresponding to the scanning direction marked as the initial mark is deleted, namely, the touch area corresponding to the non-target scanning direction is deleted, and the touch point can be identified only according to the touch area corresponding to the target scanning direction.
When the multi-point touch control is determined, dividing a plurality of scanning directions into M groups, and determining a target scanning direction in each group to obtain M target scanning directions; when the touch control is determined to be the few-point touch control, the plurality of scanning directions are divided into N groups, and one target scanning direction is determined in each group, so that N target scanning directions are obtained.
For each scan direction group, the scan direction in which the number of detected touch areas is the largest may be determined as the target scan direction in the scan direction group.
After the first number of target scan directions is determined, the position and area of the touch area detected in the target scan directions may be determined, and the position information of the touch point in the touch screen may be determined according to the position and area of the touch area detected in the target scan directions, and the position and area size of the touch point may be output.
The number of target scanning directions is M when multi-touch is determined, and N when few-touch is determined. Because M is smaller than N, the position information of the touch point in the touch screen is determined through fewer target scanning directions during multi-point touch, the position information of the touch point can be acquired quickly, and the response speed is high. And when the touch screen is in the small-point touch control, the position information of the touch point in the touch screen is determined through more target scanning directions, and the determined position information of the touch point is higher in accuracy and more accurate.
Fig. 6 is a flow chart of a touch recognition method according to an embodiment of the present application, as shown in fig. 6, including:
S601, setting various parameters.
Each parameter comprises a scanning direction number N, an angle of each lamp tube in each scanning direction, a touch area number threshold t, a few-point high-precision mode scanning direction number N and a multi-point quick response mode scanning direction number M, wherein an identifier flg is set for each scanning direction and initialized to 0, and the set identifier is the initial identifier of each scanning direction.
S602, scanning in one period is executed, and the number of touch areas in each scanning direction is counted.
And executing scanning in n scanning directions in one period, and counting and recording the touch area in each scanning direction and the number of the touch areas in each scanning direction. Fig. 7 is a schematic view of shielding light by a touch area according to an embodiment of the present application, as shown in fig. 7, showing two touch points, i.e. black circular areas in fig. 7.
The touch screen in fig. 7 includes 4 scanning directions, i.e., two scanning directions of a long side and two scanning directions of a short side. In fig. 7, the blocked light rays are also shown, i.e. as indicated by the dashed lines in fig. 7. The light rays which are blocked pass through the areas where the two touch points are located, so that when a user performs touch operation on the touch screen, the light rays are blocked. The touch point can be determined according to the blocked light rays.
Fig. 8 is a schematic view of light shielding of a short side, as shown in fig. 8, in which the short side includes two scanning directions, one is a vertical direction and the other is an oblique direction. Fig. 9 is a schematic view of long-side light shielding provided in an embodiment of the present application, where, as shown in fig. 9, the long sides respectively include two scanning directions, one is a horizontal direction and the other is an oblique direction.
S603, selecting a reference direction and recording the number of touch areas detected in the reference direction.
And traversing each scanning direction, and selecting the scanning direction with the largest number of touch areas as a reference direction Dir.
Fig. 7 to 9 illustrate a case where all touch areas are detected, and in practice, some touch areas cannot be detected in some scanning directions.
For example, assume that a touch screen includes 12 scanning directions in total, among which a scanning direction of 6 bit long sides and a scanning direction of the other 6 bit short sides. The number of touch areas detected in the 12 scanning directions is as follows:
Dir1 (scanning direction 1): 2 touch areas; dir2 (scanning direction 2): 1 touch area;
dir3 (scanning direction 3): 1 touch area; dir4 (scan direction 4): 2 touch areas;
Dir5 (scanning direction 5): 3 touch areas; dir6 (scan direction 6): 2 touch areas;
dir7 (scan direction 7): 1 touch area; dir8 (scanning direction 8): 2 touch areas;
dir9 (scanning direction 9): 2 touch areas; dir10 (scan direction 10): 1 touch area;
Dir11 (scanning direction 11): 2 touch areas; dir12 (scan direction 12): 1 touch area.
Since the scanning direction in which the number of detected touch areas is largest is Dir5, dir5 is determined as the reference direction.
S604, judging whether the number k of the touch areas in the reference direction Dir is larger than t, if so, executing S605, otherwise, executing S606.
S605, entering a multi-point high-speed response mode.
Multipoint high-speed response mode: dividing n scanning directions into M groups equally, and putting the rest scanning directions into the last group which cannot be divided equally; each group selects the direction with the largest number of touch areas and marks the direction flg as 1, at this time, the direction with the largest number of touch areas in each group is the target scanning direction of the group, and the mark with the direction flg as 1 is updated; after all groups are executed S607.
Assuming that the touch area threshold t=2, the number of target scanning directions m=3 at this time, each 4 scanning directions as a group, and the scanning direction in which the largest touch area is detected is selected as the target scanning direction.
A first group, dir1-Dir4, selected Dir1;
a second group, dir5-Dir8, selected Dir5;
third group, dir9-Dir12, dir9 was selected.
S606, entering a few-point high-precision mode.
Few-point high precision mode: dividing N scanning directions into N groups equally, and putting the rest scanning directions into the last group which cannot be divided equally; each group selects the direction with the largest number of touch areas and marks the direction flg as 1, at this time, the direction with the largest number of touch areas in each group is the target scanning direction of the group, and the mark with the direction flg as 1 is updated; after all groups are executed S607.
Assuming that the touch area threshold t=3, the number of target scanning directions n=5 at this time, each 2 scanning directions as a group, and the scanning direction in which the largest touch area is detected is selected as the target scanning direction.
A first group, dir1-Dir2, selected Dir1;
A second group, dir3-Dir4, selected Dir4;
a third group, dir5-Dir6, selected Dir5;
A fourth group, dir7-Dir8, selected Dir6;
Fifth group, dir9-Dir12, dir9 was selected.
S607, traversing the scanning direction, deleting the touch area in the unselected scanning direction.
Traversing each scanning direction, and deleting all touch areas in the direction with the flag flg of 0, namely: and deleting the touch area corresponding to the scanning direction marked as the initial mark, wherein the number of the corresponding touch area is 0.
S608, the point position and the point area are calculated using the remaining touch area.
S609, performing ghost point removing operation, and performing track tracking and smoothing algorithm processing.
S610, converting and outputting coordinates of all touch points.
After the touch area in the target scanning direction is determined, the position information of the touch point is determined according to the touch area in the target scanning direction, and the coordinates of the touch point are output.
The touch recognition method provided by the embodiment of the application is applied to a touch screen, the touch screen comprises an emitting edge and a receiving edge, the emitting edge is used for emitting light rays along a plurality of scanning directions, the receiving edge is used for receiving the emitted light rays along the plurality of scanning directions, the number of touch areas detected in each scanning direction in the plurality of scanning directions is firstly obtained, at least one target scanning direction is determined according to the number of touch areas corresponding to each scanning direction, and then the position information of touch points in the touch screen is determined according to the at least one target scanning direction. According to the scheme, whether multi-point touch or few-point touch is determined according to the number of the touch areas detected in each scanning direction, the target scanning direction is further determined, then the position information of the touch points is determined based on the target scanning direction, and recognition of different touch points under different touch area numbers can be achieved, so that recognition efficiency is high.
Fig. 10 is a schematic structural diagram of a touch recognition device according to an embodiment of the present application, including an emitting edge and a receiving edge, where the emitting edge is configured to emit light in a plurality of scanning directions, and the receiving edge is configured to receive the emitted light in the plurality of scanning directions, and as shown in fig. 10, the device includes:
an acquisition module 101 for acquiring the number of touch areas detected in each of a plurality of scanning directions;
A first processing module 102, configured to determine at least one target scanning direction according to the number of touch areas corresponding to each scanning direction;
And a second processing module 103, configured to determine location information of a touch point in the touch screen according to the at least one target scanning direction.
In one possible implementation, the first processing module 102 is specifically configured to:
Determining a reference direction among the plurality of scanning directions; wherein, among the plurality of scanning directions, the touch area detected in the reference direction is the largest;
determining a first number according to the number of touch areas detected in the reference direction;
A first number of target scan directions is determined among the plurality of scan directions.
In one possible implementation, the first processing module 102 is specifically configured to:
Judging whether the number of the touch areas detected in the reference direction is greater than a first threshold;
if yes, determining the first quantity as M, wherein M is an integer greater than 1;
if not, determining the first number as N, wherein N is an integer greater than 1, and M is smaller than N.
In one possible implementation, the first processing module 102 is specifically configured to:
Grouping the plurality of scanning directions to obtain a first number of scanning direction groups;
one target scanning direction is determined in each scanning direction group respectively, and the first number of target scanning directions are obtained.
In one possible embodiment, in one scan direction group, the corresponding target scan direction is the scan direction in which the number of detected touch areas is the largest.
In one possible implementation, the first processing module 102 is specifically configured to:
acquiring an initial identifier of each scanning direction;
and grouping the plurality of scanning directions according to the initial identification of each scanning direction and the first number to obtain a first number of scanning direction groups.
In one possible implementation, the first processing module 102 is further configured to:
updating the identification of each target scanning direction, wherein the updated identification is different from the initial identification.
In one possible implementation, the first processing module 102 is further configured to:
and deleting the touch area corresponding to the scanning direction of the initial mark.
In one possible implementation, the second processing module 103 is specifically configured to:
determining a position and an area of a touch area detected in the at least one target scanning direction;
And determining the position information of the touch point in the touch screen according to the position and the area of the touch area detected in the at least one target scanning direction.
The device provided by the embodiment of the application can be used for executing the technical scheme of the embodiment of the method, and the implementation principle and the technical effect are similar, and are not repeated here.
Fig. 11 is a schematic hardware structure of a display device according to an embodiment of the present application, as shown in fig. 11, where the display device includes: at least one processor 111 and a memory 112. Wherein the processor 111 and the memory 112 are connected by a bus 113.
Optionally, the model determination further comprises a communication component. For example, the communication component may include a receiver and/or a transmitter.
In a specific implementation, the at least one processor 111 executes computer-executable instructions stored in the memory 112, so that the at least one processor 111 performs the touch recognition method as described above.
The specific implementation process of the processor 111 may refer to the above-mentioned method embodiment, and its implementation principle and technical effects are similar, and this embodiment will not be described herein again.
In the embodiment shown in fig. 11, it should be understood that the Processor may be a central processing unit (english: central Processing Unit, abbreviated as CPU), or may be other general purpose processors, digital signal processors (english: DIGITAL SIGNAL Processor, abbreviated as DSP), application specific integrated circuits (english: application SPECIFIC INTEGRATED Circuit, abbreviated as ASIC), or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in a processor for execution.
The memory may comprise high speed RAM memory or may further comprise non-volatile storage NVM, such as at least one disk memory.
The bus may be an industry standard architecture (Industry Standard Architecture, ISA) bus, an external device interconnect (PERIPHERAL COMPONENT, PCI) bus, or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, among others. The buses may be divided into address buses, data buses, control buses, etc. For ease of illustration, the buses in the drawings of the present application are not limited to only one bus or to one type of bus.
The present application also provides a computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, implement the touch recognition method as described above.
The computer readable storage medium described above may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic disk, or optical disk. A readable storage medium can be any available medium that can be accessed by a general purpose or special purpose computer.
An exemplary readable storage medium is coupled to the processor such the processor can read information from, and write information to, the readable storage medium. In the alternative, the readable storage medium may be integral to the processor. The processor and the readable storage medium may reside in an Application SPECIFIC INTEGRATED Circuits (ASIC). The processor and the readable storage medium may reside as discrete components in a device.
The division of the units is merely a logic function division, and there may be another division manner when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the method embodiments described above may be performed by hardware associated with program instructions. The foregoing program may be stored in a computer readable storage medium. The program, when executed, performs steps including the method embodiments described above; and the aforementioned storage medium includes: various media that can store program code, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.