Detailed Description
The preferred embodiments of the present invention will be described in conjunction with the accompanying drawings, and it will be understood that they are described herein only to illustrate and explain the present disclosure, and not to limit the present disclosure.
Fig. 1 shows an embodiment of a method for detecting a gesture provided by the present disclosure, which includes the following steps:
in step S101, when a touch screen operation is detected, a sub-gesture with a constant number of touch points and a single movement direction is recognized; the method provided by the embodiment is used for the terminal equipment.
In step S102, a gesture composed of two or more recognized consecutive sub-gestures is determined as the gesture detected for the touch screen operation.
According to the method for detecting the gesture, the gesture consisting of two or more sub-gestures can be detected, so that the complexity of the gesture is improved, misoperation is reduced, and the diversity of the gesture is improved.
Fig. 2 shows another embodiment of the method for detecting a gesture provided in the present disclosure, in which the process of how to recognize a sub-gesture is described in detail. Specifically, the method comprises the following steps:
in step S201, determining whether a touch screen operation is detected; if yes, go to step S202; if not, the process is ended. The method provided by the embodiment is used in the terminal equipment.
In step S202, the number of touch points and the moving direction of the current gesture of the touch screen operation are obtained and recorded.
It will be appreciated by those skilled in the art that when the touch surface is non-planar, it is preferable to map the moving direction of the gesture of the touch screen operation onto a two-dimensional xy-plane with the center of the touch surface as the origin, so that accurate recognition and detection can be performed.
In step S203, it is determined whether the number of contacts has not changed and the moving direction has not changed, and if yes, the step is repeatedly performed; if not, go to step S204.
For example, the gesture changes from the initial horizontal leftward sliding of a single finger to the longitudinal downward sliding of the single finger, so that the number of the contacts of the gesture is not changed, but the moving direction is changed; when the gesture changes from the initial horizontal leftward sliding of a single finger into the horizontal leftward sliding of double fingers, the number of the contact points of the gesture changes; when the gesture changes from sliding horizontally leftwards by a single finger to sliding vertically downwards by two fingers, the number and the moving direction of the contact points of the gesture are changed.
In other embodiments of the present disclosure, the moving distance of the contact of the gesture may be defined, for example, the moving distance of the moving direction of the contact defining the gesture in one direction must exceed a first preset threshold, and the moving distance in the other direction does not exceed a second preset threshold. For example, for a horizontal left-to-right sub-gesture, the x value in the positive direction from left to right must be small to large and exceed a first preset threshold, otherwise, the gesture is considered to be possibly misoperation of the user, while the maximum change of the longitudinal y value cannot exceed a second preset threshold, which allows for that a smaller error can be tolerated so as to adapt to errors of the touch screen and physiological errors of the person, and when the maximum change of the y value exceeds the second preset threshold, the moving direction is considered to be not single and changed.
In step S204, recognizing the gesture before the change as a sub-gesture N; where N indicates that the sub-gesture is the nth consecutive sub-gesture recognized for the touch screen operation, for example, counting from 1.
For example, as shown in fig. 3, if the gesture starts with a finger sliding laterally to the left, and then changes to a single-finger sliding vertically downward as shown in fig. 4, the starting single-finger sliding laterally to the left is recognized as sub-gesture 1, and the changed single-finger sliding vertically downward is recognized as sub-gesture 2.
For example, as shown in fig. 5, if the gesture is changed from a first single-finger horizontal rightward sliding to a single-finger vertical downward sliding, and then is changed from a second single-finger horizontal leftward sliding, the first single-finger horizontal rightward sliding is recognized as sub-gesture 1, the second single-finger vertical downward sliding is recognized as sub-gesture 2, and the last double-finger horizontal leftward sliding is recognized as sub-gesture 3.
For example, as shown in fig. 6, if the gesture is changed from a single-finger horizontal rightward slide to a single-finger downward right slide, the single-finger horizontal rightward slide is recognized as sub-gesture 1, and the single-finger downward right slide is recognized as sub-gesture 2.
It should be noted that the multi-finger sub-gesture requires that the touch gestures of the respective fingers are substantially parallel, so that the moving direction of the sub-gesture is ensured to be single and recognizable.
It should be noted that the direction of the gesture is only schematically shown in the figure, and for consecutive sub-gestures, there is at least one uninterrupted contact point between every two consecutive sub-gestures, and if there is no uninterrupted contact point between two consecutive sub-gestures, the touch screen operation is considered to be finished after the previous sub-gesture in the two sub-gestures.
In step S205, it is determined whether the number of contacts becomes zero, if yes, it indicates that the touch screen operation is finished, and step S206 is executed; if not, the process returns to step S202.
In step S206, a gesture composed of the recognized consecutive sub-gestures is determined as the gesture detected for the touch screen operation.
Taking fig. 4 as an example, the gesture composed of the recognized gesture 1 and gesture 2 is determined as the gesture detected for the touch screen operation. Taking fig. 5 as an example, the gesture composed of the recognized sub-gesture 1, sub-gesture 2, and sub-gesture 3 is determined as the gesture detected for the touch screen operation.
In step S207, corresponding operations are performed according to the detected gesture and the preset corresponding relationship between the gesture and the operation.
Fig. 7 illustrates another embodiment of the method for detecting a gesture provided by the present disclosure, in which a definition is given to the moving time of a sub-gesture. Specifically, the method comprises the following steps:
in step S701, it is determined whether a touch screen operation is detected; if yes, go to step S702; if not, the process is ended. The method provided by the embodiment is used in the terminal equipment.
In step S702, the number of touch points, the moving direction, and the moving time of the current gesture for performing the touch screen operation are acquired and recorded.
In step S703, it is determined whether the number of contacts has not changed and the moving direction has not changed, and if yes, the step is repeated; if not, go to step S704.
In step S704, it is determined whether the movement time of the gesture before the change is within a first preset time period, if so, step S705 is executed; if not, the gesture before the change is considered invalid, the step S702 is returned to, and the counting of N is restarted.
In step S705, recognizing the gesture before the change as a sub-gesture N; wherein N represents that the sub-gesture is the nth sub-gesture recognized for the touch screen operation, for example, counting from 1, and then adding 1 to N each time the step is executed.
In step S706, determining whether the number of contacts becomes zero, if yes, indicating that the touch screen operation is ended, and executing step S707; if not, the process returns to step S702.
In step S707, a gesture composed of all the recognized consecutive sub-gestures (consecutive numbers) is determined as the gesture detected for the touch screen operation.
In step S708, corresponding operations are performed according to the detected gesture and the preset corresponding relationship between the gesture and the operation.
In this embodiment, the moving time of a single sub-gesture is limited to improve the efficiency and accuracy of gestures and gesture recognition.
Fig. 8 shows another embodiment of the method for detecting a gesture provided by the present disclosure, in which a limitation is also given to the moving time of a sub-gesture, which is different from the previous embodiment in that when the moving time of a gesture before a change occurs is not within a first preset time period, only one gesture before the change occurs is ignored, and a subsequent sub-gesture continues to be recognized. Specifically, the method comprises the following steps:
in step S801, it is determined whether a touch screen operation is detected; if yes, go to step S802; if not, the process is ended. The method provided by the embodiment is used in the terminal equipment.
In step S802, the number of touch points, the moving direction, and the moving time of the current gesture for performing the touch screen operation are acquired and recorded.
In step S803, it is determined whether the number of contacts has not changed and the moving direction has not changed, and if so, the step is repeated; if not, go to step S804.
In step S804, it is determined whether the movement time of the gesture before the change is within a first preset time period, and if so, step S805 is executed; if not, the gesture before the change is considered invalid, keeping N unchanged, and returning to the step S802.
In step S805, the gesture before the change is recognized as a sub-gesture N; wherein N represents that the sub-gesture is the nth sub-gesture recognized for the touch screen operation, for example, counting from 1, and then adding 1 to N each time the step is executed.
In step S806, it is determined whether the number of contacts becomes zero, and if so, it indicates that the touch screen operation is finished, and step S807 is executed; if not, the process returns to step S802.
In step S807, a gesture composed of all the recognized consecutive (N-numbered consecutive) sub-gestures is determined as the gesture detected for the touch screen operation.
In step S808, corresponding operations are performed according to the detected gesture and the preset corresponding relationship between the gesture and the operation.
In this embodiment, when the second gesture after the first sub-gesture is recognized is invalid, the second sub-gesture is ignored, and sub-gesture recognition is continued, and the subsequently recognized sub-gesture and the first sub-gesture are regarded as continuous sub-gestures. The embodiment allows the user to have an error in the process of executing the gesture, and the user does not need to restart the gesture after the error occurs, and only needs to execute the effective sub-gesture under the condition of keeping continuous touch.
Fig. 9 illustrates another embodiment of the method for detecting a gesture provided by the present disclosure, in which a definition is given to the sum of the moving times of all sub-gestures recognized in one touch screen operation. Specifically, the method comprises the following steps:
in step S901, it is determined whether a touch screen operation is detected; if yes, go to step S902; if not, the process is ended. The method provided by the embodiment is used in the terminal equipment.
In step S902, the number of touch points, the moving direction, and the moving time of the current gesture for performing the touch screen operation are acquired and recorded.
In step S903, it is determined whether the number of contacts has not changed and the moving direction has not changed, and if yes, the step is repeatedly executed; if not, go to step S904.
In step S904, it is determined whether the movement time of the gesture before the change is within a first preset time period, and if so, step S905 is executed; if not, the process returns to step S902, and N is restarted to count.
In step S905, after the gesture before the change is recognized as the sub-gesture N; and N represents that the sub-gesture is the Nth sub-gesture recognized for the touch screen operation.
In step S906, it is determined whether the number of contacts becomes zero, and if so, step S907 is executed; if not, the process returns to step S802.
In step S907, determining whether the sum of the moving time of all the recognized continuous sub-gestures is within a second preset time duration, if so, executing step S808; if not, the process is ended.
In step S908, a gesture composed of all the recognized continuous sub-gestures is determined as the gesture detected for the touch screen operation.
In step S909, a corresponding operation is performed according to the detected gesture and the preset corresponding relationship between the gesture and the operation.
In the embodiment, the total time of the gestures formed by the sub-gestures is limited, so that the efficiency and the accuracy of the gestures and the gesture recognition can be improved.
In the disclosure, the detected gesture is composed of recognized sub-gestures, the sub-gestures are completely unrelated to the absolute position of the touch surface, and have a very large tolerance with respect to the relative position, so that the requirements of blind operation can be completely met. Secondly, a large number of gestures can be created by combining different sub-gestures, and the definition of the sub-gestures is very clear, so that the sub-gestures are very simple to recognize, and the error recognition rate is very low. In recognition of sub-gestures, requirements regarding movement distance, movement time, etc. reduce the possibility of false touches occurring. The complexity of the gesture is determined by the number of the sub-gestures, if misoperation is hoped to be reduced, the complexity of blind operation gestures can be required to be larger than a certain threshold value, and the occurrence of mistaken touch can be reduced through the requirement on the complexity of the gesture, because under the natural condition, the possibility of mistaken touch of a more complex gesture is very low.
Accordingly, the present disclosure also provides an apparatus for detecting a gesture, as shown in fig. 10, the apparatus including:
the sub-gesture recognition module 1001 is configured to, when a touch screen operation is detected, recognize a sub-gesture in which the number of contacts is unchanged and the moving direction is single;
and a gesture recognition module 1002, configured to determine a gesture composed of two or more recognized consecutive sub-gestures as a gesture detected for the touch screen operation.
The sub-gesture recognition module 1001 is configured to: and identifying the sub-gestures with the unchanged number of contacts, single moving direction and moving distance exceeding a first preset threshold.
Wherein, the moving direction is single including: the moving distance of the moving direction in one direction exceeds a first preset threshold, and the moving distance in other directions does not exceed a second preset threshold.
Wherein, the sub-gesture recognition module 1001 is further configured to: and recognizing the sub-gestures with the unchanged number of contacts, single moving direction and moving time within a first preset time length.
The gesture recognition module 1002 is to: and when the sum of the moving time of the two or more recognized sub-gestures is within a second preset time length, determining a gesture consisting of two or more recognized continuous sub-gestures as the gesture detected for the touch screen operation.
As shown in fig. 11, the apparatus further includes:
the executing module 1003 is configured to execute a corresponding operation according to the determined gesture detected for the touch screen operation and a preset corresponding relationship between the gesture and the operation.
As shown in fig. 11, the apparatus further includes:
the mapping module 1004 is configured to map a moving direction of a gesture of the touch screen operation onto a two-dimensional plane with a center of the touch surface as an origin when the touch surface is a non-plane and the touch screen operation is detected.
It should be noted that: in the above embodiment, when the device for detecting a gesture achieves the above functions, only the division of the above function modules is used for illustration, and in practical applications, the function distribution may be completed by different function modules according to needs, that is, the internal structure of the device is divided into different function modules, so as to complete all or part of the above described functions. In addition, the device for detecting a gesture and the method for detecting a gesture provided by the above embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiments and are not described herein again.
The present disclosure also provides a terminal device. Referring to fig. 12, the terminal device may be used to implement the conference room management method provided in the above embodiment. Wherein, this terminal equipment can be cell-phone, panel, wearing formula mobile device (like intelligent wrist-watch) etc.. Preferably:
the terminal device 700 may include components such as a communication unit 110, a memory 120 including one or more computer-readable storage media, an input unit 130, a display unit 140, a sensor 150, an audio circuit 160, a Wi-Fi (wireless fidelity) module 170, a processor 180 including one or more processing cores, and a power supply 190. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 10 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the communication unit 110 may be used for receiving and transmitting information or signals during a call, and the communication unit 110 may be an RF (Radio Frequency) circuit, a router, a modem, or other network communication devices. In particular, when the communication unit 110 is an RF circuit, downlink information of the base station is received and then processed by the one or more processors 180; in addition, data relating to uplink is transmitted to the base station. Generally, the RF circuit as a communication unit includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, an LNA (Low Noise Amplifier), a duplexer, and the like. In addition, the communication unit 110 may also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), email, SMS (Short Messaging Service), and the like. The memory 120 may be used to store software programs and modules, and the processor 180 executes various functional applications and data processing by operating the software programs and modules stored in the memory 120. The memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the terminal device 700, and the like. Further, the memory 120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 120 may further include a memory controller to provide the processor 180 and the input unit 130 with access to the memory 120.
The input unit 130 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. Preferably, the input unit 130 may include a touch-sensitive surface 131 and other input devices 132. The touch-sensitive surface 131, also referred to as a touch display screen or a touch pad, may collect touch operations by a user on or near the touch-sensitive surface 131 (e.g., operations by a user on or near the touch-sensitive surface 131 using a finger, a stylus, or any other suitable object or attachment), and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface 131 may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 180, and can receive and execute commands sent by the processor 180. Additionally, the touch-sensitive surface 131 may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. In addition to the touch-sensitive surface 131, the input unit 130 may also include other input devices 132. Preferably, other input devices 132 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 140 may be used to display information input by or provided to a user and various graphic user interfaces of the terminal device 700, which may be configured by graphics, text, icons, video, and any combination thereof. The Display unit 140 may include a Display panel 141, and optionally, the Display panel 141 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-emitting diode), or the like. Further, the touch-sensitive surface 131 may cover the display panel 141, and when a touch operation is detected on or near the touch-sensitive surface 131, the touch operation is transmitted to the processor 180 to determine the type of the touch event, and then the processor 180 provides a corresponding visual output on the display panel 141 according to the type of the touch event. Although in FIG. 12, touch-sensitive surface 131 and display panel 141 are shown as two separate components to implement input and output functions, in some embodiments, touch-sensitive surface 131 may be integrated with display panel 141 to implement input and output functions.
The terminal device 700 may also include at least one sensor 150, such as a light sensor, a motion sensor, and other sensors. Preferably, the light sensor may include an ambient light sensor that adjusts the brightness of the display panel 141 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 141 and/or the backlight when the terminal device 700 is moved to the ear. As one of the motion sensors, the gravity acceleration sensor may detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when the mobile phone is stationary, and may be used for applications of recognizing gestures of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and tapping), and other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor that are further configured to the terminal device 700, and are not described herein again.
The audio circuitry 160, speaker 161, microphone 162 may provide an audio interface between the user and the terminal device 700. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into an electric signal, converts the electric signal into audio data after being received by the audio circuit 160, and outputs the audio data to the processor 180 for processing, and then transmits the audio data to, for example, another terminal device via the RF circuit 110, or outputs the audio data to the memory 120 for further processing. The audio circuit 160 may also include an earbud jack to provide communication of peripheral headphones with the terminal device 700.
To implement wireless communication, a wireless communication unit 170 may be configured on the terminal device, and the wireless communication unit 170 may be a Wi-Fi module. Wi-Fi belongs to a short-range wireless transmission technology, and the terminal device 700 can help a user to send and receive e-mail, browse a web page, access streaming media, and the like through the wireless communication unit 170, which provides the user with wireless broadband internet access. Although fig. 12 shows the wireless communication unit 170, it is understood that it does not belong to the essential constitution of the terminal device 700 and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 180 is a control center of the terminal device 700, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions of the terminal device 700 and processes data by operating or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120, thereby performing overall monitoring of the mobile phone. Optionally, processor 180 may include one or more processing cores; preferably, the processor 180 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 180.
The terminal device 700 further includes a power supply 190 (e.g., a battery) for supplying power to the various components, which may preferably be logically connected to the processor 180 via a power management system, so as to manage charging, discharging, and power consumption via the power management system. The power supply 190 may also include any component including one or more of a dc or ac power source, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown, the terminal device 700 may further include a camera, a bluetooth module, and the like, which will not be described herein. Specifically, in this embodiment, the display unit of the terminal device is a touch screen display, the terminal device further includes a memory, and one or more instructions (instructions or programs), where the one or more instructions are stored in the memory and configured to be executed by the one or more processors, and the one or more instructions include instructions for:
when the touch screen operation is detected, identifying sub-gestures with unchanged number of contacts and single moving direction;
and determining a gesture formed by two or more recognized continuous sub-gestures as the gesture detected for the touch screen operation.
Furthermore, the terminal devices described in this disclosure may typically be various handheld terminal devices, such as cell phones, Personal Digital Assistants (PDAs), etc., and thus the scope of protection of this disclosure should not be limited to a particular type of mobile terminal.
Furthermore, the method according to the present disclosure may also be implemented as a computer program executed by a CPU. The computer program, when executed by the CPU, performs the above-described functions defined in the method of the present disclosure.
Further, the above method steps and system elements may also be implemented using a controller and a computer readable storage device for storing a computer program for causing the controller to implement the functions of the above steps or elements.
Further, it should be appreciated that the computer-readable storage devices (e.g., memories) described herein can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. By way of example, and not limitation, nonvolatile memory can include Read Only Memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which can act as external cache memory. By way of example and not limitation, RAM is available in a variety of forms such as synchronous RAM (DRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and Direct Rambus RAM (DRRAM). The storage devices of the disclosed aspects are intended to comprise, without being limited to, these and other suitable types of memory.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the disclosure herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as software or hardware depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
The various illustrative logical blocks, modules, and circuits described in connection with the disclosure herein may be implemented or performed with the following components designed to perform the functions described herein: a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination of these components. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The steps of a method or algorithm described in connection with the disclosure herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. In the alternative, the processor and the storage medium may reside as discrete components in a user terminal.
In one or more exemplary designs, the functions may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a general-purpose or special-purpose computer, or a general-purpose or special-purpose processor. Also, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, Digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, Digital Versatile Disc (DVD), floppy disk, blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
While the foregoing disclosure shows illustrative embodiments of the disclosure, it should be noted that various changes and modifications could be made herein without departing from the scope of the disclosure as defined by the appended claims. The functions, steps and/or actions of the method claims in accordance with the disclosed embodiments described herein need not be performed in any particular order. Furthermore, although elements of the disclosure may be described or claimed in the singular, the plural is contemplated unless limitation to the singular is explicitly stated.
The above-mentioned embodiments, objects, technical solutions and advantages of the present disclosure are described in further detail, it should be understood that the above-mentioned embodiments are merely illustrative of the present disclosure and are not intended to limit the scope of the present disclosure, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.