PRIORITYThis application claims the benefit under 35 U.S.C. §119(a) of a Korean patent application field on Aug. 28, 2012 in the Korean Intellectual Property Office and assigned Serial No. 10-2012-0094575, the entire disclosure of which is hereby incorporated by reference.
BACKGROUND OF THE INVENTION1. Field of the Invention
The present invention relates to an apparatus for measuring coordinates of input on a coordinate display device. More particularly, the present invention relates to an apparatus and method for measuring coordinates and for determining a time point of completion of input based on whether a hover event is completed.
2. Description of the Related Art
Recently, a market related to smart phones or touch screens has quickly grown, and accordingly, research on the smart phones or touch screens is also being actively conducted. In order to input a particular command in relation to a smart phone or a touch screen, a user can input a particular command or can designate a particular icon in such a manner as to designate a particular position on a display by using a part of the user's body or an ElectroMagnetic Resonance (EMR) pen.
A scheme in which a part of the user's body touches the surface of a touch screen may be implemented by using a touch screen of a capacitive type. Typically, a touch screen of the capacitive type includes transparent electrodes and a capacitive element between transparent electrodes. The user touches the surface of the touch screen by using the part of the user's body, and the touch of the part of the user's body may be sensed based on a capacitance of the capacitive element. The capacitance of the capacitive element changes according to the touch of the part of the user's body on the surface of the touch screen.
Meanwhile, in the touch screen of the capacitive type, the user touches the touch screen by using the part of the user's body. The touch screen of the capacitive type has a problem in that a relatively large touch area thereof makes precise input difficult to perform. In contrast, a touch screen of an Electromagnetic Resonance (EMR) type has an advantage in that the touch screen of an EMR type can operate even in the case of a small touch area.
In the touch screen of the EMR type, a loop coil is disposed on a circuit board, and a control operation is performed so as to apply a voltage to the loop coil and generate an electromagnetic field. Then, a control operation is performed so that the generated electromagnetic field may be delivered to an EMR pen. The EMR pen may include a capacitor and a loop, and can again emit the electromagnetic field delivered thereto as an electromagnetic field having a predetermined frequency component.
The electromagnetic field emitted by the EMR pen can again be delivered to the loop coil of the circuit board. Accordingly, a determination can be made as to a position to which the EMR pen is close in relation to the surface of the touch screen. An apparatus which measures coordinates in the EMR scheme as described above may identify the existence of the pen even when the pen does not directly touch the touch screen.
Meanwhile, an apparatus according to the related art for measuring coordinates adopts a configuration of performing text conversion based on the input of the pen. For example, the apparatus according to the related art for measuring coordinates adopts a configuration of recognizing input using the pen of the user as characters and interpreting the input as character data. Accordingly, the apparatus according to the related art for measuring coordinates can convert handwriting, which the user inputs, into character data.
When the pen does not touch the apparatus according to the related art for measuring coordinates for a predetermined time period or more, the apparatus according to the related art for measuring coordinates performs text recognition on the content that the user has input. Accordingly, the user is inconvenienced in that the user must continuously perform input using the pen in order to input desired content. Namely, the user is inconvenienced in that the user must continuously perform input without hesitation in order to prevent text recognition from being performed before completion of the input of content desired by the user.
Therefore, a need exists for an apparatus and method for determining a time point of completion of input based on whether a hover event is completed.
The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present invention.
SUMMARY OF THE INVENTIONAspects of the present invention are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present invention is to provide an apparatus for measuring coordinates and a control method thereof, which can determine a time point of completion of input based on whether a hover event is completed.
In accordance with an aspect of the present invention, a control method of an apparatus for measuring coordinates is provided. The control method includes displaying a user input interface and receiving a user input from the input means, detecting an increase in a distance between the input means and the apparatus for measuring the coordinates, determining whether a hover event is completed based on the distance between the input means and the apparatus for measuring the coordinates, and displaying a result of recognizing the user input as text when the hover event is completed.
In accordance with another aspect of the present invention, an apparatus for measuring coordinates of an input by an input means is provided. The apparatus includes a touch screen for displaying a user input interface and receiving a user input from the input means, and a controller for performing a control operation so as to detect an increase in a distance between the input means and the apparatus for measuring the coordinates, for determining whether a hover event is completed based on the distance between the input means and the apparatus for measuring the coordinates, and for performing a control operation so as to display a result of recognizing the user input as text when the hover event is completed.
Other aspects, advantages, and salient features of the invention will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses exemplary embodiments of the invention.
BRIEF DESCRIPTION OF THE DRAWINGSThe above and other aspects, features, and advantages of certain exemplary embodiments of the present invention will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a block diagram schematically showing a configuration of a mobile device according to an exemplary embodiment of the present invention;
FIG. 2 is a front perspective view of a mobile device according to an exemplary embodiment of the present invention;
FIG. 3 is a rear perspective view of a mobile device according to an exemplary embodiment of the present invention;
FIG. 4 is a block diagram showing a specific configuration of a controller of a mobile device, to which a method for controlling scrolling is applied, according to an exemplary embodiment of the present invention;
FIG. 5 is a flowchart showing a control method of an apparatus for measuring coordinates according to an exemplary embodiment of the present invention;
FIG. 6 is a conceptual view for explaining a completion of a hover event according to an exemplary embodiment of the present invention; and
FIG. 7 is a flowchart showing a control method of an apparatus for measuring coordinates according to an exemplary embodiment of the present invention.
Throughout the drawings, it should be noted that like reference numbers are used to depict the same or similar elements, features, and structures.
DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTSThe following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of exemplary embodiments of the invention as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the invention. Accordingly, it should be apparent to those skilled in the art that the following description of exemplary embodiments of the present invention is provided for illustration purpose only and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.
It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
Although the terms including ordinal numbers such as first and second may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and similarly, a second element could be termed a first element, without departing from the scope of right of the present invention. The terminology used herein is for the purpose of describing particular exemplary embodiments only, and is not intended to limit the present invention. As used herein, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
FIG. 1 is a block diagram schematically showing a configuration of a mobile device according to an exemplary embodiment of the present invention.
Referring toFIG. 1, themobile device100 includes acontroller110, amobile communication module120, asub-communication module130, amultimedia module140, acamera module150, a Global Positioning System (GPS)module155, an Input/Output (I/O)module160, asensor module170, astorage unit175, apower supply unit180, atouch screen190, and atouch screen controller195.
According to exemplary embodiments of the present invention, amobile device100 may be connected to an external device (not shown) by using an external device connection unit, such as thesub-communication module130, aconnector165, anearphone connection jack167, and the like. The external devices may include various devices, such as an earphone, an external speaker, a Universal Serial Bus (USB) memory, a charger, a cradle/dock, a Digital Multimedia Broadcasting (DMB) antenna, a device related to mobile payment, a health care device (e.g., a blood glucose meter or the like), a video game console, and a car navigation device, each of which may be removed or detached from/to themobile device100 and may be connected to themobile device100 by a wire. Also, the external devices may include short-range communication devices such as a Bluetooth communication device and a Near Field Communication (NFC) device each of which may be wirelessly connected to themobile device100 through short-range communication, a Wi-Fi direct communication device, a wireless Access Point (AP), and the like. Also, the external devices may include another device, a mobile phone, a smart phone, a tablet Personal Computer (PC), a desktop PC, a server, and the like.
According to exemplary embodiment of the present invention, thesub-communication module130 includes at least one of awireless LAN module131 and a short-range communication module132 (e.g., an NFC communication module). For example, thesub-communication module130 may include only thewireless LAN module131, or may include only the short-range communication module132, or may include both the wireless Local Area Network (LAN)module131 and the short-range communication module132.
According to exemplary embodiment of the present invention, themultimedia module140 includes at least one of abroadcasting communication module141, anaudio reproduction module142, and a movingimage reproduction module143.
According to exemplary embodiment of the present invention, thecamera module150 includes at least one of afirst camera151 and asecond camera152.
According to exemplary embodiment of the present invention, the input/output module160 includes at least one ofbuttons161, amicrophone162, aspeaker163, avibration motor164, theconnector165 and akeypad166. The input/output module160 may include anearphone jack167, astylus pen168, a pen removal/attachment recognition switch169, and the like.
Hereinafter, a case will be described as an example in which thedisplay unit190 and thedisplay controller195 are a touch screen and a touch screen controller, respectively.
Thecontroller110 may include a Central Processing Unit (CPU)111, a Read Only Memory (ROM)112 which stores a control program for controlling themobile device100, and a Random Access Memory (RAM)113 which stores a signal or data received from the outside of themobile device100, or which is used as a memory area for a task performed by themobile device100. TheCPU111 may include a number of processors. For example, theCPU111 may include a single-core processor, a dual-core processor, a triple-core processor, a quad-core processor, or the like. TheCPU111, theROM112 and theRAM113 may be interconnected by an internal bus.
Thecontroller110 may control themobile communication module120, thesub-communication module130, themultimedia module140, thecamera module150, theGPS module155, the input/output module160, thesensor module170, thestorage unit175, thepower supply unit180, thetouch screen190, and thetouch screen controller195.
According to the control of thecontroller110, themobile communication module120 allows themobile device100 to be connected to an external device through mobile communication by using at least one antenna or multiple antennas (not shown). Themobile communication module120 transmits and receives a wireless signal for a voice call, a video call, a Short Message Service (SMS), a Multimedia Messaging Service (MMS), or the like to/from a mobile phone (not shown), a smart phone (not shown), a tablet PC or another device (not shown), which has a telephone number which is input to themobile device100.
According to the control of thecontroller110, thewireless LAN module131 may be connected to the Internet at a place where a wireless AP (not shown) is installed. Thewireless LAN module131 supports a wireless LAN standard (e.g., IEEE802.11x of Institute of Electrical and Electronics Engineers (IEEE)). According to the control of thecontroller110, the short-range communication module132 enables themobile device100 to perform wireless short-range communication with an image forming device (not shown). Short-range communication schemes may include Bluetooth, Infrared Data Association (IrDA), Wi-Fi direct communication, NFC, and the like.
According to performance, themobile device100 may include at least one of themobile communication module120, thewireless LAN module131 and the short-range communication module132. For example, according to performance, themobile device100 may include a combination of themobile communication module120, thewireless LAN module131 and the short-range communication module132.
Themultimedia module140 may include thebroadcasting communication module141, theaudio reproduction module142, or a movingimage reproduction module143. According to the control of thecontroller110, thebroadcasting communication module141 may receive a broadcast signal (e.g., a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like) and broadcast additional information (e.g., an Electronic Program Guide (EPS) or an Electronic Service Guide (ESG)), which is transmitted by a broadcast station through a broadcast communication antenna (not shown). According to the control of thecontroller110, theaudio reproduction module142 may reproduce a stored or received digital audio file (e.g., a file having a file extension of mp3, wma, ogg, way, and the like). According to the control of thecontroller110, the movingimage reproduction module143 may reproduce a stored or received digital moving image file (e.g., a file having a file extension of mpeg, mpg, mp4, avi, mov, mkv, and the like). The movingimage reproduction module143 may reproduce a digital audio file.
Themultimedia module140 may exclude thebroadcasting communication module141, and may include theaudio reproduction module142 and the movingimage reproduction module143. Also, theaudio reproduction module142 or the movingimage reproduction module143 of themultimedia module140 may be included in thecontroller110.
Thecamera module150 may include at least one of thefirst camera151 and thesecond camera152, each for capturing a still image or a moving image according to the control of thecontroller110. Also, thefirst camera151 or thesecond camera152 may include an auxiliary light source, such as a flash (not shown), which provides the amount of light required to capture an image. Thefirst camera151 may be mounted on a front surface of themobile device100, and thesecond camera152 may be mounted on a rear surface of themobile device100. Otherwise, thefirst camera151 and thesecond camera152 may be disposed in such a manner as to be adjacent to each other (e.g., a distance between thefirst camera151 and thesecond camera152 is greater than 1 cm and is less than 8 cm), and may capture a three-dimensional still image or a three-dimensional moving image.
TheGPS module155 receives a signal (e.g., a radio wave) from each of multiple GPS satellites (not shown) in the Earth's orbit, and theGPS module155 may calculate a location of themobile device100 by using a Time of Arrival (TOA) from each of the GPS satellites (not shown) to themobile device100.
The input/output module160 may include at least one of themultiple buttons161, themicrophone162, thespeaker163, thevibration motor164, theconnector165 and thekeypad166.
Thebuttons161 may be formed on a front surface, a lateral surface or a rear surface of a housing of themobile device100, and may include at least one of a power/lock button (not shown), a volume button (not shown), a menu button, a home button, a back button and a search button.
According to the control of thecontroller110, themicrophone162 receives a voice or sound as input, and generates an electrical signal.
According to the control of thecontroller110, thespeaker163 may output sounds matched to various signals (e.g., a wireless signal, a broadcast signal, a digital audio file, a digital moving image file, and photographing) from themobile communication module120, thesub-communication module130, themultimedia module140 and thecamera module150, to the outside of themobile device100. Thespeaker163 may output a sound (e.g., a button operation sound or a ring back tone matched to a telephone call) matched to a function that themobile device100 performs. Themobile device100 may include multiple speakers. Thespeaker163 or multiple speakers may be disposed at an appropriate position or appropriate positions of the housing of themobile device100.
According to the control of thecontroller110, thevibration motor164 may convert an electrical signal into a mechanical vibration. For example, when themobile device100 in a vibration mode receives a voice call from another device (not shown), thevibration motor164 of themobile device100 operates. Themobile device100 may include multiple vibration motors. The onevibration motor164 or the multiple vibration motors may be mounted within the housing of themobile device100. Thevibration motor164 may operate in response to a touch action of a user who touches thetouch screen190 and a continuous movement of a touch on thetouch screen190.
Theconnector165 may be used as an interface for connecting themobile device100 to an external device (not shown) or a power source (not shown). According to the control of thecontroller110, through a wired cable connected to theconnector165, themobile device100 may transmit data stored in thestorage unit175 of themobile device100 to an external device (not shown) or may receive data from the external device (not shown). Also, through the wired cable connected to theconnector165, themobile device100 may receive power from the power source (not shown) or may charge a battery (not shown) by using the power source.
Thekeypad166 may receive key input from the user in order to control themobile device100. Thekeypad166 includes a physical keypad (not shown) installed on the front surface of themobile device100 or a virtual keypad (not shown) displayed on thetouch screen190. The physical keypad (not shown) installed on the front surface of themobile device100 may be excluded according to the performance or structure of thecommunication device100.
A plug of an earphone (not shown) may be inserted into theearphone connection jack167, and the earphone may be connected to themobile device100.
Thesensor module170 includes at least one sensor for detecting the state of themobile device100. For example, thesensor module170 may include a proximity sensor for detecting whether the user is close to themobile device100, an illuminance sensor (not shown) for detecting the amount of light around themobile device100, a motion sensor (not shown) for detecting the motion of the mobile device100 (e.g., the rotation of themobile device100, or acceleration or vibration applied to the mobile device100), a geomagnetic sensor (not shown) for detecting a point of the compass by using the Earth's magnetic field, a gravity sensor for detecting the working direction of gravity, an altimeter for measuring atmospheric pressure and detecting an altitude, and the like. At least one sensor may detect the state of thecommunication device100, may generate a signal matched to the detection, and may transmit the generated signal to thecontroller110. According to the performance of themobile device100, sensors may be added to or removed from thesensor module170.
According to the control of thecontroller110, thestorage unit175 may store a signal or data which is input/output in response to an operation of each of themobile communication module120, thesub-communication module130, themultimedia module140, thecamera module150, theGPS module155, the input/output module160, thesensor module170, and thetouch screen190. Thestorage unit175 may store a control program for controlling themobile device100 or a control program for thecontroller110, and applications.
The term “storage unit” includes thestorage unit175, theROM112 and theRAM113 within thecontroller110, or a memory card (not shown), such as a Secure Digital (SD) card or a memory stick, which is mounted on themobile device100. The storage unit may include a non-volatile memory, a volatile memory, a Hard Disk Drive (HDD), a Solid State Drive (SSD), and the like.
According to the control of thecontroller110, thepower supply unit180 may supply power to one battery or multiple batteries (not shown) disposed in the housing of themobile device100. The one battery or the multiple batteries (not shown) supply power to themobile device100. Also, thepower supply unit180 may supply power provided by an external power source (not shown) to themobile device100 through a wired cable connected to theconnector165. Also, thepower supply unit180 may supply power, which is wirelessly provided by an external power source, to themobile device100 by using a wireless charging technology.
Thetouch screen190 may provide the user with a user interface matched to various services (e.g., telephone call, data transmission, broadcasting, and photography). Thetouch screen190 may transmit an analog signal matched to at least one touch, which is input to the user interface, to thetouch screen controller195. Thetouch screen190 may receive at least one touch as input from the user's body (e.g., fingers, thumbs, and the like) or an input means (e.g., a stylus pen, and the like) enabling a touch. Also, thetouch screen190 may receive, as input, a continuous movement of one touch with respect to at least one touch. Thetouch screen190 may transmit an analog signal matched to a continuous movement of an input touch, to thetouch screen controller195.
Further, according to exemplary embodiments of the present invention, a touch is not limited to the touch of the user's body or the input means enabling a touch on thetouch screen190, but may include a non-contact touch (e.g., a detectable distance between thetouch screen190 and the user's body or the input means enabling a touch is equal to or less than 1 mm). In thetouch screen190, a detectable distance may change depending on the performance or structure of themobile device100. Particularly, in order to enable the detection of both a touch event due to the touch of the user's body or the input means enabling a touch on thetouch screen190 and an event of input in a non-contact state (e.g., hovering) in such a manner so as to distinguish the touch event from the hovering event, thetouch screen190 is configured in such a manner that thetouch screen190 may output different values (e.g., current values) detected during the touch event and detected during the hovering event. Further, it is desirable that thetouch screen190 outputs different detected values (e.g., current values) according to a distance between space, in which the hovering event occurs, and thetouch screen190.
Thetouch screen190, for example, may be implemented as a resistive touch screen, a capacitive touch screen, an infrared touch screen, an acoustic wave touch screen, and/or the like.
Meanwhile, thetouch screen controller195 converts an analog signal received from thetouch screen190 into a digital signal (e.g., X and Y coordinates), and provides the digital signal to thecontroller110. Thecontroller110 may control thetouch screen190 by using the digital signal received from thetouch screen controller195. For example, in response to a touch event or a hovering event, thecontroller110 enables a shortcut icon (not shown) displayed on thetouch screen190 to be selected, or enables the shortcut icon (not shown) to be executed. Also, thetouch screen controller195 may be included in thecontroller110.
Further, thetouch screen controller195 may detect a value (e.g., a current value) which is output from thetouch screen190, and may identify a distance between a space in which the hovering event occurs, and thetouch screen190. Also, thetouch screen controller195 may convert the value of the identified distance into a digital signal (e.g., a Z coordinate), and may provide the digital signal to thecontroller110.
Also, in order to enable thetouch screen190 to simultaneously receive input from the user's body and input from the input means enabling a touch, thetouch screen190 may include at least two touch screen panels which may sense the touch or proximity of the user's body and the input means enabling a touch, respectively. The at least two touch screen panels provide different output values to thetouch screen controller195, and thetouch screen controller195 recognizes the values received from the at least two touch screen panels as different values. Accordingly, thetouch screen controller195 may determine whether input from the touch screen is input from the user's body, or whether the input from the touch screen is input from the input means enabling a touch.
Thetouch screen190, for example, may include a coil electrode unit for measuring a position, which includes at least one loop coil capable of receiving an Electro Magnetic Resonance (EMR) signal as input. During a first time period, the coil electrode unit for measuring a position transmits a transmission signal (Tx signal) to an EMR pen. The transmitted Tx signal may be absorbed by the EMR pen. During a second time period, the EMR pen transmits a reception signal (Rx signal) to themobile device100 based on the absorbed transmission signal. Themobile device100 may recognize coordinates of input from the EMR pen based on the Rx signal received from the EMR pen. Particularly, amobile device100 may recognize the placement of the EMR pen even when the EMR pen does not directly touch the touch screen. Otherwise, themobile device100 may recognize the placement of a part of the user's body.
Accordingly, thecontroller110 may measure the placement of the EMR pen and the like near themobile device100. Also, thecontroller110 may measure the withdrawal of the EMR pen from themobile device100. For example, when the strength of an Rx signal received from the EMR pen is less than a preset threshold, thecontroller110 may determine that the EMR pen is withdrawn from themobile device100. Otherwise, when the strength of the Rx signal received from the EMR pen is greater than the preset threshold, thecontroller110 may determine that the EMR pen is placed near themobile device100.
Thecontroller110 may determine that the placement of the EMR pen near themobile device100 corresponds to a case in which a hovering event is occurring. Also, thecontroller110 may determine that taking the EMR pen off the mobile device100 (e.g., moving the EMR pen away from the mobile device100) corresponds to the completion of the hovering event.
FIG. 2 is a front perspective view of a mobile device according to an exemplary embodiment of the present invention.FIG. 3 is a rear perspective view of a mobile device according to an exemplary embodiment of the present invention.
Referring toFIGS. 2 and 3, thetouch screen190 is disposed in the center of afront surface100aof themobile device100. Thetouch screen190 is largely formed so as to occupy most of thefront surface100aof themobile device100.FIG. 2 shows an example of displaying a main home screen on thetouch screen190. The main home screen is the first screen displayed on thetouch screen190 when themobile device100 is turned on. Also, when themobile device100 has different home screens having multiple pages, the main home screen may be the first home screen among the multi-page home screens. On the home screen, shortcut icons191-1,191-2 and191-3 for executing frequently-used applications, a main menu shift key191-4, time, weather and the like may be displayed. The main menu shift key191-4 displays a menu screen on thetouch screen190. Also, astatus bar192 which displays the states of themobile device100, such as the state of battery charging, the strength of a received signal, current time, and the like, may be formed in an upper end part of thetouch screen190.
Ahome button161a, amenu button161band aback button161cmay be formed in a lower part of thetouch screen190.
Thehome button161ais used to display the main home screen on thetouch screen190. For example, the main home screen may be displayed on thetouch screen190 when thehome button161ais touched in a state of displaying a home screen (any home screen) different from the main home screen or the menu screen on thetouch screen190. Also, when thehome button161ais touched on thetouch screen190 during execution of applications, the main home screen shown inFIG. 2 may be displayed on thetouch screen190. Also, thehome button161amay be used to display recently-used applications on thetouch screen190 or may be used to display a task manager.
Themenu button161bprovides a connection menu which may be used on thetouch screen190. The connection menu may include a widget addition menu, a background screen change menu, a search menu, an edit menu, an environment setting menu, and the like.
Theback button161cmay be used to display a screen executed just before a currently-executed screen, or may be used to terminate the most recently used application.
Thefirst camera151, anilluminance sensor170aand aproximity sensor170bmay be disposed at the edge of thefront surface100aof themobile device100. Asecond camera152, aflash153 and aspeaker163 may be disposed on arear surface100cof themobile device100.
On alateral surface100bof themobile device100, for example, a power/reset button160a, aspeaker phone button161d, aterrestrial DMB antenna141afor receiving broadcast signals, one ormultiple microphones162 and the like may be disposed. TheDMB antenna141amay be formed so as to be fixed to themobile device100, or so as to be removable or detachable from/to it.
Avolume control161fmay be disposed on another lateral surface of themobile device100. Thevolume control161fmay include avolume increase button161eand avolume decrease button161g.
Also, theconnector165 is formed on a lateral surface of a lower end of themobile device100. Theconnector165 includes multiple electrodes, and may be used to connect themobile device100 to an external device by a wire. Theearphone connection jack167 may be formed on a lateral surface of an upper end of themobile device100. A plug of an earphone may be inserted into theearphone connection jack167.
Further, thecontroller110 included in the mobile device according to an exemplary embodiment of the present invention as described above is configured so that thecontroller110 may perform a method for controlling scrolling according to an exemplary embodiment of the present invention. To this end, thecontroller110 according to exemplary embodiments of the present invention may collect (e.g., detect) a hovering event, as illustrated inFIG. 4.
FIG. 4 is a block diagram showing a specific configuration of a controller of a mobile device, to which a method for controlling scrolling is applied, according to an exemplary embodiment of the present invention.
Referring toFIG. 4, thecontroller100 includes a hovering collection unit110-1 which may communicate with thetouch screen controller195.
The hovering event collection unit110-1 may identify whether a hovering event has occurred on the touch screen, by using a digital signal (e.g., X, Y and Z coordinates) provided by thetouch screen controller195; may detect an area over which the hovering event has occurred, based on X and Y coordinates; and may also detect a distance between thetouch screen190 and the user's body or the input means enabling a touch based on a Z coordinate. Further, the hovering event collection unit110-1 may count a time period, during which the hovering event lasts, in the area over which the hovering event has occurred.
Also, the hovering event collection unit110-1 may control performing of text recognition on input content by using information on the area where the collected hovering event has occurred. Namely, the hovering event collection unit110-1 may determine a time point of occurrence of a hovering event and a time point of completion of a hovering event. While the hovering event lasts (e.g., is maintained), thecontroller110 may perform a control operation so as to prevent text recognition from being performed. In contrast, when the hovering event is completed, thecontroller110 may perform a control operation so as to perform the text recognition.
FIG. 5 is a flowchart showing a control method of an apparatus for measuring coordinates according to an exemplary embodiment of the present invention.
Referring toFIG. 5, the apparatus for measuring coordinates may provide a user input interface, and may receive user input from an input means, such as an EMR pen or a finger, matched to the user input interface in step S501. For example, a user may touch a point on the touch screen of the apparatus for measuring coordinates, and the apparatus for measuring coordinates may determine the point that the input means touches on the touch screen.
Meanwhile, the apparatus for measuring coordinates may detect an increase in a distance between the input means and the apparatus for measuring coordinates in step S503. For example, the apparatus for measuring coordinates may detect an increase in the distance between the input means and the apparatus for measuring coordinates, from a reduction in the strength of an Rx signal received from the input means. Otherwise, the apparatus for measuring coordinates may detect an increase in the distance between the input means and the apparatus for measuring coordinates, based on a Z coordinate. As another example, the apparatus may detect a change in the distance between the input means and the apparatus for measuring coordinates. The apparatus may monitor a distance between the input means and the apparatus for measuring coordinates.
Then, the apparatus for measuring coordinates may determine whether a hover event is completed in step S505. For example, the apparatus for measuring coordinates may determine whether the distance between the input means and the apparatus for measuring coordinates exceeds a preset length. The preset length can be referred to as, for example, a “hover event height.” The apparatus for measuring coordinates may determine the distance between the apparatus for measuring coordinates and the input means, which exceeds a hover event height, as the completion of the hover event.
When the apparatus determines that the hover event is completed in step S505, the apparatus proceeds to step S507 in which the apparatus for measuring coordinates may recognize the contents of input writing, as text, and may convert the contents of the input writing into text.
In contrast, when the apparatus determines that that the hover event is not completed in step S505, the apparatus for measuring coordinates does not perform text recognition on the contents of the input writing. When the result of the determination in step S505 shows that the hover event is not completed, the apparatus for measuring coordinates may continuously receive user input. Accordingly, the user may complete the writing at a time point desired by the user. As an example, the apparatus may proceed to step S501.
FIG. 6 is a conceptual view for explaining a completion of a hover event according to an exemplary embodiment of the present invention.
Referring toFIG. 6, themobile device100 recognizes a hoverevent height600 having a preset length h. When the input means is placed at a position below thehover event height600 as shown by reference numeral1-a, themobile device100 recognizes that a hovering event is occurring. Accordingly, themobile device100 may avoid performing text recognition on the contents of input writing. Otherwise, while themobile device100 performs text recognition on the contents of the input writing, themobile device100 may perform a control operation so as to prevent performing of the text recognition from being displayed. When the hover event has been completed, themobile device100 may display a result of performing the text recognition, all together.
When the input means is placed at a position above thehover event height600 as shown by reference numeral1-b, amobile device100 recognizes that the hover event is completed. Accordingly, themobile device100 performs text recognition on the contents of input writing. Otherwise, themobile device100 may display a result of performing the text recognition, all together.
However, even when the input means is placed at a position below thehover event height600 as shown by reference numeral1-a, if input does not occur until a preset time period is exceeded, themobile device100 may determine that a hover event is completed.
FIG. 7 is a flowchart showing a control method of an apparatus for measuring coordinates according to an exemplary embodiment of the present invention.
Referring toFIG. 7, the apparatus for measuring coordinates may receive input from an input means such as an EMR pen or a finger in step S701. For example, a user may touch a point on the touch screen of the apparatus for measuring coordinates, and the apparatus for measuring coordinates may determine the point that the input means touches on the touch screen.
Meanwhile, the apparatus for measuring coordinates may detect an increase in a distance between the input means and the apparatus for measuring coordinates in step S703. For example, the apparatus for measuring coordinates may detect an increase in the distance between the input means and the apparatus for measuring coordinates, from a reduction in the strength of an Rx signal received from the input means. Otherwise, the apparatus for measuring coordinates may detect an increase in the distance between the input means and the apparatus for measuring coordinates, based on a Z coordinate.
The apparatus for measuring coordinates may determine whether the distance between the input means and the apparatus for measuring coordinates exceeds a hover event height in step S705. For example, when the strength of an Rx signal received from the input means is less than a preset magnitude, the apparatus for measuring coordinates may determine that the distance between the input means and the apparatus for measuring coordinates exceeds the hover event height.
When the apparatus determines that the distance between the input means and the apparatus for measuring coordinates is equal to or less than the hover event height in step S705, the apparatus for measuring coordinates may avoid performing text recognition on the contents of input writing. Otherwise, while the apparatus for measuring coordinates performs the text recognition on the contents of the input writing in real time, the apparatus may prevent a result of performing the text recognition from being displayed. In this case, the apparatus for measuring coordinates may continuously receive the user input while it does not display the result of performing the text recognition. Namely, the apparatus for measuring coordinates may provide a user input interface. For example, the apparatus may proceed to step S701.
When the apparatus determines that the distance between the input means and the apparatus for measuring coordinates is greater than the hover event height in step S705, the apparatus proceeds to step S707 in which the apparatus for measuring coordinates may determine that a hover event is completed.
The apparatus for measuring coordinates determines whether a time period of maintaining the completion of the hover event exceeds a preset time period in step S707. When the apparatus for measuring coordinates determines that the time period of maintaining the completion of the hover event is equal to or less than the preset time period in step S707, the apparatus for measuring coordinates determines that the hover event is not completed. For example, the apparatus for measuring coordinates may return to step S705.
In contrast, when the apparatus for measuring coordinates determines that the time period of maintaining the completion of the hover event exceeds the preset time period in step S707, the apparatus for measuring coordinates may determine that the hover event is completed.
Accordingly, the apparatus for measuring coordinates may perform the text recognition on the contents of the input writing in step S709. Otherwise, the apparatus for measuring coordinates may display a result of performing the text recognition, all together.
As described above, when the input means is not placed at a position above the hover event height, the apparatus for measuring coordinates continuously provides a user input interface. The user performs writing by using the pen in a state of maintaining the pen at a position having a height equal to or less than the hover event height, and thereby may be provided with the user input interface.
It will be appreciated that the exemplary embodiments of the present invention may be implemented in the form of hardware, software, or a combination of hardware and software. Any such software may be stored in a volatile or non-volatile storage device such as a ROM, or in a memory such as a RAM, a memory chip, a memory device or a memory integrated circuit, or in a storage medium, such as a Compact Disc (CD), a Digital Versatile Disc (DVD), a magnetic disk or a magnetic tape, which is optically or magnetically recordable and simultaneously, is readable by a machine (e.g., a computer), regardless of whether the software can be deleted or rewritten. It will be appreciated that the method for controlling the apparatus for measuring coordinates of input from an input means according to exemplary embodiments of the present invention may be implemented by a computer or a portable terminal including a controller and a memory, and that the memory is an example of a non-transient machine-readable storage medium suitable for storing a program or programs including instructions for implementing the exemplary embodiments of the present invention. Accordingly, exemplary embodiments of the present invention include a program including codes for implementing an apparatus or a method which is claimed in any claim of this specification, and a storage medium which stores this program and is readable by a machine (a computer or the like).
Also, the apparatus for measuring coordinates may receive and store the program from a device for providing a program, which is connected to the mobile device including the apparatus for measuring coordinates by a wire or wirelessly. The device for providing a program may include: a memory for storing a program including instructions which cause the mobile device to perform a previously-set method for controlling the apparatus for measuring coordinates, information required for the method for controlling the apparatus for measuring coordinates, and the like; a communication unit for performing wired or wireless communication with the mobile device; and a controller for performing a control operation so as to transmit the relevant program to the mobile device, at a request from the apparatus for measuring coordinates or automatically.
According to various exemplary embodiments of the present invention, an apparatus for measuring coordinates and a control method thereof, which can determine a time point of completion of input based on whether a hover event is completed, are provided. When a user inputs handwriting by using a pen, the user does not have to perform handwriting as in the scheme according to the related art in which the user must continuously perform handwriting without hesitation. When the pen is located within a predetermined distance from the apparatus for measuring coordinates, the user can operate the mobile device in such a manner as to prevent the initiation of text recognition. Accordingly, the user can determine a time point of completion of handwriting at a desired time point.
While the invention has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention. Therefore, the spirit and scope of the invention as defined by the appended claims and their equivalents.