Movatterモバイル変換


[0]ホーム

URL:


AU2007360500A1 - Passive pointing/touching interface method - Google Patents

Passive pointing/touching interface method
Download PDF

Info

Publication number
AU2007360500A1
AU2007360500A1AU2007360500AAU2007360500AAU2007360500A1AU 2007360500 A1AU2007360500 A1AU 2007360500A1AU 2007360500 AAU2007360500 AAU 2007360500AAU 2007360500 AAU2007360500 AAU 2007360500AAU 2007360500 A1AU2007360500 A1AU 2007360500A1
Authority
AU
Australia
Prior art keywords
predetermined
screen
selectable
determining
representation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU2007360500A
Inventor
Leonid Minutin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by IndividualfiledCriticalIndividual
Publication of AU2007360500A1publicationCriticalpatent/AU2007360500A1/en
Abandonedlegal-statusCriticalCurrent

Links

Classifications

Landscapes

Description

WO 2009/053956 PCT/IL2007/001259 1 PASSIVE POINTING/TOUCHING INTERFACE METHOD FIELD OF THE INVENTION The present invention relates to a pointing or/and touching based interfaces. More particularly, the invention relates to the finger or/and passive pen-like object based pointing/touching interfaces. BACKGROUND OF THE INVENTION Various types of interfaces exist for inputting commands to a computer or other information systems, usually having a display. Such interface devices may for example take the form of a computer mouse, joystick or trackball. With the spread of computer systems the other types of interfaces had been designed. For example, the touch interface based systems are well known. They typically include a touch screen having a touch surface on which contacts are made using a pointer or other pointing object. Pointer contact with the touch surface is detected and is used to generate corresponding output pointer position data representing area of the touch surface where the pointer contact has been made. There are basically two general types of touch systems available and they can be broadly classified as "active" and "passive" touch systems. Active touch systems allow the user to generate input by contacting the touch surface with a special pointer that usually requires some form of on-board power source (typically batteries) and any receiving or/and transmitting circuit. The special pointer emits signals such as infrared light, ultrasonic frequencies, electromagnetic frequencies, etc., that activate the touch surface. Passive touch systems allow the user to generate pointer position data by contacting the touch surface with a passive pointer and do not require the use of special pointers to activate the touch surface. The passive pointer can be a finger, any pen-like or any other suitable object of some material that can be used to contact some predetermined area of interest on the touch surface.
WO 2009/053956 PCT/IL2007/001259 2 In US Patent No. 6,803,906 entitled "Passive Touch System and Method of Detecting User Input" a touch system is disclosed. The touch system includes a touch screen coupled to a master controller and a computer coupled to the master controller. The computer executes one or more application programs and provides display output that is presented on the touch screen. The touch screen includes a touch surface in the form of a rectangular planar sheet of material bordered by a rectangular bezel or frame. A digital camera is mounted adjacent each corner of the touch screen. Each digital camera is aimed so that its field of view encompasses a designated edge of the touch surface. In this way, the entire touch surface is within the fields of view of the digital cameras. Moreover the fields of view of the cameras overlap so that a pointer appears within the fields of view of at least two of the digital cameras. This allows the position of such a pointer relative to the touch surface to be calculated using triangulation. During calculation of a fixed error correcting calibration angle is subtracted from the calculated angles to take into account the angular offsets of the digital cameras. This calibration of course assumes that the angular offsets of the digital cameras are known and equal. Unfortunately, the angular offset of each digital camera usually differs. Also the angular offset of each camera may change during shipping, installation, etc. of the touch system. As a result, the angular offsets of the digital cameras are typically not properly compensated for by the fixed error correcting calibration angle. Unless the actual angular offsets of the digital cameras are known, when the position of a pointer relative to the touch surface is calculated using triangulation based on the calculated angles, the calculated position may be significantly different than its actual position. To complicate matters the calculated position of the pointer may vary significantly depending on the pair of digital cameral whose image data is used to triangulate the position of the pointer. This makes it long and difficult to calculate accurately the position of a pointer relative to the touch screen. US Patent No. 6,927,386 entitled "Optical position detecting device and recording medium including an operational defect judgment" discloses a light-retro-reflector based system. In this invention at least two optical scanners provide emitting an angularly scanning laser parallel light across the display screen. The reflecting means located around the screen reflect the said light to at least two light receiving elements. During the touching operations the user enters a light blocking member, e.g. finger or pen-like object, into aperture of the emitting scanners. If the light blocking member is present in the aperture, i.e. in any path of the scanning light, then the scanning light into this path is blocked and the representative reflected light does not enter the WO 2009/053956 PCT/IL2007/001259 3 light receiving element. In the light receiving signals from the light receiving elements a blocked range appears. Representative to the location and size of this blocked range into receiving signal the light blocking member position may be detected. Calculations are based on the triangulation method and perform a conversion the blocked range position and size (i.e. angles of non reflected light paths) into orthogonal screen coordinates. This invention needs for exact, stabilized and complicated enough optical scanners, having significant scan angle, and set of very exactly positioned light-retro-reflecting elements. It is difficult to provide for them precise and stable parameters. The optical system is keen to the dust; its cleanness is difficult to provide into common environment. Also this invention is sensitive to vibration and other mechanical force factors. The orthogonal screen coordinate calculations into this invention are complicated enough. The exactness of the coordinate determining is different in a significant degree for the different screen areas. US Patent No. 6,953,926 titled "Method and devices for opto-electronically determining the position of an object" describes the method and devices for optoelectronic object position determining using a measurement assembly, which is separated from the object by a medium, that is permeable to at least one type of specific radiation. The measurement assembly consists of at least one active sensor zone in the medium comprising at least two measuring paths (x, y), which are maintained by at least two radiation sources that emit the specified radiation and by at least one radiation receiver that is allocated to the radiation sources. Said receiver determines the reflection returned by the object to generate a detection signal which corresponds to the received radiation. A specified radiation is emitted using the radiation sources and the reflection returned by the object is detected, in order to allocate initial values (U(t), U sub R(d)) to the individual measuring paths. An evaluation unit determines the position and/or displacement of the object by determining a specified angular curve of the object in relation to the radiation sources from the initial value (U sub R(T), U sub R(d)), for a known spatial relationship between the radiation sources. Analogously to noted above prior arts, in the US Patent No. 6,953,926 the coordinate determining of the pointing object relative to the screen is performed. In this determining they are used analog parameters of superposition from at least two radiations (i.e. radiations from at least two radiation sources) and amplitudes of the signals produced by the measurement assembly analogous elements. Therefore this determining is responsive to analog electronic WO 2009/053956 PCT/IL2007/001259 4 noise. The amplitudes of the superposition radiation are significantly varied through the different screen areas - this is obstacle for the exactness into following calculation. The aperture of the radiation receiver is directed upwards to the environment and receives in additional to the desired signal (i.e. desired reflections) all environment "noisy" radiations. That significantly decreases the accuracy of coordinates determining. The determination from the superposition signal the certain angular curves and corresponded to them coordinates is a complicated enough process requiring a significant computer resources, that is not desirable for multi-thread computer processes typically used in recent applications. Also in this prior art there are at least two symmetrical points having the same superposition's amplitudes and accordingly there is a significant problem to determine, whether point have been touched. The radiations from two (or more) radiating sources arrive at touching object from different angles in different points and therefore the reflecting conditions will be significantly different too. That reduces the exactness of coordinate determining and as result the effectuality of the discussed method. The present invention has been made with the aim of solving the noted above and also other problems, and it is an object of the present invention to provide in a simple and economical manner a passive interface method including not only the touching, but also the pointing user's manipulations for all screen types (for example for a common conventional non-touch LCD screen), where the exactness of the present interface method is constant for all screen's areas. Another object of the present invention is to provide a compact, immediate and reliable passive pointing or/and touching interface method without use of the coordinate data and, accordingly, without the pointing object coordinate determining or determining values of any other parameters, onto which the analog noise can influence in a very significant degree. Still another object of the present invention is to provide the stability, reliability of interface performance, and to provide not high requirements for the user's pointing or/and touching experience to hit the desired graphical user interface member, including when it is small enough (i.e. being less than, for example, the user's finger tip dimensions).
WO 2009/053956 PCT/IL2007/001259 5 SUMMARY OF THE INVENTION The present invention relates to a pointing or/and touching based interfaces. More particularly, the invention relates to the finger or/and passive pen-like object based pointing/touching interfaces. Hereinafter, as example, usually the user's finger will be in use as the pointing object. The method according to the present invention is a passive (i.e. based on a passive pointing object) closely pointing or/and touching selecting method in an information system, for example, in a computer, which has a conventional (non-touch) screen representing a graphical user interface. The passive pointing object has not some kind of power source (batteries) or/and electronic circuit, or/and magnetic, acoustic, or any other physical effect based receiving or/and transmitting active circuits or/and elements. For providing the interface accordingly to the present invention the said information system has, besides a screen representing a graphical user interface, at least one receiving optical sensor (e.g. photodiode) usually located outside screen frame and having an aperture containing the close neighbourhood-of the screen surface. The optical sensor forms a sensor signal representative to the optical beams from its aperture. Also, the information system has a mean or/and a program for processing at the signal(s) or/and digital code combination(s). Practically most of the screen graphical user interface's component parts (members) are the selectable members intended for interacting with the user. By analogy with Windows the selectable members are the elements of the user graphical interface onto which the user may to perform click and the system (computer) will accordingly react by starting any certain program, procedure or any other certain action attached to being clicked selectable member. The selectable members, for example, are screen buttons, icons, active links, menu items, and so on. In the present invention the information system using a drawing program draws each selectable member (or/and a screen surroundings of the said selectable member's view) with a special periodic graphics mode, where period of this graphic mode hereinafter will be usually named the period of representation, and it will be discussed later. Notwithstanding the said special graphics mode using, the graphical user interface is visible for the user analogously to image provided by a custom conventional screen drawing. The special graphics mode comprises into period of WO 2009/053956 PCT/IL2007/001259 6 representation two different modes of drawing: in the one period of representation part - the conventional screen drawing accordingly to the content of the screen graphical user interface, and in the second period of representation part (very short and not visible for the user) - drawing at special sequence represented with short and quick changes of selectable member's (or/and its surroundings) brightness or/and colour screen representation. These changes are short and quick enough to be not visible for the user. For selectable members (or/and their surroundings) the brightness/colour changes may be executed in a parallel or/and sequentially. In the parallel mode the brightness/colour changes for some selectable members (or/and their surroundings) are executed at the same time interval. In the sequential mode the brightness/colour changes are executed at any moment usually only for one selectable member (or/and its surroundings). Of course, the said parallel and sequential modes may be in use simultaneously for different groups of selectable members (or/and their surroundings). In the present invention each selectable member (or/and its surroundings) has an individual, unique sequence of changes in its brightness or/and colour screen representation. This sequence hereinafter will be named also "predetermined individual light sequences" or simply "light sequence" or "individual sequence". Into the period of representation the light sequences, as noted above, take typically a short time part. If the view of any certain selectable member is not less relative to the pointing object's (e.g. finger tip) dimensions then the quick and very short impulses of the light sequence may be radiated by the screen view of this certain selectable member (icon view, for example). But usually, especially when dimensions of selectable member's view are small enough, the selectable member view's surroundings on the screen are used for representation the said light sequence too (or even instead of the member's view). As a rule, this representation is used for the small views of the graphical interface members in a compact screen or/and for a significant enough number of selectable members into graphical interface and, as a result, having small dimensions owing to lack of the screen square. A screen zone, comprising the selectable member view or/and its surroundings taking part into the individual light sequence representation, hereinafter will be named "emitting zone" or "light sequence emitting zone". Of course, on the screen may present for different selectable members all types of their emitting zones, i.e. emitting zones comprising only the selectable member view (icon view, for example), emitting zones comprising the selectable member view plus its surroundings (the icon view plus around, for example, 5-15 mm surroundings) or emitting zone comprising only selectable member's surroundings. The form of the emitting zone may be any, but the rectangular form is the most suitable.
WO 2009/053956 PCT/IL2007/001259 7 The brightness/colour changes in the screen representation (i.e., the presence of the light sequence on the screen) will be not visible for the user if their frequency parameters are more than the human seeing threshold; the said threshold depends on also from a porousness of the light impulses, more particular from the ratio of impulse duration to its period (i.e. pulse/period ratio). Also the said threshold depends on the power of the light impulses, but for the conventional screens the used powers lies into not significant interval, since the human eye has high sensitivity. The screen light radiating power (screen's settings "brightness") is typically constant during the interface session, and therefore an influence of the light impulse power on the seeing threshold here is not discussed. Only one aspect may be mentioned: if the light impulses from any light sequence become be visible for the user it is always possible to increase impulse frequency (for example using other allocation of parallel/sequential modes of light sequences representation between selectable members). For conventional screen for the light impulses representation having the porousness 0.5 the human seeing threshold is near to 50-60 Hz, for the representation with the porousness 0.1 it is near to 30-35 Hz. Of course it is very desirable for the user's convenient if for all selectable members on the screen the changes of their screen representation will be not visible (i.e. without any flicker) and the conditions, for which the brightness/colour changes into screen representation of the selectable members are not visible for the user, will later be discussed. With beginning the interface session with the information system the user allocates a passive pointing object (e.g. his/her finger) opposite to and not far from the screen flat. The finger begins to reflect the emitted by the screen light arriving at the finger tip. But being into some distance from the screen surface the finger is not being into the aperture (i.e. into field of view) of the receiving optical sensor since the aperture comprises only close neighbourhood of the screen surface. Therefore, the reflected by the finger tip light does not arrive at the optical sensor input, and, accordingly, the formed by the optical sensor signal has not impulses, i.e. their magnitude is equal to null. The environment also is not into the optical sensor aperture. Its external light beams do not arrive at the optical sensor too, and do not cause appearing any impulses into the said sensor signal. The present invention in this case does nothing. With shifting by the user his/her finger nearly to the screen flat to hit desirable selectable member, the reflected by the finger tip light is increasing. When the finger tip enters the bound of aperture of the optical sensor, i.e. into very close to or touching at the screen surface, the WO 2009/053956 PCT/IL2007/001259 8 reflected light, naturally, mounts to maximal value. For this finger position, i.e. into the aperture of the optical sensor, the reflected light arrives at optical sensor input. Representatively to the said near to maximal or maximal reflected by the finger tip light the optical sensor forms the signal having impulses with near to maximal or maximal magnitude. If the finger position is allocated opposite to any certain selectable member screen view (i.e. opposite to the emitting zone of any certain selectable member), the reflected by the finger tip light and, accordingly, the signal formed by the optical sensor exactly correspond to the light sequence radiated by the said certain emitting zone. To determine the presence of the finger tip closely to the screen (or finger tip's touching the screen surface) opposite any emitting zone the magnitude of impulses into the sensor signal is compared with a predetermined threshold. To determine which from selectable members is pointed/touched by the user the present invention determines to what from predetermined individual sequences, attached to selectable members, the sequence of impulses of the sensor signal is corresponded. Accordingly, it is being pointed/touched the selectable member which predetermined individual light sequence is corresponded to sequence of impulses of the sensor signal. The predetermined threshold must be set a few lower than the maximal possible impulse's magnitude in the sensor signal. Therefore, using such predetermined threshold it is possible to determine is any impulse's magnitude is close to maximal value, or not. Using the SETTINGS, analogously to Windows, the user may adjust the predetermined threshold level accordingly his/her experience and wish. With using, for example, two parallel the present invention methods with different said predetermined thresholds it is possible to determinate two finger's positions relative to the screen flat, for example for purpose to differ between pointing and touching user's actions. In a first embodiment of the present invention to each selectable member it is attached an unique predetermined individual light sequence comprising, typically, one-two short light impulses allocated in different positions through the period of representation. The first embodiment determines the magnitude of impulses into the optical sensor signal and compares it with the predetermined threshold. This comparing is executed for the interval not less than the predetermined quantity of frames to provide all impulses to be compared at least one time. If into the predetermined quantity of frames (i.e. at least the predetermined quantity of the impulses) the being compared impulses have the magnitude exceeding the predetermined threshold, next the first embodiment executes operation correlation for the optical sensor signal WO 2009/053956 PCT/IL2007/001259 9 with the predetermined individual light sequences corresponded to the selectable members, which are present on the screen. If among the predetermined individual light sequences it is present any certain individual sequence with which the said signal is correlates, that means that the user is pointing (or touching) the certain selectable member to which the said certain individual sequence belongs. In this case the first embodiment selects the said certain pointed or touched by the user selectable member. The operation correlation is executed into a time interval not less than the control frame quantity. The control frame quantity is not less than the frame position's number of the last (through the period of representation) change among all brightness/colour changes of the predetermined individual light sequences; the most useful to use the control frame quantity equal to the number of frames into the period of representation. In a second embodiment of the present invention to each selectable member it is attached predetermined individual number, which is also unique for each of the said selectable members. The individual numbers are coded in digital serial data form into light sequences of the selectable members. All light sequences in this case usually are radiated in parallel, since for a significant quantity of the selectable members the individual numbers comprise a significant enough quantity of bits and, therefore, have a significant enough individual duration. If the sequential mode will be in use the sum duration for these light sequences (i.e. sum of individual durations) will take too much big time and, accordingly, a significant part into the period of representation. As result, if the sequential mode is used for the discussed case, the changes of brightness or/and colour of emitting zone representation may become visible for the user. Analogously to the previous first embodiment, the discussed second embodiment determines the magnitude of the impulses into the optical sensor signal and compares the said magnitude with the predetermined threshold. This comparing is executed into interval not less than the predetermined quantity of frames. If it has been determined that at least into the predetermined quantity frames i.e. predetermined quantity of impulses have the magnitude exceeding the predetermined threshold, then the discussed second embodiment forms a representing code combination from the sensor signal and detects a value coded into the said formed representing code combination. Next the second embodiment determines if the said value is equal to any certain of being used individual numbers, and if yes - the second embodiment selects the certain selectable member to which the said certain individual number had been attached.
WO 2009/053956 PCT/IL2007/001259 10 A third embodiment of the present invention is intended for interacting with the small enough screens or/and for compactly located selectable members having a small tightly located views (calculator's view having a number of button views, for example). In these cases because deficit of screen square the light sequence emitting zones of selectable members may partly overlap one at another (at that, the views of the selectable members usually does not overlap one at another). In the third embodiment they are used at least two opposite one to another located optical sensors. When the user's finger points/touches onto screen any certain desired selectable member view the finger's tip also touches at surrounding selectable members and accordingly their emitting zones. As result, the finger tip reflects not only the "desired" light sequence from said certain selectable member, but also the "foreign" lights sequences from surrounding selectable members. To determine said valid light sequence the third embodiment forms a representing generic code combination using at least two signals from opposite optical sensors. This representing generic code combination carries said valid light sequence. That allows to the third embodiment to determine and select pointed/touched said certain desired selectable member despite its small dimensions and nearness some other selectable members. Other embodiments of the present invention provide determining a valid single-touch (or single point) basing on detecting time interval(s) characterizing point/touch manipulations. Also some of other embodiments provide a double-touch (or double-point) selecting user's actions. The control at durations of the finger tip presence into close position relative to (or, more typically, durations of touching at) the screen surface and finger presence outside the optical sensor aperture allows provide a first-mode selecting function and a second-mode selecting function. The first- and second-mode selecting functions may be analogous, for example, to the left- and right-button click functions or to the single and double click functions into conventional interfaces based on the customary mouse. Moreover, the control durations of touching manipulations allows to other embodiments to determine a short accidental screen touches (or accidental short finger's presence closely to the screen flat), that are often take place during the user's interface session, and accordingly do not react in this cases. I.e., these accidental touches do not cause any reactions of the present interface that increases its availability, reliability and convenient for the user. In the description of the present invention as the parameter of the light and of the signal the term "magnitude" is used (instead of the term "amplitude"). This is for convenient only and here they are the motives. As noted above the emitting zone radiates the light sequence with changes of its WO 2009/053956 PCT/IL2007/001259 11 colour or/and brightness screen representation. These changes are the short impulses of one, two or three fundamental light colour component(s) amplitude. In most cases of changes in the screen representation of the selectable members (more particular, of the emitting zones) the brightness is changing and accordingly the changing of brightness amplitude takes place. But colour changes also may take place, and, sometimes, the colour changes do not cause the brightness quantitative change, i.e. brightness amplitude is not changed, and the term "magnitude" is more convenient. Physically here the magnitude characterizes the amplitude of change of any one (or two) from colour components (for such cases the optical sensor must be colour sensitive and comprise at least two colour optical receiving elements). So the term "magnitude" will be used to abstract from type of light changes (brightness or/and colour), from either colour components take place in said changes, from logics of changes (negative/positive impulses), from presence or absence any constant component into sensor signals. The present interface method does not determine the coordinates of the pointing/touching object (for example, finger). The coordinates at all are absent into the present invention method. In the present invention the magnitude of impulses into the signal formed by the optical sensor is used only for determining exceeding the predetermined threshold (i.e. for "event determining") and not used for coordinate or any other parameter determining (i.e. for "value determining"). That "event determining" has a discrete, digital nature, and accordingly has all advantages of the digital processes such as exactness, noise stability, simplicity. There is no importance of the selectable members locations or spatial relationship between them. Therefore, the selectable member may have arbitrary coordinates including the changing ones, i.e. it may be moving on the screen - all these circumstances are not influence of the present method. At the same time, it is very important that for the user it is not difficult to hit the small or/and moving selectable member onto screen, since for this it is enough to hit the emitting zone comprising not only the selectable member's view, but also its screen surroundings. The emitting zones may overlap one at another without influence on the interface performance, i.e. there are no obstacles when the selectable member views are small enough (less then pointing/touching object, for example, finger tip) and are tightly located. The individual "name" of selectable member being pointed/touched by the user (i.e. its individual sequence or its individual number) is coded into reflected by the finger tip light and representatives is carried into the optical sensor signal in digital mode and directly, without any coordinates or other intermediate parameters or operations, "points" at the said selectable member. The absence an analog value based WO 2009/053956 PCT/IL2007/001259 12 operations in the present method provides independent from sensor signal noise, always being into real schemes, and from light noise from the environment. The digital nature of the event determining (exceeding the predetermined threshold), the digital nature of information directly "naming" the pointed selectable member, absence of coordinate or any other parameters determining from the analog signal, the very compressed aperture of the optical sensor not including the noisy light beams from the environment, the inclusion into emitting zone not only the selectable member view, but also its surroundings, sure hitting at the small or being into the motion selectable members, providing two selecting functions, possibility to use not only touching, but also pointing manipulation, insensibility for the accidental touches or shifts during the interface session, provide for the interface based on the present invention additional advantages and the high stability, reliability and convenience. BRIEF DESCRIPTION OF THE DRAWINGS FIGs. 1A through 1D illustrate probable locations of the receiving optical sensor(s) around the screen of the information system (Lap-Top, for example). FIG.2 illustrates a probable location of the selectable members views (for example, custom book-like icons) of the graphical user interface on the screen and their screen surroundings used as the light sequence emitting zones. FIG. 3 illustrates the pointing object (for example, finger) location far enough from the screen flat (more particular, outside the optical sensor aperture). In FIG.3 it is depicted the cut of the Lap-Top with the selectable members (and, accordingly, their light sequence emitting zones) location on the screen representative to the plot from FIG.2, light beams from the environment, and the light beams traces from the screen to the finger (initial light) and from the finger (reflected light). FIGs. 4A through 4C illustrate an entering the aperture bound by the finger during its shifting to select any certain from the selectable members. FIG.4A depicts the Lap-Top cut analogously to the plot from FIG.3 and the finger position when its tip slightly enters the aperture bound. FIG.4B represents the said finger tip position relative to the aperture bound in a more scale.
WO 2009/053956 PCT/IL2007/001259 13 FIG.4C represents the time charts of: two probable light sequences from the two emitting zones; downward reflected from the finger tip light accordingly to the subject from FIGs. 4A and 4B; the optical sensor signal; the impulse's magnitude relative to the predetermined threshold. Also, in FIG.4C is shown the representing code combination relative to the period of representation. FIGs. 5A through 5C illustrate the approach of the finger closely to the screen flat or even the finger touch one. FIG.5A depicts the Lap-Top cut and the finger position when its tip has closely approached to the screen flat or even has touched one. FIG.5B represents the said finger tip position relative to the aperture bound and the screen surface in a more scale. FIG.5C represents the time charts of: two probable light sequences from the two emitting zones; downward reflected from the finger tip light accordingly to the subject from FIGs. 5A and 5B; the signal from the optical sensor; the impulse's magnitude relative to the predetermined threshold. Also, in FIG.5C is shown the representing code combination relative to the period of representation. FIGs. 6A through 6D illustrate the performance of the present invention embodiment, having opposite (one to another) located optical sensors, for case of emitting zones overlapping one at another. FIG.6A depicts a probable location of the selectable members (for example, book-like icons) of the graphical user interface on the screen and their screen views surroundings used as the light sequence emitting zones; the emitting zones overlap one at another. FIG.6B depicts the cut of the Lap-Top with the selectable members and their overlapping emitting zones accordingly to the subject from FIG.6A. FIG.6C represents the finger tip position relative to the aperture bound, screen flat and emitting zones accordingly to the subject from FIG.6B in a more scale. FIG.6D represents the time charts of: three probable light sequences from the three overlapping emitting zones; two reflected from the finger lights downward and upward directed accordingly to the subject from FIG.6C; two signals from the lower and upper optical sensors; the impulse's magnitudes of two sensor signals (lower and upper) relative to the predetermined threshold. Also, in FIG.6D is shown the representing generic code combination relative to the period of representation. Some of the figures depict the time charts. The charts represent sequences, signals, codes through the period of representation having a predetermined number of frames equal, for example, to 30 frames. The sequences and signals comprise the impulses usually having the minimal possible duration - one frame duration. Such duration is used in the charts. For clarity the said impulses duration is shown not exactly proportional to the period of representation WO 2009/053956 PCT/IL2007/001259 14 value, i.e. it has on the charts a duration not exactly equal to 1/30 of the said period of representation (this does not influence of the description's correctness). DETAILED DESCRIPTION OF THE INVENTION For providing the interface accordingly to the present invention the said information system has a screen representing a graphical user interface, at least one (for some embodiments at least two) receiving optical sensor(s) located outside screen frame and having an aperture(s) containing a close neighbourhood-of the screen surface. The selectable members of the user graphical interface (more particular, their emitting zones) emit during one period's of representation part the conventional light accordingly their view into the user graphical interface content, and during usually very short second period's of representation part - the periodic sequences of brightness/colour changes of the emitting zones representation. Each said periodic sequence is unique and predetermined for each of the selectable members and named as noted above, the "predetermined individual light sequence" or simply "light sequence" or "individual sequence". The optical sensor(s) into special conditions will form a sensor signal(s) representative to the light sequence of pointed/touched selectable member. The information system has a mean or/and a program for processing at the signal(s) or code combinations; using this mean it is possible to detect from what of the predetermined individual light sequences causes the digital sequence presenting into optical sensor signal(s). The aperture of the optical sensor may comprise at part or full screen square. If it is necessary they may be in use some of the optical sensors, possibly colour sensitive. FIGs. 1A through 1D illustrate the probable locations of the receiving optical sensor(s) around the screen of the information system (Lap-Top, for example). FIGs. 1A through 1D depict the Lap-Top 1, having screen 2, with the probable position of the single (3-1 in FIG.1A) or some (3-1 through 3-6 in FIGs.lB through 1D) receiving optical sensor(s). Also for each optical sensor (3-1 through 3-6) it is shown its aperture (4-1 through 4-6 accordingly) as angle into the screen 2 plane. FIG.1C depicts the Lap-Top 1 having a probable placement of some optical sensors 3-1 through 3-4, where the group of sensors may be divided to sensor pairs, into which the sensors are opposite located one to another. For the placement, represented in FIG. 1 C, each point onto screen surface is observing by at least one pair of the opposite sensors. As it will be described WO 2009/053956 PCT/IL2007/001259 15 below, this opposite sensor location may be used for interacting with the information system, having a shortage of the screen square for representing a complicated enough graphical user interfaces or/and if into the graphical user interface there are some tightly grouped together selectable members having a small views (relative the finger's tip dimensions; for example a button assembly of a calculator's view). In these cases the light sequence emitting zones often are overlapping one at another. FIG.1D depicts the Lap-Top 1 comprising two opposite located optical sensors 3-5 and 3-6 placed in the screen 2 collateral bounds. Such optical sensors 3-5 and 3-6 location may be used when the user has long finger-nails (that is typically for the ladies): the long finger-nail usually entails the problems for passing of the reflected light to upward direction, therefore here instead of downward/upward reflecting light directions they may be used with the same results the right/left reflected lights. And of course the discussed location allows to provide the present invention performance including cases where the graphical user interfaces has small enough or/and tightly grouped selectable members views. In a first embodiment (see the claim 1) of the present invention it is possible to use all represented above optical sensor's locations (and of course they are possible other ones). Let take for example scheme with two optical sensors with location as shown in FIG.2. In FIG.2 the Lap-Top 1 and its screen 2, and, also the said two optical sensors 3-1 and 3-2 are shown. Also FIG.2 illustrates a probable location of the selectable members, for example, book-like icons on the screen 2. The said selectable members have the light sequence emitting zones 5-1 and 5-2, comprising with the icon's view also its screen surroundings - their boundaries are signed in FIG.2 with dotted line (of course these lines are the virtual boundaries and usually are not drawn into the screen's content). Since the views of the icons here are big enough (relative to the sizes of the finger's tip) also only the icon's view may be used as the emitting zone; the use of icon's surroundings as component of emitting zone is shown here for illustrative purpose. The form of boundaries of the emitting zones 5-1 and 5-2 may be non-rectangular. The boundaries may have an intricate enough form in accordance with the contours of the icon's views, or they may be any other - triangular, circular and so on. But typically the emitting zones are rectangular (as shown in FIG.2 with dotted line) as the most simple. During the interacting with the information system the user approaches the pointing object (finger, for example) nearer to the screen flat to select any certain of selectable members. This WO 2009/053956 PCT/IL2007/001259 16 approaching is started from any point being in some distance from the screen. Such start point of interface selecting process in the present invention is shown in FIG.3. It depicts the cut of the Lap-Top 1 having the screen 2, where the cutting plane passes through the emitting zones 5-1 and 5-2 and receiving optical sensor 3-1. The emitting zones 5-1 and 5-2, signed with arrows and braces, represent the noted in FIG.2 subject, particularly the noted above book-like selectable members. Also, in FIG.3 is shown a pointing object (finger) 10 being in significant enough distance from the screen 2 flat. The optical sensor 3-1 has the receiving optical aperture 4-1 with border 9. The aperture 4-1 comprises only the close neighbourhood of the screen 2 surface, for example, 3 -8 mm from the screen 2 flat. Such aperture 4-1 may be provided by special optical lenses or by the opaque bar 8 - the last variant is shown in FIG. 3A. The emitting zones 5-land 5-2 emit the light radiations (light sequences) 6-1 and 6-2 accordingly. The finger 10 tip reflects the light to all directions, in particular, in FIG.3 they are signed the downward reflected light part 7-1 and the upward reflected light part 7-2. Also the light from the environment is shown in FIG.3A (without any number). It is obvious, that the reflected lights 7-1 and 7-2 do not arrive at the optical sensor 3-1 input (since the finger 10 tip is not into the aperture 4-1). The light from the environment also cannot reach the optical sensor 3-1 input, since all light sources of the environment are being outside the optical receiving aperture 4-1 too. Therefore, into optical sensor 3-1 signal there are no any impulses. In this case the present invention (all embodiments) does not react. The discussed finger 10 position may be used for starting/ending/interrupting interface session, or changing finger 10 position relative to the screen 2 to choose and hit next selectable member. All light sequences into the present invention (and, of course, light sequences 6-1 and 6-2 into the discussed first embodiment of the present invention) comprise two parts. The one part is the brightness and colour representation (drawing) of the selectable member (for example, icon) view and the said view's surroundings accordingly to the graphical user interface contents. Such drawing is common one used for screens in conventional systems (e.g. computers, having conventional or touch screens). The second part is the sequence of the predetermined brightness or/and colour changes representation, usually not visible for the user. These changes typically take place during one-two screen frames, i.e. very short time interval. These changes are periodical. For providing the user's convenience the period of these changes usually is set from two circumstances. The first - it is very desirable that the changes of the colour or/and brightness representation of the light sequences from the emitting zones will be not visible for WO 2009/053956 PCT/IL2007/001259 17 the user. I.e. the screen graphical user interface on the display must be seen without any flickers. The second - the delay of reaction of the present interface from the user's actions must be minimal, for example, not more than 0.3 sec (300 msec). For example, up to date LCD displays may represent the screen contents with frequency 100 Hz (100 frames per second) - let take this parameter for all following discussions. Into the noted above time interval 300 msec such LCD display represents 30 frames. If we want that the maximal delay of the first embodiment's reaction from the user selecting manipulation will be not more than 300 msec, so into these 300 msec all light sequences of all selectable members, which are present on the screen, must be fully performed. The delay value will lie into interval 10 - 300 msec, in average 150 msec (0.15 sec) - such delays the user of course will not feel. Into next 300 msec all light sequences will be repeated again. And so on. I.e. the said 30 frames (300 msec) constitute a period of light sequences representation providing maximal delay 0.3 sec. This maximal delay the user practically does not feel. Moreover, such delay will appear only for one-two selectable member from some thousand of ones being onto screen simultaneously (this aspect will be discussed below). But in most of graphical user interfaces the quantity of selectable members simultaneously presenting onto screen is not more then 100-200. For such quantities for the period of representation at 30 frames the maximal delay from the user's manipulation to the information system's reaction will be around to 100 msec (0.1 sec), i.e. significantly far from the human's feeling threshold. Some words about the flicking. During the period 30 frames there are two noted above parts into the selectable member view's representation: one - conventional (i.e. having constant parameters) and second - carrying short impulses (i.e. carrying light sequences). If into the said 30 frames for the second part of representation (i.e. for the light sequence representation) they are used for example 2 frames, then for the one part representation 28 frames are remaining. For the ratio 2 frames-of-second-part-representation divide 30 frames-of-period-representation (i.e. for low value of the relative pulse/period duration - porousness - near to 0.066) the user will not see the impulses of the light sequence; he/she will see only the static view, without any flicker, accordingly to the screen view of customary graphical user interface content. I.e. the said screen view will be seen for the user analogously to the view formed by the conventional computers. Since the brightness/colour impulses during second part of representation usually have the duration of one frame, into the said 30 frames of the period of representation there are 30 positions for these impulses. Let estimate the possible quantity of the selectable members, which WO 2009/053956 PCT/IL2007/001259 18 may be simultaneously represented with two-position light sequence (i.e. with light sequence coded with one or two impulses) through the thirty-position period of representation. I.e. how many selectable members may simultaneously present on the screen. The first group of light sequences, and accordingly selectable members, is the group into which the light sequences are coded with only one impulse. Into period 30 frames the single impulse may be allocated into 30 different positions (into 30 different frames), i.e. there are 30 possible light sequences (and accordingly 30 selectable members) which may be coded with single impulse (for brightness-change of light sequence representation). If to use three colour light components (i.e. colour-change of light sequence representation) - RGB (Red-Green-Blue), so there are six possible combinations (R, G, B, R+G, R+B, B+G). They provide 30x6=180 variants of different light sequences, and accordingly 180 selectable members in first group. The second group of light sequences, and accordingly selectable members, is the group into which the light sequences are coded with two impulses. These two impulses have a much more quantity of possible locations into the period of 30 frames. For example, if the first impulse is placed into the first position (into first frame) the second impulse may be located into any from remained 29 positions (into remained 29 frames). If the first impulse is placed into the second position (into second frame) the second impulse may be located into any from remained 28 positions (into remained 28 frames). And so on. If, for example, the first impulse is placed into the twelfth position (into 12-th frame), so for the second impulse remain 18 positions (18 frames). Amounting to all said locations into the second group it will be the number of 434 different positions, i.e. 434 different individual light sequences may be coded with two impulses (for brightness-change of light sequence representation). If to use three colour light components they will be near to 2600 possible combinations. Therefore, in sum they may be represented near to 3000 selectable members onto screen (434+2600). This quantity is much more than usual number of selectable members into every information system (maximum 100-200 of selectable members; for representing such quantity it is necessary use only part of period of representation, that allows a very short delays of system reactions on the user's manipulations). Moreover, it is possible to use three frames for light sequences representation and in this case the possible quantity of individual sequences will be more than 30,000. I.e. there are practically no restrictions for the quantity of presenting on the screen selectable members. These conclusions relate not only to the discussed first embodiment, but to all embodiments of the present invention.
WO 2009/053956 PCT/IL2007/001259 19 When the user has decided to select any certain selectable member from the graphical user interface he/she begins to shift his/her finger 10 from the position shown in FIG.3 nearer to the screen 2 surface opposite to the chosen selectable member's view. During this moving his/her finger 10 tip enters bound 9 of the optical sensor aperture 4-1 (i.e. approaches closely to the screen 2). FIGs. 4A through 4C illustrate the moment of entering the aperture bound 9 by the finger 10 tip. In FIG.4A the cut of the Lap-Top 1 and its screen 2 are depicted. On the screen 2 of the Lap-Top 1 is being the same to shown above (see FIGs. 2 and 3) subject. In particular, accordingly to the subject from FIG.2 and FIG.3, in FIG.4A they are represented two light sequence emitting zones 5-1 and 5-2 corresponding to two book-like selectable members and the optical sensor 3-1 shown in the plot from FIG.2. The pointing object (finger) 10 (see FIG.4A) slightly enters bound 9 of aperture 4-1 of receiving optical sensor 3-1. The opaque bar 8 provides comprising by the aperture 4-1 only the close neighbourhood of the screen 2 flat. The bound 9 of the aperture 4-1 is signed with dotted line. The emitting zones 5-land 5-2 emit the light radiations (light sequences) 6-1 and 6-2 accordingly. The finger 10 tip reflects the light (in FIG.4A the reflected light is not shown). It is obvious from FIG.4A, that at the finger 10 tip arrives practically only light 6-1 from the emitting zone 5-1. The light 6-2 (and also all other lights) is mostly radiated by the screen 2 perpendicularly to its flat and only negligible part of beams of light 6-2 comes to finger 10 directions. Thus, practically only the light 6-1 from the emitting zone 5-1 is reflected by the finger 10 tip. Moreover, it is important that accordingly to the subject from FIG.4A the source of light 6-1, i.e. the emitting zone 5-1, is located near enough to the finger 10 tip; that provides a significant (but of course not maximal) magnitude of light 6-1 arriving at finger 10 tip. The light 6-1 comprises only the light sequence belonged to the emitting zone 5-1. Accordingly, the reflected by the finger 10 tip light also comprises practically only the same light sequence (of course with attenuated magnitudes). FIG.4B represents the discussed finger 10 tip position relative to the aperture 4-1 bound 9 (singed with dotted line) in a more scale. Only a small part of the finger 10 tip is being into aperture 4-1 and only very small its surface, signed in FIG.4B, reflects the arriving light 6-1, and WO 2009/053956 PCT/IL2007/001259 20 here it is shown the downward part 7-1 of reflected light, which can reach the lower optical sensor (here it is not shown, it is shown in FIG.4A being numbered with 3-1). FIG.4C represents the time charts of probable light sequences 6-1 and 6-2 from two emitting zones 5-1 and 5-2 accordingly, the chart of reflected from the finger 10 (see also HG.4B) tip light, in particular its downward part 7-1 (see HG.4C), the chart of sensor signal 12-1 and the chart of impulse's magnitude 13-1 relative to the predetermined threshold 14. Also it is shown the representing code combination 15 relative to the period of representation 11. In the first and second charts are shown the light sequences 6-1 and 6-2, that are, for example, one-impulse based with different impulse positions sequences. In the light sequence 6-1 the single impulse is being in the first frame position into period of representation 11. In the light sequence 6-2 the single impulse is being in the fourth position into period of representation 11. The said impulses are the noted above second part of screen representation of the emitting zone (i.e. of the selectable member) comprising the changes of brightness or/and colour of emitting zone and carrying information identifying the emitting zone (i.e. selectable member). Between the impulses in the light sequences 6-1 and 6-2 the diagram line is being in any certain level accordingly to light from emitting zone (or/and from selectable member) view(s) drawn, as previously noted, respectively to the graphical user interface's content (usually in constant mode; such drawing is used for screens in conventional systems, e.g. computers having conventional or touch screens). As noted above, the radiation from the emitting zone 5-1 (see FIG.4A, FIG.4B and FIG.4C), i.e. the light sequence 6-1, arrives at the finger 10 tip in much more degree in comparison with the radiation from the emitting zone 5-2 (i.e. light 6-2). Accordingly, in the reflected light radiation (its downward part 7-1, in particular; see the third chart in FIG.4C) the reflected impulse . corresponded to the initial impulse of light sequence 6-1 will be much more than the reflected impulse corresponded to the initial impulse of light sequence 6-2. And in the third chart of reflected light 7-1 these two impulses with different magnitudes are shown: the first impulse having significant magnitude is caused by the initial impulse from the light sequence 6-1, and the second impulse having negligible magnitude is caused by the initial impulse from the light sequence 6-2. Receiving the reflected light 7-1 the optical sensor 3-1 (shown in FIG.4A) produces, accordingly to the received light 7-1, the signal 12-1 shown in the fourth chart in FIG.4C: the form of the signal 12-1 is analogous to the reflected light 7-1 chart form. It also has two impulses into period of representation 11 - one, being in the first frame position through the WO 2009/053956 PCT/IL2007/001259 21 period 11 with significant enough magnitude, corresponding to the initial first frame position impulse from the light sequence 6-1, and another, being in the fourth frame position through the period 11 with negligible magnitude, corresponding to initial fourth frame position impulse from the light sequence 6-2. In the fifth chart it is shown the magnitude 13-1 of impulses 12-1 of the signal 7-1 relative to the predetermined threshold 14. The first position impulse (see fifth chart in FIG.4C), having significant magnitude, has a not maximal one, since the reflecting finger surface (see FIG.4B), which value defines at a power and, hence, magnitude of the reflected light 7-1, is small enough. The level of the predetermined threshold 14 must be chosen near to the maximal possible impulse's magnitude (a few lower one; more particularly about the predetermined threshold 14 level it will be discussed later) and therefore the said significant, but not maximal, impulse of 13-1 being in the first frame position will not exceed the predetermined threshold 14. Of course the second negligible impulse of 13-1 being in the fourth frame position also will not exceed the predetermined threshold 14. Since there is no exceeding the predetermined threshold 14 by the magnitude 13-1 of signal 12-1 then the first embodiment (and also all other embodiments) does not execute any actions, as follows from the claim 1. This feature of the present invention provides additional reliability, since if the user by case slightly enters the bound of the optical sensor aperture the present invention does not react on this casual event. In that way, the user may execute free finger moving during choosing or guiding the selectable member of the screen graphical user interface also nearly enough to the screen surface that is very convenient and quick, and meets requirements of the experienced users. The above considered case of the slight entering by the user's finger the optical sensor aperture's bounds does not cause any reaction of the information system. To select at wished selectable member from the screen graphical user interface the user continues to shift his/her finger closely to the screen flat and may even touch one. FIGs. 5A through 5C illustrate such finger's position closely to the screen surface or even touching one. FIG.5A illustrates the finger 10 position when finger 10 tip is being closely to the screen 2 flat or even is touching one. In FIG.5A the cut of the Lap-Top 1, its screen 2 and optical sensor 3-1 are depicted. Also the aperture 4-1 of the optical sensor 3-1 is shown in FIG.5A. The aperture 4 1 has bound 9 shown with dotted line. The aperture 4-1 is provided by the opaque bar 8. On the screen 2 of the Lap-Top 1 is being the same to noted above in FIGs. 2, 3 and 4A subject. In WO 2009/053956 PCT/IL2007/001259 22 particular, analogously to the said subject, in FIG.5A they are represented two light sequence emitting zones 5-1 and 5-2 (see the plot from FIG.2). The pointing object (finger) 10 in FIG.5A is being closely to the screen 2 flat or even is touching one. It is obvious, that the finger 10 tip had maximally pierced the aperture bound 9 and the significant part of the finger 10 tip is being into the optical sensor aperture 4-1. The emitting zones 5-1 and 5-2 emit the light radiations (light sequences) 6-1 and 6-2, accordingly. The opaque bar 9 provides the aperture 4-1 comprising only the close neighbourhood of the screen 2 surface. As described above, at the finger 10 tip arrives practically only light radiation 6-1 from the emitting zone 5-1, and in the reflected by the finger 10 tip light (not shown in FIG.5A) accordingly is present only the light caused by this light radiation 6-1. The reflecting light components caused by other emitting zone radiations are practically absent into the reflected light. But for the illustrating purposes we will suppose that a negligible part of light sequence 6-2 also arrives at the finger 10 tip. FIG.5B represents the discussed finger 10 tip position relative to the aperture 4-1 and screen 2 in a more scale. Into the aperture 4-1 it is being a significant part of the finger 10 tip, and a significant its surface signed in FIG.5B reflects the radiation, in particular, its downward reflected light part 7-1 which reaches the lower optical sensor (here it is not shown, it is shown in FIG.5A with number 3-1). In comparison with the subject from FIG.4B it is obvious that in FIG.5B the finger's reflecting surface is much more than analogous surface in FIG.4B. Subject to the said aspect and subject to the one more fact that the reflecting surface in discussed case (see FIG. 5B) is being maximal near to the light source (i.e. to the emitting zone 5-1) in comparison with one from FIG.4B, it is possible to assert that the magnitude of the reflected light 7-1 for the discussed case (illustrated with FIGs. 5A and 5B) is much more than the magnitude of reflected light 7-1 from the previous case (illustrated with FIGs. 4A and 4B); moreover the reflected light 7-1 magnitude in discussed case has a maximal possible value. FIG.5C represents the time charts of light sequences 6-1 and 6-2 from two emitting zones 5-1 and 5-2 accordingly (see also FIG.5A), the chart of reflected from the finger 10 tip light, in particular its downward part 7-1, the chart of the optical sensor signal 12-1 and the chart of impulse's magnitude 13-1 relative to the predetermined threshold 14. Also it is shown the representing code combination 15 relative to the period of representation 11. In the first and the second charts they are shown the same to the previous discussed light sequences 6-1 and 6-2 (see FIG.4C), i.e. the light sequences 6-1 and 6-2 in FIG.5C are one-impulse based with different positions of impulses. More particular, light sequence 6-1 has, for example, single WO 2009/053956 PCT/IL2007/001259 23 impulse in the first frame position, light sequence 6-2 has, for example, single impulse in the fourth frame position. The said impulses are carrying information identifying the corresponding emitting zones (i.e. selectable members); the any certain level of diagram line between impulses represent the views of emitting zones accordingly to the user graphical interface's content. As noted above, the radiation from the emitting zone 5-1 (see FIG.5A), i.e. the light 6-1, arrives at the finger 10 tip in a significantly more degree in comparison with the arriving at the finger 10 tip radiation 6-1 of the previous (see FIG.4A) finger's position discussed into references to FIGs. 4A through 4C. And, therefore, the magnitude of the light 7-1 (see FIG.5C) reflected by the finger 10 tip (see FIG.5B) here is much more, than one from the previous case (see FIGs. 4A through 4C and references to them). Since the distance between the reflecting object (finger 10) and radiating source (screen surface with the emitting zone 5-1; see FIG.5A) is near to null, at the finger 10 tip (see FIG.5B) the light 6-1 arrives at the maximal possible degree. The light sequence 6-2 from the emitting zone 5-2 practically does not reach the finger 10 tip, but analogously to the previous cases let suppose that in negligible degree this light 6-2 arrives at the finger 10 tip. Also it is important that the reflecting surface (see FIG.5B) is maximal possible. Therefore, the reflected by the finger 10 tip light 7-1 will be at maximal possible magnitude. There will be two light impulses in reflected light 7-1 (see third chart in FIG.5 C): one, corresponding to the initial impulse of light sequence 6-1 (in the first frame position through the period 11) - owing to the above noted reasons it has the maximal possible magnitude; and second, corresponding to the initial impulse of light sequence 6-2 (in the fourth position) - it has a negligible magnitude. The reflected light 7-1, shown in the third chart in FIG.5C, arrives at the optical sensor 3-1 input (see FIG.5A). The optical sensor 3-1 accordingly forms the signal 12-1 (see the fourth chart in FIG.5C). The first embodiment determines the impulse's magnitude 13-1 of signal 12-1; it is shown in the fifth chart together with the predetermined threshold 14. The level of the predetermined threshold 14 must be chosen a few lower than the maximal possible impulse's magnitude 13-1 of the sensor signal 12-1. Therefore, the first frame position impulse's magnitude 13-1 will exceed the predetermined threshold 14. The fourth frame position impulse's magnitude 13-1 having negligible value will not exceed one. The said exceeding is seen in fifth chart in FIG.5C. The first embodiment determines that the impulse's magnitude 13-1 exceeds the predetermined threshold 14 in fourth frame position. Next, the first embodiment determines with what from the light sequences the signal 12-1 correlates. It is obvious, that the signal 12-1 from the fourth chart WO 2009/053956 PCT/IL2007/001259 24 has the form very near to the form of the first chart 6-1, i.e. the signal 12-1 correlates with the light sequence 6-1. The first embodiment determines that the signal 12-1 correlates with the light sequence 6-1; this light sequence 6-1 belongs to the emitting zone 5-1 (see the plot in FIG.2). The first embodiment selects the selectable member related to the said emitting zone 5-1. On the other hand the FIG.5A depicts that the finger 10 points (or touches) the emitting zone 5-1 and, therefore, the pointed (touched) by the user selectable member and the selected by the first embodiment selectable member are the same. The control of exceeding the predetermined threshold 14 is executed during a control frame quantity which is not less than frame's position number of the last through all changes in light sequences. In the discussed case the last impulse is located in the fourth frame's position, therefore the control of exceeding is performed during at least four frames. The operation correlation also is executed into certain time interval. From the FIG.5C it is obvious that minimal possible such interval is four frames - from the first impulse position (first frame) of the initial impulse of the light sequence 6-1 through the fourth impulse position (fourth frame) of the initial impulse of the light sequence 6-2. I.e. this interval comprises all individual changes of all light sequences and into this interval it is possible to test each from the said sequences, for example with operation correlation. But for performing operation correlation during only said four frames the previously know information must be provided (e.g. about the last change position); indirectly this is analogous to performing correlation in full period of representation (in 30 frames in our case) without any additional information. In FIG.5C is shown also the representing code combination 15 relative to the period of representation 11. This code combination 15 comprises in maximal size the quantity of bits equal to the frame quantity into period of representation 11. Each bit into its certain position is set into logical "0" if the magnitude of impulse 13-1 into the same position is less than predetermined threshold 14, and is set into logical "1" if the magnitude of impulse 13-1 into the same position exceeds predetermined threshold 14. The representing code combination 15 in digital mode carries the form of sensor signal exceeding the predetermined threshold 14. Therefore, the representing code combination 15 may be used in correlation determining (or in determining if the representing code combination 15 is equal to any certain from predetermined individual sequences) instead of the sensor signal 12-1 as is, for example, into one from variants of the first embodiment (and some other embodiments).
WO 2009/053956 PCT/IL2007/001259 25 A second embodiment (see the claim 2) of the present invention forms for each selectable member the predetermined individual light sequence comprising in digital serial data form the predetermined individual number. Such individual numbers are attached to the selectable members and for each selectable member its individual number is unique. The light sequences comprised the predetermined individual numbers may be radiated simultaneously for all selectable members. Analogously to the first embodiment of the present invention, in the second embodiment the optical sensor(s) receives the light beams from the close neighbourhood of the screen surface and forms the signal(s) representative to the said received light beams. If the pointing object (finger, for example) is being closely to the screen surface or even is touching one, then the quantity of the reflected light is significant enough. Accordingly, the magnitude of the impulses into the signal from the optical sensor also will be significant enough and it may exceed the predetermined threshold. The second embodiment determines if this exceeding takes place and, if yes, detects frame positions into which the exceeding takes place. The detected frame positions represent the frame impulse positions into reflected by the finger light. On the other hand the reflected light represents light sequence from pointed/touched emitting zone. Therefore, the value coded in digital serial data form in said light sequence presences into reflected light and accordingly into signal produced by the optical sensor. The second embodiment detects that value from said sensor signal. Next, the second embodiment determines, if the detected value is equal to any certain of individual numbers of selectable members. If it was determined that the detected value is equal to certain individual number of the certain selectable member, then the second embodiment selects this certain selectable member. A third embodiment (see the claim 3) of the present invention is intended for interacting with the small enough screens or/and for compactly located selectable members having a small views (calculator view having a number of button views, for example). In these cases because deficit of screen square the light sequence emitting zones of selectable members may partly overlap one at another (at that, the views of the selectable members usually does not overlap one at another). The probable location of such emitting zones is depicted in FIG.6A. FIG.6A illustrates a probable location of the selectable members, for example, book-like icons of the screen graphical user interface, and their screen views surroundings used as the light WO 2009/053956 PCT/IL2007/001259 26 sequence emitting zones partly overlapping one at another. In particular, in FIG.6A are shown the Lap-Top 1 and its screen 2 with four (for example) receiving optical sensors 3-1 through 3-4. These sensors 3-1 through 3-4 constitute two pairs of the opposite one to another located optical sensors. The sensors 3-1 and 3-3 are the one pair, and the sensors 3-2 and 3-4 are the other pair. On the screen 2 of Lap-Top 1 they are shown three selectable members - book-like icons; these icons and their surroundings are used as light sequence emitting zones 5-1, 5-2 and 5-3. The emitting zones 5-1 and 5-3 are signed with dotted line, the emitting zone 5-2 is signed, for clarity, with thin line. In FIG.6B is shown the cut of the Lap-Top 1 and its screen 2, where the cutting plane passes through the emitting zones 5-1, 5-2 and 5-3 (see also FIG.6A) and receiving optical sensors 3-1 and 3-3. Two optical sensors, lower sensor 3-1 and upper sensor 3-3 (see FIG.6B), are opposite one to another located. These sensors 3-1 and 3-3 have apertures 4-1 and 4-3, accordingly, usually coinciding one with other. The apertures 4-1 and 4-3 have the bound 9 signed with dotted line. The apertures 4-1 and 4-3 comprise only the close neighbourhood of the screen 2 flat; that is provided, for example, by the opaque lower 8-1 and upper 8-3 bars. Also in FIG.6B is shown the pointing object - finger 10. The screen locations of the emitting zones 5-1, 5-2 and 5-3 are signed in the Lap-Top 1 cut in FIG.6B with the arrows and braces. The emitting zones 5 1, 5-2 and 5-3 radiate the light sequences 6-1, 6-2 and 6-3 accordingly, which are not shown in FIG.6B because there are no gaps between the said emitting zones 5-1, 5-2 and 5-3. The emitting zone 5-1 and emitting zone 5-3 have single superposition with the emitting zone 5-2; the emitting zone 5-2 has two superpositions with two emitting zones 5-1 and 5-3. Thus, the emitting zone 5-2 is the most "complicated" and for illustration let suppose that the user points/touches this "the most complicated" emitting zone 5-2 (in FIG.6B accordingly it is shown that the finger 10 points/touches emitting zone 5-2). In a more scale the finger 10 tip position relative to the screen 2 and the apertures 4-1 and 4-3, and relative to emitting zone 5-2 is shown in FIG.6C. Analogously to subject from FIG.6B, in the plot in FIG.6C are shown the finger 10 being closely to the screen 2 flat or even touching at one, the location of the emitting zones 5-1, 5-2 and 5-3 are signed with the arrows and braces, the optical sensors apertures 4-1 and 4-3 have the bound 9 signed with the dotted line. The finger 10 tip reflects the light coming from the screen 2, and in FIG.6C they are shown the downward part 7-1 and upward part 7-2 of the reflected light radiation. The reflected downward part radiation 7-1 and the reflected upward part radiation 7-2 WO 2009/053956 PCT/IL2007/001259 27 come accordingly to the lower optical sensor 3-1 and to the upper optical sensor 3-3 (see FIG.6B; in FIG.6C they are not shown). As shown in FIG.6C at the lower reflecting surface of finger 10 tip arrive lights emitted by the emitting zone 5-2 and by the emitting zone 5-3. Since the square of the emitting zone 5-2 part radiating light to the finger 10 tip is less than the square of the emitting zone 5-3 part radiating light to the finger 10 tip, then into arriving at the finger 10 tip light the light component from the emitting zone 5-3 (i.e. light sequence 6-3 from not pointed selectable member; that light sequence 6-3 is not shown in FIG.6C) may be even more than light component from the emitting zone 5-2 (i.e. light sequence 6-2 from pointed selectable member; it also is not shown in FIG.6C). Accordingly, into reflected by the finger 10 tip light, particularly into downward reflected light 7-1, the light component caused by the light sequence 6-3 may be more than light component caused by the light sequence 6-2. Furthermore, since the finger 10 is being closely or even touching at the screen 2 flat, than the reflecting surfaces of the finger 10 tip here much more than ones from shown in FIG.4B. As result, the magnitude of reflecting by the finger 10 tip light in the discussed case is maximal possible, comprising impulses caused by not pointed selectable member. FIG.6D depicts time charts illustrated the third embodiment performance. The first, second and third charts represent the light sequences 6-1, 6-2 and 6-3, radiated by the emitting zones 5-1, 5 2 and 5-3, accordingly. These sequences 6-1, 6-2 and 6-3 are analogous to the previous considered first embodiment's light sequences; they are one-impulse based with different positions. For allocation of said light sequence impulses here they are used the first, fourth and seventh frame positions through the period of representation 11. Particularly, the impulse of light sequence 6-1 is into the first frame position, the impulse of light sequence 6-2 is into the frame fourth position and the impulse of light sequence 6-3 is into the seventh frame position. Any certain levels of diagram lines between said impulses represent the views of emitting zones accordingly to the user graphical interface's content. The fourth chart represents the downward part 7-1 of the reflected by the finger 10 tip light radiation. In the period of representation 11 they are three impulses caused by the initial impulses from light sequences 6-1, 6-2 and 6-3. The impulse in the first frame position is caused by the initial first frame position impulse from light 6-1; as shown in FIG.6C the light sequence 6-1 from the emitting zone 5-1 practically can not arrive at downward reflecting surface of the WO 2009/053956 PCT/IL2007/001259 28 finger 10 tip. And into the reflected light 7-1 practically will not component caused by the light from emitting zone 5-1. But for the illustrative purposes let suppose that in negligible degree the light sequence 6-1 from emitting zone 5-1 yet arrives at the downward reflecting surface of the finger 10 tip and accordingly causes the negligible impulse in the first frame position. In the fourth chart in FIG.6D is shown such impulse having negligible magnitude. In the fourth frame position in downward reflected light 7-1 there is the impulse caused by the light sequence 6-2 radiated by the emitting zone 5-2. This impulse has magnitude near to maximal possible. In the seventh frame position in the downward reflected light 7-1 there is the impulse caused by the impulse from the light sequence 6-3 radiated by the emitting zone 5-3 (belonged to not pointed selectable member). As noted above, the light arriving at finger 10 tip from the emitting zone 5 3 is maximal from other light components. Accordingly, the reflected light component in the downward reflected light 7-1 will be maximal relative other light components, i.e. the impulse in the seventh frame position caused by the impulse from light sequence 6-3 radiated by the emitting zone 5-3 will have maximal possible magnitude (and it will be maximal between all impulses in the fourth chart in FIG.6D). The fifth chart in FIG.6D represents the upward reflected light 7-2 of the reflected by the finger 10 tip light radiation. The impulse in the first frame position in upward reflected light 7-2 has maximal magnitude, since it is caused by the impulse of light sequence 6-1 from the emitting zone 5-1: as it is obvious in FIG.6D between the emitting zones 5-1, 5-2 and 5-3 the emitting zone 5-1 has maximal square radiating light onto reflecting surface (its upward part) of finger 10 tip. The impulse in fourth frame position in upward reflected light 7-2 has magnitude near to maximal, since the square of the emitting zone 5-2 part radiating light onto reflecting surface (its upward part) of finger 10 tip is significant too. The impulse in the seventh frame position is shown in the fifth chart for illustrative purposes; it has negligible magnitude, since only theoretically (and of course in negligible degree) the light from the emitting zone 5-3 may arrive at upward reflecting surface of finger 10 tip. The sixth chart in FG.6D represents the signal 12-1 from the lower optical sensor 3-1 (see FIG.6B too). The signal 12-1 is produced by the lower optical sensor 3-1 (see FIG.6B) from the downward reflected light 7-1 and therefore its chart (i.e. sixth chart) is analogous to said downward reflected light 7-1 chart (i.e. to the fourth chart). Accordingly, the signal 12-3, produced by the upper optical sensor 3-3 (see FIG.6B) from the upward reflected light 7-2, is analogous to said upward reflected light 7-2. The seventh chart represents the upper optical WO 2009/053956 PCT/IL2007/001259 29 sensor signal 12-3 (see FIG.6D). The eighth chart in FIG.6D represents the impulse's magnitude 13-1 of signal 12-1 from the lower sensor 3-1 (see FIG.6B). The ninth chart in FIG.6D represents the impulse's magnitude 13-3 of signal 12-3 from the upper sensor 3-3 (see FIG.6B). In both these charts (eighth and ninth) the predetermined threshold 14 is shown too. Also in FIG.6D a representing generic code combination 16 is shown relative to the period of representation 11. The third embodiment of the present invention is performed in the following way. The emitting zones 5-1, 5-2 and 5-3 (see FIGs. 6A and 6B), representing the respective selectable members, emit attached to each selectable member the individual light sequences 6-1, 6-2 and 6-3. The lights 6-1, 6-2 and 6-3 arrive at the finger 10 tip, and are reflected by said finger 10 tip. The downward reflected light 7-1 and upward reflected light 7-2 are shown in FIG.6C and in the fourth and fifth charts in FIG.6D. The reflected lights 7-1 and 7-2 arrive at the receiving optical sensors - lower 3-1 one and upper 3-3 one (see FIGs. 6A and 6B). The lower optical sensor 3-1 and upper optical sensor 3-3 produce the sensor signals 12-1 and 12-3, accordingly, correspondingly to said reflected lights 7-1 and 7-2 (see sixth and seventh charts in FIG.6D). Next the third embodiment determines the magnitude of impulses in the signals 12-1 and 12-3. These magnitudes 13-1 and 13-3 are shown in the eighth and ninth charts in FIG.6D. As previously noted one from said impulses in both cases has negligible magnitudes, two remained impulses - have the magnitudes very near to maximal or maximal possible (two impulses in the fourth and seventh frame positions in the eighth chart, and two impulses in the first and fourth frame positions in the ninth charts). After the magnitude determining the third embodiment compares the determined impulse's magnitudes 13-1 and 13-3 with the predetermined threshold 14. The negligible impulses of course do not exceed the predetermined threshold 14. The said maximal or near to maximal possible magnitudes of said remaining impulses confidently exceed the predetermined threshold 14. As previously noted, the predetermined threshold 14 level must be chosen not far from the maximal possible impulse's magnitude a few lower one. More particularly, the predetermined threshold 14 level provide the geometrical level position relative to the screen flat from which the present invention will detect pointing/touching event. Such geometrical level is any certain intermediate finger position between the finger positions shown in FIG.4B and in FIG.5B. (Using the present invention method twice, in parallel, with different predetermined thresholds 14, it is possible to differ two levels from which it is possible to detect pointing/touching event; for example it is possible to differ between pointing and touching manipulations). With increasing the predetermined threshold 14 the said geometrical level is WO 2009/053956 PCT/IL2007/001259 30 approached to the screen flat. With decreasing the predetermined threshold 14 the said geometrical level is approached to the bound of optical sensor aperture. Analogously to SETTINGS in Windows the user may to set the predetermined threshold 14 to get the most useful said level position accordingly to his/her experience or/and wish. As noted above, it is possible to use two (or even more) predetermined thresholds having different levels (that equivalent, as previously noted, to using of the present method twice or even more time simultaneously with different predetermined thresholds). In this case the selecting determined accordingly one predetermined threshold may be used as, for example, one mode selecting initiated by the user's pointing action when the finger's tip slightly enters the aperture bound. The selecting determined accordingly second predetermined threshold may be used as, for example, other mode selecting initiated by the user's touching action when the finger's tip at valid mode touches the screen flat. The providing two (or even more) additional selecting functions, basing on two (or even more) predetermined thresholds (that is equivalent to said use of the present invention twice or even more times in parallel), increases the functional capabilities of the present invention. The said "twice-mode" of using of the present invention is possible for every embodiment of the present invention. If there is no exceeding the predetermined threshold 14 by the impulse's magnitude than the third embodiment does nothing. If in the signals 12-1 or/and 12-3 there are impulses, which have the magnitude exceeding the predetermined threshold 14, that means that the finger 10 tip is being closely to the screen 2 flat or even is touching one, i.e. the user has pointed or/and touched any certain selectable member on the screen 2. Such exceeding the predetermined threshold 14 by the impulse's magnitude is shown in the eighth and ninth charts in FIG.6D. In this case the third embodiment forms the representing generic code combination 16 corresponding to positions through the period of representation 11 into which the said impulse's magnitudes 13-1 and 13-3 exceed the predetermined threshold 14 simultaneously (i.e. simultaneously into both signals 7-1 and 7-2 from both opposite located optical sensors 3-1 and 3-3 - see FIGs. 6A, 6B and 6D). The representing generic code combination 16 (see FIG.6D) comprises the quantity bits equal to frame positions number through the period of representation 11. Each bit of the representing generic code combination 16 in any certain position is set into logical "1" if both sensor signals 7-1 and 7-2 (i.e. from both opposite located optical sensors 3-1 and 3-3, see FIG.6A) have in the WO 2009/053956 PCT/IL2007/001259 31 said certain position the impulse's magnitude 13-1 and 13-3 (see FIG.6D) exceeding the predetermined threshold 14, or else the said bit is set into logical "0". It is obvious that the fourth frame position impulse's magnitudes 13-1 and 13-3 simultaneously exceed the predetermined threshold 14, and therefore the bit in the said fourth frame position in the representing generic code combination 16 is set in logical "1". Other bits are set in logical "0". The exceeding the predetermined threshold 14 by the first frame position impulse's magnitude 13-3 of signal 12-3 from upper sensor 3-3 (see fifth, seventh and eighth charts in FIG.6D) does not add to the first position bit of the representing generic code combination 16 the logical "1" since this exceeding is not supported by the same exceeding in the first frame position from the signal 12-1. I.e. "single" exceeding does not influence on the representing generic code combination 16 content. Analogously, the exceeding the predetermined threshold 14 by the seventh frame position impulse's magnitude 13-1 of signal 12-1 from lower sensor 3-1 also does not add to the seventh position of the representing generic code combination 16 the logical "1" since this exceeding is not supported by the same exceeding in the seventh frame position from the signal 12-3. After representing generic code combination 16 forming the third embodiment of the present invention determines if the said code 16 correlates with or is equal to any certain from the predetermined individual sequences. In FIG.6D it is obvious, that the representing generic code combination 16 exactly corresponds with the light sequence 6-2, i.e. correlates at very high degree with said light sequence 6-2. The third embodiment finds that representing generic code combination 16 correlates with the light sequence 6-2 and selects the selectable member to which said light sequence 6-2 belongs. The light sequence 6-2 belongs to the emitting zone 5-2 comprising middle icon - see FIG.6A. Therefore, the third embodiment selects the "middle" book-like selectable member of the graphical user interface. On the other hand, FIGs. 6A and 6B show that the finger 10 points the emitting zone 5-2 comprising said "middle" selectable member book-like icon. Thus, the selected by the third embodiment selectable member and pointed by the user selectable member is the same - the third embodiment had rightly selected the pointed by the user icon. Other embodiments of the present invention, in additional to previously discussed embodiments, perform the control of the time parameters of the presence and, possibly (for some embodiments), absence the pointing object (e.g. finger) into and, possibly, outside the aperture of optical sensor(s). Basing on these time parameters said other embodiments detect single and, possibly (for some embodiments), double touch user's manipulations and accordingly produce WO 2009/053956 PCT/IL2007/001259 32 first- and, possibly (for some embodiments), second-mode selecting functions. Two kinds of selecting functions may be analogous to the left- and right-button mouse clicks in Windows, or carry any other functions. The said single and double touch event are assigned as valid (and accordingly cause producing first- and, possibly (for some embodiments), second-mode selecting functions) if their time intervals of presence and, possibly (for some embodiments), absence relative the aperture of optical sensor(s) is/are corresponding with predetermined first and, possibly (for some embodiments), second time intervals. As result said other embodiments of the present invention do not react on the short accidental finger passes through the sensor's aperture or its short accidental touches the screen surface. Such accidental pointings/touches are the ordinary ones during the user interacting with the touch screen systems. Therefore, the insensitivity of discussed other embodiment to accidental user screen touches (and, of course, to accidental shifting the finger closely to the screen surface without touching) significantly increases the reliability, stability and convenience of the present invention and creates for the user a friend interacting mode in the course of the interface session. The present invention provides the point/touch screen based interfaces for all screen types (for example, for a conventional non-touch LCD screens); it is able to provide the point/touch interface for the very big screens and for very small screens. Also the present invention provides the point/touch interface for a very small selectable member's views (even less than finger's dimensions). The present invention provides "single-touch" and "double-touch" selecting functions. If the present invention to use twice (with different predetermined thresholds) then the present invention provides two additional selecting functions differed in point/touch mode: for example, one additional selecting function from the pointing user's action, and second additional selecting function from the touching user's actions. That relates to all embodiments of the present invention. I.e. it is possible to provide in sum three different selecting functions: pointing (without touching) action, single-touching action, double-touching action. The present invention method is the compact, immediate and reliable interface method; it does not use the coordinate data and accordingly it does not execute any coordinate determining for the pointing/touching object and analysis of selectable members coordinates to find any certain one for selecting. The digital nature of the present method increases its the reliability and stability against analog and light noises. The present invention is not sensitive to the short accidental user's shifting into the aperture or touches at the screen during the interface session. The integration of the present invention into every information systems is simple enough and has a low cost of WO 2009/053956 PCT/IL2007/001259 33 implementation. All user manipulations into the present invention are executed in a natural and convenient manner and do not require a training or any previous experience.

Claims (15)

1. In an information system which represents a graphical user interface, where the said information system includes: at least one screen representing the said graphical user interface where the said graphical user interface includes some of selectable members where an arbitrary from the said selectable members may be selected by the user during his/her interacting with the said information system by a pointing or/and touching manipulation with a pointing object, a program which draws the said graphical user interface through sequence of frames on the said screen, at least one optical sensor having at least one receiving optical element where the said optical sensor forms a sensor signal representative to optical beams from an aperture of the said optical sensor where the said aperture contains the close neighbourhood of at least part of the said screen's surface, a mean or/and a program for determining a magnitude of impulses into any signal and for determining a correlation between any signal and any sequence, a method of selecting any certain from the said selectable members which screen view or/and the said view's screen surroundings is pointed or/and touched with the said pointing or/and touching manipulation with the said pointing object, the said method comprising the steps of: al) for each the said selectable member into every period of representation comprising a predetermined number of frames forming a predetermined individual sequence of changes in brightness or/and colour screen representation of the said screen view of the said selectable member or/and the said view's screen surroundings where each the said change has a duration of at least one the said frame and where the said predetermined individual sequence is unique for each from the said selectable members; RIIR.RTITIITF RHFFT (Rill F 27I WO 2009/053956 PCT/IL2007/001259 35 b1) determining the said magnitude of the impulses into the said sensor signal during a control frame quantity, where the said control frame quantity is not less than a frame position number of the last through the said period of representation the said change among all the said changes of the said predetermined individual sequences, and comparing the said magnitude with a predetermined threshold; cl) if into at least a predetermined quantity of frames the said magnitude exceeds the said predetermined threshold, then determining if the said sensor signal correlates into a control frame quantity, where the said control frame quantity is not less than a frame position number of the last through the said period of representation the said change among all the said changes of the said predetermined individual sequences, with any from the said predetermined individual sequences; and dl) if in step cl it has been determined, that the said sensor signal correlates with a certain from the said predetermined individual sequences, then selecting the said selectable member to which the said certain from the said predetermined individual sequences belongs.
2. In an information system which represents a graphical user interface, where the said information system includes: at least one screen representing the said graphical user interface where the said graphical user interface includes some of selectable members where an arbitrary from the said selectable members may be selected by the user during his/her interacting with the said information system by a pointing or/and touching manipulation with a pointing object, a program which draws the said graphical user interface through sequence of frames on the said screen, at least one optical sensor having at least one receiving optical element where the said optical sensor forms a sensor signal representative to optical beams from an aperture of the said optical sensor where the said aperture contains the close neighbourhood of at least part of the said screen's surface, a mean or/and a program for determining a magnitude of impulses into any signal and for detecting a value coded in digital serial data form into any code combination, a method of selecting any certain from the said selectable members which screen view or/and the said view's screen surroundings is pointed or/and touched with the said pointing or/and touching manipulation with the said pointing object, the said method comprising the steps of: WO 2009/053956 PCT/IL2007/001259 36 a2) for each the said selectable member into every period of representation comprising a predetermined number of frames forming a predetermined individual sequence of changes in brightness of/and colour screen representation of the said screen view of the said selectable member or/and the said view's screen surroundings where each the said change has a duration of at least one the said frame and where the said predetermined individual sequence includes in digital serial data form a predetermined individual number which is unique for each from the said selectable members; b2) determining the said magnitude of the impulses into the said sensor signal during a control frame quantity, where the said control frame quantity is not less than a frame position number of the last through the said period of representation the said change among all the said changes of the said predetermined individual sequences, and comparing the said magnitude with a predetermined threshold; c2) if into at least a predetermined quantity of frames the said magnitude exceeds the said predetermined threshold, then forming a representing code combination having a length equal to the said predetermined number of frames and comprising a content corresponding to the said period of representation frame's positions into which the said magnitude exceeds the said predetermined threshold, detecting the value coded in digital serial data form into the said representing code combination, and determining if the said value is equal to any from the said predetermined individual numbers; and d2) if in step c2 it has been determined, that the said value is equal to a certain from the said predetermined individual numbers, then selecting the said selectable member, to which the said certain from the said predetermined individual numbers belongs.
3. In an information system which represents a graphical user interface, where the said information system includes: at least one screen representing the said graphical user interface where the said graphical user interface includes some of selectable members where an arbitrary from the said selectable members may be selected by the user during his/her interacting with the said information system by a pointing or/and touching manipulation with a pointing object, a program which draws the said graphical user interface through sequence of frames on the said WO 2009/053956 PCT/IL2007/001259 37 screen, at least two opposite one to another located optical sensors where each from the said optical sensors has at least one receiving optical element and where each from the said optical sensors representative to optical beams from its aperture forms an individual sensor signal where the said aperture contains the close neighbourhood of at least part of the said screen's surface, a mean or/and a program for determining a magnitude of impulses into any signal, for forming a generic code combination from at least two signals and for determining a correlation between any code combination and any sequence, a method of selecting any certain from the said selectable members which screen view or/and the said view's screen surroundings is pointed or/and touched with the said pointing or/and touching manipulation with the said pointing object, the said method comprising the steps of: a3) for each the said selectable member into every period of representation comprising a predetermined number of frames forming a predetermined individual sequence of changes in brightness or/and colour screen representation of the said screen view of the said selectable member or/and the said view's screen surroundings where each the said change has a duration of at least one the said frame and where the said predetermined individual sequence is unique for each from the said selectable members; b3) determining the said magnitude of the impulses into at least two the said individual sensor signals from the said opposite one to another located optical sensors during a control frame quantity, where the said control frame quantity is not less than a frame position number of the last through the said period of representation the said change among all the said changes of the said predetermined individual sequences, and comparing the said magnitude with a predetermined threshold; c3) if into at least a predetermined quantity of frames the said magnitude exceeds the said predetermined threshold, then forming a representing generic code combination having a length equal to the said predetermined number of frames and comprising a content corresponding to the said period of representation frame's positions into which the said impulses of at least two the said individual sensor signals have the said magnitude exceeding the said predetermined threshold, and determining if the said representing generic code combination correlates with or is equal to any from the said predetermined individual sequences; and WO 2009/053956 PCT/IL2007/001259 38 d3) if in step c3 it has been determined, that the said representing generic code combination correlates with or is equal to a certain from the said predetermined individual sequences, then selecting the said selectable member to which the said certain from the said predetermined individual sequences belongs.
4. In an information system which represents a graphical user interface, where the said information system includes: at least one screen representing the said graphical user interface where the said graphical user interface includes some of selectable members where an arbitrary from the said selectable members may be selected by the user during his/her interacting with the said information system by a pointing or/and touching manipulation with a pointing object, a program which draws the said graphical user interface through sequence of frames on the said screen, at least one optical sensor having at least one receiving optical element where the said optical sensor forms a sensor signal representative to optical beams from an aperture of the said optical sensor where the said aperture contains the close neighbourhood of at least part of the said screen's surface, a mean or/and a program for determining a magnitude of impulses into any signal and for determining a correlation between any signal and any sequence, a method of selecting any certain from the said selectable members which screen view or/and the said view's screen surroundings is pointed or/and touched with the said pointing or/and touching manipulation with the said pointing object, the said method comprising the steps of: a4) for each the said selectable member into every period of representation comprising a predetermined number of frames forming a predetermined individual sequence of changes in brightness or/and colour screen representation of the said screen view of the said selectable member or/and the said view's screen surroundings where each the said change has a duration of at least one the said frame and where the said predetermined individual sequence is unique for each from the said selectable members; b4) determining the said magnitude of the impulses into the said sensor signal during a control frame quantity, where the said control frame quantity is not less than a frame position number of the last through the said period of representation the said change among all the said changes of the said predetermined individual sequences, and comparing the said magnitude with a predetermined threshold; WO 2009/053956 PCT/IL2007/001259 39 c4) if into at least a predetermined quantity of frames the said magnitude exceeds the said predetermined threshold, then determining if the said sensor signal correlates into a control frame quantity, where the said control frame quantity is not less than a frame position number of the last through the said period of representation the said change among all the said changes of the said predetermined individual sequences, with any from the said predetermined individual sequences; d4) if in step c4 it has been determined, that the said sensor signal correlates with a certain from the said predetermined individual sequences, then determining a first time interval while the said sensor signal correlates with the said certain from the said predetermined individual sequences; and e4) if the said first time interval is more than a predetermined first time threshold, then selecting the said selectable member to which the said certain from the said predetermined individual sequences belongs.
5. In an information system which represents a graphical user interface, where the said information system includes: at least one screen representing the said graphical user interface where the said graphical user interface includes some of selectable members where an arbitrary from the said selectable members may be selected by the user during his/her interacting with the said information system by a pointing or/and touching manipulation with a pointing object, a program which draws the said graphical user interface through sequence of frames on the said screen, at least one optical sensor having at least one receiving optical element where the said optical sensor forms a sensor signal representative to optical beams from an aperture of the said optical sensor where the said aperture contains the close neighbourhood of at least part of the said screen's surface, a mean or/and a program for determining a magnitude of impulses into any signal and for detecting a value coded in digital serial data form into any code combination, a method of selecting any certain from the said selectable members which screen view or/and the said view's screen surroundings is pointed or/and touched with the said pointing or/and touching manipulation with the said pointing object, the said method comprising the steps of: WO 2009/053956 PCT/IL2007/001259 40 a5) for each the said selectable members into every period of representation comprising a predetermined number of frames forming a predetermined individual sequence of changes in brightness of/and colour screen representation of the said screen view of the said selectable member or/and the said view's screen surroundings where each the said change has a duration of at least one the said frame and where the said predetermined individual sequence includes in digital serial data form a predetermined individual number which is unique for each from the said selectable members; b5) determining the said magnitude of the impulses into the said sensor signal during a control frame quantity, where the said control frame quantity is not less than a frame position number of the last through the said period of representation the said change among all the said changes of the said predetermined individual sequences, and comparing the said magnitude with a predetermined threshold; c5) if into at least a predetermined quantity of frames the said magnitude exceeds the said predetermined threshold, then forming a representing code combination having a length equal to the said predetermined number of frames and comprising a content corresponding to the said period of representation frame's positions into which the said magnitude exceeds the said predetermined threshold, detecting the value coded in digital serial data form into the said representing code combination, and determining if the said value is equal to any from the said predetermined individual numbers; d5) if in step c5 it has been determined, that the said value is equal to a certain from the said predetermined individual numbers, then determining a first time interval while the said value is equal to the said certain from the said predetermined individual numbers; e5) if the said first time interval is more than a predetermined first time threshold, then selecting the said selectable member, to which the said certain from the said predetermined individual numbers belongs. WO 2009/053956 PCT/IL2007/001259 41
6. In an information system which represents a graphical user interface, where the said information system includes: at least one screen representing the said graphical user interface where the said graphical user interface includes some of selectable members where an arbitrary from the said selectable members may be selected by the user during his/her interacting with the said information system by a pointing or/and touching manipulation with a pointing object, a program which draws the said graphical user interface through sequence of frames on the said screen, at least two opposite one to another located optical sensors where each from the said optical sensors has at least one receiving optical element and where each from the said optical sensors representative to optical beams from its aperture forms an individual sensor signal where the said aperture contains the close neighbourhood of at least part of the said screen's surface, a mean or/and a program for determining a magnitude of impulses into any signal, for forming a generic code combination from at least two signals and for determining a correlation between any code combination and any sequence, a method of selecting any certain from the said selectable members which screen view or/and the said view's screen surroundings is pointed or/and touched with the said pointing or/and touching manipulation with the said pointing object, the said method comprising the steps of: a6) for each the said selectable member into every period of representation comprising a predetermined number of frames forming a predetermined individual sequence of changes in brightness or/and colour screen representation of the said screen view of the said selectable member or/and the said view's screen surroundings where each the said change has a duration of at least one the said frame and where the said predetermined individual sequence is unique for each from the said selectable members; b6) determining the said magnitude of the impulses into at least two the said individual sensor signals from the said opposite one to another located optical sensors during a control frame quantity, where the said control frame quantity is not less than a frame position number of the last through the said period of representation the said change among all the said changes of the said predetermined individual sequences, and comparing the said magnitude with a predetermined threshold; c6) if into at least a predetermined quantity of frames the said magnitude exceeds the said predetermined threshold, then forming a representing generic code combination having a length equal to the said predetermined number of frames and comprising a content corresponding to the WO 2009/053956 PCT/IL2007/001259 42 said period of representation frame's positions into which the said impulses of at least two the said individual sensor signals have the said magnitude exceeding the said predetermined threshold, and determining if the said representing generic code combination correlates with or is equal to any from the said predetermined individual sequences; d6) if in step c6 it has been determined, that the said representing generic code combination correlates with or is equal to a certain from the said predetermined individual sequences, then determining a first time interval while the said representing generic code combination correlates with or is equal to the said certain from the said predetermined individual sequences; and f6) if the said first time interval is more than a predetermined first time threshold, then selecting the said selectable member to which the said certain from the said predetermined individual sequences belongs.
7. In an information system which represents a graphical user interface, where the said information system includes: at least one screen representing the said graphical user interface where the said graphical user interface includes some of selectable members where an arbitrary from the said selectable members may be first- or/and second-mode selected by the user during his/her interacting with the said information system by a first- or second-mode pointing or/and touching manipulation with a pointing object, a program which draws the said graphical user interface through sequence of frames on the said screen, at least one optical sensor having at least one receiving optical element where the said optical sensor forms a sensor signal representative to optical beams from an aperture of the said optical sensor where the said aperture contains the close neighbourhood of at least part of the said screen's surface, a mean or/and a program for determining a magnitude of impulses into any signal and for determining a correlation between any signal and any sequence, a method of two-mode selecting any certain from the said selectable members which screen view or/and the said view's screen surroundings is pointed or/and touched with the said first- or second-mode pointing or/and touching manipulation with the said pointing object, the said method comprising the steps of: a7) for each the said selectable member into every period of representation comprising a predetermined number of frames forming a predetermined individual sequence of changes in WO 2009/053956 PCT/IL2007/001259 43 brightness or/and colour screen representation of the said screen view of the said selectable member or/and the said view's screen surroundings where each the said change has a duration of at least one the said frame and where the said predetermined individual sequence is unique for each from the said selectable members; b7) determining the said magnitude of the impulses into the said sensor signal during a control frame quantity, where the said control frame quantity is not less than a frame position number of the last through the said period of representation the said change among all the said changes of the said predetermined individual sequences, and comparing the said magnitude with a predetermined threshold; c7) if into at least a predetermined quantity of frames the said magnitude exceeds the said predetermined threshold, then determining if the said sensor signal correlates into a control frame quantity, where the said control frame quantity is not less than a frame position number of the last through the said period of representation the said change among all the said changes of the said predetermined individual sequences, with any from the said predetermined individual sequences; d7) if in step c7 it has been determined, that the said sensor signal correlates with a certain from the said predetermined individual sequences, then determining a first time interval while the said sensor signal correlates with the said certain from the said predetermined individual sequences; e7) if the said first time interval is more than a predetermined first time threshold, then first mode selecting the said selectable member to which the said certain from the said predetermined individual sequences belongs, else if the said first time interval is more than a predetermined second time threshold, then determining a second time interval while the said sensor signal does not correlate with the said certain from the said predetermined individual sequences; and f7) if the said second time interval is more than a predetermined third time threshold and is less than a predetermined fourth time threshold, then second-mode selecting the said selectable member to which the said certain from the said predetermined individual sequences belongs. WO 2009/053956 PCT/IL2007/001259 44
8. The method according to 1 or 4 or 7, where the step c1 or c4 or c7, accordingly, determining if the said sensor signal correlates into a control frame quantity, where the said control frame quantity is not less than a frame position number of the last through the said period of representation the said change among all the said changes of the said predetermined individual sequences, with any from the said predetermined individual sequences further comprising the steps: a8) forming a representing code combination having a length equal to the said predetermined number of frames and comprising a content corresponding to the said period of representation frame's positions into which the said magnitude exceeds the said predetermined threshold; and b8) determining if the said representing code combination correlates into the said period of representation with or is equal to any from the said predetermined individual sequences, and if yes then the said sensor signal correlates with a certain from the said predetermined individual sequences.
9. In an information system which represents a graphical user interface, where the said information system includes: at least one screen representing the said graphical user interface where the said graphical user interface includes some of selectable members where an arbitrary from the said selectable members may be first- or/and second-mode selected by the user during his/her interacting with the said information system by a first- or second-mode pointing or/and touching manipulation with a pointing object, a program which draws the said graphical user interface through sequence of frames on the said screen, at least one optical sensor having at least one receiving optical element where the said optical sensor forms a sensor signal representative to optical beams from an aperture of the said optical sensor where the said aperture contains the close neighbourhood of at least part of the said screen's surface, a mean or/and a program for determining a magnitude of impulses into any signal and for detecting a value coded in digital serial data form into any code combination, a method of two-mode selecting any certain from the said selectable members which screen view or/and the said view's screen surroundings is pointed or/and touched with the said first- or second-mode pointing WO 2009/053956 PCT/IL2007/001259 45 or/and touching manipulation with the said pointing object, the said method comprising the steps of: a9) for each the said selectable members into every period of representation comprising a predetermined number of frames forming a predetermined individual sequence of changes in brightness of/and colour screen representation of the said screen view of the said selectable member or/and the said view's screen surroundings where each the said change has a duration of at least one the said frame and where the said predetermined individual sequence includes in digital serial data form a predetermined individual number which is unique for each from the said selectable members; b9) determining the said magnitude of the impulses into the said sensor signal during a control frame quantity, where the said control frame quantity is not less than a frame position number of the last through the said period of representation the said change among all the said changes of the said predetermined individual sequences, and comparing the said magnitude with a predetermined threshold; c9) if into at least a predetermined quantity of frames the said magnitude exceeds the said predetermined threshold, then forming a representing code combination having a length equal to the said predetermined number of frames and comprising a content corresponding to the said period of representation frame's positions into which the said magnitude exceeds the said predetermined threshold, detecting the value coded in digital serial data form into the said representing code combination, and determining if the said value is equal to any from the said predetermined individual numbers; d9) if in step c9 it has been determined, that the said value is equal to a certain from the said predetermined individual numbers, then determining a first time interval while the said value is equal to the said certain from the said predetermined individual numbers; e9) if the said first time interval is more than a predetermined first time threshold, then first mode selecting the said selectable member to which the said certain from the said predetermined individual numbers belongs, else if the said first time interval is more than a predetermined second time threshold, then determining a second time interval while the said value is not equal to the said certain from the said predetermined individual numbers; and WO 2009/053956 PCT/IL2007/001259 46 f9) if the said second time interval is more than a predetermined third time threshold and is less than a predetermined fourth time threshold, then second-mode selecting the said selectable member to which the said certain from the said predetermined individual numbers belongs.
10. In an information system which represents a graphical user interface, where the said information system includes: at least one screen representing the said graphical user interface where the said graphical user interface includes some of selectable members where an arbitrary from the said selectable members may be first- or/and second-mode selected by the user during his/her interacting with the said information system by a first- or second-mode pointing or/and touching manipulation with a pointing object, a program which draws the said graphical user interface through sequence of frames on the said screen, at least two opposite one to another located optical sensors where each from the said optical sensors has at least one receiving optical element and where each from the said optical sensors representative to optical beams from its aperture forms an individual sensor signal where the said aperture contains the close neighbourhood of at least part of the said screen's surface, a mean or/and a program for determining a magnitude of impulses into any signal, for forming a generic code combination from at least two signals and for determining a correlation between any code combination and any sequence, a method of two-mode selecting any certain from the said selectable members which screen view or/and the said view's screen surroundings is pointed or/and touched with the said first- or second-mode pointing or/and touching manipulation with the said pointing object, the said method comprising the steps of: al0) for each the said selectable members into every period of representation comprising a predetermined number of frames forming a predetermined individual sequence of changes in brightness or/and colour screen representation of the said screen view of the said selectable member or/and the said view's screen surroundings where each the said change has a duration of at least one the said frame and where the said predetermined individual sequence is unique for each from the said selectable members; blO) determining the said magnitude of the impulses into at least two the said individual sensor signals from the said opposite one to another located optical sensors during a control frame WO 2009/053956 PCT/IL2007/001259 47 quantity, where the said control frame quantity is not less than a frame position number of the last through the said period of representation the said change among all the said changes of the said predetermined individual sequences, and comparing the said magnitude with a predetermined threshold; cl0) if into at least a predetermined quantity of frames the said magnitude exceeds the said predetermined threshold, then forming a representing generic code combination having a length equal to the said predetermined number of frames and comprising a content corresponding to the said period of representation frame's positions into which the said impulses of at least two the said individual sensor signals have the said magnitude exceeding the said predetermined threshold, and determining if the said representing generic code combination correlates with or is equal to any from the said predetermined individual sequences; dl 0) if in step c6 it has been determined, that the said representing generic code combination correlates with or is equal to a certain from the said predetermined individual sequences, then determining a first time interval while the said representing generic code combination correlates with or is equal to the said certain from the said predetermined individual sequences; el0) if the said first time interval is more than a predetermined first time threshold, then first mode selecting the said selectable member to which the said certain from the said predetermined individual sequences belongs, else if the said first time interval is more than a predetermined second time threshold, then determining a second time interval while the said representing generic code combination does not correlate with or is not equal to the said certain from the said predetermined individual sequences; and fl0) if the said second time interval is more than a predetermined third time threshold and is less than a predetermined fourth time threshold, then second-mode selecting the said selectable member to which the said certain from the said predetermined individual sequences belongs.
11. The method accordingly to 1 or 2 or 3 or 4 or 5 or 6 or 7 or 8 or 9 or 10, where the said information system comprises at least one the said optical sensor having at least two the said receiving optical elements, the said method before the step al, a2, a3, a4, a5, a6, a7, a8, a9 or WO 2009/053956 PCT/IL2007/001259 48 al0 further comprising the steps of: dividing the said optical beams coming from the said aperture to the input of the said optical sensor to at least two fundamental colour components; transferring each of the said fundamental colour components to certain among the said receiving optical elements where each from the said receiving optical elements forms its colour component signal representative to received certain the said fundamental colour component; and forming the said sensor signal where the said optical sensor forms the said sensor signal representative to the maximal from the said colour component signals or representative to any from the colour component signals exceeding a predetermined component threshold.
12. The method accordingly to 1 or 2 or 3 or 4 or 5 or 6 or 7 or 8 or 9 or 10, where the said optical sensor further comprising at least one the said receiving optical elements where the said optical element is colour sensitive and forms its colour component signal, the said method further comprising the steps of: forming the said sensor signal representatively to the said colour component signal before the step al, a2, a3, a4, a5, a6, a7, a8, a9 or alO.
13. The method accordingly to 1 or 4 or 7, where the step cl or c4 or c7, accordingly, determining if the said sensor signal correlates into a control frame quantity, where the said control frame quantity is not less than a frame position number of the last through the said period of representation the said change among all the said changes of the said predetermined individual sequences, with any from the said predetermined individual sequences, further comprises the steps: al3) determining a coefficient of correlation for the said sensor signal and any from the said predetermined individual sequences into the said control frame quantity; and b13) if the said coefficient of correlation is more than a predetermined correlative threshold, then the said sensor signal correlates with a certain from the said predetermined individual sequences. WO 2009/053956 PCT/IL2007/001259 49
14. The method accordingly to 3 or 6 or 10, where in the step c3 or c6 or c1O, accordingly, determining if the said representing generic code combination correlates with any from the said predetermined individual sequences, further comprises the steps: al4) determining a coefficient of correlation for the said representing generic code combination and any from said certain predetermined individual sequences; and b14) if the said coefficient of correlation is more than a predetermined correlative threshold, then the said representing generic code combination correlates with a certain from the said predetermined individual sequences.
15. The method accordingly to 8, where the step b8 determining if the said representing code combination correlates into the said period of representation with any from the said predetermined individual sequences, further comprises the steps: a15) determining a coefficient of correlation for the said representing code combination and any from said certain predetermined individual sequences; and b15) if the said coefficient of correlation is more than a predetermined correlative threshold, then the said representing code combination correlates with a certain from the said predetermined individual sequences
AU2007360500A2007-10-232007-10-23Passive pointing/touching interface methodAbandonedAU2007360500A1 (en)

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
PCT/IL2007/001259WO2009053956A2 (en)2007-10-232007-10-23Passive pointing/touching interface method

Publications (1)

Publication NumberPublication Date
AU2007360500A1true AU2007360500A1 (en)2009-04-30

Family

ID=40580183

Family Applications (1)

Application NumberTitlePriority DateFiling Date
AU2007360500AAbandonedAU2007360500A1 (en)2007-10-232007-10-23Passive pointing/touching interface method

Country Status (3)

CountryLink
AU (1)AU2007360500A1 (en)
GB (1)GB2469397B (en)
WO (1)WO2009053956A2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2011041944A1 (en)*2009-10-092011-04-14禾瑞亚科技股份有限公司Method and device for dual-differential sensing
US10254881B2 (en)2015-06-292019-04-09Qualcomm IncorporatedUltrasonic touch sensor-based virtual button

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US6064372A (en)*1996-11-272000-05-16Fluke CorporationTouchscreen display system for a test instrument
US6310610B1 (en)*1997-12-042001-10-30Nortel Networks LimitedIntelligent touch display
JP2002165787A (en)*2000-02-222002-06-11Nemoto Kyorindo:KkMedical tomogram display device
US7006080B2 (en)*2002-02-192006-02-28Palm, Inc.Display system
WO2008073289A2 (en)*2006-12-082008-06-19Johnson Controls Technology CompanyDisplay and user interface

Also Published As

Publication numberPublication date
GB201009134D0 (en)2010-07-14
GB2469397A (en)2010-10-13
WO2009053956A2 (en)2009-04-30
GB2469397B (en)2012-04-25
WO2009053956A3 (en)2009-08-27

Similar Documents

PublicationPublication DateTitle
KR101736233B1 (en)Infrared touch screen
US10055066B2 (en)Controlling audio volume using touch input force
US9990696B2 (en)Decimation strategies for input event processing
EP2135155B1 (en)Touch screen system with hover and click input methods
US20190121459A1 (en)Detecting touch input force
CN106133664B (en)Frequency conversion in touch sensors
US20190102041A1 (en)Pressure informed decimation strategies for input event processing
EP0686935A1 (en)Pointing interface
US20110012856A1 (en)Methods for Operation of a Touch Input Device
US20150301577A1 (en)Device and method for operating at mitigated sensitivity in a touch sensitive device
US10386953B2 (en)Capacitive sensor patterns
JP2017509955A (en) Dynamic allocation of possible channels in touch sensors
US10241612B2 (en)Decimation supplementation strategies for input event processing
KR20120048332A (en)A system & method for performance test of electrostatic touch screen panel, a performance test device for electrostatic touch screen panel, a producing method for electrostatic touch screen panel and a storage medium
AU2007360500A1 (en)Passive pointing/touching interface method
CN102207817A (en)Electronic reading device and cursor control method thereof
WO2023234822A1 (en)An extended-reality interaction system
WO2019032014A1 (en)A touch-based virtual-reality interaction system
Rodrigues et al.A virtual touch interaction device for immersive applications
KR20200021650A (en)Media display device
TW201227463A (en)Electronic device with touch input function

Legal Events

DateCodeTitleDescription
MK4Application lapsed section 142(2)(d) - no continuation fee paid for the application

[8]ページ先頭

©2009-2025 Movatter.jp