Movatterモバイル変換


[0]ホーム

URL:


US20140195969A1 - Zeroclick - Google Patents

Zeroclick
Download PDF

Info

Publication number
US20140195969A1
US20140195969A1US13/896,280US201313896280AUS2014195969A1US 20140195969 A1US20140195969 A1US 20140195969A1US 201313896280 AUS201313896280 AUS 201313896280AUS 2014195969 A1US2014195969 A1US 2014195969A1
Authority
US
United States
Prior art keywords
touch
screen
user
sensitive
sensitive screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/896,280
Other versions
US20160246451A9 (en
Inventor
Nes Stewart Irvine
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filedlitigationCriticalhttps://patents.darts-ip.com/?family=27579341&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=US20140195969(A1)"Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Priority claimed from GB0011321Aexternal-prioritypatent/GB0011321D0/en
Priority claimed from GB0011370Aexternal-prioritypatent/GB0011370D0/en
Priority claimed from GB0011441Aexternal-prioritypatent/GB0011441D0/en
Priority claimed from GB0012582Aexternal-prioritypatent/GB0012582D0/en
Priority claimed from GB0026891Aexternal-prioritypatent/GB0026891D0/en
Priority claimed from GB0028097Aexternal-prioritypatent/GB0028097D0/en
Priority claimed from GB0028693Aexternal-prioritypatent/GB0028693D0/en
Priority claimed from GB0029148Aexternal-prioritypatent/GB0029148D0/en
Priority claimed from GB0031164Aexternal-prioritypatent/GB0031164D0/en
Priority claimed from GB0031680Aexternal-prioritypatent/GB0031680D0/en
Application filed by IndividualfiledCriticalIndividual
Priority to US13/896,280priorityCriticalpatent/US20160246451A9/en
Publication of US20140195969A1publicationCriticalpatent/US20140195969A1/en
Publication of US20160246451A9publicationCriticalpatent/US20160246451A9/en
Abandonedlegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

A GUI interface, a method of programming a GUI interface, and an apparatus which enables functions of controls in the GUI to be activated by a movement to a control and then another subsequent movement related to that control. It may be defined more precisely below, A GUI in which, when a pointer0 is immediately adjacent or passes over a control area1, a procedure is initiated whereby subsequent movement of the pointer over a predetermined path area3 generates a ‘click’ event which simulates direct clicking of the control1 and moving outside the predetermined path area3 prior to completion of the path3 resets the control to as if the pointer has never started along the predetermined path area3.

Description

Claims (20)

What is claimed:
1. A device capable of executing software comprising:
a touch-sensitive screen configured to detect being touched by a user's finger without requiring the touch-sensitive screen to be sensitive to a pressure of the finger contact on the screen;
a processor connected to the touch-sensitive screen and configured to receive from the screen information regarding locations touched by the user's finger;
executable user interface code stored in a memory connected to the processor, the user interface code executable by the processor;
the user interface code being configured to detect one or more locations touched by a movement of the user's finger on the screen without requiring the touch-sensitive screen to be sensitive to the pressure of the finger contact on the screen and determine therefrom a selected operation.
2. The device according toclaim 1 which is further configured by one or more of the following:
a) the device is operable by the touch-sensitive screen without requiring the touch-sensitive screen to be pressure sensitive including any of a mobile phone, or a touch-sensitive pad, or another computer device with a touch-sensitive screen;
b) the device activates the touch-sensitive screen, without requiring the touch-sensitive screen to be pressure sensitive, by a start sequence of locations touched on the touch-sensitive screen by a movement of the user's finger to start operating one or more further functions of the user interface selectable by one or more subsequent finger movements of the user on the touch-sensitive screen to control the device by one or more subsequent selected operations of the user;
c) the device can browse the internet;
d) the device can play video files;
e) the device can play audio files;
f) the device can display text;
g) the device can display a multimedia file;
h) the device can edit text;
i) the device can search the internet or text by entering characters;
j) the device has a start sequence of locations to be touched on the touch-sensitive screen by one or more finger movements of the user on the touch-sensitive screen without requiring the touch-sensitive screen to be pressure sensitive to start operating one or more further functions of the user interface to control the device, and the start sequence does not require the sequence of locations to be touched to be deducible by another user from the appearance of the touch-sensitive screen displaying the start sequence of locations to be touched;
k) the device is operated by touching the touch-sensitive screen in two or to more areas sequentially without requiring the touch-sensitive screen to be pressure sensitive;
l) the device has one or more functions to activate the touch-sensitive screen display from a very low power mode;
m) the device requires a sequence of locations on the touch-sensitive screen being touched by one or more finger movements on the touch-sensitive screen, without requiring the touch-sensitive screen to be pressure sensitive, thereby to activate the selected operation of the user to control the device;
n) the touch-sensitive screen being touched by one or more finger movements without requiring the touch-sensitive screen to be pressure sensitive can generate one or more functions by any of placing a finger at a location on the screen, moving a finger in contact with the screen, or not touching the screen at a location of the screen;
o) the device can operate a graphic program including the ability to draw a line on the touch-sensitive screen;
p) the device further includes detecting on the touch-sensitive screen, without requiring the touch-sensitive screen to be pressure sensitive, a movement of a pen at one or more locations of the touch-sensitive screen as a movement of the users finger at said one or more locations on the touch-sensitive screen;
q) the device in which the user interface is further configured to respond to a pointer speed at which the user's finger touching the touch-sensitive screen without requiring the touch-sensitive screen to be pressure sensitive moves along the screen;
r) the device in which the user interface is further configured to cause the user selected operation to move a displayed desktop in a selected direction;
s) the device wherein the user interface is further configured to execute a selected operation by the touch-sensitive screen detecting coordinates of each of at least two of the user's fingers touching different areas of the touch-sensitive screen sequentially without requiring the touch-sensitive screen to be pressure sensitive;
t) the device wherein said user interface is further configured to make a triggering of the user selected operation by a sequence of locations touched on the touch-sensitive screen without requiring the touch-sensitive screen to be pressure sensitive less probable to occur by accident than the user pressing a physical button or requiring pressing on a touch screen to trigger said user selected operation;
u) the device in which the user interface is further configured to cause the selected operation to control the device by one or more functions in addition to an appearance on the touch-sensitive screen by detecting one or more coordinate positions of one or more finger movements touching the touch-sensitive screen without requiring the touch-sensitive screen to be pressure sensitive;
v) the device can operate an application program within the user interface;
w) the device can deactivate the selected operation;
x) the device wherein the touch-sensitive screen is capable of detecting the finger in dose proximity over the screen and the user interface is further configured to execute a selected operation by the finger being in close proximity but not touching the screen;
y) a method of operating the device;
z) the user interface code is further configured to cause one or more selected operations, which includes one or more functions available to the user interface code of the device, to deactivate while the users finger is touching one or more locations on the screen;
1) the user interface code is further configured to activate another selected operation by the selected operation used in conjunction with an additional input method available in a prior art GUI;
2) the use interface code is further configured to execute the selected operation by the user's finger on the screen being along a specified direction and within a designated area of the screen over a specified distance;
3) the use interface code is further configured to execute the selected operation by the user's finger on the screen being along a specified direction and within a designated area of the screen over a specified distance then is removed from the screen;
4) a part of a surface of the device as the touch-sensitive screen can be configured to detect being touched by the user's finger without requiring the touch-sensitive screen to be sensitive to the pressure of the finger contact on the surface to operate as a button without requiring pressure;
5) a surface of the device as the touch-sensitive screen may include in whole or in part visual feedback of a LCD screen,
6) a sequence of one or more locations touched on a touch-sensitive surface of the device as the touch-sensitive screen can include a sequence of contact on the surface by a user's finger movement that may execute the selected operation which would not be obvious to the user by a visual feedback from the surface;
7) a touch-sensitive surface of the device as the touch-sensitive screen can act as a pressure-less switch or button on the device; and
8) the user's finger movement detected by the touch-sensitive screen can be used interchangeably as pointer movement.
3. The device ofclaim 1 in which said device comprises either a touch-sensitive pad or mobile phone that is not required to be pressure-sensitive,
wherein the user interface code is configured to detect more than one selected operation,
which has the capability to emulate a visible or invisible pointer movement and/or a “click” event for one or more of the following:
a control, menu, desktop, internet browser, multimedia player, settings menu, icon, button, phone dialer, multimedia recorder, word processor, email program, graphical program, graphical user interface, other application program, or pixel of the touch-sensitive screen, and
with the capability to provide visual, and auditory feedback responsive to one or more locations being touched by the movement of the user's finger,
without requiring the exertion of pressure on the touch-sensitive screen.
4. A mobile device according toclaim 1 wherein the mobile device is either a mobile phone or a touch-sensitive pad.
5. A method of operating a computer apparatus capable of operating software by a graphical user interface GUI characterized by the GUI detecting a coordinate input of pointer movement alone from a pointing device and thereby activate one or more functions available to the GUI.
6. A device capable of executing software according to the method ofclaim 5 wherein, a touch-sensitive screen is the pointing device, and the coordinate input of pointer movement alone means the information detected by the user interface code of the GUI of the location of the one or more coordinates being touched by contact of a user's finger on the touch-sensitive screen and thereby activate one or more said functions by a selected operation of the user, comprising:
the touch-sensitive screen configured to detect being touched by the user's finger without requiring the touch-sensitive screen to be sensitive to a pressure of the finger contact on the screen;
a processor connected to the touch-sensitive screen and configured to receive from the screen information regarding locations touched by the user's finger;
executable user interface code stored in a memory connected to the processor, the user interface code executable by the processor;
the user interface code being configured to detect one or more locations touched by a movement of the user's finger on the screen without requiring the touch-sensitive screen to be sensitive to the pressure of the finger contact on the screen and determine therefrom the selected operation.
7. A method according toclaim 5 wherein, said one or more functions were previously activated by a prior art GUI detecting a simultaneous coordinate input and an additional data input from the pointing device wherein the additional data input is
an input of sensitivity from the pointing device of a touch screen to pressure of a finger in contact on the screen, or
mouse button data from a pointing device or equivalent;
instead of operating the GUI according to the method ofclaim 5, in which the GUI is characterised by detecting said coordinate input of pointer movement alone, that is the input of X and Y coordinate information, from the pointing device and thereby activate said one or more functions available to the GUI.
8. The device ofclaim 6 in which, said device comprises either a touch-sensitive pad or mobile phone that is not required to be pressure sensitive,
wherein the user interface code is configured to detect more than one selected operation,
which has the capability to emulate a visible or invisible pointer movement and/or a “click” event for one or more of the following:
a control, menu, desktop, internet browser, multimedia player, settings menu, icon, button, phone dialer, multimedia recorder, word processor, email program, graphical program, graphical user interface, other application program, or pixel of the touch-sensitive screen, and
with the capability to provide visual, and auditory feedback responsive to one or more locations being touched by the movement of the user's finger,
without requiring the exertion of pressure on the touch-sensitive screen.
9. A method of operating a GUI as claimed inclaim 5 wherein,
the pointing device is a touch-sensitive screen, and the GUI is displayed on the touch-sensitive screen, which does not require to be pressure sensitive to a pressure of finger contact on the screen for the GUI to detect the coordinate input of one or more coordinate locations of one or more finger movements of a user on the touch-sensitive screen and thereby activate one or more said functions of the GUI to control the apparatus of a device with the touch-sensitive screen.
10. A method of operating the device according toclaim 6 which is further configured by one or more of the following:
a) the device is operable by the touch-sensitive screen without requiring the touch-sensitive screen to be pressure sensitive including any of a mobile phone, or a touch-sensitive pad, or another computer device with a touch-sensitive screen;
b) the device activates the touch-sensitive screen, without requiring the touch-sensitive screen to be pressure sensitive, by a start sequence of locations touched on the touch-sensitive screen by a movement of the user's finger to start operating one or more further functions of the user interface selectable by one or more subsequent finger movements of the user on the touch-sensitive screen to control the device by one or more subsequent selected operations of the user;
c) the device can browse the internet;
d) the device can play video files;
e) the device can play audio files;
f) the device can display text;
g) the device can display a multimedia file;
h) the device can edit text;
i) the device can search the internet or text by entering characters;
j) the device has a start sequence of locations to be touched on the touch-sensitive screen by one or more finger movements of the user on the touch-sensitive screen without requiring the touch-sensitive screen to be pressure sensitive to start operating one or more further functions of the user interface to control the device, and the start sequence does not require the sequence of locations to be touched to be deducible by another user from the appearance of the touch-sensitive screen displaying the start sequence of locations to be touched;
k) the device is operated by touching the touch-sensitive screen in two or more areas sequentially without requiring the touch-sensitive screen to be pressure sensitive;
l) the device has one or more functions to activate the touch-sensitive screen display from a very low power mode;
m) the device requires a sequence of locations on the touch-sensitive screen being touched by one or more finger movements on the touch-sensitive screen, without requiring the touch-sensitive screen to be pressure sensitive, thereby to activate the selected operation of the user to control the device;
n) the touch-sensitive screen being touched by one or more finger movements without requiring the touch-sensitive screen to be pressure sensitive can generate one or more functions by any of placing a finger at a location on the screen, moving a finger in contact with the screen, or not touching the screen at a location of the screen;
o) the device can operate a graphic program including the ability to draw a line on the touch-sensitive screen;
p) the device further includes detecting on the touch-sensitive screen, without requiring the touch-sensitive screen to be pressure sensitive, a movement of a pen at one or more locations of the touch-sensitive screen as a movement of the user's finger at said one or more locations on the touch-sensitive screen;
q) the device in which the user interface is further configured to respond to a pointer speed at which the user's finger touching the touch-sensitive screen without requiring the touch-sensitive screen to be pressure sensitive moves along the screen;
r) the device in which the user interface is further configured to cause the user selected operation to move a displayed desktop in a selected direction;
s) the device wherein the user interface is further configured to execute a selected operation by the touch-sensitive screen detecting coordinates of each of at least two of the user's fingers touching different areas of the touch-sensitive screen sequentially without requiring the touch-sensitive screen to be pressure sensitive;
t) the device wherein said user interface is further configured to make a triggering of the user selected operation by a sequence of locations touched on the touch-sensitive screen without requiring the touch-sensitive screen to be pressure sensitive less probable to occur by accident than the user pressing a physical button or requiring pressing on a touch screen to trigger said user selected operation;
u) the device in which the user interface is further configured to cause the selected operation to control the device by one or more functions in addition to an appearance on the touch-sensitive screen by detecting one or more coordinate positions of one or more finger movements touching the touch-sensitive screen without requiring the touch-sensitive screen to be pressure sensitive;
v) the device can operate an application program within the user interface;
w) the device can deactivate the selected operation;
x) the device wherein the touch-sensitive screen is capable of detecting a finger in dose proximity over the screen and the user interface is further configured to execute a selected operation by the finger being in dose proximity but not touching the screen;
y) the user interface code is further configured to activate another selected operation by the selected operation used in conjunction with an additional input method of a prior art GUI;
z) the user interface code is further configured to cause one or more selected operations, which includes one or more functions available to the user interface code of the device, to deactivate while the user's finger is touching one or more locations on the screen;
1) the use interface code is further configured to execute the selected operation by the user's finger on the screen being along a specified direction and within a designated area of the screen over a specified distance;
2) the use interface code is further configured to execute the selected operation by the user's finger on the screen being along a specified direction and within a designated area of the screen over a specified distance then is removed from the screen;
3) a part of a surface of the device as the touch-sensitive screen can be configured to detect being touched by the user's finger without requiring the touch-sensitive screen to be sensitive to the pressure of the finger contact on the surface to operate as a button without requiring pressure;
4) a surface of the device as the touch-sensitive screen may include in whole or in part visual feedback of a LCD screen,
5) a sequence of one or more locations touched on a touch-sensitive surface of the device as the touch-sensitive screen can include a sequence of contact on the surface by a user's finger movement that may execute the selected operation which would not be obvious to the user by a visual feedback from the surface;
6) a touch-sensitive surface of the device as the touch-sensitive screen can act as a pressure-less switch or button on the device; and
7) the user's finger movement detected by the touch-sensitive screen can be used interchangeably as pointer movement.
11. A method of operating the computer apparatus of a mobile device according toclaim 5 wherein the pointing device of the mobile device is a touch-sensitive screen without requiring the touch-sensitive screen to be pressure sensitive to contact of a user's finger on the screen, and the coordinate input of pointer movement alone from the touch-sensitive screen is the coordinate input detected by the GUI of one or more coordinate locations related to the screen of one or more finger movements over or on the screen detected by the touch-sensitive screen and the mobile device is either a mobile phone or a touch-sensitive pad.
12. A method of operating the graphical user interface GUI ofclaim 5 in which by the coordinate input being detected by the GUI,
when a pointer is immediately adjacent to or passes over a control area,
a procedure is initiated whereby subsequent movement of the pointer over a predetermined path generates a ‘click’ event, activating said one or more functions of the GUI which simulates direct clicking of a control.
13. A method of operating the graphical user interface GUI according toclaim 5 in which by the coordinate input being detected by the GUI,
a function related to a control area of said functions is triggered by a pointer movement over the control area, then by further movement over an additional area comprising the steps of
a. moving the pointer into contact with the control area
b. initiating activating the one or more functions associated with the control area by moving the pointer to an additional area related to the control area
c. moving the pointer within the additional area defined in b. and completing a specified movement within the additional area to complete activation of the function associated with the control area.
14. A method of operating a GUI in which by pointer movement alone may activate functions, which were previously activated in existing programs by other methods.
15. A method according toclaim 14 in which said method ofclaim 14 is limited to activate a function of said functions and while said method ofclaim 14 activates the function, said claim language ofclaim 14 of “a method of operating a GUI in which by pointer movement alone may activate functions, which were previously activated in existing programs by other methods” expressly means “a method of operating a computer apparatus capable of operating software by a graphical user interface GUI characterized by the GUI detecting a coordinate input of pointer movement alone from a pointing device and thereby activate a function available to the GUI, which may be previously activated in existing programs by other methods”.
16. A method of operating a GUI as defined inclaim 15 wherein, said other method of a prior art GUI is a standard click method whereby said function was activated by the prior art GUI detecting a simultaneous coordinate input and an additional data input from the pointing device wherein the additional data input is:
an input of sensitivity from the pointing device of a touch screen to pressure of a finger in contact on the screen, or
mouse button data from a pointing device or equivalent;
instead of operating the GUI according to the method ofclaim 15, in which the GUI is characterised by detecting said coordinate input of pointer movement alone, that is the input of X and Y coordinate information, from the pointing device and thereby activate said function available to the GUI.
17. A device capable of executing software to the method ofclaim 14 wherein, “by pointer movement alone may activate functions, which were previously activated in existing programs by other methods” of the claim language ofclaim 14 means “by the GUI detecting a coordinate input of pointer movement alone from a pointing device and thereby activate one or more functions of said functions available to the GUI, which may be previously activated in existing programs by other methods”, and a touch-sensitive screen is the pointing device, and the coordinate input means the information detected by the user interface code of the GUI of the location of the one or more coordinates being touched by contact of a user's finger on the touch-sensitive screen activates one or more said functions by a selected operation of the user, comprising:
the touch-sensitive screen configured to detect being touched by the user's finger without requiring the touch-sensitive screen to be sensitive to a pressure of the finger contact on the screen;
a processor connected to the touch-sensitive screen and configured to receive from the screen information regarding locations touched by the user's finger;
executable user interface code stored in a memory connected to the processor, the user interface code executable by the processor;
the user interface code being configured to detect one or more locations touched by a movement of the user's finger on the screen without requiring the touch-sensitive screen to be sensitive to the pressure of the finger contact on the screen and determine therefrom the selected operation.
18. The device ofclaim 17 in which, said device comprises either a touch-sensitive pad or mobile phone that is not required to be pressure-sensitive,
wherein the user interface code is configured to detect more than one selected operation,
which has the capability to emulate a visible or invisible pointer movement and/or a “click” event for one or more of the following:
a control, menu, desktop, internet browser, multimedia player, settings menu, icon, button, phone dialer, multimedia recorder, word processor, email program, graphical program, graphical user interface, other application program, or pixel of the touch-sensitive screen, and
with the capability to provide visual, and auditory feedback responsive to one or more locations being touched by the movement of the user's finger,
without requiring the exertion of pressure on the touch-sensitive screen.
19. A method of operating a GUI as defined inclaim 15 whereby said other method of a prior art GUI existing program operating a mobile device by a keyboard is the prior art GUI detecting an input sensitive to pressure of a finger in contact with one or more physical buttons, or virtual buttons on a touch screen, of the mobile device as an additional data input in conjunction with a coordinate input of pointer movement, that is the input of X and Y coordinate information, to activate said function;
instead of operating said mobile device with a touch sensitive screen as the pointing device according to the method ofclaim 15, in which the GUI detects said coordinate input of pointer movement alone, that is the input of X and Y coordinate information, from the touch sensitive screen by one or more locations touched by a user's finger on the screen, without requiring the touch sensitive screen to be sensitive to the pressure of finger contact, and thereby activate said function.
20. A method of operating the device according toclaim 17 which is further configured by one or more of the following:
a) the device is operable by the touch-sensitive screen without requiring the touch-sensitive screen to be pressure sensitive including any of a mobile phone, or a touch-sensitive pad, or another computer device with a touch-sensitive screen;
b) the device activates the touch-sensitive screen, without requiring the touch-sensitive screen to be pressure sensitive, by a start sequence of locations touched on the touch-sensitive screen by a movement of the user's finger to start operating one or more further functions of the user interface selectable by one or more subsequent finger movements of the user on the touch-sensitive screen to control the device by one or more subsequent selected operations of the user;
c) the device can browse the internet;
d) the device can play video files;
e) the device can play audio files;
f) the device can display text;
g) the device can display a multimedia file;
h) the device can edit text;
i) the device can search the internet or text by entering characters;
j) the device has a start sequence of locations to be touched on the touch-sensitive screen by one or more finger movements of the user on the touch-sensitive screen without requiring the touch-sensitive screen to be pressure sensitive to start operating one or more further functions of the user interface to control the device, and the start sequence does not require the sequence of locations to be touched to be deducible by another user from the appearance of the touch-sensitive screen displaying the start sequence of locations to be touched;
k) the device is operated by touching the touch-sensitive screen in two or more areas sequentially without requiring the touch-sensitive screen to be pressure sensitive;
l) the device has one or more functions to activate the touch-sensitive screen display from a very low power mode;
m) the device requires a sequence of locations on the touch-sensitive screen being touched by one or more finger movements on the touch-sensitive screen, without requiring the touch-sensitive screen to be pressure sensitive, thereby to activate the selected operation of the user to control the device;
n) the touch-sensitive screen being touched by one or more finger movements without requiring the touch-sensitive screen to be pressure sensitive can generate one or more functions by any of placing a finger at a location on the screen, moving a finger in contact with the screen, or not touching the screen at a location of the screen;
o) the device can operate a graphic program including the ability to draw a line on the touch-sensitive screen;
p) the device further includes detecting on the touch-sensitive screen, without requiring the touch-sensitive screen to be pressure sensitive, a movement of a pen at one or more locations of the touch-sensitive screen as a movement of the user's finger at said one or more locations on the touch-sensitive screen;
q) the device in which the user interface is further configured to respond to a pointer speed at which the user's finger touching the touch-sensitive screen without requiring the touch-sensitive screen to be pressure sensitive moves along the screen;
r) the device in which the user interface is further configured to cause the user selected operation to move a displayed desktop in a selected direction;
s) the device wherein the user interface is further configured to execute a selected operation by the touch-sensitive screen detecting coordinates of each of at least two of the user's fingers touching different areas of the touch-sensitive screen sequentially without requiring the touch-sensitive screen to be pressure sensitive;
t) the device wherein said user interface is further configured to make a triggering of the user selected operation by a sequence of locations touched on the touch-sensitive screen without requiring the touch-sensitive screen to be pressure sensitive less probable to occur by accident than the user pressing a physical button or requiring pressing on a touch screen to trigger said user selected operation;
u) the device in which the user interface is further configured to cause the selected operation to control the device by one or more functions in addition to an appearance on the touch-sensitive screen by detecting one or more coordinate positions of one or more finger movements touching the touch-sensitive screen without requiring the touch-sensitive screen to be pressure sensitive;
v) the device can operate an application program within the user interface;
w) the device can deactivate the selected operation;
x) the device wherein the touch-sensitive screen is capable of detecting the finger in close proximity over the screen and the user interface is further configured to execute a selected operation by the finger being in close proximity but not touching the screen;
y) the user interface code is further configured to activate another selected operation by the selected operation used in conjunction with an additional input method in a prior art GUS;
z) the user interface code is further configured to cause one or more selected operations, which includes one or more functions available to the user interface code of the device, to deactivate while the user's finger is touching one or more locations on the screen;
1) the use interface code is further configured to execute the selected operation by the user's finger on the screen being along a specified direction and within a designated area of the screen over a specified distance;
2) the use interface code is further configured to execute the selected operation by the user's finger on the screen being along a specified direction and within a designated area of the screen over a specified distance then is removed from the screen;
3) a part of a surface of the device as the touch-sensitive screen can be configured to detect being touched by the user's finger without requiring to the touch-sensitive screen to be sensitive to the pressure of the finger contact on the surface to operate as a button without requiring pressure;
4) a surface of the device as the touch-sensitive screen may include in whole or in part visual feedback of a LCD screen,
5) a sequence of one or more locations touched on a touch-sensitive surface of the device as the touch-sensitive screen can include a sequence of contact on the surface by a user's finger movement that may execute the selected operation which would not be obvious to the user by a visual feedback from the surface;
6) a touch-sensitive surface of the device as the touch-sensitive screen can act as a pressure-less switch or button on the device; and
7) the user's finger movement detected by the touch-sensitive screen can be used interchangeably as pointer movement.
US13/896,2802000-05-112013-05-16ZeroclickAbandonedUS20160246451A9 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US13/896,280US20160246451A9 (en)2000-05-112013-05-16Zeroclick

Applications Claiming Priority (24)

Application NumberPriority DateFiling DateTitle
GB0011321AGB0011321D0 (en)2000-05-112000-05-11Zeroclick
GB0011321.72000-05-11
GB0011370AGB0011370D0 (en)2000-05-122000-05-12Zeroclick
GB0011370.42000-05-12
GB0011441AGB0011441D0 (en)2000-05-122000-05-12The zeroclick invention
GB0011441.32000-05-12
GB0012582.32000-05-24
GB0012582AGB0012582D0 (en)2000-05-242000-05-24The zeroclick invention
GB0026891AGB0026891D0 (en)2000-11-012000-11-01Zeroclick
GB0026891.22000-11-01
GB0028097AGB0028097D0 (en)2000-11-202000-11-20Zeroclick
GB0028097.42000-11-20
GB0028693AGB0028693D0 (en)2000-11-272000-11-27ZeroClick
GB0028693.02000-11-27
GB0029148.42000-11-30
GB0029148AGB0029148D0 (en)2000-11-302000-11-30Zeroclick
GB0031164AGB0031164D0 (en)2000-12-212000-12-21ZeroClick
GB0031164.72000-12-21
GB0031680.22000-12-27
GB0031680AGB0031680D0 (en)2000-12-272000-12-27Zeroclick
US10/275,863US7818691B2 (en)2000-05-112001-05-03Zeroclick
PCT/GB2001/001978WO2002005081A1 (en)2000-05-112001-05-03Zeroclick
US12/877,994US8549443B2 (en)2000-05-112010-09-08Zeroclick
US13/896,280US20160246451A9 (en)2000-05-112013-05-16Zeroclick

Related Parent Applications (1)

Application NumberTitlePriority DateFiling Date
US12/877,994ContinuationUS8549443B2 (en)2000-05-112010-09-08Zeroclick

Publications (2)

Publication NumberPublication Date
US20140195969A1true US20140195969A1 (en)2014-07-10
US20160246451A9 US20160246451A9 (en)2016-08-25

Family

ID=27579341

Family Applications (3)

Application NumberTitlePriority DateFiling Date
US10/275,863Expired - Fee RelatedUS7818691B2 (en)2000-05-112001-05-03Zeroclick
US12/877,994Expired - Fee RelatedUS8549443B2 (en)2000-05-112010-09-08Zeroclick
US13/896,280AbandonedUS20160246451A9 (en)2000-05-112013-05-16Zeroclick

Family Applications Before (2)

Application NumberTitlePriority DateFiling Date
US10/275,863Expired - Fee RelatedUS7818691B2 (en)2000-05-112001-05-03Zeroclick
US12/877,994Expired - Fee RelatedUS8549443B2 (en)2000-05-112010-09-08Zeroclick

Country Status (8)

CountryLink
US (3)US7818691B2 (en)
EP (1)EP1285330B1 (en)
AT (1)ATE338300T1 (en)
CA (1)CA2412578A1 (en)
DE (1)DE60122708D1 (en)
GB (1)GB2380918C3 (en)
NZ (1)NZ523065A (en)
WO (1)WO2002005081A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US9203951B1 (en)*2014-07-032015-12-01International Business Machines CorporationMobile telephone adapted for use with one hand
USD784379S1 (en)*2015-11-252017-04-18General Electric CompanyDisplay panel or portion thereof with transitional graphical user interface
USD784381S1 (en)*2015-11-252017-04-18General Electric CompanyDisplay panel or portion thereof with transitional graphical user interface
USD785657S1 (en)*2015-11-252017-05-02General Electric CompanyDisplay screen or portion thereof with transitional graphical user interface
US10423293B2 (en)*2015-11-252019-09-24International Business Machines CorporationControlling cursor motion

Families Citing this family (249)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US7760187B2 (en)2004-07-302010-07-20Apple Inc.Visual expander
US8645137B2 (en)2000-03-162014-02-04Apple Inc.Fast, language-independent method for user authentication by voice
KR100433396B1 (en)*2001-10-292004-06-02삼성전자주식회사On/off shifting appratus and method for electronic equipment
US8095879B2 (en)2002-12-102012-01-10Neonode Inc.User interface for mobile handheld computer unit
GB0222094D0 (en)*2002-09-242002-10-30Koninkl Philips Electronics NvGraphical user interface navigation method and apparatus
CN100409157C (en)*2002-12-232008-08-06皇家飞利浦电子股份有限公司Non-contact inputting devices
US7434175B2 (en)*2003-05-192008-10-07Jambo Acquisition, LlcDisplaying telephone numbers as active objects
US7266780B2 (en)*2003-06-302007-09-04Motorola, Inc.Method for combining deterministic and non-deterministic user interaction data input models
US7757173B2 (en)*2003-07-182010-07-13Apple Inc.Voice menu system
US7568161B2 (en)*2003-08-132009-07-28Melia Technologies, LtdOvercoming double-click constraints in a mark-up language environment
US7472350B2 (en)*2003-10-022008-12-30International Business Machines CorporationDisplaying and managing inherited values
WO2005052780A2 (en)*2003-11-202005-06-09Nes Stewart IrvineGraphical user interface
DE10360657A1 (en)*2003-12-232005-07-21Daimlerchrysler Ag Operating system for a motor vehicle
GB2410662A (en)*2004-01-292005-08-03Siemens PlcActivation of an operation by cursor movement
US7694233B1 (en)2004-04-302010-04-06Apple Inc.User interface presentation of information in reconfigured or overlapping containers
US20050283727A1 (en)*2004-06-212005-12-22Large William TNon-resident methods and systems for providing clickless user actuation of a webpage
WO2006014629A2 (en)*2004-07-202006-02-09Hillcrest Laboratories, Inc.Graphical cursor navigation methods
JP4113902B2 (en)*2004-08-272008-07-09富士通株式会社 Operation screen generation method, display control device, operation screen generation program, and computer-readable recording medium recording the program
US7519923B2 (en)*2004-10-202009-04-14International Business Machines CorporationMethod for generating a tree view of elements in a graphical user interface (GUI)
US8169410B2 (en)*2004-10-202012-05-01Nintendo Co., Ltd.Gesture inputs for a portable display device
US20060129928A1 (en)*2004-12-022006-06-15Weigen QiuUse of pointing device to identify ideographic characters
US20060156135A1 (en)*2004-12-162006-07-13Marek SikoraTabbed form with error indicators
JP4719494B2 (en)*2005-04-062011-07-06任天堂株式会社 Input coordinate processing program and input coordinate processing apparatus
US7750893B2 (en)*2005-04-062010-07-06Nintendo Co., Ltd.Storage medium storing input position processing program, and input position processing device
JP4832826B2 (en)*2005-07-262011-12-07任天堂株式会社 Object control program and information processing apparatus
US8677377B2 (en)2005-09-082014-03-18Apple Inc.Method and apparatus for building an intelligent automated assistant
JP4807999B2 (en)*2005-09-162011-11-02株式会社リコー Image display apparatus, image display method, image forming apparatus, image forming method, and program causing computer to execute these methods
US8629885B2 (en)*2005-12-012014-01-14Exent Technologies, Ltd.System, method and computer program product for dynamically identifying, selecting and extracting graphical and media objects in frames or scenes rendered by a software application
US20070296718A1 (en)*2005-12-012007-12-27Exent Technologies, Ltd.Dynamic resizing of graphics content rendered by an application to facilitate rendering of additional graphics content
US20070168309A1 (en)*2005-12-012007-07-19Exent Technologies, Ltd.System, method and computer program product for dynamically extracting and sharing event information from an executing software application
US7596536B2 (en)*2005-12-012009-09-29Exent Technologies, Ltd.System, method and computer program product for dynamically measuring properties of objects rendered and/or referenced by an application executing on a computing device
US8099683B2 (en)*2005-12-082012-01-17International Business Machines CorporationMovement-based dynamic filtering of search results in a graphical user interface
US20070171196A1 (en)*2006-01-232007-07-26Thomas Robert PfingstenController user interface and method
US8312372B2 (en)*2006-02-102012-11-13Microsoft CorporationMethod for confirming touch input
US7523418B2 (en)2006-03-152009-04-21International Business Machines CorporationTechniques for choosing a position on a display having a cursor
CN100507818C (en)*2006-04-302009-07-01国际商业机器公司 Method and device for enabling user to select multiple objects in one document
US9318108B2 (en)2010-01-182016-04-19Apple Inc.Intelligent automated assistant
US8570278B2 (en)2006-10-262013-10-29Apple Inc.Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
US7856605B2 (en)2006-10-262010-12-21Apple Inc.Method, system, and graphical user interface for positioning an insertion marker in a touch screen display
US8381130B2 (en)*2006-11-292013-02-19Yahoo! Inc.Method and user interface tool for navigation through grid of scheduled items
KR100869950B1 (en)2006-12-012008-11-24삼성전자주식회사 Expanded Standby Screen Layout Structure and Display Method for Mobile Devices
US10437459B2 (en)2007-01-072019-10-08Apple Inc.Multitouch data fusion
US20080168478A1 (en)2007-01-072008-07-10Andrew PlatzerApplication Programming Interfaces for Scrolling
US20080168402A1 (en)2007-01-072008-07-10Christopher BlumenbergApplication Programming Interfaces for Gesture Operations
US9703889B2 (en)*2007-02-142017-07-11Google Inc.Providing auto-focus for a search field in a user interface
KR101358767B1 (en)2007-04-022014-02-07삼성전자주식회사Method for executing user command according to spatial movement of user input device and video apparatus thereof
US8977255B2 (en)2007-04-032015-03-10Apple Inc.Method and system for operating a multi-function portable electronic device using voice-activation
US20080256454A1 (en)*2007-04-132008-10-16Sap AgSelection of list item using invariant focus location
US8099681B2 (en)*2007-09-242012-01-17The Boeing CompanySystems and methods for propagating alerts via a hierarchy of grids
US20090158152A1 (en)*2007-12-122009-06-18Kodimer Marianne LSystem and method for generating context sensitive help for a graphical user interface
US20090158190A1 (en)*2007-12-132009-06-18Yuvee, Inc.Computing apparatus including a personal web and application assistant
WO2009076702A1 (en)*2007-12-142009-06-25Doubleiq Pty LtdA method and apparatus for the display and/or processing of information, such as data
US9330720B2 (en)2008-01-032016-05-03Apple Inc.Methods and apparatus for altering audio output signals
US8201109B2 (en)*2008-03-042012-06-12Apple Inc.Methods and graphical user interfaces for editing on a portable multifunction device
US8645827B2 (en)2008-03-042014-02-04Apple Inc.Touch event model
US8717305B2 (en)2008-03-042014-05-06Apple Inc.Touch event model for web pages
US8416196B2 (en)2008-03-042013-04-09Apple Inc.Touch event model programming interface
US8650507B2 (en)2008-03-042014-02-11Apple Inc.Selecting of text using gestures
US8996376B2 (en)2008-04-052015-03-31Apple Inc.Intelligent text-to-speech conversion
US10496753B2 (en)2010-01-182019-12-03Apple Inc.Automatically adapting user interfaces for hands-free interaction
US20090327886A1 (en)*2008-06-272009-12-31Microsoft CorporationUse of secondary factors to analyze user intention in gui element activation
US20100030549A1 (en)2008-07-312010-02-04Lee Michael MMobile device having human language translation capability with positional feedback
JP4600548B2 (en)*2008-08-272010-12-15ソニー株式会社 REPRODUCTION DEVICE, REPRODUCTION METHOD, AND PROGRAM
US8898568B2 (en)2008-09-092014-11-25Apple Inc.Audio user interface
US9491316B2 (en)2008-09-092016-11-08Applied Systems, Inc.Methods and apparatus for delivering documents
US8810522B2 (en)*2008-09-292014-08-19Smart Technologies UlcMethod for selecting and manipulating a graphical object in an interactive input system, and interactive input system executing the method
JP5225820B2 (en)*2008-11-252013-07-03アイシン精機株式会社 Input device, vehicle periphery monitoring device, icon switch selection method and program
WO2010067118A1 (en)2008-12-112010-06-17Novauris Technologies LimitedSpeech recognition involving a mobile device
JP5173870B2 (en)*2009-01-282013-04-03京セラ株式会社 Input device
JP2010204870A (en)*2009-03-032010-09-16Funai Electric Co LtdInput device
US8285499B2 (en)2009-03-162012-10-09Apple Inc.Event recognition
US8566045B2 (en)2009-03-162013-10-22Apple Inc.Event recognition
US9311112B2 (en)2009-03-162016-04-12Apple Inc.Event recognition
US9684521B2 (en)2010-01-262017-06-20Apple Inc.Systems having discrete and continuous gesture recognizers
US9875013B2 (en)2009-03-162018-01-23Apple Inc.Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
GB2468884A (en)*2009-03-252010-09-29Nec CorpUser defined paths for control on a touch screen
US9858925B2 (en)2009-06-052018-01-02Apple Inc.Using context information to facilitate processing of commands in a virtual assistant
US20120309363A1 (en)2011-06-032012-12-06Apple Inc.Triggering notifications associated with tasks items that represent tasks to perform
US10241752B2 (en)2011-09-302019-03-26Apple Inc.Interface for a virtual digital assistant
US10241644B2 (en)2011-06-032019-03-26Apple Inc.Actionable reminder entries
US9431006B2 (en)2009-07-022016-08-30Apple Inc.Methods and apparatuses for automatic speech recognition
US20110010656A1 (en)*2009-07-132011-01-13Ta Keo LtdApparatus and method for improved user interface
JP2011022842A (en)*2009-07-162011-02-03Sony CorpDisplay apparatus, display method, and program
US8291313B1 (en)*2009-08-262012-10-16Adobe Systems IncorporatedGeneration of a container hierarchy from a document design
US9176962B2 (en)2009-09-072015-11-03Apple Inc.Digital media asset browsing with audio cues
US10276170B2 (en)2010-01-182019-04-30Apple Inc.Intelligent automated assistant
US10679605B2 (en)2010-01-182020-06-09Apple Inc.Hands-free list-reading by intelligent automated assistant
US10705794B2 (en)2010-01-182020-07-07Apple Inc.Automatically adapting user interfaces for hands-free interaction
US10553209B2 (en)2010-01-182020-02-04Apple Inc.Systems and methods for hands-free notification summaries
US10397639B1 (en)2010-01-292019-08-27Sitting Man, LlcHot key systems and methods
US8682667B2 (en)2010-02-252014-03-25Apple Inc.User profiling for selecting user specific voice input processing information
JP2011177203A (en)*2010-02-262011-09-15Nintendo Co LtdObject controlling program and object controlling apparatus
US10216408B2 (en)*2010-06-142019-02-26Apple Inc.Devices and methods for identifying user interface objects based on view hierarchy
KR20120021056A (en)*2010-08-312012-03-08삼성전자주식회사Method for providing search service to store search result temporarily and display apparatus applying the same
KR101719268B1 (en)2010-09-022017-03-23삼성전자주식회사Method for providing search service interconvertable search window and image display window and display apparatus applying the same
US20120151397A1 (en)*2010-12-082012-06-14Tavendo GmbhAccess to an electronic object collection via a plurality of views
US10762293B2 (en)2010-12-222020-09-01Apple Inc.Using parts-of-speech tagging and named entity recognition for spelling correction
US9262612B2 (en)2011-03-212016-02-16Apple Inc.Device access using voice authentication
US9298363B2 (en)2011-04-112016-03-29Apple Inc.Region activation for touch sensitive surface
US9239672B2 (en)*2011-04-202016-01-19Mellmo Inc.User interface for data comparison
KR101788006B1 (en)*2011-07-182017-10-19엘지전자 주식회사Remote Controller and Image Display Device Controllable by Remote Controller
US9317625B2 (en)*2011-05-112016-04-19Mitel Networks CorporationQuick directory search system on a touch screen device and methods thereof
US8843346B2 (en)*2011-05-132014-09-23Amazon Technologies, Inc.Using spatial information with device interaction
US8316319B1 (en)*2011-05-162012-11-20Google Inc.Efficient selection of characters and commands based on movement-inputs at a user-inerface
US8677232B2 (en)2011-05-312014-03-18Apple Inc.Devices, methods, and graphical user interfaces for document manipulation
USD678311S1 (en)*2011-05-312013-03-19Fanhattan, LLCPortion of a display with a graphical user interface
US10057736B2 (en)2011-06-032018-08-21Apple Inc.Active transport based notifications
US9146656B1 (en)*2011-06-272015-09-29Google Inc.Notifications user interface
US8994660B2 (en)2011-08-292015-03-31Apple Inc.Text correction processing
KR101844903B1 (en)*2011-08-312018-04-04삼성전자 주식회사Providing Method for Data Complex Recording And Portable Device thereof
US8682881B1 (en)*2011-09-072014-03-25Google Inc.System and method for extracting structured data from classified websites
US8990726B2 (en)2011-09-122015-03-24Microsoft Technology Licensing, LlcText box clearing selector
US9588595B2 (en)2011-09-122017-03-07Microsoft Technology Licensing, LlcPassword reveal selector
GB2496378B (en)2011-11-032016-12-21IbmSmart window creation in a graphical user interface
EP2788860A4 (en)*2011-12-062016-07-06Autograph IncConsumer self-profiling gui, analysis and rapid information presentation tools
CN104160361A (en)2012-02-062014-11-19迈克尔·K·科尔比 string completion
US10134385B2 (en)2012-03-022018-11-20Apple Inc.Systems and methods for name pronunciation
US9483461B2 (en)2012-03-062016-11-01Apple Inc.Handling speech synthesis of content for multiple languages
US9280610B2 (en)2012-05-142016-03-08Apple Inc.Crowd sourcing information to fulfill user requests
US20130311954A1 (en)*2012-05-182013-11-21Geegui CorporationEfficient user interface
US9721563B2 (en)2012-06-082017-08-01Apple Inc.Name recognition system
US9495129B2 (en)2012-06-292016-11-15Apple Inc.Device, method, and user interface for voice-activated navigation and browsing of a document
ITTV20120138A1 (en)*2012-07-252014-01-26Isis S R L METHOD FOR THE CONTROL AND ACTIVATION OF A USER INTERFACE AND DEVICE AND PLANT WITH THIS METHOD AND INTERFACE
US20150169170A1 (en)*2012-08-302015-06-18Google Inc.Detecting a hover event using a sequence based on cursor movement
US9576574B2 (en)2012-09-102017-02-21Apple Inc.Context-sensitive handling of interruptions by intelligent digital assistant
US9547647B2 (en)2012-09-192017-01-17Apple Inc.Voice-based media searching
US20140118310A1 (en)*2012-10-262014-05-01Livescribe Inc.Digital Cursor Display Linked to a Smart Pen
KR101416749B1 (en)*2012-12-132014-07-08주식회사 케이티Tv representing apparatus and method for controlling access of user
US20140173524A1 (en)*2012-12-142014-06-19Microsoft CorporationTarget and press natural user input
CN103914466B (en)*2012-12-312017-08-08阿里巴巴集团控股有限公司A kind of method and system of label button management
DE212014000045U1 (en)2013-02-072015-09-24Apple Inc. Voice trigger for a digital assistant
TW201435720A (en)*2013-03-012014-09-16Hon Hai Prec Ind Co LtdCursor of mouse control method
US9760187B2 (en)*2013-03-112017-09-12Barnes & Noble College Booksellers, LlcStylus with active color display/select for touch sensitive devices
US9368114B2 (en)2013-03-142016-06-14Apple Inc.Context-sensitive handling of interruptions
US10652394B2 (en)2013-03-142020-05-12Apple Inc.System and method for processing voicemail
AU2014233517B2 (en)2013-03-152017-05-25Apple Inc.Training an at least partial voice command system
WO2014144579A1 (en)2013-03-152014-09-18Apple Inc.System and method for updating an adaptive speech recognition model
WO2014158101A1 (en)*2013-03-282014-10-02Sun VasanMethods, systems and devices for interacting with a computing device
WO2014197336A1 (en)2013-06-072014-12-11Apple Inc.System and method for detecting errors in interactions with a voice-based digital assistant
US9582608B2 (en)2013-06-072017-02-28Apple Inc.Unified ranking with entropy-weighted information for phrase-based semantic auto-completion
WO2014197334A2 (en)2013-06-072014-12-11Apple Inc.System and method for user-specified pronunciation of words for speech synthesis and recognition
WO2014197335A1 (en)2013-06-082014-12-11Apple Inc.Interpreting and acting upon commands that involve sharing information with remote devices
US10176167B2 (en)2013-06-092019-01-08Apple Inc.System and method for inferring user intent from speech inputs
DE112014002747T5 (en)2013-06-092016-03-03Apple Inc. Apparatus, method and graphical user interface for enabling conversation persistence over two or more instances of a digital assistant
US9733716B2 (en)2013-06-092017-08-15Apple Inc.Proxy gesture recognizer
AU2014278595B2 (en)2013-06-132017-04-06Apple Inc.System and method for emergency calls initiated by voice command
JP2015011689A (en)*2013-07-022015-01-19船井電機株式会社Information processing device, information processing method, and system
DE112014003653B4 (en)2013-08-062024-04-18Apple Inc. Automatically activate intelligent responses based on activities from remote devices
KR102184269B1 (en)*2013-09-022020-11-30삼성전자 주식회사Display apparatus, portable apparatus and method for displaying a screen thereof
US20150081502A1 (en)*2013-09-192015-03-19Trading Technologies International, Inc.Methods and apparatus to implement two-step trade action execution
US11435895B2 (en)2013-12-282022-09-06Trading Technologies International, Inc.Methods and apparatus to enable a trading device to accept a user input
US9971801B2 (en)*2014-03-262018-05-15Interject Data Systems, Inc.Grid cell data requests
US10599332B2 (en)*2014-03-312020-03-24Bombardier Inc.Cursor control for aircraft display device
KR101617216B1 (en)*2014-05-072016-05-02삼성전자 주식회사A display device and method for displaying a object highlight of a image
US9620105B2 (en)2014-05-152017-04-11Apple Inc.Analyzing audio input for efficient speech and music recognition
US10592095B2 (en)2014-05-232020-03-17Apple Inc.Instantaneous speaking of content on touch devices
US9502031B2 (en)2014-05-272016-11-22Apple Inc.Method for supporting dynamic grammars in WFST-based ASR
US9715875B2 (en)2014-05-302017-07-25Apple Inc.Reducing the need for manual start/end-pointing and trigger phrases
US10170123B2 (en)2014-05-302019-01-01Apple Inc.Intelligent assistant for home automation
US9785630B2 (en)2014-05-302017-10-10Apple Inc.Text prediction using combined word N-gram and unigram language models
US9760559B2 (en)2014-05-302017-09-12Apple Inc.Predictive text input
US10078631B2 (en)2014-05-302018-09-18Apple Inc.Entropy-guided text prediction using combined word and character n-gram language models
US9633004B2 (en)2014-05-302017-04-25Apple Inc.Better resolution when referencing to concepts
US9842101B2 (en)2014-05-302017-12-12Apple Inc.Predictive conversion of language input
US10289433B2 (en)2014-05-302019-05-14Apple Inc.Domain specific language for encoding assistant dialog
CN110797019B (en)2014-05-302023-08-29苹果公司Multi-command single speech input method
US9734193B2 (en)2014-05-302017-08-15Apple Inc.Determining domain salience ranking from ambiguous words in natural speech
US9430463B2 (en)2014-05-302016-08-30Apple Inc.Exemplar-based natural language processing
US9338493B2 (en)2014-06-302016-05-10Apple Inc.Intelligent automated assistant for TV user interactions
US10659851B2 (en)2014-06-302020-05-19Apple Inc.Real-time digital assistant knowledge updates
CN105335136B (en)*2014-07-162019-08-09阿里巴巴集团控股有限公司The control method and device of smart machine
US10446141B2 (en)2014-08-282019-10-15Apple Inc.Automatic speech recognition based on user feedback
US9818400B2 (en)2014-09-112017-11-14Apple Inc.Method and apparatus for discovering trending terms in speech requests
US10789041B2 (en)2014-09-122020-09-29Apple Inc.Dynamic thresholds for always listening speech trigger
US9646609B2 (en)2014-09-302017-05-09Apple Inc.Caching apparatus for serving phonetic pronunciations
US10127911B2 (en)2014-09-302018-11-13Apple Inc.Speaker identification and unsupervised speaker adaptation techniques
US9668121B2 (en)2014-09-302017-05-30Apple Inc.Social reminders
US10074360B2 (en)2014-09-302018-09-11Apple Inc.Providing an indication of the suitability of speech recognition
US9886432B2 (en)2014-09-302018-02-06Apple Inc.Parsimonious handling of word inflection via categorical stem + suffix N-gram language models
KR20160053144A (en)*2014-10-312016-05-13삼성전자주식회사Display apparatus, system and controlling method thereof
US10552013B2 (en)2014-12-022020-02-04Apple Inc.Data detection
US9711141B2 (en)2014-12-092017-07-18Apple Inc.Disambiguating heteronyms in speech synthesis
US10963126B2 (en)*2014-12-102021-03-30D2L CorporationMethod and system for element navigation
US9865280B2 (en)2015-03-062018-01-09Apple Inc.Structured dictation using intelligent automated assistants
US10567477B2 (en)2015-03-082020-02-18Apple Inc.Virtual assistant continuity
US9721566B2 (en)2015-03-082017-08-01Apple Inc.Competing devices responding to voice triggers
US9886953B2 (en)2015-03-082018-02-06Apple Inc.Virtual assistant activation
US9899019B2 (en)2015-03-182018-02-20Apple Inc.Systems and methods for structured stem and suffix language models
US9842105B2 (en)2015-04-162017-12-12Apple Inc.Parsimonious continuous-space phrase representations for natural language processing
US9870755B2 (en)*2015-05-222018-01-16Google LlcPrioritized display of visual content in computer presentations
US10083688B2 (en)2015-05-272018-09-25Apple Inc.Device voice control for selecting a displayed affordance
US10127220B2 (en)2015-06-042018-11-13Apple Inc.Language identification from short strings
US9578173B2 (en)2015-06-052017-02-21Apple Inc.Virtual assistant aided communication with 3rd party service in a communication session
US10101822B2 (en)2015-06-052018-10-16Apple Inc.Language input correction
US10255907B2 (en)2015-06-072019-04-09Apple Inc.Automatic accent detection using acoustic models
US11025565B2 (en)2015-06-072021-06-01Apple Inc.Personalized prediction of responses for instant messaging
US10186254B2 (en)2015-06-072019-01-22Apple Inc.Context-based endpoint detection
US10452231B2 (en)*2015-06-262019-10-22International Business Machines CorporationUsability improvements for visual interfaces
US10394421B2 (en)2015-06-262019-08-27International Business Machines CorporationScreen reader improvements
US10817127B1 (en)*2015-07-112020-10-27Allscripts Software, LlcMethodologies involving use of avatar for clinical documentation
US10747498B2 (en)2015-09-082020-08-18Apple Inc.Zero latency digital assistant
US10671428B2 (en)2015-09-082020-06-02Apple Inc.Distributed personal assistant
US9697820B2 (en)2015-09-242017-07-04Apple Inc.Unit-selection text-to-speech synthesis using concatenation-sensitive neural networks
US10366158B2 (en)2015-09-292019-07-30Apple Inc.Efficient word encoding for recurrent neural network language models
US11010550B2 (en)2015-09-292021-05-18Apple Inc.Unified language modeling framework for word prediction, auto-completion and auto-correction
US11587559B2 (en)2015-09-302023-02-21Apple Inc.Intelligent device identification
US10881713B2 (en)2015-10-282021-01-05Atheer, Inc.Method and apparatus for interface control with prompt and feedback
US10691473B2 (en)2015-11-062020-06-23Apple Inc.Intelligent automated assistant in a messaging environment
US10248284B2 (en)*2015-11-162019-04-02Atheer, Inc.Method and apparatus for interface control with prompt and feedback
US10049668B2 (en)2015-12-022018-08-14Apple Inc.Applying neural network language models to weighted finite state transducers for automatic speech recognition
US10223066B2 (en)2015-12-232019-03-05Apple Inc.Proactive assistance based on dialog communication between devices
JP6624972B2 (en)*2016-02-262019-12-25キヤノン株式会社 Method, apparatus, and program for controlling display
US10446143B2 (en)2016-03-142019-10-15Apple Inc.Identification of voice inputs providing credentials
US9934775B2 (en)2016-05-262018-04-03Apple Inc.Unit-selection text-to-speech synthesis based on predicted concatenation parameters
CN105930089B (en)*2016-05-312019-04-09维沃移动通信有限公司 Method for switching display interface of mobile terminal and mobile terminal
US9972304B2 (en)2016-06-032018-05-15Apple Inc.Privacy preserving distributed evaluation framework for embedded personalized systems
US10249300B2 (en)2016-06-062019-04-02Apple Inc.Intelligent list reading
US10049663B2 (en)2016-06-082018-08-14Apple, Inc.Intelligent automated assistant for media exploration
DK179309B1 (en)2016-06-092018-04-23Apple IncIntelligent automated assistant in a home environment
US10490187B2 (en)2016-06-102019-11-26Apple Inc.Digital assistant providing automated status report
US10192552B2 (en)2016-06-102019-01-29Apple Inc.Digital assistant providing whispered speech
US10586535B2 (en)2016-06-102020-03-10Apple Inc.Intelligent digital assistant in a multi-tasking environment
US10509862B2 (en)2016-06-102019-12-17Apple Inc.Dynamic phrase expansion of language input
US10067938B2 (en)2016-06-102018-09-04Apple Inc.Multilingual word prediction
DK179343B1 (en)2016-06-112018-05-14Apple IncIntelligent task discovery
DK179049B1 (en)2016-06-112017-09-18Apple IncData driven natural language event detection and classification
DK201670540A1 (en)2016-06-112018-01-08Apple IncApplication integration with a digital assistant
DK179415B1 (en)2016-06-112018-06-14Apple IncIntelligent device arbitration and control
US11182853B2 (en)2016-06-272021-11-23Trading Technologies International, Inc.User action for continued participation in markets
US10043516B2 (en)2016-09-232018-08-07Apple Inc.Intelligent automated assistant
US10593346B2 (en)2016-12-222020-03-17Apple Inc.Rank-reduced token representation for automatic speech recognition
US10936872B2 (en)2016-12-232021-03-02Realwear, Inc.Hands-free contextually aware object interaction for wearable display
US11099716B2 (en)2016-12-232021-08-24Realwear, Inc.Context based content navigation for wearable display
US11507216B2 (en)*2016-12-232022-11-22Realwear, Inc.Customizing user interfaces of binary applications
US10620910B2 (en)2016-12-232020-04-14Realwear, Inc.Hands-free navigation of touch-based operating systems
US10530872B1 (en)*2017-01-112020-01-07Facebook, Inc.Methods and systems for determining screen-reader use
DK201770439A1 (en)2017-05-112018-12-13Apple Inc.Offline personal assistant
DK179496B1 (en)2017-05-122019-01-15Apple Inc. USER-SPECIFIC Acoustic Models
DK179745B1 (en)2017-05-122019-05-01Apple Inc. SYNCHRONIZATION AND TASK DELEGATION OF A DIGITAL ASSISTANT
DK201770432A1 (en)2017-05-152018-12-21Apple Inc.Hierarchical belief states for digital assistants
DK201770431A1 (en)2017-05-152018-12-20Apple Inc.Optimizing dialogue policy decisions for digital assistants using implicit feedback
DK179549B1 (en)2017-05-162019-02-12Apple Inc.Far-field extension for digital assistant services
USD862511S1 (en)*2017-11-022019-10-08Google LlcDisplay screen or portion thereof with transitional graphical user interface
USD851674S1 (en)*2017-11-172019-06-18Outbrain Inc.Electronic device display or portion thereof with animated graphical user interface
US11379113B2 (en)2019-06-012022-07-05Apple Inc.Techniques for selecting text
WO2022072331A1 (en)2020-09-302022-04-07Neonode Inc.Optical touch sensor
US11061553B1 (en)*2020-12-282021-07-13Dropbox, Inc.Drag and drop quick actions
CN113476822B (en)*2021-06-112022-06-10荣耀终端有限公司Touch method and device
USD1096762S1 (en)*2021-09-152025-10-07Hong Sun YangElectronic device display screen with animated graphical user interface

Citations (24)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4712191A (en)*1982-08-111987-12-08U.S. Philips CorporationDisplay system with nested information display
US4745543A (en)*1981-08-201988-05-17Fischer & Porter Co.Front panel for a process controller
US4821029A (en)*1984-04-261989-04-11Microtouch Systems, Inc.Touch screen computer-operated video display process and apparatus
US5053758A (en)*1988-02-011991-10-01Sperry Marine Inc.Touchscreen control panel with sliding touch control
US5327161A (en)*1989-08-091994-07-05Microtouch Systems, Inc.System and method for emulating a mouse input device with a touchpad input device
US5627567A (en)*1993-04-271997-05-06Hewlett-Packard CompanyMethod and apparatus for adaptive touch recognition in a touch sensitive user interface
US5644628A (en)*1994-03-151997-07-01Alcatel N.V.telecommunications terminal interface for control by predetermined gestures
US5668929A (en)*1993-01-211997-09-16Hirsch Electronics CorporationSpeech activated security systems and methods
US5719936A (en)*1995-03-071998-02-17Siemens AktiengesellschaftCommunication device for mobile operation having a telephone and notebook with display
US5757368A (en)*1995-03-271998-05-26Cirque CorporationSystem and method for extending the drag function of a computer pointing device
US5799068A (en)*1992-06-291998-08-25Elonex I.P. Holdings Ltd.Smart phone integration with computer systems
US5812118A (en)*1996-06-251998-09-22International Business Machines CorporationMethod, apparatus, and memory for creating at least two virtual pointing devices
US5856822A (en)*1995-10-271999-01-0502 Micro, Inc.Touch-pad digital computer pointing-device
US6061050A (en)*1995-10-272000-05-09Hewlett-Packard CompanyUser interface device
US6256020B1 (en)*1997-03-312001-07-03G & R Associates IncorporatedComputer-telephony integration employing an intelligent keyboard and method for same
US20010012769A1 (en)*1997-11-272001-08-09Jukka SirolaWireless communication device and a method of manufacturing a wireless communication device
US6295052B1 (en)*1996-02-192001-09-25Misawa Homes Co., Ltd.Screen display key input unit
US6310613B1 (en)*1998-05-262001-10-30Yamatake CorporationMethod and apparatus for changing numeric values on a display device
US20010040587A1 (en)*1993-11-152001-11-15E. J. ScheckTouch control of cursonr position
US6346935B1 (en)*1998-09-142002-02-12Matsushita Electric Industrial Co., Ltd.Touch-sensitive tablet
US6445383B1 (en)*1998-02-092002-09-03Koninklijke Philips Electronics N.V.System to detect a power management system resume event from a stylus and touch screen
US6480964B1 (en)*1998-08-202002-11-12Samsung Electronics Co., Ltd.User interface power management control technique for a computer system
US6609146B1 (en)*1997-11-122003-08-19Benjamin SlotznickSystem for automatically switching between two executable programs at a user's computer interface during processing by one of the executable programs
US6757002B1 (en)*1999-11-042004-06-29Hewlett-Packard Development Company, L.P.Track pad pointing device with areas of specialized function

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4733222A (en)*1983-12-271988-03-22Integrated Touch Arrays, Inc.Capacitance-variation-sensitive touch sensing array system
JPS6370326A (en)*1986-09-121988-03-30Wacom Co LtdPosition detector
DE4138815A1 (en)*1991-11-261993-05-27Ego Elektro Blanc & Fischer BASE FOR AN ELECTROMECHANICAL FUNCTIONAL UNIT
US5543588A (en)*1992-06-081996-08-06Synaptics, IncorporatedTouch pad driven handheld computing device
US6028271A (en)1992-06-082000-02-22Synaptics, Inc.Object position detector with edge motion feature and gesture recognition
JP2525546B2 (en)*1992-09-081996-08-21インターナショナル・ビジネス・マシーンズ・コーポレイション Graphic resource editor
JP2813728B2 (en)*1993-11-011998-10-22インターナショナル・ビジネス・マシーンズ・コーポレイション Personal communication device with zoom / pan function
JP3546337B2 (en)*1993-12-212004-07-28ゼロックス コーポレイション User interface device for computing system and method of using graphic keyboard
US5500935A (en)*1993-12-301996-03-19Xerox CorporationApparatus and method for translating graphic objects and commands with direct touch input in a touch based input system
WO1996009579A1 (en)*1994-09-221996-03-28Izak Van CruyningenPopup menus with directional gestures
US5721853A (en)1995-04-281998-02-24Ast Research, Inc.Spot graphic display element with open locking and periodic animation
US5689667A (en)*1995-06-061997-11-18Silicon Graphics, Inc.Methods and system of controlling menus with radial and linear portions
US5790115A (en)*1995-09-191998-08-04Microsoft CorporationSystem for character entry on a display screen
US5745116A (en)*1996-09-091998-04-28Motorola, Inc.Intuitive gesture-based graphical user interface
US6057844A (en)*1997-04-282000-05-02Adobe Systems IncorporatedDrag operation gesture controller
JPH11110480A (en)*1997-07-251999-04-23Kuraritec CorpMethod and device for displaying text
US6433801B1 (en)*1997-09-262002-08-13Ericsson Inc.Method and apparatus for using a touch screen display on a portable intelligent communications device
US6101498A (en)*1997-11-172000-08-08International Business Machines Corp.System for displaying a computer managed network layout with a first transient display of a user selected primary attribute of an object and a supplementary transient display of secondary attributes
US6104400A (en)*1997-12-302000-08-15International Business Machines CorporationLarge tree structure visualization and display system
US6819345B1 (en)*1998-02-172004-11-16Microsoft CorporationManaging position and size for a desktop component
US6429846B2 (en)*1998-06-232002-08-06Immersion CorporationHaptic feedback for touchpads and other touch controls
US6707443B2 (en)*1998-06-232004-03-16Immersion CorporationHaptic trackball device
KR100553671B1 (en)*1998-06-272006-05-10삼성전자주식회사Method for driving pointing device of computer system
US6496206B1 (en)*1998-06-292002-12-17Scansoft, Inc.Displaying thumbnail images of document pages in an electronic folder
US20020018051A1 (en)*1998-09-152002-02-14Mona SinghApparatus and method for moving objects on a touchscreen display
US6337698B1 (en)*1998-11-202002-01-08Microsoft CorporationPen-based interface for a notepad computer
US6239803B1 (en)*1999-04-142001-05-29Stanley W. DriskellMethod to achieve least effort selection from an item list of arbitrary length
US6580442B1 (en)*1999-12-012003-06-17Ericsson Inc.Touch-based information processing device and method
US7003734B1 (en)*2000-05-052006-02-21Point Roll, Inc.Method and system for creating and displaying images including pop-up images on a visual display
US9171851B2 (en)*2000-08-082015-10-27The Directv Group, Inc.One click web records
US6915489B2 (en)*2001-03-282005-07-05Hewlett-Packard Development Company, L.P.Image browsing using cursor positioning

Patent Citations (24)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4745543A (en)*1981-08-201988-05-17Fischer & Porter Co.Front panel for a process controller
US4712191A (en)*1982-08-111987-12-08U.S. Philips CorporationDisplay system with nested information display
US4821029A (en)*1984-04-261989-04-11Microtouch Systems, Inc.Touch screen computer-operated video display process and apparatus
US5053758A (en)*1988-02-011991-10-01Sperry Marine Inc.Touchscreen control panel with sliding touch control
US5327161A (en)*1989-08-091994-07-05Microtouch Systems, Inc.System and method for emulating a mouse input device with a touchpad input device
US5799068A (en)*1992-06-291998-08-25Elonex I.P. Holdings Ltd.Smart phone integration with computer systems
US5668929A (en)*1993-01-211997-09-16Hirsch Electronics CorporationSpeech activated security systems and methods
US5627567A (en)*1993-04-271997-05-06Hewlett-Packard CompanyMethod and apparatus for adaptive touch recognition in a touch sensitive user interface
US20010040587A1 (en)*1993-11-152001-11-15E. J. ScheckTouch control of cursonr position
US5644628A (en)*1994-03-151997-07-01Alcatel N.V.telecommunications terminal interface for control by predetermined gestures
US5719936A (en)*1995-03-071998-02-17Siemens AktiengesellschaftCommunication device for mobile operation having a telephone and notebook with display
US5757368A (en)*1995-03-271998-05-26Cirque CorporationSystem and method for extending the drag function of a computer pointing device
US5856822A (en)*1995-10-271999-01-0502 Micro, Inc.Touch-pad digital computer pointing-device
US6061050A (en)*1995-10-272000-05-09Hewlett-Packard CompanyUser interface device
US6295052B1 (en)*1996-02-192001-09-25Misawa Homes Co., Ltd.Screen display key input unit
US5812118A (en)*1996-06-251998-09-22International Business Machines CorporationMethod, apparatus, and memory for creating at least two virtual pointing devices
US6256020B1 (en)*1997-03-312001-07-03G & R Associates IncorporatedComputer-telephony integration employing an intelligent keyboard and method for same
US6609146B1 (en)*1997-11-122003-08-19Benjamin SlotznickSystem for automatically switching between two executable programs at a user's computer interface during processing by one of the executable programs
US20010012769A1 (en)*1997-11-272001-08-09Jukka SirolaWireless communication device and a method of manufacturing a wireless communication device
US6445383B1 (en)*1998-02-092002-09-03Koninklijke Philips Electronics N.V.System to detect a power management system resume event from a stylus and touch screen
US6310613B1 (en)*1998-05-262001-10-30Yamatake CorporationMethod and apparatus for changing numeric values on a display device
US6480964B1 (en)*1998-08-202002-11-12Samsung Electronics Co., Ltd.User interface power management control technique for a computer system
US6346935B1 (en)*1998-09-142002-02-12Matsushita Electric Industrial Co., Ltd.Touch-sensitive tablet
US6757002B1 (en)*1999-11-042004-06-29Hewlett-Packard Development Company, L.P.Track pad pointing device with areas of specialized function

Cited By (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US9203951B1 (en)*2014-07-032015-12-01International Business Machines CorporationMobile telephone adapted for use with one hand
USD784379S1 (en)*2015-11-252017-04-18General Electric CompanyDisplay panel or portion thereof with transitional graphical user interface
USD784381S1 (en)*2015-11-252017-04-18General Electric CompanyDisplay panel or portion thereof with transitional graphical user interface
USD785657S1 (en)*2015-11-252017-05-02General Electric CompanyDisplay screen or portion thereof with transitional graphical user interface
USD831055S1 (en)2015-11-252018-10-16General Electric CompanyDisplay screen or portion thereof with graphical user interface
USD839883S1 (en)2015-11-252019-02-05General Electric CompanyDisplay screen or portion thereof with graphical user interface
USD849766S1 (en)2015-11-252019-05-28General Electric CompanyDisplay screen or portion thereof with graphical user interface
US10423293B2 (en)*2015-11-252019-09-24International Business Machines CorporationControlling cursor motion

Also Published As

Publication numberPublication date
GB2380918B (en)2003-06-18
GB2380918C3 (en)2016-03-30
GB2380918C (en)2011-12-28
AU785203B2 (en)2006-11-02
US20030197744A1 (en)2003-10-23
NZ523065A (en)2004-11-26
US8549443B2 (en)2013-10-01
ATE338300T1 (en)2006-09-15
EP1285330A1 (en)2003-02-26
GB0228342D0 (en)2003-01-08
WO2002005081A1 (en)2002-01-17
GB2380918A (en)2003-04-16
AU5646701A (en)2002-01-21
US20160246451A9 (en)2016-08-25
DE60122708D1 (en)2006-10-12
US20110093819A1 (en)2011-04-21
GB2380918C2 (en)2013-09-25
HK1057106A1 (en)2004-03-12
CA2412578A1 (en)2002-01-17
EP1285330B1 (en)2006-08-30
US7818691B2 (en)2010-10-19

Similar Documents

PublicationPublication DateTitle
US8549443B2 (en)Zeroclick
Kinnear et al.SPSS for Windows made simple
US5745717A (en)Graphical menu providing simultaneous multiple command selection
US9836192B2 (en)Identifying and displaying overlay markers for voice command user interface
US5798760A (en)Radial graphical menuing system with concentric region menuing
US5790820A (en)Radial graphical menuing system
US20230385523A1 (en)Manipulation of handwritten content on an electronic device
WO2022197459A1 (en)Converting text to digital ink
US20150026609A1 (en)Idea wheel-based data creating apparatus
Costagliola et al.A technique for improving text editing on touchscreen devices
EP4309071A1 (en)Duplicating and aggregating digital ink instances
EP4309148A1 (en)Submitting questions using digital ink
EP4309069A1 (en)Linking digital ink instances using connecting lines
UddinImproving Multi-Touch Interactions Using Hands as Landmarks
WO2022197443A1 (en)Setting digital pen input mode using tilt angle
JP3937682B2 (en) Information processing device
ZA200209992B (en)Zeroclick.
Lewis et al.Handheld electronic devices
Costagliola et al.The design and evaluation of a text editing technique for stylus-based tablets
MalacriaWhy interaction methods should be exposed and recognizable
WebbPhrasing Bimanual Interaction for Visual Design
CordwellImproving access to computers for students with disabilities: Features available in the Windows 7 operating system
WO2022197436A1 (en)Ink grouping reveal and select
Lewis et al.T. past decade has seen an increasing proliferation of handheld electronic devices and mobile services, and this will certainly continue into the future. In this review we address recent research and design trends related to this challenging product class. We first address the design goal of ensuring a good fit between the shape of a hand-held device and users' hands. The input section addresses the methods by which users con
JP2001290883A (en)Medical examination assisting device

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:INTERAD TECHNOLOGIES, LLC, TEXAS

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IRVINE, NES STEWART, DR;REEL/FRAME:036639/0428

Effective date:20150514

Owner name:ZEROCLICK, LLC, TEXAS

Free format text:CHANGE OF NAME;ASSIGNOR:INTERAD TECHNOLOGIES, LLC;REEL/FRAME:036674/0399

Effective date:20150519

STPPInformation on status: patent application and granting procedure in general

Free format text:DOCKETED NEW CASE - READY FOR EXAMINATION

STPPInformation on status: patent application and granting procedure in general

Free format text:NON FINAL ACTION MAILED

STCBInformation on status: application discontinuation

Free format text:ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION


[8]ページ先頭

©2009-2025 Movatter.jp