Movatterモバイル変換


[0]ホーム

URL:


USRE41922E1 - Method and apparatus for providing translucent images on a computer display - Google Patents

Method and apparatus for providing translucent images on a computer display
Download PDF

Info

Publication number
USRE41922E1
USRE41922E1US10/163,748US16374802AUSRE41922EUS RE41922 E1USRE41922 E1US RE41922E1US 16374802 AUS16374802 AUS 16374802AUS RE41922 EUSRE41922 EUS RE41922E
Authority
US
United States
Prior art keywords
image
translucent
screen
window
overlay
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US10/163,748
Inventor
Michael L. Gough
Joseph J. MacDougald
Gina D. Venolia
Thomas S. Gilley
Greg M. Robbins
Daniel J. Hansen, Jr.
Abhay Oswal
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filedlitigationCriticalhttps://patents.darts-ip.com/?family=22030359&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=USRE41922(E1)"Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Priority claimed from US08/130,079external-prioritypatent/US6072489A/en
Application filed by Apple IncfiledCriticalApple Inc
Priority to US10/163,748priorityCriticalpatent/USRE41922E1/en
Assigned to APPLE INC.reassignmentAPPLE INC.CHANGE OF NAME (SEE DOCUMENT FOR DETAILS).Assignors: APPLE COMPUTER, INC.
Priority to US12/437,500prioritypatent/USRE44241E1/en
Application grantedgrantedCritical
Publication of USRE41922E1publicationCriticalpatent/USRE41922E1/en
Priority to US13/874,286prioritypatent/USRE45630E1/en
Anticipated expirationlegal-statusCritical
Expired - Lifetimelegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

A method and apparatus is described for producing a translucent image over a base image created on the display screen of a computer system by a selected first application program, and conducting image operations either on the base image created by the selected application program with reference to the translucent image produced, or conducting image operations on the translucent image with reference to the base image of the first application program. The first application program runs on a central processing unit (CPU) of a computer system to produce a base image, and another application program referred to as the overlay program is run to produce the translucent image such that portions of the base image which are overlapped by the overlay image are at least partially visible through the translucent image. There is also a mechanism for blending the first video data and the second video data to produce a blended image on the screen assembly.

Description

CROSS-REFERENCE TO A RELATED APPLICATION RELATED APPLICATIONS
This application is a broadening reissue of U.S. Pat. No.6,072,489, issued on Jun.6,2000. U.S. Pat. No.6,072,489 is a continuation-in-part of patent application Ser. No. 08/060,572, filed May 10, 1993 under the title “Method and Apparatus for Displaying an Overlay Image,” now U.S. Pat. No. 5,638,501 on behalf of Gough et al. and assigned to the same assignee as herein, the disclosure of which is hereby incorporated herein by reference in its entirety. Priority rights and claims of benefit based upon this earlier-filed patent application are claimed. More than one reissue application has been filed for the reissue of U.S. Pat. No.6,072,489. The reissue applications are application Ser. Nos.10/163,748 (the present application), and12/437,500 a continuation reissue of U.S. Pat. No.6,072,489.
BACKGROUND OF THE INVENTION
This invention relates generally to computer systems, and more particularly to computer systems utilizing graphical user interfaces.
Graphical user interfaces or GUI are becoming increasingly popular with computer users. It is generally accepted that computers having graphical user interfaces are easier to use, and that it is quicker to learn an application program in a GUI environment than in a non-GUI environment.
A relatively new type of computer which is well suited for graphical user environments is the pen-based or pen-aware computer system, hereinafter generically referred to as a “pen computer system,” “pen computer,” or the like. A pen-based computer system is typically a small, hand-held computer where the primary method for inputting data includes a “pen” or stylus. A pen-aware computer system is one which has been modified to accept pen inputs in addition to traditional input methods.
A pen computer system is often housed in a relatively flat enclosure, and has a dual-function display assembly which serves as both an input device and an output device. When operating as an input device, the display assembly senses the position of the tip of a stylus on the viewing screen and provides this positional information to the computer's central processing unit (CPU). Some display assemblies can also sense the pressure of the stylus on the screen to provide further information to the CPU. When operating as an output device, the display assembly presents computer-generated images on the screen.
Typically, graphical images can be input into the pen computer systems by merely moving the stylus across the surface of the screen, i.e. making a “stroke” on the screen. A stroke can be defined as the engagement of the screen with a stylus, the movement of the stylus across the screen (if any), and its subsequent disengagement from the screen. As the CPU senses the position and movement of the stylus, it can generate a corresponding image on the screen to create the illusion that the stylus is drawing the image directly upon the screen, i.e., that the stylus is “inking” an image on the screen. With suitable recognition software, text and numeric information can also be entered into the pen-based computer system in a similar fashion. Methods for recognizing the meaning of “ink” are well known to those skilled in the art.
Pen computer systems tend to discourage the use of a keyboard as an input device. Most of the software written for pen computers is designed to function well with pen strokes and by “tapping” the stylus against the computer screen in defined areas. A “tap” is a stroke which does not move substantially across the screen. In addition, a primary feature of many pen computer systems is their portability, which a keyboard, if included with the pen system, would seriously degrade.
In some instances, however, the need arises on a pen-based computer for data entry in a keyboard-like fashion. For example, the pen-based computer might be running a non-pen aware program that normally accepts characters from a keyboard. Also, in some cases, the only way to enter data efficiently might be to use a keyboard-like input device.
In particular, a need might arise on a pen computer to enter command or character that is normally or most efficiently executed with keystrokes on a keyboard-based system. In some pen computer systems, such keyboard-like entry of commands can be accomplished using a keyboard image displayed on the screen of the pen computer. The keyboard image resembles a standard keyboard, and keys are selected using a stylus. Most keyboard commands and characters can be entered in this fashion. Another alternative is to provide a recognition window for inputting handwritten data which is then recognized and sent to an application program as if it were typed from a keyboard. A problem with all such input approaches is that they occupy valuable screen space, which is often very limited on pen computer systems.
The efficient use of the available display screen space for observation of images and windows containing images, while particularly pronounced for pen computer systems, is common to all computer systems which display information or images to the user. No matter how large a particular display may be, a particular user will be tempted to attempt to display more information on the screen than can effectively be handled.
Images or information presented on a display screen are typically presented as opaque images, i.e., images “behind” a displayed image are obscured. This is the case with display windows which are layered on a particular screen, with the uppermost window image partially or completely blocking the view of the lower windows. For two windows to be capable of interaction, it is preferable that the user be able to observe both images at the same time, or at close to the same time.
SUMMARY OF THE INVENTION
The present invention provides for the selective creation, establishment, and processing of opaque and translucent images and opaque and translucent windows independently or in connection with other translucent images or a base opaque image provided on a display screen of a computer system. The provision of the translucent image of the present invention makes it possible to optimize space usage of the computer screen itself. Further, the invention also advantageously allows a translucent image to be formed proximate to and with specific reference to particular elements of opaque application images beneath it.
The invention further includes a method for providing a translucent image on the screen of a computer system including the steps of: 1) displaying a translucent image on the screen such that at least one opaque image can be seen through the translucent image, and 2) conducting operations with respect to either the translucent image or upon opaque images on the screen of the computer system. Both translucent and opaque image fields can be employed, which can each be completely blank without any features or elements. Particular operations upon images are considered to be image operations in regions or domains which are defined to be either translucent or opaque regions. Further, the translucent image involved may be a so-called “overlay” image produced by a computer implemented process of the present invention referred to herein as the “overlay utility.”
The present invention additionally provides a transparent overlay image over a base image provided on a screen of a pen computer system. The overlay image can serve as an input device for application programs without obscuring images made on the screen by the application programs. The provision of the transparent overlay image of the present invention makes it possible to use much or all of the screen of the pen computer system for input. It also advantageously allows controls in the overlay image to be formed proximate to specific elements of application images beneath it.
A method for providing an overlay image on the screen of a computer system in accordance with the present invention includes the steps of:1)Displaying a base image on the screen of the computer system; and2)displaying an overlay image on the screen such that overlapped portions of the application image can be seen through the overlay image. Preferably, the base image is produced by an unmodified application program running on the computer system, and the overlay image is produced by a computer implemented process of the present invention referred to herein as the “overlay utility”.
A method for displaying images on a screen of a selected computer system in accordance with the present invention includes the steps of: 1) running an application program on a central processing unit (CPU) of a computer system to produce a base opaque image on a screen coupled to the CPU; and 2) running an overlay program on the CPU to produce a translucent image on the screen such that portions of an opaque base image which are overlapped by the overlay image are at least partially visible through the overlay image. Preferably, the step of running the overlay program includes the steps of: 1) displaying a translucent image on the screen; 2) intercepting screen inputs which contact the overlay image; 3) processing the intercepted screen inputs in the CPU; and 4 updating the application program based upon the process screen inputs. The step of displaying a translucent image preferably involves the blending of a translucent image with the base image. In one embodiment of the present invention, the blending is accomplished within the CPU, and in another embodiment of the present invention, the blending is accomplished externally to the CPU in specialized video driver circuitry.
A computer system in accordance with the present invention includes a central processing unit (CPU), a screen assembly coupled to the CPU, a mechanism coupled to the screen assembly for displaying a base image on the screen assembly, and a mechanism coupled to the screen assembly for displaying a translucent image on the screen assembly such that portions of the base image which are overlapped by the overlay image are at least partially visible through the overlay image. Preferably, the screen assembly includes an LCD matrix display provided with input from a stylus, a pen, a trackball, a mouse, or a keyboard, as the case may be.
In the computer system of the present invention, the mechanism for displaying the opaque base image preferably includes a first computer implemented process running on the CPU to produce first video data, and video driver circuitry coupled between the CPU and the screen assembly, which is receptive to the first video data. Also preferably, the mechanism for displaying the translucent image includes a second computer implemented process running on the CPU producing second video data, wherein the video driver circuitry is also receptive to the second video data. The computer system blends the first video data and the second video data to produce a blended image on the screen assembly. In one embodiment of the present invention, the blending is part of the second computer implemented process running on the CPU. In another embodiment of the present invention, the blending is accomplished within the hardware of the video driver circuitry.
The computer system according to the invention includes a central processing unit (CPU), a screen for displaying images, the screen being coupled to said CPU, a display coupled to the screen for displaying a translucent image, and an arrangement for conducting image operations beneath the level of a translucent image produced by the display. The computer system may for example, according to one embodiment, be effective to perform image operation with reference to a translucent image on the screen. The computer system according to the invention may further include a screen coupled to the CPU, a display coupled to the screen for displaying a translucent image on the screen, and an arrangement for conducting image operations with reference to a translucent image or an opaque image on the display. The computer system may further include an arrangement effective for conducting selectable image operations with reference to a translucent image or an opaque image on a display screen.
An advantage of the present invention is that a translucent overlay can be provided which permits a user to input data into an active application program without obscuring the user's view of the program's display window. The overlay image of the present invention is therefore well suited for computer systems having limited display areas, including for example pen computer systems.
Another advantage of the overlay image of the present invention is that it works with both pen-aware and non-pen-aware application programs. Therefore, the overlay image of the present invention can be used with the many thousands of application programs which are not designed to be used in pen computer systems.
These and other advantages of the present invention will become apparent upon reading the following detailed descriptions and studying the various figures of the drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram of a pen computer system in accordance with the present invention;
FIG. 2 is flow diagram illustrating the process for launching an application program and the steps of handling opaque and translucent images and cursor operations;
FIG. 3a illustrates an Apple Computer display screen with a single non-translucent overlay window shown on one portion of the screen, and a gadget bar including a wand icon for transforming the overlay window been opaque and translucent states;
FIG. 3b illustrates an Apple Computer display screen with a pair or overlapping non-translucent windows shown on one portion of the screen, and a gadget bar including a wand icon for transforming the overlay window been opaque and translucent states;
FIG. 3c illustrates an Apple Computer display screen with a pair of overlapping windows shown on one portion of the screen, the overlaying window having been rendered translucent, the opaque window portion within the overlapping region of the two windows having the image of a circle displayed, and a gadget bar including a wand icon for transforming the overlay window been opaque and translucent states;
FIG. 3d illustrates an Apple Computer display screen with a pair of overlapping windows shown on one portion of the screen, the overlaying window having been rendered translucent, the opaque window portion within the overlapping region of the two windows having the image of a circle displayed, there being an additional circle image traced over the underlying circle in the opaque window, that additional circle being traced as an image in the translucent window which translucently is superimposed over the opaque window in the overlap region of the two windows, and a gadget bar including a wand icon for transforming the overlay window been opaque and translucent states;
FIG. 3e illustrates an Apple Computer display screen with a pair of overlapping opaque windows shown on one portion of the screen, the overlaying opaque window displaying the traced circle made during the window's translucent phase, and a gadget bar including a wand icon for transforming the overlay window been opaque and translucent states;
FIG. 3f illustrates an Apple Computer display screen with a single non-translucent window shown on one portion of the screen, and a gadget bar including a wand icon for transforming the overlay window been opaque and translucent states;
FIG. 3g illustrates an Apple Computer display screen with a pair of overlapping windows shown on one portion of the screen, the overlay window of the pair being translucent and having an circle image in the overlapping region of the two windows, and a gadget bar including a wand icon for transforming the overlay window been opaque and translucent states;
FIG. 3h illustrates an Apple Computer display screen with a single non-translucent window shown on one portion of the screen, the non-translucent window including the image of a circle which was created by tracing under the translucent circle image shown in FIG. 3g, and a gadget bar including a wand icon;
FIG. 3i illustrates the display screen of the prior figures including an opaque window having an overlay translucent window superimposed thereover with a predetermined translucent image, in this case the legend “TOP SECRET;”
FIG. 4 illustrates the coordinate space on which images are expressed for loading onto a video random access memory (VRAM) for presentation on a display screen;
FIG.5ais a flow diagram showing the basic steps to accomplish presentation of a translucent image according to the invention herein;
FIG. 5b is a diagram illustrating the process of displaying E translucent image, according to the invention herein;
FIG. 5c is a diagram illustrating the process of performing at overlay shield cursor patch operation as discussed herein;
FIG. 6a is a flow diagram showing the implementation of translucent overlay image operations;
FIG. 6b is a flow diagram illustrating the “Overlay Shield Cursor Patch” step of FIG. 5b;
FIG. 7 is a flow diagram illustrating the “Overlay Show Cursor Patch” step of FIG. 5b;
FIG. 8 is a flow diagram illustrating the “Blending Engine” of FIG. 5b;
FIGS. 8a-8f illustrate a computer-implemented blending process;
FIG. 9 illustrates an alternate embodiment of the “Display an Overlay Image” step of FIG. 6b;
FIG. 10 illustrates the operation of the “Blending Engine” of FIG. 9;
FIG 11 is a flow diagram illustrating the “Overlay Shield Cursor Patch” step of FIG9;
FIG. 12a illustrates a known memory management unit (MMU) data structure;
FIG. 12b illustrates a modification to the MMU data structures used to implement the “Redirect Drawing to RAM”step226 of FIG. 11;
FIG. 13 is a flow diagram illustrating the operation of the “Blending Engine”190 of FIG. 9;
FIG. 14 is a flow diagram illustrating the “Overlay System Task Patch” step of FIG. 9;
FIG. 15a and 15b illustrate a RAM memory pool format used if the present invention;
FIG. 16 is a flow diagram illustrating the process of moving an image from overlay screen to system screen, as well as the use of VRAM memory after blending operation to produce a blended image on the display screen; and
FIG. 17 is a flow diagram showing the process of handling cursor setting between system and overlay modes of operation.
FIG. 18 is a view of a Macintosh computer screen showing a desktop, a window produced by an application program called “AppleShare” and a utility program known as “PenBoard”;
FIG. 19 illustrates a non-transparent overlay which mostly obscures the desktop and window of the AppleShare application program;
FIG. 20 illustrates the overlay keyboard after it has been made transparent by the method and apparatus of the present invention;
FIGS.21a-21c illustrate the entry of data to the active window of the AppleShare program;
FIG. 22 is a diagram illustrating the “Display an Overlay Image”step138 of FIG. 6B;
FIG. 23 illustrates an alternate embodiment of the “Display an Overlay Image”step138 of FIG. 6B;
FIG. 24 illustrates the operation of the “Blending Engine”1190 of FIG. 23;
FIG.25 illustrates a video driver circuitry of a prior art Macintosh computer system produced by Apple Computer, Inc. of Cupertino, Calif.; and
FIG. 26 illustrates video driver circuitry in accordance with the present invention which provides overlay VRAM and blending capabilities.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
As shown in FIG. 1, acomputer system10 in accordance with the present invention includes a central processing unit (CPU)12, read only memory (ROM)14, random access memory (RAM)16,expansion RAM17, input/output (I/O)circuitry18,display assembly20, andexpansion bus22. Thecomputer system10 may also optionally include amass storage unit24 such as a disk drive unit or nonvolatile memory such as flash memory and a real-time clock26.
TheCPU12 is preferably a commercially available, single chip microprocessor, and is preferably a complex instruction set computer (CISC) chip such as the 68040 microprocessor available from Motorola, Inc.CPU12 is coupled toROM14 by adata bus28,control bus29, andaddress bus31.ROM14 contains the basic operating system for thecomputer system10.CPU12 is also connected to RAM16 bybusses28,29, and31 to permit the use ofRAM16 as scratch pad memory.Expansion RAM17 is optionally coupled toRAM16 for use byCPU12.CPU12 is also coupled to the I/O circuitry18 bydata bus28,control bus29, andaddress bus31 to permit data transfers with peripheral devices.
I/O circuitry18 typically includes a number of latches, registers and direct memory access (DMA) controllers. The purpose of I/O circuitry18 is to provide an interface betweenCPU12 and such peripheral devices asdisplay screen assembly20 andmass storage24.
Display assembly20 ofcomputer system10 is both an input and an output device. Accordingly, it is coupled to I/O circuitry18 by abi-directional data bus36. When operating as an output device, thedisplay assembly20 receives data from I/O circuitry18 viabus36 and displays that data on a suitable screen. The screen fordisplay assembly20 can be a liquid crystal display (LCD) of the type commercially available from a variety of manufacturers. The input device (“tablet”) of apreferred display assembly20 in accordance with the invention can be a thin, clear membrane which covers the LCD display and which is sensitive to the position of astylus38 on its surface. Alternatively, the tablet can be an embedded RF digitizer activated by an “active” RF stylus. Combination display assemblies are available from a variety of vendors.
Other types of user inputs can also be used in conjunction with the present invention. While the method of the present invention is described in the context of a pen system, other pointing devices such as a computer mouse, a track ball, or a tablet can be used to manipulate a pointer or acursor39 on a screen of a general purpose computer. Therefore, as used herein, the terms “pointer,” “pointing device,” “pointer inputs” and the like will refer to any mechanism or device for pointing to a particular location on a screen of a computer display.
Some type ofmass storage24 is generally considered desirable. However, themass storage24 can be eliminated by providing a sufficient amount ofRAM16 andexpansion RAM17 to store user application programs and data. In that case,RAMs16 and17 can be provided with a backup battery to prevent the loss of data even when thecomputer system10 is turned off. However, it is generally desirable to have some type oflong term storage24 such as a commercially available miniature hard disk drive, nonvolatile memory such as flash memory, battery-backed RAM, PC-data cards, or the like.
In operation, information is input into thecomputer system10 by “writing” on the screen ofdisplay assembly20 withstylus38. Information concerning the location of thestylus38 on the screen of thedisplay assembly20 is input into theCPU12 via I/O circuitry18. Typically, this information comprises the Cartesian (i.e., x & y) coordinates of a pixel of the screen ofdisplay assembly20 over which the tip of the stylus is positioned. Commercially available combination display assemblies include appropriate circuitry to provide the stylus location information as digitally encoded data to the I/O circuitry of the present invention. TheCPU12 then processes the data under control of an operating system and possibly an application program stored inROM14 and/orRAM16. TheCPU12 then produces data which is output to thedisplay assembly20 to produce appropriate images on its screen.
Expansion bus22 is coupled to thedata bus28, thecontrol bus29 and theaddress bus31, similar to the other components insystem10.Expansion bus22 provides extra ports to couple devices such as modems, display switches microphone, speaker, etc., to theCPU12.
FIG. 2 is flow diagram illustrating the process for launching a selected application program oncomputer system10 and the steps of handling opaque and translucent images and cursor operations in accordance with the invention herein. By launching, it is meant to begin execution and perform a range of activities typically considered ancillary to beginning execution, including for example conducting appropriate memory allocation activities. The application program can be any of a number of application programs effective for producing images or windows on display screens20. Typically, the images or windows produced will be opaque or translucent and full tone, but half-tone and partial tone images are workable with the invention herein as well, irrespective of the particular color or whether a black and white image system is employed. The process of the selected application program begins atstart step40, and the application program launches operation atstep42. According to a preferred version of the invention, the selected application program then displays a desired image ondisplay screen20, or even the lack of any image, i.e., a blank image, according tostep44. The image launched is preferably opaque, but could be translucent, according to another version of the invention.
Next, a “process cursor” operation is undertaken, according tostep46. According to this step, as will be noted in greater detail below, particularly with reference to FIG. 17, it is determined whether thecursor39 is in a region of a coordinate system of the computer associated with a translucent image domain or whethercursor39 is operating within a monitor coordinate space. FIG. 4 and the corresponding text discuss the coordinate system in greater detail. It is typically considered true that computer operations are conducted with regard to the particular region in which the cursor is operative. While this is generally true, it is considered to be within the scope of this invention for the cursor to act upon images or windows that are either above or below theactual cursor39.
Next, according to step48, it is determined whether or not an overlay task is requested. If not, process control returns to point A preceding theprocess cursor step46, and system operation cycles though theprocess cursor step46 anddecision step48 repeatedly until an overlay task is requested instep48.
Two overlay tasks in accordance with the present invention include “translucent request” and an “opaque request.” If there is a translucent request then step50 undertakes the operation of rendering a desired image translucent Similarly, if there is an opaque request, then step52 is undertaken to render a desired image opaque. After completing either step50 or52, control returns to point A with a subsequent process cursor operation being conducted according tostep46. The essential functions of the process cursor operation are as expressed with reference to FIG. 17 below. In particular, as will be seen, these include making a determination as to whether to enter the reactive mode. If the reactive mode is in fact indicated, a determination is made as to whether thecursor39 is within the bounds of an overlay or translucent image. If the cursor is within the bounds of an overlay image or a translucent image, thecursor39 is set to be on the overlay or translucent image. If the cursor is not within the bounds of a translucent image, thecursor39 is set to be on the system monitor. Once the correct situs of thecursor39 has been established, the process is considered to be complete.
To indicate the implementation of the invention in greater detail, FIG. 3a illustrates an Apple ComputerMacintosh display screen60 with a singlenon-translucent window62 shown on one portion ofscreen60, and agadget bar64 including awand icon66 for transforming selected image windows between opaque and translucent states.Window62 encloses an image, in this case acircle68, for example. Thiscircle68 is considered to represent an arbitrary image of interest to the user.Window62 can be considered to be an image produced by a first application or “APP#1”program selected by the user. This image production is described in detail in co-pending patent application Ser. No. 08/060,438, filed May 10, 1993 under the title “Interfacing with a Computer System” on behalf of Gough et al. and assigned to the same assignee as herein, the disclosure of which is hereby incorporated herein by reference in its entirety.
FIG. 3b illustrates an Apple ComputerMacintosh display screen62 with a pair of overlapping non-translucent, i.e., opaque windows, respectively,62 and70, shown on one portion ofscreen60.Window60 is produced by a first application program “APP#1,” andwindow70 is produced by a second application program “APP#2.”Gadget bar64 is shown includingwand icon66 as in FIG. 3a.Wand icon66 is effective for transforming either ofwindows62 or70 or the images which may reside in the respective windows between opaque and translucent states. The topmost or “active”window70 is shown superimposing over a portion oflower window62. Typically, the window selected for translucency is the uppermost or “overlay”window70, as this permits selected images in the overlapped region of the two windows, to be seen by virtue of the translucency of theuppermost window70. As it is, prior towindow70 being changed to a translucent state,circle image68 shown in FIG. 3a is obscured by the overlap between the twowindows62,70. By clicking onwand icon66 of FIG. 3b, the user effectively renders the top-most oroverlay window70 partially or completely translucent. By “translucent” it is meant herein that the overlay image or window can be seen, but it can also be seen through. It is understood that this creates the impression that light can travel through the particular image. By translucent, it is further meant that the lines of a-particular image can be seen, but that the spaces between the lines and the spaces around the lines can be seen through.
FIG. 3c illustrates adisplay screen60 with the pair of overlappingwindows62 and70 shown on one portion ofscreen60. In this Figure, the overlayingwindow70 has been rendered translucent. Further,opaque window62 has the image ofcircle68 displayed within the overlapping region of the twowindows62,70. Finally,gadget bar64 includingwand icon66 for transforming a selected one ofwindows62,70, is shown. This permits image operations to be conducted intranslucent overlaying window70 with reference to the image ofcircle68 inwindow62. Image operations can be any kind of operation conducted on an image or window. Drawing an image, placing an image, or for that matter modifying, moving, expanding, or changing an image or a window, are considered to be image operations. Alternatively, according to a preferred version of the invention, another image inopaque window62 or elsewhere could be the subject of image operations withoverlay window70. An example of one image operation which could be implemented, is simply the operation of copying or tracing the image ofcircle68 from theopaque window62 ontotranslucent window70.
It should be noted that, in this preferred embodiment,wand icon66 is used to designate the overlay task which is tested instep48 of FIG. 2. If the active window is opaque, a selection ofwand icon66 will indicate a “translucent request,” and if the active window is translucent, a selection ofwand icon66 will indicate an “opaque request.”Wand icon66 is preferably selected by a tap ofstylus38 over theicon66.
FIG. 3d illustratesdisplay screen60 with overlappingwindows62 and70 shown on a portion of thescreen60. The image operation suggested above has been accomplished and thecircle68 has been traced onto thetranslucent window70 as acircle78 based upon or with reference to the images established inwindow62. Thisnew circle78 ontranslucent window70 may be of the same size, larger, or smaller, thancircle68. Further, it may be offset from the corresponding location ofopaque window62. The object is simply to provide the user with ideas, choices or alternatives in connection with a secondary image or window which is created by reference to information contained in a primary image or window.Additional circle78 is conveniently created by the user either by tracing directly upon the display screen over the underlying image on the screen based uponcircle68 withstylus38, or by movingcursor39 active at thewindow70 to define the circle or other image subject to image operations, with a mouse, track ball, stylus, or the like. By thus acting and tracing an image, the user thus implements a selected computer implemented process and the process receives screen inputs which contact or are otherwise associated with a particular window as the computer implemented process is effective for processing the screen inputs. According to one version of the invention, it is a second computer implemented process which receives screen inputs which contact or are otherwise associated with a translucent window, and the second computer implemented process effectively processes the screen inputs.
FIG. 3e illustratesdisplay screen60 with overlappingopaque windows62,70, made by respective applicationprograms APP#1 andAPP#2. In particular, FIG. 3e showswindow70 after it has once again been made opaque displaying tracedcircle image78 made during the window's translucent phase, andgadget bar64 which includeswand icon66.
Accordingly, by following the steps ofFIGS. 3a-3e, the user has been able to conduct image operations and to make traces or reference images based upon theunderlying circle image68 ontooverlay window70.Window70 has been created by its own application program, i.e.,APP#2, as an opaque window in the first instance (FIG. 3b), which has then been converted into translucent window (FIG. 3c) to enable desired image operations to be conducted between the two windows. In particular, the image of interest was the circle image on the opaque,underlying window62. The tracing operation was illustrated in FIG. 3d and, in FIG. 3e, thewindow70 was made opaque again
Translucency and opaqueness can be selected in a variety of manners, such as by express keyboard commands. Furthermore, a user may perform a number of image activities in the translucent window with reference to underlyingopaque window62. In this case, the user has selected a simple tracing operation to duplicate the image ofunderlying circle68, albeit with a slightly smaller radius. The process of the invention accordingly permits the accomplishment of any of a range of desired tasks. For example, if instead ofcircle68, a complex image of a photograph of a house were displayed inopaque window62, according to the process of the invention, a translucent overlay window could be suitably positioned thereover, permitting the user to make a sketch of selected features of the house on the overlying translucent window.
An alternate version of the invention is shown with reference toFIGS. 3f-3h. According to this version, in the figure sequence which follows, the active screen (or “reactive screen,” as it might be called, because it is responsive to external influences) or window will be considered to be the opaque window underneath at a selected lower level, while the overlyingtranslucent window62 carries a selected image of interest with reference to which image operations are to be performed in the underlyingopaque window70. Toward this end, FIG. 3f illustratesdisplay screen60 with a singlenon-translucent window71 shown on one portion ofscreen60.Gadget bar64 is omitted for simplicity, but may be present for providing the functionality described previously. The non-translucent,opaque window71 is initially completely blank, in this example. Thus, while in the sequence of figures starting with FIG. 3a the image operations conducted were performed on the active overlying window71 (which was translucent), the image operations in the figure sequence starting with FIG. 3f entail image operations on the activeunderlying window71, while a translucent window73 (See FIG. 3g) is passive and is employed for reference with regard to operations conducted on theunderlying window71 below. It is considered typical that cursor operations are treated as happening on the active screen, whether it is the underlying opaque screen or the overlying translucent screen on which the activity is taking place.
FIG. 3g illustrates this indisplay screen60 with a pair o overlappingwindows73,71. The Figure showsoverlay window73 on on portion ofscreen60.Overlay window73 is translucent and has a circle image75 in the overlapping region of the two windows,73,71. In this case,cursor39 is non-reactive as to theoverlay window73. However,cursor39 is operative in the underlying,opaque window71, belowtranslucent overlay window73. Accordingly, since the cursor is active on theunderlying window71 and the desired, or selected, image75 is to be established in thetranslucent window73, tracing along its image can be accomplished by sketching underneath image75.
FIG. 3h illustratesdisplay screen60 with singlenon-translucent window71 shown on one portion of the screen.Non-translucent window71 includes the image of acircle78 which was created by “tracing” under translucent circle image75 shown in FIG. 3g. At this stage, the desiredimage78 sought to be created has been made, andtranslucent window73 has been “closed,” i.e., removed from view onscreen60.
FIG. 3i illustratesdisplay screen60 of the prior figures including anopaque window77 selected for image operations.Opaque window70 is “overlain” with an overlaytranslucent window79 which, in this case, is larger than thedisplay screen60. Formed withinwindow79 is a translucent image including the legend “TOP SECRET.”Overlay window79 is non-reactive, and thus no image operations withinoverlay window79 are permitted. Image operations below overlaytranslucent window79 are considered generally independent of and not with reference to the particular translucent image on overlaytranslucent window79. Preferably,cursor39 operates “under” theoverlay window79 to perform operations at a lower level or at one or more of lower levels underneathoverlay window79, such as withinopaque window77, according to an embodiment of the invention. The object of having the translucent overlay in this case is simply to warn of the security status of the underlying information as “Top Secret.” The user can accordingly work with the underlyingopaque window77 with the image operations and cursor movements desired, and as though the overlay translucency did not even exist except visually to the user. In the case of this embodiment, the translucent overlay is completely passive and the information on the translucency is generally though not necessarily external information and not typically specific information relevant to the image operations being conducted on any underlying active window or image
With respect to the question of precisely how the image operation outlined inFIGS. 3a-3i may be conducted in accordance with a preferred embodiment of the present invention, reference is made to FIG. 4. In particular, FIG. 4 illustrates a “coordinate space”80 on which selected images are expressed, which is standard on all Macintosh brand computers from Apple Computer, Inc. of Cupertino, Calif. In this case,operating system screen81 for a selected monitor being employed by the user is shown. Further, there is shown aportion80′ of coordinatespace80 reserved for non-physical monitor representations, on which, in turn, a region is reserved for expression of thetranslucent overlay screen82. The images on the respective operating system and translucent or overlay screens, respectively81 and82, are combined, or “blended” as will be discussed below, for loading into a video random access memory (VRAM)85 and subsequent presentation ondisplay screen60 ofdisplay assembly20.
The coordinatespace80 defined for theparticular computer system10 ranges from coordinates (−32,767; −32,767) to (+32,767; +32,767), thereby defining the space in terms of a selected pair of diagonal corner points. The top left corner coordinate points of the respective operating system and translucent or overlay screens, respectively81 and82, are respectively, for example, (0,0) and (0′,0′). The blending process to be discussed below essentially blends the domains of the respective coordinateimage screens81 and82 together for display onscreen60. According to a preferred version of the invention, the blended or overlapping regions are displayed onscreen60 as 50% half-tone images, whether in color or otherwise.
FIG. 5a is a flow diagram showing the basic steps to accomplish presentation of translucent or overlay images according to the invention herein, and within the scope ofprocess step50 shown in FIG. 2, calling for the creation of a translucent image. The general process begins atstep91. At anext process step93, the operating system records entry of a particular window or image into a reactive or non-reactive state of operation. By way of reference, a non-reactive state of operation for a translucent window is generally considered to be a mode of operation in which cursor operations and activities are performed on another window or image. Similarly, when image operations are to be performed on a translucent window or image, the translucent window is considered to be reactive. In either case, whether or not operations as to particular window or image are in the reactive or non-reactive state, operation is conducted atstep95 to create an overlay screen image which is represented on coordinatespace80′ in itsoverlay screen82. Next, according to step96, the separate images inscreens81 and82 are combined or “blended” according to operations to be discussed below. After blending operation has been completed, the results of blending are loaded intoVRAM85 to create the combined image established ondisplay screen60, according tostep98. At this point, operations are considered to be completed, according tostep99.
FIG. 5b is a diagram illustrating the process of displaying a translucent or overlay image in connection with an associated underlying opaque image or window within the scope of the invention herein. In particular, FIG. 5b shows the operating system, application program, overlay utility, system routines, etc, in hierarchical fashion. At the highest level is operatingsystem100 ofcomputer system10 of FIG. 1. Running under theoperating system100 is anapplication program101, such as the AppleShare application program.Application program101, when it wants to open a window such aswindow62 of FIG. 3a, calls a set ofroutines102 provided by theoperating system100. The window opened is automatically active, as the newest window created or activated. Another window or image can be activated merely by user selection in positioning the cursor over the window or image and clicking on the mouse, trackball or another applicable interface device. More specifically, in the Macintosh operating system,application program101 calls a “New Window”routine103 which, in turn, calls a “Frame Rect”routine104. The Frame Rect routine uses a pointer table106 to call a “Shield Cursor” routine107 and a “Show Cursor”routine108. If theapplication program101 were running onsystem100 without the process133 (see FIG. 6a) of the present invention, this would be the entirety of the calls to open up thewindow79 of FIG. 3b. This process is extensively documented in the multi-volume reference set, Inside Macintosh, by C. Rose et al., Addison-Wesley Publishing Company, Inc., July 1988 and is well known to those skilled in the art of programming on the Macintosh operating system.
FIG. 5c illustrates the “Overlay Shield Cursor Patch”process110 of FIG. 5b in greater detail. Theprocess110 begins at122 and, in afirst step123 the call from the Frame Rect routine104 to the Shield Cursor Routine107 (see FIG. 5b) is intercepted. This is accomplished by modifying the pointer table106 such that the process control jumps to the Overlay Shield Cursor Patch address area rather than the ShieldCursor Routine area107 upon a call from the Frame Rect routine104. The Overlay Shield Cursor Patch routine110 must however, remember the proper address for the Shield Cursor Routine so that the process control can be passed to theShield Cursor Routine107 at the appropriate time. Next, in astep124, the coordinates of the shield rectangle are stored for future blending operations. The shield rectangle is essentially the rectangle of the window to be developed by the application program, such as thewindow116. The coordinates of the shield rectangle can therefore be fully described with two corner coordinates, as is well known to those skilled in the art of programming on the Macintosh computer system. Next in astep125, it is determined whether this is the first time that theapplication program101 is drawing to thescreen60 after an overlay image has been produced. If it is, astep126 creates an overlay buffer, and the image of the screen that is stored in the video RAM (VRAM) is copied from the system's VRAM to a RAM screen buffer provided in general system RAM, according tostep127. Next, in astep128, the system is set such that future drawing output which is intended, by the operating system, to go to VRAM is sent to the RAM screen buffer of the present invention instead. Finally, the call made by the Frame Rect routine104 is finally passed to theShield Cursor Routine107 in astep129, and the process is completed as indicated atstep130.
The implementation ofcomputer process133, as will be seen with reference to FIG. 6a, is effective to implement an overlay utility application process effective to modify the normal flow of routine calls implemented by aparticular application program101 as follows. First,application program101 calls New Window routine103 which in turn calls Frame Rect routine104. Frame Rect routine104 next attempts to call the Shield Cursor Routine. However, according to the invention, Frame Rect routine104 instead calls a portion of a process ofstep138 of FIG. 6b known as the OverlayShield Cursor Patch110, which will be discussed below. This is accomplished by havingprocess138 modify the pointer table106 such that when the Frame Rect routine104 is trying to call theShield Cursor Routine107, it, instead, calls the OverlayShield Cursor Patch110. After OverlayShield Cursor Patch110 completes its process,Shield Cursor Routine107 is called. As far as the Frame Rect routine104 is concerned, it does not know of the diversion of process control to the Overlay ShieldCursor Patch process110, and instead believes that it directly called theShield Cursor Routine107.
When the Frame Rect routine104 goes to pointer table106 in an attempt to callShow Cursor Routine108, process control is instead diverted to aprocess112 known as “Overlay Show Cursor Patch.” The Overlay ShowCursor Patch process112 interacts with aBlending Engine process114 to blend a first screen image116 (see FIG. 5b) generated by the Macintosh operating system and the application program, with a second, “overlay” image118 to form the blended image120. After the completion of blending process ofstep114, Overlay ShowCursor Patch process112 turns over process control to the “Show Cursor Routine”process108. Again, as far as the Frame Rect routine104 is concerned, it made a direct call to the “Show Cursor Routine”108 and was ignorant of the diversion of the process control to the OverlayShow Cursor Patch112 and theBlending Engine114.
In FIG. 6a, the process in accordance with the present invention for implementing translucent overlay image operations is shown beginning atprocess step131. Atstep132, a selected application program is started, loaded, or “executed” oncomputer system10 to produce a particular image or window desired for image operations either within its own right or with reference to another image or window. The application program could for example be the AppleShare application program which producedwindow62 onscreen60. Next, instep133, the “overlay utility” is started or “executed” oncomputer system10. This “overlay utility” is an application program (often referred to as a “utility” or “routine”) which implements the computer process of the present invention. Step133 may include, for example, activating thewand icon66 ofgadget bar64 shown in FIG. 3a. After performance of a range of other selected activities, the process is completed as indicated atstep134.
In FIG. 6b,process133 of FIG. 6a is illustrated in greater detail.Process133 begins atstep135, and in astep136, it is determined, as a threshold question, whetherprocess133 is already to be treated as completed. In this instance,process133 is considered to be completed when a particular “button” of the translucent selected image is tapped. If the process is in fact completed,overlay utility133 is terminated as indicated at137. If the process is not completed,step138 displays a translucent or “overlay” image on the screen such that images on the screen that it overlaps can be seen through the overlay image. Of course, other overlay images besides selected images can be provided by the present invention, e.g. handwriting “recognition” windows, etc. Alternatively, translucent windows or images can overly other translucent windows or images. Next, in astep139, the overlay utility intercepts screen inputs which contact the overlay image, and these screen inputs are processed. Finally, in astep140, the active application program which is executing instep132 of FIG. 6a, is updated according to the processed screen inputs. Process control is then turned over to step136 which again determines whether theprocess133 is completed.
By way of additional detail,process step138 of FIG. 6b is effective to implement its process when Frame Rect routine104 calls theShow Cursor Routine108 of FIG. 5b. In that instance, when the Frame Rect routine104 goes to pointer table106 in an attempt to callShow Cursor Routine108, process control is instead diverted to aprocess112 known as “Overlay Show Cursor Patch.”
In FIG. 7,process step112 of FIG. 5b is described in greater detail. Theprocess112 begins at140 and, in astep142, the Show Cursor Routine call made by the Frame Rect routine104 is intercepted. Thisstep142 is, again, preferably implemented by modifying a pointer table to cause process control to jump to the OverlayShow Cursor Patch112 instead of theShow Cursor Routine108. The starting address of theShow Cursor Routine108 is stored by the OverlayShow Cursor Patch112 for later use. Next, in astep144, the shield rectangular coordinates of the window being opened by theapplication program101 are recalled. These coordinates were stored bystep124 of the Overlay ShieldCursor Patch process110. Next, in astep146, theBlending Engine114 of FIG. 7 is called. After theBlending Engine146 has completed its process, astep148 passes the process control back to theShow Cursor Routine108 such that the Frame Rect routine104 had no knowledge of the interveningsteps112 and114. The process is then completed as indicated at150. The “Blending Engine”process114 begins at152 and, in astep154, the shield rectangle is divided into individually blended units. For example, these blendable units can be anywhere in the range of 1 to 32 pixels, where a pixel is the smallest display unit provided on thescreen60. Next, in astep156, the RAM screen buffer data within the shield rectangle is retrieved for one blendable unit. In astep158, the RAM overlay image buffer from within the shield rectangle has been retrieved for the one blendable unit. The data retrieved fromsteps156 and158 is blended to form blended data in thestep160. Next, in astep162, the blended data is written to VRAM to be displayed on thescreen20. Next, in astep164, it is determined whether all of the blendable units created bystep154 have been blended by the process steps of156-162. If not, the loop comprising steps156-164 is repeated. Ifstep164 determines that all blendable units have been blended, the call that was initially made by the Frame Rect routine104 is passed to theShow Cursor Routine108 in astep166, and the process is completed at168. Again, the Frame Rect routine104 is unaware of the activities ofprocess114 and, instead, believes that its call was passed directly to theShow Cursor Routine108 for processing.
FIGS. 8a-8f are used, as an example, to further explain theprocess114 of FIG. 8. FIG. 8a represents the RAM shield buffer within the shield rectangle, and has been divided into16 individually-blendable units. These units are arranged in a four-by-four matrix, where the rows have been numbered1,2,3, and4. FIG. 8c illustrates the RAM screen overlay buffer in the shield rectangle, and again has16 individually-blendable units formed in a four-by-four array, with the rows numbered1,2,3, and4. In FIG. 8c, therow1 from FIG. 8a and therow1 from FIG. 8b are blended together to form a blendedrow170c. In FIG. 8d,rows2 fromFIGS. 8a and 8b are blended together to form a blendedrow170d. In FIG. 8e,rows3 and4 are blended together to form a blended row170e, and in FIG.8f rows4 fromFIGS. 8a and 8b are blended together to form a blendedrow170f. This “blending” process allows a base image (opaque or translucent) on thescreen60 to be seen through a translucent overlay image produced by the process of the present invention.
FIG. 9 illustrates an alternate embodiment of the present invention which has been optimized for screen-writing speed. While the process of FIG. 8b works very well, it requires that the entirety of thebase screen116 be rewritten whenever the blended image120 is to be refreshed. The alternative process of FIG. 9 only refreshes the portions of the blended image that need, to be refreshed, thereby greatly increasing the writing speed to thescreen60.
Much of the operation of the process illustrated in FIG. 9 is similar to that described in FIG. 5b. Anoperating system172 supports anapplication program174 which, when it wants to open a window, calls a set ofroutines176 including a “New Window routine”178 and Frame Rect routine180. The Frame Rect routine180 then, as before, attempts to first call theShield Cursor Routine182 first and then the Show Cursor Routine184. Again, as before, the pointer table is modified such that when the Frame Rect routine tries to call theShield Cursor Routine182, it instead calls the OverlayShield Cursor Patch186 of the present invention, and when the Frame Rect routine180 attempts to call the Show Cursor Routine184, it instead calls the OverlayShow Cursor Patch188. The Overlay Show Cursor Patch calls along aBlending Engine190 which blends a selectedfirst application image192 with atranslucent image194 to create a blendedimage196.
Theoperating system172, as part of its functioning, will make periodic calls to various system task processes. Thesystem task198 performs such functions as execute “Device Driver Code” and “Desk Accessory Code.”The process of the present invention opportunistically takes advantage of these periodic system task calls by modifying a pointer table200 to turn over process control to an OverlaySystem Task Patch202. This Overlay System Task Patch, with the OverlayShield Cursor Patch186, the OverlayShow Cursor Patch188, andBlending Engine190 comprise theoverlay utility133 ofFIGS. 6a and 6b in this second preferred embodiment.
FIG. 10 is used to illustrate the operation of theBlending Engine190 of FIG. 9 in greater detail. Theprocess138 of FIG. 6b remaps certain pages of VRAM to the RAM screen buffer when a translucent image contains objects that overlap these pages. The RAMoverlay screen buffer194 is then merged withchanges192′ in theRAM screen buffer192 in theBlending Engine190 by a process similar to that previously described and inserts the blended image into a “hole”196′ ofVRAM screen buffer196. Accordingly, only the overlapped portions ofRAM screen buffer192 and RAMoverlay screen buffer194 need to be blended to accomplish changes inVRAM screen buffer196.VRAM screen buffer196 is much faster memory for video purposes than theRAM screen buffer192. These factors substantially increase the blending speed of the VRAM screen buffer and therefore of the display onscreen60.
FIG. 11 illustrates the Overlay ShieldCursor Patch process186 of FIG. 9 in greater detail.Process186 of FIG. 9 begins atstep210 of FIG. 12a and then, according tostep212,process186 intercepts a call to theShield Cursor Routine182. This interception is preferably accomplished in a manner analogous to that previously described with reference to FIG. 5b. The coordinates of the shield rectangle are then stored in astep214 of FIG. 11 for future blending operations. This is similar to thestep133 of FIG. 6a. Next, instep216, it is determined whether there is a drawing to the overlay image of the present invention. If there is, astep218 determines whether this is the first time that there has been a drawing to the overlay image. If it is, astep220 creates theoverlay buffer194 of FIG. 10. If not, astep222 determines which pages ofVRAM screen buffer196 are “touched” by the overlay drawing operation. Next, in astep224, data is copied fromVRAM196 to theRAM screen buffer192 for each “touched” page. Next, in astep228, the buffer overflow error (if any) is recorded. Next, astep230 passes the original Frame Rect routine call to theShield Cursor Routine182. Thisstep230 is also performed directly afterstep216 if there was no drawing to the overlay image. Theprocess186 is then completed atstep232.
FIG. 12a illustrates a prior art memory management (MMU) data structure for a Macintosh computer system from Apple Computer, Inc. of Cupertino, Calif. The Macintosh computer system uses a tree-type MMU data structure in which inturn root pointer234 points to astack236 of 32 megabyte (MB) pointers, each of which points to astack238 of 256 kilobyte (KB) pointers, each of which then points to astack240 of 4 KB pointers, each of which point to 4 KB physical memory pages242. Some of these 4 KB physical memory pages reside in general system RAM, and some of these 4 KB physical memory pages reside in VRAM. This MMU data structure is well known to those skilled in the art of programming Macintosh computer systems.
FIG. 12b illustrates modifications that the present invention has made to the MMU data structures to accomplishstep226 of FIG. 11. Essentially, step226 selectively modifies some of the pointers in the 4 KB pointer stacks240 to “trick” the system into writing images that are intended for VRAM into RAM and vice versa. For example,process226 can redirect a pointer from the 4 KBphysical memory page242a of the VRAM to the 4 KBphysical memory page242b of the RAM as indicated byarrow244a. Also, a 4 KB pointer of astack240 can be modified as indicated by thearrow244b such that data which was to be written into 4 KBphysical memory page242b is, instead redirected to the 4 KBphysical memory page242a of the VRAM. This modification of the MMU data structure, therefore, effectively “swaps”pages242a and242b, thus causing a portion of the screen (as stored in theVRAM memory page242a) to be drawn “off screen” inRAM memory page242b.
The MMU modification of FIG. 12b takes advantage of the fact that the Macintosh operating system supports multiple monitors. These monitors exist in the aforementioned single coordinate plane, in which the upper-left corner of the main screen is the origin (the point with coordinate value (0,0)). The overlay screen exists in the same coordinate space, but it is off in an area not normally occupied by monitors. The upper-left hand corner of the overlay screen, for example, can be at coordinate (−10,000, −10,000). It is very unlikely that using this remote area of coordinate space will affect existing monitor set ups. In consequence, a “pseudo” screen is recognized by the operating system where theoverlay image194 resides. The blending operation, then, blends the images of theactual screen60 and this “pseudo” screen which includes the overlay image.
FIG. 13 illustrates theprocess190 of FIG. 9 in greater detail. The “Blending Engine”process190 begins at246 and, in astep248, the shield rectangle is divided into component rectangles that intersect redirected pages of the display memory. The redirected page concept was explained with reference to FIG. 12b. Next, in astep250, the component rectangles are blended. This is accomplished as previously described with reference to FIG. 8 andFIGS. 8a-8f. Next, in astep252, it is determined whether all component rectangles have been completed. If not, steps250 and252 are continued in a loop until all component rectangles are done at which time the process is completed as indicated at254.
In FIG. 14, the Overlay SystemTask Patch process202 of FIG. 9 is described in greater detail. As mentioned previously,process202 is an additional portion of the overlay utility of this second embodiment of the present invention. Theprocess202 needs to be implemented periodically and, since the system periodically makes calls to various system tasks, theprocess202 uses these periodic system task calls to activate its processes. Alternatively, other activation methods could be used to periodically start theprocess202. Theprocess202, starts at256 and, in astep258,process202 intercepts a system task call made bysystem172. Next, in astep260, redirected pages are moved back to VRAM when the overlay image for those pages is clear, i.e., when all pixels of the overlay contain a value of zero for a given screen page. In adecision step262, it is determined whether a buffer overflow error flag has been set. If it has, astep264 uses the newly cleared RAM pages to reconcile the error. Then, in astep266, it is determined whether there is sufficient memory available to complete the task. If not, additional memory is allocated in astep268. Next, astep270 determines whether there was an allocation error made during the allocation step of268. If not, the newly allocated pages are used to reconcile the error in astep272 and the error is cleared instep274. Thisstep274 is also executed if there was determined to be sufficient memory instep266. Next, theprocess202 calls thesystem task198 in astep276 and the process is completed as indicated at278. The callsystem task step276 is also executed ifstep270 indicates that there is an allocation error in the additional memory. Thesystem172 is unaware of the modification of the pointer table200 and of the process of the OverlaySystem Task Patch202 and, simply believes that thesystem task198 has been called directly as indicated by arrow199 on FIG. 9.
InFIGS. 15a and 15b, the preferred RAM memory pool format for the present invention is disclosed. Referring to FIG. 15b, theRAM memory pool280 comprises a number ofblocks282a,282b,282c, etc. Each block preferably contains 16 pages of memory which are used to remap portions of the display monitor memory using the MMU as previously described. The blocks are chained together by pointers as represented by arrows284. With additional reference toFIGS. 15a and 15b, each block282 includes aheader portion286, adata portion288, and atrailer portion290. Theheader portion286 includes twopointer portions292 and294 and anallocation portion296. Theheader portion286 also includes apadding portion298. The pointer portion292 points to the first page in the current block282, and is preferably 32 bits in length. Thepointer294 is also preferably 32 bits in length, and points to the next block in the RAM memory pool. In this example, thepointer294 ofblock282a points to the pointer292 of theblock282b as indicated by the arrow284a. Similarly, thepointer294 ofblock282b points to the pointer292 ofblock282c as indicated by the arrow284b.
The blocks are chained together as indicated in FIG. 15b. When a free page is needed, the page allocator traverses the chain searching for a block which contains a free page. When it finds one, it sets the correspondingallocation flag296 to indicate that the page is now in use. If no free pages are found, a new block282 is allocated, and is connected to the end of the chain. A page is then allocated from the new block.
The purpose of the header “padding” is for page alignment. Thepages288 are aligned in memory so that the MMU can properly map onto them. The number of bytes in theheader padding298 depends on where the header happens to be allocated in memory. If it is only a few bytes from a page boundary, then the header padding is only a few bytes in length. In some cases, the header padding may approach a full page in size (4K in this instance). Trailer “padding”290 contains the remaining bytes in the block, which is allocated at a fixed size. Again, this fixed size in the preferred embodiment is 4K.
FIG. 16 is a flow diagram illustrating the process of moving images fromoverlay screen82 andsystem screen81 for processing, as well as the use of VRAM memory after blending operation to produce a blended image on thedisplay screen20. In particular, the process begins with astart step401. According to step402, images are moved from overlay ortranslucent screen82 and thesystem screen81. Next, blending is accomplished between the system screen and the overlay or translucent screen, with the results being stored inVRAM85, according tostep404. Finally, the contents of theVRAM85 are displayed on thedisplay monitor20, as perstep406. The operation is completed atstep410.
FIG. 17 is a flow diagram showing the process of handling cursor setting between system and overlay modes of operation. The process starts withstep420. Next, a determination is made as to whether to enter the reactive mode, according tostep422. If not, the operation is considered completed, according tostep430. If the reactive mode is in fact indicated, as perstep422, a determination is made as to whether thecursor39 is within the bounds of an overlay or translucent image, as indicated withprocess step424. If the cursor is within the bounds of an overlay image or a translucent image, thecursor39 is set to be on the overlay or translucent image. If the cursor is not within the bounds of a translucent image, thecursor39 is set to be on the system monitor, according toprocess step426. Once the correct situs of thecursor39 has been established, the process is considered to be complete.
In FIG. 18, ascreen1040 of a Macintosh computer system made by Apple Computer, Inc., of Cupertino. Calif., includes adesktop image1042 produced by a Macintosh operating system, awindow1044 produced by a “AppleShare” application program made by Apple Computer, Inc., and apalette1046 produced by a small application program or “utility” known as “PenBoard” made by Apple Computer, Inc. Thedesktop1042, which includes amenu bar1048 and adesk area1050, often displays a number oficons1052,1054 and1056, which represent different objects or functions. For example, theicon1052 represents a hard disk drive;icon1054 represents the “trash can” in which files can be deleted; andicon1056 represents a folder which can contain applications and documents of various types. Themenu bar1048 preferably includes a number oflabels1058,1060, and1062 for pull-down menus, as is well known to Macintosh users.
As mentioned previously, thedesktop1042 is created by the operating system(sometimes referred to as the “Finder”). The Finder can be considered to be a specialized form of application program which displays an image on the entirety of thescreen1040. In other words, the “window” size of thedesktop1042 is the same size as thescreen1040. The application program AppleShare which creates thewindow1044 typically does not take over theentire screen1040. Similarly, the palette1046(which is just a specialized form of window)is produced by the PenBoard application, and does not occupy the entire space of thescreen1040.
As is apparent by studying FIG. 18, thescreen1040 can quickly become occupied with icons, windows and palettes. This is not a major problem in traditional computer systems wherein the primary forms of input comprise keyboards and pointer devices, such a mice. However, in the pen computer systems where these more traditional forms of input devices are not always available, the limitations of screen size becomes readily apparent.
In FIG. 19, akeyboard image1064 has been provided onscreen1040 to aid in the input of data to the AppleShare application program described previously. Preferably, thiskeyboard image1064 is provided by dragging akeyboard icon1066 off of thePenBoard palette1046 in a fashion more fully described in copending U.S. patent application Ser. No.08/060,458, filed May10,1993, on behalf of Gough et al., entitled “Method and Apparatus for Interfacing With a Computer System”, and assigned to the assignee of the present application, the disclosure of which is hereby incorporated herein by reference in its entirety. As can be seen in this FIG. 19, thekeyboard image1064 completely obscures theicons1052,1054 and1056 of FIG. 18, and almost totally obscures thewindow1044 of the AppleShare application program. Information can be entered into thewindow1044 of the application program from thekeyboard image1064 by “tapping” on a “key” with thestylus38. For example,arrow1068 on thekeyboard image1064 represents the “tapping” on the key “R” with thestylus38. This tapping action will send a “R” to be displayed in thewindow1044 of the AppleShare application just as if a “R” had been typed on a physical keyboard. Again, the functioning of thekeyboard image1064 is discussed in the aforementioned copending U.S. patent application of Gough et al.
While thekeyboard image1064 can be used to input data into a currently active application program(such as AppleShare), the keyboard image prevents any user feedback of the information being entered into application windows obscured by the keyboard image. Therefore, it is difficult for the user to determine whether data has been properly entered into the application program. This, in turn, slows down the data entry process, and greatly increases the chances for errors.
The present invention solves this problem, as illustrated in FIG. 20. A user taps on a “transparency”icon1069 on thekeyboard image1064 of FIG. 19 with thestylus38 to cause thekeyboard1064 to become partially transparent or “translucent.” By “translucent” it is meant herein that the overlay image can be seen, but it can also be seen through. Tapping on thetransparency icon1069 of thekeyboard image1064′ of FIG. 20 would cause the “solid”keyboard image1064 of FIG. 19 to reappear.
As can be seen, thetranslucent keyboard image1064′ allows thewindow1044 andicons1052,1054, and1056, to be seen through thetranslucent keyboard image1064′. In other words, portions of base images which are overlapped by thekeyboard image1064′, can still be seen(with some loss in resolution)through thetranslucent keyboard image1064′.
The functioning of thekeyboard image1064′ will be explained in greater detail with reference to FIGS.21a-21c. In FIG. 21a, thestylus38 is used to “tap” on the “r” key as indicated by thearrow1068 and the shading of the “r” key. Thekeyboard image1064′ “intercepts” thetap1068 which would otherwise fall on thewindow1044, and, instead causes a “r” to be sent to the AppleShare program and be displayed in a password field of thewindow1044. (Actually, AppleShare would display a “bullet” instead of the “r” to maintain the security of the password, but it will be assumed in this example that the typed password will remain visible). The “r” within the password field ofwindow1044 can be seen through thetranslucent window1064′ in this figure. In FIG. 21b,second tap1068 on the “i” key will cause thekeyboard image1064′ to “intercept” the tap which would otherwise fall on thewindow1044, and to send a “i” character to the AppleShare application program which then displays an “i” after the “r” in the password field ofwindow1044. Next, as seen in FIG. 21c, the “p” key is tapped at1068, causing thekeyboard1064′ to intercept the tap which would otherwise fall on thewindow1044 and to send the “p” character to the AppleShare program which displays the character in the password field after the character “r” and “i.” Other characters and control characters (such as the “return” button1070)can be sent to the applicationprogram controlling window1044 in a similar fashion.
It will be apparent with a study of FIGS.20 and21a-21c that thetranslucent keyboard image1064′ is a distinctly superior user interface for situations in which screen area is at a premium. Since images “beneath” thetranslucent keyboard image1064′ can be seen through the keyboard image, the user has immediate feedback as to the accuracy of his or her input to the active application program. For example, if a key were “tapped” in error, the backspace key1072 can be tapped on thetranslucent keyboard1064′ so that the correct character can be reentered. Thetranslucent keyboard1064′ therefore effectively expands the useful area ofscreen1040 by providing multiple, usable, overlapped images.
A preferred method in accordance with the present invention for implementing theprocess133 on a Macintosh computer system is illustrated with reference to FIG. 22. The illustrated method of FIG. 22 is fairly specific to the Macintosh computer system. It will therefore be apparent to those skilled in the art that when theprocess133 is implemented on other computer systems, such as MS-DOS compatible computer systems and UNIX computer systems, that the methodology of FIG. 22 will have to be modified. However, such modifications will become readily apparent to those skilled in the art after studying the following descriptions of how theprocess133 is implemented on the Macintosh computer system.
In FIG. 22, the operating system, application program, overlay utility, system routines, etc., are shown in a somewhat hierarchical fashion. At the highest level is the operating system1096 of thecomputer system10 of FIG. 1. Running under the operating system1096 is anapplication program1098, such as the aforementioned AppleShare application program.Application program1098, when it wants to open a window such aswindow1044 of FIG. 18, calls a set ofroutines1100 provided by the operating system1096. More specifically, in the Macintosh operating system,application program1098 calls a “New Window” routine1102 which, in turn, calls a “Frame Rect” routine1104. The Frame Rect routine uses a pointer table1106 to call a “Shield Cursor” routine1107 and a “Show Cursor” routine1108. If theapplication program1098 were running on system1096 without theprocess133 of the present invention, this would be the entirety of the calls to open up thewindow1044 of FIG. 18. This process is extensively documented in the multi-volume reference set, Inside Macintosh, by C. Rose et al., Addison-Wesley Publishing Company, Inc., July1988 and are well known to those skilled in the art of programming on the Macintosh operating system.
The implementation of computer implementedprocess133 modifies this normal flow of routine calls in the following way. When theapplication program1098 calls the New Window routine1102 which calls the Frame Rect routine1104, which attempts to call the Shield Cursor Routine, the Frame Rect routine1104 instead calls a portion of the process ofstep138 of FIG. 6B known as the OverlayShield Cursor Patch1110. This is accomplished by having theprocess138 modify the pointer table1106 such that when the Frame Rect routine1104 is trying to call theShield Cursor Routine1107 it, instead, calls the OverlayShield Cursor Patch1110. After the OverlayShield Cursor Patch1110 completes its process, theShield Cursor Routine1107 is then called. As far as the Frame Rect routine1104 is concerned, it does not know of the diversion of process control to the Overlay ShieldCursor Patch process1110, and instead believes that it directly called theShield Cursor Routine1107.
Theprocess step138 of FIG. 6B similarly “tricks” the Frame Rect routine1104 when it attempts to call theShow Cursor Routine1108. In that instance, when the Frame Rect routine1104 goes to the pointer table1106 in an attempt to call theShow Cursor Routine1108, process control is instead diverted to aprocess1112 known as “Overlay Show Cursor Patch”. The Overlay ShowCursor Patch process1112 interacts with aBlending Engine process1114 to blend afirst screen image1116 generated by the Macintosh operating system and the application program, with a second image1118 (in this case, the keyboard image)to form the blendedimage1120. The operation of the Blending Engine will be discussed in greater detail subsequently. After the completion of the blending process of1114, the Overlay ShowCursor Patch process1112 turns over process control to the “Show Cursor Routine”process1108. Again, as far as the Frame Rect routine1104 is concerned, it made a direct call to the “Show Cursor Routine”1108 and was ignorant of the diversion of the process control to the OverlayShow Cursor Patch1112 and theBlending Engine1114.
FIG. 23 illustrates an alternate embodiment of the present invention which has been optimized for screen-writing speed. While the process of FIG. 22 works very well, it requires that the entirety of thebase screen1116 be rewritten whenever the blendedimage1120 is to be refreshed. The alternative process of FIG. 23 only refreshes the portions of the blended image that needs to be refreshed, thereby greatly increasing the writing speed to thescreen1040.
Much of the operation of the process illustrated in FIG. 23 is similar to that described in FIG. 22. Anoperating system1172 supports anapplication program1174 which, when it wants to open a window, calls a set ofroutines1176 including a “New Window routine”1178 and Frame Rect routine1180. The Frame Rect routine1180 then, as before, attempts to first call theShield Cursor Routine1182 first and then theShow Cursor Routine1184. Again, as before, the pointer table is modified such that when the Frame Rect routine tries to call theShield Cursor Routine1182, it instead calls the Overlay Shield Cursor Patch1186 of the present invention, and when the Frame Rect routine1180 attempts to call theShow Cursor Routine1184 it instead calls the OverlayShow Cursor Patch1188. The Overlay Show Cursor Patch calls aBlending Engine1190 which blends apartial base image1192 with anoverlay image1194 to create a blendedimage1196.
Thesystem1172, as part of its functioning, will make periodic calls to various system task processes1198. Thesystem task1198 performs such functions as execute “Device Driver Code” and “Desk Accessory Code.” The process of the present invention opportunistically takes advantage of these periodic system task calls by modifying a pointer table1200 to turn over process control to an OverlaySystem Task Patch1202. This Overlay System Task Patch, along with the Overlay Shield Cursor Patch1186, the OverlayShow Cursor Patch1188, and theBlending Engine1190 comprise theoverlay utility133 of FIGS.6A and 6B in this second preferred embodiment.
FIG. 24 is used to illustrate the operation of theBlending Engine1190 of FIG. 23 in greater detail. Theprocess138 of FIG. 6B remaps certain pages of VRAM to the RAM screen buffer when an overlay image contains objects that overlap these pages. The RAMoverlay screen buffer1194 is then merged with theRAM screen buffer1192 in theBlending Engine1190 by a process similar to that previously described and inserts the blended image into a “hole”1204 of theVRAM screen buffer1196. Theportions1206 and1208 of the VRAM screen buffer remain the VRAM since the overlay image of the present invention does not overlap pages comprising these portions of the screen.
Sinceportions1206 and1208 are pages of VRAM screen buffer memory which are not overlapped, at least in part, by an overlay image of the present invention, theseportions1206 and1208 can remain in VRAM screen buffer. VRAM screen buffer is much faster memory for video purposes than theRAM screen buffer1192. Also, changes made to theRAM screen buffer1192 or to the RAMoverlay screen buffer1194 that do not cause a change inportions1206 and1208 do not require that the system blend theportions1206 and1208. The combination of these factors substantially increase the blending speed of the VRAM screen buffer and therefore of the display onscreen1040.
FIGS.25 and 26 are used to illustrate an alternate embodiment of the present invention wherein the blending of the base image and the overlay image are performed in the video driver hardware rather than within a computer implemented process on the CPU. In FIG. 25, a prior art video driver system of a Macintosh computer system is illustrated. In this prior art example, thevideo driver circuit1302 is coupled to anaddress bus1304 and adata bus1306 connected to a Motorola68030 microprocessor. Thevideo driver circuit1302 includes a colorscreen controller CSC1307, and two banks ofVRAM1308 and1310. TheCSC1307 produces LCD control and data on abus1312 which control a black and white or color liquid crystal display (LCD). For example, thevideo driver circuit1302 can drive an Esher LCD circuit for a640 by400 bit display, with eight bits of information per pixel.
In FIG. 26, a modifiedvideo driver circuit1302′ is coupled to the same Motorola68030address bus1304 anddata bus1306, and includes thesame CSC1307,VRAM1308, andVRAM1310. However, the data and address connections have been modified as indicated. In this implementation, data from the screen buffer and the overlay screen buffer are input into the VRAM of modifiedvideo driver circuit1302′, and combined therein to provide LCD control and blended data on thebus1312. Again, thevideo driver circuit1302′ can control a black and white or color LCD, except this time instead of having eight bits per pixel, there are four bits allocated to the base image and four bits allocated to the overlay image. A color look-up table(CLUT)—not shown—ofCSC1307 is loaded with256 entries which detail each possible combination of bits from the4 bit screen and the4 bit overlay, and what the resultant blended value is. The color capability of theCSC1307 is therefore no longer used for color look-up, and is instead used for the blending values. This technique makes it possible to use off-the-shelf integrated circuits, such as theCSC1307 which is available from Chips&Technologies, Inc. of San Jose, Calif., to perform an entirely new operation.
In summary, the method of the invention includes establishing translucent images on a display screen including displaying a translucent images and conducting image operations enabled by the translucent image. Image operations can be any kind of operation conducted on an image or window. Drawing an image, placing an image, or for that matter modifying, moving, expanding, or changing an image or a window, are considered to be image operations. A reference image could be provided by a selected first application program. The translucent image could be produced by a selected second application program. The user is thus enabled to make sketches on the translucent image or window based upon what he or she sees on the base image produced by the first application program. This is made possible without any direct intervention in the operations of the first application program. In short, the features of the first application program are advantageously employed, without any modification of the first application program itself. The technical enablement of this cooperative screen is found in a feature of the invention according to which the second application program intercepts certain screen inputs of the first application program and uses them to supply the screen input needed as to the second application program.
The image operations enabled by the concurrent interoperability of the two applications can be implemented by user selected intervention at any of a number of screen operational levels. The base image or window is considered to operate at a lower level, or below the level of the translucent image or window. Thus, the translucent image or window is known as the “overlay” image or window. Typically, the cursor is active at the particular level at which the user can operate. In any case, according to the invention, it may be useful to operate at either the base level, i.e., the level of the base image or window, or at the translucent overlay level. In other words, user input is permitted at either the base image or the translucent image. By a particular user input with respect to an image, the user implements a selected computer implemented process and the process receives screen inputs which contact or are otherwise associated with a particular window as the computer implemented process is effective for processing the screen inputs. These various inputs are controllable selectively by the user, in that users can take specific actions to determine which of the levels will be active for them. This can, for example, be accomplished by action of clicking or activating a pen or stylus or by another well known action users are considered capable of actuating. A particular window just opened is automatically active, as the newest window created or activated. Another window or image can be activated merely by user selection in positioning the cursor over the window or image and clicking on the mouse, trackball or another applicable interface device.
While this invention has been described in terms of several preferred embodiments, it is contemplated that many alterations, permutations, and equivalents will be apparent to those skilled in the art. It is therefore intended that the following appended claims be interpreted as including all such alterations, permutations, and equivalents as fall within the true spirit and scope of the present invention.

Claims (35)

US10/163,7481993-05-102002-06-05Method and apparatus for providing translucent images on a computer displayExpired - LifetimeUSRE41922E1 (en)

Priority Applications (3)

Application NumberPriority DateFiling DateTitle
US10/163,748USRE41922E1 (en)1993-05-102002-06-05Method and apparatus for providing translucent images on a computer display
US12/437,500USRE44241E1 (en)1993-05-102009-05-07Method and apparatus for providing translucent images on a computer display
US13/874,286USRE45630E1 (en)1993-05-102013-04-30Method and apparatus for providing translucent images on a computer display

Applications Claiming Priority (3)

Application NumberPriority DateFiling DateTitle
US08/060,572US5638501A (en)1993-05-101993-05-10Method and apparatus for displaying an overlay image
US08/130,079US6072489A (en)1993-05-101993-09-30Method and apparatus for providing translucent images on a computer display
US10/163,748USRE41922E1 (en)1993-05-102002-06-05Method and apparatus for providing translucent images on a computer display

Related Parent Applications (2)

Application NumberTitlePriority DateFiling Date
US08/060,572Continuation-In-PartUS5638501A (en)1993-05-101993-05-10Method and apparatus for displaying an overlay image
US08/130,079ReissueUS6072489A (en)1993-05-101993-09-30Method and apparatus for providing translucent images on a computer display

Related Child Applications (1)

Application NumberTitlePriority DateFiling Date
US08/130,079ContinuationUS6072489A (en)1993-05-101993-09-30Method and apparatus for providing translucent images on a computer display

Publications (1)

Publication NumberPublication Date
USRE41922E1true USRE41922E1 (en)2010-11-09

Family

ID=22030359

Family Applications (5)

Application NumberTitlePriority DateFiling Date
US08/060,572Expired - LifetimeUS5638501A (en)1993-05-101993-05-10Method and apparatus for displaying an overlay image
US08/827,764Expired - LifetimeUS5949432A (en)1993-05-101997-04-11Method and apparatus for providing translucent images on a computer display
US10/163,748Expired - LifetimeUSRE41922E1 (en)1993-05-102002-06-05Method and apparatus for providing translucent images on a computer display
US12/437,500Expired - LifetimeUSRE44241E1 (en)1993-05-102009-05-07Method and apparatus for providing translucent images on a computer display
US13/874,286Expired - LifetimeUSRE45630E1 (en)1993-05-102013-04-30Method and apparatus for providing translucent images on a computer display

Family Applications Before (2)

Application NumberTitlePriority DateFiling Date
US08/060,572Expired - LifetimeUS5638501A (en)1993-05-101993-05-10Method and apparatus for displaying an overlay image
US08/827,764Expired - LifetimeUS5949432A (en)1993-05-101997-04-11Method and apparatus for providing translucent images on a computer display

Family Applications After (2)

Application NumberTitlePriority DateFiling Date
US12/437,500Expired - LifetimeUSRE44241E1 (en)1993-05-102009-05-07Method and apparatus for providing translucent images on a computer display
US13/874,286Expired - LifetimeUSRE45630E1 (en)1993-05-102013-04-30Method and apparatus for providing translucent images on a computer display

Country Status (1)

CountryLink
US (5)US5638501A (en)

Cited By (71)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20050216856A1 (en)*2004-03-232005-09-29Matti Michael CSystem and method for displaying information on an interface device
US20060277460A1 (en)*2005-06-032006-12-07Scott ForstallWebview applications
US20070083825A1 (en)*2002-07-102007-04-12Imran ChaudhriMethod and apparatus for displaying a window for a user interface
US20070101291A1 (en)*2005-10-272007-05-03Scott ForstallLinked widgets
US20070106952A1 (en)*2005-06-032007-05-10Apple Computer, Inc.Presenting and managing clipped content
US20080163082A1 (en)*2006-12-292008-07-03Nokia CorporationTransparent layer application
US20090228987A1 (en)*2008-03-042009-09-10Microsoft CorporationShield for user interface testing
US20100322485A1 (en)*2009-06-182010-12-23Research In Motion LimitedGraphical authentication
US20120067503A1 (en)*2010-09-212012-03-22Harris Research, Inc.Flexible translucent color matching apparatus
US20120159616A1 (en)*2010-12-162012-06-21Research In Motion LimitedPressure sensitive multi-layer passwords
USD664550S1 (en)2011-09-122012-07-31Microsoft CorporationDisplay screen with animated graphical user interface
USD664967S1 (en)2011-09-122012-08-07Microsoft CorporationDisplay screen with animated graphical user interface
USD664968S1 (en)2011-09-122012-08-07Microsoft CorporationDisplay screen with animated graphical user interface
USD665395S1 (en)2011-09-122012-08-14Microsoft CorporationDisplay screen with animated graphical user interface
USD675224S1 (en)2011-09-122013-01-29Microsoft CorporationDisplay screen with animated graphical user interface
USD681652S1 (en)*2007-03-222013-05-07Fujifilm CorporationElectronic camera
US8453065B2 (en)2004-06-252013-05-28Apple Inc.Preview and installation of user interface elements in a display environment
USRE44241E1 (en)1993-05-102013-05-28Apple Inc.Method and apparatus for providing translucent images on a computer display
US20130215088A1 (en)*2012-02-172013-08-22Howon SONElectronic device including flexible display
US8543824B2 (en)2005-10-272013-09-24Apple Inc.Safe distribution and use of content
US8631487B2 (en)2010-12-162014-01-14Research In Motion LimitedSimple algebraic and multi-layer passwords
US8635676B2 (en)2010-12-162014-01-21Blackberry LimitedVisual or touchscreen password entry
USD697930S1 (en)*2011-10-142014-01-21Nest Labs, Inc.Display screen or portion thereof with a graphical user interface
US8650624B2 (en)2010-12-162014-02-11Blackberry LimitedObscuring visual login
US8661530B2 (en)2010-12-162014-02-25Blackberry LimitedMulti-layer orientation-changing password
USD701515S1 (en)*2011-10-142014-03-25Nest Labs, Inc.Display screen or portion thereof with a graphical user interface
US20140085331A1 (en)*2012-09-182014-03-27Harris Research, Inc.Apparatus, method, and program product for selecting a finish
USD701869S1 (en)*2011-10-142014-04-01Nest Labs, Inc.Display screen or portion thereof with a graphical user interface
US8706270B2 (en)2010-11-192014-04-22Nest Labs, Inc.Thermostat user interface
US8727611B2 (en)2010-11-192014-05-20Nest Labs, Inc.System and method for integrating sensors in thermostats
US8745694B2 (en)2010-12-162014-06-03Research In Motion LimitedAdjusting the position of an endpoint reference for increasing security during device log-on
US8769668B2 (en)2011-05-092014-07-01Blackberry LimitedTouchscreen password entry
US8769641B2 (en)2010-12-162014-07-01Blackberry LimitedMulti-layer multi-point or pathway-based passwords
US8839142B2 (en)2007-06-082014-09-16Apple Inc.Desktop system object removal
US8850348B2 (en)2010-12-312014-09-30Google Inc.Dynamic device-associated feedback indicative of responsible device usage
US8863271B2 (en)2010-12-162014-10-14Blackberry LimitedPassword entry using 3D image with spatial alignment
US8869027B2 (en)2006-08-042014-10-21Apple Inc.Management and generation of dashboards
US8918219B2 (en)2010-11-192014-12-23Google Inc.User friendly interface for control unit
US8931083B2 (en)2010-12-162015-01-06Blackberry LimitedMulti-layer multi-point or randomized passwords
US8949735B2 (en)2012-11-022015-02-03Google Inc.Determining scroll direction intent
US8954871B2 (en)2007-07-182015-02-10Apple Inc.User-centric widgets and dashboards
US8998102B2 (en)2011-10-212015-04-07Google Inc.Round thermostat with flanged rotatable user input member and wall-facing optical sensor that senses rotation
US9032318B2 (en)2005-10-272015-05-12Apple Inc.Widget security
US9092039B2 (en)2010-11-192015-07-28Google Inc.HVAC controller with user-friendly installation features with wire insertion detection
US9092128B2 (en)2010-05-212015-07-28Apple Inc.Method and apparatus for managing visual information
US9104211B2 (en)2010-11-192015-08-11Google Inc.Temperature controller with model-based time to target calculation and display
US9135426B2 (en)2010-12-162015-09-15Blackberry LimitedPassword entry using moving images
US9175871B2 (en)2011-10-072015-11-03Google Inc.Thermostat user interface
US9189467B1 (en)2001-11-072015-11-17Apple Inc.Method and apparatus for annotating an electronic document
US9223948B2 (en)2011-11-012015-12-29Blackberry LimitedCombined passcode and activity launch modifier
US9258123B2 (en)2010-12-162016-02-09Blackberry LimitedMulti-layered color-sensitive passwords
US9292196B2 (en)2010-10-192016-03-22Apple Inc.Modifying the presentation of clustered application windows in a user interface
US9298196B2 (en)2010-11-192016-03-29Google Inc.Energy efficiency promoting schedule learning algorithms for intelligent thermostat
US9417888B2 (en)2005-11-182016-08-16Apple Inc.Management of user interface elements in a display environment
US9453655B2 (en)2011-10-072016-09-27Google Inc.Methods and graphical user interfaces for reporting performance information for an HVAC system controlled by a self-programming network-connected thermostat
US9459018B2 (en)2010-11-192016-10-04Google Inc.Systems and methods for energy-efficient control of an energy-consuming system
US9489062B2 (en)2010-09-142016-11-08Google Inc.User interfaces for remote management and control of network-connected thermostats
US9507503B2 (en)2004-06-252016-11-29Apple Inc.Remote access to layer and user interface elements
US9513930B2 (en)2005-10-272016-12-06Apple Inc.Workflow widgets
US9542202B2 (en)2010-10-192017-01-10Apple Inc.Displaying and updating workspaces in a user interface
US9552002B2 (en)2010-11-192017-01-24Google Inc.Graphical user interface for setpoint creation and modification
US9658732B2 (en)2010-10-192017-05-23Apple Inc.Changing a virtual workspace based on user interaction with an application window in a user interface
US9702582B2 (en)2015-10-122017-07-11Ikorongo Technology, LLCConnected thermostat for controlling a climate system based on a desired usage profile in comparison to other connected thermostats controlling other climate systems
US9890970B2 (en)2012-03-292018-02-13Google Inc.Processing and reporting usage information for an HVAC system controlled by a network-connected thermostat
US9952573B2 (en)2010-11-192018-04-24Google LlcSystems and methods for a graphical user interface of a controller for an energy-consuming system having spatially related discrete display elements
US10078319B2 (en)2010-11-192018-09-18Google LlcHVAC schedule establishment in an intelligent, network-connected thermostat
US10145577B2 (en)2012-03-292018-12-04Google LlcUser interfaces for HVAC schedule display and modification on smartphone or other space-limited touchscreen device
US10152192B2 (en)2011-02-212018-12-11Apple Inc.Scaling application windows in one or more workspaces in a user interface
US10346275B2 (en)2010-11-192019-07-09Google LlcAttributing causation for energy usage and setpoint changes with a network-connected thermostat
US10740117B2 (en)2010-10-192020-08-11Apple Inc.Grouping windows into clusters in one or more workspaces in a user interface
US11334034B2 (en)2010-11-192022-05-17Google LlcEnergy efficiency promoting schedule learning algorithms for intelligent thermostat

Families Citing this family (212)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US6357047B1 (en)1997-06-302002-03-12Avid Technology, Inc.Media pipeline with multichannel video processing and playback
JPH07175458A (en)*1993-10-121995-07-14Internatl Business Mach Corp <Ibm>Method and system for reduction of looking-on of data on screen
US5502504A (en)*1994-04-281996-03-26Prevue Networks, Inc.Video mix program guide
US5831615A (en)*1994-09-301998-11-03Intel CorporationMethod and apparatus for redrawing transparent windows
US5546518A (en)*1995-01-061996-08-13Microsoft CorporationSystem and method for composing a display frame of multiple layered graphic sprites
USD416241S (en)1995-05-051999-11-09Apple Computer, Inc.Computer display screen with scroll bars
USD423485S (en)*1995-05-052000-04-25Apple Computer, Inc.Computer display screen with a computer generated menu design
JP3355596B2 (en)*1995-06-062002-12-09インターナショナル・ビジネス・マシーンズ・コーポレーション Graphics device and display method
US5778404A (en)*1995-08-071998-07-07Apple Computer, Inc.String inserter for pen-based computer systems and method for providing same
JP3141737B2 (en)*1995-08-102001-03-05株式会社セガ Virtual image generation apparatus and method
JP2817687B2 (en)*1995-12-281998-10-30富士ゼロックス株式会社 Image forming device
IL126142A0 (en)*1996-03-151999-05-09Zapa Digital Art LtdProgrammable computer graphic objects
US6317128B1 (en)*1996-04-182001-11-13Silicon Graphics, Inc.Graphical user interface with anti-interference outlines for enhanced variably-transparent applications
JP2768350B2 (en)*1996-05-131998-06-25日本電気株式会社 Bitmap data bit building method and graphics controller
US6252595B1 (en)*1996-06-162001-06-26Ati Technologies Inc.Method and apparatus for a multi-state window
US5883670A (en)1996-08-021999-03-16Avid Technology, Inc.Motion video processing circuit for capture playback and manipulation of digital motion video information on a computer
US6604242B1 (en)*1998-05-182003-08-05Liberate TechnologiesCombining television broadcast and personalized/interactive information
US6275236B1 (en)*1997-01-242001-08-14Compaq Computer CorporationSystem and method for displaying tracked objects on a display device
US6246407B1 (en)*1997-06-162001-06-12Ati Technologies, Inc.Method and apparatus for overlaying a window with a multi-state window
US6105083A (en)1997-06-202000-08-15Avid Technology, Inc.Apparatus and method for controlling transfer of data between and processing of data by interconnected data processing elements
US5973734A (en)1997-07-091999-10-26Flashpoint Technology, Inc.Method and apparatus for correcting aspect ratio in a camera graphical user interface
JP2001508986A (en)*1997-09-302001-07-03コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Image mixing method and display device
US6002397A (en)*1997-09-301999-12-14International Business Machines CorporationWindow hatches in graphical user interface
US6269196B1 (en)1998-01-162001-07-31Adobe Systems IncorporatedImage blending with interpolated transfer modes including a normal transfer mode
US6504575B1 (en)*1998-02-272003-01-07Flashpoint Technology, Inc.Method and system for displaying overlay bars in a digital imaging device
US6130665A (en)*1998-04-012000-10-10Telefonaktiebolaget Lm EricssonTouch screen handling
JP3300280B2 (en)*1998-04-232002-07-08インターナショナル・ビジネス・マシーンズ・コーポレーション Image synthesis processing apparatus and method
USD427576S (en)*1998-05-012000-07-04Apple Computer, Inc.Menu design for a computer display screen
USD431038S (en)*1998-05-042000-09-19Apple Computer, Inc.Window for a computer display screen
USD426207S (en)*1998-05-072000-06-06Apple Computer, Inc.Window for a computer display screen
USD428398S (en)*1998-05-072000-07-18Apple Computer, Inc.Menu design for a computer display screen
US6493575B1 (en)1998-06-042002-12-10Randy J. KestenFluoroscopic tracking enhanced intraventricular catheter system
US6912311B2 (en)*1998-06-302005-06-28Flashpoint Technology, Inc.Creation and use of complex image templates
US6731295B1 (en)1998-11-092004-05-04Broadcom CorporationGraphics display system with window descriptors
US7446774B1 (en)*1998-11-092008-11-04Broadcom CorporationVideo and graphics system with an integrated system bridge controller
US6636222B1 (en)1999-11-092003-10-21Broadcom CorporationVideo and graphics system with an MPEG video decoder for concurrent multi-row decoding
US6661422B1 (en)1998-11-092003-12-09Broadcom CorporationVideo and graphics system with MPEG specific data transfer commands
US6573905B1 (en)1999-11-092003-06-03Broadcom CorporationVideo and graphics system with parallel processing of graphics windows
US6798420B1 (en)1998-11-092004-09-28Broadcom CorporationVideo and graphics system with a single-port RAM
US6853385B1 (en)1999-11-092005-02-08Broadcom CorporationVideo, audio and graphics decode, composite and display system
US7982740B2 (en)1998-11-092011-07-19Broadcom CorporationLow resolution graphics mode support using window descriptors
US6768774B1 (en)1998-11-092004-07-27Broadcom CorporationVideo and graphics system with video scaling
US6317141B1 (en)1998-12-312001-11-13Flashpoint Technology, Inc.Method and apparatus for editing heterogeneous media objects in a digital imaging device
JP2000207092A (en)*1999-01-192000-07-28Internatl Business Mach Corp <Ibm>Method and device for preventing misoperation and storage medium with software product for misoperation prevention stored therein
US6433798B1 (en)*1999-04-302002-08-13Sun Microsystems, Inc.Stationary scrolling for user interfaces
WO2000077974A1 (en)1999-06-112000-12-21Liberate TechnologiesHierarchical open security information delegation and acquisition
CA2419624A1 (en)1999-08-012001-02-08Deep Video Imaging LimitedInteractive three dimensional display with layered screens
US7882426B1 (en)*1999-08-092011-02-01Cognex CorporationConditional cell execution in electronic spreadsheets
EP1212745B1 (en)*1999-08-192006-02-08PureDepth LimitedControl of depth movement for visual display with layered screens
US7624339B1 (en)1999-08-192009-11-24Puredepth LimitedData display for multiple layered screens
AU769103B2 (en)*1999-08-192004-01-15Pure Depth LimitedDisplay method for multiple layered screens
US6975324B1 (en)1999-11-092005-12-13Broadcom CorporationVideo and graphics system with a video transport processor
US8913667B2 (en)*1999-11-092014-12-16Broadcom CorporationVideo decoding system having a programmable variable-length decoder
US6538656B1 (en)1999-11-092003-03-25Broadcom CorporationVideo and graphics system with a data transport processor
US9668011B2 (en)*2001-02-052017-05-30Avago Technologies General Ip (Singapore) Pte. Ltd.Single chip set-top box system
US6670970B1 (en)1999-12-202003-12-30Apple Computer, Inc.Graduated visual and manipulative translucency for windows
US6806892B1 (en)1999-12-202004-10-19International Business Machines CorporationLayer viewport for enhanced viewing in layered drawings
JP2001184842A (en)*1999-12-282001-07-06Hitachi Ltd Information playback device
USD757052S1 (en)2000-01-042016-05-24Apple Inc.Computer display screen with graphical user interface
US6466226B1 (en)*2000-01-102002-10-15Intel CorporationMethod and apparatus for pixel filtering using shared filter resource between overlay and texture mapping engines
US7148898B1 (en)*2000-03-292006-12-12Sourceprose CorporationSystem and method for synchronizing raster and vector map images
US7190377B2 (en)*2000-03-292007-03-13Sourceprose CorporationSystem and method for georeferencing digital raster maps with resistance to potential errors
TWI282956B (en)*2000-05-092007-06-21Sharp KkData signal line drive circuit, and image display device incorporating the same
US6992990B2 (en)*2000-07-172006-01-31Sony CorporationRadio communication apparatus
JP4543513B2 (en)*2000-07-172010-09-15ソニー株式会社 Bidirectional communication system, display device, base device, and bidirectional communication method
JP4501243B2 (en)*2000-07-242010-07-14ソニー株式会社 Television receiver and program execution method
US6954615B2 (en)*2000-07-252005-10-11Sony CorporationDisplay terminal
JP2002064398A (en)*2000-08-212002-02-28Sony CorpWireless transmitter
US6714218B1 (en)*2000-09-052004-03-30Intel CorporationScaling images
JP4881503B2 (en)2000-09-192012-02-22ソニー株式会社 Command processing method and wireless communication device
JP2002111686A (en)*2000-10-042002-04-12Sony CorpCommunication method and communication device
JP4572461B2 (en)2000-10-102010-11-04ソニー株式会社 Terminal device setting method
US6501464B1 (en)*2000-10-312002-12-31Intel CorporationOn-screen transparent keyboard interface
US20020073123A1 (en)*2000-12-082002-06-13Wen-Sung TsaiMethod for displaying overlapping documents in a computer environment
DE20101768U1 (en)*2001-01-312002-03-14Siemens Ag Display and operating device, in particular touch panel
US7030861B1 (en)2001-02-102006-04-18Wayne Carl WestermanSystem and method for packing multi-touch gestures onto a hand
US7343415B2 (en)*2001-03-292008-03-113M Innovative Properties CompanyDisplay of software notes indicating that content from a content provider site is available for display
US20020143900A1 (en)*2001-03-292002-10-03Kenner Martin A.Content recipient access to software notes posted at content provider site
NZ511444A (en)2001-05-012004-01-30Deep Video Imaging LtdInformation display
JP3812368B2 (en)*2001-06-062006-08-23豊田合成株式会社 Group III nitride compound semiconductor device and method for manufacturing the same
US20030001899A1 (en)*2001-06-292003-01-02Nokia CorporationSemi-transparent handwriting recognition UI
JP4250884B2 (en)*2001-09-052009-04-08パナソニック株式会社 Electronic blackboard system
GB2379549A (en)*2001-09-062003-03-12Sharp KkActive matrix display
NZ514119A (en)*2001-09-112004-06-25Deep Video Imaging LtdImprovement to instrumentation
JP3901484B2 (en)*2001-10-052007-04-04株式会社ジェイテクト Electric power steering device
FR2831978B1 (en)*2001-11-072004-08-20Neopost Ind POSTAL PRODUCT STATISTICAL MONITORING SYSTEM
US7921284B1 (en)2001-12-122011-04-05Gary Mark KinghornMethod and system for protecting electronic data in enterprise environment
US7565683B1 (en)2001-12-122009-07-21Weiqing HuangMethod and system for implementing changes to security policies in a distributed security system
US7178033B1 (en)2001-12-122007-02-13Pss Systems, Inc.Method and apparatus for securing digital assets
US10033700B2 (en)2001-12-122018-07-24Intellectual Ventures I LlcDynamic evaluation of access rights
US7921450B1 (en)2001-12-122011-04-05Klimenty VainsteinSecurity system using indirect key generation from access rules and methods therefor
US8006280B1 (en)2001-12-122011-08-23Hildebrand Hal SSecurity system for generating keys from access rules in a decentralized manner and methods therefor
US8065713B1 (en)2001-12-122011-11-22Klimenty VainsteinSystem and method for providing multi-location access management to secured items
US10360545B2 (en)2001-12-122019-07-23Guardian Data Storage, LlcMethod and apparatus for accessing secured electronic data off-line
US7380120B1 (en)2001-12-122008-05-27Guardian Data Storage, LlcSecured data format for access control
US7921288B1 (en)2001-12-122011-04-05Hildebrand Hal SSystem and method for providing different levels of key security for controlling access to secured items
US7930756B1 (en)2001-12-122011-04-19Crocker Steven ToyeMulti-level cryptographic transformations for securing digital assets
US7260555B2 (en)2001-12-122007-08-21Guardian Data Storage, LlcMethod and architecture for providing pervasive security to digital assets
US7950066B1 (en)2001-12-212011-05-24Guardian Data Storage, LlcMethod and system for restricting use of a clipboard application
US6784905B2 (en)2002-01-222004-08-31International Business Machines CorporationApplying translucent filters according to visual disability needs
US6876369B2 (en)*2002-01-222005-04-05International Business Machines Corp.Applying translucent filters according to visual disability needs in a network environment
US8176334B2 (en)2002-09-302012-05-08Guardian Data Storage, LlcDocument security system that permits external users to gain access to secured files
US7487444B2 (en)2002-03-192009-02-03Aol LlcReformatting columns of content for display
US7096432B2 (en)*2002-05-142006-08-22Microsoft CorporationWrite anywhere tool
US20040001101A1 (en)*2002-06-272004-01-01Koninklijke Philips Electronics N.V.Active window switcher
US7355609B1 (en)2002-08-062008-04-08Apple Inc.Computing visible regions for a hierarchical view
NZ521505A (en)*2002-09-202005-05-27Deep Video Imaging LtdMulti-view display
US7667710B2 (en)*2003-04-252010-02-23Broadcom CorporationGraphics display system with line buffer control scheme
NZ525956A (en)*2003-05-162005-10-28Deep Video Imaging LtdDisplay control system for use with multi-layer displays
US7681112B1 (en)2003-05-302010-03-16Adobe Systems IncorporatedEmbedded reuse meta information
US8707034B1 (en)2003-05-302014-04-22Intellectual Ventures I LlcMethod and system for using remote headers to secure electronic files
US20040261039A1 (en)*2003-06-192004-12-23International Business Machines CorporationMethod and system for ordering on-screen windows for display
US7092693B2 (en)2003-08-292006-08-15Sony CorporationUltra-wide band wireless / power-line communication system for delivering audio/video content
US9024884B2 (en)2003-09-022015-05-05Apple Inc.Touch-sensitive electronic apparatus for media applications, and methods therefor
JP4306390B2 (en)*2003-09-292009-07-29日本電気株式会社 Password authentication apparatus, method and program
JP2005107780A (en)*2003-09-302005-04-21Sony CorpImage blending method and blended image data generation device
US8127366B2 (en)2003-09-302012-02-28Guardian Data Storage, LlcMethod and apparatus for transitioning between states of security policies used to secure electronic documents
US7703140B2 (en)2003-09-302010-04-20Guardian Data Storage, LlcMethod and system for securing digital assets using process-driven security policies
US8063916B2 (en)2003-10-222011-11-22Broadcom CorporationGraphics layer reduction for video composition
WO2005041048A1 (en)*2003-10-292005-05-06Matsushita Electric Industrial Co.,Ltd.Electronic document reading system
US7675528B2 (en)*2003-11-142010-03-09Vistaprint Technologies LimitedImage cropping system and method
KR101007798B1 (en)*2003-12-082011-01-14엘지전자 주식회사 Scaling Method for Partial Area of Main Image of Digital Broadcast Receiver
KR100617810B1 (en)*2004-03-032006-08-28삼성전자주식회사 Data display device and method
US7644369B2 (en)*2004-03-192010-01-05Rocket Software, Inc.Controlling display screen legibility
US20050210400A1 (en)*2004-03-192005-09-22Peter Hoe-RichardsonControlling display screen legibility
US7834819B2 (en)2004-04-012010-11-16Polyvision CorporationVirtual flip chart method and apparatus
US7948448B2 (en)2004-04-012011-05-24Polyvision CorporationPortable presentation system and methods for use therewith
US7627757B2 (en)*2004-04-302009-12-01Research In Motion LimitedMessage service indication system and method
JP3949674B2 (en)*2004-05-112007-07-25株式会社コナミデジタルエンタテインメント Display device, display method, and program
US7515135B2 (en)*2004-06-152009-04-07Research In Motion LimitedVirtual keypad for touchscreen display
US7546543B2 (en)2004-06-252009-06-09Apple Inc.Widget authoring and editing environment
US8302020B2 (en)2004-06-252012-10-30Apple Inc.Widget authoring and editing environment
US8566732B2 (en)2004-06-252013-10-22Apple Inc.Synchronization of widgets and dashboards
US8239749B2 (en)2004-06-252012-08-07Apple Inc.Procedurally expressing graphic objects for web pages
US7761800B2 (en)*2004-06-252010-07-20Apple Inc.Unified interest layer for user interface
US20060007178A1 (en)*2004-07-072006-01-12Scott DavisElectronic device having an imporoved user interface
US7429993B2 (en)*2004-09-172008-09-30Microsoft CorporationMethod and system for presenting functionally-transparent, unobtrusive on-screen windows
US7701460B2 (en)*2004-11-152010-04-20Hewlett-Packard Development Company, L.P.Graphics systems and methods
KR100580264B1 (en)*2004-12-092006-05-16삼성전자주식회사 Automatic image processing method and apparatus
US20060125846A1 (en)*2004-12-102006-06-15Springer Gregory TVirtual overlay for computer device displays
US20060150104A1 (en)*2004-12-312006-07-06Luigi LiraDisplay of user selected digital artworks as embellishments of a graphical user interface
US8140975B2 (en)*2005-01-072012-03-20Apple Inc.Slide show navigation
US8341541B2 (en)*2005-01-182012-12-25Microsoft CorporationSystem and method for visually browsing of open windows
US7747965B2 (en)*2005-01-182010-06-29Microsoft CorporationSystem and method for controlling the opacity of multiple windows while browsing
US7426697B2 (en)*2005-01-182008-09-16Microsoft CorporationMulti-application tabbing system
US8219907B2 (en)*2005-03-082012-07-10Microsoft CorporationResource authoring with re-usability score and suggested re-usable data
US20060256090A1 (en)*2005-05-122006-11-16Apple Computer, Inc.Mechanical overlay
US7489320B2 (en)*2005-05-132009-02-10Seiko Epson CorporationSystem and method for conserving memory bandwidth while supporting multiple sprites
US8543931B2 (en)2005-06-072013-09-24Apple Inc.Preview including theme based installation of user interface elements in a display environment
US7890881B1 (en)*2005-07-292011-02-15Adobe Systems IncorporatedSystems and methods for a fold preview
US7647335B1 (en)2005-08-302010-01-12ATA SpA - Advanced Technology AssessmentComputing system and methods for distributed generation and storage of complex relational data
EP1938177A4 (en)*2005-10-152013-01-09Nokia Corp ENHANCED TEXT ENTRY IN ELECTRONIC DEVICES
US7954064B2 (en)2005-10-272011-05-31Apple Inc.Multiple dashboards
US7822596B2 (en)*2005-12-052010-10-26Microsoft CorporationFlexible display translation
US20070139430A1 (en)*2005-12-212007-06-21Microsoft CorporationRendering "gadgets" with a browser
US7876333B2 (en)*2006-03-302011-01-25Smart Technologies UlcMethod and graphical interface for embedding animated content into a computer application
US7620905B2 (en)*2006-04-142009-11-17International Business Machines CorporationSystem and method of windows management
US8015245B2 (en)*2006-04-242011-09-06Microsoft CorporationPersonalized information communications
US8155682B2 (en)*2006-05-052012-04-10Research In Motion LimitedHandheld electronic device including automatic mobile phone number management, and associated method
US9224145B1 (en)2006-08-302015-12-29Qurio Holdings, Inc.Venue based digital rights using capture device with digital watermarking capability
KR100764652B1 (en)*2006-10-252007-10-08삼성전자주식회사 Key input device and method for a terminal having a touch screen
US20080168367A1 (en)*2007-01-072008-07-10Chaudhri Imran ADashboards, Widgets and Devices
KR100881952B1 (en)*2007-01-202009-02-06엘지전자 주식회사 Mobile communication terminal having a touch screen and its operation control method
US8667415B2 (en)2007-08-062014-03-04Apple Inc.Web widgets
US8156467B2 (en)*2007-08-272012-04-10Adobe Systems IncorporatedReusing components in a running application
US8176466B2 (en)2007-10-012012-05-08Adobe Systems IncorporatedSystem and method for generating an application fragment
US20090128581A1 (en)*2007-11-202009-05-21Microsoft CorporationCustom transition framework for application state transitions
US9445772B2 (en)*2007-12-312016-09-20St. Jude Medical, Atrial Fibrillatin Division, Inc.Reduced radiation fluoroscopic system
US8495487B2 (en)*2009-01-042013-07-23Sandra Lee JeromeWeb-based dealership management system
US20090177538A1 (en)*2008-01-082009-07-09Microsoft CorporationZoomable advertisements with targeted content
CN101493749A (en)*2008-01-212009-07-29联想(北京)有限公司Windows display status regulation method and apparatus
US9619304B2 (en)2008-02-052017-04-11Adobe Systems IncorporatedAutomatic connections between application components
US20100014825A1 (en)*2008-07-182010-01-21Porto Technology, LlcUse of a secondary device to overlay disassociated media elements onto video content
US8656293B1 (en)2008-07-292014-02-18Adobe Systems IncorporatedConfiguring mobile devices
US8384738B2 (en)*2008-09-022013-02-26Hewlett-Packard Development Company, L.P.Compositing windowing system
US20100275126A1 (en)*2009-04-272010-10-28Scott David LinckeAutomatic On-Screen Keyboard
US9703411B2 (en)*2009-04-302017-07-11Synaptics IncorporatedReduction in latency between user input and visual feedback
US8416262B2 (en)2009-09-162013-04-09Research In Motion LimitedMethods and devices for displaying an overlay on a device display screen
US8957920B2 (en)2010-06-252015-02-17Microsoft CorporationAlternative semantics for zoom operations in a zoomable scene
US20120011460A1 (en)*2010-07-122012-01-12Action Star Enterprise Co., Ltd.System capable of simulating variety of output/input devices
US8977984B2 (en)2010-07-282015-03-10Kyocera CorporationMobile electronic device, screen control method and additional display program
KR101044320B1 (en)*2010-10-142011-06-29주식회사 네오패드 Method and system for providing background content of virtual key input means
US9710730B2 (en)*2011-02-112017-07-18Microsoft Technology Licensing, LlcImage registration
JP5325248B2 (en)*2011-03-182013-10-23株式会社スクウェア・エニックス Video game processing apparatus and video game processing program
KR20120117578A (en)*2011-04-152012-10-24삼성전자주식회사Displaying method and display apparatus for applying the same
US9472018B2 (en)*2011-05-192016-10-18Arm LimitedGraphics processing systems
US9043715B2 (en)*2011-06-022015-05-26International Business Machines CorporationAlert event notification
JP2012253543A (en)*2011-06-022012-12-20Seiko Epson CorpDisplay device, control method of display device, and program
US20130298071A1 (en)*2012-05-022013-11-07Jonathan WINEFinger text-entry overlay
USD727342S1 (en)*2012-06-052015-04-21P&W Solutions Co., Ltd.Display screen with graphical user interface
GB201212878D0 (en)*2012-07-202012-09-05Pike JustinAuthentication method and system
CN105938430B (en)*2012-07-312019-08-23北京奇虎科技有限公司A kind of device for displaying information and method
JP5522755B2 (en)*2012-09-142014-06-18Necシステムテクノロジー株式会社 INPUT DISPLAY CONTROL DEVICE, THIN CLIENT SYSTEM, INPUT DISPLAY CONTROL METHOD, AND PROGRAM
EP2747071A1 (en)2012-12-212014-06-25Deutsche Telekom AGDisplay of a tamper-resistant identity indicator
JP6271125B2 (en)*2012-12-272018-01-31株式会社東芝 Electronic device, display method, and program
US20140195943A1 (en)*2013-01-042014-07-10Patent Category Corp.User interface controls for portable devices
US9881592B2 (en)*2013-10-082018-01-30Nvidia CorporationHardware overlay assignment
CN103713832B (en)*2013-12-232017-06-27联想(北京)有限公司A kind of display processing method and electronic equipment
US9986225B2 (en)*2014-02-142018-05-29Autodesk, Inc.Techniques for cut-away stereo content in a stereoscopic display
US10534517B2 (en)*2014-04-252020-01-14Rohde & Schwarz Gmbh & Co. KgMeasuring device and method for operating a measuring device using transparent display content
WO2016018274A1 (en)*2014-07-302016-02-04Hewlett-Packard Development Company, L.P.Transparent whiteboard display
US10061509B2 (en)*2014-10-092018-08-28Lenovo (Singapore) Pte. Ltd.Keypad control
US20160246466A1 (en)*2015-02-232016-08-25Nuance Communications, Inc.Transparent full-screen text entry interface
KR20170005602A (en)*2015-07-062017-01-16삼성전자주식회사Method for providing an integrated Augmented Reality and Virtual Reality and Electronic device using the same
CN106020638A (en)*2016-05-062016-10-12乐视控股(北京)有限公司A display interface switching method, a display interface switching apparatus and a mobile apparatus
US10264213B1 (en)2016-12-152019-04-16Steelcase Inc.Content amplification system and method
US11294530B2 (en)*2017-08-072022-04-05Microsoft Technology Licensing, LlcDisplaying a translucent version of a user interface element
BE1025593B1 (en)*2017-09-292019-04-29Inventrans Bvba METHOD AND DEVICE AND SYSTEM FOR PROVIDING DOUBLE MOUSE SUPPORT
CN114924651A (en)2017-09-292022-08-19苹果公司 Gaze-Based User Interaction
US10965929B1 (en)*2019-01-042021-03-30Rockwell Collins, Inc.Depth mapping and parallel distortion correction for mixed reality
US11907605B2 (en)2021-05-152024-02-20Apple Inc.Shared-content session user interfaces
US11449188B1 (en)2021-05-152022-09-20Apple Inc.Shared-content session user interfaces

Citations (42)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4555775A (en)*1982-10-071985-11-26At&T Bell LaboratoriesDynamic generation and overlaying of graphic windows for multiple active program storage areas
US4686522A (en)1985-02-191987-08-11International Business Machines CorporationMethod of editing graphic objects in an interactive draw graphic system using implicit editing actions
EP0280582A2 (en)*1987-02-271988-08-31Axiom Innovation LimitedImprovements in computer graphics systems
US4783648A (en)*1985-07-011988-11-08Hitachi, Ltd.Display control system for multiwindow
US4823281A (en)1985-04-301989-04-18Ibm CorporationColor graphic processor for performing logical operations
US4827253A (en)1987-05-181989-05-02Dubner Computer Systems, Inc.Video compositing using a software linear keyer
US4868765A (en)1986-01-021989-09-19Texas Instruments IncorporatedPorthole window system for computer displays
US4914607A (en)*1986-04-091990-04-03Hitachi, Ltd.Multi-screen display control system and its method
US4954970A (en)1988-04-081990-09-04Walker James TVideo overlay image processing apparatus
US4959803A (en)*1987-06-261990-09-25Sharp Kabushiki KaishaDisplay control system
US4974196A (en)*1987-09-211990-11-27Hitachi, Ltd.Method of processing commands for cataloged procedure in multi-window system
US4992781A (en)1987-07-171991-02-12Sharp Kabushiki KaishaImage synthesizer
US5119476A (en)*1988-06-221992-06-02Bull S.A.Method for generating dialogue-windows visually displayed on a computer-screen and equipment for implementing this method
US5124691A (en)*1988-07-151992-06-23Sharp Kabushiki KaishaPicture information display device
US5185808A (en)1991-06-061993-02-09Eastman Kodak CompanyMethod for merging images
US5265202A (en)1992-08-281993-11-23International Business Machines CorporationMethod and system for accessing visually obscured data in a data processing system
US5283867A (en)1989-06-161994-02-01International Business MachinesDigital image overlay system and method
US5283560A (en)*1991-06-251994-02-01Digital Equipment CorporationComputer system and method for displaying images with superimposed partially transparent menus
US5307452A (en)1990-09-211994-04-26PixarMethod and apparatus for creating, manipulating and displaying images
US5313571A (en)1991-10-171994-05-17Fuji Xerox Co., Ltd.Apparatus for storing and displaying graphs
US5313227A (en)1988-04-151994-05-17International Business Machines CorporationGraphic display system capable of cutting out partial images
US5351067A (en)*1991-07-221994-09-27International Business Machines CorporationMulti-source image real time mixing and anti-aliasing
EP0635779A1 (en)1993-07-211995-01-25Xerox CorporationUser interface having movable sheet with click-through tools
EP0635780A1 (en)1993-07-211995-01-25Xerox CorporationUser interface having clicktrough tools that can be composed with other tools
US5425141A (en)1991-12-181995-06-13Sun Microsystems, Inc.Managing display windows of inter-related applications using hollowed windows
US5425137A (en)1993-01-261995-06-13Us Jvc CorporationSystem and method for processing images using computer-implemented software objects representing lenses
US5463728A (en)*1993-03-101995-10-31At&T Corp.Electronic circuits for the graphical display of overlapping windows with transparency
US5463726A (en)1990-11-201995-10-31International Business Machines CorporationMethod and apparatus for graphic accessing of multiple software applications
US5467443A (en)1991-09-251995-11-14Macromedia, Inc.System and method for automatically generating derived graphic elements
US5467441A (en)1993-07-211995-11-14Xerox CorporationMethod for operating on objects in a first image using an object-based model data structure to produce a second contextual image having added, replaced or deleted objects
US5469541A (en)1990-05-101995-11-21International Business Machines CorporationWindow specific control of overlay planes in a graphics display system
US5469540A (en)1993-01-271995-11-21Apple Computer, Inc.Method and apparatus for generating and displaying multiple simultaneously-active windows
US5475812A (en)1992-09-111995-12-12International Business Machines CorporationMethod and system for independent control of multiple windows in a graphics display system
US5590265A (en)1992-07-271996-12-31Canon Kabushiki KaishaSystem which can display multiwindows and its window dosplay method
US5596690A (en)1993-07-211997-01-21Xerox CorporationMethod and apparatus for operating on an object-based model data structure to produce a second image in the spatial context of a first image
US5638501A (en)*1993-05-101997-06-10Apple Computer, Inc.Method and apparatus for displaying an overlay image
US5651107A (en)*1992-12-151997-07-22Sun Microsystems, Inc.Method and apparatus for presenting information in a display system using transparent windows
US5652851A (en)1993-07-211997-07-29Xerox CorporationUser interface technique for producing a second image in the spatial context of a first image using a model-based operation
US5729704A (en)1993-07-211998-03-17Xerox CorporationUser-directed method for operating on an object-based model data structure through a second contextual image
US5798752A (en)1993-07-211998-08-25Xerox CorporationUser interface having simultaneously movable tools and cursor
US5818455A (en)1993-07-211998-10-06Xerox CorporationMethod and apparatus for operating on the model data structure of an image to produce human perceptible output using a viewing operation region having explicit multiple regions
US5831615A (en)*1994-09-301998-11-03Intel CorporationMethod and apparatus for redrawing transparent windows

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JPH02114319A (en)1988-10-251990-04-26Fujitsu Ltd Window system window display method
US5157384A (en)*1989-04-281992-10-20International Business Machines CorporationAdvanced user interface
US5252951A (en)*1989-04-281993-10-12International Business Machines CorporationGraphical user interface with gesture recognition in a multiapplication environment
JPH03288891A (en)1990-04-051991-12-19Fujitsu LtdWindow display control system for mutiwindow system
US5581243A (en)*1990-06-041996-12-03Microslate Inc.Method and apparatus for displaying simulated keyboards on touch-sensitive displays
US5260697A (en)1990-11-131993-11-09Wang Laboratories, Inc.Computer with separate display plane and user interface processor
US5491495A (en)1990-11-131996-02-13Wang Laboratories, Inc.User interface having simulated devices
US5333255A (en)1991-01-031994-07-26Xerox CorporationApparatus for displaying a plurality of two dimensional display regions on a display
US5233686A (en)1991-09-241993-08-03Ceridian CorporationOpen systems software backplane architecture for federated execution of independent application programs
TW241196B (en)*1993-01-151995-02-21Du Pont
US6072489A (en)1993-05-102000-06-06Apple Computer, Inc.Method and apparatus for providing translucent images on a computer display
US5398309A (en)*1993-05-171995-03-14Intel CorporationMethod and apparatus for generating composite images using multiple local masks
US5684939A (en)*1993-07-091997-11-04Silicon Graphics, Inc.Antialiased imaging with improved pixel supersampling
US5367453A (en)*1993-08-021994-11-22Apple Computer, Inc.Method and apparatus for correcting words
US5528738A (en)*1993-10-061996-06-18Silicon Graphics, Inc.Method and apparatus for antialiasing raster scanned, polygonal shaped images
US7505046B1 (en)*2000-05-022009-03-17Adobe Systems IncorporatedPreserving opaque-like rendering in transparent 2D graphics using knockout groups

Patent Citations (46)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4555775A (en)*1982-10-071985-11-26At&T Bell LaboratoriesDynamic generation and overlaying of graphic windows for multiple active program storage areas
US4555775B1 (en)*1982-10-071995-12-05Bell Telephone Labor IncDynamic generation and overlaying of graphic windows for multiple active program storage areas
US4686522A (en)1985-02-191987-08-11International Business Machines CorporationMethod of editing graphic objects in an interactive draw graphic system using implicit editing actions
US4823281A (en)1985-04-301989-04-18Ibm CorporationColor graphic processor for performing logical operations
US4783648A (en)*1985-07-011988-11-08Hitachi, Ltd.Display control system for multiwindow
US4868765A (en)1986-01-021989-09-19Texas Instruments IncorporatedPorthole window system for computer displays
US4914607A (en)*1986-04-091990-04-03Hitachi, Ltd.Multi-screen display control system and its method
EP0280582A2 (en)*1987-02-271988-08-31Axiom Innovation LimitedImprovements in computer graphics systems
US4827253A (en)1987-05-181989-05-02Dubner Computer Systems, Inc.Video compositing using a software linear keyer
US4959803A (en)*1987-06-261990-09-25Sharp Kabushiki KaishaDisplay control system
US4992781A (en)1987-07-171991-02-12Sharp Kabushiki KaishaImage synthesizer
US4974196A (en)*1987-09-211990-11-27Hitachi, Ltd.Method of processing commands for cataloged procedure in multi-window system
US4954970A (en)1988-04-081990-09-04Walker James TVideo overlay image processing apparatus
US5313227A (en)1988-04-151994-05-17International Business Machines CorporationGraphic display system capable of cutting out partial images
US5119476A (en)*1988-06-221992-06-02Bull S.A.Method for generating dialogue-windows visually displayed on a computer-screen and equipment for implementing this method
US5124691A (en)*1988-07-151992-06-23Sharp Kabushiki KaishaPicture information display device
US5283867A (en)1989-06-161994-02-01International Business MachinesDigital image overlay system and method
US5469541A (en)1990-05-101995-11-21International Business Machines CorporationWindow specific control of overlay planes in a graphics display system
US5307452A (en)1990-09-211994-04-26PixarMethod and apparatus for creating, manipulating and displaying images
US5463726A (en)1990-11-201995-10-31International Business Machines CorporationMethod and apparatus for graphic accessing of multiple software applications
US5185808A (en)1991-06-061993-02-09Eastman Kodak CompanyMethod for merging images
US5283560A (en)*1991-06-251994-02-01Digital Equipment CorporationComputer system and method for displaying images with superimposed partially transparent menus
US5351067A (en)*1991-07-221994-09-27International Business Machines CorporationMulti-source image real time mixing and anti-aliasing
US5467443A (en)1991-09-251995-11-14Macromedia, Inc.System and method for automatically generating derived graphic elements
US5313571A (en)1991-10-171994-05-17Fuji Xerox Co., Ltd.Apparatus for storing and displaying graphs
US5425141A (en)1991-12-181995-06-13Sun Microsystems, Inc.Managing display windows of inter-related applications using hollowed windows
US5590265A (en)1992-07-271996-12-31Canon Kabushiki KaishaSystem which can display multiwindows and its window dosplay method
US5265202A (en)1992-08-281993-11-23International Business Machines CorporationMethod and system for accessing visually obscured data in a data processing system
US5475812A (en)1992-09-111995-12-12International Business Machines CorporationMethod and system for independent control of multiple windows in a graphics display system
US5651107A (en)*1992-12-151997-07-22Sun Microsystems, Inc.Method and apparatus for presenting information in a display system using transparent windows
US5425137A (en)1993-01-261995-06-13Us Jvc CorporationSystem and method for processing images using computer-implemented software objects representing lenses
US5469540A (en)1993-01-271995-11-21Apple Computer, Inc.Method and apparatus for generating and displaying multiple simultaneously-active windows
US5463728A (en)*1993-03-101995-10-31At&T Corp.Electronic circuits for the graphical display of overlapping windows with transparency
US5638501A (en)*1993-05-101997-06-10Apple Computer, Inc.Method and apparatus for displaying an overlay image
US5949432A (en)1993-05-101999-09-07Apple Computer, Inc.Method and apparatus for providing translucent images on a computer display
US5617114A (en)1993-07-211997-04-01Xerox CorporationUser interface having click-through tools that can be composed with other tools
US5596690A (en)1993-07-211997-01-21Xerox CorporationMethod and apparatus for operating on an object-based model data structure to produce a second image in the spatial context of a first image
US5467441A (en)1993-07-211995-11-14Xerox CorporationMethod for operating on objects in a first image using an object-based model data structure to produce a second contextual image having added, replaced or deleted objects
EP0635779A1 (en)1993-07-211995-01-25Xerox CorporationUser interface having movable sheet with click-through tools
US5581670A (en)1993-07-211996-12-03Xerox CorporationUser interface having movable sheet with click-through tools
US5652851A (en)1993-07-211997-07-29Xerox CorporationUser interface technique for producing a second image in the spatial context of a first image using a model-based operation
US5729704A (en)1993-07-211998-03-17Xerox CorporationUser-directed method for operating on an object-based model data structure through a second contextual image
US5798752A (en)1993-07-211998-08-25Xerox CorporationUser interface having simultaneously movable tools and cursor
US5818455A (en)1993-07-211998-10-06Xerox CorporationMethod and apparatus for operating on the model data structure of an image to produce human perceptible output using a viewing operation region having explicit multiple regions
EP0635780A1 (en)1993-07-211995-01-25Xerox CorporationUser interface having clicktrough tools that can be composed with other tools
US5831615A (en)*1994-09-301998-11-03Intel CorporationMethod and apparatus for redrawing transparent windows

Non-Patent Citations (11)

* Cited by examiner, † Cited by third party
Title
Angel, Edward, Interactive Computer Graphics: A Top-Down Approach with OpenGL, 1997, pp. 57-58, 214-215, 412-414, Addison-Wesley Longman, Inc., Reading, Massachusetts.
Anonymous, Method to Allow Uers to Select Transparent Color for Windows, Mar. 1993, Research Disclosure, pp. 1-3.
Bier et al., "Toolglass and Magic Lenses: The See-Through Interface," 1993, Computer Graphics Proceedings, Annual Conference Series, pp. 73-80.
Douglas C. Engelbart and William K. English, "A Research Center for Augmenting Human Intellect," AFIPS Conference Proceedings of the 1968 Fall Joint Computer Conference, Dec. 1968, pp. 395-410, vol. 33, San Francisco, California. Reprinted by Thompson Book Company, Washington D.C.
Foley, , J.D., Van Dam, A., Feiner, S.K., Hughes, J.F., Computer Graphics: Principles and Practice, 1990, pp. 754-758, 909-910, Second Edition, Addison-Wesley Publishing Company, Reading, Massachusetts.
Glassner, Andrew S., Editor, Graphics Gems, 1990, pp. 397-399, Academic Press, Inc., San Diego, California.
Hearn, Donald and Baker, M. Pauline, Computer Graphics, 1994, pp. 508-511, Second Edition, Prentice Hall, Inc., Englewood Cliffs, New Jersey.
Hiroshi Ishii and Kazuho Arita, "ClearFace: Translucent Multiuser Interface for TeamWorkStation," in ACM Sigchi Bulletin, Oct. 1991, pp. 67-68, vol. 23, No. 4, ACM, New York, New York.
Hiroshi Ishii and Kazuho Arita, "ClearFace: Translucent Multiuser Interface for TeamWorkStation," Proceedings of ECSCW-91, Sep. 1991, pp. 163-174, Amsterdam, The Netherlands, Editors L. Bannon, M. Robinson and K. Schmidt.
IBM Technical Disclosure Bulletin, "Transparent Window Selection", vol. 30, No. 11, Apr. 1988, pp. 268-270.*
Vince, John, Computer Animation, 1992, pp. 134, 314, Addison-Wesley Publishing Company, Reading, Massachusetts.

Cited By (125)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
USRE45630E1 (en)1993-05-102015-07-28Apple Inc.Method and apparatus for providing translucent images on a computer display
USRE44241E1 (en)1993-05-102013-05-28Apple Inc.Method and apparatus for providing translucent images on a computer display
US9189467B1 (en)2001-11-072015-11-17Apple Inc.Method and apparatus for annotating an electronic document
US9552131B2 (en)2002-07-102017-01-24Apple Inc.Method and apparatus for displaying a window for a user interface
US20070083825A1 (en)*2002-07-102007-04-12Imran ChaudhriMethod and apparatus for displaying a window for a user interface
US20070089066A1 (en)*2002-07-102007-04-19Imran ChaudhriMethod and apparatus for displaying a window for a user interface
US8601384B2 (en)2002-07-102013-12-03Apple Inc.Method and apparatus for displaying a window for a user interface
US8533624B2 (en)2002-07-102013-09-10Apple Inc.Method and apparatus for displaying a window for a user interface
US10365782B2 (en)2002-07-102019-07-30Apple Inc.Method and apparatus for displaying a window for a user interface
US20050216856A1 (en)*2004-03-232005-09-29Matti Michael CSystem and method for displaying information on an interface device
US9753627B2 (en)2004-06-252017-09-05Apple Inc.Visual characteristics of user interface elements in a unified interest layer
US8453065B2 (en)2004-06-252013-05-28Apple Inc.Preview and installation of user interface elements in a display environment
US10489040B2 (en)2004-06-252019-11-26Apple Inc.Visual characteristics of user interface elements in a unified interest layer
US9507503B2 (en)2004-06-252016-11-29Apple Inc.Remote access to layer and user interface elements
US20070106952A1 (en)*2005-06-032007-05-10Apple Computer, Inc.Presenting and managing clipped content
US9098597B2 (en)2005-06-032015-08-04Apple Inc.Presenting and managing clipped content
US20060277460A1 (en)*2005-06-032006-12-07Scott ForstallWebview applications
US9032318B2 (en)2005-10-272015-05-12Apple Inc.Widget security
US9104294B2 (en)*2005-10-272015-08-11Apple Inc.Linked widgets
US20070101291A1 (en)*2005-10-272007-05-03Scott ForstallLinked widgets
US11150781B2 (en)2005-10-272021-10-19Apple Inc.Workflow widgets
US8543824B2 (en)2005-10-272013-09-24Apple Inc.Safe distribution and use of content
US9513930B2 (en)2005-10-272016-12-06Apple Inc.Workflow widgets
US9417888B2 (en)2005-11-182016-08-16Apple Inc.Management of user interface elements in a display environment
US8869027B2 (en)2006-08-042014-10-21Apple Inc.Management and generation of dashboards
US20080163082A1 (en)*2006-12-292008-07-03Nokia CorporationTransparent layer application
US9575655B2 (en)*2006-12-292017-02-21Nokia Technologies OyTransparent layer application
USD700193S1 (en)*2007-03-222014-02-25Fujifilm CorporationElectronic camera
USD681652S1 (en)*2007-03-222013-05-07Fujifilm CorporationElectronic camera
US8839142B2 (en)2007-06-082014-09-16Apple Inc.Desktop system object removal
US9483164B2 (en)2007-07-182016-11-01Apple Inc.User-centric widgets and dashboards
US8954871B2 (en)2007-07-182015-02-10Apple Inc.User-centric widgets and dashboards
US20090228987A1 (en)*2008-03-042009-09-10Microsoft CorporationShield for user interface testing
US8261238B2 (en)*2008-03-042012-09-04Microsoft CorporationShield for user interface testing
US9064104B2 (en)2009-06-182015-06-23Blackberry LimitedGraphical authentication
US10176315B2 (en)2009-06-182019-01-08Blackberry LimitedGraphical authentication
US10325086B2 (en)2009-06-182019-06-18Blackberry LimitedComputing device with graphical authentication interface
US20100322485A1 (en)*2009-06-182010-12-23Research In Motion LimitedGraphical authentication
US9092128B2 (en)2010-05-212015-07-28Apple Inc.Method and apparatus for managing visual information
US9489062B2 (en)2010-09-142016-11-08Google Inc.User interfaces for remote management and control of network-connected thermostats
US9810590B2 (en)2010-09-142017-11-07Google Inc.System and method for integrating sensors in thermostats
US9223323B2 (en)2010-09-142015-12-29Google Inc.User friendly interface for control unit
US9612032B2 (en)2010-09-142017-04-04Google Inc.User friendly interface for control unit
US10124623B2 (en)*2010-09-212018-11-13Harris Research, IncFlexible translucent color matching apparatus
US20120067503A1 (en)*2010-09-212012-03-22Harris Research, Inc.Flexible translucent color matching apparatus
US9658732B2 (en)2010-10-192017-05-23Apple Inc.Changing a virtual workspace based on user interaction with an application window in a user interface
US9542202B2 (en)2010-10-192017-01-10Apple Inc.Displaying and updating workspaces in a user interface
US12182377B2 (en)2010-10-192024-12-31Apple Inc.Updating display of workspaces in a user interface for managing workspaces in response to user input
US9292196B2 (en)2010-10-192016-03-22Apple Inc.Modifying the presentation of clustered application windows in a user interface
US11150780B2 (en)2010-10-192021-10-19Apple Inc.Updating display of workspaces in a user interface for managing workspaces in response to user input
US10740117B2 (en)2010-10-192020-08-11Apple Inc.Grouping windows into clusters in one or more workspaces in a user interface
US9459018B2 (en)2010-11-192016-10-04Google Inc.Systems and methods for energy-efficient control of an energy-consuming system
US9552002B2 (en)2010-11-192017-01-24Google Inc.Graphical user interface for setpoint creation and modification
US10241482B2 (en)2010-11-192019-03-26Google LlcThermostat user interface
US11372433B2 (en)2010-11-192022-06-28Google LlcThermostat user interface
US9104211B2 (en)2010-11-192015-08-11Google Inc.Temperature controller with model-based time to target calculation and display
US10175668B2 (en)2010-11-192019-01-08Google LlcSystems and methods for energy-efficient control of an energy-consuming system
US9127853B2 (en)2010-11-192015-09-08Google Inc.Thermostat with ring-shaped control member
US10082306B2 (en)2010-11-192018-09-25Google LlcTemperature controller with model-based time to target calculation and display
US10078319B2 (en)2010-11-192018-09-18Google LlcHVAC schedule establishment in an intelligent, network-connected thermostat
US9995499B2 (en)2010-11-192018-06-12Google LlcElectronic device controller with user-friendly installation features
US9952573B2 (en)2010-11-192018-04-24Google LlcSystems and methods for a graphical user interface of a controller for an energy-consuming system having spatially related discrete display elements
US10346275B2 (en)2010-11-192019-07-09Google LlcAttributing causation for energy usage and setpoint changes with a network-connected thermostat
US10627791B2 (en)2010-11-192020-04-21Google LlcThermostat user interface
US8706270B2 (en)2010-11-192014-04-22Nest Labs, Inc.Thermostat user interface
US9026232B2 (en)2010-11-192015-05-05Google Inc.Thermostat user interface
US9766606B2 (en)2010-11-192017-09-19Google Inc.Thermostat user interface
US9298196B2 (en)2010-11-192016-03-29Google Inc.Energy efficiency promoting schedule learning algorithms for intelligent thermostat
US10747242B2 (en)2010-11-192020-08-18Google LlcThermostat user interface
US10606724B2 (en)2010-11-192020-03-31Google LlcAttributing causation for energy usage and setpoint changes with a network-connected thermostat
US8727611B2 (en)2010-11-192014-05-20Nest Labs, Inc.System and method for integrating sensors in thermostats
US8918219B2 (en)2010-11-192014-12-23Google Inc.User friendly interface for control unit
US9575496B2 (en)2010-11-192017-02-21Google Inc.HVAC controller with user-friendly installation features with wire insertion detection
US9092039B2 (en)2010-11-192015-07-28Google Inc.HVAC controller with user-friendly installation features with wire insertion detection
US11334034B2 (en)2010-11-192022-05-17Google LlcEnergy efficiency promoting schedule learning algorithms for intelligent thermostat
US9135426B2 (en)2010-12-162015-09-15Blackberry LimitedPassword entry using moving images
US9258123B2 (en)2010-12-162016-02-09Blackberry LimitedMulti-layered color-sensitive passwords
US8661530B2 (en)2010-12-162014-02-25Blackberry LimitedMulti-layer orientation-changing password
US8650624B2 (en)2010-12-162014-02-11Blackberry LimitedObscuring visual login
US10621328B2 (en)2010-12-162020-04-14Blackberry LimitedPassword entry using 3D image with spatial alignment
US8635676B2 (en)2010-12-162014-01-21Blackberry LimitedVisual or touchscreen password entry
US8931083B2 (en)2010-12-162015-01-06Blackberry LimitedMulti-layer multi-point or randomized passwords
US8863271B2 (en)2010-12-162014-10-14Blackberry LimitedPassword entry using 3D image with spatial alignment
US8745694B2 (en)2010-12-162014-06-03Research In Motion LimitedAdjusting the position of an endpoint reference for increasing security during device log-on
US20120159616A1 (en)*2010-12-162012-06-21Research In Motion LimitedPressure sensitive multi-layer passwords
US8769641B2 (en)2010-12-162014-07-01Blackberry LimitedMulti-layer multi-point or pathway-based passwords
US8631487B2 (en)2010-12-162014-01-14Research In Motion LimitedSimple algebraic and multi-layer passwords
US8650635B2 (en)*2010-12-162014-02-11Blackberry LimitedPressure sensitive multi-layer passwords
US9476606B2 (en)2010-12-312016-10-25Google Inc.Dynamic device-associated feedback indicative of responsible device usage
US9732979B2 (en)2010-12-312017-08-15Google Inc.HVAC control system encouraging energy efficient user behaviors in plural interactive contexts
US10443879B2 (en)2010-12-312019-10-15Google LlcHVAC control system encouraging energy efficient user behaviors in plural interactive contexts
US8850348B2 (en)2010-12-312014-09-30Google Inc.Dynamic device-associated feedback indicative of responsible device usage
US10152192B2 (en)2011-02-212018-12-11Apple Inc.Scaling application windows in one or more workspaces in a user interface
US8769668B2 (en)2011-05-092014-07-01Blackberry LimitedTouchscreen password entry
USD665395S1 (en)2011-09-122012-08-14Microsoft CorporationDisplay screen with animated graphical user interface
USD675224S1 (en)2011-09-122013-01-29Microsoft CorporationDisplay screen with animated graphical user interface
USD664967S1 (en)2011-09-122012-08-07Microsoft CorporationDisplay screen with animated graphical user interface
USD664550S1 (en)2011-09-122012-07-31Microsoft CorporationDisplay screen with animated graphical user interface
USD664968S1 (en)2011-09-122012-08-07Microsoft CorporationDisplay screen with animated graphical user interface
US9920946B2 (en)2011-10-072018-03-20Google LlcRemote control of a smart home device
US9453655B2 (en)2011-10-072016-09-27Google Inc.Methods and graphical user interfaces for reporting performance information for an HVAC system controlled by a self-programming network-connected thermostat
US9175871B2 (en)2011-10-072015-11-03Google Inc.Thermostat user interface
US10295974B2 (en)2011-10-072019-05-21Google LlcMethods and graphical user interfaces for reporting performance information for an HVAC system controlled by a self-programming network-connected thermostat
USD701515S1 (en)*2011-10-142014-03-25Nest Labs, Inc.Display screen or portion thereof with a graphical user interface
USD697930S1 (en)*2011-10-142014-01-21Nest Labs, Inc.Display screen or portion thereof with a graphical user interface
USD701869S1 (en)*2011-10-142014-04-01Nest Labs, Inc.Display screen or portion thereof with a graphical user interface
US9291359B2 (en)2011-10-212016-03-22Google Inc.Thermostat user interface
US8998102B2 (en)2011-10-212015-04-07Google Inc.Round thermostat with flanged rotatable user input member and wall-facing optical sensor that senses rotation
US9740385B2 (en)2011-10-212017-08-22Google Inc.User-friendly, network-connected, smart-home controller and related systems and methods
US9720585B2 (en)2011-10-212017-08-01Google Inc.User friendly interface
US10678416B2 (en)2011-10-212020-06-09Google LlcOccupancy-based operating state determinations for sensing or control systems
US9223948B2 (en)2011-11-012015-12-29Blackberry LimitedCombined passcode and activity launch modifier
US20130215088A1 (en)*2012-02-172013-08-22Howon SONElectronic device including flexible display
US9672796B2 (en)*2012-02-172017-06-06Lg Electronics Inc.Electronic device including flexible display
US10145577B2 (en)2012-03-292018-12-04Google LlcUser interfaces for HVAC schedule display and modification on smartphone or other space-limited touchscreen device
US11781770B2 (en)2012-03-292023-10-10Google LlcUser interfaces for schedule display and modification on smartphone or other space-limited touchscreen device
US10443877B2 (en)2012-03-292019-10-15Google LlcProcessing and reporting usage information for an HVAC system controlled by a network-connected thermostat
US9890970B2 (en)2012-03-292018-02-13Google Inc.Processing and reporting usage information for an HVAC system controlled by a network-connected thermostat
US20140085331A1 (en)*2012-09-182014-03-27Harris Research, Inc.Apparatus, method, and program product for selecting a finish
US8949735B2 (en)2012-11-022015-02-03Google Inc.Determining scroll direction intent
US9222693B2 (en)2013-04-262015-12-29Google Inc.Touchscreen device user interface for remote control of a thermostat
US11054165B2 (en)2015-10-122021-07-06Ikorongo Technology, LLCMulti zone, multi dwelling, multi user climate systems
US10288308B2 (en)2015-10-122019-05-14Ikorongo Technology, LLCMethod and system for presenting comparative usage information at a thermostat device
US10288309B2 (en)2015-10-122019-05-14Ikorongo Technology, LLCMethod and system for determining comparative usage information at a server device
US9702582B2 (en)2015-10-122017-07-11Ikorongo Technology, LLCConnected thermostat for controlling a climate system based on a desired usage profile in comparison to other connected thermostats controlling other climate systems

Also Published As

Publication numberPublication date
USRE44241E1 (en)2013-05-28
USRE45630E1 (en)2015-07-28
US5638501A (en)1997-06-10
US5949432A (en)1999-09-07

Similar Documents

PublicationPublication DateTitle
USRE41922E1 (en)Method and apparatus for providing translucent images on a computer display
US6072489A (en)Method and apparatus for providing translucent images on a computer display
US5677710A (en)Recognition keypad
US5559942A (en)Method and apparatus for providing a note for an application program
US9996176B2 (en)Multi-touch uses, gestures, and implementation
US5559948A (en)Apparatus and method for manipulating an object in a computer system graphical user interface
US5603053A (en)System for entering data into an active application currently running in the foreground by selecting an input icon in a palette representing input utility
US5760773A (en)Methods and apparatus for interacting with data objects using action handles
US4868765A (en)Porthole window system for computer displays
US5917493A (en)Method and apparatus for randomly generating information for subsequent correlating
US6903730B2 (en)In-air gestures for electromagnetic coordinate digitizers
US5461710A (en)Method for providing a readily distinguishable template and means of duplication thereof in a computer system graphical user interface
US7319454B2 (en)Two-button mouse input using a stylus
US6160555A (en)Method for providing a cue in a computer system
US7581194B2 (en)Enhanced on-object context menus
JP3486459B2 (en) Electronic information equipment and control method thereof
US6909439B1 (en)Method and apparatus for maximizing efficiency of small display in a data processing system
US20050015731A1 (en)Handling data across different portions or regions of a desktop
US20030217336A1 (en)Overlaying electronic ink
US6141008A (en)Method and system for providing size adjustment for a maximized window in a computer system graphical user interface
US20030128241A1 (en)Information terminal device
JP3388451B2 (en) Handwriting input device
JPH06175775A (en)Information processor
US6304276B1 (en)Data processing device and data processing method
JPS61267128A (en) Display erasing method

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:APPLE INC., CALIFORNIA

Free format text:CHANGE OF NAME;ASSIGNOR:APPLE COMPUTER, INC.;REEL/FRAME:019265/0961

Effective date:20070109

FEPPFee payment procedure

Free format text:PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text:PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAYFee payment

Year of fee payment:12

RRRequest for reexamination filed

Effective date:20121218

B1Reexamination certificate first reexamination

Free format text:THE PATENTABILITY OD CLAIMS 29 AND 30 IS CONFIRMED. CLAIMS 1-28 WERE PREVIOUSLY CANCELLED. CLAIM 33 IS DETERMINED TO BE PATENTABLE AS AMENDED. CLAIMS 34 AND 35, DEPENDENT ON AN AMENDED CLAIM, ARE DETERMINED TO BE PATENTABLE. NEW CLAIMS 36 AND 37 ARE ADDED AND DETERMINED TO BE PATETNABLE. CLAIMS 31 AND 32 WERE NOT EXAMINED.

RRRequest for reexamination filed

Effective date:20140205

FPB2Reexamination decision cancelled all claims (2nd reexamination)

Kind code of ref document:C2

Free format text:REEXAMINATION CERTIFICATE

Filing date:20140205

Effective date:20190724

FPB2Reexamination decision cancelled all claims (2nd reexamination)

Kind code of ref document:C2

Free format text:REEXAMINATION CERTIFICATE

Filing date:20140205

Effective date:20190724


[8]ページ先頭

©2009-2025 Movatter.jp