Movatterモバイル変換


[0]ホーム

URL:


US6292170B1 - Designing compound force sensations for computer applications - Google Patents

Designing compound force sensations for computer applications
Download PDF

Info

Publication number
US6292170B1
US6292170B1US09/270,223US27022399AUS6292170B1US 6292170 B1US6292170 B1US 6292170B1US 27022399 AUS27022399 AUS 27022399AUS 6292170 B1US6292170 B1US 6292170B1
Authority
US
United States
Prior art keywords
force
sensation
force sensation
compound
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US09/270,223
Inventor
Dean C. Chang
Louis B. Rosenberg
Jeffrey R. Mallett
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Immersion Corp
Original Assignee
Immersion Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US08/846,011external-prioritypatent/US6147674A/en
Priority claimed from US08/877,114external-prioritypatent/US6169540B1/en
Priority claimed from US09/243,209external-prioritypatent/US6285351B1/en
Priority to US09/270,223priorityCriticalpatent/US6292170B1/en
Application filed by Immersion CorpfiledCriticalImmersion Corp
Assigned to IMMERSION CORPORATIONreassignmentIMMERSION CORPORATIONASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: ROSENBERG, LOUIS B., CHANG, DEAN C., MALLETT, JEFFREY R.
Priority to AU38799/00Aprioritypatent/AU3879900A/en
Priority to PCT/US2000/006562prioritypatent/WO2000055839A1/en
Priority to US09/947,213prioritypatent/US7091948B2/en
Publication of US6292170B1publicationCriticalpatent/US6292170B1/en
Application grantedgrantedCritical
Assigned to IMMERSION CORPORATION (DELAWARE CORPORATION)reassignmentIMMERSION CORPORATION (DELAWARE CORPORATION)MERGER (SEE DOCUMENT FOR DETAILS).Assignors: IMMERSION CORPORATION (CALIFORNIA CORPORATION)
Priority to US11/455,944prioritypatent/US7701438B2/en
Priority to US12/762,791prioritypatent/US8717287B2/en
Anticipated expirationlegal-statusCritical
Expired - Lifetimelegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

A design interface tool for designing force sensations for use with a host computer and force feedback interface device. A force feedback device is connected to a host computer that displays the interface tool. The user selects a type of force sensation and designs and defines physical characteristics of the selected force sensation using the interface tool. A graphical representation of the characterized force sensation is displayed. The user can include a plurality of force sensations in a compound force sensation, where the compound sensation is graphically displayed to indicate the relative start times and duration of each of the force sensations. The user can also easily adjust the start times and durations of the force sensations using the graphical representation. The force sensations are output to a user manipulandum of a force feedback device to be felt by the user, where the graphical representation is updated in conjunction with the output of the force sensation. The user can iteratively modify force sensation characteristics and feel the results. Sounds can also be graphically represented and synchronized with the designed compound forces.

Description

CROSS REFERENCE TO RELATED APPLICATIONS
This application is a continuation-in-part of co-pending parent patent applications Ser. No. 08/846,011, now U.S. Pat. No. 6,147,674 filed Apr. 25, 1997, entitled, “Method and Apparatus for Designing and Controlling Force Sensations in Force Feedback Computer Applications”; Ser. No. 08/877,114, now U.S. Pat No. 6,169,540, filed Jun. 17, 1997, entitled, “Method and Apparatus for Designing Force Sensations in Force Feedback Computer Applications”; and Ser. No. 09/243,209, filed Feb. 2, 1999, entitled, “Designing Force Sensations for Computer Applications Including Sounds,” all of which are incorporated by reference herein in their entirety.
BACKGROUND OF THE INVENTION
The present invention relates generally to interface devices for allowing humans to interface with computer systems, and more particularly to computer interface devices that provide input from the user to computer systems and implement force feedback to the user.
Using an interface device, a user can interact with an environment displayed by a computer system to perform functions and tasks on the computer, such as playing a game, experiencing a simulation or virtual reality environment, using a computer aided design system, operating a graphical user interface (GUI), or otherwise influencing events or images depicted on the screen. Common human-computer interface devices used for such interaction include a joystick, mouse, trackball, stylus, tablet, pressure-sensitive ball, or the like, that is connected to the computer system controlling the displayed environment. Typically, the computer updates the environment in response to the user's manipulation of a user-manipulatable physical object such as a joystick handle or mouse, and provides visual and audio feedback to the user utilizing the display screen and audio speakers. The computer senses the user's manipulation of the user object through sensors provided on the interface device that send locative signals to the computer.
In some interface devices, haptic feedback is also provided to the user, also known as “force feedback.” These types of interface devices can provide physical sensations which are felt by the user manipulating a user manipulable object of the interface device. For example, the Force-FX joystick controller from CH Products, Inc. or the Wingman Force joystick from Logitech, Inc. may be connected to a computer and provides forces to a user of the controller. Other systems might use a force feedback mouse controller. One or more motors or other actuators are used in the device and arc connected to the controlling computer system. The computer system controls forces on the joystick in conjunction and coordinated with displayed events and interactions by sending control signals or commands to the actuators. The computer system can thus convey physical force sensations to the user in conjunction with other supplied feedback as the user is grasping or contacting the joystick or other object of the interface device. For example, when the user moves the manipulatable object and causes a displayed cursor to interact with a different displayed graphical object, the computer can issue a command that causes the actuator to output a force on the user object, conveying a feel sensation to the user.
A problem with the prior art development of force feedback sensations in software is that the programmer of force feedback applications does not have an intuitive sense as to how forces will feel when adjusted in certain ways, and thus must go to great effort to develop characteristics of forces that are desired for a specific application. For example, a programmer may wish to create a specific spring and damping force sensation between two graphical objects, where the force sensation has a particular stiffness, play, offset, etc. In current force feedback systems, the programmer must determine the parameters and characteristics of the desired force by a brute force method, by simply setting parameters, testing the force, and adjusting the parameters in an iterative fashion. This method can be cumbersome because it is often not intuitive how a parameter will affect the feel of a force as it is actually output on the user object; the programmer often may not even be close to the desired force sensation with initial parameter settings. Other types of forces may not be intuitive at all, such as a spring having a negative stiffness, and thus force sensation designers may have a difficult time integrating these types of sensations into software applications.
Furthermore, designers may have a difficult time synchronizing force sensations with each other, especially compound force sensations which include multiple individual force sensations. For example, a particular force sensation such as a collision may be accompanied by another force sensation such as an “earthquake” vibration. It can often be difficult to coordinate multiple simultaneous force sensations, especially when each of the force sensations starts and stops at different times and has a different duration.
SUMMARY OF THE INVENTION
The present invention is directed to designing force sensations output by a force feedback interface device. A controlling host computer provides a design interface tool that allows intuitive and simple design of a variety of force sensations and also allows multiple force sensations to be included in a compound force sensation.
More particularly, a design interface for designing force sensations for use with a force feedback interface device is described. The force sensation design interface is displayed on a display device of a host computer. Input from a user selects one or more force sensations to be commanded by a host computer and output by a force feedback interface device. Each force sensation preferably is provided with its own design parameters and design window, and the user can design and define physical characteristics of the selected force sensations. A graphical representation of the characterized force sensation is displayed on a display device of the host computer. Multiple force sensations can be included in a compound force sensation based on input received from the user, and a time-based graphical representation of the compound force sensation is displayed. For example, the time-based graphical representation can include a bar graph for each of the individual force sensations in the compound sensation indicating a start time and duration of each of the individual force sensations relative to each other. Multiple such compound force sensations can be designed and displayed simultaneously.
Preferably, the compound force sensation is output by the force feedback interface device coupled to the host computer to the manipulandum in conjunction with updating the graphical demonstration of the compound force sensation. The graphical representation provides the user with a visual demonstration of the individual force sensations included in the compound force sensation. Changes to at least one of the individual force sensations can be made by the user after the compound force sensation is output and the changes are displayed in the compound graphical representation. The user can also easily adjust the start times and durations of the individual force sensations using the graphical representation. Thus, in an iterative process, the user can design effective compound force sensations through actual experience of those sensations. The design interface can be implemented using program instructions stored on a computer-readable medium. In addition, one or more sounds can be assigned to individual force sensations in the compound force effect or can be assigned to the compound force effect directly. The sounds can preferably be displayed alongside the individual forces in the compound graphical representation.
The present invention advantageously provides a simple, easy-to-use design interface tool for designing force feedback sensations. Given the large variety of possible force sensations and the often unexpected results when modifying the several parameters of force sensations, the design interface tool of the present invention meets the needs of force sensation designers that wish to create force sensations as close to their needs as possible. The graphical design interface of the present invention allows a force sensation designer to easily, intuitively, and simultaneously design and modify individual force sensations and compound force sensations that include multiple different individual force sensations and arrange those individual force sensations to be output at desired times with respect to each other, allowing more effective force feedback to be implemented in games, simulations, graphical interfaces, and other applications.
These and other advantages of the present invention will become apparent to those skilled in the art upon a reading of the following specification of the invention and a study of the several figures of the drawing.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram of a system for controlling a force feedback interface device suitable for use with the present invention;
FIG. 2 is a perspective view of an embodiment of a mechanism for interfacing a user manipulatable object with the force feedback device of FIG. 1;
FIG. 3 is a diagram of a displayed interface of the present invention for designing force sensations;
FIG. 4 is a diagram of the interface of FIG. 3 in which a design window for a spring condition is displayed;
FIG. 5 are diagrams of the interface of FIG. 3 in which an alternative design window for a spring condition is displayed;
FIG. 6 is a diagram of the interface of FIG. 3 in which a design window for a periodic wave is displayed; and
FIG. 7 is a diagram of an embodiment of an interface of the present invention allowing the design of compound force sensations.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
FIG. 1 is a block diagram illustrating a forcefeedback interface system10 for use with the present invention controlled by a host computer system.Interface system10 includes ahost computer system12 and aninterface device14.
Host computer system12 is preferably a personal computer, such as an IBM-compatible or Macintosh personal computer, or a workstation, such as a SUN or Silicon Graphics workstation. Alternatively,host computer system12 can be one of a variety of home video game systems, such as systems available from Nintendo, Sega, or Sony, a television “set top box” or a “network computer”, etc.Host computer system12 preferably implements a host application program with which auser22 is interacting via peripherals andinterface device14. For example, the host application program can be a video game, medical simulation, scientific analysis program, operating system, graphical user interface, or other application program that utilizes force feedback. Typically, the host application provides images to be displayed on a display output device, as described below, and/or other feedback, such as auditory signals.
Host computer system12 preferably includes ahost microprocessor16, random access memory (RAM)17, read-only memory (ROM)19, input/output (I/O)electronics21, aclock18, adisplay screen20, and anaudio output device21.Display screen20 can be used to display images generated byhost computer system12 or other computer systems, and can be a standard display screen, CRT, flat-panel display, 3-D goggles, or any other visual interface.Audio output device21, such as speakers, is preferably coupled tohost microprocessor16 via amplifiers, filters, and other circuitry well known to those skilled in the art (e.g. in a sound card) and provides sound output touser22 from thehost computer18. Other types of peripherals can also be coupled tohost processor16, such as storage devices (hard disk drive, CD ROM/DVD-ROM drive, floppy disk drive, etc.), printers, and other input and output devices. Data for implementing the interfaces of the present invention can be stored on computer readable media such as memory (RAM or ROM), a hard disk, a CD-ROM or DVD-ROM, etc.
Aninterface device14 is coupled tohost computer system12 by abi-directional bus24. The bi-directional bus sends signals in either direction betweenhost computer system12 and the interface device. An interface port ofhost computer system12, such as an RS232 or Universal Serial Bus (USB) serial interface port, parallel port, game port, etc., connectsbus24 tohost computer system12.
Interface device14 includes alocal microprocessor26,sensors28,actuators30, auser object34,optional sensor interface36, anoptional actuator interface38, and otheroptional input devices39.Local microprocessor26 is coupled tobus24 and is considered local tointerface device14 and is dedicated to force feedback and sensor I/O ofinterface device14.Microprocessor26 can be provided with software instructions to wait for commands or requests fromcomputer host16, decode the command or request, and handle/control input and output signals according to the command or request. In addition,processor26 preferably operates independently ofhost computer16 by reading sensor signals and calculating appropriate forces from those sensor signals, time signals, and stored or relayed instructions selected in accordance with a host command. Suitable microprocessors for use aslocal microprocessor26 include the MC68HC711E9 by Motorola, the PIC16C74 by Microchip, and the 82930AX by Intel Corp., for example.Microprocessor26 can include one microprocessor chip, or multiple processors and/or co-processor chips, and/or digital signal processor (DSP) capability.
Microprocessor26 can receive signals fromsensors28 and provide signals to actuators30 of theinterface device14 in accordance with instructions provided byhost computer12 overbus24. For example, in a preferred local control embodiment,host computer system12 provides high level supervisory commands tomicroprocessor26 overbus24, andmicroprocessor26 manages low level force control loops to sensors and actuators in accordance with the high level commands and independently of thehost computer18. The force feedback system thus provides a host control loop of information and a local control loop of information in a distributed control system. This operation is described in greater detail in U.S. Pat. No. 5,739,811 and patent application Ser. Nos. 08/877,114 and 08/050,665 (which is a continuation of U.S. Pat. No. 5,734,373), all incorporated by reference herein.Microprocessor26 can also receive commands from anyother input devices39 included oninterface apparatus14, such as buttons, and provides appropriate signals tohost computer12 to indicate that the input information has been received and any information included in the input information.Local memory27, such as RAM and/or ROM, is preferably coupled tomicroprocessor26 ininterface device14 to store instructions formicroprocessor26 and store temporary and other data. In addition, alocal clock29 can be coupled to themicroprocessor26 to provide timing data.
Sensors28 sense the position, motion, and/or other characteristics of auser object34 of theinterface device14 along one or more degrees of freedom and provide signals tomicroprocessor26 including information representative of those characteristics. Rotary or linear optical encoders, potentiometers, optical sensors, velocity sensors, acceleration sensors, strain gauge, or other types of sensors can be used.Sensors28 provide an electrical signal to anoptional sensor interface36, which can be used to convert sensor signals to signals that can be interpreted by themicroprocessor26 and/orhost computer system12.
Actuators30 transmit forces touser object34 of theinterface device14 in one or more directions along one or more degrees of freedom in response to signals received frommicroprocessor26.Actuators30 can include two types: active actuators and passive actuators. Active actuators include linear current control motors, stepper motors, pneumatic/hydraulic active actuators, a torquer (motor with limited angular range), a voice coil actuators, and other types of actuators that transmit a force to move an object. Passive actuators can also be used foractuators30, such as magnetic particle brakes, friction brakes, or pneumatic/hydraulic passive actuators.Actuator interface38 can be optionally connected betweenactuators30 andmicroprocessor26 to convert signals frommicroprocessor26 into signals appropriate to driveactuators30.
Other input devices39 can optionally be included ininterface device14 and send input signals tomicroprocessor26 or to hostprocessor16. Such input devices can include buttons, dials, switches, levers, or other mechanisms. For example, in embodiments where user object34 is a joystick, other input devices can include one or more buttons provided, for example, on the joystick handle or base.Power supply40 can optionally be coupled toactuator interface38 and/oractuators30 to provide electrical power. Asafety switch41 is optionally included ininterface device14 to provide a mechanism to deactivateactuators30 for safety reasons.
User manipulable object34 (“user object” or “manipulandum”) is a physical object, device or article that may be grasped or otherwise contacted or controlled by a user and which is coupled tointerface device14. By “grasp”, it is meant that users may releasably engage a grip portion of the object in some fashion, such as by hand, with their fingertips, or even orally in the case of handicapped persons. Theuser22 can manipulate and move the object along provided degrees of freedom to interface with the host application program the user is viewing ondisplay screen20.Object34 can be a joystick handle, mouse, trackball, stylus (e.g. at the end of a linkage), steering wheel, sphere, medical instrument (laparoscope, catheter, etc.), pool cue (e.g. moving the cue through actuated rollers), hand grip, knob, button, or other article.
FIG. 2 is a perspective view of one embodiment of amechanical apparatus100 suitable for providing mechanical input and output tohost computer system12.Apparatus100 is appropriate for ajoystick orsimilar user object34.Apparatus100 includesgimbal mechanism140,sensors28 andactuators30.User object34 is shown in this embodiment as a joystick having agrip portion162.
Gimbal mechanism140 provides two rotary degrees of freedom to object34. A gimbal device as shown in FIG. 4 is described in greater detail U.S. Pat. No. 5,767,839, incorporated herein by reference in its entirety.Gimbal mechanism140 provides support forapparatus100 on groundedsurface142, such as a table top or similar surface.Gimbal mechanism140 is a five-member linkage that includes aground member144,extension members146aand146b, andcentral members148aand148b.Gimbal mechanism140 also includescapstan drive mechanisms164.
Ground member144 includes abase member166 andvertical support members168.Base member166 is coupled to groundedsurface142. The members ofgimbal mechanism140 are rotatably coupled to one another through the use of bearings or pivots.Extension member146ais rigidly coupled to acapstan drum170 and is rotated about axis A ascapstan drum170 is rotated. Likewise,extension member146bis rigidly coupled to theother capstan drum170 and can be rotated about axis B. Central drivemember148ais rotatably coupled toextension member146aand can rotate about floating axis D, andcentral link member148bis rotatably coupled to an end ofextension member146bat a center point P and can rotate about floating axis E. Central drivemember148aandcentral link member148bare rotatably coupled to each other at the center of rotation of the gimbal mechanism, which is the point of intersection P of axes A and B. Bearing172 connects the twocentral members148aand148btogether at the intersectionpoint P. Sensors28 andactuators30 are coupled to theground member144 and to the members146 through thecapstan drum170.
Gimbal mechanism140 provides two degrees of freedom to anobject34 positioned at or near to the center point P of rotation, whereobject34 can be rotated about axis A and/or B. In alternate embodiments, object34 can also be rotated or translated in other degrees of freedom, such as a linear degree of freedom along axis C or a rotary “spin” degree of freedom about axis C, and these additional degrees of freedom can be sensed and/or actuated. In addition, acapstan drive mechanism164 can be coupled to eachvertical member168 to provide mechanical advantage without introducing friction and backlash to the system.
Other types of mechanisms can be used in other embodiments, e.g. for a joystick, mouse, steering wheel, trackball, pool cue, or other force feedback interface device; some examples of mechanisms and devices are disclosed in patent application Ser. Nos. 08/877,114; 08/961,790; (now U.S. Pat. No. 6,020,875) 08/965,720; and 09/058,259, (now U.S. Pat. No. 6,140,382) all incorporated by reference herein.
Design of Force Feedback Sensations
Because force feedback devices can produce such a wide variety of feel sensations, each with its own unique parameters, constraints, and implementation issues, the overall spectrum of force sensations has been divided herein into subsets. Herein, three classes of feel sensations are provided: spatial conditions (“conditions”), temporal effects (“effects” or “waves”), and dynamic sensations (“dynamics”). Conditions are force sensations that arc a function of user motion of themanipulatable object34, effects are force sensations that are played back over time independently of user object position or motion, and dynamics are force sensations that are based on an interactive dynamic model of motion and time. These three types of force sensations are described in greater detail in parent application Ser. No. 08/846,011, incorporated by reference herein. Preferred standard types of conditions include springs, dampers, inertia, friction, texture, and walls. Three basic types of effects include periodic, constant force (vector force), and ramp. Dynamic sensations involve real-time physical interactions based on 1) user motion and 2) a physical system wherein user motion during the interaction affects the behavior of the physical system. Each dynamic sensation, for example, can be based on a simulated basic physical system including a dynamic mass that is connected to theuser object34 by a simulated spring and a simulated damper.
FIG. 3 illustrates adisplay device20 displaying an interactiveforce design interface300 that enables developers and programmers of force feedback (“users” of the interface) to design and implement force sensations rapidly and efficiently. The graphical environment allows conditions, effects (“waves”), and dynamics to be defined through intuitive graphical metaphors that convey the physical meaning of each parameter involved. As the parameters are manipulated, sensations can be felt in real-time, allowing for an iterative design process that fine-tunes the feel to the designer's exact need. Once the appropriate sensation is achieved, the interface can save the parameters as a resource and automatically generate optimized code in a desired format that can be used directly within an application program. Thus,interface300 handles most of the force feedback development process from force sensation design to coding. With these tools, force feedback programming becomes a fast and simple process.
The challenge of programming for force feedback is not the act of coding. Force models to provide force sensations are available, and, once the desired force sensation is known and characterized, it is straightforward to implement the force sensation using software instructions. However, the act of designing force sensations to provide a desired feel that appropriately match gaming or other application events is not so straightforward. Designing force sensations and a particular feel requires a creative and interactive process where parameters are defined, their effect experienced, and the parameters arc modified until the sensations are at the desired characterization. For example, when designing conditions, this interactive process might involve setting the stiffness of springs, sizing the deadband, manipulating the offset, and tuning the saturation values. When designing effects, this might involve selecting a wave source (sine, square, triangle, etc.), setting the magnitude, frequency, and duration of the signal, and then tuning the envelope parameters. For a dynamic sensation, this might involve setting the dynamic mass, and then tuning resonance and decay parameters. With so many parameters to choose from, each applicable to a different type of force sensation, there needs to be a fast, simple, and interactive means for sensation design. To solve this need, thegraphical interface300 of the present invention allows a user to rapidly set physical parameters and feel sensations, after which the interface automatically generates the appropriate code for use in a host computer application program.
Interface300 enables interactive real-time sensation design of conditions, effects, and dynamics, where parameters can be defined and experienced through a rapid iterative process. Thus, it is preferred that a forcefeedback interface device14 be connected to thecomputer implementing interface300 and be operative to output commanded force sensations. Intuitive graphical metaphors that enhance a programmer's understanding of the physical parameters related to each sensation type are provided ininterface300, thereby speeding the iterative design process. File-management tools are also preferably provided ininterface300 so that designed force sensations can be saved, copied, modified, and combined, thereby allowing a user to establish a library of force sensations. Once sensations are defined, theinterface300 preferably stores the parameters as “resources” which can be used by an application program. For example, by linking a force sensation resource into an application program, the resources can be converted into optimized Direct-X code for use in an application in the Windows environment. Other code formats or languages can be provided in other embodiments.Interface300 can be implemented by program instructions or code stored on a computer readable medium, where the computer readable medium can be either a portable or immobile item and may be semiconductor or other memory of the executing computer (such as computer12), magnetic hard disk or tape, portable disk, optical media such as CD-ROM, PCMCIA card, or other medium.
As shown in FIG. 3, theinterface300 has three primary work areas: thesensation pallet302, thebutton trigger pallet304, and thedesign space306. Force sensations are created in thedesign space306 and can be saved and loaded into that space using standard file handling features.
To create a new force sensation, a sensation type is chosen from thesensation pallet302.Pallet302 is shown in an expandable tree format. The root of the tree includes the threeclasses310 of force feedback sensations described herein, conditions, waves (effects), and dynamics. Preferably, users can also define their own headings; for example, a “Favorites” group can be added, where force sensations with desirable previously-designed parameters are stored.
Ininterface300, the conditions, waves, and dynamics classes are shown in expanded view. These classes may also be “compressed” so as to only display the class heading, if desired. When a class is displayed in expanded view, theinterface300 displays a listing of all the sensation types that are supported by the hardware connected to thehost computer12 for that class. For example, when programming for more recent or expensive hardware supporting a large number of force sensation types, a list including many or all available sensation types is displayed. When programming for older or less expensive interface device hardware that may not implement all the sensations, some sensation types can be omitted or unavailable to be selected in the expanded view. Preferably,interface300 can determine exactly what force sensations arc supported by a giveninterface device14 connected to the host computer by using an effect enumeration process, i.e., the host computer can request information from the interface device, such as a version number, date of manufacture, list of implemented features, etc.
Once a sensation type is chosen from thesensation pallet302, the sensation type is added to thedesign space306. For example, in FIG. 3, anicon308 for the selected force sensation “Damper” is displayed within thedesign space306 window.Icon308 can now be selected/opened by the user in order to set the parameters for the given sensation type using graphical development tools (described below). Multiple icons can similarly be dragged to the design space to create a more complex force sensation. Once the parameters are specified for the given sensation, the sensation can be saved as a resource file. Using this process, a user can create a diverse library of feel sensations as resource files. Also, predefined libraries of sample resources from third party sources might also be available.
Options displayed in thetrigger button pallet304 can also be selected by the user.Trigger pallet304 is used for testing force sensations that are going to be defined as button reflexes. For example, a force sensation might be designed as a combination of a square wave and a sine wave that triggers whenButton #2 of the interface device is pressed. The square wave would be created by choosing theperiodic type312 from thesensation pallet302 and defining parameters appropriate for the square wave. A sine wave would then be created by choosing anotherperiodic type312 from thesensation pallet302 and defining the parameters appropriate for the sine wave. At this point, twoperiodic icons308 would be displayed in thedesign space window306. To test the simultaneous output of the square wave and sine wave, the user can just drag and drop theseicons308 into theButton2icon314.Button2 on theinterface device14 has thus been designed to trigger the reflex sensation when pressed. This process is fast, simple, and versatile. When the user achieves a sensation exactly as desired, the sensation can be saved as a resource file and optimized software code for use in the application program is generated. TheButton2 selection might be provided in other ways in different embodiments. For example, the user might select or highlight the designed force icons indesign space306 and then select theButton2 icon inpallet304 to indicate that the highlighted forces will be triggered byButton2. Furthermore, sounds can also preferably be associated with the button icons inpallet304 so that a sound is output when the button is pressed, synchronized with the force sensation(s) also associated with that button. Such a feature is described in greater detail in copending patent application Ser. No. 09/243,209, entitled, “Designing Force Sensations for Computer Applications Including Sounds”, filed Feb. 2, 1999 and incorporated herein by reference.
FIG. 4 illustratesinterface300 where a force sensation is characterized in thedesign space306. When anicon308 indesign space306 is selected by the user, theicon308 expands into a force sensation window and graphical environment for setting and testing the physical parameters associated with the selected sensation. For example, in FIG. 4, aspring sensation type320 has been selected from thecondition list322 and provided asicon324 in thedesign space306. Aspring window326 is displayed indesign space306 whenicon324 is selected. Withinspring window326 arefields328 characterizing the force, including the axis330 (and/or direction, degree of freedom, etc.) in which the force is to be applied, the gain332 (or magnitude) of the force, and theparameters334 associated with the force sensation. For example, for the spring sensation, the positive stiffness (“coefficient”), negative stiffness (“coefficient”), positive saturation, negative saturation, offset, and deadband of the spring sensation are displayed as parameters. The user can input desired data into thefields328 to characterize the force. For example, the user has specified that the force is to be applied along the x-axis (in both directions, since no single direction is specified, has specified a gain of 100, and has specified saturation values of 10,000 in positive and negative directions. The user can also preferably specify all or some of the parameters in graphical fashion by adjusting the size or shape of the envelope, the height or frequency of the waveform, the width of the deadband or springs, the location of a wall on an axis, etc. by using a cursor or other controlled graphical object.
As the user inputs values intofields328, the resulting additions and changes to the force sensation are displayed in an intuitive graphical format in the force sensation window. For example, in thespring sensation window326,graphical representation336 is displayed.Representation336 includes animage338 of the user object34 (shown as a joystick, but which also can be shown as other types of user objects), animage340 of ground, animage342 of a spring on the right of thejoystick34, and animage344 of a spring on the left of thejoystick34.Representation336 models a single axis or degree of freedom of the interface device.
Representation336 represents a physical, graphical model with which the user can visually understand the functioning of the force sensation. Theuser object image338 is displayed preferably having a shape similar to the actual user object of the desired interface device (a joystick in this example). Along the displayed axis, in both directions, there arespring images342 and344 as defined by a positive stiffness parameter (k) and a negative stiffness parameter (k). Graphically, the large stiffness of the spring to the right (coefficient of 80) is represented as alarger spring image342. The origin of the spring condition is shown at acenter position346, since the offsetparameter348 is zero. If the offset has a positive or negative magnitude, the origin would be displayed accordingly toward the left or right. The deadband region is shown graphically as the gap between theuser object image338 and thespring images342 and344.
In the preferred embodiment, the graphical representation further helps the user visualize the designed force sensation by being updated in real time in accordance with the movement of theuser object34 of theconnected interface device14.User object image338 will move in a direction corresponding to the movement ofuser object34 as caused by the user. The user object is free to be moved in either the positive or negative direction along the given axis and encounter either a positive or negative stiffness from the spring sensation. Thus, if the user object is freely moved to the left fromorigin346, thejoystick image338 is moved left in the deadband region and no force is output. When theuser object34 encounters the spring resistance, thejoystick image338 is displayed contacting thespring image344. If there is no deadband defined, thespring images342 and344 are displayed as contacting thejoystick image338 at the center position. Theedge stop images350 define the limits to the degree of freedom; for example, when theuser object34 is moved to a physical limit of the interface device along an axis, thejoystick image338 is displayed as contacting an appropriateedge stop image350.
When thejoystick image338 contacts aspring image342 or344, thatspring image342 is displayed compressed an appropriate amount. Once a spring stiffness is encountered, the resistance force increases linearly with compression of the spring (as is true of a real spring). The amount of compression felt by the user is correlated with the amount of compression shown byspring image342. If the programmer has defined a saturation value for force opposing movement in the positive direction, the force output would cease increasing with compression once the saturation limit in the positive direction was exceeded.
Once the user has tested the input parameters and settings, he or she may change any of the existing information or add new information by inputting data intofields328. Any such changes will instantly be displayed inwindow326. For example, if the user changes the coefficient (stiffness) of the spring on the right, thespring image342 will immediately be changed in size to correlate with the new value. The user thus gains an intuitive sense of how the sensation will feel by simply viewing therepresentation336. The user can then determine how the sensation will feel with more accuracy (fine tuning) by moving the user object and feeling the sensation. Thus, thegraphical representation336 as displayed clearly demonstrates to the user the various effects of parameters on the force sensation and additionally allows the user to experience the forces coordinated with the graphical representation.
Other graphical representations can be displayed ininterface300 for spatial texture conditions, wall conditions, damping conditions, inertia conditions, friction conditions, etc. as described in application 08/877,114. In other embodiments, a 2-dimensional force sensation (i.e. two degrees of freedom) can be displayed in thewindow326 by showing an overhead representation of the user object. For example, a circular user object image can be displayed in the middle of two sets of spring images in a cross formation, each set of springs for a different degree of freedom.
FIG. 5 illustratesinterface300 displaying an alternative graphical representation of a spring condition. FIG. 5 also shows the variety ofconditions400 available to be selected from the condition list. The representation used in FIG. 4 can be used for a spring condition as well. In FIG. 6, the user has selectedspring icon401 in thedesign space306. Aspring condition window402 is displayed in thedesign space306 whenicon401 is selected. Thespring window402 includesparameters404 for characterizing the spring force, as well asgain406 andaxis control408. A window is displayed for each axis in which the force is to be applied. A greyed out window for the second axis condition indicates that no force is presently assigned to that axis.
Infirst axis window410, a simple mode and advanced mode is available; in FIG. 5, simple mode has been selected by the user.Spring images412 are displayed from each edge ofwindow410, wherespring image412ais for the negative direction andspring image412bis for the positive direction. When the user moves the user object along the displayed axis (the x-axis),line414 moves in the corresponding direction. When theline414 moves into aspring image412, the microprocessor outputs the specified spring force on the user object so the user can feel the characterized force sensation. As the user object continues to be moved into the spring, the spring image compresses as a real spring would. The empty space betweenspring images412 indicates the deadband region where no forces are output. In the preferred embodiment, the user may adjust the stiffness (k) of the spring force by selectingcontrol points422 at the edges of the front of thespring images412 with a cursor. The user can drag the control points to adjust the widths of the spring images, which in turn adjusts the stiffness parameter. A thicker spring image indicates a larger stiffness parameter, and a stronger spring force. The user may also move the front ends of the spring images closer together or further apart, thus adjusting the deadband and offset parameters. As parameters are adjusted, they are sent to the local microprocessor which then implements the newly characterized force on the user object (if appropriate).
Icons are also preferably provided to help the user with the design of force sensations from previously stored force sensations. For example,clip objects icon424, when selected, provides the user with a list or library of predefined, common force sensations that the user can use as a base or starting point, i.e., the user can modify a common spring condition configuration to quickly achieve a desired force. This library can be provided, for example, from commercial force providers or other sources, or can be a custom-made library.
FIG. 6 illustratesinterface300 with a graphical representation for a periodic wave (effect) force sensation.Periodic window440 is displayed in response to the user selecting (e.g., double clicking on)periodic effect icon442 that has been dragged intodesign space306. Inwindow440,waveform source field444 allows a user to select from multiple available types of signal wave sources for the effect. The user is allowed to select the duration of the periodicwave using sliders446, and may also select an infinite duration withbox448. The gain and offset may be selected usingsliders450, and other parameters are provided infields452. A graphical representation of the periodic waveform is displayed inwindow454 having a shape based on the wave source chosen and based on the other selected parameters (or default parameters if no parameters are chosen). Envelope parameters infields452 can be graphically adjusted by the user by draggingcontrol points456 of the waveform. A frequency of the waveform can be adjusted by dragging a displayed wave to widen or narrow the displayed oscillations of the wave, or by specifying the period infield458. Trigger buttons for the periodic wave can be determined infields460 to assign physical button(s) or controls to the designed effect, and the direction of the periodic wave in the user object workspace is determined usingdial462 andfield464. Therepeat interval field460 allows a user to specify the amount of time before the effect is repeated if the designated button is held down. These parameters and characteristics can be entered as numbers in the displayed input fields or prompts, or can be input by dragging the graphical representation of the waveform inwindow454 with a cursor to the desired shape or level.
The parameters, when specified, cause the graphical representation to change according to the parameters. Thus, if the user specifies a particular envelope, that envelope is immediately displayed in thewindow454. The user can thus quickly visually determine how specified parameters exactly affect the periodic waveform.
To test the specified periodic wave, the user preferably selectsstart button456, which instructs the microprocessor to output the specified force sensation over time to the user object so the user can feel the force sensation when grasping the user object. In the preferred embodiment, a graphical marker, such as a vertical line or pointer, scrolls across thedisplay window454 from left to right indicating the present portion or point on the waveform currently being output. Or, the waveform can be animated; for example, if an impulse and fade is specified, the wave is animated so that the impulse portion of the waveform is displayed when the impulse force is output on the user object, and the fade is displayed when the output force fades down to a steady state level. Since graphical display is handled by the host computer and force wave generation is (in one embodiment) handled by a local microprocessor, the host display of the marker needs to be synchronized with the microprocessor force generation at the start of the force output. The user can stop the output of the periodic sensation by selecting thestop button458. Other features of a periodic force design interface are described in patent application Ser. No. 08/877,114.
As described above, the normal procedure for a force designer in usinginterface300 is to input parameters for a selected type of force sensation, test the way the force feels by manipulating the user object, adjusting the parameters based on the how the force feels, and iteratively repeating the steps of testing the way the force feels and adjusting the parameters until the desired force sensation is characterized. Normally, the user would then save the resulting parameter set describing this force sensation to a storage medium, such as a hard disk, CDROM, non-volatile memory, PCMCIA card, tape, or other storage space that is accessible to a computer desired to control the force feedback. The user also preferably assigns an identifier to the stored parameter set, such as a filename, so that this force sensation can be later accessed. Thus, other application programs running on a host computer can access the parameter set by this identifier and use the designed force sensation in an application, such as in a game, in a graphical user interface, in a simulation, etc.
Once a force sensation has been designed using the graphical tools as described above, the definition can be saved as a resource of parameters. By accessing the interface resource from an application program, the resource is converted automatically from a parameter set to code in the desired language or format, e.g., DirectX by Microsoft® Corporation for use in the Windows™ operating system. For example, the force feedback resource can be provided as or in a DLL (Dynamic Linked Library) that is linked to an application program. In one embodiment, the DLL can provide the application program with effects defined as completed DirectX Structs (DI_Struct), where the application programmer can then create effects by using the CreateEffect call within DirectX (or equivalent calls in other languages/formats). Or, the DLL can perform the entire process and create the effect for the application program, providing the programmer with a pointer to the sensation. One advantage of using the first option of having the programmer call CreateEffect is that it gives the programmer the opportunity to access the parameters before creating the effect so that the parameters can be modified, if desired.
Compound Force Sensations
FIG. 7 illustrates aforce design interface500 including a compound force design feature of the present invention. This feature allows a user to easily design and test sequences of multiple force sensations instead of having to separately test each individual force sensation. This allows faster creation of compound force sensations as well as greater flexibility when designing force sensations that are intended to be output simultaneously for at least part of the duration of the force sensations. Furthermore, another feature of the present invention allows sounds to be designed and their start times adjusted alongside compound force sensations to correlate with desired force sensations.
Interface500 includes asensation palette502, abutton trigger palette504, and adesign space506, similar to these features in theinterface300 described above. The design space can include a number oficons508 representing different force sensations which can be stored in a particular “force resource” file that is written out by theinterface500. Each force sensation in thedesign space506 can be modified by calling up an associated force design window510 which includes a graphical representation of the particular force sensation and several controls to allow modification of the force sensation, as described in detail above. For example, FIG. 7 shows adesign window510afor an “aftershock” periodic force sensation, adesign window510bfor an “angle wall” force sensation, and a force design window510cfor a “bell ringing” force sensation. The parameters of each force sensation can be adjusted individually similarly as described with reference to the above embodiments.
Interface500 also includes a compound force sensation design feature, which is accessed by the user through the use of compound force sensation controls. Preferably, a compoundforce palette icon514 is available in thesensation palette502, like any other force sensations in thepalette502. The user can create a compound force “container” by dragging thecompound icon514 into thedesign space506, similar to any other force sensation as described in the embodiments above. A compound forcesensation container icon516 shows the created compound container in the design space with the other created force sensations. Preferably, the user provides the created compound force container with a name, which will be the identifier for the “compound force sensation,” which is the collection of individual force sensations in the compound container. For example, the identifier forcontainer516 is “Sequence.”
As with any of the force sensations, the user preferably accesses the details of the compound force container by selecting (e.g. double-clicking) theicon516. This causes acompound window518 to be displayed (or other screen area that displays data related to compound force sensations). Upon creation, no force sensations are included in the compound container, and thus no force sensation icons are displayed in thecompound window518. The user can then selectindividual force sensations508 to be included in the compound container, e.g.,force sensation icons508 can be dragged and “dropped” into the window518 (or the force sensations can be dragged into the compound icon516). Preferably, the force sensations are displayed indesign space506 and thus were previously created by the user ininterface500 or loaded into theinterface500 from an external source such as a storage device (hard drive, CDROM, etc.) or another computer/device networked to the host computer running theinterface500. Alternatively, individual force sensations not displayed indesign space506 can be imported into the compound window. A compound container such asicon516 can also preferably be associated with a button icon inbutton palette504, similarly to individual force sensations.
Window518 in FIG. 7 shows an example ofcompound window518 where fourindividual force sensations508 have been dragged into thewindow518, including “aftershock”force sensation520, “angle wall”force sensation522, “bell ringing”force sensation524, and “earthquake”force sensation526. To the right of these displayed icons, atime scale519 is displayed indicating a range of seconds numbered from zero to 60, where each mark on the time scale represents 5 seconds of time. Below the time scale, a horizontal bar graph is displayed for each of the force sensation icons in thewindow518. Thus,bar graphs530,532,534, and536 are associated withforce sensations520,522,524, and526, respectively, and represent the start times and durations of each of the force sensations. Preferably, each bar graph is displayed in a different color or other visual characteristic.Bar graph530 shows a duration for the “aftershock”force sensation520 starting at about the 11thsecond and lasting about 1-2 seconds, which indicates that the jolts from the aftershock are intended to be very short.Bar graph532 shows an infinite duration for theangle wall sensation522, which indicates that the angle wall need not have a duration such as normal periodics, but can be always present (and such is the default duration); the angle wall, for example, has an output dependent on the position of the user manipulandum, such as when a user-controlled cursor moves into the displayed wall.Bar graph534 indicates that thebell ringing sensation524 begins at about second35 and lasts to about second 40. Finally, theearthquake sensation526 is shown to begin at about the 3rdsecond and last until about the 9thsecond. Thus, by examining thewindow518, the user can easily determine that the earthquake sensation is intended to last for some time, followed by a small pause and then the aftershock sensation.
A great amount of flexibility is provided for the user of thecompound window518 in setting the start times and durations of the individual force sensations displayed in thewindow518. The user can preferably adjust the start time of a force sensation by selecting a particular bar graph and moving it horizontally using an input device such as a keyboard, mouse or joystick. For example, the user can select thebar graph536 with a cursor and drag the entire bar graph to the right, so that the force sensation has a new start and end time, but the duration remains constant. Alternatively, the user can also adjust the duration of each force sensation by changing the start time and/or end time of a bar graph. For example, the cursor can be moved over the left edge of a bar graph, so that only the left edge is selected and dragged when the user moves the input device horizontally. The user can adjust the right side of a bar graph in a similar fashion. In other embodiments, the user can adjust the duration, start time, and end time of a bar graph by inputting numbers into entry fields; this can allow more precision in adjusting the bar graphs, e.g. a start time of 3.3 seconds can be entered if such precision is desired. However, such entry fields can alternatively (or additionally) be included in the individual force sensation design windows510, as described below.
When a change is made by the user (or by a program, script, imported file, data, etc.) to a bar graph incompound window518, that change is preferably immediately made to that force sensation and reflected in the individual force sensation design windows510 if those windows510 are currently displayed. For example, if the anglewall bar graph532 is modified, thebar graph544 displayed in the anglewall design window510bis modified in the same way. Furthermore, the displayed delay (start time)field546 andduration field548 for thedesign window510bare preferably updated in accordance with thebar graphs544 and532. Preferably, the other force sensation design windows510 include a bar graph similar tobar graph544 which corresponds to a bar graph incompound window518. Likewise, changes made to any parameters or characteristics of the individual force sensations as displayed in windows510 are preferably immediately updated in a corresponding fashion to those force sensations in the compound force sensation, and the changes are displayed incompound window518 ifwindow518 is displayed.
Astart button540 is also preferably provided to allow the user to output or “play” the force sensations in the compound window using a force feedback device connected to the host computer running theinterface500. When thestart button540 is selected, the user can immediately feel the force sensations and their respective start times and durations with respect to each other. Preferably, a pointer or marker (not shown) is displayed on thetime scale519 and/or bar graphs which moves from the left to the right and indicates the current point in time and the sections of each bar graph currently being output as forces (if any). If an individual force sensation does not have the desired duration or start time, the user can iteratively adjust those parameters in thecompound window518 and play the forces until the desired effect is achieved. Astop button541 can also be included to stop the output of the compound force sensation when selected. Furthermore, a user is preferably able to start the output of any portion of the compound force sensation by selecting a particular point in time with reference to thetime scale519. For example, the user can select a point at the 30-second mark and the force sensations will be output beginning from that selected point. Alternatively, the user can select a start point at the 30-second mark and an end point at the 40-second mark, for example, to output the compound force sensations within that range of time.
The compound container can preferably be stored as a separate compound file that includes references or pointers to the included force sensations. Alternatively, the force sensation data can be stored in the compound file. In some embodiments, a compound force sensation can include one or more “sub” compound force sensations, where each sub compound sensation includes one or more individual force sensations; force sensations in each “sub” compound container can be displayed or hidden from view as desired. Furthermore, the sub compound containers can include lower-level compound force sensations, and so on. The information for each sub compound container can be stored in the highest-level compound sensation file, or each sub compound sensation can be stored separately and referenced by data in higher-level compound sensation files.
In other embodiments, a compound force sensation can be selected and represented in other ways. For example, a menu option can be used to create a compound container, and the individual force sensations within the container can be represented using vertical bar graphs, circular graphs, or other graphical representations. In addition, additional information concerning each included force sensation can be displayed in thecompound window518, such as the periodic graphs, force magnitude information, trigger events, etc.
In a different embodiment, sounds can also be designed and displayed or indicated incompound window518. For example, one or more sound files, such as a “wav” file, mp3 file, or other standard, can be associated with a button icon inpalette504. As described in co-pending patent application 09/243,209, entitled, “Designing Force Sensations for Computer Applications Including Sounds”, filed Feb. 2, 1999, when one or more sound files have been attached to a button icon, the sound files are played on speakers attached to the computer when the associated button on the force feedback interface device is pressed by the user. In addition, any force sensations assigned to that button icon are played at the same time as the sound when the button is pressed. Preferably, the start of the force sensation is synchronized with the start of the sound effect played. The force sensation and sound can also be played when a control ininterface500 is activated by the user, such as a graphical button or icon ininterface500. When one or more sounds are associated with a button icon, an indicator is displayed that indicates the button icon has associated sounds. Furthermore, one or more compound force sensations can also be associated with a button icon inbutton palette504 so that all the force sensations in the compound container(s) will begin at the press of the associated physical button and in conjunction with any sounds associated with that button icon.
Alternatively, one or more sounds or sound files can be associated directly with anindividual force sensation508. An indicator closely associated with the icons of the force sensations, such asicons508, preferably indicates that sounds are associated with that force sensation; for example, a musical note image can be provided on force sensations having associated sounds. In some embodiments, sounds can also be directly associated with a compound force sensation (e.g.,container516 and window518) rather than associated with an individual force sensation or with a button icon.
In some embodiments, ifcompound window518 is used, the sounds associated with a force sensation, associated with the compound container, or associated with a button with which the compound container is also associated, can preferably be displayed as bar graphs within thetime scale519 similar to the bar graphs for the force sensations. For example, if an individual force sensation has a directly-associated sound, a note image can be provided on theforce sensation icon520,522,524, or526, or can be displayed alongside the icon. Alternatively, a sound bar graph can be displayed alongside or within thebar graph530, etc. for the force sensation. For example, a smaller, thinner bar graph representing the duration of the associated sound can be displayed within thebar graph536 to indicate the starting time and duration of the associated sound. If a sound is associated with the entire compound force sensation, then separate sound bar graphs can be displayed after or before the bar graphs for the individual force sensations to be referenced by thesame time scale519.
The user can preferably adjust the start time (delay) and/or duration of the sounds similarly to adjusting the force sensations in thewindow518 to allow flexibility in determining when sounds will start and stop with respect to force sensations. In yet other embodiments, sounds can be displayed as waveforms similarly to the waveforms for force sensations shown indesign windows510aand510c,and can be adjusted in magnitude, duration, start time, or even more fundamentally if advanced sound editing features are included in the interface.
While this invention has been described in terms of several preferred embodiments, it is contemplated that alterations, permutations and equivalents thereof will become apparent to those skilled in the art upon a reading of the specification and study of the drawings. For example, many different parameters can be associated with dynamic sensations, conditions, and effects to allow ease of specifying a particular force sensation. These parameters can be presented in the graphical interface of the present invention, including in the compound force sensation windows. Many types of different visual metaphors can be displayed in the interface tool of the present invention to allow a programmer to easily visualize changes to a force sensation and to enhance the characterization of the force sensation. Furthermore, certain terminology has been used for the purposes of descriptive clarity, and not to limit the present invention. It is therefore intended that the following appended claims include all such alterations, permutations, and equivalents as fall within the true spirit and scope of the present invention.

Claims (30)

What is claimed is:
1. A method for implementing a force sensation design interface, said method comprising:
causing a display of said force sensation design interface on a display device of a host computer;
receiving input from a user to said force sensation design interface, said input selecting a plurality of individual force sensations to be commanded by said host computer and output by a force feedback interface device, said force feedback interface device including a manipulandum physically engageable by a user and moveable in at least one degree of freedom, wherein each of said individual force sensations is provided with an edit display area displayed in said force sensation design interface;
including said plurality of selected force sensations in a compound force sensation based on input received from said user; and
causing a display of a time-based graphical representation of said compound force sensation, said representation displaying each of said individual force sensations included in said compound force sensation.
2. A method as recited in claim1 further comprising commanding said compound force sensation to be output by said force feedback interface device coupled to said host computer such that said individual force sensations are output to said manipulandum in conjunction with updating said graphical demonstration of said compound force sensation, such that said graphical representation provides said user with a visual demonstration of said individual force sensations included in said compound force sensation.
3. A method as recited in claim1 further comprising:
receiving changes from said user to at least one of said individual force sensations in said compound force sensation, said changes received in said edit display area of said changed individual force sensation, said changes provided after said compound force sensation is output; and
causing a display of said changes in said graphical representation of said compound force sensation.
4. A method as recited in claim1 further comprising:
receiving changes from said user to said compound force sensation, said changes received in a display area of said compound force sensation, said changes provided after said compound force sensation is output; and
effecting said changes in said display area of said compound force sensation receiving said changes.
5. A method as recited in claim1 further comprising storing said compound force sensation to a storage medium accessible to said host computer.
6. A method as recited in claim1 further comprising accessing said stored compound force sensation from an application program different than said force sensation design interface.
7. A method as recited in claim1 wherein said time-based graphical representation includes a bar graph for each of said individual force sensations in said compound force sensation indicating a start time and duration of each of said individual force sensations relative to each other.
8. A method as recited in claim2 wherein said updating of said graphical demonstration includes moving a marker across a plurality of bar graphs, each indicating a time period during which one of said individual force sensations is output.
9. A method as recited in claim2 wherein a sound is associated with said compound force sensation, wherein a start of a sound is synchronized with a start of said compound force sensation.
10. A method as recited in claim9 wherein said sound is selected by said user in said force sensation design interface and said user associates said sound with said compound force sensation.
11. A method as recited in claim1 wherein said compound force sensation also includes a lower-level compound force sensation, said lower-level compound force sensation including at least one individual force sensation.
12. A method as recited in claim1 wherein said individual force sensations include conditions, effects, and dynamics.
13. An apparatus for implementing a force sensation design interface, said apparatus comprising:
means for causing a display of said force sensation design interface on a display device of a host computer, said means for displaying providing an edit window for each individual force sensation to be created or edited by said user in said force design interface;
means for receiving input from a user to said force sensation design interface, said input selecting a plurality of said individual force sensations to be commanded by said host computer and output by a force feedback interface device, said force feedback interface device including a manipulandum graspable by a user and moveable in a degree of freedom; and
means for including said selected force sensation with at least one other individual force sensation in a compound force sensation based on input received from said user, wherein said means for causing a display displays a time-based graphical representation of said compound force sensation.
14. An apparatus as recited in claim13 wherein said time-based graphical representation includes a bar graph for each of said individual force sensations indicating a start time and duration of each of said individual force sensations relative to each other.
15. An apparatus as recited in claim13 further comprising means for commanding said compound force sensation to be output by said force feedback interface device coupled to said host computer such that said compound force sensation is output to said manipulandum in conjunction with updating said graphical demonstration of said compound force sensation.
16. An apparatus as recited in claim13 further comprising means for receiving changes to at least one of said individual force sensations from said user after said force sensation is output and causing a display of said changes in said graphical representation of said compound force sensation.
17. A computer readable medium including program instructions executable on a computer for providing a force sensation design interface implemented by said computer, said program instructions performing steps comprising:
causing a display of said force sensation design interface on a display device of a host computer;
receiving input from a user to said force sensation design interface, said input selecting an individual force sensation to be commanded by said host computer and output by a force feedback interface device, said force feedback interface device including a manipulandum graspable by a user and moveable in a degree of freedom;
including said selected force sensation with at least one other individual force sensation in a compound force sensation based on input received from said user; and
causing a display of a compound force sensation window in said design interface, said window including a time-based graphical representation of said individual force sensations; and
outputting said compound force sensation to said force feedback interface device to be felt by said user, wherein a marker moves across said time-based graphical representation to indicate a current point in time with respect to said individual force sensations.
18. A computer readable medium as recited in claim17 including program instructions for receiving additional input from said user to change said characteristics of said individual force sensations and adjust the time period during which at least one of said individual force sensations is to be output.
19. A computer readable medium as recited in claim17 including program instructions for writing said compound force sensation to a storage medium, said compound force sensation being accessible to application programs implemented on said computer and controlling force feedback.
20. A method for implementing a force sensation design interface, said method comprising:
enabling a display of said force sensation design interface on a display device of a host computer;
enabling a reception of input from a user to said force sensation design interface, said input selecting a plurality of individual force sensations to be commanded by said host computer and output by a force feedback interface device, said force feedback interface device including a manipulandum physically engageable by a user and moveable in at least one degree of freedom, wherein each of said individual force sensations is provided with an edit display area displayed in said force sensation design interface;
enabling an inclusion of said plurality of selected force sensations in a compound force sensation based on input received from said user; and
enabling a display of a time-based graphical representation of said compound force sensation, said representation displaying each of said individual force sensations included in said compound force sensation.
21. A method as recited in claim20 further comprising enabling commanding of said compound force sensation to be output by said force feedback interface device coupled to said host computer such that said individual force sensations are output to said manipulandum in conjunction with updating said graphical demonstration of said compound force sensation, such that said graphical representation provides said user with a visual demonstration of said individual force sensations included in said compound force sensation.
22. A method as recited in claim20 including program instructions for receiving additional input from said user to change said characteristics of said individual force sensations and adjust the time period during which at least one of said individual force sensations is to be output.
23. A method as recited in claim20 further comprising:
enabling a reception of changes from said user to at least one of said individual force sensations in said compound force sensation, said changes received in said edit display area of said changed individual force sensation, said changes provided after said compound force sensation is output; and
enabling a display of said changes in said graphical representation of said compound force sensation.
24. A method as recited in claim20 further comprising:
enabling a reception of changes from said user to said compound force sensation, said changes received in a display area of said compound force sensation, said changes provided after said compound force sensation is output; and
enabling said changes to take effect in said display area of said compound force sensation receiving said changes.
25. A method as recited in claim20 wherein said time-based graphical representation includes a bar graph for each of said individual force sensations in said compound force sensation indicating a start time and duration of each of said individual force sensations relative to each other.
26. A method as recited in claim21 wherein said updating of said graphical demonstration includes moving a marker across a plurality of bar graphs, each indicating a time period during which one of said individual force sensations is output.
27. A method as recited in claim21 wherein a sound is associated with said compound force sensation, wherein a start of a sound is synchronized with a start of said compound force sensation.
28. A method as recited in claim20 wherein said compound force sensation also includes a lower-level compound force sensation, said lower-level compound force sensation including at least one individual force sensation.
29. A method as recited in claim24 further comprising enabling a storage of a plurality of parameters characterizing said force sensation to a storage medium accessible to said host computer, and further comprising enabling an access of said stored plurality of parameters by an application program different than said design interface, said application program using said plurality of parameters to output said characterized force sensation during execution of said application program.
30. A method as recited in claim20 wherein at least one of said selected individual force sensations is a periodic force sensation, and wherein said graphical representation of said periodic force sensation is an image of a periodic waveform.
US09/270,2231997-04-251999-03-15Designing compound force sensations for computer applicationsExpired - LifetimeUS6292170B1 (en)

Priority Applications (6)

Application NumberPriority DateFiling DateTitle
US09/270,223US6292170B1 (en)1997-04-251999-03-15Designing compound force sensations for computer applications
AU38799/00AAU3879900A (en)1999-03-152000-03-13Designing compound force sensations for computer applications
PCT/US2000/006562WO2000055839A1 (en)1999-03-152000-03-13Designing compound force sensations for computer applications
US09/947,213US7091948B2 (en)1997-04-252001-09-04Design of force sensations for haptic feedback computer interfaces
US11/455,944US7701438B2 (en)1997-04-252006-06-20Design of force sensations for haptic feedback computer interfaces
US12/762,791US8717287B2 (en)1997-04-252010-04-19Force sensations for haptic feedback computer interfaces

Applications Claiming Priority (4)

Application NumberPriority DateFiling DateTitle
US08/846,011US6147674A (en)1995-12-011997-04-25Method and apparatus for designing force sensations in force feedback computer applications
US08/877,114US6169540B1 (en)1995-12-011997-06-17Method and apparatus for designing force sensations in force feedback applications
US09/243,209US6285351B1 (en)1997-04-251999-02-02Designing force sensations for computer applications including sounds
US09/270,223US6292170B1 (en)1997-04-251999-03-15Designing compound force sensations for computer applications

Related Parent Applications (4)

Application NumberTitlePriority DateFiling Date
US08/846,011Continuation-In-PartUS6147674A (en)1995-12-011997-04-25Method and apparatus for designing force sensations in force feedback computer applications
US08/877,114ContinuationUS6169540B1 (en)1995-12-011997-06-17Method and apparatus for designing force sensations in force feedback applications
US09/243,209Continuation-In-PartUS6285351B1 (en)1997-04-251999-02-02Designing force sensations for computer applications including sounds
US09/734,630Continuation-In-PartUS6697086B2 (en)1995-12-012000-12-11Designing force sensations for force feedback computer applications

Related Child Applications (3)

Application NumberTitlePriority DateFiling Date
US09/243,209Continuation-In-PartUS6285351B1 (en)1997-04-251999-02-02Designing force sensations for computer applications including sounds
US09/734,630Continuation-In-PartUS6697086B2 (en)1995-12-012000-12-11Designing force sensations for force feedback computer applications
US09/947,213Continuation-In-PartUS7091948B2 (en)1997-04-252001-09-04Design of force sensations for haptic feedback computer interfaces

Publications (1)

Publication NumberPublication Date
US6292170B1true US6292170B1 (en)2001-09-18

Family

ID=23030424

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US09/270,223Expired - LifetimeUS6292170B1 (en)1997-04-251999-03-15Designing compound force sensations for computer applications

Country Status (3)

CountryLink
US (1)US6292170B1 (en)
AU (1)AU3879900A (en)
WO (1)WO2000055839A1 (en)

Cited By (64)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20020080150A1 (en)*2000-12-212002-06-27Rintaro NakataniGraphical display adjusting system
US6424356B2 (en)1999-05-052002-07-23Immersion CorporationCommand of force sensations in a forceback system using force effect suites
US6496200B1 (en)*1999-11-022002-12-17Interval Research Corp.Flexible variation of haptic interface resolution
US20030025723A1 (en)*2001-07-162003-02-06Immersion CorporationPivotable computer interface
US20030067440A1 (en)*2001-10-092003-04-10Rank Stephen D.Haptic feedback sensations based on audio output from computer devices
US6563523B1 (en)*1999-10-282003-05-13Midway Amusement Games LlcGraphical control of a time-based set-up feature for a video game
US6697086B2 (en)1995-12-012004-02-24Immersion CorporationDesigning force sensations for force feedback computer applications
US6703550B2 (en)2001-10-102004-03-09Immersion CorporationSound data output and manipulation using haptic feedback
US20040236541A1 (en)*1997-05-122004-11-25Kramer James F.System and method for constraining a graphical hand from penetrating simulated graphical objects
US20050004987A1 (en)*2003-07-032005-01-06Sbc, Inc.Graphical user interface for uploading files
US20050004944A1 (en)*1999-12-222005-01-06Cossins Robert N.Geographic management system
US6863536B1 (en)1998-01-262005-03-08Simbionix Ltd.Endoscopic tutorial system with a bleeding complication
US6885898B1 (en)2001-05-182005-04-26Roy-G-Biv CorporationEvent driven motion systems
US20050134561A1 (en)*2003-12-222005-06-23Tierling Kollin M.System and method for mapping instructions associated with haptic feedback
US20050184696A1 (en)*2003-12-192005-08-25Anastas George V.Haptic profiling system and method
US6941543B1 (en)1995-05-302005-09-06Roy-G-Biv CorporationMotion control system and method
US7024666B1 (en)2002-01-282006-04-04Roy-G-Biv CorporationMotion control systems and methods
US7027032B2 (en)1995-12-012006-04-11Immersion CorporationDesigning force sensations for force feedback computer applications
US7031798B2 (en)2001-02-092006-04-18Roy-G-Biv CorporationEvent management systems and methods for the distribution of motion control commands
US7039866B1 (en)1995-12-012006-05-02Immersion CorporationMethod and apparatus for providing dynamic force sensations for force feedback computer applications
US7084867B1 (en)*1999-04-022006-08-01Massachusetts Institute Of TechnologyHaptic interface system for collision detection and applications therefore
US7091948B2 (en)*1997-04-252006-08-15Immersion CorporationDesign of force sensations for haptic feedback computer interfaces
US7133033B1 (en)*1999-12-022006-11-07Advanced Input Devices Uk LimitedActuator for a switch
US7137107B1 (en)2003-04-292006-11-14Roy-G-Biv CorporationMotion control systems and methods
US7168042B2 (en)1997-11-142007-01-23Immersion CorporationForce effects for object types in a graphical user interface
US7209028B2 (en)2001-06-272007-04-24Immersion CorporationPosition sensor with resistive element
US7307619B2 (en)2001-05-042007-12-11Immersion Medical, Inc.Haptic interface for palpation simulation
US20080166115A1 (en)*2007-01-052008-07-10David SachsMethod and apparatus for producing a sharp image from a handheld device containing a gyroscope
US7404716B2 (en)2001-07-162008-07-29Immersion CorporationInterface apparatus with cable-driven force feedback and four grounded actuators
US20090007661A1 (en)*2007-07-062009-01-08Invensense Inc.Integrated Motion Processing Unit (MPU) With MEMS Inertial Sensing And Embedded Digital Electronics
US20090145225A1 (en)*2007-12-102009-06-11Invensense Inc.Vertically integrated 3-axis MEMS angular accelerometer with integrated electronics
US20090184849A1 (en)*2008-01-182009-07-23Invensense, Inc.Interfacing application programs and motion sensors of a device
US20090193892A1 (en)*2008-02-052009-08-06Invensense Inc.Dual mode sensing for vibratory gyroscope
US20090289779A1 (en)*1997-11-142009-11-26Immersion CorporationForce feedback system including multi-tasking graphical host environment
US20090303204A1 (en)*2007-01-052009-12-10Invensense Inc.Controlling and accessing content using motion processing on mobile devices
US20100033426A1 (en)*2008-08-112010-02-11Immersion Corporation, A Delaware CorporationHaptic Enabled Gaming Peripheral for a Musical Game
US20100064805A1 (en)*2008-09-122010-03-18InvenSense,. Inc.Low inertia frame for detecting coriolis acceleration
US20100071467A1 (en)*2008-09-242010-03-25InvensenseIntegrated multiaxis motion sensor
US7742036B2 (en)2003-12-222010-06-22Immersion CorporationSystem and method for controlling haptic devices having multiple operational modes
US20100167820A1 (en)*2008-12-292010-07-01Houssam BarakatHuman interface device
US20100231539A1 (en)*2009-03-122010-09-16Immersion CorporationSystems and Methods for Interfaces Featuring Surface-Based Haptic Effects
US20100231540A1 (en)*2009-03-122010-09-16Immersion CorporationSystems and Methods For A Texture Engine
US20100231508A1 (en)*2009-03-122010-09-16Immersion CorporationSystems and Methods for Using Multiple Actuators to Realize Textures
US20100231550A1 (en)*2009-03-122010-09-16Immersion CorporationSystems and Methods for Friction Displays and Additional Haptic Effects
US20100231367A1 (en)*2009-03-122010-09-16Immersion CorporationSystems and Methods for Providing Features in a Friction Display
US7853645B2 (en)1997-10-072010-12-14Roy-G-Biv CorporationRemote generation and distribution of command programs for programmable devices
US7904194B2 (en)2001-02-092011-03-08Roy-G-Biv CorporationEvent management systems and methods for motion control systems
US7978186B2 (en)1998-10-262011-07-12Immersion CorporationMechanisms for control knobs and other interface devices
US8027349B2 (en)2003-09-252011-09-27Roy-G-Biv CorporationDatabase event driven motion systems
US8032605B2 (en)1999-10-272011-10-04Roy-G-Biv CorporationGeneration and distribution of motion commands over a distributed network
US8047075B2 (en)2007-06-212011-11-01Invensense, Inc.Vertically integrated 3-axis MEMS accelerometer with electronics
US8102869B2 (en)2003-09-252012-01-24Roy-G-Biv CorporationData routing systems and methods
US8271105B2 (en)1995-05-302012-09-18Roy-G-Biv CorporationMotion control systems
USD679728S1 (en)*2009-02-112013-04-09Ricoh Company, Ltd.Display screen with icon
US8508039B1 (en)2008-05-082013-08-13Invensense, Inc.Wafer scale chip scale packaging of vertically integrated MEMS sensors with electronics
US9056244B2 (en)2012-09-122015-06-16Wms Gaming Inc.Gaming apparatus incorporating targeted haptic feedback
US20160054156A1 (en)*2013-03-292016-02-25Atlas Copco Blm S.R.L.Electronic control device for controlling sensors
US20170090577A1 (en)*2015-09-252017-03-30Immersion CorporationHaptic effects design system
US9927873B2 (en)2009-03-122018-03-27Immersion CorporationSystems and methods for using textures in graphical user interface widgets
US20180126276A1 (en)*2016-11-082018-05-10CodeSpark, Inc.Level editor with word-free coding system
WO2019048847A1 (en)*2017-09-062019-03-14Simworx LtdModelling systems and methods for entertainment rides
US10564924B1 (en)*2015-09-302020-02-18Amazon Technologies, Inc.Navigating metadata in long form content
US10613629B2 (en)2015-03-272020-04-07Chad LaurendeauSystem and method for force feedback interface devices
US11103787B1 (en)2010-06-242021-08-31Gregory S. RabinSystem and method for generating a synthetic video stream

Citations (85)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US3919691A (en)1971-05-261975-11-11Bell Telephone Labor IncTactile man-machine communication system
US4477043A (en)1982-12-151984-10-16The United States Of America As Represented By The Secretary Of The Air ForceBiodynamic resistant control stick
EP0265011A1 (en)1986-10-201988-04-27Océ-Nederland B.V.Inputting device with tactile feedback
US4800721A (en)1987-02-131989-01-31Caterpillar Inc.Force feedback lever
US4868549A (en)1987-05-181989-09-19International Business Machines CorporationFeedback mouse
US4896554A (en)1987-11-031990-01-30Culver Craig FMultifunction tactile manipulatable control
US4907973A (en)1988-11-141990-03-13Hon David CExpert system simulator for modeling realistic internal environments and performance
US4935728A (en)1985-01-021990-06-19Altra CorporationComputer control
US5044956A (en)1989-01-121991-09-03Atari Games CorporationControl device such as a steering wheel for video vehicle simulator with realistic feedback forces
US5076517A (en)1989-08-141991-12-31United Technologies CorporationProgrammable, linear collective control system for a helicopter
US5103404A (en)1985-12-061992-04-07Tensor Development, Inc.Feedback for a manipulator
US5116180A (en)1988-07-181992-05-26Spar Aerospace LimitedHuman-in-the-loop machine control loop
US5146566A (en)1991-05-291992-09-08Ibm CorporationInput/output system for computer user interface using magnetic levitation
US5182557A (en)1989-09-201993-01-26Semborg Recrob, Corp.Motorized joystick
US5185561A (en)1991-07-231993-02-09Digital Equipment CorporationTorque motor as a tactile feedback device in a computer system
US5186629A (en)1991-08-221993-02-16International Business Machines CorporationVirtual graphics display capable of presenting icons and windows to the blind computer user and method
US5193963A (en)1990-10-311993-03-16The United States Of America As Represented By The Administrator Of The National Aeronautics And Space AdministrationForce reflecting hand controller
US5220260A (en)1991-10-241993-06-15Lex Computer And Management CorporationActuator having electronically controllable tactile responsiveness
US5223776A (en)1990-12-311993-06-29Honeywell Inc.Six-degree virtual pivot controller
US5235868A (en)1991-10-021993-08-17Culver Craig FMechanism for generating control signals
US5341459A (en)1991-05-091994-08-23The United States Of America As Represented By The Administrator Of The National Aeronautics And Space AdministrationGeneralized compliant motion primitive
US5354162A (en)1991-02-261994-10-11Rutgers UniversityActuator system for providing force feedback to portable master support
EP0626634A2 (en)1993-05-111994-11-30Matsushita Electric Industrial Co., Ltd.Force-feedback data input device
WO1995002801A1 (en)1993-07-161995-01-26Immersion Human InterfaceThree-dimensional mechanical mouse
US5386507A (en)1991-07-181995-01-31Teig; Steven L.Computer graphics system for selectively modelling molecules and investigating the chemical and physical properties thereof
US5389865A (en)1992-12-021995-02-14Cybernet Systems CorporationMethod and system for providing a tactile virtual reality and manipulator defining an interface device therefor
US5402499A (en)1992-08-071995-03-28Lsi Logic CorporationMultimedia controller
US5405152A (en)1993-06-081995-04-11The Walt Disney CompanyMethod and apparatus for an interactive video game with physical feedback
WO1995020788A1 (en)1994-01-271995-08-03Exos, Inc.Intelligent remote multimode sense and display system utilizing haptic information compression
US5451924A (en)1993-01-141995-09-19Massachusetts Institute Of TechnologyApparatus for providing sensory substitution of force feedback
US5461711A (en)1993-12-221995-10-24Interval Research CorporationMethod and system for spatial accessing of time-based information
WO1995032459A1 (en)1994-05-191995-11-30Exos, Inc.Interactive simulation system including force feedback input device
US5482051A (en)1994-03-101996-01-09The University Of AkronElectromyographic virtual reality system
US5513100A (en)1993-06-101996-04-30The University Of British ColumbiaVelocity controller with force feedback stiffness control
US5526480A (en)1992-12-281996-06-11International Business Machines CorporationTime domain scroll bar for multimedia presentations in a data processing system
US5550562A (en)1993-01-121996-08-27Fujitsu LimitedData processing device that enables mouse-operated application programs to be operated from an operation pad, and an operation pad for use with the same
US5565840A (en)1994-09-211996-10-15Thorner; CraigTactile sensation generator
US5589854A (en)1995-06-221996-12-31Tsai; Ming-ChangTouching feedback device
US5596347A (en)1994-01-271997-01-21Microsoft CorporationSystem and method for computer cursor control
US5625576A (en)1993-10-011997-04-29Massachusetts Institute Of TechnologyForce reflecting haptic interface
US5629594A (en)1992-12-021997-05-13Cybernet Systems CorporationForce feedback system
WO1997021160A2 (en)1995-12-011997-06-12Immersion Human Interface CorporationMethod and apparatus for providing force feedback for a graphical user interface
US5642469A (en)1994-11-031997-06-24University Of WashingtonDirect-drive manipulator for pen-based force display
WO1997031333A1 (en)1996-02-231997-08-28Shalit TomerDisplay arrangement and method
US5666473A (en)1992-10-081997-09-09Science & Technology Corporation & UnmTactile computer aided sculpting device
US5666138A (en)1994-11-221997-09-09Culver; Craig F.Interface control
US5684722A (en)1994-09-211997-11-04Thorner; CraigApparatus and method for generating a control signal for a tactile sensation generator
US5691898A (en)*1995-09-271997-11-25Immersion Human Interface Corp.Safe and low cost computer peripherals with force feedback for consumer applications
US5709219A (en)1994-01-271998-01-20Microsoft CorporationMethod and apparatus to create a complex tactile sensation
US5715412A (en)1994-12-161998-02-03Hitachi, Ltd.Method of acoustically expressing image information
US5714978A (en)1994-12-051998-02-03Nec CorporationAdjacent cursor system with tactile feedback for the blind
US5717869A (en)*1995-11-031998-02-10Xerox CorporationComputer controlled display system using a timeline to control playback of temporal data representing collaborative activities
US5721566A (en)1995-01-181998-02-24Immersion Human Interface Corp.Method and apparatus for providing damping force feedback
US5734373A (en)1993-07-161998-03-31Immersion Human Interface CorporationMethod and apparatus for controlling force feedback interface systems utilizing a host computer
US5736978A (en)*1995-05-261998-04-07The United States Of America As Represented By The Secretary Of The Air ForceTactile graphics display
US5739811A (en)1993-07-161998-04-14Immersion Human Interface CorporationMethod and apparatus for controlling human-computer interface systems providing force feedback
US5754023A (en)1995-10-261998-05-19Cybernet Systems CorporationGyro-stabilized platforms for force-feedback applications
US5755577A (en)1995-03-291998-05-26Gillio; Robert G.Apparatus and method for recording data of a surgical procedure
US5760764A (en)1995-12-131998-06-02AltraComputer display cursor controller with serial interface
US5760788A (en)1995-07-281998-06-02Microsoft CorporationGraphical programming system and method for enabling a person to learn text-based programming
US5767839A (en)1995-01-181998-06-16Immersion Human Interface CorporationMethod and apparatus for providing passive force feedback to human-computer interface systems
US5769640A (en)1992-12-021998-06-23Cybernet Systems CorporationMethod and system for simulating medical procedures including virtual reality and control method and system for use therein
US5781172A (en)1990-12-051998-07-14U.S. Philips CorporationData input device for use with a data processing apparatus and a data processing apparatus provided with such a device
US5784052A (en)1995-03-131998-07-21U.S. Philips CorporationVertical translation of mouse or trackball enables truly 3D input
US5790108A (en)1992-10-231998-08-04University Of British ColumbiaController
US5802353A (en)1996-06-121998-09-01General Electric CompanyHaptic computer modeling system
US5805140A (en)1993-07-161998-09-08Immersion CorporationHigh bandwidth force feedback interface using voice coils and flexures
US5808601A (en)1995-09-121998-09-15International Business Machines CorporationInteractive object selection pointer method and apparatus
US5825308A (en)1996-11-261998-10-20Immersion Human Interface CorporationForce feedback interface having isotonic and isometric functionality
EP0875819A1 (en)1997-04-251998-11-04Microsoft CorporationOffline force effect rendering
WO1998049614A1 (en)1997-04-251998-11-05Immersion CorporationMethod and apparatus for designing and controlling force sensations in force feedback computer applications
WO1998058308A1 (en)1997-06-171998-12-23Immersion CorporationMethod and apparatus for designing force sensations in force feedback computer applications
US5857986A (en)1996-05-241999-01-12Moriyasu; HiroInteractive vibrator for multimedia
US5889670A (en)1991-10-241999-03-30Immersion CorporationMethod and apparatus for tactilely responsive user interface
US5956484A (en)1995-12-131999-09-21Immersion CorporationMethod and apparatus for providing force feedback over a computer network
US5959613A (en)1995-12-011999-09-28Immersion CorporationMethod and apparatus for shaping force signals for a force feedback device
US5973689A (en)1996-10-301999-10-26U.S. Philips CorporationCursor control with user feedback mechanism
US6001014A (en)1996-10-011999-12-14Sony Computer Entertainment Inc.Game machine control module and game machine
US6020876A (en)1997-04-142000-02-01Immersion CorporationForce feedback interface with selective disturbance filter
US6028593A (en)1995-12-012000-02-22Immersion CorporationMethod and apparatus for providing simulated physical interactions within computer generated environments
US6078308A (en)1995-12-132000-06-20Immersion CorporationGraphical click surfaces for force feedback applications to provide user selection using cursor interaction with a trigger position within a boundary of a graphical object
US6088017A (en)1995-11-302000-07-11Virtual Technologies, Inc.Tactile feedback man-machine interface device
US6111577A (en)1996-04-042000-08-29Massachusetts Institute Of TechnologyMethod and apparatus for determining forces to be applied to a user through a haptic interface
US6125385A (en)1996-08-012000-09-26Immersion CorporationForce feedback implementation in web pages
US6131097A (en)1992-12-022000-10-10Immersion CorporationHaptic authoring

Patent Citations (95)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US3919691A (en)1971-05-261975-11-11Bell Telephone Labor IncTactile man-machine communication system
US4477043A (en)1982-12-151984-10-16The United States Of America As Represented By The Secretary Of The Air ForceBiodynamic resistant control stick
US4935728A (en)1985-01-021990-06-19Altra CorporationComputer control
US5103404A (en)1985-12-061992-04-07Tensor Development, Inc.Feedback for a manipulator
EP0265011A1 (en)1986-10-201988-04-27Océ-Nederland B.V.Inputting device with tactile feedback
US4800721A (en)1987-02-131989-01-31Caterpillar Inc.Force feedback lever
US4868549A (en)1987-05-181989-09-19International Business Machines CorporationFeedback mouse
US4896554A (en)1987-11-031990-01-30Culver Craig FMultifunction tactile manipulatable control
US5116180A (en)1988-07-181992-05-26Spar Aerospace LimitedHuman-in-the-loop machine control loop
US4907973A (en)1988-11-141990-03-13Hon David CExpert system simulator for modeling realistic internal environments and performance
US5044956A (en)1989-01-121991-09-03Atari Games CorporationControl device such as a steering wheel for video vehicle simulator with realistic feedback forces
US5076517A (en)1989-08-141991-12-31United Technologies CorporationProgrammable, linear collective control system for a helicopter
US5182557A (en)1989-09-201993-01-26Semborg Recrob, Corp.Motorized joystick
US5193963A (en)1990-10-311993-03-16The United States Of America As Represented By The Administrator Of The National Aeronautics And Space AdministrationForce reflecting hand controller
US5781172A (en)1990-12-051998-07-14U.S. Philips CorporationData input device for use with a data processing apparatus and a data processing apparatus provided with such a device
US5223776A (en)1990-12-311993-06-29Honeywell Inc.Six-degree virtual pivot controller
US5354162A (en)1991-02-261994-10-11Rutgers UniversityActuator system for providing force feedback to portable master support
US5341459A (en)1991-05-091994-08-23The United States Of America As Represented By The Administrator Of The National Aeronautics And Space AdministrationGeneralized compliant motion primitive
US5146566A (en)1991-05-291992-09-08Ibm CorporationInput/output system for computer user interface using magnetic levitation
US5386507A (en)1991-07-181995-01-31Teig; Steven L.Computer graphics system for selectively modelling molecules and investigating the chemical and physical properties thereof
US5185561A (en)1991-07-231993-02-09Digital Equipment CorporationTorque motor as a tactile feedback device in a computer system
US5186629A (en)1991-08-221993-02-16International Business Machines CorporationVirtual graphics display capable of presenting icons and windows to the blind computer user and method
US5235868A (en)1991-10-021993-08-17Culver Craig FMechanism for generating control signals
US5414337A (en)1991-10-241995-05-09Lex Computer And Management CorporationActuator having electronically controllable tactile responsiveness
US5889670A (en)1991-10-241999-03-30Immersion CorporationMethod and apparatus for tactilely responsive user interface
US5220260A (en)1991-10-241993-06-15Lex Computer And Management CorporationActuator having electronically controllable tactile responsiveness
US5402499A (en)1992-08-071995-03-28Lsi Logic CorporationMultimedia controller
US5666473A (en)1992-10-081997-09-09Science & Technology Corporation & UnmTactile computer aided sculpting device
US5790108A (en)1992-10-231998-08-04University Of British ColumbiaController
US5831408A (en)1992-12-021998-11-03Cybernet Systems CorporationForce feedback system
US5844392A (en)1992-12-021998-12-01Cybernet Systems CorporationHaptic browsing
US6131097A (en)1992-12-022000-10-10Immersion CorporationHaptic authoring
US5629594A (en)1992-12-021997-05-13Cybernet Systems CorporationForce feedback system
US5769640A (en)1992-12-021998-06-23Cybernet Systems CorporationMethod and system for simulating medical procedures including virtual reality and control method and system for use therein
US5389865A (en)1992-12-021995-02-14Cybernet Systems CorporationMethod and system for providing a tactile virtual reality and manipulator defining an interface device therefor
US5526480A (en)1992-12-281996-06-11International Business Machines CorporationTime domain scroll bar for multimedia presentations in a data processing system
US5550562A (en)1993-01-121996-08-27Fujitsu LimitedData processing device that enables mouse-operated application programs to be operated from an operation pad, and an operation pad for use with the same
US5451924A (en)1993-01-141995-09-19Massachusetts Institute Of TechnologyApparatus for providing sensory substitution of force feedback
EP0626634A2 (en)1993-05-111994-11-30Matsushita Electric Industrial Co., Ltd.Force-feedback data input device
US5405152A (en)1993-06-081995-04-11The Walt Disney CompanyMethod and apparatus for an interactive video game with physical feedback
US5513100A (en)1993-06-101996-04-30The University Of British ColumbiaVelocity controller with force feedback stiffness control
US5805140A (en)1993-07-161998-09-08Immersion CorporationHigh bandwidth force feedback interface using voice coils and flexures
WO1995002801A1 (en)1993-07-161995-01-26Immersion Human InterfaceThree-dimensional mechanical mouse
US5576727A (en)1993-07-161996-11-19Immersion Human Interface CorporationElectromechanical human-computer interface with force feedback
US5739811A (en)1993-07-161998-04-14Immersion Human Interface CorporationMethod and apparatus for controlling human-computer interface systems providing force feedback
US5734373A (en)1993-07-161998-03-31Immersion Human Interface CorporationMethod and apparatus for controlling force feedback interface systems utilizing a host computer
US5625576A (en)1993-10-011997-04-29Massachusetts Institute Of TechnologyForce reflecting haptic interface
US5461711A (en)1993-12-221995-10-24Interval Research CorporationMethod and system for spatial accessing of time-based information
US5596347A (en)1994-01-271997-01-21Microsoft CorporationSystem and method for computer cursor control
US5742278A (en)*1994-01-271998-04-21Microsoft CorporationForce feedback joystick with digital signal processor controlled by host processor
WO1995020788A1 (en)1994-01-271995-08-03Exos, Inc.Intelligent remote multimode sense and display system utilizing haptic information compression
US5709219A (en)1994-01-271998-01-20Microsoft CorporationMethod and apparatus to create a complex tactile sensation
US5482051A (en)1994-03-101996-01-09The University Of AkronElectromyographic virtual reality system
WO1995032459A1 (en)1994-05-191995-11-30Exos, Inc.Interactive simulation system including force feedback input device
US6004134A (en)1994-05-191999-12-21Exos, Inc.Interactive simulation including force feedback
US5643087A (en)1994-05-191997-07-01Microsoft CorporationInput device including digital force feedback apparatus
US5684722A (en)1994-09-211997-11-04Thorner; CraigApparatus and method for generating a control signal for a tactile sensation generator
US5565840A (en)1994-09-211996-10-15Thorner; CraigTactile sensation generator
US5642469A (en)1994-11-031997-06-24University Of WashingtonDirect-drive manipulator for pen-based force display
US5666138A (en)1994-11-221997-09-09Culver; Craig F.Interface control
US5714978A (en)1994-12-051998-02-03Nec CorporationAdjacent cursor system with tactile feedback for the blind
US5715412A (en)1994-12-161998-02-03Hitachi, Ltd.Method of acoustically expressing image information
US5721566A (en)1995-01-181998-02-24Immersion Human Interface Corp.Method and apparatus for providing damping force feedback
US5767839A (en)1995-01-181998-06-16Immersion Human Interface CorporationMethod and apparatus for providing passive force feedback to human-computer interface systems
US5784052A (en)1995-03-131998-07-21U.S. Philips CorporationVertical translation of mouse or trackball enables truly 3D input
US5755577A (en)1995-03-291998-05-26Gillio; Robert G.Apparatus and method for recording data of a surgical procedure
US5736978A (en)*1995-05-261998-04-07The United States Of America As Represented By The Secretary Of The Air ForceTactile graphics display
US5589854A (en)1995-06-221996-12-31Tsai; Ming-ChangTouching feedback device
US5760788A (en)1995-07-281998-06-02Microsoft CorporationGraphical programming system and method for enabling a person to learn text-based programming
US5808601A (en)1995-09-121998-09-15International Business Machines CorporationInteractive object selection pointer method and apparatus
US5691898A (en)*1995-09-271997-11-25Immersion Human Interface Corp.Safe and low cost computer peripherals with force feedback for consumer applications
US5754023A (en)1995-10-261998-05-19Cybernet Systems CorporationGyro-stabilized platforms for force-feedback applications
US5717869A (en)*1995-11-031998-02-10Xerox CorporationComputer controlled display system using a timeline to control playback of temporal data representing collaborative activities
US6088017A (en)1995-11-302000-07-11Virtual Technologies, Inc.Tactile feedback man-machine interface device
WO1997021160A2 (en)1995-12-011997-06-12Immersion Human Interface CorporationMethod and apparatus for providing force feedback for a graphical user interface
US6169540B1 (en)1995-12-012001-01-02Immersion CorporationMethod and apparatus for designing force sensations in force feedback applications
US6147674A (en)1995-12-012000-11-14Immersion CorporationMethod and apparatus for designing force sensations in force feedback computer applications
US5959613A (en)1995-12-011999-09-28Immersion CorporationMethod and apparatus for shaping force signals for a force feedback device
US6028593A (en)1995-12-012000-02-22Immersion CorporationMethod and apparatus for providing simulated physical interactions within computer generated environments
US6078308A (en)1995-12-132000-06-20Immersion CorporationGraphical click surfaces for force feedback applications to provide user selection using cursor interaction with a trigger position within a boundary of a graphical object
US5760764A (en)1995-12-131998-06-02AltraComputer display cursor controller with serial interface
US5956484A (en)1995-12-131999-09-21Immersion CorporationMethod and apparatus for providing force feedback over a computer network
WO1997031333A1 (en)1996-02-231997-08-28Shalit TomerDisplay arrangement and method
US6111577A (en)1996-04-042000-08-29Massachusetts Institute Of TechnologyMethod and apparatus for determining forces to be applied to a user through a haptic interface
US5857986A (en)1996-05-241999-01-12Moriyasu; HiroInteractive vibrator for multimedia
US5802353A (en)1996-06-121998-09-01General Electric CompanyHaptic computer modeling system
US6125385A (en)1996-08-012000-09-26Immersion CorporationForce feedback implementation in web pages
US6001014A (en)1996-10-011999-12-14Sony Computer Entertainment Inc.Game machine control module and game machine
US5973689A (en)1996-10-301999-10-26U.S. Philips CorporationCursor control with user feedback mechanism
US5825308A (en)1996-11-261998-10-20Immersion Human Interface CorporationForce feedback interface having isotonic and isometric functionality
US6020876A (en)1997-04-142000-02-01Immersion CorporationForce feedback interface with selective disturbance filter
US6005551A (en)1997-04-251999-12-21Microsoft CorporationOffline force effect rendering
WO1998049614A1 (en)1997-04-251998-11-05Immersion CorporationMethod and apparatus for designing and controlling force sensations in force feedback computer applications
EP0875819A1 (en)1997-04-251998-11-04Microsoft CorporationOffline force effect rendering
WO1998058308A1 (en)1997-06-171998-12-23Immersion CorporationMethod and apparatus for designing force sensations in force feedback computer applications

Non-Patent Citations (32)

* Cited by examiner, † Cited by third party
Title
Adelstein, B. et al., "Design & Implementation of a Force Reflecting Manipulandum for Manual Control Research," NASA Ames Research, 1992, pp. 1-24.
Akamatsu et al., "Multimodal Mouse: A Mouse-Type Device with Tactile & Force Display," Presence, vol. 3, No. 1, 1994, pp. 73-80.
Atkinson, W. et al., "Computing with Feeling," Comp. & Graphics, vol. 2, 1976, pp. 97-103.
Bejczy et al., "The Phantom Robot: Predictive Displays for Teleoperation with Time Delay,", CA Institute of Technology, IEEE 1990, pp. 546-550.
Brooks Jr., Frederick et al., "Project GROPE-Haptic Displays for Scientific Visualization," Computer Graphics, vol. 24, No. 4, 1990, pp. 177-185.
Colgate, J. Edward et al., "Implementation of Stiff Virtual Walls in Force-Reflecting Interfaces," Northwestern University, 1993.
Hannaford et al., "Force Feedback Cursor Control," NASA Tech Brief, vol. 13, No. 11, Item #21, 1989, pp. 1-4.
Hiroo Iwata, "Artificial Reality with Force-Feedback: Development of Desktop Virtual Space with Compact Master Manipulator," Computer Graphics, vol. 24, No. 4, 1990, pp. 165-170.
Hirota et al., "Development Of Surface Display," University Tokyo, IEEE, 1993, pp. 256-262.
Jones, L.A. et al., "A Perceptual Analysis of Stiffness," Expeerimental Brain Research (1990) 79:150-156.
Kelley et al., "On the Development of a Force-Feedback Mouse & It's Integration into a Graphical User Interface," Int'l Mechanical Engineering Congress and Exhibition, 1994, pp. 1-8.
Kelley et al., Magic Mouse: Tactile & Kinesthetic Feedback in the Human-Computer Interface using an Electromagnetically Acutuated Input/Output Device, 1993, University British Columbia.
Kilpatrick et al., "The Use of Kinesthetic Supplement in an Interactive Graphics System," University of North Carolina, 1976, pp. 1-174.
L. Rosenberg, "A Force Feedback Programming Primer-For PC Gaming Peripherals Supporting I-Force 2.0 and Direct-X 5.0," Immersion Corporation 1997.
Minsky, Margaret et al., "Feeling & Seeing: Issues in Force Display," ACM 1990, pp. 235-242, 270.
Munch et al., "Intelligent Control for Haptic Displays," Eurographics '96, Blackwell Publishers, vol. 15, No. 3, 1996, pp. C-217-226.
Ouh-Young et al., "Using A Manipulator for Force Display in Molecular Docking,", University North Carolina, IEEE 1988, pp. 1824-1829.
Ouh-young, Ming et al., "Creating an Illusion of Feel: Control Issues in Force Display," University of N. Carolina, 1989, pp. 1-14.
Payette et al., "Evaluation Of Force Feedback Computer Pointing device in Zero Gravity," DSC-vol. 58, Proc. of ASME Dynamics Systems and Control Division, 1996, pp. 547-553.
Rosenberg et al., "Commercially Viable Force Feedback Controller for Individuals with Neuromotor Disabilities," Crew Systems Directorate, AL/CF-TR-1997-0016, 1996, pp. 1-33.
Rosenberg et al., "Perceptual Decomposition of Virtual Haptic Surfaces," Proc. IEEE Symposium on Research Frontiers in Virtual Reality, 1993.
Rosenberg et al., "The use of force feedback to enhance graphical user interfaces," Stereoscopic Displays and Virtual Reality Systems, Proc. SPIE, 1996, pp. 243-248.
Rosenberg, "Virtual haptic Overlays enhance performance in telepresence tasks," Stanford University, 1994.
Rosenberg, L., "Perceptual Design of a Virtual Rigid Surface Contact," Air Force Materiel Command, AL/CF-TR-1995-0029, 1993, pp. 1-39.
Rosenberg, L., "The Use of Fixtures to Enhance Operator Performance in Time Delayed Teleoperation," Air Force Materiel Command, AL/CF-TR-1994-0139, 1993, pp. 1-45.
Russo, Massimo, "The Design & Implementation of a Three Degree of Freedom Force Output Joystick," Dept. of Mech. Engineering, 1990, pp. 1-40.
Schmult, B. et al., "Application Areas for a Force-Feedback Joystick," DSC-vol. 49, Advances in Robotics Mechatronics, and Haptic Interfaces, ASME 1993, pp. 47-54.
SensAble Technologies, Inc., Ghost SDK Programmer's Guide, Vers. 3.0, Rev. 1.2 Jan. 3, 1999, pp. 1-1 to F-1.
Su, S. Augustine et al., "The Virtual Panel Architecture: A 3D Gesture Framework," IEEE 1993, pp. 387-393.
Tan, Hong et al., "Human Factors for the Design of Force-Reflecting Haptic Interfaces," MIT, 1994.
Winey III et al., "Computer Simulated Visual & Tactile Feedback as an aid to Manipulator & Vehicle Control," MIT, 1981, pp. 1-79.
Yokokohji, et al., "What You Can See is What You Can Feel-Development of a Visual/Haptic Interface to Virtual Environment," 0-8186-7295 IEEE, Jan. 1996.

Cited By (129)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US6941543B1 (en)1995-05-302005-09-06Roy-G-Biv CorporationMotion control system and method
US8271105B2 (en)1995-05-302012-09-18Roy-G-Biv CorporationMotion control systems
US7027032B2 (en)1995-12-012006-04-11Immersion CorporationDesigning force sensations for force feedback computer applications
US7039866B1 (en)1995-12-012006-05-02Immersion CorporationMethod and apparatus for providing dynamic force sensations for force feedback computer applications
US6697086B2 (en)1995-12-012004-02-24Immersion CorporationDesigning force sensations for force feedback computer applications
US7843424B2 (en)1995-12-012010-11-30Immersion CorporationMethod and apparatus for designing force sensations in force feedback computer applications
US7091948B2 (en)*1997-04-252006-08-15Immersion CorporationDesign of force sensations for haptic feedback computer interfaces
US20100201502A1 (en)*1997-04-252010-08-12Immersion CorporationDesign of Force Sensations For Haptic Feedback Computer Interfaces
US7701438B2 (en)*1997-04-252010-04-20Immersion CorporationDesign of force sensations for haptic feedback computer interfaces
US8717287B2 (en)*1997-04-252014-05-06Immersion CorporationForce sensations for haptic feedback computer interfaces
US20060279538A1 (en)*1997-04-252006-12-14Chang Dean CDesign of force sensations for haptic feedback computer interfaces
US7472047B2 (en)1997-05-122008-12-30Immersion CorporationSystem and method for constraining a graphical hand from penetrating simulated graphical objects
US20040236541A1 (en)*1997-05-122004-11-25Kramer James F.System and method for constraining a graphical hand from penetrating simulated graphical objects
US7853645B2 (en)1997-10-072010-12-14Roy-G-Biv CorporationRemote generation and distribution of command programs for programmable devices
US9740287B2 (en)1997-11-142017-08-22Immersion CorporationForce feedback system including multi-tasking graphical host environment and interface device
US8928581B2 (en)1997-11-142015-01-06Immersion CorporationForce feedback system including multi-tasking graphical host environment
US20090289779A1 (en)*1997-11-142009-11-26Immersion CorporationForce feedback system including multi-tasking graphical host environment
US20100271295A1 (en)*1997-11-142010-10-28Immersion CorporationForce feedback system including multi-tasking graphical host environment and interface device
US8020095B2 (en)1997-11-142011-09-13Immersion CorporationForce feedback system including multi-tasking graphical host environment
US9778745B2 (en)1997-11-142017-10-03Immersion CorporationForce feedback system including multi-tasking graphical host environment and interface device
US8527873B2 (en)1997-11-142013-09-03Immersion CorporationForce feedback system including multi-tasking graphical host environment and interface device
US9323332B2 (en)1997-11-142016-04-26Immersion CorporationForce feedback system including multi-tasking graphical host environment
US10234944B2 (en)1997-11-142019-03-19Immersion CorporationForce feedback system including multi-tasking graphical host environment
US7168042B2 (en)1997-11-142007-01-23Immersion CorporationForce effects for object types in a graphical user interface
US6863536B1 (en)1998-01-262005-03-08Simbionix Ltd.Endoscopic tutorial system with a bleeding complication
US7978186B2 (en)1998-10-262011-07-12Immersion CorporationMechanisms for control knobs and other interface devices
US7084867B1 (en)*1999-04-022006-08-01Massachusetts Institute Of TechnologyHaptic interface system for collision detection and applications therefore
US6424356B2 (en)1999-05-052002-07-23Immersion CorporationCommand of force sensations in a forceback system using force effect suites
US8032605B2 (en)1999-10-272011-10-04Roy-G-Biv CorporationGeneration and distribution of motion commands over a distributed network
US6563523B1 (en)*1999-10-282003-05-13Midway Amusement Games LlcGraphical control of a time-based set-up feature for a video game
US6496200B1 (en)*1999-11-022002-12-17Interval Research Corp.Flexible variation of haptic interface resolution
US6803924B1 (en)1999-11-022004-10-12Interval Research CorporationFlexible variation of haptic interface resolution
US7133033B1 (en)*1999-12-022006-11-07Advanced Input Devices Uk LimitedActuator for a switch
US9363146B2 (en)1999-12-222016-06-07Celeritasworks, LlcGeographic management system for determining and displaying network data and geospatial data
US20050004944A1 (en)*1999-12-222005-01-06Cossins Robert N.Geographic management system
US10187268B2 (en)*1999-12-222019-01-22Celeritasworks, LlcGeographic management system
US20020080150A1 (en)*2000-12-212002-06-27Rintaro NakataniGraphical display adjusting system
US7904194B2 (en)2001-02-092011-03-08Roy-G-Biv CorporationEvent management systems and methods for motion control systems
US7031798B2 (en)2001-02-092006-04-18Roy-G-Biv CorporationEvent management systems and methods for the distribution of motion control commands
US7307619B2 (en)2001-05-042007-12-11Immersion Medical, Inc.Haptic interface for palpation simulation
US8638308B2 (en)2001-05-042014-01-28Immersion Medical, Inc.Haptic interface for palpation simulation
US7024255B1 (en)2001-05-182006-04-04Roy-G-Biv CorporationEvent driven motion systems
US6885898B1 (en)2001-05-182005-04-26Roy-G-Biv CorporationEvent driven motion systems
US7209028B2 (en)2001-06-272007-04-24Immersion CorporationPosition sensor with resistive element
US7404716B2 (en)2001-07-162008-07-29Immersion CorporationInterface apparatus with cable-driven force feedback and four grounded actuators
US20030025723A1 (en)*2001-07-162003-02-06Immersion CorporationPivotable computer interface
US8007282B2 (en)2001-07-162011-08-30Immersion CorporationMedical simulation interface apparatus and method
US7877243B2 (en)2001-07-162011-01-25Immersion CorporationPivotable computer interface
US7623114B2 (en)2001-10-092009-11-24Immersion CorporationHaptic feedback sensations based on audio output from computer devices
US8441437B2 (en)2001-10-092013-05-14Immersion CorporationHaptic feedback sensations based on audio output from computer devices
US8686941B2 (en)2001-10-092014-04-01Immersion CorporationHaptic feedback sensations based on audio output from computer devices
US20030067440A1 (en)*2001-10-092003-04-10Rank Stephen D.Haptic feedback sensations based on audio output from computer devices
US7208671B2 (en)2001-10-102007-04-24Immersion CorporationSound data output and manipulation using haptic feedback
US6703550B2 (en)2001-10-102004-03-09Immersion CorporationSound data output and manipulation using haptic feedback
US20040161118A1 (en)*2001-10-102004-08-19Chu Lonny L.Sound data output and manipulation using haptic feedback
US7024666B1 (en)2002-01-282006-04-04Roy-G-Biv CorporationMotion control systems and methods
US7137107B1 (en)2003-04-292006-11-14Roy-G-Biv CorporationMotion control systems and methods
US20050004987A1 (en)*2003-07-032005-01-06Sbc, Inc.Graphical user interface for uploading files
US8102869B2 (en)2003-09-252012-01-24Roy-G-Biv CorporationData routing systems and methods
US8027349B2 (en)2003-09-252011-09-27Roy-G-Biv CorporationDatabase event driven motion systems
US7982711B2 (en)2003-12-192011-07-19Immersion CorporationHaptic profiling system and method
US20050184696A1 (en)*2003-12-192005-08-25Anastas George V.Haptic profiling system and method
US20050134561A1 (en)*2003-12-222005-06-23Tierling Kollin M.System and method for mapping instructions associated with haptic feedback
US7742036B2 (en)2003-12-222010-06-22Immersion CorporationSystem and method for controlling haptic devices having multiple operational modes
US7791588B2 (en)2003-12-222010-09-07Immersion CorporationSystem and method for mapping instructions associated with haptic feedback
US7907838B2 (en)2007-01-052011-03-15Invensense, Inc.Motion sensing and processing on mobile devices
US8462109B2 (en)2007-01-052013-06-11Invensense, Inc.Controlling and accessing content using motion processing on mobile devices
US20110163955A1 (en)*2007-01-052011-07-07Invensense, Inc.Motion sensing and processing on mobile devices
US20100214216A1 (en)*2007-01-052010-08-26Invensense, Inc.Motion sensing and processing on mobile devices
US7796872B2 (en)2007-01-052010-09-14Invensense, Inc.Method and apparatus for producing a sharp image from a handheld device containing a gyroscope
US9292102B2 (en)2007-01-052016-03-22Invensense, Inc.Controlling and accessing content using motion processing on mobile devices
US8351773B2 (en)2007-01-052013-01-08Invensense, Inc.Motion sensing and processing on mobile devices
US20080166115A1 (en)*2007-01-052008-07-10David SachsMethod and apparatus for producing a sharp image from a handheld device containing a gyroscope
US20090303204A1 (en)*2007-01-052009-12-10Invensense Inc.Controlling and accessing content using motion processing on mobile devices
US8047075B2 (en)2007-06-212011-11-01Invensense, Inc.Vertically integrated 3-axis MEMS accelerometer with electronics
US10288427B2 (en)2007-07-062019-05-14Invensense, Inc.Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics
US8250921B2 (en)2007-07-062012-08-28Invensense, Inc.Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics
US20090007661A1 (en)*2007-07-062009-01-08Invensense Inc.Integrated Motion Processing Unit (MPU) With MEMS Inertial Sensing And Embedded Digital Electronics
US8997564B2 (en)2007-07-062015-04-07Invensense, Inc.Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics
US20110197677A1 (en)*2007-12-102011-08-18Invensense, Inc.Vertically integrated 3-axis mems angular accelerometer with integrated electronics
US9846175B2 (en)2007-12-102017-12-19Invensense, Inc.MEMS rotation sensor with integrated electronics
US20090145225A1 (en)*2007-12-102009-06-11Invensense Inc.Vertically integrated 3-axis MEMS angular accelerometer with integrated electronics
US7934423B2 (en)2007-12-102011-05-03Invensense, Inc.Vertically integrated 3-axis MEMS angular accelerometer with integrated electronics
US8960002B2 (en)2007-12-102015-02-24Invensense, Inc.Vertically integrated 3-axis MEMS angular accelerometer with integrated electronics
US9811174B2 (en)2008-01-182017-11-07Invensense, Inc.Interfacing application programs and motion sensors of a device
US9342154B2 (en)2008-01-182016-05-17Invensense, Inc.Interfacing application programs and motion sensors of a device
US20090184849A1 (en)*2008-01-182009-07-23Invensense, Inc.Interfacing application programs and motion sensors of a device
US8952832B2 (en)2008-01-182015-02-10Invensense, Inc.Interfacing application programs and motion sensors of a device
US8020441B2 (en)2008-02-052011-09-20Invensense, Inc.Dual mode sensing for vibratory gyroscope
US20090193892A1 (en)*2008-02-052009-08-06Invensense Inc.Dual mode sensing for vibratory gyroscope
US8508039B1 (en)2008-05-082013-08-13Invensense, Inc.Wafer scale chip scale packaging of vertically integrated MEMS sensors with electronics
US20100033426A1 (en)*2008-08-112010-02-11Immersion Corporation, A Delaware CorporationHaptic Enabled Gaming Peripheral for a Musical Game
US8539835B2 (en)2008-09-122013-09-24Invensense, Inc.Low inertia frame for detecting coriolis acceleration
US20100064805A1 (en)*2008-09-122010-03-18InvenSense,. Inc.Low inertia frame for detecting coriolis acceleration
US8141424B2 (en)2008-09-122012-03-27Invensense, Inc.Low inertia frame for detecting coriolis acceleration
US20100071467A1 (en)*2008-09-242010-03-25InvensenseIntegrated multiaxis motion sensor
US20100167820A1 (en)*2008-12-292010-07-01Houssam BarakatHuman interface device
USD679728S1 (en)*2009-02-112013-04-09Ricoh Company, Ltd.Display screen with icon
US10073527B2 (en)2009-03-122018-09-11Immersion CorporationSystems and methods for providing features in a friction display including a haptic effect based on a color and a degree of shading
US20100231539A1 (en)*2009-03-122010-09-16Immersion CorporationSystems and Methods for Interfaces Featuring Surface-Based Haptic Effects
US20100231508A1 (en)*2009-03-122010-09-16Immersion CorporationSystems and Methods for Using Multiple Actuators to Realize Textures
US9746923B2 (en)2009-03-122017-08-29Immersion CorporationSystems and methods for providing features in a friction display wherein a haptic effect is configured to vary the coefficient of friction
US20100231540A1 (en)*2009-03-122010-09-16Immersion CorporationSystems and Methods For A Texture Engine
US20100231367A1 (en)*2009-03-122010-09-16Immersion CorporationSystems and Methods for Providing Features in a Friction Display
US10620707B2 (en)2009-03-122020-04-14Immersion CorporationSystems and methods for interfaces featuring surface-based haptic effects
US10073526B2 (en)2009-03-122018-09-11Immersion CorporationSystems and methods for friction displays and additional haptic effects
US20100231550A1 (en)*2009-03-122010-09-16Immersion CorporationSystems and Methods for Friction Displays and Additional Haptic Effects
US10379618B2 (en)2009-03-122019-08-13Immersion CorporationSystems and methods for using textures in graphical user interface widgets
US9696803B2 (en)2009-03-122017-07-04Immersion CorporationSystems and methods for friction displays and additional haptic effects
US9927873B2 (en)2009-03-122018-03-27Immersion CorporationSystems and methods for using textures in graphical user interface widgets
US9874935B2 (en)2009-03-122018-01-23Immersion CorporationSystems and methods for a texture engine
US10466792B2 (en)2009-03-122019-11-05Immersion CorporationSystems and methods for friction displays and additional haptic effects
US10198077B2 (en)2009-03-122019-02-05Immersion CorporationSystems and methods for a texture engine
US10747322B2 (en)2009-03-122020-08-18Immersion CorporationSystems and methods for providing features in a friction display
US10007340B2 (en)2009-03-122018-06-26Immersion CorporationSystems and methods for interfaces featuring surface-based haptic effects
US10248213B2 (en)2009-03-122019-04-02Immersion CorporationSystems and methods for interfaces featuring surface-based haptic effects
US10564721B2 (en)2009-03-122020-02-18Immersion CorporationSystems and methods for using multiple actuators to realize textures
US11103787B1 (en)2010-06-242021-08-31Gregory S. RabinSystem and method for generating a synthetic video stream
US9056244B2 (en)2012-09-122015-06-16Wms Gaming Inc.Gaming apparatus incorporating targeted haptic feedback
US11525713B2 (en)*2013-03-292022-12-13Atlas Copco Blm S.R.L.Electronic control device for controlling sensors
US20160054156A1 (en)*2013-03-292016-02-25Atlas Copco Blm S.R.L.Electronic control device for controlling sensors
US10613629B2 (en)2015-03-272020-04-07Chad LaurendeauSystem and method for force feedback interface devices
US20170090577A1 (en)*2015-09-252017-03-30Immersion CorporationHaptic effects design system
US10564924B1 (en)*2015-09-302020-02-18Amazon Technologies, Inc.Navigating metadata in long form content
US10786737B2 (en)*2016-11-082020-09-29CodeSpark, Inc.Level editor with word-free coding system
US20180126276A1 (en)*2016-11-082018-05-10CodeSpark, Inc.Level editor with word-free coding system
GB2578850A (en)*2017-09-062020-05-27Simworx LtdModelling systems and methods for entertainment rides
CN111372663A (en)*2017-09-062020-07-03森沃克斯有限公司Modeling system and method for amusement ride installation
WO2019048847A1 (en)*2017-09-062019-03-14Simworx LtdModelling systems and methods for entertainment rides

Also Published As

Publication numberPublication date
AU3879900A (en)2000-10-04
WO2000055839A1 (en)2000-09-21

Similar Documents

PublicationPublication DateTitle
US6292170B1 (en)Designing compound force sensations for computer applications
US6285351B1 (en)Designing force sensations for computer applications including sounds
US7091948B2 (en)Design of force sensations for haptic feedback computer interfaces
US7843424B2 (en)Method and apparatus for designing force sensations in force feedback computer applications
US6169540B1 (en)Method and apparatus for designing force sensations in force feedback applications
US6750877B2 (en)Controlling haptic feedback for enhancing navigation in a graphical environment
US9582077B2 (en)Providing force feedback to a user of an interface device based on interactions of a user-controlled complaint paddle with a simulated object in a graphical environment
US9323332B2 (en)Force feedback system including multi-tasking graphical host environment
EP1012697B1 (en)Method and apparatus for designing and controlling force sensations in force feedback computer applications
US6424356B2 (en)Command of force sensations in a forceback system using force effect suites
US6433771B1 (en)Haptic device attribute control
WO2002057885A2 (en)Controlling haptic feedback for enhancing navigation in a graphical environment
US20210149545A1 (en)Systems for navigating a three-dimensional model with a handheld controller
Ramstein et al.Software Tools for Programming High-Quality Haptic Interfaces

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:IMMERSION CORPORATION, CALIFORNIA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHANG, DEAN C.;ROSENBERG, LOUIS B.;MALLETT, JEFFREY R.;REEL/FRAME:010014/0716;SIGNING DATES FROM 19990601 TO 19990602

STCFInformation on status: patent grant

Free format text:PATENTED CASE

ASAssignment

Owner name:IMMERSION CORPORATION (DELAWARE CORPORATION), CALI

Free format text:MERGER;ASSIGNOR:IMMERSION CORPORATION (CALIFORNIA CORPORATION);REEL/FRAME:012607/0368

Effective date:19991102

FPAYFee payment

Year of fee payment:4

FPAYFee payment

Year of fee payment:8

FPAYFee payment

Year of fee payment:12


[8]ページ先頭

©2009-2025 Movatter.jp