Movatterモバイル変換


[0]ホーム

URL:


US20210397260A1 - Methods and systems for decoding and rendering a haptic effect associated with a 3d environment - Google Patents

Methods and systems for decoding and rendering a haptic effect associated with a 3d environment
Download PDF

Info

Publication number
US20210397260A1
US20210397260A1US17/343,811US202117343811AUS2021397260A1US 20210397260 A1US20210397260 A1US 20210397260A1US 202117343811 AUS202117343811 AUS 202117343811AUS 2021397260 A1US2021397260 A1US 2021397260A1
Authority
US
United States
Prior art keywords
haptic
user
environment
data
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US17/343,811
Other versions
US11698680B2 (en
Inventor
David Birnbaum
Yeshwant Muthusamy
Jamal Saboune
Christopher Ullrich
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Immersion Corp
Original Assignee
Immersion Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Immersion CorpfiledCriticalImmersion Corp
Priority to US17/343,811priorityCriticalpatent/US11698680B2/en
Assigned to IMMERSION CORPORATIONreassignmentIMMERSION CORPORATIONASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: ULLRICH, CHRISTOPHER, BIRNBAUM, DAVID, MUTHUSAMY, Yeshwant, SABOUNE, JAMAL
Priority to PCT/US2021/036911prioritypatent/WO2021262453A1/en
Priority to EP21829297.7Aprioritypatent/EP4168884A4/en
Publication of US20210397260A1publicationCriticalpatent/US20210397260A1/en
Application grantedgrantedCritical
Publication of US11698680B2publicationCriticalpatent/US11698680B2/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

In aspects, methods and apparatus are provided for generating a haptic effect for a three-dimensional (3D) environment that is experienced virtually by a user. The methods may be performed by a processor, and includes receiving media data that describes the 3D environment, wherein the media data includes haptic data which describes a haptic characteristic associated with at least one object, structure, or event in the 3D environment. The method further includes performing a haptic decoding operation and a haptic rendering operation. The decoding operation may include extracting the haptic data from the media data. The haptic rendering operation may include generating a drive signal and communicating the drive signal to a haptic output device to cause the haptic output device to generate a haptic effect at a user peripheral device. Numerous other aspects are provided.

Description

Claims (20)

What is claimed is:
1. A method of providing haptic effects for a three-dimensional (3D) environment that is experienced virtually by a user, the method comprising:
receiving, by at least one processor, media data that describes the 3D environment, wherein the media data includes: haptic data which describes a haptic characteristic associated with at least one object, structure, or event in the 3D environment;
performing, by the at least one processor, a haptic decoding operation that includes extracting the haptic data from the media data; and
performing, by the at least one processor, a haptic rendering operation that includes:
(i) generating a drive signal based on the haptic characteristic and based on at least one of a virtual viewpoint location of a user in the 3D environment or a virtual field of view of the user in the 3D environment, wherein the virtual viewpoint location is a location at which the user is virtually located in the 3D environment, or is a location at which a 3D representation of the user is located in the 3D environment, and
(ii) communicating the drive signal to a haptic output device in a user peripheral device in communication with the at least one processor, to cause the haptic output device to generate a haptic effect at the user peripheral device.
2. The method ofclaim 1, wherein the media data further includes visual data that describes an appearance of the 3D environment in an omnidirectional manner relative to at least one location in the 3D environment.
3. The method ofclaim 1, wherein the media data that is received includes a payload which has a first payload portion that includes the visual data, a second payload portion that includes the haptic data, and a third payload portion that includes audio data.
4. The method ofclaim 1, wherein the media data further includes point cloud data that describes the object or structure in the 3D environment, and
wherein the haptic rendering operation includes generating the drive signal based on a spatial relationship between the viewpoint location and the object or structure described by the point cloud data.
5. The method ofclaim 4, wherein the haptic rendering operation includes generating the drive signal based on user interaction with the object or structure in the 3D environment, and wherein the method further comprises detecting a collision between at least a portion of the 3D representation of the user and the object or structure, wherein the haptic rendering operation is performed in response to detecting the collision.
6. The method ofclaim 1, wherein the haptic data includes a texture map or texture atlas that describes a virtual surface texture or virtual surface material associated with the object or structure in the 3D environment, and wherein the haptic rendering operation includes generating the drive signal based on the virtual surface texture or virtual surface material.
7. The method ofclaim 1, wherein the haptic data describes a virtual thermal characteristic associated with the object or structure in the 3D environment, and wherein the haptic rendering operation includes generating the drive signal based on the virtual thermal characteristic.
8. The method ofclaim 1, wherein the haptic data describes a virtual firmness characteristic associated with the object or structure in the 3D environment, and wherein the haptic rendering operation includes generating the drive signal based on the virtual firmness characteristic.
9. The method ofclaim 1, wherein the haptic data describes a virtual vibration characteristic associated with the object or structure in the 3D environment, and wherein the haptic rendering operation includes generating the drive signal based on the virtual vibration characteristic.
10. The method ofclaim 1, wherein the haptic data includes content which is pre-rendered at a network entity, and wherein the content is received by the at least one processor from the network entity.
11. The method ofclaim 1, wherein the visual data includes video data for a 360-degree video that describes an appearance of the 3D environment in a manner that allows the virtual field of view within the 3D environment to be controlled with six degrees of freedom (6 DoF).
12. The method ofclaim 11, wherein the haptic characteristic associated with the object, structure, or event in the 3D environment is a first haptic track,
wherein the haptic data describes a second haptic track associated with the 360-degree video, and
wherein the haptic rendering operation includes generating the drive signal based on the first haptic track and the second haptic track.
13. The method ofclaim 1, wherein the media data includes audio data, wherein the haptic data includes a haptic characteristic which is associated with the audio data,
wherein the audio data describes crowd noise at a venue represented by the 3D environment, and wherein the haptic characteristic describes tactile essence associated with the crowd noise.
14. The method ofclaim 1, wherein the haptic rendering operation includes generating the drive signal based on at least one of a device type or device capability of the haptic output device in the user peripheral device.
15. The method ofclaim 1, wherein the haptic output device is a wearable device among a plurality of wearable devices in communication with the at least one processor, wherein the haptic rendering operation includes selecting the wearable device, from among the plurality of wearable devices, to output the haptic effect.
16. The method ofclaim 1, wherein the haptic rendering operation includes generating the drive signal based on a geometry or geometry type of the object or structure in the 3D environment.
17. The method ofclaim 1, wherein the haptic data includes multiple haptic tracks associated with multiple imaginary 3D wedges that emanate from a common location and divide the object or structure into multiple 3D portions, respectively, and
wherein the haptic rendering operation includes generating the drive signal based on which 3D wedge or 3D portion of the object or structure is receiving user interaction or is within the user field of view.
18. The method ofclaim 1, wherein, when the haptic data includes multiple haptic tracks associated with multiple nested shapes that divide the object or structure into multiple layers, and wherein the haptic rendering operation includes generating the drive signal based on which nested shape of the multiple nested shapes is receiving user interaction.
19. A method of providing haptic effects for a chat application, the method comprising:
displaying, during a chat session between a first user and a second user, a video of the first user or an animation representing the first user on an end user device of the second user;
receiving user input on the end user device of the second user, wherein the user input indicates selection of the first user, or selection of an avatar representing the first user; and
generating, by the end user device of the second user, a message for transmission to an end user device of the first user, wherein the message indicates that a haptic effect is to be generated on the end user device of the first user, wherein the message is based on the user input received on the end user device of the second user.
20. A method of providing a chat session among at least a first user and a second user for a chat application, the method comprising:
determining, by an end user device of the second user, a device capability of an end user device of the first user; and
based on the determined device capability of the end user device of the first user, displaying a video of the first user on the end user device of the second user, or instead displaying an avatar of the first user on the end user of the second user.
US17/343,8112020-06-232021-06-10Methods and systems for decoding and rendering a haptic effect associated with a 3D environmentActiveUS11698680B2 (en)

Priority Applications (3)

Application NumberPriority DateFiling DateTitle
US17/343,811US11698680B2 (en)2020-06-232021-06-10Methods and systems for decoding and rendering a haptic effect associated with a 3D environment
PCT/US2021/036911WO2021262453A1 (en)2020-06-232021-06-11Methods and systems for decoding and rendering a haptic effect associated with a 3d environment
EP21829297.7AEP4168884A4 (en)2020-06-232021-06-11 METHODS AND SYSTEMS FOR DECODING AND RENDERING A HAPTIC EFFECT ASSOCIATED WITH A 3D ENVIRONMENT

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
US202063042800P2020-06-232020-06-23
US17/343,811US11698680B2 (en)2020-06-232021-06-10Methods and systems for decoding and rendering a haptic effect associated with a 3D environment

Publications (2)

Publication NumberPublication Date
US20210397260A1true US20210397260A1 (en)2021-12-23
US11698680B2 US11698680B2 (en)2023-07-11

Family

ID=79023481

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US17/343,811ActiveUS11698680B2 (en)2020-06-232021-06-10Methods and systems for decoding and rendering a haptic effect associated with a 3D environment

Country Status (3)

CountryLink
US (1)US11698680B2 (en)
EP (1)EP4168884A4 (en)
WO (1)WO2021262453A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20230039530A1 (en)*2021-08-092023-02-09Disney Enterprises, Inc.Automated generation of haptic effects based on haptics data
US20230097257A1 (en)*2020-12-312023-03-30Snap Inc.Electronic communication interface with haptic feedback response
KR20230108150A (en)2022-01-102023-07-18울산과학기술원Computer device for visual-based tactile output using machine learning model, and method of the same
US11712628B2 (en)*2018-09-202023-08-01Apple Inc.Method and device for attenuation of co-user interactions
WO2024086230A1 (en)*2022-10-192024-04-25Interdigital Vc Holdings, Inc.Carriage of coded haptics data in media containers
WO2024086141A1 (en)*2022-10-182024-04-25Tencent America LLCMethod and apparatus for timed referenced access unit packetization of haptics elementary streams
US12164689B2 (en)2021-03-312024-12-10Snap Inc.Virtual reality communication interface with haptic feedback response
US12200399B2 (en)2020-12-312025-01-14Snap Inc.Real-time video communication interface with haptic feedback response
US12216823B2 (en)2020-12-312025-02-04Snap Inc.Communication interface with haptic feedback response
US12254132B2 (en)2020-12-312025-03-18Snap Inc.Communication interface with haptic feedback response
US12314472B2 (en)2021-03-312025-05-27Snap Inc.Real-time communication interface with haptic and audio feedback response
US12353628B2 (en)2021-03-312025-07-08Snap Inc.Virtual reality communication interface with haptic feedback response
US12445675B2 (en)*2023-10-162025-10-14Tencent America LLCMethod and apparatus for defining frames and timed referenced network abstraction layer (NALS) structure in haptics signals

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US12411553B2 (en)*2022-12-202025-09-09Tencent America LLCMethods for signaling random access in haptics interchange file format

Citations (13)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20140204002A1 (en)*2013-01-212014-07-24Rotem BennetVirtual interaction with image projection
US20140267076A1 (en)*2013-03-152014-09-18Immersion CorporationSystems and Methods for Parameter Modification of Haptic Effects
US20140270681A1 (en)*2013-03-152014-09-18Immersion CorporationMethod and apparatus for encoding and decoding haptic information in multi-media files
US20150355713A1 (en)*2013-05-172015-12-10Immersion CorporationLow-frequency effects haptic conversion system
US20170237789A1 (en)*2016-02-172017-08-17Meta CompanyApparatuses, methods and systems for sharing virtual elements
US20190057583A1 (en)*2013-09-062019-02-21Immersion CorporationMethod and system for providing haptic effects based on information complementary to multimedia content
US20190094981A1 (en)*2014-06-142019-03-28Magic Leap, Inc.Methods and systems for creating virtual and augmented reality
US10254838B2 (en)*2014-12-232019-04-09Immersion CorporationArchitecture and communication protocol for haptic output devices
US20190369836A1 (en)*2018-05-302019-12-05Microsoft Technology Licensing, LlcHuman-computer interface for computationally efficient placement and sizing of virtual objects in a three-dimensional representation of a real-world environment
US20200368616A1 (en)*2017-06-092020-11-26Dean Lindsay DELAMONTMixed reality gaming system
US20210096726A1 (en)*2019-09-272021-04-01Apple Inc.Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments
US20210382544A1 (en)*2020-06-082021-12-09Apple Inc.Presenting avatars in three-dimensional environments
US20220270509A1 (en)*2019-06-142022-08-25Quantum Interface, LlcPredictive virtual training systems, apparatuses, interfaces, and methods for implementing same

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US7990374B2 (en)2004-06-292011-08-02Sensable Technologies, Inc.Apparatus and methods for haptic rendering using data in a graphics pipeline
US8054289B2 (en)*2006-12-012011-11-08Mimic Technologies, Inc.Methods, apparatus, and article for force feedback based on tension control and tracking through cables
US9588586B2 (en)2014-06-092017-03-07Immersion CorporationProgrammable haptic devices and methods for modifying haptic strength based on perspective and/or proximity
EP3118723A1 (en)*2015-07-132017-01-18Thomson LicensingMethod and apparatus for providing haptic feedback and interactivity based on user haptic space (hapspace)
EP3179336A1 (en)*2015-12-072017-06-14Thomson LicensingMethod and device for rendering haptic effects in an immersive content
US10147460B2 (en)*2016-12-282018-12-04Immersion CorporationHaptic effect generation for space-dependent content
WO2018200424A1 (en)*2017-04-242018-11-01Ultrahaptics Ip LtdAlgorithm enhancements for haptic-based phased-array systems
US10775894B2 (en)*2018-11-022020-09-15Immersion CorporationSystems and methods for providing customizable haptic playback

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20140204002A1 (en)*2013-01-212014-07-24Rotem BennetVirtual interaction with image projection
US20140267076A1 (en)*2013-03-152014-09-18Immersion CorporationSystems and Methods for Parameter Modification of Haptic Effects
US20140270681A1 (en)*2013-03-152014-09-18Immersion CorporationMethod and apparatus for encoding and decoding haptic information in multi-media files
US20150355713A1 (en)*2013-05-172015-12-10Immersion CorporationLow-frequency effects haptic conversion system
US20190057583A1 (en)*2013-09-062019-02-21Immersion CorporationMethod and system for providing haptic effects based on information complementary to multimedia content
US20190094981A1 (en)*2014-06-142019-03-28Magic Leap, Inc.Methods and systems for creating virtual and augmented reality
US10254838B2 (en)*2014-12-232019-04-09Immersion CorporationArchitecture and communication protocol for haptic output devices
US20170237789A1 (en)*2016-02-172017-08-17Meta CompanyApparatuses, methods and systems for sharing virtual elements
US20200368616A1 (en)*2017-06-092020-11-26Dean Lindsay DELAMONTMixed reality gaming system
US20190369836A1 (en)*2018-05-302019-12-05Microsoft Technology Licensing, LlcHuman-computer interface for computationally efficient placement and sizing of virtual objects in a three-dimensional representation of a real-world environment
US20220270509A1 (en)*2019-06-142022-08-25Quantum Interface, LlcPredictive virtual training systems, apparatuses, interfaces, and methods for implementing same
US20210096726A1 (en)*2019-09-272021-04-01Apple Inc.Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments
US20210382544A1 (en)*2020-06-082021-12-09Apple Inc.Presenting avatars in three-dimensional environments

Cited By (17)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US11712628B2 (en)*2018-09-202023-08-01Apple Inc.Method and device for attenuation of co-user interactions
US20240184371A1 (en)*2020-12-312024-06-06Snap Inc.Electronic communication interface with haptic feedback response
US20230097257A1 (en)*2020-12-312023-03-30Snap Inc.Electronic communication interface with haptic feedback response
US12254132B2 (en)2020-12-312025-03-18Snap Inc.Communication interface with haptic feedback response
US12216823B2 (en)2020-12-312025-02-04Snap Inc.Communication interface with haptic feedback response
US12216827B2 (en)*2020-12-312025-02-04Snap Inc.Electronic communication interface with haptic feedback response
US12200399B2 (en)2020-12-312025-01-14Snap Inc.Real-time video communication interface with haptic feedback response
US12164689B2 (en)2021-03-312024-12-10Snap Inc.Virtual reality communication interface with haptic feedback response
US12314472B2 (en)2021-03-312025-05-27Snap Inc.Real-time communication interface with haptic and audio feedback response
US12353628B2 (en)2021-03-312025-07-08Snap Inc.Virtual reality communication interface with haptic feedback response
US20230039530A1 (en)*2021-08-092023-02-09Disney Enterprises, Inc.Automated generation of haptic effects based on haptics data
US12347194B2 (en)*2021-08-092025-07-01Disney Enterprises, Inc.Automated generation of haptic effects based on haptics data
KR102654176B1 (en)*2022-01-102024-04-04울산과학기술원Computer device for visual-based tactile output using machine learning model, and method of the same
KR20230108150A (en)2022-01-102023-07-18울산과학기술원Computer device for visual-based tactile output using machine learning model, and method of the same
WO2024086141A1 (en)*2022-10-182024-04-25Tencent America LLCMethod and apparatus for timed referenced access unit packetization of haptics elementary streams
WO2024086230A1 (en)*2022-10-192024-04-25Interdigital Vc Holdings, Inc.Carriage of coded haptics data in media containers
US12445675B2 (en)*2023-10-162025-10-14Tencent America LLCMethod and apparatus for defining frames and timed referenced network abstraction layer (NALS) structure in haptics signals

Also Published As

Publication numberPublication date
EP4168884A1 (en)2023-04-26
US11698680B2 (en)2023-07-11
WO2021262453A1 (en)2021-12-30
EP4168884A4 (en)2024-08-07

Similar Documents

PublicationPublication DateTitle
US11698680B2 (en)Methods and systems for decoding and rendering a haptic effect associated with a 3D environment
JP7498209B2 (en) Information processing device, information processing method, and computer program
CN110636324B (en)Interface display method and device, computer equipment and storage medium
CN112256127B (en)Spherical video editing
JP6321150B2 (en) 3D gameplay sharing
CN107852573B (en)Mixed reality social interactions
US10078917B1 (en)Augmented reality simulation
CN114327700B (en) Virtual reality device and screenshot picture playing method
RU2621644C2 (en)World of mass simultaneous remote digital presence
CN108700936A (en) Through camera user interface elements for virtual reality
US11496587B2 (en)Methods and systems for specification file based delivery of an immersive virtual reality experience
JP2009252240A (en)System, method and program for incorporating reflection
CN113194329B (en)Live interaction method, device, terminal and storage medium
CN119383447A (en) Image processing device, image processing method, system, computer program product, storage medium and computer-implemented method
CN114830636A (en)Parameters for overlay processing of immersive teleconferencing and telepresence of remote terminals
JP7496558B2 (en) Computer program, server device, terminal device, and method
US20130332829A1 (en)Dynamic 2d and 3d gestural interfaces for audio video players capable of uninterrupted continuity of fruition of audio video feeds
CN115068929B (en) Game information acquisition method, device, electronic device and storage medium
US20230043683A1 (en)Determining a change in position of displayed digital content in subsequent frames via graphics processing circuitry
US20230334790A1 (en)Interactive reality computing experience using optical lenticular multi-perspective simulation
WO2024060959A1 (en)Method and apparatus for adjusting viewing picture in virtual environment, and storage medium and device
KR20210056414A (en) System for controlling audio-enabled connected devices in mixed reality environments
US11381805B2 (en)Audio and video stream rendering modification based on device rotation metric
US11816785B2 (en)Image processing device and image processing method
Wu et al.Interaction with a virtual character through performance based animation

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:IMMERSION CORPORATION, CALIFORNIA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BIRNBAUM, DAVID;MUTHUSAMY, YESHWANT;SABOUNE, JAMAL;AND OTHERS;SIGNING DATES FROM 20210531 TO 20210604;REEL/FRAME:056496/0262

FEPPFee payment procedure

Free format text:ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPPInformation on status: patent application and granting procedure in general

Free format text:DOCKETED NEW CASE - READY FOR EXAMINATION

STPPInformation on status: patent application and granting procedure in general

Free format text:NON FINAL ACTION MAILED

STCFInformation on status: patent grant

Free format text:PATENTED CASE


[8]ページ先頭

©2009-2025 Movatter.jp