Movatterモバイル変換


[0]ホーム

URL:


US20220005273A1 - Dynamic three-dimensional surface sketching - Google Patents

Dynamic three-dimensional surface sketching
Download PDF

Info

Publication number
US20220005273A1
US20220005273A1US16/918,845US202016918845AUS2022005273A1US 20220005273 A1US20220005273 A1US 20220005273A1US 202016918845 AUS202016918845 AUS 202016918845AUS 2022005273 A1US2022005273 A1US 2022005273A1
Authority
US
United States
Prior art keywords
input device
rendering
display
data representative
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/918,845
Inventor
Oluwaseyi SOSANYA
Daniela Paredes-Fuentes
Daniel Thomas
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wacom Co Ltd
Original Assignee
Wacom Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wacom Co LtdfiledCriticalWacom Co Ltd
Priority to US16/918,845priorityCriticalpatent/US20220005273A1/en
Assigned to WACOM CO., LTD.reassignmentWACOM CO., LTD.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: THOMAS, DANIEL, PAREDES-FUENTES, Daniela, SOSANYA, OLUWASEYI
Priority to EP21831515.8Aprioritypatent/EP4176337A4/en
Priority to CN202180046391.9Aprioritypatent/CN115735176A/en
Priority to JP2022580857Aprioritypatent/JP7682217B2/en
Priority to PCT/IB2021/055651prioritypatent/WO2022003513A1/en
Publication of US20220005273A1publicationCriticalpatent/US20220005273A1/en
Priority to US18/148,340prioritypatent/US12307601B2/en
Priority to US19/192,164prioritypatent/US20250259397A1/en
Priority to JP2025080308Aprioritypatent/JP2025118835A/en
Abandonedlegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

A method and system for three-dimensional (3D) surface sketching are provided. A 3D scanner scans an outer surface of a physical object and outputs data representative of the outer surface. A processor generates, based on the received data, a 3D model of the object and outputs a 3D rendering of the object. A display displays the 3D rendering of the object. An input device physically traces over a portion of the outer surface of the object and a tracking device tracks a positioning of the input device as the input device physically traces over the portion of the outer surface of the object. The processor receives data representative of at least one spatial position of the input device, augments the 3D rendering of the object based at least in part on the data and outputs the augmented 3D rendering to the display.

Description

Claims (20)

What is claimed is:
1. A system, comprising:
a three-dimensional (3D) scanner configured to scan an outer surface of a physical object, and output data representative of the outer surface of the object;
a processor configured to receive the data representative of the outer surface of the object, and generate, based on the received data, a 3D model of the object, and output a 3D rendering of the object based on the generated 3D model;
a display configured to receive the 3D rendering of the object, and display the 3D rendering of the object;
an input device operable to physically trace over at least one portion of the outer surface of the object; and
a tracking device configured to track a positioning of the input device as the input device physically traces over the at least one portion of the outer surface of the object, and output data representative of at least one spatial position of the input device as the input device traces over the object, wherein:
the processor is configured to receive the data representative of the at least one spatial position of the input device, augment the 3D rendering of the object based at least in part on the data representative of the at least one spatial position of the input device, and in response to augmenting the 3D rendering of the object, output the augmented 3D rendering of the object to the display, and
the display is configured to display the augmented 3D rendering of the object.
2. The system ofclaim 1, wherein the processor is configured to augment the 3D rendering of the object by at least:
identifying, based on the data representative of the at least one spatial position of the input device, one or more curves having one or more respective positions in space relative to the outer surface of the object; and
superposing the one or more curves on the 3D rendering of the object at one or more rendering positions corresponding to the one or more positions in space relative to the outer surface of the object, respectively.
3. The system ofclaim 2, wherein the input device is pressure sensitive and configured to sense a pressure applied to the input device as the input device physically traces over the at least one portion of the outer surface of the object, and output data representative of the pressure, and
wherein the processor is configured to:
determine respective one or more widths of the one or more curves based at least in part on the pressure applied to the input device as the input device physically traces over the at least one portion of the outer surface of the object to form the one or more curves; and
superpose, on the 3D rendering of the object, the one or more curves having the respective one or more widths.
4. The system ofclaim 3, wherein the input device includes a pressure-sensitive tip operable to sense the pressure applied to the input device as the input device physically traces over the at least one portion of the outer surface of the object.
5. The system ofclaim 2, wherein the input device includes a first control input operative to receive one or more respective width indications of the one or more curves, and wherein the input device is configured to output data representative of one or more respective width indications to the processor, and the processor is configured to:
receive the data representative of one or more respective width indications;
determine respective one or more widths of the one or more curves based the data representative of one or more respective width indications; and
superpose, on the 3D rendering of the object, the one or more curves having the respective one or more widths.
6. The system ofclaim 1, wherein the display is a head-mounted display configured to display the 3D rendering of the object superposed on the physical object that otherwise is visually visible through the head-mounted display.
7. A system, comprising:
a three-dimensional (3D) scanner configured to scan an outer surface of a physical object, and output data representative of the outer surface of the object;
a processor configured to receive the data representative of the outer surface of the object, generate, based on the received data, a 3D model of the object, and output a 3D rendering of the object based on the generated 3D model;
a display configured to receive the 3D rendering of the object, and display the 3D rendering of the object;
an input device operable to physically trace over at least one portion of the outer surface of the object; and
a tracking device configured to track a positioning of the input device as the input device traces over the at least one portion of the outer surface of the object, and output data representative of at least one position of the input device in 3D space as the input device traces over the outer surface of the object, wherein:
the processor is configured to receive the data representative of the at least one position of the input device, modify the 3D model of the object based at least in part on the data representative of the at least one position of the input device, generate an updated 3D rendering of the object based on the modified 3D model, and in response to generating the updated 3D rendering of the object, output the updated 3D rendering of the object to the display, and
the display is configured to display the updated 3D rendering of the object.
8. The system ofclaim 7, wherein the processor is configured to generate the 3D model of the object by generating a polygon mesh that includes a plurality of vertices and a plurality of edges.
9. The system ofclaim 8, wherein the processor is configured to modify the 3D model of the object by at least:
changing a position of a vertex of the plurality of vertices or an edge of the plurality of edges to correspond to the at least one position of the input device in 3D space.
10. The system ofclaim 8, wherein the processor is configured to modify the 3D model of the object by at least:
adding, to the plurality of vertices, a first vertex having a position in space that corresponds to the at least one position of the input device in 3D space.
11. The system ofclaim 10, wherein the processor is configured to modify the 3D model of the object by at least:
removing, from the plurality of vertices, a second vertex having a position that is closest in 3D space to the position of the first vertex.
12. The system ofclaim 7, wherein the display is a head-mounted display configured to display the 3D rendering of the object superposed on the physical object that otherwise is visually visible through the head-mounted display, and further configured to display the updated 3D rendering of the object superposed on the physical object that otherwise is visually visible through the head-mounted display.
13. A system, comprising:
a three-dimensional (3D) scanner configured to scan an outer surface of a physical object, and output data representative of the outer surface of the object;
a processor configured to receive the data representative of the outer surface of the object, and generate, based on the received data, a 3D model of the object, and output a 3D rendering of the object based on the generated 3D model;
a display configured to receive the 3D rendering of the object, and display the 3D rendering of the object;
an input device operable to physically trace over at least one portion of the outer surface of the object; and
a tracking device configured to track a positioning of the input device as the input device traces over the at least one portion of the outer surface of the object, and output data representative of at least two positions of the input device as the input device traces over the object, wherein:
the processor is configured to receive the data representative of the at least two positions, determine a distance between the at least two positions, and output data representative of the distance.
14. The system ofclaim 13, wherein the processor is configured to identify a curve based on data representative of positions of the input device between the at least two positions, and determine the distance between the at least two positions along the identified curve.
15. The system ofclaim 13, wherein the display is configured to:
receive the data representative of the distance, and
display the distance on the display.
16. The system ofclaim 13, wherein the input device includes a control input operative to receive a selection of a first mode of operation of a plurality of modes of operation of the input device and output data indicative of the first mode of operation.
17. The system ofclaim 16, wherein the processor is configured to:
receive the data indicative of the first mode of operation, and
in response to receiving the data indicative of the first mode of operation, determine the distance between the at least two positions, and output the data representative of the distance.
18. The system ofclaim 16, wherein the input device receives, via the control input, a selection of a second mode of operation of the plurality of modes of operation of the input device and output data indicative of the second mode of operation.
19. The system ofclaim 18, wherein the processor is configured to:
receive the data indicative of the second mode of operation, and
in response to receiving the data indicative of the second mode of operation, augment the 3D rendering of the object based on positioning information received from the tracking device tracking the input device as the input device traces over at least one portion of the outer surface of the object.
20. The system ofclaim 18, wherein the processor is configured to:
receive the data indicative of the second mode of operation, and
in response to receiving the data indicative of the second mode of operation, modify the 3D model of the object based on positioning information received from the tracking device tracking the input device as the input device traces over at least one portion of the outer surface of the object, and generate an updated 3D rendering of the object based on the modified 3D model.
US16/918,8452020-07-012020-07-01Dynamic three-dimensional surface sketchingAbandonedUS20220005273A1 (en)

Priority Applications (8)

Application NumberPriority DateFiling DateTitle
US16/918,845US20220005273A1 (en)2020-07-012020-07-01Dynamic three-dimensional surface sketching
EP21831515.8AEP4176337A4 (en)2020-07-012021-06-25 DYNAMIC THREE-DIMENSIONAL SURFACE SKETCHING
CN202180046391.9ACN115735176A (en)2020-07-012021-06-25Dynamic three-dimensional surface shorthand
JP2022580857AJP7682217B2 (en)2020-07-012021-06-25 Dynamic 3D Surface Sketching
PCT/IB2021/055651WO2022003513A1 (en)2020-07-012021-06-25Dynamic three-dimensional surface sketching
US18/148,340US12307601B2 (en)2020-07-012022-12-29Dynamic three-dimensional surface sketching
US19/192,164US20250259397A1 (en)2020-07-012025-04-28Dynamic three-dimensional surface sketching
JP2025080308AJP2025118835A (en)2020-07-012025-05-13 Systems and methods

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
US16/918,845US20220005273A1 (en)2020-07-012020-07-01Dynamic three-dimensional surface sketching

Related Child Applications (2)

Application NumberTitlePriority DateFiling Date
PCT/IB2021/055651A-371-Of-InternationalWO2022003513A1 (en)2020-07-012021-06-25Dynamic three-dimensional surface sketching
US18/148,340ContinuationUS12307601B2 (en)2020-07-012022-12-29Dynamic three-dimensional surface sketching

Publications (1)

Publication NumberPublication Date
US20220005273A1true US20220005273A1 (en)2022-01-06

Family

ID=79167090

Family Applications (3)

Application NumberTitlePriority DateFiling Date
US16/918,845AbandonedUS20220005273A1 (en)2020-07-012020-07-01Dynamic three-dimensional surface sketching
US18/148,340ActiveUS12307601B2 (en)2020-07-012022-12-29Dynamic three-dimensional surface sketching
US19/192,164PendingUS20250259397A1 (en)2020-07-012025-04-28Dynamic three-dimensional surface sketching

Family Applications After (2)

Application NumberTitlePriority DateFiling Date
US18/148,340ActiveUS12307601B2 (en)2020-07-012022-12-29Dynamic three-dimensional surface sketching
US19/192,164PendingUS20250259397A1 (en)2020-07-012025-04-28Dynamic three-dimensional surface sketching

Country Status (5)

CountryLink
US (3)US20220005273A1 (en)
EP (1)EP4176337A4 (en)
JP (2)JP7682217B2 (en)
CN (1)CN115735176A (en)
WO (1)WO2022003513A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN115063518A (en)*2022-06-082022-09-16Oppo广东移动通信有限公司 Trajectory rendering method, device, electronic device and storage medium

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20030214490A1 (en)2002-05-202003-11-20Gateway, Inc.Stylus providing variable line width as a function of pressure
JP4215549B2 (en)*2003-04-022009-01-28富士通株式会社 Information processing device that operates in touch panel mode and pointing device mode
DE102009058802B4 (en)2009-12-182018-03-29Airbus Operations Gmbh Arrangement for the combined representation of a real and a virtual model
US9984501B2 (en)2012-05-142018-05-29Autodesk, Inc.Adaptively merging intersecting meshes
US10262462B2 (en)*2014-04-182019-04-16Magic Leap, Inc.Systems and methods for augmented and virtual reality
US9530250B2 (en)*2013-12-102016-12-27Dassault SystemesAugmented reality updating of 3D CAD models
US20160147408A1 (en)2014-11-252016-05-26Johnathan BevisVirtual measurement tool for a wearable visualization device
CN107621893B (en)*2016-07-152020-11-20苹果公司Content creation using electronic input devices on non-electronic surfaces
US10739870B2 (en)*2016-09-212020-08-11Apple Inc.Stylus for coordinate measuring
US20190369752A1 (en)2018-05-302019-12-05Oculus Vr, LlcStyluses, head-mounted display systems, and related methods
US10854006B2 (en)*2018-11-152020-12-01Palo Alto Research Center IncorporatedAR-enabled labeling using aligned CAD models
US10825217B2 (en)*2019-01-022020-11-03Microsoft Technology Licensing, LlcImage bounding shape using 3D environment representation

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN115063518A (en)*2022-06-082022-09-16Oppo广东移动通信有限公司 Trajectory rendering method, device, electronic device and storage medium

Also Published As

Publication numberPublication date
EP4176337A1 (en)2023-05-10
JP2023532090A (en)2023-07-26
JP2025118835A (en)2025-08-13
EP4176337A4 (en)2023-12-06
WO2022003513A1 (en)2022-01-06
JP7682217B2 (en)2025-05-23
US12307601B2 (en)2025-05-20
US20250259397A1 (en)2025-08-14
US20230138623A1 (en)2023-05-04
CN115735176A (en)2023-03-03

Similar Documents

PublicationPublication DateTitle
US20250259397A1 (en)Dynamic three-dimensional surface sketching
KR102487918B1 (en) Shape-segmentation of a triangular 3D mesh using a modified shape from shading (SFS) approach
JP6171079B1 (en) Inconsistency detection system, mixed reality system, program, and inconsistency detection method
JP6465789B2 (en) Program, apparatus and method for calculating internal parameters of depth camera
KR102114496B1 (en)Method, terminal unit and server for providing task assistance information in mixed reality
Baillot et al.Authoring of physical models using mobile computers
US9041717B2 (en)Techniques for processing image data generated from three-dimensional graphic models
CN105006021B (en)A kind of Color Mapping Approach and device being applicable to quickly put cloud three-dimensional reconstruction
US10983661B2 (en)Interface for positioning an object in three-dimensional graphical space
CN110956695B (en)Information processing apparatus, information processing method, and storage medium
TWI607814B (en)Flying Laser Marking System with Real-time 3D Modeling and Method Thereof
CN107330978A (en)The augmented reality modeling experiencing system and method mapped based on position
JP2018106661A (en)Inconsistency detection system, mixed reality system, program, and inconsistency detection method
CN109085603A (en)Optical 3-dimensional imaging system and color three dimensional image imaging method
WO2019193859A1 (en)Camera calibration method, camera calibration device, camera calibration system and camera calibration program
JP2018132319A (en)Information processing apparatus, control method of information processing apparatus, computer program, and memory medium
US20200137289A1 (en)Method and system for head mounted display infrared emitter brightness optimization based on image saturation
JP2022128087A (en)Measurement system and measurement program
CN114373016A (en)Method for positioning implementation point in augmented reality technical scene
JP7465133B2 (en) Information processing device and information processing method
US20230260076A1 (en)System, information processing apparatus, and method
Kernbauer et al.Spatial Augmented Reality for Heavy Machinery Using Laser Projections
US20250292519A1 (en)Augmented reality feedback in user interfaces for dimensioning objects
JP2011039646A (en)Image processing apparatus and method, and program
WO2022044151A1 (en)Marker drawing device, system, and method

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:WACOM CO., LTD., JAPAN

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SOSANYA, OLUWASEYI;PAREDES-FUENTES, DANIELA;THOMAS, DANIEL;SIGNING DATES FROM 20200720 TO 20200728;REEL/FRAME:053369/0525

STPPInformation on status: patent application and granting procedure in general

Free format text:NON FINAL ACTION MAILED

STCBInformation on status: application discontinuation

Free format text:ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION


[8]ページ先頭

©2009-2025 Movatter.jp