Movatterモバイル変換


[0]ホーム

URL:


CN110147770A - A kind of gaze data restoring method and system - Google Patents

A kind of gaze data restoring method and system
Download PDF

Info

Publication number
CN110147770A
CN110147770ACN201910433988.3ACN201910433988ACN110147770ACN 110147770 ACN110147770 ACN 110147770ACN 201910433988 ACN201910433988 ACN 201910433988ACN 110147770 ACN110147770 ACN 110147770A
Authority
CN
China
Prior art keywords
user
virtual scene
gaze data
data
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910433988.3A
Other languages
Chinese (zh)
Inventor
秦林婵
王云飞
黄通兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing 7Invensun Technology Co Ltd
Beijing Qixin Yiwei Information Technology Co Ltd
Original Assignee
Beijing Qixin Yiwei Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Qixin Yiwei Information Technology Co LtdfiledCriticalBeijing Qixin Yiwei Information Technology Co Ltd
Priority to CN201910433988.3ApriorityCriticalpatent/CN110147770A/en
Publication of CN110147770ApublicationCriticalpatent/CN110147770A/en
Pendinglegal-statusCriticalCurrent

Links

Classifications

Landscapes

Abstract

The invention discloses a kind of gaze data restoring method, this method comprises: obtaining the eyes image of 3D virtual scene and user;The eyes image is analyzed, the gaze data of user is obtained;The gaze data is restored to the 3D virtual scene.The gaze data of user can be restored to 3D virtual scene based on technical solution of the present invention, it solves the disadvantage that gaze data is restored to video data in the prior art, gaze data is restored in the 3D virtual scene, since gaze data can more match the visual field of user in reduction process, so that the treatment process of gaze data reduction is relatively simple and improves processing speed.Corresponding, the invention also discloses a kind of gaze data also original systems.

Description

A kind of gaze data restoring method and system
Technical field
The present invention relates to eye-tracking technical fields, more particularly to a kind of gaze data restoring method and system.
Background technique
In order to study observation data of the user in certain scenes, it usually needs the gaze data of user is restored,To analyze the gaze data of user in this scenario.
It is existing gaze data is restored when, gaze data is usually restored to video corresponding with the sceneIn data, since the eye movement data of user is the coordinate or Vector Message watched attentively, and everyone field-of-view angle is different, forWhen the gaze data at family is restored, since video data can not reflect the gaze angle of each user, gaze data can be madeReduction process it is more difficult, and processing speed is slower, so that being carried out using the gaze data after reduction subsequentIt analyzes and counts and is very inconvenient.
Summary of the invention
It is directed to the above problem, the present invention provides a kind of gaze data restoring method and system, realizes watching attentively userData convert is into 3D virtual scene, so that the treatment process of gaze data reduction is relatively simple and improves processing speed.
To achieve the goals above, the present invention provides the following technical scheme that
A kind of gaze data restoring method, this method comprises:
Obtain the eyes image of 3D virtual scene and user;
The eyes image is analyzed, the gaze data of user is obtained;
The gaze data is restored to the 3D virtual scene.
Optionally, this method further include:
The user obtained in outdoor scene record video watches picture attentively;
Picture is watched attentively to the user and carries out image mosaic processing, realizes that 3D virtual scene is built, it is virtual to obtain the 3DScene.
Optionally, the 3D virtual scene includes panorama sketch, wherein described to watch picture progress image spelling attentively to the userConnect processing, comprising:
Each user is watched attentively in image projection to object space face;
Picture is watched attentively to the adjacent user on the object space face to be compared, and determines picture overlapping region;
Fusion treatment is carried out to the picture overlapping region, and picture is watched attentively to multiple users after fusion treatment and is spelledProcessing is connect, panorama sketch is obtained.
Optionally, this method further include:
The 360 degree of panorama sketch to match with the user visual field are obtained by target device shooting, the 3D virtual scene includes360 degree of panorama sketch.
Optionally, this method further include:
Obtain the scene information that the user watches attentively in picture;
3D virtual scene is carried out to the scene information by default 3D model of place to build, and obtains 3D virtual scene.
Optionally, this method further include:
Acquire the position data of user, wherein described to analyze the eyes image, obtain user watches number attentivelyAccording to, comprising:
Based on the position data, the eyes image of the user is analyzed, obtains the gaze data of user.
Optionally, the position data includes the position and angle information of user, in travel path data and behavioral dataAt least one.
Optionally, this method further include:
The gaze data of multiple users is restored to the 3D virtual scene, realizes the connection to the gaze data of multiple usersClose analysis.
Optionally, the gaze data by multiple users is restored to the 3D virtual scene, comprising:
According to the gaze data of the multiple user and the spatial relations on matching of the 3D virtual scene, by the multiple useThe gaze data at family is restored to the 3D virtual scene.
Optionally, the gaze data by multiple users is restored to the 3D virtual scene, comprising:
Object time axis is generated, and the temporal information of the gaze data of the multiple user is matched to the object timeOn axis;
Gaze data on the object time axis is restored to the 3D virtual scene.
Optionally, this method further include:
According to the restoring data that the gaze data of multiple users is restored to the 3D virtual scene, visualization figure is generated,So that the Conjoint Analysis using the visualization figure to the gaze data of multiple users.
A kind of gaze data also original system, system include:
Acquiring unit, for obtaining the eyes image of 3D virtual scene and user;
Analytical unit obtains the gaze data of user for analyzing the eyes image;
Reduction unit, for the gaze data to be restored to the 3D virtual scene.
Optionally, the system comprises:
Picture acquiring unit watches picture attentively for obtaining the user in outdoor scene record video;
Image processing unit carries out image mosaic processing for watching picture attentively to the user, realizes that 3D virtual scene is takenIt builds, obtains the 3D virtual scene;
Wherein, described image processing unit, comprising:
Subelement is projected, for watching each user attentively image projection to object space face;
Comparing subunit is compared, really for watching picture attentively to the adjacent user on the object space faceDetermine picture overlapping region;
Splice subelement, for carrying out fusion treatment to the picture overlapping region, and to multiple use after fusion treatmentFamily watches picture attentively and carries out splicing, obtains panorama sketch, wherein the 3D virtual scene includes panorama sketch.
Optionally, the system also includes:
Position data acquisition unit, for acquiring the position data of user, the position data include user position andAt least one of angle information, travel path data and behavioral data;
Wherein, the analytical unit is specifically used for:
Based on the position data, the eyes image of the user is analyzed, obtains the gaze data of user.
Optionally, the system further include:
Conjoint Analysis unit is realized for the gaze data of multiple users to be restored to the 3D virtual scene to multipleThe Conjoint Analysis of the gaze data of user;
Wherein, Conjoint Analysis unit includes:
First goes back atomic unit, the space for gaze data and the 3D virtual scene according to the multiple userWith relationship, the gaze data of the multiple user is restored to the 3D virtual scene;
And/or
Second goes back atomic unit, believes for generating object time axis, and by the time of the gaze data of the multiple userBreath is matched on the object time axis;Gaze data on the object time axis is restored to the 3D virtual scene.
Optionally, the system also includes:
Image generation unit, for according to the reduction number that the gaze data of multiple users is restored to the 3D virtual sceneAccording to generation visualization figure, so that utilizing Conjoint Analysis of the visualization figure to the gaze data of multiple users.
Compared to the prior art, the present invention provides a kind of gaze data restoring method and systems, obtain 3D virtual sceneWith the eyes image of user, eyes image is analyzed, obtains the gaze data of user;Gaze data is restored to 3D voidQuasi- scene.The gaze data of user can be restored to 3D virtual scene based on technical solution of the present invention, solve existing skillGaze data is restored in the 3D virtual scene by the shortcomings that gaze data is restored to video data in art, due to reductionGaze data can more match the visual field of user in the process, so that the treatment process of gaze data reduction is relatively simple and is promotedProcessing speed.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show belowThere is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only thisThe embodiment of invention for those of ordinary skill in the art without creative efforts, can also basisThe attached drawing of offer obtains other attached drawings.
Fig. 1 is a kind of flow diagram of gaze data restoring method provided in an embodiment of the present invention;
Fig. 2 is a kind of structural schematic diagram of spectacle eye tracker provided in an embodiment of the present invention;
Fig. 3 is a kind of structural schematic diagram of gaze data provided in an embodiment of the present invention also original system.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, completeSite preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based onEmbodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every otherEmbodiment shall fall within the protection scope of the present invention.
Term " first " and " second " in description and claims of this specification and above-mentioned attached drawing etc. are for areaNot different objects, rather than for describing specific sequence.Furthermore term " includes " and " having " and their any deformations,It is intended to cover and non-exclusive includes.Such as it contains the process, method of a series of steps or units, system, product or setsIt is standby to be not limited to listed step or unit, but may include the step of not listing or unit.
A kind of gaze data restoring method is provided in embodiments of the present invention, and referring to Fig. 1, this method may include followingStep:
S101, the eyes image for obtaining 3D virtual scene and user.
When gaze data being restored to 3D virtual scene in embodiments of the present invention, need to obtain corresponding 3D first virtualScene one is in the application scenarios of virtual reality, that is, it is virtual to have existed 3D at this point, including two kinds of application scenariosScene can then directly acquire the 3D virtual scene;Another kind is the usage scenario for reality, i.e., records video phase to outdoor sceneMatched 3D virtual scene carries out gaze data reduction, needs first to create a 3D void based on outdoor scene record video in this scenarioQuasi- scene, it should be noted that 3D virtual scene, which can be, in an embodiment of the present invention watches the field in picture attentively based on userScape information carries out building acquisition, is also possible to by that can be that subsequent analysis brings the analysis similar with 3D virtual scene to imitateThe panorama sketch of fruit or 360 degree of panorama sketch embody.The scene is corresponded to, further comprises a kind of creation in another embodiment of the present inventionThe method of 3D virtual scene, method includes the following steps:
S201, the user obtained in outdoor scene record video watch picture attentively;
S202, watch picture progress image mosaic processing attentively to the user, realize that 3D virtual scene is built, obtain the 3DVirtual scene.
Due to including that several users watch picture attentively in outdoor scene record video, basis is needed when creating 3D virtual sceneThe user watches picture attentively and carries out image procossing, can specifically include the processing steps such as image mosaic, image co-registration or image conversionSuddenly, building for 3D virtual scene is finally realized, which can indicate that user's actually watches scene attentively.
It should be noted that the user obtained in outdoor scene record video watches picture attentively, wherein user characterization at least oneA user, it can for an individual user may be a user group, then, using image processing techniques, by instituteIt states and watches picture attentively and be converted to 3D virtual scene.For example, obtaining user watches the scene information in picture attentively, by presetting 3D scene mouldType carries out 3D virtual scene to the scene information and builds, and obtains 3D virtual scene.Wherein, presetting 3D model of place is to pass through fieldThe neural network model that the training of scape information obtains, may be implemented building for 3D virtual scene.
When for there is the 3D virtual scene constructed, the eye that user watches the 3D virtual scene attentively can be directly acquiredImage, it can the virtual reality device for the eye-tracking technology having is worn by user to obtain the eyes image of user.
When 3D virtual scene includes panorama sketch, panorama sketch therein can be common panorama sketch, be also possible to 360 degreePanorama sketch.
When for according to watching picture generation panorama sketch attentively or build 3D virtual scene, picture can be watched attentively in acquisition userWhile face, the eyes image that user watches outdoor scene record video attentively is obtained, it can corresponding eye tracker is worn by userBasic data is obtained, which includes that user watches picture and corresponding eyes image attentively.It referring to fig. 2, is this hairA kind of spectacle eye tracker that bright embodiment provides, includes prospect camera 1 and eye movement sensor device 2 in the eye tracker,In, the scene image or video that user faces are shot by prospect camera, that is, obtain user watches picture attentively;It is passed by eye movementSensor tracks user's eye movement, that is, obtains the eyes image of user.It should be noted that user watch picture attentively and eyes image isMatching storage, when prospect camera 1 is collected watch attentively picture be the first picture when, the eyes image recorded at this time is and thisCorresponding first eyes image of first picture, with the movement of user's head, the first picture can change as the second picture, and corresponding theThe eyes image of two pictures storage is the second current eyes image.
Based on above-mentioned data acquiring mode, the eyes image of user can be visually added to prospect camera,I.e. by it is collected watch picture attentively and stored when, indicate user at every moment in the object of viewing or side with visual patternPosition.
Eyeball tracking is referred to as Eye-controlling focus, be by measurement eye motion situation come estimate eyes sight and/Or the technology of blinkpunkt, the data about blinkpunkt obtained during eyeball tracking are properly termed as gaze data, this is watched attentivelyData can be obtained by the eyes image of user, wherein eyes image includes eye feature, can be anti-by pupil-corneaShooting method obtains eyes image.
S102, the eyes image is analyzed, obtains the gaze data of user.
Based on the collected eyes image of eye movement sensor device, which includes the data for characterizing eye feature, exampleSuch as, including but not limited to pupil position, pupil shape, iris position, iris shape, eyelid position, canthus position, hot spot positionIt sets.It is then based on and obtains eyes image and carry out the analysis of user's gaze angle and blinkpunkt, so that obtain user watches number attentivelyAccording to the gaze data characterizes the coordinate or Vector Message of user's sight or blinkpunkt.
S103, the gaze data is restored to the 3D virtual scene.
Since the gaze data is to analyze to obtain based on eyes image, or extract and obtain in eyes image.In 3D virtual scene for recording user's picture creation in video according to outdoor scene, due to watching picture attentively and eyes image isWith storage, that is, watching picture attentively is to match with eyes image, therefore can be according to the picture of eyes image and 3D virtual sceneOr the matching relationship in space, wherein 3D virtual scene can be according to picture creation is watched attentively, be also possible to have constructed, becauseThis, picture is corresponding to watch picture attentively, and the space in the corresponding 3D virtual scene constructed in space determines gaze data and 3D virtual fieldThe matching relationship of scape, so that gaze data is restored in the 3D virtual scene.Specifically, watching picture creation attentively for basis3D virtual scene, can be coordinately transformed, by the way that gaze data is look at the coordinate data on picture by gaze dataIt is mapped on 3D virtual scene.
Since 3D virtual scene can satisfy the gaze angle of each user, no longer needed to when being restored to gaze data byComplicated angular transformation or excessive vector calculate to realize, so that the process of gaze data reduction is relatively simple, andAnd improve treatment effeciency.
The present invention provides a kind of gaze data restoring method, the eyes image of 3D virtual scene and user are obtained, to eyePortion's image is analyzed, and the gaze data of user is obtained;Gaze data is restored to the 3D virtual scene.Based on skill of the inventionThe gaze data of user can be restored to 3D virtual scene by art scheme, solve and gaze data is restored to view in the prior artFrequency according to the shortcomings that, gaze data is restored in the 3D virtual scene, since gaze data in reduction process can be moreThe visual field of user is matched, so that the treatment process of gaze data reduction is relatively simple and improves processing speed.
On the basis of the above embodiments, since 3D virtual scene can be embodied by panorama sketch, in the present inventionAnother embodiment in, additionally provide it is a kind of based on watch attentively picture set generate panorama sketch method, this method comprises:
S301, each user is watched attentively in image projection to object space face;
S302, watch user adjacent on object space face attentively picture and be compared, determine picture overlapping region;
S303, fusion treatment is carried out to the picture overlapping region, and picture is watched attentively to multiple users after fusion treatmentSplicing is carried out, panorama sketch is obtained.
The picture of watching attentively of user can be the prospect camera of eye tracker worn based on user and adopt in outdoor scene record videoCollection, and it is picture one by one that user, which watches picture attentively, needs to splice in these pictures, to obtain a panorama sketch.ToolBody, the splicing for the panoramic picture that can be used watches the image of picture attentively by the collected user of prospect cameraSample generates the scene drawing method of an omnidirectional images that are biggish or even can achieve 360 degree.It is true to give someOne group of topography of scene, then splices this group of image, generates the new view comprising this group of topography.SpecificallyProcess can be that will shoot obtain from prospect camera to watch picture attentively and project in a certain way on object space face, that is, uniteOn one space face, such as cube, cylindrical body and spherome surface etc., it can thus to watch attentively picture with unified ginsengNumber space coordinate.Watch picture attentively to adjacent in this unified space and be compared, it can matched regional location with determination;By pictureFace overlapping region carries out fusion treatment, is spliced into panorama sketch.In the splicing of panorama sketch, according to two width figure adjacent in image sequenceThe user got as the i.e. two neighboring moment watches the similitude of the overlapping region of picture attentively to realize.
Therefore, it is based on image mosaic technology in embodiments of the present invention, the picture of watching attentively of user is spliced, so that instituteSome watches picture attentively with unified parameter space coordinate, and no matter what kind of the visual angle of user is in this way, and picture is watched in acquisition attentivelyIt can unify into same panorama sketch, convenient for the reduction of subsequent gaze data.
It should be noted that the prospect camera that the method for above-mentioned generation panorama sketch is the eye tracker worn based on user is adoptedWhat the gazing at images of collection was realized, it also can use target device shooting in another embodiment of the invention and obtain and the user visual fieldThe 360 degree of panorama sketch to match, wherein target device can be 360 degree of cameras or 3D camera.I.e. if using 360When degree camera or 3D camera carry out watching picture acquisition attentively, directly image progress adaptive processes can be no longer needed toSpliced, can also generate 3D virtual scene.The embodiment of the present invention is not particularly limited this, as long as can be based on user'sWatch picture attentively and generates 3D virtual scene.
A kind of method for obtaining gaze data is additionally provided in another embodiment of the invention, in the base of above-described embodimentThis method on plinth further include:
Acquire the position data of user, wherein analyze the eyes image, obtain the gaze data of user, wrapIt includes:
Based on position data, the eyes image of user is analyzed, obtains the gaze data of user.
In virtual scene, such as in the scene of user's virtual game, there is the virtual of eye-tracking technology by wearingReal world devices, are then based on the eye movement data of equipment acquisition user, while can obtain user by position locating moduleLocation information, which can use gyroscope, such as obtain user's in the head-mount gyroscope of userPosition and angle information can also further obtain in the travel path data or behavioral data of user, wherein behavioral dataIt may include user to special article pick-and-place data, these above-mentioned data can characterize customer position information.
In the position data for obtaining user, can based on to eyes image image recognition or position location information come alsoThe blinkpunkt of original subscriber obtains the gaze data of user, the gaze data of user is then automatically mapped to 3D virtual sceneIn, it is corresponding to be mapped to position data in 3D virtual scene.
On the basis of the above embodiments, in another embodiment of the present invention further include:
The gaze data of multiple users is restored to the 3D virtual scene, realizes the connection to the gaze data of multiple usersClose analysis.
When the gaze data to multiple users restores, melting for data can be carried out by space or time dimensionIt closes.Specifically, including:
According to the gaze data of multiple users and the spatial relations on matching of 3D virtual scene, by the gaze data of multiple usersIt is restored to 3D virtual scene.
In the gaze data reduction for carrying out multiple users based on the spatial relations on matching, need according in user's gaze dataThe space of the location information of the scene information of the blinkpunkt is characterized in the location information and 3D virtual scene of the blinkpunkt of reactionWith relationship, data convert is carried out.
Or the gaze data reduction of multiple users can be carried out based on time match mode, it specifically includes:
Object time axis is generated, and the temporal information of the gaze data of the multiple user is matched to the object timeOn axis;
Gaze data on the object time axis is restored to the 3D virtual scene.
The gaze data of multiple users is restored sequentially in time, the gaze data suitable for multiple users producesThe different scene of raw timing node, the time for enabling to the gaze data being restored in 3D virtual scene to meet its generation are suitableSequence.
In gaze data restoring method provided in an embodiment of the present invention, can be restored according to by the gaze data of multiple usersTo the restoring data of the 3D virtual scene, visualization figure is generated, so that watching attentively using the visualization figure to multiple usersThe Conjoint Analysis of data.The embodiment of the present invention does not limit the form for the specific visualization figure that subsequent Conjoint Analysis utilizesSystem, for example, watching thermal map attentively, watching trajectory diagram, more head part position A Fanda, more people's ID of trace route path images attentively for more people can be generatedDeng.
For example, can be with gaze data restoring method based on the embodiment of the present invention, using multiple users in some clothesDress shop watches picture and corresponding eyes image attentively, then splices the picture of watching attentively of each user, obtains the clothes shopPanorama sketch obtain the gaze data of each user by analyzing eye movement data, it is complete that gaze data is mapped to thisIt on scape figure, that is, is labeled on the 3D virtual scene, can analyze the buying behavior data for obtaining the consumer of the clothes shop.
For the 3D virtual scene built, such as user is in virtual game, can based on user's gaze data alsoOriginal analyzes the user data at the certain outposts of this game, so as to the gaze data optimization game level based on userSetting.
The location information for obtaining user in embodiments of the present invention, can use Inertial Measurement Unit (InertialMeasurement unit, abbreviation IMU) it is the device for measuring object triaxial attitude angle (or angular speed) and acceleration.GyroInstrument and accelerometer are the main elements of IMU.Generally, an IMU contains three uniaxial accelerometers and three uniaxialGyroscope, accelerometer detection object founds the acceleration signal of three axis in carrier coordinate system unification and independence, and gyroscope detects carrier phaseFor the angular velocity signal of navigational coordinate system, the angular speed and acceleration of object in three dimensions are measured, and is calculated with thisThe posture of object.
The gaze data of user can be restored in 3D scene in embodiments of the present invention, which may includeThe eye movement data of user, position data and other behaviors, movement and physiological data, to carry out data analysis and more personal datasOverlay analysis.
Referring to Fig. 3, additionally providing a kind of gaze data also original system, system in embodiments of the present invention includes:
Acquiring unit 10, for obtaining the eyes image of 3D virtual scene and user;
Analytical unit 20 obtains the gaze data of user for analyzing the eyes image;
Reduction unit 30, for the gaze data to be restored to the 3D virtual scene.
The present invention provides a kind of gaze data restoring method and system, acquiring unit obtains 3D virtual scene and userEyes image, analytical unit analyze eyes image, obtain the gaze data of user;Reduction unit restores gaze dataTo the 3D virtual scene.The gaze data of user can be restored to 3D virtual scene based on technical solution of the present invention, solvedThe shortcomings that gaze data is restored to video data in the prior art, gaze data is restored in the 3D virtual scene,Since gaze data can more match the visual field of user in reduction process, so that the treatment process of gaze data reduction is more simpleList and improve processing speed.
On the basis of the above embodiments, the system comprises:
Picture acquiring unit watches picture attentively for obtaining the user in outdoor scene record video;
Image processing unit carries out image mosaic processing for watching picture attentively to the user, realizes that 3D virtual scene is takenIt builds, obtains the 3D virtual scene;
Wherein, described image processing unit, comprising:
Subelement is projected, for watching each user attentively image projection to object space face;
Comparing subunit is compared, really for watching picture attentively to the adjacent user on the object space faceDetermine picture overlapping region;
Splice subelement, for carrying out fusion treatment to the picture overlapping region, and to multiple use after fusion treatmentFamily watches picture attentively and carries out splicing, obtains panorama sketch, wherein the 3D virtual scene includes panorama sketch.
On the basis of the above embodiments, the system further include:
Shooting unit, for obtaining the 360 degree of panorama sketch to match with the user visual field, the 3D by target device shootingVirtual scene includes 360 degree of panorama sketch.
On the basis of the above embodiments, the system also includes:
Position data acquisition unit, for acquiring the position data of user, the position data include user position andAt least one of angle information, travel path data and behavioral data;
Wherein, the analytical unit is specifically used for:
Based on the position data, the eyes image of the user is analyzed, obtains the gaze data of user.
On the basis of the above embodiments, the system further include:
Conjoint Analysis unit is realized for the gaze data of multiple users to be restored to the 3D virtual scene to multipleThe Conjoint Analysis of the gaze data of user;
Wherein, Conjoint Analysis unit includes:
First goes back atomic unit, the space for gaze data and the 3D virtual scene according to the multiple userWith relationship, the gaze data of the multiple user is restored to the 3D virtual scene;
And/or
Second goes back atomic unit, believes for generating object time axis, and by the time of the gaze data of the multiple userBreath is matched on the object time axis;Gaze data on the object time axis is restored to the 3D virtual scene.
On the basis of the above embodiments, the system also includes:
Image generation unit, for according to the reduction number that the gaze data of multiple users is restored to the 3D virtual sceneAccording to generation visualization figure, so that utilizing Conjoint Analysis of the visualization figure to the gaze data of multiple users.
The embodiment of the invention provides a kind of storage mediums, are stored thereon with program, real when which is executed by processorThe existing gaze data restoring method.
The embodiment of the invention also provides a kind of processor, the processor is for running program, wherein described program fortuneThe gaze data restoring method is executed when row.
The embodiment of the invention provides a kind of equipment, equipment include processor, memory and storage on a memory and canThe program run on a processor, processor perform the steps of when executing program
Obtain the eyes image of 3D virtual scene and user;
The eyes image is analyzed, the gaze data of user is obtained;
The gaze data is restored to the 3D virtual scene.
Further, this method further include:
The user obtained in outdoor scene record video watches picture attentively;
Picture is watched attentively to the user and carries out image mosaic processing, realizes that 3D virtual scene is built, it is virtual to obtain the 3DScene.
Further, the 3D virtual scene includes panorama sketch, wherein described to watch picture progress image attentively to the userSplicing, comprising:
Each user is watched attentively in image projection to object space face;
Picture is watched attentively to the adjacent user on the object space face to be compared, and determines picture overlapping region;
Fusion treatment is carried out to the picture overlapping region, and picture is watched attentively to multiple users after fusion treatment and is spelledProcessing is connect, panorama sketch is obtained.
Further, this method further include:
The 360 degree of panorama sketch to match with the user visual field are obtained by target device shooting, the 3D virtual scene includes360 degree of panorama sketch.
Further, this method further include:
Obtain the scene information that the user watches attentively in picture;
3D virtual scene is carried out to the scene information by default 3D model of place to build, and obtains 3D virtual scene.
Further, this method further include:
Acquire the position data of user, wherein described to analyze the eyes image, obtain user watches number attentivelyAccording to, comprising:
Based on the position data, the eyes image of the user is analyzed, obtains the gaze data of user.
Further, the position data includes the position and angle information of user, travel path data and behavioral dataAt least one of.
Further, this method further include:
The gaze data of multiple users is restored to the 3D virtual scene, realizes the connection to the gaze data of multiple usersClose analysis.
Further, the gaze data by multiple users is restored to the 3D virtual scene, comprising:
According to the gaze data of the multiple user and the spatial relations on matching of the 3D virtual scene, by the multiple useThe gaze data at family is restored to the 3D virtual scene.
Further, the gaze data by multiple users is restored to the 3D virtual scene, comprising:
Object time axis is generated, and the temporal information of the gaze data of the multiple user is matched to the object timeOn axis;
Gaze data on the object time axis is restored to the 3D virtual scene.
Further, this method further include:
According to the restoring data that the gaze data of multiple users is restored to the 3D virtual scene, visualization figure is generated,So that the Conjoint Analysis using the visualization figure to the gaze data of multiple users.
Equipment herein can be server, PC, PAD, mobile phone etc..
Present invention also provides a kind of computer program products, when executing on data processing equipment, are adapted for carrying out justThe program of beginningization there are as below methods step:
Obtain the eyes image of 3D virtual scene and user;
The eyes image is analyzed, the gaze data of user is obtained;
The gaze data is restored to the 3D virtual scene.
Further, this method further include:
The user obtained in outdoor scene record video watches picture attentively;
Picture is watched attentively to the user and carries out image mosaic processing, realizes that 3D virtual scene is built, it is virtual to obtain the 3DScene.
Further, the 3D virtual scene includes panorama sketch, wherein described to watch picture progress image attentively to the userSplicing, comprising:
Each user is watched attentively in image projection to object space face;
Picture is watched attentively to the adjacent user on the object space face to be compared, and determines picture overlapping region;
Fusion treatment is carried out to the picture overlapping region, and picture is watched attentively to multiple users after fusion treatment and is spelledProcessing is connect, panorama sketch is obtained.
Further, this method further include:
The 360 degree of panorama sketch to match with the user visual field are obtained by target device shooting, the 3D virtual scene includes360 degree of panorama sketch.
Further, this method further include:
Obtain the scene information that the user watches attentively in picture;
3D virtual scene is carried out to the scene information by default 3D model of place to build, and obtains 3D virtual scene.
Further, this method further include:
Acquire the position data of user, wherein described to analyze the eyes image, obtain user watches number attentivelyAccording to, comprising:
Based on the position data, the eyes image of the user is analyzed, obtains the gaze data of user.
Further, the position data includes the position and angle information of user, travel path data and behavioral dataAt least one of.
Further, this method further include:
The gaze data of multiple users is restored to the 3D virtual scene, realizes the connection to the gaze data of multiple usersClose analysis.
Further, the gaze data by multiple users is restored to the 3D virtual scene, comprising:
According to the gaze data of the multiple user and the spatial relations on matching of the 3D virtual scene, by the multiple useThe gaze data at family is restored to the 3D virtual scene.
Further, the gaze data by multiple users is restored to the 3D virtual scene, comprising:
Object time axis is generated, and the temporal information of the gaze data of the multiple user is matched to the object timeOn axis;
Gaze data on the object time axis is restored to the 3D virtual scene.
Further, this method further include:
According to the restoring data that the gaze data of multiple users is restored to the 3D virtual scene, visualization figure is generated,So that the Conjoint Analysis using the visualization figure to the gaze data of multiple users.
It should be understood by those skilled in the art that, embodiments herein can provide as method, system or computer programProduct.Therefore, complete hardware embodiment, complete software embodiment or reality combining software and hardware aspects can be used in the applicationApply the form of example.Moreover, it wherein includes the computer of computer usable program code that the application, which can be used in one or more,The computer program implemented in usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) producesThe form of product.
The application is referring to method, the process of equipment (system) and computer program product according to the embodiment of the present applicationFigure and/or block diagram describe.It should be understood that every one stream in flowchart and/or the block diagram can be realized by computer program instructionsThe combination of process and/or box in journey and/or box and flowchart and/or the block diagram.It can provide these computer programsInstruct the processor of general purpose computer, special purpose computer, Embedded Processor or other programmable data processing devices to produceA raw machine, so that being generated by the instruction that computer or the processor of other programmable data processing devices execute for realThe device for the function of being specified in present one or more flows of the flowchart and/or one or more blocks of the block diagram.
These computer program instructions, which may also be stored in, is able to guide computer or other programmable data processing devices with spyDetermine in the computer-readable memory that mode works, so that it includes referring to that instruction stored in the computer readable memory, which generates,Enable the manufacture of device, the command device realize in one box of one or more flows of the flowchart and/or block diagram orThe function of being specified in multiple boxes.
These computer program instructions also can be loaded onto a computer or other programmable data processing device, so that countingSeries of operation steps are executed on calculation machine or other programmable devices to generate computer implemented processing, thus in computer orThe instruction executed on other programmable devices is provided for realizing in one or more flows of the flowchart and/or block diagram oneThe step of function of being specified in a box or multiple boxes.
In a typical configuration, calculating equipment includes one or more processors (CPU), input/output interface, netNetwork interface and memory.
Memory may include the non-volatile memory in computer-readable medium, random access memory (RAM) and/Or the forms such as Nonvolatile memory, such as read-only memory (ROM) or flash memory (flash RAM).Memory is computer-readable JieThe example of matter.
Computer-readable medium includes permanent and non-permanent, removable and non-removable media can be by any methodOr technology come realize information store.Information can be computer readable instructions, data structure, the module of program or other data.The example of the storage medium of computer includes, but are not limited to phase change memory (PRAM), static random access memory (SRAM), movesState random access memory (DRAM), other kinds of random access memory (RAM), read-only memory (ROM), electric erasableProgrammable read only memory (EEPROM), flash memory or other memory techniques, read-only disc read only memory (CD-ROM) (CD-ROM),Digital versatile disc (DVD) or other optical storage, magnetic cassettes, tape magnetic disk storage or other magnetic storage devicesOr any other non-transmission medium, can be used for storage can be accessed by a computing device information.As defined in this article, it calculatesMachine readable medium does not include temporary computer readable media (transitory media), such as the data-signal and carrier wave of modulation.
It should also be noted that, the terms "include", "comprise" or its any other variant are intended to nonexcludabilityIt include so that the process, method, commodity or the equipment that include a series of elements not only include those elements, but also to wrapInclude other elements that are not explicitly listed, or further include for this process, method, commodity or equipment intrinsic wantElement.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including elementThere is also other identical elements in process, method, commodity or equipment.
It will be understood by those skilled in the art that embodiments herein can provide as method, system or computer program product.Therefore, complete hardware embodiment, complete software embodiment or embodiment combining software and hardware aspects can be used in the applicationForm.It is deposited moreover, the application can be used to can be used in the computer that one or more wherein includes computer usable program codeThe shape for the computer program product implemented on storage media (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.)Formula.
The above is only embodiments herein, are not intended to limit this application.To those skilled in the art,Various changes and changes are possible in this application.It is all within the spirit and principles of the present application made by any modification, equivalent replacement,Improve etc., it should be included within the scope of the claims of this application.

Claims (16)

CN201910433988.3A2019-05-232019-05-23A kind of gaze data restoring method and systemPendingCN110147770A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201910433988.3ACN110147770A (en)2019-05-232019-05-23A kind of gaze data restoring method and system

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201910433988.3ACN110147770A (en)2019-05-232019-05-23A kind of gaze data restoring method and system

Publications (1)

Publication NumberPublication Date
CN110147770Atrue CN110147770A (en)2019-08-20

Family

ID=67592994

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201910433988.3APendingCN110147770A (en)2019-05-232019-05-23A kind of gaze data restoring method and system

Country Status (1)

CountryLink
CN (1)CN110147770A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN112712487A (en)*2020-12-232021-04-27北京软通智慧城市科技有限公司Scene video fusion method and system, electronic equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101067716A (en)*2007-05-292007-11-07南京航空航天大学 Augmented Reality Natural Interactive Helmet with Eye Tracking
CN104506836A (en)*2014-11-282015-04-08深圳市亿思达科技集团有限公司Personal holographic three-dimensional display method and device based on eyeball tracking
CN105807931A (en)*2016-03-162016-07-27成都电锯互动科技有限公司Realization method of virtual reality
US20170148215A1 (en)*2015-11-192017-05-25Oculus Vr, LlcEye Tracking for Mitigating Vergence and Accommodation Conflicts
CN108320333A (en)*2017-12-292018-07-24中国银联股份有限公司The scene adaptive method of scene ecad virtual reality conversion equipment and virtual reality
CN109032351A (en)*2018-07-162018-12-18北京七鑫易维信息技术有限公司Watch point function attentively and determines that method, blinkpunkt determine method, apparatus and terminal device
CN109428987A (en)*2017-07-042019-03-05北京视境技术有限公司A kind of 360 degree of stereo photographic devices of wear-type panorama and image pickup processing method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101067716A (en)*2007-05-292007-11-07南京航空航天大学 Augmented Reality Natural Interactive Helmet with Eye Tracking
CN104506836A (en)*2014-11-282015-04-08深圳市亿思达科技集团有限公司Personal holographic three-dimensional display method and device based on eyeball tracking
US20170148215A1 (en)*2015-11-192017-05-25Oculus Vr, LlcEye Tracking for Mitigating Vergence and Accommodation Conflicts
CN105807931A (en)*2016-03-162016-07-27成都电锯互动科技有限公司Realization method of virtual reality
CN109428987A (en)*2017-07-042019-03-05北京视境技术有限公司A kind of 360 degree of stereo photographic devices of wear-type panorama and image pickup processing method
CN108320333A (en)*2017-12-292018-07-24中国银联股份有限公司The scene adaptive method of scene ecad virtual reality conversion equipment and virtual reality
CN109032351A (en)*2018-07-162018-12-18北京七鑫易维信息技术有限公司Watch point function attentively and determines that method, blinkpunkt determine method, apparatus and terminal device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
ERIN MARTEL ET AL.: "Controlling VR games: control schemes and the player experience", 《ENTERTAINMENT COMPUTING》*
师书恩: "《信息技术教学应用》", 31 May 2004*
潘世豪: "虚拟/增强环境中的视线追踪算法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》*

Cited By (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN112712487A (en)*2020-12-232021-04-27北京软通智慧城市科技有限公司Scene video fusion method and system, electronic equipment and storage medium
CN112712487B (en)*2020-12-232024-10-01北京软通智慧科技有限公司Scene video fusion method, system, electronic equipment and storage medium

Similar Documents

PublicationPublication DateTitle
Zhang et al.Egobody: Human body shape and motion of interacting people from head-mounted devices
US11935180B2 (en)Dual IMU SLAM
CN105393284B (en)Space engraving based on human body data
CN103513421B (en) Image processing device, image processing method and image processing system
CN106716303B (en)Stablize the movement of interaction ray
KR102212250B1 (en)Body-locked placement of augmented reality objects
CN112805750A (en)Cross-reality system
Gordon et al.FLEX: extrinsic parameters-free multi-view 3D human motion reconstruction
CN106066701B (en)A kind of AR and VR data processing equipment and method
JP6294054B2 (en) Video display device, video presentation method, and program
US20210326584A1 (en)Augmented reality (ar) device and method of predicting pose therein
US10380758B2 (en)Method for tracking subject head position from monocular-source image sequence
CN113628322B (en)Image processing, AR display and live broadcast method, device and storage medium
JPWO2017134886A1 (en) Information processing apparatus, information processing method, and recording medium
CN102591449A (en)Low-latency fusing of virtual and real content
CN105393158A (en)Shared and private holographic objects
CN102981616A (en)Identification method and identification system and computer capable of enhancing reality objects
TR201906816T4 (en) Methods, devices and systems for auto scrolling when the augmented reality scene is played.
CN104995583A (en)Direct interaction system for mixed reality environments
CN108986141A (en)Object of which movement information processing method, device, augmented reality equipment and storage medium
EP4394706A1 (en)Spatial positioning method and apparatus
CN113544748A (en)Cross reality system
CN116820251B (en)Gesture track interaction method, intelligent glasses and storage medium
CN110211222A (en)A kind of AR immersion tourism guide method, device, storage medium and terminal device
EP3087727B1 (en)An emotion based self-portrait mechanism

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
RJ01Rejection of invention patent application after publication
RJ01Rejection of invention patent application after publication

Application publication date:20190820


[8]ページ先頭

©2009-2025 Movatter.jp