Movatterモバイル変換


[0]ホーム

URL:


CA2209886A1 - Motor and eye activity performance analyzer - Google Patents

Motor and eye activity performance analyzer

Info

Publication number
CA2209886A1
CA2209886A1CA002209886ACA2209886ACA2209886A1CA 2209886 A1CA2209886 A1CA 2209886A1CA 002209886 ACA002209886 ACA 002209886ACA 2209886 ACA2209886 ACA 2209886ACA 2209886 A1CA2209886 A1CA 2209886A1
Authority
CA
Canada
Prior art keywords
video
subject
eye
output
mixer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002209886A
Other languages
French (fr)
Inventor
Joan Vickers
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University Technologies International Inc
Original Assignee
University Technologies International Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University Technologies International IncfiledCriticalUniversity Technologies International Inc
Publication of CA2209886A1publicationCriticalpatent/CA2209886A1/en
Abandonedlegal-statusCriticalCurrent

Links

Classifications

Landscapes

Abstract

Apparatus for the analysis of eye movement, gaze and body movement of a subject, while the subject is involved in a motor activity has an eye video camera adapted to be supported on a subject's head, a scene video camera adapted to be supported on a subject's head, a mixer for mixing the video output of the eye and scene cameras such that in each frame of the mixer output there is included synchronized video information from each of the eye and scene video outputs and a video recorder connected to the output of the mixer for recording the mixer output. An external camera views the subject while the subject is engaged in the activity, and the mixer includes a stage for mixing the external camera with the eye and scene video. A method of simultaneously detecting eye movement and gaze of a subject, while the subject is involved in a motor activity, includes repetitively performing the steps of outputting a first video frame from a first video camera while directing the first video camera towards the subject's eye;
outputting a second video frame from a second video camera while directing the second video camera towards a scene viewed by the subject; mixing the first and second video frames into a mixed frame, such that each mixed frame of video information includes information from each of the first and second video cameras; and recording the mixed frame of video information.

Description

TITLE OF THE INVENTION:
Motor and Eye Activity Performance Analyzer NAME(S) OF INVENTOR(S):
Joan Vickers FIELD OF THE INVENTION
This invention relates to systems used in analysis of motor and eye activity performance.

R~ nO~ND OF THE INVENTION
In previous work by this inventor, a Vision-in-Action (VIA) system was proposed (Gaze Control in Basketball Foul Shooting, in Eye Movement Research, Elsevier Science BV, 1995) in which a helmet based mobile eye movement tracking system and external video camera were used to display simultaneously (1) video of a scene viewed by a subject and ( 2) video of the subject within the scene.

SUMMARY OF THE INVENTION
While this previously proposed system provides the first simultaneous display of gaze and motor activity, the inventor has found that, by presenting the gaze information as a single cursor point in a scene, information about ocular activity is lost. The inventor has therefore proposed in this patent document, the simultaneous recording and display of eye video, such that ocular characteristics may be observed, coded and analyzed.
There is thus provided in accordance with one embodiment of the invention, an apparatus for the analysis of eye movement, gaze and body movement of a subject, while the subject is involved in a motor activity, in which the apparatus includes an eye video camera adapted to be supported on a subject~s head, a scene video camera adapted to be supported on a subject-s head, a mixer for mixing the video output of the eye and scene cameras such that in each frame of the mixer output there is included synchronized video information from each of the eye and scene video outputs and a video recorder connected to the output of the mixer for recording the mixer output.
Preferably, the apparatus also includes an external camera for viewing the subject while the subject is engaged in the activity, and the mixer includes a stage for mixing the external camera with the eye and scene video.
In a further aspect of the invention, there is provided an eye view monitor computer, and related optical systems, conIlected to receive output from the eye video camera and the scene video camera for computing the location of gaze of the subject and injecting gaze information into the scene video.
In a still further aspect of the invention, time code information and audio may be mixed with the output from the mixer.
In a still further aspect of the invention, there is provided a method of simultaneously detecting eye movement and gaze of a subject, while the subject is involved in a motor activity, the method comprising repetitively performing the steps of:
outputting a first video frame from a first video camera while directing the first video camera towards the subject's eye;
outputting a second video frame from a second video camera while directing the second video camera towards a scene viewed by the subject;
mixing the first and second video frames into a mixed frame, such that each mixed frame of video information includes information from each of the first and second video cameras; and recording the mixed frame of video information.
In a further aspect of the invention, there is included outputting a third video frame from a third video camera while directing the third video camera at the subject within the scene; and mixing the first, second and third video frames into a mixed frame, such that each mixed frame of video information includes information from each of the first, second and third video cameras.
Preferably, the output is displayed on a video monitor, coded and analyzed. Subjects may then be trained for optimal performance of the activity by recognition of where the subject's activity deviates from the analyzed performance by a more successful subject of the same activity, or a more successful attempt at the activity by the same subject. Repeated recording of the subject during the same motor activity may be used to train the subject.
These and other aspects of the invention are described in the detailed description of the invention and claimed in the claims that follow.

BRIEF DESCRIPTION OF THE DRAWINGS
There will now be described preferred embodiments of the invention, with reference to the drawings, by way of illustratiorl, in which like numerals denote like elements and in which:
Fig. 1 is a schematic showing the main electronic components used in an embodiment of the invention;
Fig. lA is a schematic showing additional mixing of external video inputs for the embodiment of the invention shown in Fig. 1;

Fig. 2 shows a subject and the location of video cameras with output showing eye motion, field of view of the eye and the position of the subject in the scene;
Fig. 3 shows an exemplary video monitor displaying the outputs from the video cameras shown in Fig.
2; and Fig. 4 is a graph showing data output from following the method of the invention for the case of a basketball player training to improve free throw performance.

DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
Referring to Figs. 1 and 2, a mobile helmet based eye view monitor system is shown for collecting eye and gaze information from a subject 10. The eye view monitor system is preferably model ASL 3100H available from Applied Science Laboratories. The operation of the eye view monitor system will be described only briefly as the system itself is well known in the art. The eye view monitor system includes an eye video camera 12 mounted on a support 14.
The support 14 is formed from a helmet and attachments for fixing the eye video camera on the head 16 of the subject 10 with the eye video camera 12 directed at the eye of the subject 10 so that the eye of the subject is in the field of view of the eye video camera 12. The eye is illuminated by a near infrared light source (not shown) beamed coaxially with the optical axis of the eye video camera.
The light from the near infrared light source and the resulting backlighted bright pupil image are reflected from a visor 18 coated to be transparent in the visible spectrum but reflective to the near infrared. Use of the visor 18 allows unobstructed simultaneous recording of an image of the eye while the eye is occupied in viewing a scene.
Optics (not shown) for the eye video camera 12 or image processing may be used so that the eye video shows about a 2.5 cm x 2.5 cm square centered on the eye. Two eye video cameras 12 may be used, one for the left eye (OS) and one for the right eye (OD).
A light source 15 is also mounted on the helmet 14 using any of various attachment devices and is supported on the helmet 14 to provide a collimated beam of light that reflects from the visor 18 onto the surface of the cornea and returns a corneal reflection into the eye video camera 12. The light source 15 is also part of the ASL system.
A scene video camera 22 is mounted on a support 24 attached to the helmet 14 and oriented towards the visor 18 so that the field of view of the scene video camera 22 is nearly aligned with the view from the eye being monitored, thus avoiding parallax problems. The scene camera 22 thus appears to see the world from the same position as the subject's eyes. The visor 18 is oriented to pick up the optimal scene in front of the subject 10. Since the subject is free to move the head, the scene changes with shifts in the subject~s head position.
A third video camera 32 is mounted in any conventional manner such as on a tripod and directed so that its field of view includes the subject 10 performing a motor activity. The third video camera 32 is not part of the ASL system and may be any conventional video camera with output compatible with the other electronic devices used in the system.
Output from the eye video camera 12 is provided along line 26 to eye view monitor computer 28, also part of the ASL system. The eye view monitor computer 28 is programmed by the manufacturer, Applied Science Laboratories, to use corneal reflection information and pupil center information to compute the gaze location of the subject and add gaze information to the scene video output from the eye view monitor computer 28. When the scene video is displayed on a video monitor, the gaze location appears in the scene as a cursor 54 (Fig. 3), which could be a solid block as shown or cross-hair of any of various shapes. The algorithms used to compute the gaze location are known in the art. Briefly, the eye view monitor computer 28 measures the vertical and horizontal distance between the center of the pupil 76, 78 and the center of the corneal reflection 84, 89. After correction of second order effects, line of gaze with respect to the light source can be computed. Individual differences and second order effects may be accounted for during calibration, during which the subject fixates on critical calibration points in the scene. The helmet 14 has a 30 meter (or other length) cable attached to the waist of the subject 10, and interfaced to the eye view monitor computer 28, thus permitting the subject near normal mobility.
Output from the eye view monitor computer 28, including scene and eye video, and output from the external camera are then mixed in a mixer 34 having an output 36 formed of sequential frames in which, in each frame of the mixer output, there is included synchronized video information from each of the cameras. Preferably, the signals are mixed so that, as shown in Fig. 3, each video output occupies a distinct window on a video monitor 38 (eg a Sony~ monitor). Various arrangements may perform the function of the mixer 34. As shown in Fig. 1, two effects generators 41 and 42, for example Panasonic~ Special Effects Generators Model WJ 4600a, may be cascaded together. As shown in Fig. lA, a third effects generator 45 may be cascaded with the first and second effects generators 41 and 42. The third effects generator 45 receives input from two external cameras 32A and 32B, which are preferably oriented to give different views of the same subject 10, or views of two different subjects, and outputs mixed video on line 47 to the input 2 of effects generator 41. Similarly, additional external cameras 32 may be linked in a multi-stage cascade. Audio from other subjects involved in the motor activity, for example, a teammate or competitor in a sporting activity may also be mixed with the video and played back later for analysis.
The effects generator 41 receives scene video from the eye view monitor computer at its video input 1, and video of the subject from external camera 32 at its video input 2. Genlock signal out from blackburst output on the effects generator 41 is fed back to the external camera 32 on line 49. A mixed signal comprising both scene video and subject video is output from the effects generator 41 video output as shown at 43. The effects generator 42 receives eye video from the eye view monitor computer 28 at its video input 1, mixed scene and subject video from the effects generator 41, and mixes them to produce mixed eye, scene and subject video at its video output 36.
The mixer 34 may also be formed of a digital squeezer, itself known in the art, that squeezes each of the images from the eye video camera, scene video camera and external video camera so that a complete, but reduced, image from each camera appears in the displayed image on the display monitor.
Mixer output at 26 is input to a video time code generator 44 (eg a Datum~ model 9100A) which adds time code information to the mixer output for display. The combined mixer output and time code information is input to a first video recorder 48 (eg a Toshiba~ or Sony~ recorder), where it may be mixed with audio from an audio mixer 46, and recorder on video tape. The video tape or a direct feed may be then input to a second video recorder 52, from which the mixer output and time code information may be displayed on the monitor 38. The final mixed data from all cameras can also be recorded on compact disc, video disc, computer, or other recording devices.
Typical video output displayed on the monitor 38 is shown in Fig. 3 for the analysis of the sport of volleyball. The upper portion, about 50~ of the monitor screen, shows a scene 60 as seen by scene video camera 22, which closely approximates the scene as viewed by the subject lO. The cursor 54 shows the gaze location of the subject 10 as computed by the eye view monitor computer 28.
The lower right quarter 62 of the screen shows the subject lO as viewed by the external camera 32. The lower left quarter 64 shows the image of the eye of the subject lO. As shown in Fig. 3, two eye video cameras 12 may be used to display images of both the left eye 72 and right eye 74 of the subject lO. The eye view image 64 also shows the center of the pupils 76 and 78 as indicated by the cross-hairs 80 and 82 and the location of the corneal reflection 84 and 86 as indicated by the cross-hairs 88 and 89. Time code information 90 may be superimposed on the displayed image in any convenient location, such as above the eye image 64.
Each displayed frame of video is assigned a unique time code. Each frame is also preferably coded with a unique identifier in milliseconds to show the frame interval, according tG the standard used (30 frames per second, 60 frames per second or 100 frames per second for example).
I~ operation of the system thus described, a subject 10 performs a motor activity, such as participating in a sport. Examples include basketball free throws, golfing, and volleyball. The eye video camera 12 is directed at the subject's eye (two eye video cameras 12 may be used, one for each eye) after reflection from the visor 18, and eye video is output to the eye view monitor computer 28. The scene video camera 22 is directed at the scene by reflection from the visor 18, and scene video is also output to the eye view monitor computer 28. Eye video and scene video is mixed in the mixer 34. At the same time, the external video camera 32 is directed at the subject 10 and output from the external video camera 32 (or two or more external video cameras as shown in Fig. lA) is mixed with the output from the eye video camera 12 and scene video camera 22. A frame of video from each camera 12, 22 and 32 is mixed by the mixer 34 such that each mixed frame of video information includes information from each of the video cameras. In some instances, it is possible to omit the subject video. By information is meant useful information: enough of the video output from each camera must be displayed to be able to assign values to variables representing physical activity viewed by the video cameras.
The output from the mixer may then be recorded for later viewing and displayed simultaneously on a monitor. It is preferred that each mixed frame be encoded with a unique time code so that each array of values representing physical activity has a unique identifier. It is preferred that the scene video mixed into the mixer output include gaze information, such that a cursor appears at the location of the subject~s gaze in the scene. The subject may also be required to speak about the activity she is engaged in and this speech may be recorded with the mixer output.
Once the mixer output has been recorded, it may be played back and analyzed frame by frame. Data values are assigned to observable physical characteristics of the motor activity of the subject, people or objects in the scene, gaze locations, gaze behaviours, eye movement, and activity performance as outlined in table 1. Activity performance, or outcome measures, are recorded, such as accuracy of the trial, optimal verses non-optimal use of objects or tools, and effective or ineffective interaction with others. Temporal motor phases are defined based on the mechanics of the motor skills and needs of the subject.
Onset, offset and duration, in milliseconds or other time durations, of the skill and phases of the skill are determined as is the movement time of the skill. Number of steps, onset of the first and last step, step duration, and direction of stepping may be assigned data values.
Likewise, arm and hand movement and body velocity may also be assigned values.
Gaze locations may be identified and coded, relative to the motor phases of the skill. A gaze behaviour is defined as the gaze held stable on a location in the environment, or a shift in gaze from one location to another. Percent of trials in which a gaze behaviour is observed is reported, as is the frequency, onset, offset, and duration of important gaze behaviours during each motor phase.
Since a shift in gaze is normally initiated by eye movements, four or more gaze behaviours are identified based on definitions originating from the eye movement literature, for example, fixations, saccades, tracking and blink. Each is coded within accepted ranges using minimum duration parameters. For example, minimum fixation duration is 99.9 ms (3 or more frames) and defined as the stabilization of the gaze cursor on a location in the environment. A tracking gaze behaviour is coded when the eyes pursue a moving object or person for 2 or more frames or 66.6 ms. A saccade is coded when a shift in gaze is observed from one location to another with a movement time (MT) of 66.6 ms or more (2 or more frames). During a saccade, vision is normally suppressed. Gross head movement is also analyzed and occurs when the subject's gaze moves such that it is impossible to code a fixation, saccade, tracking or blink. This last category permits an assessment of eye/head stability during the skill.
Ocular functions are defined by pupil diameter, eye movements (ratio of pupil center to CR), blink rate and duration. Ocular motility, accommodation, binocular function, depth perception and ablyopia/strabismus can be determined during active movement.
Analysis output includes graphical presentation of the temporal integration of the subject~s gaze behaviours, motor behaviours, and ocular function relative to objects, persons and events in the skill. Quiet eye is a measure of the subject~s reaction time or cognitive function. Quiet eye occurs from the onset of fixation or tracking a primary object, person or event until onset of movement time or principal goal directed motor behaviour.
Analysis of gaze location and behaviour and relation of gaze location and behaviour to the specific motor phase during which it occurs is most important in the training of subjects in the performance of the motor skill.
The system also has application to testing of visual characteristics of hyperactive, Attention Deficit Disorder children, by for example indicating the presence or absence of inattention during performance of a motor skill.
For example, a subject may be monitored during a motor activity and the scene and eye video displayed. Even though a subject's head may be pointing in one direction, thus showing a specific scene, the eye movement can be readily observed to determine whether the eye movement is tracking objects in the scene, attending an object or saccading. ~linking and pupil diameter may be readily observed and measured on the eye video 64 as indicators of the degree of attention of the subject to the task in hand.
When two eye videos are used, deviations in gaze between the eyes may be detected by observation of the pupil centers. If gaze location is also observed using the cursor, even more accurate information can be obtained.
When all three video streams (eye video, scene video and subject video) are presented on the monitor, even more useful information may be obtained, since phase of the motor skill can more readily be observed and coded during playback of the video.
As an example, the invention may be used for the training of children who have attention or reading disorders. A child may be asked to read writing on a blackboard, and then to shift their gaze to a book, while wearing the helmet 14 with eye video camera 12 and scene video camera 22. The scene 60 in this instance is the blackboard, then the book. The external camera 32 when used, indicates accompanying levels of hyperactivity, but is optional if this measure is not needed. The eye video 64 may be monitored during and after the shift in gaze to assess eye movement as the eye begins to focus on the writing in t:he book. Clear deviations of the eyes during and after the shift in gaze, such as the eyes viewing at different angles, will be observable in the eye video 54.
The recorded video may be played back to the child, and repeated video recordings of the child while reading, to assist in training the child to avoid deviations in gaze.
In a further example, a basketball player with a poor free throw record, even an expert professional player, may be assessed and trained to improve their visual control and technique. The player wears the helmet 14, and three video cameras 12, 22 and 32 are used. Gaze behaviour and occular cha~acteristics during the free throw are coded and analyzed to ~ield the graph shown in Fig. 4. For example, fixation on the hoop may be considered one critical phase of gaze behaviour and the duration of this gaze behaviour and its temporal relation to the stage of the free throw may be recorded. In addition, pupil diameter may be recorded and correlated with gaze and motor phase to assess the focus of attention of the player. It may be, for example, that the player's focus of attention is optimal during pre-shot preparation, but during initiation of the shot, may differ markedly from optimal, as shown by a reduction in pupil diameter (see Fig. 4, wherein the line graph shows pupil diameter). Identification of the motor phase (preparation, pre-shot, shot or flight) may be achieved from studying the scene video 62, and its duration (bar graph in Fig. 4) found easily from the time code 90.
By comparing ocular and gaze characteristics for particular motor phases during misses and hits, the player may be trained to avoid unsuccessful strategies. For example, the player may be taught to reduce or increase the duration of a motor phase, or gaze at a particular feature in the scene or attend with more attention during particular motor phases.
In a further example, unusual visual behaviour of an operator of a vehicle may be monitored using the eye video 64. In this instance, an exterior video camera is used to view the operator, while what the operator sees is shown in the scene video. At the same time, the eye video camera is trained on the operator's eye. Video from the three cameras may be simultaneously viewed as described above, and used in crash reconstruction or analysis of operator performance.
In a further practical application of the invention, stroke patients with visual neglect may also be observed. The system may be used to determine the severity CA 02209886 l997-07-09 of stroke damage, and the extent of visual neglect during motor tasks.
A person skilled in the art could make immaterial modifications to the invention described in this patent document without departing from the essence of the invention that is intended to be covered by the scope of the claims that follow.

Variable groups Variables Measure Input Coiurnn 1 Subject characteristics Aqe years, momhs ~_ 2 Gender male, lemale ~7~,~
3 ,~ccuracy acc~rate, Inaccurale ~
4 ackqround nom nal ~_ s . xperimental qroup nomnal ~_,~
6 Other qroupinqs nomnal 7 Heiqht; weight, elc numeric ~6_~
8 7~ ~
9 Blo~ t han' "n,o~. 9 ~t Phase of movement n~ Tinal ~_~
10 characteristics Duration ol phase cs- ~d5 _~
t 1 Onset oi phase ~8~ ~ds _~
t 2 Offset of phase m 59C- 's ~_~
t 3 Direction oi movement Nominal ~,,~
t 4 Accuracy % trials 1~_,~
t 5 Velocity metres per second ~_ 16 Acceleration metres per second ~_ ~bject reception location nominai ~
18 rrors numeric i~_ g ody aiignment angle, cm o tep length cm, metres ~_~
FTR recordings degre~s 1~_ ccuracy numer c, percent ~
rrors numer c, ratio, percsnt ~_ mprovement numer c, percent ,~__ ~iovement time (MT) ~ s ~s ~_ Gaze bebaviour Type of gaze Nominal f~iixation, saccca.,.
rack, blink, _1~
~ea- ,mo.e,. I) ~__ o ~;aze lo~ation Siom nai ~=_ ~ype qaze by location ~ nnai ~_ aze duration ~ n-s ~_ aze onsst - 8 ~ 5 ~
~ ~ Caze oHset m secon~s __ 2 Cazè ~requency numeric, p2;ce.,t.. grJ __ eaction time (RT) ~ ~Is q ~ s llr~s ~ uiet eye tRT) ~
3' ~_ 3~ Ocular characteristics xly pupil centre m Imetres, deorees arc ~,~$o7 4 0 /y corneal reflect ~ ' ~t .,s, decrees arc 41 ~upil diameter m imetre 42 ccentricity -egrees o arc. diopter ~
43 Visual angle ~ -eqrees o aa. diopter 7~,tVf,.757.5~ ~_ 44 ." ~~ODandOSco-ordination -eqrees o acr. diopter ~t~_, 46 t~j'ect, event characteristics Speed metres per seccnd ~__ 47 Acceleration metres per second ~ '7~t_~
4 8 Size melres, centime:res ~,5~,~;~5 49 Changs in direction cenlin,rt,ts, visual angle S û ocation n - inal S,~.'~-7.
light onset ~ ss-n-s .~57~ t,~
' 2 ~liqht o~iset ~ ~ec,n 5 ~ x7?~7~ 5.
- 3 emporal onsets ~ - 3 C:\n 5 ~ 7~,~
~emporal ollsels m I secon- 5 ;~t,7~t7 s ~arget charactertistics nominal _ ~ Y~7.X~-t~,7~ o~ 7",~ ~7~i.~

1~1B~ L 1

Claims (15)

THE EMBODIMENTS OF THE INVENTION IN WHICH AN EXCLUSIVE
PROPERTY OR PRIVILEGE IS CLAIMED ARE DEFINED AS FOLLOWS:
1. Apparatus for the analysis of eye movement, gaze and body movement of a subject, while the subject is involved in a motor activity, the apparatus comprising:
an eye video camera having a first field of view and a first video output;
first support means for supporting the eye video camera on the subject with the eye of the subject in the first field of view;
a scene video camera having a second field of view and a second video output;
second support means for locating the scene video camera in a position that the second field of view is substantially aligned with the field of view of the subject's eye; and a mixer for mixing the first and second video output, the mixer having an output in which, in each frame of the mixer output, there is included synchronized video information from each of the first and second video outputs; and a video recorder connected to the output of the mixer for recording the mixer output.
2. The apparatus of claim 1 further including:
a subject video camera having a third field of view and a third video output connected to an input of the mixer; and the mixer having output in which, in each frame of the mixer output, there is included synchronized video information from each of the first, second and third video outputs.
3. The apparatus of claim 1 further including an eye view monitor computer connected to receive output from the eye video camera and the scene video camera for computing the location of gaze of the subject and injecting gaze information into the second video output.
4. The apparatus of claim 3 further including:
a light source for providing a corneal reflection into the eye video camera;
a support for supporting the light source on the subject; and the eye view monitor computer being programmed to use corneal reflection information and pupil center information to compute the gaze location of the subject and add gaze information to the scene video.
5. The apparatus of claim 4 further including a time code generator operatively connected to add time code information to the mixer output.
6. The apparatus of claim 5 further including an audio mixer operatively connected to add audio information to the mixer output.
7. The apparatus of claim 1 further including a time code generator operatively connected to add time code information to the mixer output.
8. The apparatus of claim 2 further including an audio mixer operatively connected to add audio information to the mixer output.
9. A method of simultaneously detecting eye movement and gaze of a subject, while the subject is involved in a motor activity, the method comprising repetitively performing the steps of:
outputting a first video frame from a first video camera while directing the first video camera towards the subject's eye;
outputting a second video frame from a second video camera while directing the second video camera towards a scene viewed by the subject;
mixing the first and second video frames into a mixed frame, such that each mixed frame of video information includes information from each of the first and second video cameras; and recording the mixed frame of video information.
10. The method of claim 9 further including:
outputting a third video frame from a third video camera while directing the third video camera at the subject within the scene; and mixing the first, second and third video frames into a mixed frame, such that each mixed frame of video information includes information from each of the first, second and third video cameras.
11. The method of claim 9 further comprising the step of:
displaying the mixed frame of video information on a video monitor.
12. The method of claim 11 further comprising the step of:
encoding each mixed frame with a unique time code.
13. The method of claim 12 further including the step of:
adding gaze data to each second frame, such that a cursor appears at the location of the subject's gaze in the scene.
14. The method of claim 13 further including the step of:
adding audio data to the mixed frame.
15. The method of claim 14 further including the step of:
analyzing the information displayed in the mixed frame by:
assigning first data values corresponding to the nature and duration of pupil movement of the subject;
assigning second data values corresponding to gaze location and duration of the subject; and correlating the first and second data values with the subject's ability to perform the activity.
CA002209886A1996-07-151997-07-09Motor and eye activity performance analyzerAbandonedCA2209886A1 (en)

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
US68007696A1996-07-151996-07-15
US08/680,0761996-07-15

Publications (1)

Publication NumberPublication Date
CA2209886A1true CA2209886A1 (en)1998-01-15

Family

ID=24729551

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CA002209886AAbandonedCA2209886A1 (en)1996-07-151997-07-09Motor and eye activity performance analyzer

Country Status (1)

CountryLink
CA (1)CA2209886A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2001074236A1 (en)*2000-03-312001-10-11University Technologies International Inc.A diagnostic test for attention deficit hyperactivity disorder
EP1219243A1 (en)*2000-12-282002-07-03Matsushita Electric Works, Ltd.Non-invasive brain function examination
WO2013166025A1 (en)*2012-05-012013-11-07RightEye, LLCSystems and methods for evaluating human eye tracking
CN113509138A (en)*2021-05-112021-10-19岭南师范学院 A pupillary photoreflectometer for autistic children with laser self-mixing interference
CN115314684A (en)*2022-10-102022-11-08中国科学院计算机网络信息中心 A kind of inspection method, system, device and readable storage medium of railway bridge

Cited By (15)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2001074236A1 (en)*2000-03-312001-10-11University Technologies International Inc.A diagnostic test for attention deficit hyperactivity disorder
EP1219243A1 (en)*2000-12-282002-07-03Matsushita Electric Works, Ltd.Non-invasive brain function examination
US6702757B2 (en)2000-12-282004-03-09Matsushita Electric Works, Ltd.Non-invasive brain function examination
US9649030B2 (en)2012-05-012017-05-16RightEye, LLCSystems and methods for evaluating human eye tracking
US8864310B2 (en)2012-05-012014-10-21RightEye, LLCSystems and methods for evaluating human eye tracking
US8955974B2 (en)2012-05-012015-02-17RightEye, LLCSystems and methods for evaluating human eye tracking
WO2013166025A1 (en)*2012-05-012013-11-07RightEye, LLCSystems and methods for evaluating human eye tracking
US10512397B2 (en)2012-05-012019-12-24RightEye, LLCSystems and methods for evaluating human eye tracking
EP3733051A1 (en)*2012-05-012020-11-04Righteye, LLCSystems and methods for evaluating human eye tracking
US11160450B2 (en)2012-05-012021-11-02RightEye, LLCSystems and methods for evaluating human eye tracking
US11690510B2 (en)2012-05-012023-07-04Righteye LlcSystems and methods for evaluating human eye tracking
US12251161B2 (en)2012-05-012025-03-18RightEye, LLCSystems and methods for evaluating human eye tracking
CN113509138A (en)*2021-05-112021-10-19岭南师范学院 A pupillary photoreflectometer for autistic children with laser self-mixing interference
CN115314684A (en)*2022-10-102022-11-08中国科学院计算机网络信息中心 A kind of inspection method, system, device and readable storage medium of railway bridge
CN115314684B (en)*2022-10-102022-12-27中国科学院计算机网络信息中心Method, system and equipment for inspecting railroad bridge and readable storage medium

Similar Documents

PublicationPublication DateTitle
US12059207B2 (en)Cognitive training system with binocular coordination analysis and cognitive timing training feedback
US9895100B2 (en)Eye movement monitoring of brain function
US10231614B2 (en)Systems and methods for using virtual reality, augmented reality, and/or a synthetic 3-dimensional information for the measurement of human ocular performance
US7872635B2 (en)Foveated display eye-tracking system and method
US9004687B2 (en)Eye tracking headset and system for neuropsychological testing including the detection of brain damage
Pelz et al.Oculomotor behavior and perceptual strategies in complex tasks
US4789235A (en)Method and system for generating a description of the distribution of looking time as people watch television commercials
US5293187A (en)Method and apparatus for eye tracking for convergence and strabismus measurement
KR102182605B1 (en)Systems and methods for gaze-based media selection and editing
US10398309B2 (en)Noninvasive rapid screening of mild traumatic brain injury using combination of subject's objective oculomotor, vestibular and reaction time analytic variables
US20110205167A1 (en)Brain concussion screening method & apparatus
CN108366764B (en)Viewer emotion determination device, viewer emotion determination system, and program
US11317861B2 (en)Vestibular-ocular reflex test and training system
Baumann et al.Neon accuracy test report
JPH0654808A (en)Medical treatment diagnostic device by gaze point masking
Ma et al.Eye tracking measures of bicyclists’ behavior and perception: A systematic review
CUMMINGEye movements and visual perception
CA2209886A1 (en)Motor and eye activity performance analyzer
WO2001074236A1 (en)A diagnostic test for attention deficit hyperactivity disorder
US8567950B2 (en)Apparatus for treating visual field loss
Lambert et al.High-speed data processing and unobtrusive monitoring of eye movements
US20220386953A1 (en)Impairement screening system and method
JP2021035499A (en)Eyewear, data collection system and data collection method
KR102132294B1 (en)Method for analyzing virtual reality content information in virtual reality and evaluation terminal adopting the same
CN113729609B (en)Co-vision machine

Legal Events

DateCodeTitleDescription
FZDEDiscontinued

[8]ページ先頭

©2009-2025 Movatter.jp