RELATED APPLICATIONThis application claims priority from U.S. Provisional Application No. 62/393,337, filed 12 Sep. 2016, the subject matter of which is incorporated herein by reference in its entirety.
TECHNICAL FIELDThis disclosure relates to an apparatus and method for collecting neurological data and, more particularly, a method and apparatus for analyzing neurocognitive data.
BACKGROUNDMinor head injury and concussion can cause neurological anomalies that may manifest via symptoms like loss of consciousness, amnesia, headache, dizziness, fatigue, and light sensitivity. These neurological anomalies may also or instead manifest via subtler symptoms like decreased neurocognitive function, impaired hand-eye coordination, and inhibited depth perception. Many other correlated neurocognitive functions are suspected of being causally related to minor head injury and concussion.
Modern traumatic brain injury testing solutions vary in scope of testing, portability, and price. Most modern testing solutions only measure a single facet of neurocognitive function at a time, while testing solutions that measure multiple neurocognitive functions in a single test are too expensive and cumbersome to reasonably allow for portability. These limitations naturally limit the locations and frequency with which the test can be administered, and the delay between the possible traumatic head injury and the administration of the test.
SUMMARYIn an aspect, a neurological data collecting apparatus is described. A plurality of electronically interconnected interface panels configured to selectively display stimuli and accept input is provided. An interface panel frame supports the plurality of interface panels and aggregates the plurality of interface panels into a user interface screen that extends into the user's peripheral vision when the user is standing within a predetermined distance from the user interface screen. A supporting stand is connected to the user interface screen. The supporting stand includes a height adjustment mechanism for adjusting the height of the user interface screen to eye level of the user. A control unit electronically connects the user interface screen to a computer.
In an aspect, a method for analyzing neurocognitive data is described. A portable user interface screen that extends into the user's peripheral vision on a supporting stand at a user's eye level and a computer electronically connected to both a data collection and aggregation system and a user interface screen via a control unit are provided. A neurocognitive test to be programmatically generated is sent from the computer to the user interface screen via the electronic connection to the control unit. At least one neurocognitive test is displayed on the user interface screen. Input from the user responsive to at least one neurocognitive test is accepted and recorded. Input is sent from the user interface screen through the control unit to the computer and from the computer to a data collection and aggregation system. The neurocognitive data derived from the input is analyzed using algorithms. Results are created responsive to the neurocognitive data. The results are sent from the data collection and aggregation system to the computer to be displayed on the computer in a user-perceptible format.
BRIEF DESCRIPTION OF THE DRAWINGSFor a better understanding, reference may be made to the accompanying drawings, in which:
FIG. 1A schematically depicts a front view of an aspect of the apparatus;
FIG. 1B schematically depicts a back view of the apparatus ofFIG. 1A;
FIG. 2 is an exploded perspective schematic view of a component of the aspect ofFIG. 1A;
FIG. 3 is a detail partial bottom perspective schematic view of a component of the aspect ofFIG. 1A andFIG. 4A.
FIG. 4A is a schematic rear view of the aspect ofFIG. 1A in an alternate configuration;
FIG. 4B is a schematic top view of the aspect ofFIG. 4A;
FIG. 4C is a schematic bottom view of the aspect ofFIG. 4A;
FIG. 5 is a schematic front view of a component of an alternative configuration;
FIG. 6 is a flow chart of a method for using the aspect ofFIG. 1.
FIG. 7 is a flow chart of a method for creating an aspect inFIG. 6.
DESCRIPTION OF ASPECTS OF THE DISCLOSUREUnless defined otherwise, all technical and scientific terms used herein have the same meaning as is commonly understood by one of skill in the art to which the present disclosure pertains.
As used herein, the term “user” can be used interchangeably with the term “patient” and refer to any warm-blooded organism including, but not limited to, human beings, pigs, rats, mice, dogs, goats, sheep, horses, monkeys, apes, rabbits, cattle, farm animals, livestock, etc.
As used herein, the singular forms “a,” “an” and “the” can include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” as used herein, can specify the presence of stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof.
As used herein, the term “and/or” can include any and all combinations of one or more of the associated listed items.
As used herein, phrases such as “between X and Y” and “between about X and Y” can be interpreted to include X and Y.
It will be understood that when an element is referred to as being “on,” “attached” to, “connected” to, “coupled” with, “contacting,” etc., another element, it can be directly on, attached to, connected to, coupled with or contacting the other element or intervening elements may also be present. In contrast, when an element is referred to as being, for example, “directly on,” “directly attached” to, “directly connected” to, “directly coupled” with or “directly contacting” another element, there are no intervening elements present.
Spatially relative terms, such as “under,” “below,” “lower,” “over,” “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms can encompass different orientations of a device in use or operation, in addition to the orientation depicted in the figures. For example, if a device in the figures is inverted, elements described as “under” or “beneath” other elements or features would then be oriented “over” the other elements or features.
It will be understood that, although the terms “first,” “second,” etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. Thus, a “first” element discussed below could also be termed a “second” element without departing from the teachings of the present disclosure. The sequence of operations (or steps) is not limited to the order presented in the claims or figures unless specifically indicated otherwise.
The invention comprises, consists of, or consists essentially of the following features, in any combination.
An apparatus for collecting neurological data from a user is shown inFIG. 1A andFIG. 1B. The apparatus uses a plurality of electronicallyinterconnected interface panels2 configured to selectively display stimuli, such as visual, tactile, or audible stimuli, and accept input, such as tactile, verbal, or visual input. Eachinterface panel2 includes an electronically controlledstimulus display18 and one or morecorresponding input sensors20 that may be associated with individual stimuli and the time the user interacted with that same individual stimuli. The electronically controlledstimulus display18 may use liquid crystal display (LCD), light emitting diode (LED), organic light emitting diode (OLED), plasma display panel (PDP), or any other desired technology to create a two-dimensional video display that covers a substantial portion of the surface area of the front of theinterface panel2. (It is also contemplated that the display screen could be virtual, or augmented reality, such as by providing the user with a pair of virtual reality glasses or any suitable virtual reality headset and appropriate software/hardware to display stimuli and accept input in an analogous manner—in virtual space—to the use of the method and apparatus described and shown herein as occurring at least partially in physical space.) Theinput sensors20 may use resistive touch screen, capacitive touch screen, surface acoustic wave, infrared grid, optical imaging technology, force sensors, buttons, or any other desired input or user interface component to measure where and when a user touches aninterface panel2. Theuser interface screen6 could also or instead use speakers and/or a microphone to interact with the user when audible stimuli and/or verbal input are present in the apparatus.
As shown inFIG. 2, theinterface panels2 may each use a plurality ofLEDs22 soldered (or otherwise connected) to a printedcircuit board24 to create an electronically controlledstimulus display18. EachLED22 may be, for example, 5×5 mm, with a series ofLEDs22 arranged in any desired configuration, such as in a 16×28 matrix, on the printedcircuit board24. Oneexample interface panel2 uses a four-wireresistive touchscreen26 that covers the entire stimulus display to createinput sensors20. The four-wireresistive touchscreen26 could be connected to the printedcircuit board24 by a flexible printed circuit board or ribbon cable (not shown). The printedcircuit board24 contains anintegrated circuit28 which converts the input data into x and y coordinates in order to correspond the input with an associated individual stimulus. In between the printedcircuit board24 and the4 wireresistive touchscreen22 is abuffer plate30 which is a material that fills the gaps between theLEDs18, so the four-wireresistive touchscreen26 is resting on a substantially level surface (theLEDs18 and the buffer plate30). Behind the printedcircuit board24 is a back plate32, which is a metal plate that protects the printedcircuit board24.
Theinterface panels2 each slide into an interface panel frame4, which aggregates theinterface panels2 to form auser interface screen6 that extends into the user's peripheral vison when the user is standing within a predetermined distance from the user interface screen, such as up to 20″ away, from theuser interface screen6. (A user-to-screen separation of 20″ or less has been shown to facilitate a desired field of vision, and of peripheral vision, for the user while still allowing the size of theuser interface screen6 to be relatively compact and thus portable, in some use environments of the present invention.)
For example, it is generally held in the field that peripheral vision begins at approximately 60 degrees from the center of the user's gaze. Therefore, a desired testing distance away from the screen can be calculated for a particular screen size based upon that 60-degree “rule of thumb”. One of ordinary skill in the art will be able to specify sizing of auser interface screen6, and corresponding placement of the user relative to the screen, to achieve the desired field of vision results for a particular use application of the present invention. It is contemplated, though, that the sizing, vertical placement in the field of view, horizontal placement in the field of view, and distance from the user of theuser interface screen6 may vary, depending upon the size, visual ability, and other characteristics of the user. Therefore, one of ordinary skill in the art will be able to configure auser interface screen6 for a particular use environment and/or a particular user without undue experimentation, using the principles articulated herein along with the knowledge of one of ordinary skill in the art.
A functionally significant feature of the apparatus is that it uses auser interface screen6 that extends into the user's peripheral vision. Any sizeduser interface screen6 can measure peripheral vision based on how close user is to the user interface screen, but the closer a user is to the user interface screen, the more depth perception can be measured. A user interface screen that still extends into the user's peripheral vision when the user is, for example, up to 20″ away results in a relatively large and wideuser interface screen6 that increases both the measurements of peripheral vision and depth perception as theuser interface screen6 width increases and the user's distance from theuser interface screen6 decreases. Stated differently, the closer a user is to theuser interface screen6, the more depth perception can be measured, while the further away a user is, the more of theuser interface screen6 is in an area of central vision.
The interface panel frame4 may be built, for example, from extruded aluminum bars34 as shown inFIG. 3. As shown inFIGS. 1A and 1B, in one possible embodiment the interface panel frame4 includes four extruded aluminum bars34 formed into a rectangle. As shown inFIG. 3, the interior enclosed area36 of each extrudedaluminum bar34 could have anadapter40 to allow at least oneinterface panel2 to be slid into the interior enclosed area36, so as to allow theinterface panels2 to be combined using the interface panel frame4 into theuser interface screen6.
As shown inFIG. 4A, in an alternate configuration of the interface panel frame4 to that shown inFIGS. 1A and 1B, eachinterface panel2 may attach to eachadjacent interface panel2 via at least onehinge46, so that the plurality ofinterface panels2 being aggregated into auser interface screen6 are configured to fold in an accordion or map-like manner. As shown inFIG. 4B, theuser interface screen6 folds in an accordion or map-like manner when thelast interface panel2 folds inwards on the adjoininginterface panel2 followed by thatinterface panel2 folding outwards on the nextadjacent interface panel2, which is then repeated until allinterface panels2 are collapsed on each other in a zig-zag type manner in the horizontal direction. When theuser interface screen6 includes one or more “rows” ofinterface panels2 stacked vertically, rather than the single row shown inFIG. 4A, then theuser interface screen6 could also or instead collapse in a similar zig-zag type manner in the vertical direction.
In an example of this alternate configuration ofFIGS. 4A-4C, the top and bottom of the interface panel frame4 may include a plurality of extruded aluminum bars34 that are each the same length as a corresponding attachedinterface panel2. Each extrudedaluminum bar34 may be attached to an adjacent extrudedaluminum bar34 byhinges46 with up to a 180-degree angle of rotation. As shown inFIG. 4C, when all hinges46 are fully extended to 180 degrees, the extruded aluminum bars34 and their corresponding attachedinterface panels2 would become horizontally aligned with each other to form the aggregateduser interface screen6. As shown inFIGS. 3, 4B, and 4C within the exteriorenclosed area38 of the top and bottom extruded aluminum bars34 are a plurality ofrods42, which may have any desired shape, including a contoured profile substantially matching that of the exteriorenclosed area38. For example, a number ofrods42 equal to one fewer than the number ofinterface panels2 could be provided. The leftmost andrightmost rods42 haveknobs44 attached to them. When theknobs44 are slid along the top of theuser interface screen6, all of therods42 are shifted either left or right within the exteriorenclosed area38, in accordance with the sliding direction of theknobs44. As shown inFIG. 4C, when therods42 are shifted halfway to the left or right, eachrod42 spans two adjacent extruded aluminum bars34, thus bridging across the hinged connection and creating a lock to prevent relative pivoting of thoseadjacent interference panels2 with respect to each other. As shown inFIG. 4B, when the pins36 are shifted completely left or right, therods42 only span their respective extruded aluminum bars34 and align with thecorresponding interference panels2 between thehinges46, thus allowing the rotation of eachinterface panel2 towards anadjacent interface panel2 in facilitating folding of theuser interface screen6, with an entirely self-contained system which does not require additional, separate locking components for use.
Eachinterface panel2 is electronically connected to anadjacent interface panel2, to acontrol unit12, and to apower supply14. Thecontrol unit12 controls the stimuli display of theinterface panels2 and accepts the input from theinterface panels2. Thecontrol unit12 may be an embedded single board computer controller with microprocessor(s), memory, input/output (I/O) and other features found in a functional computer (which is used herein to reference any desired type of computer structure, including a micro control unit). Thepower supply14 is an electronic power distributor that supplies the power to eachinterface panel2 and thecontrol unit12. Thepower supply14 can connect to any AC adapter, AC/DC adapter, or AC/DC converter power cord, such as a power cord typically used to charge a laptop or to an external battery, and/or thepower supply14 could include its own onboard battery or other power storage feature.
Thecontrol unit12 andpower supply14 may be physically attached to anyinterface panel2. In the configuration ofFIGS. 1A-1B, where eachinterface panel2 is separate from the others until assembled for use, theinterface panels2 may be electronically connected to each other by a user during assembly usingelectronic connectors16 like a ribbon cable, a flexible printed circuit board, rigid ports/sockets, or any other desired electrical connecting feature. In the configuration ofFIGS. 4A-4C, where eachinterface panel2 is always connected with anadjacent interface panel2 during storage, transport, and use, theelectronic connectors16 may be, for example, a permanent flexible printed circuit board, and a flexibleprotective shield48 may be placed between each pair ofadjacent interface panels2 to protect theelectronic connectors16.
Thecontrol unit12 may be electronically connected to acomputer12. Thecomputer12 may be a desktop computer, a laptop computer, a tablet computer, a smartphone, a hand-held computer device, or any other desired computer component. Thecomputer12 may be electronically connected to thecontrol unit12 by a wired or wireless personal area network (PAN) technology such as, but not limited to, Bluetooth, USB cable, Wireless USB, or Zigbee; by a wired or wireless local area network (LAN) technology such as, but not limited to, an Ethernet cable or Wi-Fi; and/or by radio frequency (RF) and/or wide area network (WAN) technologies such as, but not limited to, digital radio and remote Internet.
A supportingstand8 with aheight adjustment mechanism10 may connect to the interface panel frame4 or directly to theuser interface screen6. Theheight adjustment mechanism10, when present, allows for adjustment of theuser interface screen6 to the user's eye level. That is, the supportingstand8 may be an adjustable stand that has a variable height above the ground corresponding to the placement of theuser interface screen6 at eye level of the user. Alternately, the supportingstand8 may be a flat display mounting interface or other wall adapter (e.g., like those sometimes used to mount flat-screen TVs to an upright surface) that is adjustably attached to a vertical surface and configured to allow theuser interface screen6 to be moved up and down to correspond to eye level of the user. In the configuration shown inFIGS. 1A-1B, the supportingstand8 may attach directly to the bottom extrudedaluminum bar34. As shown in the configuration ofFIGS. 4A-4C, the back of theinterface screen6 could contain a mount interface, such as a standard VESA mount, thus allowing any VESA-compatible stand or wall mount to be used as a supportingstand8. A flat display mounting interface that is adjustably attached to a vertical surface is not limited to just a wall mount, but may also include supportingstand8 designs compatible with the trailer hitch of a vehicle, a crossbar of audience seating bleachers, a tailgate of a vehicle, or any other relatively stationary surface in front of which a user can stand during use of the apparatus. One of ordinary skill in the art can readily provide a suitable supportingstand8 design for a desired use environment of the present invention.
Another configuration of the apparatus could use one OLED stimulus display as theuser interface screen6, rather than a plurality ofinterface panels2. Due to the flexibility of the OLED stimulus displays, theuser interface screen6 would be able to be rolled up or folded without the need forseparate interface panels2. A carbon fiber lattice support, or any other desired framing or mounting components, may be attached to theuser interface screen6 to support its weight, hold the OLED stimulus display in a desired steady position for use of the apparatus, and/or provide an attachment mechanism to the supportingstand8. It is contemplated that, when an OLED stimulus display is used as theuser interface screen6, the OLED stimulus display could be curved to at least partially laterally surround the user and thus keep theuser interface screen6 surface at a substantially constant distance from a stationary user.
However accomplished, though (via a folding structure, a flexible OLED, a virtual reality interface, or any other desired scheme), the portability of theuser interface screen6 may be important for some use environments (e.g., beside a playing field), in order to facilitate rapid and frequent testing of users. As an example, auser interface screen6 as described herein and shown in the Figures may be approximately 2′×6′×0.5″ in a use configuration and approximately 2′×1′×3″ as packed for transport/storage.
The apparatus may also include at least oneexternal sensor52 in addition to theinput sensors20 as shown inFIG. 5 that records tactile, visual, or audio sensations generated by the user or the environment. Anexternal sensor52 could be attached to the user interface screen and electronically connected to the computer or thecontrol unit12 via a direct electronic connection to thecontrol unit12 or indirect connection to aninterface panel2. At least one of theexternal sensors52 may be a camera configured to record at least one of the user's eye and body movements. As shown inFIG. 5, a plurality of external sensors in the form of cameras may be a placed on a pair ofglasses54 the user would wear while interacting with theuser interface screen6. When a plurality of cameras are provided, one camera could be a video camera56 that records theuser interface screen6 in order to record what stimulus the user is looking at, and another camera could be an infrared camera58 recording the pupil movements of the user. Another configuration could attach a video camera to theuser interface screen6, pointed toward the user, in order to record the user's body movements. Additional external sensors (e.g. thermometer, pulse oximeter, and/or barometer; not shown) could be associated with theuser interface screen6 and by extension thecontrol unit12 to collect additional environmental, physiological, and/or health data such as, but not limited to, body temperature, pulse, blood pressure, blood oxygen, ambient temperature, light, atmospheric pressure, lactic acid levels, and/or heart rate.
Theuser interface screen6, interface panel frame4, and supportingstand8 may be configured for repeated disassembly and reassembly by the user. The versatility and design of these structures makes the apparatus portable, which allows for the apparatus to be used in any environment. In the rectangular interface panel frame4 configuration ofFIGS. 1A-1B, the apparatus may be assembled by a single user in less than ten minutes by first constructing the interface panel frame4, attaching the supportingstand8 to the interface panel frame4, sliding theinterface panels2 into the interface panel frame4, and electronically connecting theinterface panels2 to each other. In theFIGS. 4A-4C configuration using the accordion design, or the OLED stimulus display (not shown), the apparatus can be assembled in less than one minute by first extending theuser interface screen6, and then attaching the supportingstand8 to theVESA mount interface50 or another mount interface associated with theuser interface screen6.
The apparatus and system described herein analyzes neurological data derived from the input a user by using a method as shown in aflow chart200 inFIG. 6 In a first action block202 of theflow chart200, a portableuser interface screen6 is provided. Theuser interface screen6 extends into the user's peripheral vison when the user is standing within a predetermined distance from theuser interface screen6, and uses a supportingstand8 to be at a user's eye level. Optionally, additional external sensors (e.g. thermometer, pulse oximeter, and/or barometer; not shown) could be associated with theuser interface screen6 and be used in the method described herein to collect additional environmental, physiological, and/or health data such as, but not limited to, body temperature, pulse, blood pressure, blood oxygen, ambient temperature, light, atmospheric pressure, lactic acid levels, and/or heart rate. This additional data can be used in any desired manner to inform, weight, control for, or otherwise affect the test results, and/or could simply be recorded for the record, as desired.
In a second action block204 of theflow chart200, acomputer12 electronically connected to both a data collection and aggregation system14 (e.g., a computer server, optionally including a database function) and theuser interface screen6 via thecontrol unit12 is provided. The data collection andaggregation system14 may be either remote internet-connected cloud based data collection and aggregation system or a local data collection and aggregation system held on thecomputer12.
In thethird action block206 offlow chart200, thecomputer12 sends a programmatically generatedneurocognitive test10 to theuser interface screen6 via thecontrol unit12. Everyneurocognitive test10 is programmatically generated according to a series of preset algorithmic rules located in the computer, data collection and aggregation system, and/or control unit. In the fourth action block208 offlow chart200, theuser interface screen6 displays at least oneneurocognitive test10, accepts the user's responses to saidneurocognitive test10, and records them as input. Aneurocognitive test10 is sequences of stimuli displayed on theuser interface screen6, where the reactions of the user to such sequences of stimuli are correlated with at least one of the neurocognitive data types being chosen from data types including, but not limited to, psychomotor response, complex reaction time, memory, balance, peripheral awareness, and/or any other desired neurocognitive data type.
Psychomotor response is a response involving both the brain and motor activity. Psychomotor response of a user can include measurable parameters such as simple reaction time, peripheral awareness, and depth perception. Simple reaction time can be measured using theuser interface screen6 by measuring the time it takes for a user to recognize a stimulus and interact with it. Aneurocognitive test10 can also measure at least one of a peripheral awareness and/or depth perception of the user responsive to the presence of theuser interface screen6 in the user's peripheral vision. Peripheral awareness can be measured using theuser interface screen6 by the location accuracy of a user's interaction with stimuli at least approximately 60 degrees to the left or right of a user's forward line of sight, in accordance with the conventional definition of “peripheral vision”, as previously discussed. Optionally, the apparatus can detect and monitor the user's forward line of sight, in order to facilitate the measurement of peripheral awareness. Depth perception can be measured using theuser interface screen6 by measuring the location accuracy of a user's interaction with stimuli that is large distance away from the user. As theuser interface screen6 size increases, the distance between the user and the farthest possible stimuli also increases (for a flat, not curved, screen), thus, the ability to measure depth perception increases. Theuser interface screen6 and related components of the apparatus therefore allow theneurocognitive test10 to help to measure and directly characterize a user's psychomotor ability responsive to measurements of simple reaction time, peripheral awareness, and depth perception.
Complex reaction time can be measured using theuser interface screen6 by the time it takes for a user to recognize a stimulus, decide whether to interact with the stimulus, and then interact with the stimulus. Memory can be measured using theuser interface screen6 by the display of a sequence of stimuli and subsequent measurement of the accuracy by which the user repeats that sequence of stimuli through input. If the apparatus uses a video camera to track eye movement by obtaining and recording video input of the user responding physically to theneurocognitive test10, then theneurocognitive test10 may indirectly characterize a user's balance ability responsive to measurements of at least one of eye movement, peripheral awareness and depth perception, because measurements of vision can be correlated with the neurocognitive data type of balance. In addition, other sensors added to the apparatus (e.g. thermometer, pulse oximeter, and/or barometer) may be used to generate environmental or other health data types.
In the fifth action block210 offlow chart200, theuser interface screen6 sends the input to thecomputer12 via thecontrol unit12. Input may include data collected frominput sensors20 and multiple types ofexternal sensors52 that record tactile, visual, and/or audio sensations generated by the user or the environment. In the sixth action block212 offlow chart200, thecomputer12 sends the input to the data collection andaggregation system14. In the seventh action block214 offlow chart200, the data collection andaggregation system14 analyzes the neurocognitive data derived from the input using algorithms and creates results responsive to thatneurocognitive data18. In the eighth action block216 offlow chart200, data collection andaggregation system14 sends theseresults18 to thecomputer12, and thecomputer12 displays theresults18 in a user-perceptible format to the user.
The data collection and aggregation system may create results responsive to neurocognitive data by using a method as shown in aflow chart300 inFIG. 7. In the first action block302 offlow chart300, the data collection and aggregation system creates a longitudinal set of neurocognitive data for the user from at least oneneurocognitive test16, each of which may be taken at any time during the user's involvement with the apparatus and method of the present invention. Theneurocognitive tests16 could be taken frequently throughout the year by the user in order to create a baseline and rich history of information regarding neurocognitive function for that user. The portability of theuser interface screen6 also allows for a more accurate and robust longitudinal data set because theneurocognitive tests16 can be easily taken more often. Since theneurocognitive tests16 can be administered wherever the user is located, there are many opportunities to run manyneurocognitive tests16, either in the regular course of the activity, or in response to some suspected trauma event.
In the second action block304 offlow chart300, the user creates event data from at least oneneurocognitive test10 administered at any time, and for any reason. For the purpose of the below description, theflow chart300 presumes that theneurocognitive test10 which is being compared—as event data—to the longitudinal data set (of at least one previously administered neurocognitive test16) is administered in conjunction with (e.g., directly after) an event that may cause a change in neurocognitive function (as opposed to just being administered to the user at a random time). Theneurocognitive test10 may be administered at any time, and without regard to whether a change in neurocognitive function is suspected. For example, a “standing” weekly (or other periodically-administered)neurocognitive test10 could be given to help detect latent neurocognitive damage which does not arise to a level of severity sufficient to come to the attention of a user or observer otherwise.
An event that may cause a change in neurocognitive function could be a suspected concussion, such as the user taking a significant blow to the head. For example, during a football game a user may be involved in a violent head-to-head collision that a medical trainer suspects may have caused a concussion. That user could be taken out of the game, and immediately be administered aneurocognitive test10, which would then be used as event data in the aforementioned comparison.
Another event that may cause a significant change in neurocognitive function could be the beginning or conclusion of an event where a concussion may occur. Theneurocognitive test10 could be administered at some point in time tied to the timing of that event as a matter of course, even if there is no suspicion that a change in neurocognitive function actually did occur. For example, a user could play an entire game of football where they sustained multiple sub-concussive events, which could even be unsuspected or unnoticed by the team's medical trainers. After the conclusion of the game, if the user takes aneurocognitive test10 for event data as a matter of post-game protocol, the apparatus could help determine if such micro concussions (or even impacts which do not rise to the level of a concussion) during the course of that game likely caused a significant decrease in neurocognitive function. For the sake of description, the term “a concussion” is used herein to encompass both a single-impact event and a series of sub-concussive events.
Another event that may cause a change in neurocognitive function could be period of restricted activity. For example, if a user is “benched” from practice/games or otherwise restricted from activity, aneurocognitive test10 could be used to produce a set of event data, with the event being the non-participation of the user. Use of such event data could help detect gain of neurocognitive function related to recovery from trauma. It is contemplated that theneurocognitive test10 for the production of event data could be administered at any time, and for any reason (or no particular reason), as desired by the user and/or coach/staff or other personnel. For example, theneurocognitive test10 could be administered weekly, whether or not a game or practice has occurred. Because theneurocognitive test10 is administered in an effort to detect potential neurocognitive function change, regularly scheduled testing (regardless of a user's activity status) should be considered to fall under the category of “an event that may cause a change in neurocognitive function”.
The event data produced in, for example, the latest administeredneurocognitive test10 could be compared to any desired one or more—up to all—of the longitudinal data set produced by the at least oneneurocognitive test16 previously administered to that user. Additionally or alternatively, the event data from any chosen one of a series of neurocognitive tests taken by a user could be compared to the data set of information produced by any other one or more neurocognitive tests in the series, for any desired reason.
The portability and quick assembly/disassembly of theuser interface screen6 allows for more accurate and timely event data to be taken immediately after a suspected concussion or the conclusion of an event where a concussion may have occurred, because theuser interface screen6 could be located nearby for ease of prompt access related to any event where a concussion may occur, or where immediate post-game data is desirable.
In thethird action block306 offlow chart300, the data collection and aggregation system uses algorithms to compare the event data with the longitudinal data set. In the fourth action block308 offlow chart300, the data collection and aggregation system determines responsive to the comparison if there was potential change in neurocognitive function over time. In the fifth action block310 offlow chart300, the data collection and aggregation system creates results responsive to the potential change in neurocognitive function of the user. In the sixth action block312-316 of the flow chart, the results may take the form of a recommendation for auser activity restriction312, a recommendation for the user to see a medical professional314, and/or relevant neurocognitive datastatistical information316. Theresults18 may be used by team medical trainers to make decisions like restricting (or choosing not to restrict or to remove a previous restriction) user activity and/or recommending that the user see a medical professional. Theresults18 may be used by team personnel, the user, medical professionals, or any other party to make decisions including diagnosing potential or actual trauma-related medical issues, restricting a user's activities, providing medical treatment, determining recovery time and/or any other reason.
While aspects of this disclosure have been particularly shown and described with reference to the example aspects above, it will be understood by those of ordinary skill in the art that various additional aspects may be contemplated. For example, the specific methods described above for using the apparatus are merely illustrative; one of ordinary skill in the art could readily determine any number of tools, sequences of steps, or other means/options for placing the above-described apparatus, or components thereof, into positions substantively similar to those shown and described herein. In an effort to maintain clarity in the Figures, certain ones of duplicative components shown have not been specifically numbered, but one of ordinary skill in the art will realize, based upon the components that were numbered, the element numbers which should be associated with the unnumbered components; no differentiation between similar components is intended or implied solely by the presence or absence of an element number in the Figures. Any of the described structures and components could be integrally formed as a single unitary or monolithic piece or made up of separate sub-components, with either of these formations involving any suitable stock or bespoke components and/or any suitable material or combinations of materials. Any of the described structures and components could be disposable or reusable as desired for a particular use environment. Any component could be provided with a user-perceptible marking to indicate a material, configuration, at least one dimension, or the like pertaining to that component, the user-perceptible marking potentially aiding a user in selecting one component from an array of similar components for a particular use environment. Though certain components described herein are shown as having specific geometric shapes, all structures of this disclosure may have any suitable shapes, sizes, configurations, relative relationships, cross-sectional areas, or any other physical characteristics as desirable for a particular application. Any structures or features described with reference to one aspect or configuration could be provided, singly or in combination with other structures or features, to any other aspect or configuration, as it would be impractical to describe each of the aspects and configurations discussed herein as having all of the options discussed with respect to all of the other aspects and configurations. A device or method incorporating any of these features should be understood to fall under the scope of this disclosure as determined based upon the claims below and any equivalents thereof.
Other aspects, objects, and advantages can be obtained from a study of the drawings, the disclosure, and the appended claims.