Movatterモバイル変換


[0]ホーム

URL:


WO2018011615A1 - System for precise decoding of movement related human intention from brain signal - Google Patents

System for precise decoding of movement related human intention from brain signal
Download PDF

Info

Publication number
WO2018011615A1
WO2018011615A1PCT/IB2016/001185IB2016001185WWO2018011615A1WO 2018011615 A1WO2018011615 A1WO 2018011615A1IB 2016001185 WIB2016001185 WIB 2016001185WWO 2018011615 A1WO2018011615 A1WO 2018011615A1
Authority
WO
WIPO (PCT)
Prior art keywords
intention
stimulation
imagination
thought
cues
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IB2016/001185
Other languages
French (fr)
Inventor
Eiichi Yoshida
Yasuharu KOIKE
Ganesh GOWRISHANKAR
Hideyuki Ando
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Centre National de la Recherche Scientifique CNRS
National Institute of Advanced Industrial Science and Technology AIST
Tokyo Institute of Technology NUC
Original Assignee
Centre National de la Recherche Scientifique CNRS
National Institute of Advanced Industrial Science and Technology AIST
Tokyo Institute of Technology NUC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Centre National de la Recherche Scientifique CNRS, National Institute of Advanced Industrial Science and Technology AIST, Tokyo Institute of Technology NUCfiledCriticalCentre National de la Recherche Scientifique CNRS
Priority to PCT/IB2016/001185priorityCriticalpatent/WO2018011615A1/en
Priority to EP16760777.9Aprioritypatent/EP3484356A1/en
Priority to JP2019500670Aprioritypatent/JP6884349B2/en
Publication of WO2018011615A1publicationCriticalpatent/WO2018011615A1/en
Anticipated expirationlegal-statusCritical
Ceasedlegal-statusCriticalCurrent

Links

Classifications

Definitions

Landscapes

Abstract

An intention, imagination or thought decoding system for humans or animals comprising: a. brain activity recording or measuring system. b. A sensory stimulation system which applies one or more artificially induced sensory cue/cues, and/or stimulation/stimulations related to the intention, imagination or thought to be decoded. c. A computer program or algorithm, which analyzes recorded brain activity in the presence of, or immediately after, the sensory cue/cues or stimulation/stimulations to decode the intention, Imagination or thought.

Description

SYSTEM FOR PRECISE DECODING OF MOVEMENT RELATED HUMAN
INTENTION FROM BRAIN SIGNAL
The present disclosure relates to a method for precise decoding of movement related human intention from brain signal. Brain machine interfacing (BMI) refers to an artificial connection between an organism's brain and a machine. The interfacing is usually involves three steps -1) Brain imaging/ Recoding: The process of recording the electromagnetic or blood flow signal during the activation of one's brain;2)Decoding: understanding what the recorded signals mean;3)Machine actuation: using the understanding to actuate and control a machine (Fig. 1). For example, BMI can be used to provide an artificial limb to an amputee. In which case one has to record the brain signals (as electroencephalography or EEG, functional magnetic resonance imaging or fMRI, near infrared spectroscopy or NIRS etc.), decode which movement the human wants to make, and then actuate the artificial limb to make the desired movement.
The biggest challenge for BMI is probably the decoding, and specifically understanding from brain signals in a short period of time, what movement a human wants to do. The best performance for (even two class) decoders up till now has not exceeded 70% when movements are not made (only imagined). Even when subjects make movements, accuracy of decoded performance has never exceeded 85% without including the signals related to the movement (NPL 23). Here we provide a novel (active decoding) methodology, that is based on theories of motor neuroscience, and that increases this performance radically, giving performance of ~90% correct intention decoding within 100ms of the subject intention imagination.
[NPL 1] Ganesh G, Takagi A, Osu R, Yoshioka T, Kawato M, Burdet E. Two is better than one: Physical interactions improve motor performance in humans. Scientific reports. 2014;4.
[NPL 2] Aglioti SM, Cesari P, Romani M, Urgesi C. Action anticipation and motor resonance in elite basketball players. Nat Neurosci. 2008 Sep;ll(9):1109-16. PubMed PMID: 19160510. Epub 2009/01/23. eng. [NPL 3] Frith CD, Frith U. Interacting minds-a biological basis. Science. 1999 Nov 26;286(5445):1692-5. PubMed PMID: 10576727. Epub 1999/11/27. eng.
[NPL 4] Pickering MJ, Garrod S. An integrated theory of language production and comprehension. Behav Brain Sci. 2013 Aug;36(4):329-47. PubMed PMID: 23789620. [NPL 5] Miall RC, Christensen LOD, Cain O, Stanley J. Disruption of state estimation in the human lateral cerebellum. Plos Biology. 2007 Nov;5(ll):2733-44. PubMed PMID: WOS:000251874700027. English.
[NPL 6] Wolpert DM, Ghahramani Z, Jordan Ml. An internal model for sensorimotor integration. Science. 1995 Sep 29;269(5232):1880-2. PubMed PMID: 7569931. Epub 1995/09/29. eng.
[NPL 7] Desmurget M, Grafton S. Forward modeling allows feedback control for fast reaching movements. Trends Cogn Sci. 2000 Nov 1;4(11):423-31. PubMed PMID: 11058820.
[NPL 8] Christensen MS, Lundbye-Jensen J, Geertsen SS, Petersen TH, Paulson OB, Nielsen JB. Premotor cortex modulates somatosensory cortex during voluntary movements without proprioceptive feedback. Nat Neurosci. 2007 Apr;10(4):417-9. PubMed PMID: 17369825.
[NPL 9] Duhamel JR, Colby CL, Goldberg ME. The updating of the representation of visual space in parietal cortex by intended eye movements. Science. 1992 Jan 3;255(5040):90-2. PubMed PMID: 1553535.
[NPL 10] Sommer MA, Wurtz RH. A pathway in primate brain for internal monitoring of movements. Science. 2002 May 24;296(5572):1480-2. PubMed PMID: 12029137.
[NPL 11] Mulliken GH, Musallam S, Andersen RA. Forward estimation of movement state in posterior parietal cortex. Proc Natl Acad Sci U S A. 2008 Jun 17;105(24):8170-7. PubMed PMID: 18499800. Pubmed Central PMCID: 2448809.
[NPL 12] Troyer TW, Doupe AJ. An associational model of birdsong sensorimotor learning I. Efference copy and the learning of song syllables. J Neurophysiol. 2000 Sep;84(3):1204-23. PubMed PMID: 10979996. [NPL 13] Mischiati M, Lin HT, Herold P, Imler E, Olberg R, Leonardo A. Internal models direct dragonfly interception steering. Nature. 2015 Jan 15;517(7534):333-8. PubMed PMID: 25487153. [NPL 14] Shadmehr R, Wise SP. The computational neurobiology of reaching and pointing. Cambridge, Massachusetts: The MIT Press; 2005.
[NPL 15] Wolpert DM, Kawato M. Multiple paired forward and inverse models for motor control. Neural Netw. 1998 ; ll(7-8):1317-29. [NPL 16] Christensen, M. S., Lundbye-Jensen, J., Geertsen, S. S., Petersen, T. H., Paulson, O. B. & Nielsen, J. B. Premotor cortex modulates somatosensory cortex during voluntary movements without proprioceptive feedback. Nat. Neurosci. 2007; 10, 417-419. (doi:10.1038/nnl873)
[NPL 17] Wolpert, D. M. & Flanagan, J. R. Motor prediction. Curr. Biol. . 2001; 11, R729-R732. (doi:10.1016/S0960-9822(01)00432-8) [NPL 18] Miall, R. C. & King, D. State estimation in the cerebellum. Cerebellum 2008; 7, 572-576. (doi:10.1007/sl2311-008-0072-6)
[NPL 19] Ganesh, G., Osu, R. & Naito, E. Feeling the force: returning haptic signals influence effort inference during motor coordination. Sci. Rep. 2013; 3, 2648. (doi:10.1038/srep02648)
[NPL 20] Todorov, E. & Jordan, M. I. Optimal feedback control as a theory of motor coordination. Nat. Neurosci. 2002; 5, 1226-1235. (doi:10.1038/nn963)
[NPL 21] Tseng, Y.-W., Diedrichsen, J., Krakauer, J. W., Shadmehr, R. & Bastian, A. J. Sensory prediction errors drive cerebellum-dependent adaptation of reaching. J. Neurophysiol. 2007; 98, 54-62. (doi:10.1152/jn.00266.2007)
[NPL 22] Yamashita O, Sato M, Yoshioka T, Tong F, Kamitani Y. "Sparse estimation automatically selects voxels relevant for the decoding of fMRI activity patterns". Neuroimage. 2008; 42(4):1414-29.
[NPL 23] Shakeel A, Navid MS, Anwar MN, Mazhar S, Jochumsen M, Niazi IK. A Review of Techniques for Detection of Movement Intention Using Movement-Related Cortical Potentials. Comput Math Methods Med. 2015; 2015:346217
[NPL 24] Tian X, Poeppel D. Mental imagery of speech and movement implicates the dynamics of internal forward models. Front Psychol. 2010; 1:166. doi: 10.3389/fpsyg.2010.00166. eCollection 2010. [NPL 25] Gentili , Han CE, Schweighofer N, Papaxanthis C. Motor learning without doing: trial-by-trial improvement in motor performance during mental training. J Neurophysiol 2010;104(2):774-83. doi: 10.1152/jn.00257.2010.
The key feature of the new technique is that it is an "active" decoding technique. Active because the technique proposes to use the brain signal recording in parallel to artificial stimulation of the sensory system that corresponds to the movement intention to be decoded.
[Fig. 1] Standard BMI architecture.
[Fig. 2] Proposed architecture/methodology: We propose to not decode movement intention directly (like other decoding methods, Fig.l) but to utilize a sensory stimulator in parallel, and decode whether the intention matches the stimulation.
Neuroscientific motivation and principle
The reason for this active procedure comes from Neuroscience. It is well established in motor neuroscience that the human movements are critically determined by the brains' ability to estimate and predict sensory signals of self-generated actions (NPL 1-4). Previous motor studies in humans (NPL 5-8), non-human primates (NPL 9-11), birds (NPL 12), and insect (NPL 13) have shown that estimation of self- generated actions is achieved by forward models which transform motor commands into a predicted sensory consequence of one's own actions (NPL 14, 15). Recent studies have shown that the same forward model predits sensory outputs of observed movements as well. It is believed that the brain calculates the prediction error, the difference between the predictions (from the forward model) and the actual sensory outcomes, to develop perception of self-generated actions (NPL 16-19), and update internal models for online motor control (NPL 20)] and motor learning (NPL 21). Critically for us, forward models are believed to be active not just during action generation but also action imagined (NPL 24, 25). Here we use this fact to develop a new way of decoding the imagined intention.
We propose to decode from the brain signals, not the intention directly, but rather the prediction error to determine what the imagined intention of a subject is. For promoting the development of the prediction error by the brain, we need to actively provide the brain with actual sensory signals. Therefore our idea is to send a particular sensory stimulation when a subject is intending to move and decode the error signal (whether the subjects intends to move corresponding to the sensory signal we sent, or not) to decode his movement intention (Fig. 2).
We believe that this procedure is much more robust and efficient due to few reasons
1) Intention is a complex phenomenon which can differ a lot subjectively in terms of brain activation. The final movement direction (or the error) is a much lower dimension signal and arguably easier to similar among subjects.
2) The forward model is a critical part of the motor system and the error signal is crucial for many motor operations. Therefore, we believe that the error signal will have a large signature in the brain activity.
3) Our method provides a way to perturb the forward model actively to probe the intention.
Which means that for the decoding of "each" intention, the methods allows multiple (separate) recordings by different kinds of perturbations. The method hence promises much richer brain data related to movement intention.
Note that the crucial idea is for the decoding of the prediction error, instead of the movement intention directly from the brain activity. Brain activity may be measured by one or more brain imaging modality/modalities such as electroencephalography (EEG), Magnetoencephelography (MEG), Near Infrared spectroscopy (NI S), functional Magnetic resonance imaging (fMRI) and/or others.
While we propose this for the decoding of movement intentions, the same procedure for decoding of imagined movements or thoughts by one-self or those observed in others. Prediction error decoding requires stimulation in the sensory modus that is likely to be activated if a real movement is done (or observed) according to the intention, imagination or thought, as this is what the forward model will be strongly predicting. For example, if the goal is to decode the intended hand movement of a human or animal, the stimulation (required for decoding) should correspond to hand movement (maybe tendon vibration) such that the human/animal feels his hands are moving (even though it may not be). If the goal is to decode the intention to walk or not, then the stimulation (required for decoding) should be such that the human/animal feels he is walking (even though it may not be). Prediction errors may be primed using training associations (like conditional training) of sensory cues which can be auditory and/or visual and/ or tactile and/or olfactory and/or gustatory, to either preceding or subsequent performed or observed movements. Similarly, other artificial electrical, mechanical, or magnetic stimulations may be trained as inputs to the bodily sensory system and associated with movement intention, imagination or thought. With such a training, cues may be used to increase a strength of prediction errors during stimulations, or instead of the sensory stimulation, the trained cues may be presented to develop prediction errors in the brain and decode the intention, imagination or thought. Thus depending on the task and training paradigms used, the stimulation or cues need to be presented in parallel of immediately subsequent to the intention, imagination or thought.
Finally, through our experiments, we can show that the decoding works even when the stimulation is very small- such that decoding is greatly helped by even very small stimulations that are not enough for the perceptual illusion of the movements. In other words, the subject has no awareness of any stimulation or perceptual perturbation.
Preliminary testing/ Validation
As our preliminary setup, we have examined the technique for wheelchair users, and decoded which direction they wanted the wheelchair to turn. We used a commercial electroencephalography (EEG) system to record the brain signals. As the key sensory feedback in humans of movement direction change is from the vestibular system, we used a custom made (by Osaka University) Galvanic vestibular stimulator (GVS) that excites the human vestibular system. We however, used very low GVS stimulations, that were not detectable (as movement) by the human subjects. We then used a sparse logistical regression algorithm (NPL 22) that decodes whether (or not) the direction of the GVS signal matches the direction in which the user wants his wheelchair to turn.
In summary we ask subjects to imagine their chair turning either left or right, we provide a small GVS stimulation while they imagine, and can decode what they are imagining from the EEG signals, by evaluating whether what they are imagining corresponds to the stimulation (direction) or not. We have tried this technique with 5 participants with ~90% accuracy in intention decoding from 96ms of EEG data after start of GVS.
Extension to online and continuous decoding Currently we validate the methodology by an offline discrete procedure, in which subjects were asked to imagine either themselves turning right or left. In this particular experiment we stimulate their vestibular system only once, and then later see if we can decode their turning intention from the EEG signals.
However, the methodology (and the fact that decoding is done in <100ms) promises extension to real time applications where, online during any tasks (like wheelchair manipulation), movement intentions can be decoded continuously using repeated stimulation separated by 100ms (which is the time required for the decoding). These stimulations can be random, few in number, or periodic or continuous. They may be aligned with, or triggered by other physiological, behavioral or environmental variables during the task. Multiple stimulations can in fact increase the performance of the decoding even further.
1) BMI: Our methodology will greatly increase the ability to decode motor intentions from the brain and thus will be crucial for Brain machine interfaces that are used in prosthetics, entertainments and robotic applications.
2) Neuroscience: Our methodology gives evidence of prediction error signals in the brain and should be useful for neuroscientists to understand better the functioning of the brain and specifically the sensory- motor system.
3) Medical Diagnosis: the method may be used for medical diagnosis - as decoding , in parallel with stimulation may be useful to determine the normal/abnormal functioning of the brain.
The method may thus be useful for decoding intention, imagination or thought used with healthy, elderly or individuals with pathologies.

Claims

An intention, imagination or thought decoding system for humans or animals comprising: a. brain activity recording or measuring system.
b. A sensory stimulation system which applies one or more artificially induced sensory cue/cues, and/or stimulation/stimulations related to the intention, imagination or thought to be decoded.
c. A computer program or algorithm, which analyzes recorded brain activity in the presence of, or immediately after, the sensory cue/cues or stimulation/stimulations to decode the intention, Imagination or thought.
2. A brain recording or measuring system according to claim 1 which measures and/or records brain activity using one or more brain imaging modality/modalities such as electroencephalography (EEG), Magnetoencephelography (MEG), Near Infrared spectroscopy (NI S), functional Magnetic resonance imaging (fMRI) and/or others.
3. A decoding system according to claim 1, where the cues are audio and/or visual and/ or tactile and/or olfactory and/or gustatory inputs to an individual, that individual may or may not consciously perceive.
4. A decoding system according to claim 1, where the stimulations are electrical, mechanical, or magnetic inputs to the bodily sensory system and which an individual may or may not consciously perceive.
5. A cue or cues according to claim 3, that the user whose intention, imagination or thought is to be decoded, may have been previously trained on such that he forms an association between the cue/cues and the intention, imagination or thought.
6. A stimulation or stimulations according to claim 4, that the user whose intention,
imagination or thought is to be decoded, may have been previously trained on such that he forms an association between the stimulation/stimulations and the intention, imagination or thought.
7. A stimulation system according to claim 2, that applies stimulation/stimulations or cue/cues either once or more, randomly, periodically, or aligned with, or triggered by other physiological, behavioral or environmental variables.
8. A stimulation system according to claim 2, that applies stimulation/stimulations and/or cue/cues is/are triggered manually.
9. A stimulation system according to claim 2, that applies stimulation/stimulations and/or cue/cues either before, during or after an individual's intention, imagination or thought.
10. A decoding system according to claim 1, where the decoding may be done once or multiple times.
11. A computer program or algorithm according to claim 1 which decodes for the differences between the intention, imagination or thought, and the applied cues or stimulation.
12. A decoding system according to claim 1 that is used with healthy, elderly individuals, or individuals with pathologies to decode their intention, imagination or thought.
PCT/IB2016/0011852016-07-132016-07-13System for precise decoding of movement related human intention from brain signalCeasedWO2018011615A1 (en)

Priority Applications (3)

Application NumberPriority DateFiling DateTitle
PCT/IB2016/001185WO2018011615A1 (en)2016-07-132016-07-13System for precise decoding of movement related human intention from brain signal
EP16760777.9AEP3484356A1 (en)2016-07-132016-07-13System for precise decoding of movement related human intention from brain signal
JP2019500670AJP6884349B2 (en)2016-07-132016-07-13 A new way to accurately decode human intentions about movement from brain signals

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
PCT/IB2016/001185WO2018011615A1 (en)2016-07-132016-07-13System for precise decoding of movement related human intention from brain signal

Publications (1)

Publication NumberPublication Date
WO2018011615A1true WO2018011615A1 (en)2018-01-18

Family

ID=56877070

Family Applications (1)

Application NumberTitlePriority DateFiling Date
PCT/IB2016/001185CeasedWO2018011615A1 (en)2016-07-132016-07-13System for precise decoding of movement related human intention from brain signal

Country Status (3)

CountryLink
EP (1)EP3484356A1 (en)
JP (1)JP6884349B2 (en)
WO (1)WO2018011615A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN109078262A (en)*2018-08-152018-12-25北京机械设备研究所A kind of MI-BCI training method based on peripheral nerve electro photoluminescence
CN109359403A (en)*2018-10-292019-02-19上海市同济医院A kind of schizophrenia early diagnosis model and its application based on Facial expression identification magnetic resonance imaging
KR20200052209A (en)*2018-11-062020-05-14고려대학교 산학협력단Apparatus and method for controlling wearable robot by detecting motion intention of users based on brain machine interface
US12118718B2 (en)2021-01-272024-10-15Canon Medical Systems CorporationMedical information processing apparatus, medical information processing method, and non-transitory computer-readable medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2011140303A1 (en)*2010-05-052011-11-10University Of Maryland, College ParkTime domain-based methods for noninvasive brain-machine interfaces
WO2015058223A1 (en)*2013-10-212015-04-30G.Tec Medical Engineering GmbhMethod for quantifying the perceptive faculty of a person

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2011140303A1 (en)*2010-05-052011-11-10University Of Maryland, College ParkTime domain-based methods for noninvasive brain-machine interfaces
WO2015058223A1 (en)*2013-10-212015-04-30G.Tec Medical Engineering GmbhMethod for quantifying the perceptive faculty of a person

Non-Patent Citations (25)

* Cited by examiner, † Cited by third party
Title
AGLIOTI SM; CESARI P; ROMANI M; URGESI C: "Action anticipation and motor resonance in elite basketball players", NAT NEUROSCI, vol. 11, no. 9, September 2008 (2008-09-01), pages 1109 - 16
CHRISTENSEN MS; LUNDBYE-JENSEN J; GEERTSEN SS; PETERSEN TH; PAULSON OB; NIELSEN JB: "Premotor cortex modulates somatosensory cortex during voluntary movements without proprioceptive feedback", NAT NEUROSCI., vol. 10, no. 4, April 2007 (2007-04-01), pages 417 - 9
CHRISTENSEN, M. S.; LUNDBYE-JENSEN; J., GEERTSEN, S. S.; PETERSEN, T. H.; PAULSON, O. B.; NIELSEN, J. B.: "Premotor cortex modulates somatosensory cortex during voluntary movements without proprioceptive feedback", NAT. NEUROSCI., vol. 10, 2007, pages 417 - 419
DESMURGET M; GRAFTON S: "Forward modeling allows feedback control for fast reaching movements", TRENDS COGN SCI, vol. 4, no. 11, 1 November 2000 (2000-11-01), pages 423 - 31
DUHAMEL JR; COLBY CL; GOLDBERG ME: "The updating of the representation of visual space in parietal cortex by intended eye movements", SCIENCE, vol. 255, no. 5040, 3 January 1992 (1992-01-03), pages 90 - 2
FRITH CD; FRITH U.: "Interacting minds--a biological basis", SCIENCE, vol. 286, no. 5445, 26 November 1999 (1999-11-26), pages 1692 - 5
GANESH G; TAKAGI A; OSU R; YOSHIOKA T; KAWATO M; BURDET E.: "Two is better than one: Physical interactions improve motor performance in humans", SCIENTIFIC REPORTS, vol. 4, 2014
GANESH, G.; OSU, R.; NAITO, E.: "Feeling the force: returning haptic signals influence effort inference during motor coordination", SCI. REP., vol. 3, 2013, pages 2648
GENTILI R; HAN CE; SCHWEIGHOFER N; PAPAXANTHIS C.: "Motor learning without doing: trial-by-trial improvement in motor performance during mental training", J NEUROPHYSIOL, vol. 104, no. 2, 2010, pages 774 - 83, XP055236853, DOI: doi:10.1152/jn.00257.2010
MIALL RC; CHRISTENSEN LOD; CAIN O; STANLEY J.: "Disruption of state estimation in the human lateral cerebellum", PLOS BIOLOGY, vol. 5, no. 11, November 2007 (2007-11-01), pages 2733 - 44
MIALL, R. C.; KING, D.: "State estimation in the cerebellum", CEREBELLUM, vol. 7, 2008, pages 572 - 576
MISCHIATI M; LIN HT; HEROLD P; IMLER E; OLBERG R; LEONARDO A: "Internal models direct dragonfly interception steering", NATURE, vol. 517, no. 7534, 15 January 2015 (2015-01-15), pages 333 - 8
MULLIKEN GH; MUSALLAM S; ANDERSEN RA: "Forward estimation of movement state in posterior parietal cortex", PROC NATL ACAD SCI USA., vol. 105, no. 24, 17 June 2008 (2008-06-17), pages 8170 - 7
PICKERING MJ; GARROD S.: "An integrated theory of language production and comprehension", BEHAV BRAIN SCI., vol. 36, no. 4, August 2013 (2013-08-01), pages 329 - 47
SHADMEHR R; WISE SP: "The computational neurobiology of reaching and pointing", 2005, THE MIT PRESS
SHAKEEL A; NAVID MS; ANWAR MN; MAZHAR S; JOCHUMSEN M; NIAZI IK.: "A Review of Techniques for Detection of Movement Intention Using Movement-Related Cortical Potentials", COMPUT MATH METHODS MED., vol. 2015, 2015, pages 346217
SOMMER MA; WURTZ RH: "A pathway in primate brain for internal monitoring of movements", SCIENCE, vol. 296, no. 5572, 24 May 2002 (2002-05-24), pages 1480 - 2
TIAN X; POEPPEL D.: "Mental imagery of speech and movement implicates the dynamics of internal forward models", FRONT PSYCHOL., vol. 1, 2010, pages 166
TODOROV, E.; JORDAN, M. I.: "Optimal feedback control as a theory of motor coordination", NAT. NEUROSCI, vol. 5, 2002, pages 1226 - 1235
TROYER TW; DOUPE AJ: "An associational model of birdsong sensorimotor learning I. Efference copy and the learning of song syllables", J NEUROPHYSIOL, vol. 84, no. 3, September 2000 (2000-09-01), pages 1204 - 23
TSENG, Y.-W.; DIEDRICHSEN, J.; KRAKAUER, J. W.; SHADMEHR, R.; BASTIAN, A. J.: "Sensory prediction errors drive cerebellum-dependent adaptation of reaching", J. NEUROPHYSIOL, vol. 98, 2007, pages 54 - 62
WOLPERT DM; GHAHRAMANI Z; JORDAN MI: "An internal model for sensorimotor integration", SCIENCE, vol. 269, no. 5232, 29 September 1995 (1995-09-29), pages 1880 - 2
WOLPERT DM; KAWATO M.: "Multiple paired forward and inverse models for motor control", NEURAL NETW, vol. 11, no. 7-8, 1998, pages 1317 - 29, XP004146723, DOI: doi:10.1016/S0893-6080(98)00066-5
WOLPERT, D. M.; FLANAGAN, J. R.: "Motor prediction", CURR. BIOL., vol. 11, 2001, pages R729 - R732
YAMASHITA O; SATO M; YOSHIOKA T; TONG F; KAMITANI Y: "Sparse estimation automatically selects voxels relevant for the decoding of fMRI activity patterns", NEUROIMAGE, vol. 42, no. 4, 2008, pages 1414 - 29, XP025609107, DOI: doi:10.1016/j.neuroimage.2008.05.050

Cited By (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN109078262A (en)*2018-08-152018-12-25北京机械设备研究所A kind of MI-BCI training method based on peripheral nerve electro photoluminescence
CN109078262B (en)*2018-08-152022-11-01北京机械设备研究所MI-BCI training method based on peripheral nerve electrical stimulation
CN109359403A (en)*2018-10-292019-02-19上海市同济医院A kind of schizophrenia early diagnosis model and its application based on Facial expression identification magnetic resonance imaging
KR20200052209A (en)*2018-11-062020-05-14고려대학교 산학협력단Apparatus and method for controlling wearable robot by detecting motion intention of users based on brain machine interface
KR102276991B1 (en)*2018-11-062021-07-13고려대학교 산학협력단Apparatus and method for controlling wearable robot by detecting motion intention of users based on brain machine interface
US12118718B2 (en)2021-01-272024-10-15Canon Medical Systems CorporationMedical information processing apparatus, medical information processing method, and non-transitory computer-readable medium

Also Published As

Publication numberPublication date
EP3484356A1 (en)2019-05-22
JP6884349B2 (en)2021-06-09
JP2019527416A (en)2019-09-26

Similar Documents

PublicationPublication DateTitle
WhitlockPosterior parietal cortex
Mikołajewska et al.Ethical considerations in the use of brain-computer interfaces
Guerrero et al.Using “human state aware” robots to enhance physical human–robot interaction in a cooperative scenario
EP3484356A1 (en)System for precise decoding of movement related human intention from brain signal
Déli et al.The thermodynamic brain and the evolution of intellect: the role of mental energy
RollsBrain mechanisms of emotion and decision-making
Mottura et al.A virtual reality system for strengthening awareness and participation in rehabilitation for post-stroke patients
Trocellier et al.Identifying factors influencing the outcome of BCI-based post stroke motor rehabilitation towards its personalization with Artificial Intelligence
Taylor et al.Feasibility of neucube snn architecture for detecting motor execution and motor intention for use in bciapplications
KilteniMethods of somatosensory attenuation
Anema et al.Motor and kinesthetic imagery
Cristaldi et al.Predictive processing and embodiment in emotion
Haroon et al.Mental Fatigue Classification Aided by Machine Learning-Driven Model under the Influence of Foot and Auditory Binaural Beats Brain Massage Via fNIRS
AcolinTowards a clinical theory of embodiment: A model for the conceptualization and treatment of mental illness
JP2019527416A5 (en)
Tucciarelli et al.Shaping the developing homunculus: the roles of deprivation and compensatory behaviour in sensory remapping
Lamti et al.Emotion detection for wheelchair navigation enhancement
Ab Aziz et al.Designing a robot-assisted therapy for individuals with anxiety traits and states
Thilakarathne et al.Computational cognitive modelling of action awareness: prior and retrospective
Mazurek et al.Highlights from the 28th Annual Meeting of the Society for the Neural Control of Movement
Grünbaum et al.Sensation of movement
Lai et al.Current practical applications of electroencephalography (EEG)
Love et al.Highlights from the 32nd Annual Meeting of the Society for the Neural Control of Movement
Hassija et al.The Skills Training in Affective and Interpersonal Regulation (STAIR) narrative model: A treatment approach to promote resilience
Giakoumis et al.Service robot behaviour adaptation based on user mood, towards better personalized support of mci patients at home

Legal Events

DateCodeTitleDescription
121Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number:16760777

Country of ref document:EP

Kind code of ref document:A1

ENPEntry into the national phase

Ref document number:2019500670

Country of ref document:JP

Kind code of ref document:A

NENPNon-entry into the national phase

Ref country code:DE

ENPEntry into the national phase

Ref document number:2016760777

Country of ref document:EP

Effective date:20190213


[8]ページ先頭

©2009-2025 Movatter.jp