Movatterモバイル変換


[0]ホーム

URL:


US20110040155A1 - Multiple sensory channel approach for translating human emotions in a computing environment - Google Patents

Multiple sensory channel approach for translating human emotions in a computing environment
Download PDF

Info

Publication number
US20110040155A1
US20110040155A1US12/540,735US54073509AUS2011040155A1US 20110040155 A1US20110040155 A1US 20110040155A1US 54073509 AUS54073509 AUS 54073509AUS 2011040155 A1US2011040155 A1US 2011040155A1
Authority
US
United States
Prior art keywords
emotion
sensory
user
input
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/540,735
Inventor
Barbara S. Guzak
Hung-Tack Kwan
Janki Y. Vora
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
International Business Machines Corp
Original Assignee
International Business Machines Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by International Business Machines CorpfiledCriticalInternational Business Machines Corp
Priority to US12/540,735priorityCriticalpatent/US20110040155A1/en
Assigned to INTERNATIONAL BUSINESS MACHINES CORPORATIONreassignmentINTERNATIONAL BUSINESS MACHINES CORPORATIONASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: GUZAK, BARBARA S., KWAN, HUNG-TACK, VORA, JANKI Y.
Publication of US20110040155A1publicationCriticalpatent/US20110040155A1/en
Priority to US14/071,148prioritypatent/US9329758B2/en
Abandonedlegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

Sensory inputs of a user can be received by a computing device. At least one of the sensory inputs can include a physiological input providing a physiological measurement from a body of the user. Each sensory input can be processed in a unique one of a set of standards-defined sensory channels, each corresponding to a specific emotion dimension. Processing the sensory inputs can transform the physiological measurement into an emotion dimension value. The emotion dimension values from each of the sensory channels can be aggregated to generate at least one emotion datum value, which is a standards-defined value for an emotional characteristic of the user. Historical data for a user can be optionally collected and used by a learning and calibration component to improve the accuracy of the generated emotion datum values for a specific individual. A programmatic action driven by the emotion datum value can be performed.

Description

Claims (20)

1. A method for discerning human emotions in a computing environment comprising:
receiving a plurality of sensory inputs of a user, wherein at least one of the sensory inputs comprises a physiological input providing a physiological measurement from a body of the user, said physiological measurement being obtained using a physiological sensor;
processing each sensory input in a unique one of a plurality of standards-defined sensory channels, each standards-defined sensory channel corresponding to a specific emotion dimension, wherein said processing comprises transforming the physiological measurement into an emotion dimension value, said emotion dimension value abstracting the physiological measurement from specifics attributable to unique characteristics of the physiological sensor;
aggregating the emotion dimension values from each of the sensory channels to generate at least one emotion datum value, which is a standards-defined value for an emotional characteristic of said user, wherein said emotion datum value is a value independent of any of said sensory devices and is a value independent of any single one of said standards-defined sensory channels; and
performing a programmatic action driven by the emotion datum value.
7. The method ofclaim 1, wherein said user participates in an application session of an application which performs said programmatic action driven by the emotion datum value, wherein during the application session, the plurality of sensory inputs from the user are continuously received, processed, and aggregated to continuously generate different emotion datum values, which drive programmatic actions of the application that are based upon emotional variances of the user during the application session, wherein the continuous receiving, processing, aggregating, and performing of the programmatic actions occur in at least one of real-time and near real time, wherein the application session is a communication session in which said user communicates in real-time or near real-time with at least one other user, wherein during said communication session said other user is continuously appraised of emotional changes of said user determined from the continuously generated emotion datum values.
10. The method ofclaim 1, further comprising:
collecting channel specific historical data for said user on a sensory channel specific basis for each of said plurality of standards-defined sensory channels;
analyzing said channel specific historical data for each sensory channel on an iterative basis;
adjusting parameters used to generate said emotion dimension values in accordance with results of said analyzing of the channel specific historical data to improve an accuracy of the generated emotion dimension values over time; and
collecting aggregation specific historical data for said user comprising emotion datum values and emotion datum values generated from said emotion datum values;
analyzing said aggregation specific historical data on an iterative basis; and
adjusting parameters used to generate said emotion datum values in accordance with results of said analyzing of the aggregation specific historical data to improve an accuracy of the generated emotion datum values over time.
13. A computer program product for discerning human emotions in a computing environment comprising:
a computer usable storage medium having computer readable program code embodied therewith, the computer readable program code comprising:
computer readable program code configured to receive a plurality of sensory inputs of a user, wherein at least one of the sensory inputs comprises a physiological input providing a physiological measurement from a body of the user, said physiological measurement being obtained using a physiological sensor;
computer readable program code configured to process each sensory input in a unique one of a plurality of standards-defined sensory channels, each standards-defined sensory channel corresponding to a specific emotion dimension, wherein said processing comprises transforming the physiological measurement into an emotion dimension value, said emotion dimension value abstracting the physiological measurement from specifics attributable to unique characteristics of the physiological sensor;
14. A system for incorporating human emotions in a computing environment comprising:
a plurality of discrete sensory channels for handling sensory input, wherein each of the discrete sensory channels is a standards-defined sensory channel corresponding to a specific emotion dimension, wherein sensory input handled within the sensory channels comprises physiological input providing a physiological measurement from a body of the user;
a plurality of in-channel processors that process sensory input specific to the channel and that generate emotion dimension values from the sensory input, wherein each emotion dimension value has been transformed to be independent of idiosyncrasies of a sensory capture device from which the sensory input was originally obtained; and
a sensory aggregator for aggregating emotion dimension values generated in a per-channel basis by the in-channel processors to generate at least one emotion datum value, which is a standards-defined value for an emotional characteristic of a user from whom the sensory input was gathered, wherein said emotion datum value is a value independent of any single one of said standards-defined sensory channels, and is an application independent value that is able to be utilized by a plurality of independent applications to discern emotions of said user and to cause application specific code of the independent applications to be reactive to changes in sensory aggregator generated emotion datum values.
US12/540,7352009-08-132009-08-13Multiple sensory channel approach for translating human emotions in a computing environmentAbandonedUS20110040155A1 (en)

Priority Applications (2)

Application NumberPriority DateFiling DateTitle
US12/540,735US20110040155A1 (en)2009-08-132009-08-13Multiple sensory channel approach for translating human emotions in a computing environment
US14/071,148US9329758B2 (en)2009-08-132013-11-04Multiple sensory channel approach for translating human emotions in a computing environment

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
US12/540,735US20110040155A1 (en)2009-08-132009-08-13Multiple sensory channel approach for translating human emotions in a computing environment

Related Child Applications (1)

Application NumberTitlePriority DateFiling Date
US14/071,148ContinuationUS9329758B2 (en)2009-08-132013-11-04Multiple sensory channel approach for translating human emotions in a computing environment

Publications (1)

Publication NumberPublication Date
US20110040155A1true US20110040155A1 (en)2011-02-17

Family

ID=43588986

Family Applications (2)

Application NumberTitlePriority DateFiling Date
US12/540,735AbandonedUS20110040155A1 (en)2009-08-132009-08-13Multiple sensory channel approach for translating human emotions in a computing environment
US14/071,148Expired - Fee RelatedUS9329758B2 (en)2009-08-132013-11-04Multiple sensory channel approach for translating human emotions in a computing environment

Family Applications After (1)

Application NumberTitlePriority DateFiling Date
US14/071,148Expired - Fee RelatedUS9329758B2 (en)2009-08-132013-11-04Multiple sensory channel approach for translating human emotions in a computing environment

Country Status (1)

CountryLink
US (2)US20110040155A1 (en)

Cited By (55)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20110144452A1 (en)*2009-12-102011-06-16Hyun-Soon ShinApparatus and method for determining emotional quotient according to emotion variation
US20120011477A1 (en)*2010-07-122012-01-12Nokia CorporationUser interfaces
US20120209793A1 (en)*2010-08-192012-08-16Henry Minard Morris, JR.Ideation Search Engine
US20130094722A1 (en)*2009-08-132013-04-18Sensory Logic, Inc.Facial coding for emotional interaction analysis
WO2013091677A1 (en)2011-12-202013-06-27Squarehead Technology AsSpeech recognition method and system
US20130217350A1 (en)*2012-02-162013-08-22Research In Motion CorporationSystem and method for communicating presence status
US20130280682A1 (en)*2012-02-272013-10-24Innerscope Research, Inc.System and Method For Gathering And Analyzing Biometric User Feedback For Use In Social Media And Advertising Applications
US20130293577A1 (en)*2012-05-042013-11-07Kathryn Stone PerezIntelligent translations in personal see through display
US20140067397A1 (en)*2012-08-292014-03-06Nuance Communications, Inc.Using emoticons for contextual text-to-speech expressivity
US20140091897A1 (en)*2012-04-102014-04-03Net Power And Light, Inc.Method and system for measuring emotional engagement in a computer-facilitated event
US8700009B2 (en)2010-06-022014-04-15Q-Tec Systems LlcMethod and apparatus for monitoring emotion in an interactive network
US20140236953A1 (en)*2009-02-112014-08-21Jeffrey A. RapaportMethods using social topical adaptive networking system
US20140247989A1 (en)*2009-09-302014-09-04F. Scott DeaverMonitoring the emotional state of a computer user by analyzing screen capture images
US20140354532A1 (en)*2013-06-032014-12-04Daqri, LlcManipulation of virtual object in augmented reality via intent
US8917854B2 (en)2013-01-082014-12-23Xerox CorporationSystem to support contextualized definitions of competitions in call centers
WO2015020638A1 (en)*2013-08-062015-02-12Intel CorporationEmotion-related query processing
US9019174B2 (en)2012-10-312015-04-28Microsoft Technology Licensing, LlcWearable emotion detection and feedback system
US20150254563A1 (en)*2014-03-072015-09-10International Business Machines CorporationDetecting emotional stressors in networks
US9141604B2 (en)*2013-02-222015-09-22Riaex IncHuman emotion assessment reporting technology—system and method
US9208465B2 (en)2011-12-012015-12-08Xerox CorporationSystem and method for enhancing call center performance
US9230022B1 (en)*2012-08-232016-01-05Amazon Technologies, Inc.Customizable result sets for application program interfaces
US20160007165A1 (en)*2013-02-192016-01-07Angel Sense Ltd.Method and system for identifying exceptions of people behavior
US20160077547A1 (en)*2014-09-112016-03-17Interaxon Inc.System and method for enhanced training using a virtual reality environment and bio-signal data
US20160180277A1 (en)*2014-12-172016-06-23Avaya Inc.Automated responses to projected contact center agent fatigue and burnout
US9471837B2 (en)2014-08-192016-10-18International Business Machines CorporationReal-time analytics to identify visual objects of interest
US9560984B2 (en)2009-10-292017-02-07The Nielsen Company (Us), LlcAnalysis of controlled and automatic attention for introduction of stimulus material
US9600743B2 (en)2014-06-272017-03-21International Business Machines CorporationDirecting field of vision based on personal interests
US20170103360A1 (en)*2015-10-132017-04-13Genesys Telecommunications Laboratories, Inc.System and method for intelligent task management and routing based on physiological sensor input data
US9936250B2 (en)2015-05-192018-04-03The Nielsen Company (Us), LlcMethods and apparatus to adjust content presented to an individual
US20180150130A1 (en)*2016-11-302018-05-31At&T Intellectual Property I, L.P.Methods, and devices for generating a user experience based on the stored user information
US9996155B2 (en)2013-06-032018-06-12Daqri, LlcManipulation of virtual object in augmented reality via thought
US10137368B2 (en)*2008-11-062018-11-27At&T Intellectual Property I, L.P.Billing a subset of mobile devices associated with an online gaming environment rendered on the subset of mobile devices
US10142276B2 (en)2011-05-122018-11-27Jeffrey Alan RapaportContextually-based automatic service offerings to users of machine system
US10159435B1 (en)*2017-09-292018-12-25Novelic D.O.O.Emotion sensor system
US10176161B2 (en)*2016-01-282019-01-08International Business Machines CorporationDetection of emotional indications in information artefacts
US10225621B1 (en)2017-12-202019-03-05Dish Network L.L.C.Eyes free entertainment
US10235998B1 (en)*2018-02-282019-03-19Karen Elaine KhaleghiHealth monitoring system and appliance
US10398366B2 (en)2010-07-012019-09-03Nokia Technologies OyResponding to changes in emotional condition of a user
US10484845B2 (en)2016-06-302019-11-19Karen Elaine KhaleghiElectronic notebook system
US10492735B2 (en)*2018-04-272019-12-03Microsoft Technology Licensing, LlcIntelligent warning system
CN110555204A (en)*2018-05-312019-12-10北京京东尚科信息技术有限公司emotion judgment method and device
US10559307B1 (en)2019-02-132020-02-11Karen Elaine KhaleghiImpaired operator detection and interlock apparatus
US20200134741A1 (en)*2018-02-202020-04-30Osram GmbhControlled Agricultural Systems and Methods of Managing Agricultural Systems
US10735191B1 (en)2019-07-252020-08-04The Notebook, LlcApparatus and methods for secure distributed communications and data access
CN112263252A (en)*2020-09-282021-01-26贵州大学 A PAD emotion dimension prediction method based on HRV features and three-layer SVR
US20210097631A1 (en)*2015-03-302021-04-01Twiin, LLCSystems and methods of generating consciousness affects
US10987015B2 (en)2009-08-242021-04-27Nielsen Consumer LlcDry electrodes for electroencephalography
WO2021173256A1 (en)*2020-02-272021-09-02Microsoft Technology Licensing, LlcAdjusting user experience for multiuser sessions based on vocal-characteristic models
CN113473913A (en)*2019-02-282021-10-01日本电气株式会社Emotion estimation device, emotion estimation method, and computer-readable recording medium
US11157549B2 (en)*2019-03-062021-10-26International Business Machines CorporationEmotional experience metadata on recorded images
US11321890B2 (en)*2016-11-092022-05-03Microsoft Technology Licensing, LlcUser interface for generating expressive content
US11481788B2 (en)2009-10-292022-10-25Nielsen Consumer LlcGenerating ratings predictions using neuro-response data
US11704681B2 (en)2009-03-242023-07-18Nielsen Consumer LlcNeurological profiles for market matching and stimulus presentation
US11755172B2 (en)*2016-09-202023-09-12Twiin, Inc.Systems and methods of generating consciousness affects using one or more non-biological inputs
US11816743B1 (en)2010-08-102023-11-14Jeffrey Alan RapaportInformation enhancing method using software agents in a social networking system

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US10061808B2 (en)*2014-06-032018-08-28Sap SeCached views
US10614432B2 (en)*2016-01-292020-04-07Ncr CorporationChannel integration processing
US20180325390A1 (en)*2016-02-102018-11-15Hewlett-Packard Development Company, Lp.Biometric data analysis
US20170364929A1 (en)*2016-06-172017-12-21Sanjiv FerreiraMethod and system for identifying, aggregating & transforming emotional states of a user using a temporal phase topology framework
US10255733B2 (en)2017-08-212019-04-09At&T Intellectual Property I, L.P.Network controlled physical access restriction based upon physiological state
US11723579B2 (en)2017-09-192023-08-15Neuroenhancement Lab, LLCMethod and apparatus for neuroenhancement
US11717686B2 (en)2017-12-042023-08-08Neuroenhancement Lab, LLCMethod and apparatus for neuroenhancement to facilitate learning and performance
US11273283B2 (en)2017-12-312022-03-15Neuroenhancement Lab, LLCMethod and apparatus for neuroenhancement to enhance emotional response
US12280219B2 (en)2017-12-312025-04-22NeuroLight, Inc.Method and apparatus for neuroenhancement to enhance emotional response
US11364361B2 (en)2018-04-202022-06-21Neuroenhancement Lab, LLCSystem and method for inducing sleep by transplanting mental states
EP3849410A4 (en)2018-09-142022-11-02Neuroenhancement Lab, LLC SLEEP ENHANCEMENT SYSTEM AND METHOD
US11386474B2 (en)*2018-10-092022-07-12Rovi Guides, Inc.System and method for generating a product recommendation in a virtual try-on session
US11786694B2 (en)2019-05-242023-10-17NeuroLight, Inc.Device, method, and app for facilitating sleep

Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20050261032A1 (en)*2004-04-232005-11-24Jeong-Wook SeoDevice and method for displaying a status of a portable terminal by using a character image
US20060036751A1 (en)*2004-04-082006-02-16International Business Machines CorporationMethod and apparatus for governing the transfer of physiological and emotional user data
US20080215617A1 (en)*2006-01-102008-09-04Cecchi Guillermo AlbertoMethod for using psychological states to index databases
US20090164403A1 (en)*2007-12-202009-06-25Searete Llc, A Limited Liability Corporation Of The State Of DelawareMethods and systems for indicating behavior in a population cohort

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
DE102004001801A1 (en)2004-01-052005-07-28Deutsche Telekom AgSystem and process for the dialog between man and machine considers human emotion for its automatic answers or reaction

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20060036751A1 (en)*2004-04-082006-02-16International Business Machines CorporationMethod and apparatus for governing the transfer of physiological and emotional user data
US20050261032A1 (en)*2004-04-232005-11-24Jeong-Wook SeoDevice and method for displaying a status of a portable terminal by using a character image
US20080215617A1 (en)*2006-01-102008-09-04Cecchi Guillermo AlbertoMethod for using psychological states to index databases
US20090164403A1 (en)*2007-12-202009-06-25Searete Llc, A Limited Liability Corporation Of The State Of DelawareMethods and systems for indicating behavior in a population cohort

Cited By (106)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US10137368B2 (en)*2008-11-062018-11-27At&T Intellectual Property I, L.P.Billing a subset of mobile devices associated with an online gaming environment rendered on the subset of mobile devices
US10691726B2 (en)*2009-02-112020-06-23Jeffrey A. RapaportMethods using social topical adaptive networking system
US20140236953A1 (en)*2009-02-112014-08-21Jeffrey A. RapaportMethods using social topical adaptive networking system
US11704681B2 (en)2009-03-242023-07-18Nielsen Consumer LlcNeurological profiles for market matching and stimulus presentation
US8929616B2 (en)*2009-08-132015-01-06Sensory Logic, Inc.Facial coding for emotional interaction analysis
US20130094722A1 (en)*2009-08-132013-04-18Sensory Logic, Inc.Facial coding for emotional interaction analysis
US10987015B2 (en)2009-08-242021-04-27Nielsen Consumer LlcDry electrodes for electroencephalography
US20140247989A1 (en)*2009-09-302014-09-04F. Scott DeaverMonitoring the emotional state of a computer user by analyzing screen capture images
US11669858B2 (en)2009-10-292023-06-06Nielsen Consumer LlcAnalysis of controlled and automatic attention for introduction of stimulus material
US11170400B2 (en)2009-10-292021-11-09Nielsen Consumer LlcAnalysis of controlled and automatic attention for introduction of stimulus material
US11481788B2 (en)2009-10-292022-10-25Nielsen Consumer LlcGenerating ratings predictions using neuro-response data
US10068248B2 (en)2009-10-292018-09-04The Nielsen Company (Us), LlcAnalysis of controlled and automatic attention for introduction of stimulus material
US10269036B2 (en)2009-10-292019-04-23The Nielsen Company (Us), LlcAnalysis of controlled and automatic attention for introduction of stimulus material
US9560984B2 (en)2009-10-292017-02-07The Nielsen Company (Us), LlcAnalysis of controlled and automatic attention for introduction of stimulus material
US20110144452A1 (en)*2009-12-102011-06-16Hyun-Soon ShinApparatus and method for determining emotional quotient according to emotion variation
US8700009B2 (en)2010-06-022014-04-15Q-Tec Systems LlcMethod and apparatus for monitoring emotion in an interactive network
US10398366B2 (en)2010-07-012019-09-03Nokia Technologies OyResponding to changes in emotional condition of a user
US20120011477A1 (en)*2010-07-122012-01-12Nokia CorporationUser interfaces
US11816743B1 (en)2010-08-102023-11-14Jeffrey Alan RapaportInformation enhancing method using software agents in a social networking system
US20120209793A1 (en)*2010-08-192012-08-16Henry Minard Morris, JR.Ideation Search Engine
US11539657B2 (en)2011-05-122022-12-27Jeffrey Alan RapaportContextually-based automatic grouped content recommendations to users of a social networking system
US11805091B1 (en)2011-05-122023-10-31Jeffrey Alan RapaportSocial topical context adaptive network hosted system
US10142276B2 (en)2011-05-122018-11-27Jeffrey Alan RapaportContextually-based automatic service offerings to users of machine system
US9208465B2 (en)2011-12-012015-12-08Xerox CorporationSystem and method for enhancing call center performance
WO2013091677A1 (en)2011-12-202013-06-27Squarehead Technology AsSpeech recognition method and system
US9064243B2 (en)*2012-02-162015-06-23Blackberry LimitedSystem and method for communicating presence status
US20130217350A1 (en)*2012-02-162013-08-22Research In Motion CorporationSystem and method for communicating presence status
US10881348B2 (en)2012-02-272021-01-05The Nielsen Company (Us), LlcSystem and method for gathering and analyzing biometric user feedback for use in social media and advertising applications
US20130280682A1 (en)*2012-02-272013-10-24Innerscope Research, Inc.System and Method For Gathering And Analyzing Biometric User Feedback For Use In Social Media And Advertising Applications
US9569986B2 (en)*2012-02-272017-02-14The Nielsen Company (Us), LlcSystem and method for gathering and analyzing biometric user feedback for use in social media and advertising applications
US20140091897A1 (en)*2012-04-102014-04-03Net Power And Light, Inc.Method and system for measuring emotional engagement in a computer-facilitated event
US9519640B2 (en)*2012-05-042016-12-13Microsoft Technology Licensing, LlcIntelligent translations in personal see through display
US20130293577A1 (en)*2012-05-042013-11-07Kathryn Stone PerezIntelligent translations in personal see through display
US9230022B1 (en)*2012-08-232016-01-05Amazon Technologies, Inc.Customizable result sets for application program interfaces
US20140067397A1 (en)*2012-08-292014-03-06Nuance Communications, Inc.Using emoticons for contextual text-to-speech expressivity
US9767789B2 (en)*2012-08-292017-09-19Nuance Communications, Inc.Using emoticons for contextual text-to-speech expressivity
US9508008B2 (en)2012-10-312016-11-29Microsoft Technology Licensing, LlcWearable emotion detection and feedback system
US9019174B2 (en)2012-10-312015-04-28Microsoft Technology Licensing, LlcWearable emotion detection and feedback system
US9824698B2 (en)2012-10-312017-11-21Microsoft Technologies Licensing, LLCWearable emotion detection and feedback system
US8917854B2 (en)2013-01-082014-12-23Xerox CorporationSystem to support contextualized definitions of competitions in call centers
US10104509B2 (en)*2013-02-192018-10-16Angel Sense LtdMethod and system for identifying exceptions of people behavior
US20160007165A1 (en)*2013-02-192016-01-07Angel Sense Ltd.Method and system for identifying exceptions of people behavior
US9141604B2 (en)*2013-02-222015-09-22Riaex IncHuman emotion assessment reporting technology—system and method
US9996155B2 (en)2013-06-032018-06-12Daqri, LlcManipulation of virtual object in augmented reality via thought
US20140354532A1 (en)*2013-06-032014-12-04Daqri, LlcManipulation of virtual object in augmented reality via intent
US9996983B2 (en)2013-06-032018-06-12Daqri, LlcManipulation of virtual object in augmented reality via intent
US9383819B2 (en)*2013-06-032016-07-05Daqri, LlcManipulation of virtual object in augmented reality via intent
US9594807B2 (en)2013-08-062017-03-14Intel CorporationEmotion-related query processing
WO2015020638A1 (en)*2013-08-062015-02-12Intel CorporationEmotion-related query processing
US20150254563A1 (en)*2014-03-072015-09-10International Business Machines CorporationDetecting emotional stressors in networks
US9600743B2 (en)2014-06-272017-03-21International Business Machines CorporationDirecting field of vision based on personal interests
US9892648B2 (en)2014-06-272018-02-13International Business Machine CorporationDirecting field of vision based on personal interests
US9471837B2 (en)2014-08-192016-10-18International Business Machines CorporationReal-time analytics to identify visual objects of interest
US10120413B2 (en)*2014-09-112018-11-06Interaxon Inc.System and method for enhanced training using a virtual reality environment and bio-signal data
US20160077547A1 (en)*2014-09-112016-03-17Interaxon Inc.System and method for enhanced training using a virtual reality environment and bio-signal data
US11815951B2 (en)2014-09-112023-11-14Interaxon Inc.System and method for enhanced training using a virtual reality environment and bio-signal data
US12253882B2 (en)2014-09-112025-03-18Interaxon Inc.System and method for enhanced training using a virtual reality environment and bio-signal data
US11287848B2 (en)2014-09-112022-03-29Interaxon Inc.System and method for enhanced training using a virtual reality environment and bio-signal data
US10768665B2 (en)2014-09-112020-09-08Interaxon Inc.System and method for enhanced training using a virtual reality environment and bio-signal data
US20160180277A1 (en)*2014-12-172016-06-23Avaya Inc.Automated responses to projected contact center agent fatigue and burnout
US20210097631A1 (en)*2015-03-302021-04-01Twiin, LLCSystems and methods of generating consciousness affects
US11900481B2 (en)*2015-03-302024-02-13Twiin, LLCSystems and methods of generating consciousness affects
US11290779B2 (en)2015-05-192022-03-29Nielsen Consumer LlcMethods and apparatus to adjust content presented to an individual
US10771844B2 (en)2015-05-192020-09-08The Nielsen Company (Us), LlcMethods and apparatus to adjust content presented to an individual
US9936250B2 (en)2015-05-192018-04-03The Nielsen Company (Us), LlcMethods and apparatus to adjust content presented to an individual
US20170103360A1 (en)*2015-10-132017-04-13Genesys Telecommunications Laboratories, Inc.System and method for intelligent task management and routing based on physiological sensor input data
US10607167B2 (en)*2015-10-132020-03-31Genesys Telecommunications Laboratories, Inc.System and method for intelligent task management and routing based on physiological sensor input data
US10176161B2 (en)*2016-01-282019-01-08International Business Machines CorporationDetection of emotional indications in information artefacts
US11736912B2 (en)2016-06-302023-08-22The Notebook, LlcElectronic notebook system
US11228875B2 (en)2016-06-302022-01-18The Notebook, LlcElectronic notebook system
US10484845B2 (en)2016-06-302019-11-19Karen Elaine KhaleghiElectronic notebook system
US12167304B2 (en)2016-06-302024-12-10The Notebook, LlcElectronic notebook system
US12150017B2 (en)2016-06-302024-11-19The Notebook, LlcElectronic notebook system
US20230367448A1 (en)*2016-09-202023-11-16Twiin, Inc.Systems and methods of generating consciousness affects using one or more non-biological inputs
US11755172B2 (en)*2016-09-202023-09-12Twiin, Inc.Systems and methods of generating consciousness affects using one or more non-biological inputs
US20220230374A1 (en)*2016-11-092022-07-21Microsoft Technology Licensing, LlcUser interface for generating expressive content
US11321890B2 (en)*2016-11-092022-05-03Microsoft Technology Licensing, LlcUser interface for generating expressive content
US20180150130A1 (en)*2016-11-302018-05-31At&T Intellectual Property I, L.P.Methods, and devices for generating a user experience based on the stored user information
US11086391B2 (en)*2016-11-302021-08-10At&T Intellectual Property I, L.P.Methods, and devices for generating a user experience based on the stored user information
US11449136B2 (en)2016-11-302022-09-20At&T Intellectual Property I, L.P.Methods, and devices for generating a user experience based on the stored user information
US10159435B1 (en)*2017-09-292018-12-25Novelic D.O.O.Emotion sensor system
US10225621B1 (en)2017-12-202019-03-05Dish Network L.L.C.Eyes free entertainment
US10645464B2 (en)2017-12-202020-05-05Dish Network L.L.C.Eyes free entertainment
US12367532B2 (en)*2018-02-202025-07-22Fluence Bioengineering, Inc.Controlled agricultural systems and methods of managing agricultural systems
US20200134741A1 (en)*2018-02-202020-04-30Osram GmbhControlled Agricultural Systems and Methods of Managing Agricultural Systems
US11386896B2 (en)2018-02-282022-07-12The Notebook, LlcHealth monitoring system and appliance
US10235998B1 (en)*2018-02-282019-03-19Karen Elaine KhaleghiHealth monitoring system and appliance
US20190267003A1 (en)*2018-02-282019-08-29Karen Elaine KhaleghiHealth monitoring system and appliance
US11881221B2 (en)2018-02-282024-01-23The Notebook, LlcHealth monitoring system and appliance
US10573314B2 (en)*2018-02-282020-02-25Karen Elaine KhaleghiHealth monitoring system and appliance
US10492735B2 (en)*2018-04-272019-12-03Microsoft Technology Licensing, LlcIntelligent warning system
CN110555204A (en)*2018-05-312019-12-10北京京东尚科信息技术有限公司emotion judgment method and device
US11482221B2 (en)2019-02-132022-10-25The Notebook, LlcImpaired operator detection and interlock apparatus
US12046238B2 (en)2019-02-132024-07-23The Notebook, LlcImpaired operator detection and interlock apparatus
US10559307B1 (en)2019-02-132020-02-11Karen Elaine KhaleghiImpaired operator detection and interlock apparatus
US11984136B2 (en)*2019-02-282024-05-14Nec CorporationEmotion estimation apparatus, emotion estimation method, and computer readable recording medium
US20220148617A1 (en)*2019-02-282022-05-12Nec CorporationEmotion estimation apparatus, emotion estimation method, and computer readable recording medium
CN113473913A (en)*2019-02-282021-10-01日本电气株式会社Emotion estimation device, emotion estimation method, and computer-readable recording medium
US11157549B2 (en)*2019-03-062021-10-26International Business Machines CorporationEmotional experience metadata on recorded images
US11163822B2 (en)*2019-03-062021-11-02International Business Machines CorporationEmotional experience metadata on recorded images
US11582037B2 (en)2019-07-252023-02-14The Notebook, LlcApparatus and methods for secure distributed communications and data access
US12244708B2 (en)2019-07-252025-03-04The Notebook, LlcApparatus and methods for secure distributed communications and data access
US10735191B1 (en)2019-07-252020-08-04The Notebook, LlcApparatus and methods for secure distributed communications and data access
WO2021173256A1 (en)*2020-02-272021-09-02Microsoft Technology Licensing, LlcAdjusting user experience for multiuser sessions based on vocal-characteristic models
US11170800B2 (en)2020-02-272021-11-09Microsoft Technology Licensing, LlcAdjusting user experience for multiuser sessions based on vocal-characteristic models
CN112263252A (en)*2020-09-282021-01-26贵州大学 A PAD emotion dimension prediction method based on HRV features and three-layer SVR

Also Published As

Publication numberPublication date
US9329758B2 (en)2016-05-03
US20140068472A1 (en)2014-03-06

Similar Documents

PublicationPublication DateTitle
US9329758B2 (en)Multiple sensory channel approach for translating human emotions in a computing environment
US12189854B2 (en)Systems and methods for collecting, analyzing, and sharing bio-signal and non-bio-signal data
US20230221801A1 (en)Systems and methods for collecting, analyzing, and sharing bio-signal and non-bio-signal data
US20210267514A1 (en)Method and apparatus for monitoring emotional compatibility in online dating
US20220392625A1 (en)Method and system for an interface to provide activity recommendations
US8700009B2 (en)Method and apparatus for monitoring emotion in an interactive network
US10716501B2 (en)System and method for classification and quantitative estimation of cognitive stress
US20210118323A1 (en)Method and apparatus for interactive monitoring of emotion during teletherapy
Sannino et al.A continuous noninvasive arterial pressure (CNAP) approach for health 4.0 systems
US20230099519A1 (en)Systems and methods for managing stress experienced by users during events
Knapp et al.Physiological signals and their use in augmenting emotion recognition for human–machine interaction
Currey et al.Naturalistic decision making: a model to overcome methodological challenges in the study of critical care nurses’ decision making about patients’ hemodynamic status
EP4182875A1 (en)Method and system for an interface for personalization or recommendation of products
CN113853161A (en)System and method for identifying and measuring emotional states
US11822719B1 (en)System and method for controlling digital cinematic content based on emotional state of characters
Ferrari et al.Using Voice and Biofeedback to Predict User Engagement during Product Feedback Interviews
Girardi et al.The way it makes you feel predicting users’ engagement during interviews with biofeedback and supervised learning
US20240008784A1 (en)System and Method for Prevention, Diagnosis, and Treatment of Health Conditions
US20170354383A1 (en)System to determine the accuracy of a medical sensor evaluation
Ferrari et al.Using voice and biofeedback to predict user engagement during product feedback interviews
US20230107691A1 (en)Closed Loop System Using In-ear Infrasonic Hemodynography and Method Therefor
CN117854662A (en)Objective psychological health data acquisition method and related equipment
HUICHAPA et al.Using Voice and Biofeedback to Predict User Engagement during Product Feedback Interviews
JP2023067016A (en) Measurement support device and measurement support method
CN117973919A (en) Personnel ability assessment training method, device and electronic equipment based on human factor intelligence

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:INTERNATIONAL BUSINESS MACHINES CORPORATION, NEW Y

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GUZAK, BARBARA S.;KWAN, HUNG-TACK;VORA, JANKI Y.;REEL/FRAME:023097/0632

Effective date:20090803

STCBInformation on status: application discontinuation

Free format text:ABANDONED -- FAILURE TO PAY ISSUE FEE


[8]ページ先頭

©2009-2025 Movatter.jp