Movatterモバイル変換


[0]ホーム

URL:


CN103984413B - Information interacting method and information interactive device - Google Patents

Information interacting method and information interactive device
Download PDF

Info

Publication number
CN103984413B
CN103984413BCN201410209581.XACN201410209581ACN103984413BCN 103984413 BCN103984413 BCN 103984413BCN 201410209581 ACN201410209581 ACN 201410209581ACN 103984413 BCN103984413 BCN 103984413B
Authority
CN
China
Prior art keywords
user
users
information
exchange
eyes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410209581.XA
Other languages
Chinese (zh)
Other versions
CN103984413A (en
Inventor
杜琳
张宏江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhigu Ruituo Technology Services Co Ltd
Original Assignee
Beijing Zhigu Ruituo Technology Services Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhigu Ruituo Technology Services Co LtdfiledCriticalBeijing Zhigu Ruituo Technology Services Co Ltd
Priority to CN201410209581.XApriorityCriticalpatent/CN103984413B/en
Publication of CN103984413ApublicationCriticalpatent/CN103984413A/en
Application grantedgrantedCritical
Publication of CN103984413BpublicationCriticalpatent/CN103984413B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Landscapes

Abstract

The embodiment of the present application discloses a kind of information interacting method and information interactive device, and methods described includes:Obtain the sight exchange of information between a user and an other users;Confirm whether the sight exchange of information meets the interaction condition of setting;It is corresponding with meeting the interactive condition, the information exchange of the user and/or other users are carried out between external equipment corresponding with the other users.The embodiment of the present application is by naturally carrying out the information exchange of the user and/or other users between user according to the sight communicational aspects between user, other normal communications between user need not be interrupted, and the interaction of personal information is only carried out between the user for having sight to exchange, ensures the security of personal information interaction.

Description

Information interacting method and information interactive device
Technical field
The application is related to technical field of information interaction, more particularly to a kind of information interacting method and information interactive device.
Background technology
It would generally judge whether to need to exchange during talk in the commercial meet occasion such as meeting, people's first meetingBusiness card, and when needing to exchange, stop current other normal communications, find out the paper business card or electronic business card (electricity of oneselfOrganization names of the sub- business card such as belonging to including user, address name and information cell-phone number) other side is given, and receive other side'sBusiness card or the communication information of typing other side.
The content of the invention
The purpose of the application is:A kind of information exchange scheme is provided.
In a first aspect, this application provides a kind of information interacting method, including:
Obtain the sight exchange of information between a user and an other users;
Confirm whether the sight exchange of information meets the interaction condition of setting;
It is corresponding with meeting the interactive condition, carry out the user between external equipment corresponding with the other usersAnd/or the information exchange of other users.
Second aspect, this application provides a kind of information interactive device, including:
Acquisition module, for obtaining the sight exchange of information between a user and an other users;
Module is confirmed, for confirming whether the sight exchange of information meets the interaction condition of setting;
Interactive module, for corresponding with meeting the interactive condition, external equipment corresponding with the other users itBetween carry out the information exchange of the user and/or other users.
At least one embodiment of the embodiment of the present application by according to the sight communicational aspects between user come naturallyCarry out the personal information interaction between user, it is not necessary to interrupt other normal communications between user, and only having sight friendshipThe interaction of personal information is carried out between the user of stream, ensures the security of personal information interaction.
Brief description of the drawings
Fig. 1 is a kind of flow chart of information interacting method of the embodiment of the present application;
Fig. 2 is the flow chart of another information interacting method of the embodiment of the present application;
Fig. 3 is that a kind of information interacting method of the embodiment of the present application obtains user and the eye contact time of other usersSchematic diagram;
Fig. 4 is a kind of structural schematic block diagram of information interactive device of the embodiment of the present application;
Fig. 5 a are the structural schematic block diagram of another information interactive device of the embodiment of the present application;
Fig. 5 b are a kind of structural schematic block diagram for watching confirmation unit attentively of information interactive device of the embodiment of the present application;
Fig. 5 c are the structural schematic block diagram for watching confirmation unit attentively of another information interactive device of the embodiment of the present application;
Fig. 6 is a kind of structural schematic block diagram of nearly eye wearable device of the embodiment of the present application;
Fig. 7 is a kind of structural schematic block diagram of intelligent glasses of the embodiment of the present application;
Fig. 8 is the structural schematic block diagram of another information interactive device of the embodiment of the present application.
Embodiment
(identical label represents identical element in some accompanying drawings) and embodiment below in conjunction with the accompanying drawings, to the tool of the applicationBody embodiment is described in further detail.Following examples are used to illustrate the application, but are not limited to scope of the present application.
It will be understood by those skilled in the art that the term such as " first ", " second " in the application be only used for distinguishing it is asynchronousSuddenly, equipment or module etc., any particular technology implication is neither represented, does not also indicate that the inevitable logical order between them.
As shown in figure 1, the embodiment of the present application provides a kind of information interacting method, including:
S110 obtains the sight exchange of information between a user and an other users;
S120 confirms whether the sight exchange of information meets the interaction condition of setting;
S130 is corresponding with meeting the interactive condition, carried out between external equipment corresponding with the other users described inUser and/or the information exchange of other users.
For example, executive agent of the information interactive device that the application provides as the present embodiment, execution S110~S130.Specifically, described information interactive device can be arranged on a user equipment in a manner of software, hardware or software and hardware combiningIn, or, the inherently described user equipment of described information interactive device;The user equipment includes but is not limited to:Intelligent handMachine, intelligent glasses, intelligent helmet etc., wherein intelligent glasses are divided into intelligent framework glasses and intelligent invisible glasses again.In the applicationIn embodiment, the user is the user of described information interactive device;Other users are the use of the external equipmentPerson.
In the embodiment of the present application, described information interaction mainly includes the individual of the user and/or other usersThe interaction of information.
The embodiment of the present application naturally carries out the personal information between user according to the sight communicational aspects between userInteraction, it is not necessary to interrupt other normal communications between user, and only carry out personal letter between the user for thering is sight to exchangeThe interaction of breath, ensure the security of personal information interaction.
Each step of the embodiment of the present application is further illustrated by the following examples:
As shown in Fig. 2 in a kind of possible embodiment, methods described also includes before the step S110:
S100 confirms whether the user is in face-to-face exchange state;
When confirming that the user is in the face-to-face exchange state, the step S110 is returned again to.
If user does not exchange with other users, the step S110 can not obtain the sight exchange of information, thereforeTo repeat to obtain the sight exchange of information always, therefore, in the present embodiment, methods described need not obtain institute alwaysSight exchange of information is stated, but when confirming that user is exchanging with an other users, just trigger the step S110.
It is described to confirm whether the user includes in face-to-face exchange state in a kind of possible embodiment:
Obtain the acoustic information around the user;
The acoustic information is analyzed by speech detection, confirms whether the user is in face-to-face exchange shapeState.
In a kind of possible embodiment of present embodiment, the user can be obtained by a sound sensor deviceThe acoustic information of surrounding, the sound sensor device can be a part for described information interactive device.Alternatively possibleIn embodiment, the acoustic information can also be obtained from other equipment by way of communication.
In the present embodiment, as the user and other users face-to-face exchange, the language of the user is typically hadSound and the voice of at least one other users, the language for whether having the user in the acoustic information is confirmed by speech detection algorithmsSound and the voice of at least one other users, then can confirm that whether the user is in face-to-face exchange state.
It is described to confirm whether the user is in face-to-face exchange state in the alternatively possible embodiment of the applicationIncluding:
Obtain the head movement attitude information of the user;
The head movement attitude information is analyzed by the identification of head movement gesture mode, confirms that the user isIt is no to be in face-to-face exchange state.
In the embodiment of the present application, can be obtained by being arranged at an athletic posture senser element of the user's headThe head movement attitude information of the user;Or the head fortune can also be obtained from other equipment by way of communicationDynamic attitude information.
In the embodiment of the present application, due to had when user is in face-to-face exchange state specific head movement feature andPosture feature, therefore, it can confirm that whether the user is in face-to-face exchange state by the identification of head movement gesture mode.Unlike the embodiment by speech detection, in the present embodiment, even if there is no the exchange of language between user,It can confirm that whether the user is in face-to-face exchange state.
In alternatively possible embodiment, acoustic information described above and the head movement appearance can be passed through simultaneouslyState information confirms whether the user is in face-to-face exchange state, can further improve the accuracy of confirmation.
Certainly, those skilled in the art are it is recognised that other be used to detect the side whether user is in the talk stateMethod can also be applied in the embodiment of the present application.
S110 obtains the sight exchange of information between a user and an other users.
In the embodiment of the present application, the sight exchange of information includes:The eye contact time.
Here, the eye contact time is:It is described other while the user watches the eyes of other users attentivelyUser is also look at the time of the eyes of the user.
In the embodiment of the present application, obtaining the eye contact time of the user and other users includes:
Obtain the very first time information that the user watches the eyes of other users attentively;
Obtain the second temporal information that other users watch the eyes of the user attentively;
The sight exchange of information is obtained according to the very first time information and second temporal information.
As shown in figure 3, the very first time can be obtained according to the very first time information and second temporal informationOverlapping period period corresponding with second temporal information period corresponding to information, the period is the useFamily and the eye contact time of other users.
In the embodiment of the present application, the very first time information for obtaining the user and watching the eyes of other users attentivelyIncluding:
Confirm whether the user watches the eyes of other users attentively;
Record the user watch attentively other users eyes time as the very first time information.
In the embodiment of the present application, confirm that the eyes whether user watches other users attentively are mainly to pass through detectionWhether the blinkpunkt of user overlaps realization with the eyes of other users.
Certainly, it is described to determine the use because detection is possible to have error, therefore in a kind of possible embodimentThe eyes whether family watches other users attentively can also be whether the blinkpunkt for detecting the user falls comprising described otherOne predeterminable area of the eyes of user.Predeterminable area described here for example can be in the range of the eye socket of other usersRegion, or even can be the facial zone of other users.
In a kind of possible embodiment, the eyes bag for confirming the user and whether watching other users attentivelyInclude:
Obtain image corresponding with the visual field of the user;
Obtain the direction of visual lines of the user;
Confirm described image on object corresponding with the direction of visual lines of the user whether be other users eyes.
In the embodiment of the present application, it is the viewing for including the user that the image corresponding with the visual field of the user, which refers to,The image of the object in region (here, the viewing areas can be the subset in the user visual field, can also be overlapped with the user visual field).Described image is obtained for example, can be shot by direction of the image capture module of a near-eye equipment along eyes of user direction.Can to confirm the relation between described image and user's viewing areas, (such as image and the viewing areas be complete by calibratingFull weight is closed, partially overlapped).
In the embodiment of the present application, the direction of visual lines of the user can be obtained by sight tracing.
In the embodiment of the present application, it is described confirmation described image on object corresponding with the direction of visual lines of the user whetherEyes for other users can include:
The direction of visual lines for calibrating the user and the image capture module for gathering described image gather pair between imageIt should be related to;
Described image is analyzed, for example, being analyzed by Face datection algorithm described image, identifies the figureThe eye areas of face as in;
Confirm whether the direction of visual lines of the user is corresponding with the eye areas according to the corresponding relation, and then can be withConfirm the user direction of visual lines whether be other users eyes.
It can not definitely confirm that user is just being look at the object because the direction of visual lines of user overlaps with an object, for example,User is thinking deeply that eyes do not focus on or user is seeing that a transparent substance (such as is seen interior on through mode displayHold), it is therefore, described whether to confirm the user in a kind of optional embodiment in order to improve the accuracy for watching confirmation attentivelyWatching the eyes of other users attentively also includes:
The blinkpunkt of the user is obtained relative to the first distance of the user;
Obtain second distance of the other users relative to the user;
Confirm whether first distance matches with the second distance.
In the embodiment of the present application, when first distance and the second distance match, and due to the user'sThe region of the eyes of direction of visual lines and other users on the image is corresponding, then can more accurately confirm the userWatch the eyes of other users attentively.
In the embodiment of the present application, the mode for obtaining first distance can have a variety of, for example, following one kind:
1) by detecting the direction of visual lines of two eyes of user respectively, the intersection point of the direction of visual lines of two eyes is obtainedRelative to the position of the user, and then obtain first distance;
2) direction of visual lines for the eyes for detecting the user, and the depth map of user's direction of visual lines, root are passed throughAccording to the corresponding relation of object on the direction of visual lines and the depth map, and then obtain first distance;
3) gather the eye fundus image on eyeground, according to eye fundus image acquisition module when collecting clearly eye fundus image intoAs parameter etc., first distance is obtained.
In a kind of possible embodiment, other users can be obtained by depth detection relative to the useThe second distance at family.For example, obtain the second distance by being arranged on a depth transducer of the user side.
Due to can talk when, user can typically watch other side attentively with a certain specific direction, therefore can be by setting in advanceThe modes such as fixed or machine learning confirm the presumptive area in described image of the user in talk corresponding to direction of visual lines, now,In a kind of optional embodiment, simplifiedly, the side for confirming the user and whether watching the eyes of other users attentivelyFormula can include:
Obtain image corresponding with the visual field of the user;
Confirm the eyes of other users described in described image whether in presumptive area.
During personal information due to reading an other users in user, if mug shot corresponding to other users,It then can more easily help user that the personal information is mapped with other users.In addition, according to implementing aboveDescribed in mode, when user watches the eyes of other users attentively, when carrying out the step S110, the described and institute of acquisitionOther users' (such as including the face of the user) can be included by stating in image corresponding to the visual field of user.Therefore, oneIn the optional embodiment of kind, methods described also includes:
It is corresponding to watch the eyes of other users attentively with the user, preserves in described image and includes other usersUser images.
In the embodiment of the present application, second temporal information for obtaining other users and watching the eyes of the user attentivelyIncluding:
Second temporal information is obtained from outside.
Here outside for example can be external equipment corresponding to other users, for example, other users also bySecond temporal information described in the technical limit spacings such as eye tracking described above, and the application method is sent to by the external equipmentExecutive agent.In addition, the outside can also be an external server, the external equipment can be by described the second of acquisitionTemporal information is sent to the external server, and the executive agent of the application method is again by way of communication from the outside clothesBusiness device obtains second temporal information.
In a kind of possible embodiment, the facial characteristics of the user and other users are corresponding respectively with itThere is corresponding relation between equipment, i.e. user equipment corresponding to one can be mapped to by the facial characteristics identified.In the implementation, can be to acquisition comprising described other after methods described determines that the user watches the eyes of other users attentively in modeThe images of the eyes of user carries out facial characteristics identification, and is mapped to pair by the facial characteristics of the other users identifiedThe external equipment of other users is answered, and establishes and is communicated to obtain second temporal information with the external equipment, andThe very first time information can also be sent to the external equipment.
In addition, in alternatively possible embodiment, it can also be that the voice of the user and other users are specialLevying between corresponding equipment corresponding respectively has corresponding relation, i.e. can be mapped to a pair by the phonetic feature identifiedThe user equipment answered.Methods described can also be corresponding with other users according to the phonetic feature of the other users collectedExternal equipment establishes connection, obtains second temporal information.
Wherein, the communication between the executive agent of the application method and the external equipment can be complete in several waysInto, such as radio communication (such as bluetooth, WiFi), visual communication (Quick Response Code as corresponding to presenting respectively), acoustic communication is (as led toCross ultrasonic wave) etc. mode.
S120 confirms whether the sight exchange of information meets the interaction condition of setting.
As described above, in one embodiment, when the sight exchange of information is the eye contact time,The interactive condition can for example include:
The eye contact time between the user and other users reaches the threshold value of setting.
As shown in figure 3, in a kind of possible embodiment, the eye contact time can be the user with it is describedThe accumulation eye contact time between other users, wherein, when thering is multiple sight to connect between the user and other usersIt is described to accumulate the sum that the eye contact time is this multiple eye contact time, t1+t2+t3 as shown in Figure 3 when touching;AnotherIn the possible embodiment of kind, the eye contact time can also be that the single between the user and other users regardsLinear contact lay time, t1, t2 or t3 as shown in Figure 3;Or the eye contact time can also be the user with it is described itsThe maximum in the multiple eye contact time between its user, t3 as shown in Figure 3.
S130 is corresponding with meeting the interactive condition, carried out between external equipment corresponding with the other users described inUser and/or the information exchange of other users.
In a kind of possible embodiment of the embodiment of the present application, the external equipment corresponding with the other users itBetween carry out the user and/or the information exchange of other users and include:
The first man information of the user is sent to the external equipment;
The second personal information of other users is received from the external equipment.
In alternatively possible embodiment, the first man of the user can also be only sent to the external equipmentInformation, or, only from the second personal information of external equipment reception other users.
In the embodiment of the present application, the first man information can include following at least one:The category of the userProperty information (such as the name of the user, affiliated unit's title, post etc.), the communication information of the user is (such as the userTelephone number, addresses of items of mail, instant messaging account etc.).It is, of course, also possible to want to be presented to including other users otherThe information (such as photo of the user etc.) of user.
Second personal information includes the desired information for being presented to the user of other users.It is for example, describedThe attribute information of other users and/or the communication information of other users.
In a kind of possible embodiment, methods described also includes:
It is corresponding with meeting the interactive condition, obtain the user images for including other users;
The user images are associated with second personal information.
It is as described above, can be with after the user images of other users are associated with second personal informationUser is helped preferably to confirm other users.
Wherein, the user images can be obtained in the user images stored from above;Or it can also be logicalThe mode for crossing communication obtains from outside.
It will be understood by those skilled in the art that in the above method of the application embodiment, the sequence number of each stepSize is not meant to the priority of execution sequence, and the execution sequence of each step should be confirmed with its function and internal logic, without answeringAny restriction is formed to the implementation process of the application embodiment.
As shown in figure 4, the embodiment of the present application provides a kind of information interactive device 400, including:
Acquisition module 410, for obtaining the sight exchange of information between a user and an other users;
Module 420 is confirmed, for confirming whether the sight exchange of information meets the interaction condition of setting;
Interactive module 430, for corresponding with meeting the interactive condition, external equipment corresponding with the other usersBetween carry out the information exchange of the user and/or other users.
The embodiment of the present application naturally carries out the personal information between user according to the sight communicational aspects between userInteraction, it is not necessary to interrupt other normal communications between user, and only carry out personal letter between the user for thering is sight to exchangeThe interaction of breath, ensure the security of personal information interaction.
The function of each module of the embodiment of the present application is further illustrated by the following examples.
As shown in Figure 5 a, in a kind of possible embodiment, described device 400 also includes:
Exchange confirms module 440, for confirming whether the user is in face-to-face exchange state;
The acquisition module 410 is further used for:
It is corresponding that the face-to-face exchange state is in the user, obtains the sight exchange of information.
In the present embodiment, when not exchanged in order to avoid user with other users, the acquisition module 410 is continuous alwaysAttempt to obtain the sight exchange of information (but can not actually get the sight exchange of information), in the present embodiment,Confirm that module 440 confirms that user after being exchanged with an other users, just triggers the work of the acquisition module 410 by the exchangeMake.
It is because therefore, described in having some specific motion characteristics with user during other users face-to-face exchangeExchange confirms that module 440 can confirm whether the user is in the exchange status, therefore the friendship in several waysStream confirms that the structure of module 440 for example can be following one or more:
1) exchange confirms that module 440 can include:
Sound acquisition submodule 441, for obtaining the acoustic information around the user;
Speech analysis submodule 442, for being analyzed by speech detection the acoustic information, confirm the userWhether face-to-face exchange state is in.
In the present embodiment, as the user and other users face-to-face exchange, the user speech is typically hadThe voice of at least one other users, the voice for whether having the user in the acoustic information is confirmed by speech detection algorithmsSound, then can confirm that whether the user is in face-to-face exchange state.
In the embodiment of the present application, the sound acquisition submodule 441 can be a sound sensor device;Or the soundSound acquisition submodule 441 can also be a communication device, for from other equipment (such as user carry other portable equipments)Obtain the acoustic information.
2) exchange confirms that module 440 can include:
Header information acquisition submodule 443, for obtaining the head movement attitude information of the user;
Header pattern analyzes submodule 444, for being identified by head movement gesture mode to the head movement postureInformation is analyzed, and confirms whether the user is in face-to-face exchange state.
In the embodiment of the present application, due to had when user is in face-to-face exchange state specific head movement feature andPosture feature, therefore, it can confirm that whether the user is in face-to-face exchange state by the identification of head movement gesture mode.Unlike the embodiment above with speech detection, in the present embodiment, even if there is no the friendship of language between userStream, it can be identified that whether the user is in face-to-face exchange state.
In the embodiment of the present application, header information acquisition submodule 443 can include being arranged at the one of the user's headAthletic posture senser element (such as:Acceleration transducer and gyroscope), for obtaining the head movement posture of user letterBreath;Or header information acquisition submodule 443 can obtain the head movement appearance by way of communication from other equipmentState information.
As shown in Figure 5 a, in the present embodiment, the exchange confirms that module 440 includes sound recited above and obtains sonModule 441, speech analysis submodule 442, header information acquisition submodule 443 and header pattern analysis submodule 444, can be withConfirm the user whether in friendship face-to-face by acoustic information described above and the head movement attitude information simultaneouslyStream mode, it can further improve the accuracy of confirmation.
Certainly, those skilled in the art are it is recognised that other be used to detect the knot whether user is in the talk stateStructure can also be applied in the embodiment of the present application.
In the embodiment of the present application, the sight exchange of information includes:The eye contact time.Here, the eye contactTime is:While the user watches the eyes of other users attentively, other users are also look at the eye of the userThe time of eyeball.
In the embodiment of the present application, the acquisition module 410 is further used for obtaining the user and other usersBetween the eye contact time.
As shown in Figure 5 a, in the embodiment of the present application, the acquisition module 410 includes:
First acquisition submodule 411, the very first time letter for watching the eyes of other users attentively for obtaining the userBreath;
Second acquisition submodule 412, the second time letter for watching the eyes of the user attentively for obtaining other usersBreath;
Submodule 413 is handled, for obtaining the sight according to the very first time information and second temporal informationExchange of information.
As shown in Figure 5 a, in a kind of possible embodiment, first acquisition submodule 411 includes:
Confirmation unit 4111 is watched attentively, for confirming whether the user watches the eyes of other users attentively;
Time recording unit 4112, for record the user watch attentively other users eyes time be used as described inVery first time information.
In the embodiment of the present application, confirm that the eyes whether user watches other users attentively are mainly to pass through detectionWhether the blinkpunkt of user overlaps realization with the eyes of other users.
It is described to watch confirmation unit 4111 attentively and include in embodiment as shown in Figure 5 b:
Image obtains subelement 4111a, for obtaining image corresponding with the visual field of the user;
Eye tracking subelement 4111b, for obtaining the direction of visual lines of the user;
Sight confirms subelement 4111c, for confirming object corresponding with the direction of visual lines of the user in described imageWhether be other users eyes.
In the embodiment of the present application, it can be an image for being arranged on nearly eye position that described image, which obtains subelement 4111a,Sampler (being, for example, the camera of one near wearable device), described image is shot along the direction of eyes of user direction.Wherein it is possible to confirm the relation between described image and user's viewing areas (such as image and institute by an alignment unitState viewing areas to be completely superposed, partially overlap).Or in other embodiments, described image obtains subelement 4111a alsoCan be a communication device, for obtaining described image from other equipment (such as other near-eye equipments).
In the embodiment of the present application, the sight confirms to correspond in subelement 4111c function embodiment shown in Figure 2Description, repeat no more here.
It is alternatively, described to watch confirmation form attentively in a kind of possible embodiment in order to improve the accuracy for watching confirmation attentivelyMember 4111 can also include:
First distance obtains subelement 4111d, for obtaining the blinkpunkt of the user relative to the first of the userDistance;
Second distance obtains subelement 4111e, for obtaining second distance of the other users relative to the user;
Distance confirms subelement 4111f, for confirming whether first distance matches with the second distance.
In the embodiment of the present application, when first distance and the second distance match, and due to the user'sThe region of the eyes of direction of visual lines and other users on the image is corresponding, then can more accurately confirm the userWatch the eyes of other users attentively.
In the embodiment of the present application, acquisition the first distance acquisition subelement 4111d structure can have a variety of, such asFor following one kind:
1) first distance, which obtains subelement 4111d, includes two an eye line tracking device, for detecting the use respectivelyThe direction of visual lines of two eyes in family, first distance obtain subelement 4111d and obtained again by the direction of visual lines of two eyesTo two eyes direction of visual lines intersection point relative to the position of the user, and then obtain first distance;
2) first distance, which obtains subelement 4111d, includes an eye tracking device, for detecting the one of the userThe direction of visual lines of eyes, and a depth sensing device, for obtaining the depth map of user's direction of visual lines, described first away fromFrom obtaining corresponding relations of the subelement 4111d according to object on the direction of visual lines and the depth map, and then obtain described theOne distance;
3) first distance, which obtains subelement 4111d, includes an eye fundus image sampler, for gathering the eye on eyegroundBase map picture, first distance obtain subelement 4111d and gather mould according to eye fundus image when collecting clearly eye fundus imageImaging parameters of block etc., obtain first distance.
In a kind of possible embodiment, the second distance, which obtains subelement 4111e, can include a depth sensingDevice, second distance of the other users relative to the user is obtained by depth detection.It is for example, described by being arranged onOne depth transducer of user side obtains the second distance.
Due to can talk when, user can typically watch other side attentively with a certain specific direction, therefore can be by setting in advanceThe modes such as fixed or machine learning confirm the presumptive area in described image of the user in talk corresponding to direction of visual lines, now,It is described to watch confirmation unit 4111 attentively and include in embodiment as shown in Figure 5 c:
Image obtains subelement 4111g, for obtaining image corresponding with the visual field of the user;
Whether region confirms subelement 4111h, for confirming the eyes of other users described in described image in fateDomain.
As shown in Figure 5 a, in the embodiment of the present application, second acquisition submodule 412 includes:
Communication unit 4121, for obtaining second temporal information from outside.
Here outside for example can be external equipment corresponding to other users, for example, other users also bySecond temporal information described in the technical limit spacings such as eye tracking described above, and the communication unit is sent to by the external equipmentMember 4121.In addition, the outside can also be an external server, the external equipment can be by second time of acquisitionInformation is sent to the external server, and the communication unit 4121 of the embodiment of the present application obtains described the from the external serverTwo temporal informations.
Wherein, the communication between the communication unit 4121 and the external equipment can be completed in several ways, exampleSuch as radio communication (such as bluetooth, WiFi), visual communication (Quick Response Code as corresponding to presenting respectively), acoustic communication is (such as by superSound wave) etc. mode.
In a kind of possible embodiment, the facial characteristics of the user and other users are corresponding respectively with itThere is corresponding relation between equipment, i.e. user equipment corresponding to one can be mapped to by the facial characteristics identified.In the implementationIt is described to watch attentively after confirmation unit 4111 determines that the user watches the eyes of other users attentively in mode, can be to acquisitionThe image of eyes comprising other users carries out facial characteristics identification, and the face of other users by identifyingFeature Mapping passes through the communication unit 4121 and external equipment foundation to the external equipment for corresponding to other usersCommunication can also be sent out the very first time information by the communication unit 4121 with obtaining second temporal informationGive the external equipment.
In addition, in alternatively possible embodiment, it can also be that the voice of the user and other users are specialLevying between corresponding equipment corresponding respectively has corresponding relation, i.e. can be mapped to a pair by the phonetic feature identifiedThe user equipment answered.The communication unit 4121 can also be according to the phonetic feature of other users collected and other useExternal equipment corresponding to family establishes connection, obtains second temporal information.
In a kind of possible embodiment, the processing submodule 413 can be according to the very first time information and instituteThe very first time information period overlapping with second temporal information can be obtained (such as Fig. 3 institutes by stating the second temporal informationShow), the period is the eye contact time of the user and other users.
During personal information due to reading an other users in user, if mug shot corresponding to other users,It then can more easily help user that the personal information is mapped with other users.In addition, according to implementing aboveDescribed in mode, when user watches the eyes of other users attentively, the figure corresponding with the visual field of the user of acquisitionOther users' (such as including the face of the user) can be included as in.Therefore, in a kind of optional embodiment, instituteStating device 400 also includes:
Memory module 450, the eyes for watching other users attentively with the user are corresponding, preserve in described imageInclude the user images of other users.
In a kind of possible embodiment, the interactive condition includes:
The eye contact time between the user and other users reaches the threshold value of setting.
I.e.:The confirmation module 420 is used to confirm that the eye contact time between the user and other users isThe no threshold value for reaching setting.
In a kind of possible embodiment, the eye contact time can be the user with other users itBetween sight accumulate time of contact;In alternatively possible embodiment, the eye contact time can also be the useThe single eye contact time that sight continuously contacts between family and other users.
In a kind of possible embodiment, the interactive module 430 includes:
Communicate submodule 431, is used for:
The first man information of the user is sent to the external equipment;Or
The second personal information of other users is received from the external equipment;Or
To the external equipment send the user first man information and from the external equipment receive it is described itsThe second personal information of its user.
In the embodiment of the present application, the first man information can include following at least one:The category of the userProperty information (such as the name of the user, affiliated unit's title, post etc.), the communication information of the user is (such as the userTelephone number, addresses of items of mail, instant messaging account etc.).It is, of course, also possible to want to be presented to including other users otherThe information (such as photo of the user etc.) of user.
Second personal information includes the desired information for being presented to the user of other users.It is for example, describedThe attribute information of other users and/or the communication information of other users.
As shown in Figure 5 a, in a kind of possible embodiment, described device 400 also includes:
Image collection module 460, for corresponding with meeting the interactive condition, acquisition includes the use of other usersFamily image;
Relating module 470, for the user images are associated with second personal information.
It is as described above, can be with after the user images of other users are associated with second personal informationUser is helped preferably to confirm other users.
Wherein, the user images can be obtained in the user images that memory module 450 preserves from above;OrPerson, it can also be and obtained by a communication device from outside.
As shown in fig. 6, the embodiment of the present application provides a kind of nearly eye wearable device 600, comprising in Fig. 4, Fig. 5 a-5cInformation interactive device 610 described in any one.
Or in a kind of possible embodiment, the nearly eye wearable device inherently described information interaction dressPut.
For example, in a kind of possible embodiment, the nearly eye wearable device is an intelligent glasses 700.
The intelligent glasses 700 include a camera 710, for realizing the figure in above-mentioned Fig. 5 b or Fig. 5 c illustrated embodimentsFunction as obtaining subelement, obtains image corresponding with the visual field of the user.
The intelligent glasses 700 also include an eye tracking device 720, for obtaining the direction of visual lines of the user.
In a kind of possible embodiment, the camera 710 also includes a depth transducer 711, for obtainingState the second distance of user described in other user distances.
The intelligent glasses 700 also include a communication module 730, enter for external equipment corresponding with the other usersRow communication, obtains second temporal information, second personal information, and by the very first time information and described firstPersonal information is sent to the external equipment.
The intelligent glasses 700 also include a processing module 740, are used for:
The described image and the second distance, the eye tracking device 720 obtained according to the camera 710 obtainsThe direction of visual lines taken, determine the very first time information;And with reference to the communication module 730 obtain described second whenBetween sight exchange of information between user described in acquisition of information and an other users, while whether confirm the sight exchange of informationMeet the interaction condition of setting;And when meeting the interactive condition, pass through the communication module 730 and the external equipmentBetween carry out the information exchange of the user and/or other users.
The function of 700 each module of the embodiment of the present application intelligent glasses is realized can be found in and retouched corresponding in Fig. 5 a-5c embodimentsState, repeat no more here.
Fig. 8 is the structural representation for another information interactive device 800 that the embodiment of the present application provides, and the application is specifically realSpecific implementation of the example not to information interactive device 800 is applied to limit.As shown in figure 8, the information interactive device 800 can wrapInclude:
Processor (processor) 810, communication interface (Communications Interface) 820, memory(memory) 830 and communication bus 840.Wherein:
Processor 810, communication interface 820 and memory 830 complete mutual communication by communication bus 840.
Communication interface 820, for being communicated with the network element of such as client etc..
Processor 810, for configuration processor 832, it can specifically perform the correlation step in above method embodiment.
Specifically, program 832 can include program code, and described program code includes computer-managed instruction.
Processor 810 is probably a central processor CPU, or specific integrated circuit ASIC (ApplicationSpecific Integrated Circuit), or it is arranged to implement the integrated electricity of one or more of the embodiment of the present applicationRoad.
Memory 830, for depositing program 832.Memory 830 may include high-speed RAM memory, it is also possible to also includeNonvolatile memory (non-volatile memory), for example, at least a magnetic disk storage.Program 832 can specifically be usedFollowing steps are performed in causing described information interactive device 800:
Obtain the sight exchange of information between a user and an other users;
Confirm whether the sight exchange of information meets the interaction condition of setting;
It is corresponding with meeting the interactive condition, carry out the user between external equipment corresponding with the other usersAnd/or the information exchange of other users.
Corresponding to the specific implementation of each step may refer in corresponding steps and the unit in above-described embodiment in program 832Description, will not be described here.It is apparent to those skilled in the art that for convenience and simplicity of description, it is above-mentioned to retouchThe equipment and the specific work process of module stated, the corresponding process description in preceding method embodiment is may be referred to, herein no longerRepeat.
Those of ordinary skill in the art are it is to be appreciated that the list of each example described with reference to the embodiments described hereinMember and method and step, it can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actuallyPerformed with hardware or software mode, application-specific and design constraint depending on technical scheme.Professional and technical personnelDescribed function can be realized using distinct methods to each specific application, but this realization is it is not considered that exceedScope of the present application.
If the function is realized in the form of SFU software functional unit and is used as independent production marketing or in use, can be withIt is stored in a computer read/write memory medium.Based on such understanding, the technical scheme of the application is substantially in other wordsThe part to be contributed to prior art or the part of the technical scheme can be embodied in the form of software product, the meterCalculation machine software product is stored in a storage medium, including some instructions are causing a computer equipment (can bePeople's computer, server, or network equipment etc.) perform each embodiment methods described of the application all or part of step.And foregoing storage medium includes:USB flash disk, mobile hard disk, read-only storage (ROM, Read-Only Memory), arbitrary access are depositedReservoir (RAM, Random Access Memory), magnetic disc or CD etc. are various can be with the medium of store program codes.
Embodiment of above is merely to illustrate the application, and is not the limitation to the application, about the common of technical fieldTechnical staff, in the case where not departing from spirit and scope, it can also make a variety of changes and modification, thus it is allEquivalent technical scheme falls within the category of the application, and the scope of patent protection of the application should be defined by the claims.

Claims (29)

CN201410209581.XA2014-05-192014-05-19Information interacting method and information interactive deviceActiveCN103984413B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201410209581.XACN103984413B (en)2014-05-192014-05-19Information interacting method and information interactive device

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201410209581.XACN103984413B (en)2014-05-192014-05-19Information interacting method and information interactive device

Publications (2)

Publication NumberPublication Date
CN103984413A CN103984413A (en)2014-08-13
CN103984413Btrue CN103984413B (en)2017-12-08

Family

ID=51276423

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201410209581.XAActiveCN103984413B (en)2014-05-192014-05-19Information interacting method and information interactive device

Country Status (1)

CountryLink
CN (1)CN103984413B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
RU2596062C1 (en)*2015-03-202016-08-27Автономная Некоммерческая Образовательная Организация Высшего Профессионального Образования "Сколковский Институт Науки И Технологий"Method for correction of eye image using machine learning and method of machine learning
CN104754497B (en)*2015-04-022018-10-16清华大学A kind of communication connection method for building up of vision attention power drive
CN105824419B (en)*2016-03-182018-12-11苏州佳世达电通有限公司Object wearing device interactive system and object wearing device exchange method
CN106774919A (en)*2016-12-282017-05-31苏州商信宝信息科技有限公司A kind of information transmission system based on intelligent glasses lock onto target thing
CN106774918A (en)*2016-12-282017-05-31苏州商信宝信息科技有限公司A kind of information retrieval system based on intelligent glasses
CN109725699B (en)*2017-10-202022-05-20荣耀终端有限公司Identification code identification method, device and equipment
CN111240471B (en)*2019-12-312023-02-03维沃移动通信有限公司 Information interaction method and wearable device

Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN102566756A (en)*2010-12-162012-07-11微软公司Comprehension and intent-based content for augmented reality displays
CN102906623A (en)*2010-02-282013-01-30奥斯特豪特集团有限公司Local advertising content on an interactive head-mounted eyepiece
CN102981616A (en)*2012-11-062013-03-20中兴通讯股份有限公司Identification method and identification system and computer capable of enhancing reality objects

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US8184983B1 (en)*2010-11-122012-05-22Google Inc.Wireless directional identification and subsequent communication between wearable electronic devices
CN103946732B (en)*2011-09-262019-06-14微软技术许可有限责任公司Video based on the sensor input to perspective, near-eye display shows modification
US9823742B2 (en)*2012-05-182017-11-21Microsoft Technology Licensing, LlcInteraction and management of devices using gaze detection
US9966075B2 (en)*2012-09-182018-05-08Qualcomm IncorporatedLeveraging head mounted displays to enable person-to-person interactions

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN102906623A (en)*2010-02-282013-01-30奥斯特豪特集团有限公司Local advertising content on an interactive head-mounted eyepiece
CN102566756A (en)*2010-12-162012-07-11微软公司Comprehension and intent-based content for augmented reality displays
CN102981616A (en)*2012-11-062013-03-20中兴通讯股份有限公司Identification method and identification system and computer capable of enhancing reality objects

Also Published As

Publication numberPublication date
CN103984413A (en)2014-08-13

Similar Documents

PublicationPublication DateTitle
CN103984413B (en)Information interacting method and information interactive device
AU2016200905B2 (en)A system and method for identifying and analyzing personal context of a user
US20140306994A1 (en)Personal holographic billboard
KR102393228B1 (en) Devices, methods and systems for biometric user recognition utilizing neural networks
JP6074494B2 (en) Shape recognition device, shape recognition program, and shape recognition method
CN114846389A (en)Virtual fitting system and method for eyewear
CN104049761B (en)Eye electro-detection method and eye electric detection means
CN112034977A (en)Method for MR intelligent glasses content interaction, information input and recommendation technology application
CN105264460A (en)Holographic object feedback
CN108965954A (en)Use the terminal of the intellectual analysis of the playback duration for reducing video
CN105528577A (en)Identification method based on intelligent glasses
CN105354792B (en) A kind of virtual glasses try-on method and mobile terminal
CN104182041B (en)Blink type determines method and blink type determination device
CN107533375A (en)scene image analysis module
CN107390863A (en)Control method and device, electronic equipment, the storage medium of equipment
WO2015155841A1 (en)Information display method and information display terminal
CN109831549A (en)Mobile terminal
CN109376621A (en)A kind of sample data generation method, device and robot
Windau et al.Situation awareness via sensor-equipped eyeglasses
CN109272473A (en)A kind of image processing method and mobile terminal
US20250287091A1 (en)Snapshot messages for indicating user state
Chen et al.Lisee: A headphone that provides all-day assistance for blind and low-vision users to reach surrounding objects
JP2020107226A (en)Determination device and program
CN106096912B (en)Face recognition method of intelligent glasses and intelligent glasses
US12072489B2 (en)Social connection through distributed and connected real-world objects

Legal Events

DateCodeTitleDescription
C06Publication
PB01Publication
C10Entry into substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp