Video based on the sensor input to perspective, near-eye display shows modificationCross reference to related applications
This application claims the priority of following U.S. Provisional Patent Application, these applications are contained in by quoting its entiretyThis:
The U.S. Provisional Application 61/539,269 that September in 2011 is submitted on the 26th.
The application is that the part of following U.S. Non-provisional Patent application continues, each of these applications is whole by quoting itsBody is contained in this:
The U.S. non-provisional application 13/591,187 that August in 2012 is submitted on the 21st, this application requires following provisional applicationEquity, each of these applications are contained in this by quoting its entirety: the US provisional patent submitted on the 3rd of August in 2012Application 61/679,522;The U.S. Provisional Patent Application 61/679,558 that August in 2012 is submitted on the 3rd;It submits on August 3rd, 2012U.S. Provisional Patent Application 61/679,542;The U.S. Provisional Patent Application 61/679,578 that August in 2012 is submitted on the 3rd;The U.S. Provisional Patent Application 61/679,601 that August in 2012 is submitted on the 3rd;On August 3rd, the 2012 US provisional patent Shens submittedIt please 61/679,541;The U.S. Provisional Patent Application 61/679,548 that August in 2012 is submitted on the 3rd;It submits on August 3rd, 2012U.S. Provisional Patent Application 61/679,550;The U.S. Provisional Patent Application 61/679,557 that August in 2012 is submitted on the 3rd;2012The U.S. Provisional Patent Application 61/679,566 that on August is submitted for 3;On May 8th, 2012 U.S. Provisional Patent Application submitted61/644,078;The U.S. Provisional Patent Application 61/670,457 that on July 11st, 2012 submits;And on July 23rd, 2012 mentionsThe U.S. Provisional Patent Application 61/674,689 of friendship.
The U.S. non-provisional application 13/441,145 that on April 6th, 2012 submits, this application requires the power of following provisional applicationBenefit, each of these applications are contained in this by quoting its entirety: on 2 14th, the 2012 US provisional patent Shens submittedIt please 61/598,885;2 months 2012 U.S. Provisional Patent Applications 61/598,889 submitted for 14th;It submits on 2 12nd, 2012U.S. Provisional Patent Application 61/598,896;And 2 months 2012 U.S. Provisional Patent Applications 61/604 submitted for 29th,917。
The U.S. non-provisional application 13/429,413 that on March 25th, 2012 submits, this application requires following provisional applicationEquity, these applications are contained in this by quoting its entirety: the U.S. Provisional Patent Application 61/ that on January 6th, 2012 submits584,029。
The U.S. non-provisional application 13/341,758 that on December 30th, 2011 submits, this application requires following provisional applicationEquity, these application each by reference its entirety be contained in this: on November 8th, 2011 US provisional patent submittedApplication 61/557,289.
The U.S. non-provisional application 13/232,930 that September in 2011 is submitted on the 14th, this application requires following provisional applicationEquity, each of these applications are contained in this by quoting its entirety: the U.S. Provisional Application submitted on the 14th of September in 201061/382,578;The U.S. Provisional Application 61/472,491 that on April 6th, 2011 submits;On May 6th, 2011, the U.S. submitted facedWhen apply for 61/483,400;The U.S. Provisional Application 61/487,371 that on May 18th, 2010 submits;And on July 5th, 2011 mentionsThe U.S. Provisional Application 61/504,513 of friendship.
What the U.S. Non-provisional Patent application 13/037,324 and 2011 year submitted for 28 days 2 months in 2011 was submitted for 28 days 2 monthsU.S. Non-provisional Patent application 13/037,335, each of the two applications require the equity of following provisional application, theseEach of provisional application is contained in this by quoting its entirety: on 2 28th, 2010 U.S. Provisional Patent Applications submitted61/308,973;The U.S. Provisional Patent Application 61/373,791 that August in 2010 is submitted on the 13rd;It submits on September 14th, 2010U.S. Provisional Patent Application 61/382,578;The U.S. Provisional Patent Application 61/410,983 that on November 8th, 2010 submits;2011The U.S. Provisional Patent Application 61/429,445 that on January 3, in submits;And the US provisional patent Shen that on January 3rd, 2011 submitsIt please 61/429,447.
Background technique
Field:
The present invention relates to augmented reality eyepiece, associated control technology and applications, more particularly to operate on eyepieceSoftware application.
The invention further relates to use in the changeable eyeglass of serializing mode to provide the thin display technology of image from waveguide.
In this industry, the head-mounted display with reflecting surface is well known.It is described in United States Patent (USP) 4969714Head-mounted display with single oblique angle part reflection beam splitting chip.Although this method provides Zhuo on the shown visual fieldThe uniformity of brightness and color more, but optical system is relatively thick due to oblique angle beam splitting chip.
It is described in United States Patent (USP) 6829095 and 7724441 with partially reflecting surface array to provide relatively thin opticsThe head-mounted display of system, these displays are shown in Figure 124, and part of reflection surface array 12408 is used forIt shows and image light 12404 is provided on the visual field, so that user be allowed to check shown image, together with the view of the environment before userFigure.The image light 12404 that user is checked is by the combined reflected smooth structure from each of multiple portions reflecting surface 12408At.Light from image source 12402 must be by multiple portions reflecting surface 12408, and wherein a part of light 12402 is to userEye reflections, to provide image light 12404.In order to provide the homogeneous image on the display visual field, partially reflecting surface 12408Reflection characteristic must be accurately controlled.For the surface nearest from image source, the reflectivity of partially reflecting surface 12408 is necessaryIt is minimum, and for the surface farthest from image source, the reflectivity of partially reflecting surface 12408 must highest.In general, partThe reflectivity of reflecting surface 12408 must be relative to linearly increasing with a distance from image source.This present manufactures and cost to askTopic, because the reflectivity of each section reflecting surface 12408 is different from adjacent surface, and the reflectivity on each surface must be tightClose control.Therefore, using partially reflecting surface array, it is difficult to which providing has uniform brightness and color on the entirely display visual fieldImage.
Alternatively, diffraction grating is used to for the image light for passing in and out waveguide being redirected to as described in United States Patent (USP) 4711512Show the visual field.However, diffraction grating is with high costs and there are color aberrations.
To persistently exist aobvious to the wear-type for also providing the image conformity of good brightness and color on the display visual fieldShow the demand of the relatively thin optical system of device.
The invention further relates to include wire grid polarizer film as partially reflecting surface so that irradiation light is deflected downwards to insteadPenetrate compact and light weight the headlight of image source.
In the display with reflectogram image source and headlight as shown in Figure 133, irradiation light 13308 is passed from edge-lightSource 13300, and deflected by headlight 13304 to irradiate reflectogram image source 13302.Irradiation light 13308 and then self-reflection image source13302 reflections, become image light 13310, then image light 13310 is passed back by headlight 13304 and enters display optics.Headlight 13304 deflects the irradiation light 13308 entered from edge light 13300 and allows reflected image light simultaneously as a result,13310 by without deflected, therefore image light 13310 can be passed in display optics, and wherein display optics are aobviousShowing can be dispersion when device is flat screen display, or can be refraction or diffraction when display is near-eye display's.In this embodiment, display optics may include diffuser.
To the reflectogram image source of such as liquid crystal over silicon (LCOS) image source etc, irradiation light is polarization, and reflected imageSource includes that quarter-wave delays film, the polarization state during membrane change self-reflection image source reflection.Then in display opticsIt include polarizer in device, the polarization effect that it assigns liquid crystal forms a figure when image light passes through display opticsPicture.
U.S. Patent application 7163330 describes a series of headlights, and the slot in the upper surface including headlight is so as to come from sideFlat of the light of edge light source between slot is deflected downwards to reflectogram image source, to allow the incoming display of reflected image lightIn optical device.Figure 134 shows the diagram of the headlight 13400 with slot 13410 and flat 13408.From edge light13300 irradiation light 13402 is reflected from slot 13410, and is deflected down to irradiate reflectogram image source 13302.Image light 13404Self-reflection image source 13302 reflects, and passes through the flat 13408 of headlight 13400.Describe linear and curved slot13410.However, slot 13410 necessarily takes up the very big of headlight for the slot 13410 for effectively deflecting irradiation light 13402Area, to limit the area of flat 13408, and since light makes to provide when passing back by headlight from slot scatteringTo the image quality degradation of display optics.Headlight 13400 is usually formed by solid material piece, and therefore may be relativelyWeight.
In United States Patent (USP) 7545571, wearable display system is provided, it includes reflectogram image source 13502, the imageSource has polarization beam apparatus 13512 as headlight so that the irradiation light 13504 that edge light 13500 provides is deflected and polarized anti-It penetrates in image source 13502, as shown in Figure 135.Polarization beam apparatus 13512 is the oblique angle plane in solid block, is had and edgeThe associated independent curved surface reflector 13514 of light source 13500.Curved surface reflector 13514, which can be, is connected to polarization beam apparatus13512 total internal reflection block 13510.As a result, disclosed in this patent with polarization beam apparatus solid block and total internal reflection blockHeadlight provides large-scale and relatively heavy headlight.In addition, image light 13508 is also shown in Figure 135.
There are still the demand of headlight is provided for the display with reflectogram image source, this headlight is mentioned with seldom scattering lightIt is for good picture quality and still compact and light weight.
The invention further relates to the optically flat surfaces made of optical film.More specifically, the present invention is provided to use lightLearn the method that film manufactures optically flat beam splitter.
Optical film can be obtained for various purposes, comprising: beam splitter, polarization beam apparatus, holographic reflector and eyeglass.?Imaging applications are especially, it is specified that optical film is very flat is important with saving image wavefront in catoptric imaging application.Certain lightLearn film has contact adhesive on side, obtains structural support to allow optical film to be attached to substrate, and assists to make optical filmKeep flat.However, the optical film for being attached to substrate by this method often have be referred to as orange peel small scale rise and fall andThe surface of point, it is optically flat that this prevents surface from realizing, and the image therefore reflected is downgraded.
In U.S. Patent application 20090052030, the method for manufacturing optical film is provided, wherein optical film is lineGrid polarizer.However, not for providing the technology of the film with optically flat property.
In United States Patent (USP) 4537739 and 4643789, provide for original image to be transported to mold by using band andThe method for making original image be attached to molded structure.However, these methods do not predict the particular/special requirement of optical film.
In U.S. Patent application 20090261490, provide for manufacture include optical film simple optical goods withAnd the method for molding.This method is directed to curved surface generated, because this method includes between radius of curvature and the ratio of diameterIt limits to avoid because of gauffer in film caused by the deformation of film during molding.There is the optically flat surface of optical film for manufactureParticular/special requirement be not selected.
In United States Patent (USP) 7820081, the method that functional film layer is laminated to lens is provided.This method is viscous using heat cureFunctional membrane is adhered to lens by mixture.However, thermo formed optical film when this technique is included in lens heat, so that opticsThe deformation together during technique for sticking of film, adhesive and lens.This method is unsuitable for manufacturing optically flat surface as a result,.
Therefore, there are still to using optical film so that can provide the surface including optical film with optically flat propertyThe demand of method.
Summary of the invention
In embodiments, eyepiece may include the in house software application operated in integrated multimedia calculating mechanism, this is answeredIt is shown and is interacted with eyepiece with 3D augmented reality (AR) content is suitable for.3D AR software application is developed simultaneously in combination with mobile applicationIt is provided by application shop, or as being used alone, as finally using platform and by special 3D specifically for eyepieceAR eyepiece shop provides.In house software application can be output and input with by eyepiece by what the inside and outside mechanism of eyepiece providedMechanism docking such as captures equipment, inter-process mechanism, internal multi-media processing from ambient enviroment, sensor device, user actionMechanism, other internal applications, camera, sensor, microphone, by transceiver, by tactile interface, from outer computer structure, outerThe initiation such as portion's application, event and/or data feeding, external equipment, third party.The order and control model operated in conjunction with eyepiece canBy sensing reception, inside by the input of input equipment, user action, external equipment interaction, event and/or data feedingApplication execution, applications, which execute etc., to be initiated.In embodiments, may be present as by house software application provide,It is included in the series of steps executed in control, the combination including two at least the following: event and/or data feedbackIt send, sensing input and/or sensing equipment, user action capture input and/or output, the user for controlling and/initiation is orderedOrder can be used to respond input in mobile and/or movement, order and/or control model and interface (wherein input can be reflected)Platform on application, interface is answered to the communication and/or connection of external system and/or equipment, external equipment, outside from platformWith, to feedback (such as about external equipment, applications) of user etc..
The present invention also provides for providing the method for relatively thin optical system, which above mentions in the display visual fieldFor the image with improved brightness and color uniformity.The present invention includes the whole battle array of the narrow changeable eyeglass on display areaColumn, to provide the display visual field, wherein changeable eyeglass is sequentially used to reflect each section of the light from image source, thus toThe sequential partial of user's presentation image.By, from transparent to narrow changeable eyeglass is switched rapidly reflectingly, being used according to repetitive sequenceFamily perceives each section to be combined at the whole image such as provided by image source in image.Assuming that each narrow changeable eyeglassSwitch by 60Hz or higher frequency, user will not perceive the flashing in image each section.
Provide each embodiment of narrow switchable mirror chip arrays.In one embodiment, changeable eyeglass is that liquid crystal canSwitch eyeglass.In another embodiment, changeable eyeglass is that moving for changeable total internal reflection eyeglass is provided using air gapPrism element.
In an alternate embodiment, not all changeable eyeglass is all sequentially used, on the contrary, using based on user'sChangeable eyeglass in eye spacing and change selected group.
The present invention also provides include wire-grid polarizer film as partially reflecting surface so that irradiation light is deflected down to anti-Penetrate compact and light weight the headlight of image source.Edge light is polarized, and wire-grid polarizer is directed, so that irradiation light is anti-It penetrates and image light is allowed through and passes to display optics.By using wire-grid polarizer film flexible, the present invention is providedPartially reflecting surface, which can be curved to focusing on irradiation light into reflectogram image source, to improve efficiency and improveThe uniformity of brightness of image.Also there is wire-grid polarizer low-down light to scatter, because image light is going to display optics deviceBy headlight in the way of part, therefore picture quality is kept.Further, since partially reflecting surface is wire-grid polarizer film, headlightMajor part be made of air, therefore headlight is light in weight.
The present invention also provides for manufacturing the method with the surface of optically flat property when using optical film.In this hairIn bright each embodiment, optical film may include beam splitter, polarization beam apparatus, wire-grid polarizer, eyeglass, part lens or or holographicFilm.Present invention provide advantages in that: the surface of optical film is optically flat, so that the wavefront of light is kept to provide and changeInto picture weight.
In certain embodiments, the present invention provides the image display systems including optically flat optical film.Optics is flatSmooth optical film includes the lining that optically flat optical film is kept in image source and the display module shell for checking positionBottom.The image that wherein image source provides is reflected into from optical film checks position, and the substrate with optical film is outside display moduleIt can be replaced in shell.
In other embodiments of the invention, optical film is attached to molded structure, therefore optical film is display module shellA part.
Existing as shown in Figure 187, with reflectogram image source 18720 and solid beam splitter square headlight 18718In technology display 18700, light 18712 passes to diffuser 18704 from light source 18702, makes it more evenly to provide thereIrradiation light 18714.Irradiation light 18714 is partially reflected the redirection of layer 18708, to irradiate reflectogram image source 18720.Irradiation lightThe 18714 then reflections of self-reflection image source 18720, become image light 18710, then image light 18710 passes through partially reflective layer18708 pass and enter associated image forming optics (not shown) back, and image is presented to viewer in image forming optics.ByThis, solid beam splitter square 18718 redirects irradiation light 18714, and the image light 18710 for allowing to reflect simultaneously is by without quiltIt redirects, therefore image light can be passed to image forming optics, wherein image forming optics are flat screen displays in displayWhen can be dispersion, or can be refraction or diffraction when display is projector or near-eye display.
To the reflectogram image source of such as liquid crystal over silicon (LCOS) image source etc, irradiation light is polarization, and in irradiation lightWhen self-reflection image source reflects, reflectogram image source changes polarization state based on the picture material that image source is presented, to form figureAs light.It then include analyzer polarizer, the polarization effect that it assigns LCOS passes through image forming optics in image lightWhen form an image, and image is presented to viewer.
In United States Patent (USP) 7545571, wearable display system is provided, it includes reflectogram image source, and image source hasPolarization beam apparatus is as headlight so that the irradiation light that edge light provides is deflected and polarized on reflectogram image source.Polarization beam apparatusIt is the oblique angle plane in solid block, there is independent curved surface reflector associated with edge light.Curved surface reflector can be companyIt is connected to the total internal reflection block of polarization beam apparatus.There is disclosed in this patent polarization beam apparatus solid block and total internal reflection as a result,The headlight of block provides large-scale and relatively heavy headlight.
United States Patent (USP) 6195136 discloses a series of headlight illuminating methods for reflectogram image source.It discloses for makingThe more compact method using curved surface beam splitter of headlight.However, curved surface beam splitter is positioned to quite remote from image source, come with reducingFrom light source then by the angle of the light of beam splitter reflection to image source.Moreover, only providing light on the side of headlight, therefore beam splittingThe size of device must be big at least as image source.As a result, when being measured along optic axis, with the irradiation area phase in image sourceThan headlight overall dimension is still relatively large.
There are still the demand of headlight is provided for the display with reflectogram image source, this headlight is mentioned with seldom scattering lightIt is for good picture quality and still compact, efficient and light weight.
The present invention provides compact, efficient and light weight headlight in display unit, which includes part reflection tableFace is to be redirected to reflectogram image source for the irradiation light from side light source, wherein such as the elevation carrection according to diffusion body regionThe size of display unit is more much smaller than the width of illuminated reflectogram image source.In certain embodiments, partially reflecting surface canIt is curved to focus the light from light source or focus on reflectogram image source.Light source can be polarized, and polarization beam apparatus film canIt is used as curved partially reflecting surface, so that irradiation light is redirected and the image light reflected is allowed through and passes toImage forming optics.Polarization beam apparatus film is light weight, and has the scattering of low-down light, because image light is going to display lightIt learns by headlight in the way of device, therefore picture quality is kept.
In other embodiments of the invention, light source is arranged in the opposite sides of headlight, so as to reflected imageTwo opposite edges in source provide light.In this case, partially reflecting surface is made of two surfaces, and one of surface makesIrradiation light from a light source deflects the half to image source, and another surface make light deflect to image source the other half.?In this embodiment, partially reflecting surface can be bending or flat.
In another embodiment of the present invention, partially reflecting surface is polarization beam apparatus and light source is polarized, therefore is come fromThe light of light source is redirected by polarization beam apparatus first, is then launched after being reflected by reflectogram image source and changing polarization.
In another embodiment, the light from light source is not polarized, therefore a polarization shape of polarization beam apparatus reflected lightState emits another polarization state of light to irradiate the half of reflectogram image source.Before the polarization state of the light emitted passes toThe opposite side of lamp, light is recycled there.The recycling of the polarization state emitted can be accomplished by the following way: by four/ mono- wave film and by lens reflecting, so that it is transmitted back to by quarter-wave film and thus changes polarization state.After the polarization state for the light for emitting and reflecting is changed, light is redirected by polarization beam apparatus to irradiate reflectogram image sourceThe other half.In an alternate embodiment, the light from two side lamps of headlight is according to work in complementary fashion, wherein what is emitted comes from phaseThe polarization state of the light of opposite side becomes unpolarized when it is interacted with diffuser on opposite sides, and thus recycles.
In one more embodiment of the present invention, the method for manufacturing the headlight with flexible portion reflectance coating is provided.Flexible membrane can be supported at edge and independently can be sandwiched in transparent two without support or flexible membrane on reflectogram image sourceOr between multiple solid members.Solid member can be shaped being placed in before flexible membrane contacts.Solid member can be according to flat geometric formShape or curved geometric keep flexible membrane.In another embodiment, flexible membrane can be supported at edge, then solid memberCan be cast on the spot, so that flexible membrane is embedded in transparent solid material.
In one embodiment, system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes that user is logicalCross optics assembly, the integrated place for process content to show to user that it checks ambient enviroment and shown contentManage device, the integrated image source for content to be introduced to optics assembly;The processor is suitably modified to content, wherein modification is in response toIt inputs and makes in sensor.Content can be video image.Modification can be it is following at least one: adjustment brightness,Adjust color saturation, adjustment colour balance, adjustment tone, adjustment video resolution, adjustment transparency, adjustment compression ratio, adjustmentFrame per second per second, a part that video is isolated stop playing video, suspend video or restart video.Sensor input can be fromFollowing at least one of which obtains: charge-coupled device, black silicon sensor, IR sensor, acoustic sensor, inductive pick-up,It is motion sensor, optical sensor, opacity sensor, proximity sensor, inductance sensor, eddy current sensor, passive redOuter proximity sensor, radar, capacitance sensor, capacitive displacement transducer, hall effect sensor, Magnetic Sensor, GPS sensingDevice, thermal imaging sensor, thermocouple, thermistor, photoelectric sensor, ultrasonic sensor, infrared laser sensor, inertia motionSensor, MEMS internal motion sensor, ultrasonic 3D motion sensor, accelerometer, dipmeter, force snesor, piezoelectric transducer,Rotary encoder, linear encoder, chemical sensor, ozone sensor, smoke sensor device, heat sensor, magnetometer, titanium dioxideCarbon detector, carbon monoxide detector, oxygen sensor, glucose sensor, smoke detector, metal detector, raindrop passSensor, altimeter, GPS, to whether in external detection, to the detection of context, to movable detection, object detector(for example, billboard), sign detector (for example, for making geographical location marker of advertisement), laser range finder, sonar, capacitor,Photoresponse, heart rate sensor or RF/ micropower impulse radio (MIR) sensor.It may be in response to from the head about user justStop broadcasting content in the instruction that mobile accelerometer inputs.Audio sensor input can be by least one ginseng of video conferenceIt is generated with speaking for person.Visual sensor can be the video image or vision demonstration of at least one participant of video conferenceVideo image.Modification can be in response to the instruction moved about user from sensor and make more or less thoroughlyAt least one of bright video image.
In one embodiment, system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes that user is logicalCross optical component, the integrated processing for process content to show to user that it checks ambient enviroment and shown contentDevice, the integrated image source for content to be introduced to optical component, the processor are suitably modified to content, wherein modification is in response in biographySensor is inputted and is made;And the system further includes integrated video image capture mechanism, which records a side of ambient enviromentFace simultaneously provides content to show.
By read following embodiment the detailed description and the accompanying drawings, these and other system of the invention, method, purpose,Feature and advantage will be apparent to those skilled in the art.
Above-mentioned all documents are contained in this by quoting its entirety.Unless otherwise expressly stated or from textOn clear from otherwise should be read to include the project of plural form to the reference of the singular of project, vice versa.It removesIt is non-in addition to explicitly point out or from the context clear from otherwise grammer conjunction is intended to express combined subordinate clause, sentenceAny and all turnovers of son, word etc. and the combination of connection.
Brief description
The present invention and below to the detailed description of its some embodiments can refer to the following drawings understand:
Fig. 1 depicts the illustrative embodiments of Optical devices.
Fig. 2 depicts RGB LED projector.
Fig. 3 depicts the projector in use.
Fig. 4 depicts the embodiment of the waveguide being placed in frame and correction lens.
Fig. 5 depicts the design of waveguide eyepiece.
Fig. 6 depicts the embodiment of the eyepiece with perspective lens.
Fig. 7 depicts the embodiment of the eyepiece with perspective lens.
Fig. 8 A-C depict according to turn over/under turn over configuration arrange eyepiece embodiment.
Fig. 8 D-E depicts the embodiment of the buckling element of secondary optics.
Fig. 8 F depict turn over/under turn over the embodiment of electro-optical module.
Fig. 9 depicts the electrochromic layer of eyepiece.
Figure 10 depicts the advantages of eyepiece is in terms of Real-time image enhancement, keystone correction and virtual perspective correction.
Figure 11 depicts the chart of the responsiveness comparison wavelength of three substrates.
Figure 12 illustrates the performance of black silicon sensor.
Figure 13 A depicts existing night vision system, and Figure 13 B depicts night vision system of the invention, and Figure 13 C illustrates twoThe difference of responsiveness between person.
Figure 14 depicts the haptic interface of eyepiece.
Figure 14 A depicts the movement in the embodiment for characterizing the eyepiece for control of nodding.
Figure 15 depicts the finger ring of control eyepiece.
Figure 15 AA depicts the finger ring of eyepiece of the control with integrated camera, wherein allowing user in one embodimentThemselves a part of video image as video conference is provided.
Figure 15 A depicts the handset type sensor in the embodiment of virtual mouse.
Figure 15 B depicts the facial actuation sensor device being mounted on eyepiece.
Figure 15 C depicts the finger point control of eyepiece.
Figure 15 D depicts the finger point control of eyepiece.
Figure 15 E depicts the example of eye tracking control.
Figure 15 F depicts the hand location control of eyepiece.
Figure 16 depicts the location-based application model of eyepiece.
Figure 17 shows A) flexibility of the non-cooled formula cmos image sensor that is able to carry out VIS/NIR/SWIR imaging is flatPlatform and B) difference of picture quality between Image enhancement night vision system.
Figure 18 depicts the customization billboard for enabling augmented reality.
Figure 19 depicts the customization advertisement for enabling augmented reality.
Figure 20 depicts the customization illustration for enabling augmented reality.
Figure 20 A depicts the method for delivering message to send when viewer reaches a certain position.
Figure 21 depicts the replacement arrangement of eyepiece optics and electronic device.
Figure 22 depicts the replacement arrangement of eyepiece optics and electronic device.
Figure 22 A depicts eyepiece with the example that eyes shine.
Figure 22 B depicts the cross section with the eyepiece for reducing the luminous light control element of eyes.
Figure 23 depicts the replacement arrangement of eyepiece optics and electronic device.
Figure 24 depicts the latched position of dummy keyboard.
Figure 24 A depicts the embodiment of the virtual projection image on anthropolith.
Figure 25 depicts the detailed view of projector.
Figure 26 depicts the detailed view of RGB LED module.
Figure 27 depicts gaming network.
Figure 28 depicts the method for using augmented reality glasses to carry out game.
Figure 29 depicts the exemplary electronic circuit figure for augmented reality eyepiece.
Figure 29 A depicts the control circuit of the eye tracking control for external equipment.
Figure 29 B depicts the communication network between the user of augmented reality eyepiece.
The parts of images that Figure 30 depicts eyepiece progress removes.
Figure 31 is depicted based on the method for this person is identified such as the speech of a people of the microphone of augmented reality equipment captureFlow chart.
Figure 32 is depicted for the typical camera used in video call or meeting.
Figure 33 shows the embodiment of the block diagram of video call camera.
Figure 34 depicts the embodiment of optics or digital stable eyepiece.
Figure 35 depicts the embodiment of classical Cassegrain configuration.
Figure 36 depicts the configuration of microcaloire Cassegrain telescopic folding optical camera.
Figure 37 depicts the stroke process of dummy keyboard.
Figure 38 depicts the target label process of dummy keyboard.
Figure 38 A depicts the embodiment of vision word translater.
Figure 39 illustrates the glasses for biometric data capture according to an embodiment.
Figure 40 illustrates the iris recognition using biometric data capture glasses according to an embodiment.
Figure 41 depicts the face and iris recognition according to an embodiment.
Figure 42 illustrates the use of double omni-directional microphones according to an embodiment.
Figure 43 is depicted to be improved using the directionality of multiple microphones.
Figure 44 shows the use that audio capture mechanism is controlled according to the adaptive array of an embodiment.
Figure 45 shows the mosaic finger and palm register system according to an example embodiment.
Figure 46 illustrates the traditional optical method as used in other fingerprints and palmmprint system.
Figure 47 shows the method according to used in the mosaic sensor of an example embodiment.
Figure 48 shows the device layout of the mosaic sensor according to an example embodiment.
Figure 49 illustrates camera fields of view used in mosaic sensor according to another embodiment and multiple cameras.
Figure 50 shows the biological phone and tactile computer according to an embodiment.
Figure 51, which is shown, is capturing potential fingerprint and palmmprint side according to the biological phone and tactile computer of an embodimentThe use in face.
Figure 52 shows typical DOMEX set.
Figure 53 show the biometric image captured according to an embodiment using biological phone and tactile computer andRelationship between bioassay watch list.
Figure 54 shows the pocket biological tool external member according to one embodiment.
Figure 55 shows each component of the pocket biological tool external member according to one embodiment.
Figure 56 is depicted according to the fingerprint of an embodiment, palmmprint, geographical location and POI registering apparatus.
Figure 57 is shown according to an embodiment for multi-modal bioassay collection, mark, geographical location and POI registrationSystem.
Figure 58 illustrates the forearm wearable device registered according to the fingerprint of an embodiment, palmmprint, geographical location and POI.
Figure 59, which is shown, registers suite of tools according to the mobile folding bioassay of one embodiment.
Figure 60 is the high level system diagram that suite of tools is registered according to the bioassay of one embodiment.
Figure 61 is the system diagram according to the folding bioassay registering apparatus of one embodiment.
Figure 62 shows the film fingerprint and palmprint sensor according to an example embodiment.
Figure 63 is shown to be collected according to the bioassay of an example embodiment collected for finger, palm and registration dataEquipment.
Figure 64 is illustrated to be captured according to the two-stage palmmprint of an embodiment.
Figure 65 illustrates the capture of the finger tip tapping according to an embodiment.
Figure 66 illustrates the capture of the bat print and roll printing according to an embodiment.
Figure 67 depicts the system for acquiring non-contacting fingerprint, palmmprint or other bioassay lines.
Figure 68 depicts the process for acquiring discontiguous fingerprint, palmmprint or other bioassay lines.
One embodiment of Figure 69 description wrist-watch controller.
Figure 70 A-D depicts the embodiment example of eyepiece, including the ability to charge with integrative display.
Figure 71 depicts the embodiment of grounding rod data system.
Figure 72 depicts the block diagram of the control mapped system including eyepiece.
Figure 73 depicts bioassay flashlight.
The helmet that Figure 74 depicts eyepiece wears version.
Figure 75 depicts the embodiment of situational awareness glasses.
Figure 76 A depicts 360 ° of imagers of assembling, and Figure 76 B depicts the cross section view of 360 ° of imagers.
Figure 77 depicts the multiple decomposition view for closing view camera.
Figure 78 depicts flight eye.
Figure 79 depicts the decomposition plan view of eyepiece.
Figure 80 depicts the electrooptics assembly of decomposition.
Figure 81 depicts the decomposition view of the axis of electrooptics assembly.
Figure 82 depicts an embodiment of the optical presentation system using the planar illumination tool with reflective display.
Figure 83 depicts the constructive embodiment of planar illumination optical system.
Figure 84 depicts planar illumination tool and inhibits the embodiment group of the reflective display of component with LASER SPECKLEDress.
Figure 85 depicts the embodiment of the planar illumination tool with the flute profile feature for redirecting light.
Figure 86 depict with pairs of flute profile feature and ' anti-flute profile ' feature to be to reduce the planar illumination of image aberrationThe embodiment of tool.
Figure 87 depicts the embodiment of the planar illumination tool manufactured from laminar structure.
Figure 88 depicts the embodiment of the planar illumination tool with the wedge-shaped optical component for redirecting light.
Figure 89 depicts the block diagram of irradiation module according to an embodiment of the invention.
Figure 90 depicts the block diagram of optical frequency converter according to an embodiment of the invention.
Figure 91 depicts the block diagram of laser irradiation module according to an embodiment of the invention.
Figure 92 depicts the block diagram of LASER Illuminator System according to another embodiment of the present invention.
Figure 93 depicts the block diagram of imaging system according to an embodiment of the invention.
Figure 94 A and B depict according to top view and side view saturating with photochromic element and heating element respectivelyMirror.
Figure 95 depicts the embodiment of LCoS headlamp designs.
Figure 96 depicts the optical bonding prism with polarizer.
Figure 97 depicts the optical bonding prism with polarizer.
Figure 98 depicts multiple embodiments of LCoS headlamp designs.
Figure 99 depicts the chock being superimposed upon on LCoS plus OBS.
Figure 100 depicts two versions of chock.
Figure 101 depicts the bending PBS film on LCoS chip.
Figure 102 A depicts one embodiment of optics assembly.
Figure 102 B depicts one embodiment of the optics assembly with embedded camera.
Figure 103 depicts one embodiment of image source.
Figure 104 depicts one embodiment of image source.
Figure 105 depicts all embodiments of image source.
Figure 106 shows software application tool in terms of depicting function and control in conjunction with the eyepiece in one embodiment of the inventionWith the top-level block diagram in market.
Figure 107 depicts the functional block diagram of the eyepiece Application development environ-ment in one embodiment of the invention.
Figure 108 is depicted in one embodiment of the invention and is developed stack about the platform element of the software application of eyepiece.
Figure 109 is the diagram of the head-mounted display according to an embodiment of the invention with see-through capabilities.
Figure 110 is the figure to the view for the unmarked scene such as checked by head-mounted display discribed in Figure 109Show.
Figure 111 is the diagram of the view of the scene of Figure 110 with 2D superposition label.
Figure 112 is as shown in the 3D of the Figure 111 shown to the left eye of viewer label.
Figure 113 is as shown in the 3D of the Figure 111 shown to the right eye of viewer label.
Figure 114 is the diagram superposed on one another to show the left and right 3D label of different Figure 111.
Figure 115 is the diagram of the view of the scene of Figure 110 with 3D label.
Figure 116 be capture, the diagram of the stereo-picture of the scene of Figure 110.
Figure 117 is the diagram to show the superimposed left and right stereo-picture of different Figure 116 between image.
Figure 118 is the diagram for showing the scene of Figure 110 of 3D label of superposition.
Figure 119 is the flow chart of the Depth cue embodiment of the method for providing 3D label of the invention.
Figure 120 is the flow chart of another Depth cue embodiment of the method for providing 3D label of the invention.
Figure 121 is the flow chart of the another Depth cue embodiment of the method for providing 3D label of the invention.
Figure 122 is the flow chart of another Depth cue embodiment of the method for providing 3D label of the invention.
Figure 123 A is depicted for providing display order frame will pass through the processor that display component performs image display.
Figure 123 B depicts the display interface for being configured to remove display driver.
Figure 124 is the schematic diagram with the prior art waveguide of multiple portions reflector;
Figure 125 is the schematic diagram of the waveguide with the changeable eyeglass of multiple electricity in first position;
Figure 125 A is the diagram of the Waveguide assembly with electrical connection.
Figure 126 is the schematic diagram of the waveguide in the second position with the changeable eyeglass of multiple electricity;
Figure 127 is the schematic diagram of the waveguide with the changeable eyeglass of multiple electricity in the third place;
Figure 128 is the schematic diagram of the waveguide with multiple mechanical changeable eyeglasses in first position;
Figure 128 A is the schematic diagram with the Waveguide assembly of micro-actuator and associated hardware;
Figure 129 is the schematic diagram of the waveguide in the second position with multiple mechanical changeable eyeglasses;
Figure 130 is the schematic diagram of the waveguide with multiple mechanical changeable eyeglasses in the third place;
Figure 131 A and Figure 131 B are the diagrams of the Waveguide display with changeable eyeglass in user's face;And
Figure 132 A-132C is the diagram of the display area provided for the user with different eye spacing.
Figure 133 is the schematic diagram with the reflectogram image source of edge light and headlight, the signal illustrate by light;
Figure 134 be include slot prior art headlight schematic diagram;
Figure 135 is the signal in the prior art headlight including plane polarization beam splitter and curvature reflectors of solid slugFigure;
Figure 136 is the schematic diagram of the one embodiment of the invention for having single edge light and being bent wire-grid polarizer film;
Figure 137 is that there are two the schematic diagrames of edge light and one embodiment of the invention of bending wire-grid polarizer film for tool;
Figure 138 is the schematic diagram that the side frame of flexible wire-grid polarizer film is kept according to required curved shape;
Figure 139 is the flow chart of method of the invention.
Figure 140 is the schematic diagram with the nearly eye imaging system of beam splitter;
Figure 141 is the schematic diagram for the optical module of nearly eye imaging system;
Figure 142 is the diagram of film pattern optical sheet;
Figure 143 is the diagram with the insertion molding modules shell of built-in optical piece;
Figure 144 is the diagram for being laminated the compression molding of pattern optical sheet;
Figure 145 A-C is the diagram for applying optical film in molding modules shell.
Figure 146 depicts the schematic front perspective view of the AR eyepiece (without its temple) according to one embodiment of the disclosure.
Figure 147 depicts the signal back perspective view of the AR eyepiece of Figure 146.
Figure 148 depicts the signal back portion perspective view on the right side of the wearer of the AR eyepiece of Figure 146.
Figure 149 depicts the signal back portion perspective view on the right side of the wearer of the AR eyepiece of Figure 146.
Figure 150 depicts the perspective illustration that the component of AR eyepiece of one of projection screen is used to support shown in Figure 146.
Figure 151 depicts the perspective illustration of the adjustment platform of AR eyepiece shown in Figure 146.
Figure 152 depicts the perspective illustration of the component of the transverse adjusting mechanism of AR eyepiece shown in Figure 146.
Figure 153 depicts the perspective illustration of the component of the tilt adjusting mechanism of AR eyepiece shown in Figure 146.
Figure 154 is the chart for showing the dark adaptation curve of human eye.
Figure 155 is to show the chart for gradually decreasing influence of the illumination to the dark adaptation curve of human eye.
Figure 156 is the diagram with the head-mounted display of see-through capabilities.
Figure 157 is the figure for showing the relationship when entering dark surrounds between display brightness and time.
Figure 158 is the flow chart of shade adaptation method.
Figure 159 depicts the dummy keyboard presented in the user visual field.
Figure 160 depicts the example of the display system with optically flat reflecting surface.
Figure 161 shows the diagram of nearly eye display module.
Figure 162 shows the diagram of optical device associated with the type of head-mounted display.
Figure 163 is shown in which to be added to the diagram of baffle between illumination beam splitter device and lens inside the shell.
Figure 164 is shown in which the figure in lens into the another embodiment of the present invention for being added to baffle at surfaceShow.
Figure 165 is shown in which to be added to the diagram of the another embodiment of the present invention of baffle at the output of lens.
Figure 166 is shown in which that baffle is attached to another implementation of the invention of shell between lens and imaging beamsplitterThe diagram of example.
Figure 167 is shown in which to apply the side wall of shell the diagram of the another embodiment of the present invention of absorber coatings.
Figure 168 shows the diagram in another source of stray light in head-mounted display, and wherein stray light is directly from light sourceEdge enters.
Figure 169 describes the stray light of any reflecting surface reflection from the shell or edge of lens.
Figure 170 is shown in which to be disposed adjacent the diagram of the one more embodiment of the present invention of baffle with light source.
Figure 171 depicts the absorber coatings that protuberance (ridge) can be used, wherein a series of small protuberances or step take on a systemRim ray in entire sidewall areas of the column baffle to stop or trim shell.
Figure 172 shows the another embodiment of belt or thin slice, and belt or thin slice include the slide glass that can be used for blocking reflected lightAnd protuberance.
Figure 173 depicts the decomposition view of an embodiment of glasses.
Figure 174 depicts the wiring design and wire guide of glasses.
Figure 175 depicts the amplified version of the wiring design and wire guide of glasses.
Figure 176 A shows the cross section view of the wiring design and wire guide of glasses.
Figure 176 B shows the cross section view of the wiring design and wire guide of glasses.
Figure 176 C shows the full release of the wiring design and wire guide of glasses.
Figure 177 depicts the U-shaped attachment for fixing glasses.
Figure 178 depicts the embodiment of the cable tension system on the head for glasses to be fixed to user.
Figure 179 A and Figure 179 B depict the cable tension on the head for glasses to be fixed to user according to bending configurationThe embodiment of system.
Figure 180 depicts the embodiment of the cable tension system on the head for glasses to be fixed to user.
Figure 181 depicts the embodiment of the system on the head for glasses to be fixed to user.
Figure 182 depicts the embodiment of the system on the head for glasses to be fixed to user.
Figure 183 depicts the embodiment of the system on the head for glasses to be fixed to user.
Figure 184 depicts the embodiment of the system on the head for glasses to be fixed to user.
Figure 185 A depicts one embodiment of optical element string.
Figure 185 B depicts the sample ray-traces of light in one embodiment of optical element string.
Figure 186 depicts the embodiment that LCoS adds ASIC packet.
Figure 187 is the signal diagram of the prior art headlight using single source and beam splitter block;
Figure 188 is the signal diagram of the prior art headlight using single source and reflective beam splitter layer;
Figure 189 is illustrated using the signal of the headlight of single source, and wherein planar reflective beam splitter layer is with reduced angleIt is placed;
Figure 190 is illustrated using the signal of the headlight of single source, and wherein reflective beam splitter layer is curved;
Figure 191 is illustrated using the signal of the headlight of double light sources, wherein the folding mirror beam splitter film with flat surfacesIt is placed in transparent solid;
Figure 192 is illustrated using the signal of the headlight of double light sources, wherein independent without branch using the folding with flat surfacesSupport reflective beam splitter film;
Figure 193 is illustrated using the signal of the headlight of double light sources, wherein independent without branch using the folding with curved surfaceSupport reflective beam splitter film;
Figure 194 is illustrated using the signal of the headlight of double light sources, wherein the folding mirror beam splitter film with curved surfaceIt is placed in transparent solid;
Figure 195 is illustrated using the signal of the headlight of single source, and headlight has opposite eyeglass and quarter-waveFilm is to recycle a part of polarised light, wherein having the folding mirror beam splitter film of flat surfaces to be placed in transparent solidIn;
Figure 196 is illustrated using the signal of the headlight of single source, and headlight has opposite eyeglass and quarter-waveFilm is provided with independent without support folding mirror polarization beam apparatus with flat surfaces with recycling a part of polarised lightFilm;
Figure 197 is illustrated using the signal of the headlight of single source, and headlight has opposite eyeglass and quarter-waveFilm is provided with independent without support folding mirror polarization beam apparatus with curved surface with recycling a part of polarised lightFilm;
Figure 198 is to manufacture headlight shown in such as Figure 197 but there is the folding mirror beam splitter film of flat surfaces to be placed inThe signal of method in transparent solid illustrates, wherein top and bottom film retainer be used to shape to reflective beam splitter film and determinePosition, and each section of polarised light is recycled;
Figure 199 is to be manufactured using method shown in Figure 198, be used together with double light sources with the recycle sections of polarised lightHeadlight signal diagram;
Figure 200 is to fold the independent signal diagram without support reflective beam splitter film, and the film is for solid headlight of castingIt is supported on edge in the first step of method;
Figure 20 1 is to show to remove gas side by side for injecting transparent cast material in the method for solid headlight of castingThe signal in hole illustrates;
Figure 20 2 is the signal diagram for showing the casting on top of casting solid headlight;
Figure 20 3 is the signal diagram for showing using flat transparent thin slice the top for flattening casting solid headlight;
Figure 20 4 is the flow chart for the method by assembly solid headlight;
Figure 20 5 is the flow chart for the method by casting manufacture solid headlight;And
Figure 20 6 is the flow chart for using the method for multistep molding process manufacture solid film retainer.
Figure 20 7 depicts an embodiment of near-field communication wrist-watch.
Figure 20 8 depicts the embodiment with the near-field communication wrist-watch for the service point equipment interconnection for enabling near-field communication.
Figure 20 9 depicts the near-field communication hand docked with the smart phone of the service point device and user that enable near-field communicationOne embodiment of table.
Detailed description
The present invention relates to eyepiece Electrooptical devices.Eyepiece may include being suitable for projecting image onto perspective or translucent lensProjection optical device, so that the wearer of eyepiece be allowed to check the environment and shown image of surrounding.Also referred to as throwThe projection optical device of shadow instrument may include the RGB LED module using field sequential color.Using field sequential color, single full color image canColour field is broken down into based on primary colors red, green and blue, and by LCoS(liquid crystal over silicon) the individually imaging of optical display 210.BecauseEach colour field is imaged by optical display 210, and corresponding LED color is opened.When these colour fields are shown according to rapid sequenceWhen showing, full color image can be seen.It is irradiated using field sequential color, it can be by mobile red relative to blue and/or green imageImage etc. is come the image that projects obtained in any chromatic aberation adjustment eyepiece.A pair of freedom can be reflected to after imageIn curved surface waveguide, wherein image light participates in total internal reflection (TIR) until the user of arrival lens sees that area is checked in the activity of imageDomain.It may include the controllable LED light source of processor and optical display of memory and operating system.Projector may include or in lightIt is coupled to display coupled lens, condenser lens, polarization beam apparatus and field lens on.
With reference to Figure 123 A and 123B, processor 12302(is for example, digital signal processor) it can provide display order frame12324 for use by the display component 12328(of eyepiece 100 for example, LCOS display component) perform image display.In each embodimentIn, sequence frames 12324 can be used or the intermediary component that is not employed as between processor 12302 and display component 12328 it is aobviousShow driver 12312 to generate.For example, and refer to Figure 123 A, processor 12302 may include that frame buffer zone 12304 and display connectMouthful 12308(is for example, Mobile Industry Processor Interface (MIPI), and display serial line interface (DSI)).Display interface 12308 canIt will be supplied to by the RGB data 12310 of pixel as the aobvious of the intermediary component between processor 12302 and display component 12328Show driver 12312, wherein display driver 12312 receives by pixel RGB data 12310 and generates for the independent complete of redFrame shows data 12318, shows data for the independent full frame display data 12320 of green and the independent full frame for blue12322, display order frame 12324 is thus supplied to display component 12328.In addition, display driver 12312 can be to display groupPart 12328 provides timing signal, such as so as to the synchronous full frame 12318,12320,12322 as display order frame 12324Transmitting.In another example, and Figure 123 B is referred to, display interface 12330 can be configured to by direct to display component 12328It provides and shows data 12334 for red full frame, show data 12338 and the full frame for blue for green full frameDisplay data 12340 remove display driver 12312 as display order frame 12324.In addition, timing signal 12332 can be fromDisplay interface 12330 is supplied directly to display component.This configuration can by remove demand to display driver provide it is aobviousWrite lower power consumption.The demand to driver not only can be removed in this direct faceplate formation, but also the totality that can simplify configuration is patrolledVolume, and remove Pixel Information etc. is generated to faceplate formation of the change from pixel, from frame needed for redundant memory.
With reference to Figure 186, in embodiments, in order to improve the yield of LCoS+ASIC packet 18600, ASIC be may be mounted to thatOn flexible print circuit (FPC) 18604, there is reinforcing device on top.If the reinforcing device on top is high as ASIC, will not make totalBody packet increases thickness.FPC can be connected or for the plate compared with high pin count to plate via connector 18602(such as zero insertion force (ZIF)Connector) it is connected to standard LCoS packet (such as glass fiber reinforced epoxy resin laminate (FR4) 18608).It can be used pressure-sensitive viscousASIC, reinforcing device and LCoS are adhered to FPC by mixture.
With reference to Fig. 1, the illustrative embodiments of augmented reality eyepiece 100 can be depicted.It is understood that the embodiment of eyepiece 100It may not include the discribed whole elements of Fig. 1, and other embodiments may include additional or different element.In embodiments,Optical element can be typed in the temple part 122 of eyepiece frame 102.Available projector 108, which projects image onto, is placed in frameOn at least one lens 104 in the opening of frame 102.Such as receive projector, skin projector, micro-projector, millimicro micro-projector,The one or more projector 108 such as projector, holographic projector based on laser can be placed in the temple part of eyepiece frame 102In.In embodiments, two lens 104 are perspectives or translucent, and in other embodiments, only one lens 104It is translucent and another lens is opaque or missing.It in embodiments, may include being more than in eyepiece 100One projector 108.
In each embodiment of the discribed embodiment of such as Fig. 1, it is articulate that eyepiece 100 may also include at least oneEarphone 120, wireless set 118 and radiator 114, the radiator 114 are used to absorb the heat from LED light engine, withSo that LED light engine is kept nice and cool and it is allowed to work under full brightness.There is also the open multimedias of one or more TI OMAP4(Application processor) 112, the winding displacement 110 with RF antenna, they will be described in greater detail herein.
In one embodiment and Fig. 2 is referred to, projector 200 can be RGB projector.Projector 200 may include shell202, radiator 204 and RGB LED engine or module 206.RGB LED engine 206 may include LED, dichroic device, concentratorDeng.Image or video flowing can be converted into control signal, such as voltage landing/electricity by digital signal processor (DSP) (not shown)Rheology, pulse width modulation (PWM) signal etc., to control intensity, duration and the mixing of LED light.For example, DSP is controllableThe duty ratio of each pwm signal is made to control the average current for flowing through the every LED for generating multiple colors.The static image of eyepieceNoise filtering, image/video stabilization and face detection can be used in coprocessor, and is able to carry out image enhancement.EyepieceBuffering, SRC, equilibrium etc. can be used in audio back-end processor.
Projector 200 may include the optical displays such as LCoS display 210 and multiple components as shown in the figure.?In each embodiment, projector 200 is designed to using single sided board LCoS display 210;However, three panel displays are also canCan.In single sided board embodiment, display 210 is irradiated with red, blue and green (i.e. field sequential color) in order.In other embodimentsIn, replacement optics display technology, such as backlight liquid crystal display (LCD), front lighting LCD, semi-transparent reflection can be used in projector 200Formula LCD, organic power generation diode (OLED), field-emitter display (FED), ferroelectricity LCoS(FLCOS), installation on sapphireLiquid crystal technology, transparent liquid crystal micro-display, quantum dot displays etc..
In embodiments, display can be 3D display device, LCD, thin film transistor (TFT) LCD, LED, LCOS, ferroelectricity on siliconLiquid crystal display, CMOS display, OLED, QLED, on the crosspoint between OED pixel with CMOS formula element sensorOLED array, transmission-type LCoS display, CRT monitor, VGA display, SXGA display, QVGA display, have be based onThe display of the gaze tracker of video, the display with exit pupil expansion technique, Asahi film display, free form surfaceOptical display, XY Polynomial combination device display, light guide transmission display, Amoled display etc..In embodiments, it showsShow that device can be the holographic display device for allowing eyepiece that the image from image source is shown as to hologram.In embodiments, it showsShow that device can be liquid crystal reflective micro-display.Such display may include polarization optics, and micro- aobvious with certain OLEDShow that device is compared, brightness can be improved.In embodiments, display can be free curved surface prism display.Free curved surface prismDisplay can realize 3D three-dimensional imaging ability.In embodiments, display can be distinguished with Cannon and/or Olympus companyThose displays described in United States Patent (USP) 6,384,983 and 6,181,475 are similar or identical.In other embodiments, it showsShow that device may include the gaze tracker based on video.In embodiments, the light beam of infrared light supply can be in exit pupil expander(EPE) it separates and expands in, to generate the collimated light beam from EPE towards eyes.Cornea can be imaged in miniature video camera, and eyeEyeball gaze-direction can be calculated by the flash of light of positioning pupil and infrared beam.After user's calibration, tracked from staringThe data of device can reflect the user focus in displayed image, this is used as input equipment.Such equipment can be similar to sweet smellThose equipment provided by Nokia, blue Tampere city research center.In addition, in embodiments, display may include outgoing pupilDiameter expander, it amplifies exit pupil and by image transmitting to new position.It is thus possible to only need the placement at the moment in user thinSlide, and image source can be placed elsewhere.In a further embodiment, display can be off-axis optics and showDevice.In embodiments, such display can not be overlapped with the machine center of aperture.This avoids key light circles by auxiliary opticalElement, kit and/or sensor block, and can provide the use to kit and/or sensor in focal point.For example, havingThe pixel for being referred to as PenTile from Nouvoyance company can be used in source matrix Organic Light Emitting Diode (Amoled) displayDesign, the design in various ways pass through more light.Firstly, red, blue and green sub-pixel is than the sub- picture in traditional monitorIt is plain big.Secondly, having a sub-pixel in every four sub-pixels is clear (clear).This means that backlight can be used it is lessPower simultaneously brighter shines.Less sub-pixel generally means that lower resolution ratio, but PenTile display uses individuallySub-pixel cheat the eyes to perceive identical resolution ratio when using the about one third of the sub-pixel of RGB stripe line panel.PenTile display also determines the brightness of scene using image processing algorithm, dims backlight automatically to darker image.
In order to overcome the limitation of the prior art described before, the present invention provides the changeable eyeglass entirety battle array in waveguideColumn, these eyeglasses can be used sequentially to provide the progressive scan to each section of image on the display visual field.By in sequenceMode eyeglass is promptly switched to transmission-type from reflective, image can be supplied to use in the case where can not perceive flashingFamily.Since compared with reflective condition, each changeable eyeglass is more often in transmissive state, the array that eyeglass can be switched is aobvious to userShown image is also presented to user while being now transparent.
Waveguide is well known to those skilled in the art to the presentation of the light from image source, therefore will not be begged for hereinBy.The exemplary references of the transmission of waveguide and light from image source to display area mention in United States Patent (USP) 5076664 and 6829095For.The present invention includes the method and apparatus for redirecting image light in the waveguide to provide a user image, wherein in waveguideImage light be to be provided from image source.
Figure 125 shows waveguide and shows equipment 12500, it have redirect it is being transmitted by waveguide 12510, from image source12502 light is to provide a user the integral array of the changeable eyeglass 12508a-12508c of image light 12504.It showsThree changeable eyeglass 12508a-12508c, but in the present invention, array may include different number of changeable eyeglass.FigureIt is the changeable eyeglass of electricity for including liquid crystal switchable eyeglass that eyeglass can be switched shown in 125.Coverslip 12512 is provided in quiltIt is shown as in the thin layer of changeable eyeglass 12508a-12508c comprising liquid crystal material.12514 He of power supply line is also shown in Figure 12512518。
The integral array of waveguide 12510 and changeable eyeglass 12508a-12508c can be made of plastics or glass material, onlyWill the material it is suitably flat.The uniformity of thickness is not as good as important in most of liquid crystal apparatus, because changeable eyeglass hasHigh reflectance.The construction of switchable liquid crystal eyeglass describes in United States Patent (USP) 6999649.
In terms of Figure 126 and 127 shows the sequence in the present invention, because one of eyeglass place can be switched in a moment only arrayIn reflective condition, other changeable eyeglasses in array are then in transmissive state.Figure 125 is shown in reflective conditionThe first changeable eyeglass 12508a, thus redirect the light from image source 12502 with become to user present image onePartial image light 12504.Other changeable eyeglass 12508b and 12508c are in transmissive state.Waveguide is also shown in Figure 12412410。
In Figure 126, changeable eyeglass 12508a and 12508c are in transmissive state, and at changeable eyeglass 12508bIn reflective condition.This situation has provided a user image light 12600 and its associated image section.Finally in Figure 127In, changeable eyeglass 12508a and 12508b are in transmissive state, and changeable eyeglass 12508c is in reflective condition.ThisOne last situation has provided a user image light 12700 and its associated image section.After this last situation, weightIt is again the sequence as shown in Figure 125 followed by as shown in Figure 126 as shown in Figure 124, later, to provide sweeping line by line to imageIt retouches.This is sequentially continuously repeated while user checks shown image.Therefore, from the institute of image source 12502There is light to be redirected at any given time by single switchable mirror piece in order.Image source can above be mentioned in changeable eyeglass in the visual fieldContinuous work while for progressive scan to image light 12504.If image light is perceived as brighter or can to differentSwitch eyeglass there are different colour balances, then image source can be adjusted to that compensation, and image source brightness or colour balance can be adjustedSystem is with synchronous with the transfer sequence of the array of changeable eyeglass.In another embodiment of the invention, eyeglass switching can be switchedOrder can be altered to provide a user interlaced picture, the array of eyeglass such as changeable for four according to repetitive mode 1,3,2,4 order.
Figure 128 shows another embodiment of the present invention, and wherein the integral array of the changeable eyeglass of Mechanical Driven is provided.In this case, waveguide shows that the switchable mirror in equipment 12800 includes prism 12804a-12804c, these prism quiltsMovement is to be alternately provided air gap or the respectively optical contact between the 12810a-12810c of surface.As shown in Figure 128, ribMirror 12804a is moved downward to provide air gap, so that surface 12810a is the reflecting surface to be worked by total internal reflection.TogetherWhen, prism 12804b and 12804c are urged towards to provide optical contact at surface 12810b and 12810c respectively, so thatIt is radioparent for obtaining surface 12810b and 12810c.Light of this situation redirection from image source 12502, which becomes to user, is inThe image light 12802 of a part of existing image.In this embodiment, can be switched eyeglass from transmissivity almost 100% opticsContact is mobile be reflectivity almost 100% total internal reflection.Power supply line 12812 is also shown in Figure 128, pedestal is connected with common ground12814 and micro-actuator 12818a-c.
Figure 129 and 130 is shown for other in the sequence of the changeable eyeglass of Mechanical Driven in switchable mirror chip arraysSituation.In Figure 129, prism 12804a and 12804c are urged towards to provide the light with surface 12810a and 12810c respectivelyContact is learned, to provide transmissive state for the light from image source 12502.Meanwhile prism 12804b is moved down on surfaceAir gap is manufactured at 12810b, so that the light from image source 12502, which is redirected to become to user, is presented the associated of imagePartial image light 12900.In the final step of the sequence shown in Figure 130, prism 12804a and 12804b are urged towardsOn with respectively at surface 12810a and 12810b provide optical contact so that the light from image source by reach surface12810c.Prism 12804c is moved downward to provide air gap at the 12810c of surface, so that surface 12810c, which becomes, to be hadThe reflecting surface of total internal reflection, and the light from image source 12502 is redirected as image light 13000 and its associated figureAs part.
In discussion before, the situation of total internal reflection is based on waveguide 12808 as known for the skilled artisanMaterial and air optical property.In order to obtain 90 degree of reflections, the refractive index of waveguide 12808 as shown in Figure 128-130Have to be larger than 1.42.In order to provide optical contact respectively between prism 12804a-12804c and surface 12810a-12810c,The necessary match surface 12810a-12810c in the surface of prism 12804a-12804c, error is within 1.0 microns.Finally, in order toMake the light from image source 12502 advance through waveguide 12808 and prism 12804a-12804c without at interface deflect,The refractive index of prism 12804a-12804c must be identical as the refractive index of waveguide 12808, and error is within about 0.1.
Figure 131 a and 131b are shown such as Waveguide assembly 13102 and switchable mirror chip arrays included in the present inventionDiagram.Figure 131 a shows the side view with the Waveguide assembly 13102 in account, and wherein the long axis of switchable mirror chip arrays is hung downStraight orientation, so that image light 13100 is directed in the eye of user.Figure 131 b is shown with the Waveguide assembly in account13102 top view, wherein the short axle of switchable mirror chip arrays 13104 can be seen, and image light 13100 is provided to user'sEyes 13110.In Figure 131 a and 131b, the visual field provided in image light 13100 can be seen clearly that.In Figure 131 b, such asThe various pieces of the image as provided by changeable eyeglasses different in array can also be seen.Figure 131 b is also shown including image sourceOne embodiment of 13108 Waveguide assembly 13102, wherein there is image source 13108 internal light source to be come from such as with providingThen the light of the miniscope of LCOS display or LCD display, light are transmitted to changeable eyeglass by waveguide, there its quiltSwitchable mirror piece redirects and becomes the image light 13100 presented to the eyes 13110 of user.
In order to reduce the image that user is perceived when changeable eyeglass be used to provide a user the sequential partial of imageFlashing, frequency work of the changeable eyeglass sequence preferably to be faster than 60Hz.In this case, n switchable mirror in arrayEach of piece is in reflective condition (1/60) X1/n seconds in each circulation of sequence, is then in transmissive state (1/60) X (n-1)/n seconds.Therefore, compared in reflective condition, each circulation of each changeable eyeglass in the sequence it is biggerTransmissive state is in part, therefore the array of changeable eyeglass is perceived as relative transparent by user.
In another embodiment of the invention, the integral array that eyeglass can be switched has than mirror needed for covering display areaEyeglass more can be switched in piece.Additional changeable eyeglass is utilized for the difference with different eye spacing (also referred to as interpupillary distance)User provides adjustment.In this case, the changeable eyeglass that be used to present image to user is adjacent to each other, so that theyContinuous image-region is presented.Depending on the eye spacing of user, the changeable eyeglass at array edges is used.As Figure 132 A-Example shown in 132C provides the array 13200 with seven changeable eyeglasses, each eyeglass 3mm wide.In validity periodBetween, five adjacent changeable eyeglasses are used to provide for the display area (13202a-13202c) of 15mm wide, have between eyeAway from ± 3mm adjustment.In the narrow eye spacer conditions shown in Figure 132 A, it is used to show towards five changeable eyeglasses of inner edgeShow, and two outsides can be switched eyeglass and be not used by.In the width eye spacer conditions shown in Figure 132 C, towards extrorse fiveChangeable eyeglass is used to show, and two inside can be switched eyeglass and be not used by.Intermediate state is shown in Figure 132 B, whereinFive intermediate changeable eyeglasses are used, and changeable eyeglass outwardly and inwardly is not used by.In the present invention, term" being not used by " refers to that changeable eyeglass is maintained in transmissive state, and other changeable eyeglasses are according to duplicate transmissive stateSequence between reflective condition is used.
Example
In the first example, using New York, United States Hopewell town Kent Optronics Co., Ltd (http: //Www.kentoptronics.com/) the liquid crystal switchable eyeglass with quick response provided.Waveguide is by glass or plastics systemAt, liquid crystal by comprising in the space between each layer, so that liquid crystal is 5 microns of thickness.Coverslip includes liquid crystal on the outer surface.It ringsIt is 10 milliseconds between seasonable, reflectivity is 87% in reflective condition and transmissivity is 87% in transmissive state.Three changeableEyeglass can drive in the sequence operated with 30Hz.If changeable eyeglass is 5mm wide, the display area of 15mm wide is provided,This is equal to from the eyes from waveguide 10mm in 38 degree of visuals field that the eye movement range (eyebox) of 8mm wide is checked.
In the second example, the Mechanical Driven array of prism made of the glass or plastics that are 1.53 refractive index is provided,The waveguide with refractive index of 1.53 identical material by being made.The surface of prism is polished to provide the plane less than 1 micronDegree, piezoelectric micro-actuator are used for mobile about 10 microns of prism to switch to reflective condition from transmissive state.Waveguide is molded to mentionFor the mating face of the flatness less than 1 micron to(for) prism.Five changeable eyeglasses can be driven by the piezoelectric actuator withJust it is operated in the sequence operated with 100Hz.Piezoelectric micro-actuator has from Miami, FL city Steiner&MartinsLimit company (http://www.steminc.com/piezo/PZ_STAKPNViewPN.asp?PZ_SM_MODEL=SMPAK155510D10 it) buys, micro-actuator provides 10 in the 5X5X10mm packaging driven by 150V with the power more than 200 poundsThe movement of micron.The array of respectively the 5 of 5mm wide prisms is used to provide for the display area of 25mm wide, this is equal to from from waveThe eyes of 10mm are led in 72 degree of visuals field that the eye movement range of 8mm wide is checked.Alternatively, once being provided using only 3 prismsThe display area (38 degree of visuals field) of 15mm wide, the ability with transverse shifting display area ± 5mm is with the eye for different userDifferent spacing between eyeball are adjusted.
In embodiments, waveguide display systems may include the image source for providing the image light from displayed image, incite somebody to actionImage light sends the waveguide of display area to and the image light of self-waveguide in future is redirected to displayed image and can be looked by userThe integral array of the changeable eyeglass for the display area seen.In embodiments, changeable eyeglass can be electrically driven.In each implementationIn example, changeable eyeglass can be mechanically driven.In other embodiments, micro-actuator can be used for Mechanical Driven switchable mirrorPiece.In addition, micro-actuator can be piezoelectricity.Changeable eyeglass can switch between transmission and reflective condition, in viewing areaEach section of image light is provided in the progressive scan on domain.
It in embodiments, may include providing the figure from image source to waveguide from the method that waveguide provides displayed imageThere is provided in waveguide as light, on the display region the integral array of changeable eyeglass and between transmission and reflective condition it is suitableSequence operates changeable eyeglass to provide each section of image light in progressive scan on the display region.
In a further embodiment, the waveguide display systems with interpupillary distance adjustment may include providing from displayed imageThe image light of the image source of image light, the waveguide that image light is sent to display area and self-waveguide in future is redirected to displayThe integral array of the changeable eyeglass of device.In addition, the array of changeable eyeglass can have than eyeglass needed for covering display areaMore eyeglasses, and the changeable eyeglass at array edges can be used for providing the display area of the eye spacing of matching user.
Eyepiece can be by any power supply power supply, battery power, solar energy, route electric energy etc..Power supply can be incorporated in frameIn frame 102 or it is placed in 100 outside of eyepiece and is electrically connected with the element that is powered of eyepiece 100.For example, solar collector can quiltIt is placed on frame 102, on belt fastener etc..Battery charge can be used wall charger, onboard charger, on belt fastener,It is carried out in eyepiece box etc..
Projector 200 may include LED light engine 206, which may be mounted to that on radiator 204 and retainer 208, useIn the installation without friction for ensuring LED light engine, hollow taper opticaltunnel 220, diffuser 212 and condenser lens 214.Hollow tunnelRoad 220 helps the fast-changing light from RGB LED light engine that homogenizes.In one embodiment, hollow light tunnels 220 are wrappedInclude silver coating.Diffuser lens 212 further homogenize and mix to light before light is directed into condenser lens 214It closes.Light leaves condenser lens 214 subsequently into polarization beam apparatus (PBS) 218.In PBS, LED light is shown up being refractedIt is transmitted and is divided in polarization components before mirror 216 and LCoS display 210.LCoS display provides figure for micro-projectorPicture.Then image reflects from LCoS display and passes back through polarization beam apparatus, then by 90 degree of reflection.Therefore, image is substantiallyMicro-projector 200 is left in the centre of micro-projector, light is then channeled into coupled lens 504, as described below.
Fig. 2 depicts the embodiment of projection part and its other supporter as described herein, but those skilled in the artOther configurations and optical technology can be used in member.For example, replacing using reflective optical device, such as with sapphireThe transparent configuration of substrate can be used for the optical path for realizing projecting apparatus system, therefore potentially change and/or eliminate optical module,Beam splitter, redirection eyeglass etc..System can have back light system, and wherein LED RGB triple, which can be, is oriented to enableThe light source that light passes through display.As a result, backlight and display perhaps can with waveguide purlieu install or display itAfter cylindricality/directing optics may be present to enable light properly enter optical device.If shown without directing opticsDevice may be mounted to that the top of waveguide, side etc..In one example, small transparent display can be with transparent substrates (for example, blue preciousStone) on silicon active backplane realize that transparent electrode is controlled by silicon active backplane, liquid crystal material, polarizer etc..The function of polarizerIt can be and correct depolarizing to improve the contrast of display by the light of system.In another example, system is availableApply the spatial light modulator of some form of modulation with spatial variations to optical path, such as microchannel spatial light modulator,Middle diaphragm mirror optical shutter is based on microelectromechanical-systems (MEMS).System also using other optical modules, filter by such as tunable opticalWave device (for example, there is deformable film actuator), angle of elevation deflected micromirror system, discrete phase optical element etc..
In other embodiments, eyepiece can provide higher using OLED display, quantum dot displays etc., these displaysPower efficiency, brighter display, lower-cost component etc..In addition, the display skill of such as OLED and quantum dot displaysArt can provide flexible display, therefore allow to reduce the bigger packaging efficiency of eyepiece overall dimension.For example, OLED and quantum dotDisplay material can be printed in plastic supporting base by stamping technology, therefore obtain Flexible Displays component.For example, OLED(hasMachine LED) display can be the display of the flexibility, low-power that do not need backlight.It can be it is curved, such as Standard spectacles mirrorPiece is such.In one embodiment, OLED display can be transparent display or provide transparent display.In each embodimentIn, irrealizable each resolution levels and equipment size (for example, frame thickness) before high modulation transfer function is permittedCombination.
With reference to Figure 82, eyepiece is available with the associated planar illumination tool 8208 of reflective display 8210, wherein light source8208 couple 8204 with the edge of planar illumination tool 8208, and wherein the planar side of planar illumination tool 8208 irradiates reflectivityDisplay 8210, display 8210 provide the content to present to the eyes 8222 of wearer by conductive optics 8212Imaging.In embodiments, reflective display 8210 can be LCD, LCD(LCoS on silicon), cholesteryl liquid crystal, guest-host type liquidCrystalline substance, polymer dispersed liquid crystals, phase retardation liquid crystal etc. or other liquid crystal technologies known in the art.In other embodiments, insteadPenetrating property display 8210 can be bistable display, electrophoresis, electrofluid, electricity wet, dynamic electricity, cholesteryl liquid crystal etc., orAny other bistable display known in the art.Reflective display 8210 is also possible to LCD technology and bistable displayThe combination of technology.In embodiments, the coupling 8204 between light source 8208 and " edge " of planar illumination tool 8208 can lead toThe other surfaces for crossing planar illumination tool 8208 carry out, and are then directed in the plane of planar illumination tool 8208, such as oneBegin through top surface, bottom surface, inclination surface etc..For example, light can enter planar illumination tool from top surface, but enter 45Facet, so that light is made to bent into the direction of plane.In an alternate embodiment, the direction of this light to bent available optical coating realIt is existing.
In one example, light source 8202 can be the RGB LED source that direct-coupling 8204 arrives the edge of planar illumination tool(for example, LED array).Then light into the edge of planar illumination tool is directed to reflective display to be imaged, allAs described here.Light can enter reflective display to be imaged, then by planar illumination tool (such as using reflectivityThe reflecting surface of display back side) it is redirected back.Then light can enter conductive optics 8212 so as to by image orientationTo the eyes 8222 of wearer, such as by lens 8214, reflecting surface 8220 is reflected by beam splitter 8218, passes through beam splitter8218 etc. return to eyes 8222.Conductive optics 8212, but art technology are described while in accordance with 8214,8218 and 8220Personnel are appreciated that conductive optics 8212 may include known any conductive optics configuration, including than being described herein asMore complicated or simpler configuration.For example, beam splitter 8218 can be directly toward eye using the different focal length in field lens 8214Eyeball bent image, it is thus eliminated that bending eyeglass 8220, realizes better simply design and realize.In embodiments, light source8202 can be LED light source, laser light source, white light source etc. or any other light source known in the art.Optical coupling arrangement 8204It can be the direct-coupling between light source 8202 and planar illumination tool 8208, or pass through couplant or mechanism, such as waveIt leads, optical fiber, photoconductive tube, lens etc..Planar illumination tool 8208 can receive light, and passes through interference grating, the not perfect property of optics, dissipatesFeature, reflecting surface, refracting element etc. are penetrated by the planar side of light-redirecting to its structure.Planar illumination tool 8208 can be insteadCoverslip on penetrating property display 8210, such as reducing the combination of reflective display 8210 and planar illumination tool 8208Thickness.Planar illumination tool 8208, which may also include, to be located at from the diffuser on the nearest side of conductive optics 8212, so as toThe cone angle of enlarged image light when image light arrives conductive optics 8212 by planar illumination tool 8208.Conductive optics8212 may include multiple optical elements, lens, eyeglass, beam splitter etc. or any other optic delivery known in the artElement.
Figure 83 presents the embodiment of the optical system 8302 for eyepiece 8300, wherein being mounted on flat on substrate 8304Face irradiation tool 8310 and reflective display 8308 are illustrated as docking by conductive optics 8212, the conductive optics8212 include initial dispersion lens 8312, beam splitter 8314 and spherical mirror 8318, and conductive optics 8212 are to eye movement range8320 are presented image, and the eyes of wearer receive image at eye movement range 8320.In one example, flat beam splitter 8314 canTo be the fractional transmission eyeglass coating etc. of wire-grid polarizer, metal, and spherical reflector 8318 can be a series of dielectricCoating to provide partially reflecting mirror on the surface.In another embodiment, the coating on spherical mirror 8318 can be thin metal and applyLayer is to provide partially transmitting mirror.
In an embodiment of optical system, Figure 84 shows a part as ferroelectricity light wave circuit (FLC) 8404Planar illumination tool 8408, FLC8404 are coupled to planar illumination tool including the use of by waveguide wavelength converter 8420,8422The configuration of 8408 laser light source 8402, wherein planar illumination tool 8408 is using grating technology to towards reflective displayThe incoming light at the edge from planar illumination tool is presented in 8410 plane surface.Image light from reflective display 8410Then conductive optics are redirected back to by planar illumination tool 8408, via the hole 8412 in support construction 8414.CauseLaser is utilized for this embodiment, FLC also feds back through using light sharp according to broadening as described in United States Patent (USP) 7265896Light spectrum reduces the spot from laser.In this embodiment, laser light source 8402 is IR laser light source, and wherein FLC willLight beam is combined to RGB, and having causes laser to jump and generate broadening bandwidth to provide the back-reflection of spot inhibition.In this realityIt applies in example, spot inhibition carries out in waveguide 8420.Laser from laser light source 8402 passes through multi-mode interference combiner (MMI)8422 are coupled to planar illumination tool 8408.Each laser light source port is positioned such that folded across the light of MMI combinerIt is added on an output port of planar illumination tool 8408.The grating of planar illumination tool 8408 turns to reflective display productionRaw uniform irradiation.In embodiments, superfine pitch (for example, interferometry) can be used to generate to reflectivity in optical grating elementThe irradiation of display, when light passes through planar illumination tool arrival conductive optics, the irradiation is reflected back toward and has pole from gratingLow scattering.That is, light is spread out of in the case where alignment, so that grating is almost fully transparent.It is noted that in this embodimentUsed in light feedback be due to the use to laser light source, and when leds are used, it may not be necessary to spot inhibits, because of LEDThere are enough bandwidth.
It includes that there is the configuration of the not perfect property of optics (to match in this case for " flute profile " that utilization is shown in Figure 85Set) planar illumination tool 8502 optical system an embodiment.In this embodiment, light source 8202 is directly coupled8204 arrive the edge of planar illumination tool 8502.Light then passes through planar illumination tool 8502 and encounters planar illumination tool materialsIn sulculus 8504A-D, the slot in such as polymethyl methacrylate (PMMA) piece.In embodiments, with slot 8504A-DGradually leave input port, its spacing can be varied that (' aggressiveness ' more for example, with when they proceed to 8504D from 8504ACome it is smaller), its height is varied, its pitch is varied.Then light is redirected to reflective display by slot 8504A-DThe 8210 irrelevant array as light source, so that the sector for advancing to the light of reflective display 8210 is generated, wherein reflectingProperty display 8210 from slot 8504A-D enough far with generate from each slot, overlapping to provide to reflective display 8210Region uniform irradiation irradiation mode.In further embodiments it is possible to there are the best spacing of slot, wherein reflexive displayThe number of the slot of every pixel can be increased so that light is more irrelevant (fuller) on device 8210, but this again because there is more slotsThere is provided image internal interference and produce in the image for being supplied to wearer compared with low contrast.Although reference groove describes thisEmbodiment, but the not perfect property of other optics (such as point) is also possible.
In embodiments, and Figure 86 is referred to, opposite direction protuberance 8604(is " opposing slot ") planar illumination work can be applied toIn the slot of tool, such as in ' fastening ' protuberance sub-assembly 8602.Wherein opposite protuberance 8604 is placed in slot 8504A-D, so thatThere are air gaps between the side wall that the side wall and opposite direction for obtaining slot swell.This air gap provides such as when light is by planar illumination toolThe circumscribed for the refractive index that light is perceived changes, the reflection that this facilitate light at groove sidewall.The application of opposite direction protuberance 8604 subtractsAberration and the deflection of the image light as caused by slot are lacked.That is, the image light that self-reflection display 8210 reflects is rolled over by groove sidewallIt penetrates, thus it changes direction due to snell law.By providing opposite protuberance, the Sidewall angles of bracket groove in slotSidewall angles with opposite direction protuberance, the refraction of image light is compensated, and image light is reset towards conductive optics 8214To.
In embodiments, and Figure 87 is referred to, planar illumination tool 8702 can be made of multiple laminate layers 8704Laminar structure, wherein laminate layers 8704 have alternate different refractivity.For example, planar illumination tool 8702 can pass through laminationTwo diagonal planes 8708 of thin slice.By this method, bathtub construction shown in Figure 85 and 86 is replaced with laminar structure 8702.ExampleSuch as, (PMMA1 comparison PMMA2 --- wherein difference is the molecular weight of PMMA) can be made in laminated foil of similar material.OnlyIt is relatively thicker for wanting each layer, it is possible to disturbing effect be not present, and be used as transparent plastic sheet.In shown configuration, to angleplied laminateThe light source 8202 of small percentage is redirected to reflective display by pressure, wherein the pitch being laminated is selected to minimize aberration.
In an embodiment of optical system, Figure 88 shows the planar illumination tool 8802 of utilization ' wedge shape ' configuration.?In this embodiment, light source is directly coupled 8204 edges for arriving planar illumination tool 8802.Light then passes through planar illumination workTool 8802 and the inclined surface for encountering the first wedge shape 8804, wherein light is redirected to reflective display 8210, is then return toIrradiation tool 8802 simultaneously passes through the first wedge shape 8804 and the second wedge shape 8812 and reaches on conductive optics.In addition, multilayer appliesLayer 8808,8810 can be applied to wedge shape to improve conductive properties.In one example, wedge shape can be made of PMMA, having a size of 1/2mm high -10mm is wide, and crosses over entire reflective display, the angle etc. with 1 to 1.5 degree.In embodiments, light existsIt before can be in wedge shape 8804 by multiple reflections to irradiate reflective display 8210 across wedge shape 8804.If using high reflectionCoating 8808 and 8810 pair wedge shape 8804 applies, then light can be before turning to and transferring back to light source 8202 again in wedge shapeMultiple reflections are carried out in 8804.However, by using laminated coating 8808 and 8810 in wedge shape 8804, such as with SiO2, fiveTwo niobiums etc. are aoxidized, light may be oriented to irradiate reflective display 8210.Coating 8808 and 8810 is designed in wide in range modelIt encloses with specified wavelength reflected light in angle, but emits light in a certain angular range (for example, the outer angle of θ).In each embodimentIn, which allows light in wedge-shaped internal reflection, until it reaches the transmission for being presented to reflective display 8210Window, coating is then arranged to allow to transmit there.Light of the angle orientation from LED luminescent system of wedge shape is with equablyReflective image display is illuminated, to generate the image reflected by irradiation system.By providing the light from light source 8202So that the wide cone angle of light enters wedge shape 8804, different light will be reached at the different location along the length of wedge shape 8804 and be transmittedTherefore window to be provided to the uniform irradiation on the surface of reflective display 8210, and is supplied to wearer's eyesImage has the uniform luminance as determined by the picture material in image.
In embodiments, the perspective including planar illumination tool 8208 and reflective display 8210 as described hereOptical system can be applied to any headset equipment known in the art, such as including eyepiece as described herein, but alsoThe helmet be can be applied to (for example, the military helmet, aircrew helmet, bicycle helmet, motorcycle helmet, the deep-sea helmet, the space helmetDeng), ski goggle, glasses, diving mask, night vision mask, anti-poison dust-proof mask, the Hazmat helmet, virtual implementing helmet, mouldIt proposes standby etc..In addition, optical system associated with headset equipment and protective cover can be incorporated to optical system in various manners,Including other than traditionally optical component associated with headset equipment and cover, optical system is inserted into headset equipment.For example, optical system can be included in ski goggle as individual unit, so that projected content is provided a user,But wherein optical system does not substitute any component of ski goggle, and the perspective cover of such as ski goggle is (for example, be exposed to outerClear or coloring the plastic jacket of portion's environment, so that invasion of the eyes of user from wind and snow).Alternatively, optical system can be at leastPartly substitute traditionally certain optical devices associated with wear-type device.For example, conductive optics 8212 is certainThe outer lens of the alternative eyewear applications of optical element.In one example, the beam splitter of conductive optics 8212, lens or eyeglassThe front lens that can replace eyewear applications (for example, sunglasses), it is thus eliminated that the needs of the front lens to glasses, such as if curvedBent reflecting optics 8220 are extended to cover glasses, then eliminate the demand to lens cover.In embodiments, including plane is shoneThe perspective optical system for penetrating tool 8208 and reflective display 8210 can be located in wear-type device, so as to wear-type deviceFunction and it is beautiful for it is unobtrusive.For example, in the situation of glasses, or specifically in the situation of eyepiece, optical systemThe adjacent upper part of lens can be located at, such as in the top of frame.
In embodiments, optics assembly can be used for such as being mounted in the configuration of the display on head or on the helmet,And/or may also include single lens, binocular, holographic binocular, helmet visors, the head-mounted display with Mangin mirror,The integrated helmet and display sighting system, the display sighting system of the integrated helmet, the advanced head-mounted display (AHMD) of link andMultiple microdisplay optical devices.In embodiments, optics assembly may include telephoto lens.Such lens can be glassesFormula installation or install in other ways.Such a embodiment is beneficial for the people with the defects of vision.EachIn embodiment, the wide-angle Kepler telescope of EliPeli can be built in spectacle lens.Delivery lens can be used in such designInterior embedded eyeglass folds optical path and source element for the amplification compared with high magnification numbe.This allows wearer in glasses latticeIt is checked simultaneously in formula through amplification and the not enlarged visual field.In embodiments, optics assembly can be used for having by Britain's human relationsIn the configuration for the display that the Q-Sight of the BAE system house exploitation of Dun Shi is mounted on the helmet.Such configuration is provided toThe new line for giving Situation Awareness looks out ability.Moreover, any optics group in configuration as described above can be used in each embodimentPiece installing.
The planar illumination tool of also referred to as irradiation module can provide the light of multicolour, including RGB (RGB) light and/Or white light.Light from irradiation module is directed into 3LCD system, digital light processingSystem, liquid crystal over silicon(LCoS) system or other micro displays or micro- optical projection system.Irradiation module can be used wavelength combination and with to the non-linear of sourceHigh brightness, the long-life, spot reduces or the source of immaculate light to provide for the nonlinear frequency transformation of feedback.Of the invention is eachEmbodiment provides the light of multicolour, including RGB (RGB) light and/or white light.Light from irradiation module is directed into3LCD system, digital light processingSystem, liquid crystal over silicon (LCoS) system or other micro displays or micro- optical projection system.Irradiation module described herein can use in the optics assembly of eyepiece 100.
One embodiment of the present of invention includes a system, which includes: the light beam for being configured to generate first wave lengthLaser, LED or other light sources;It is coupled to laser and is configured to guide the planar lightwave circuit of light beam;And it is coupled toPlanar lightwave circuit and the light beam for being configured to receive first wave length, by the optical beam transformation of first wave length at the output of second wave lengthThe waveguide optical frequency changer of light beam.The system can provide optical coupler feedback to laser, which depends non-linearly onFirst wave length light beam power.
Another embodiment of the present invention includes a system, which includes: substrate;It is placed on substrate and is configured to issueThe light source (such as diode laser matrix or one or more LED) of multiple light beams of first wave length;Simultaneously quilt is placed on substrateIt is coupled to light source, and is configured to combine multiple light beam and generates the planar lightwave circuit of the beam combination of first wave length;WithAnd it is placed on substrate and is coupled to planar lightwave circuit, and be configured to first wave length using nonlinear frequency transformationBeam combination is transformed into the nonlinear optical element of the light beam of second wave length.The system can provide light to laser diode arraysCoupled Feedback, the feedback depend non-linearly on the power of the beam combination of first wave length.
Another embodiment of the present invention includes a system, which includes: the multiple light for being configured to generate first wave lengthThe light source (such as semiconductor laser array or one or more LED) of beam;It is coupled to light source, and is configured to combine multipleLight beam and export first wave length beam combination array waveguide optical grating;It is coupled to array waveguide optical grating, and is configured toThe quasi-phase matched wave of the output beam of second wave length is generated come the beam combination based on first wave length using second harmonicLong transformation waveguide.
Electric power can obtain out of wavelength shifting device and feed back to source.The electric power of feedback, which has, is supplied to wavelength change about sourceThe non-linear dependence of the input electric power of exchange device.Nonlinear feedback can reduce the output power from wavelength shifting device to settingThe susceptibility of variation in standby nonlinear factor, because if nonlinear factor reduces then reverse power and increases.It is increased anti-Feedback often increases the electric power for being supplied to wavelength shifting device, therefore alleviates the effect of the nonlinear factor of reduction.
With reference to Figure 109 A and 109B, processor 10902(is for example, digital signal processor) it can provide display order frame10924 with for use by the display component 10928(of eyepiece 100 for example, LCOS display component) perform image display.In each implementationIn example, sequence frames 10924 can be used or the intermediary component that is not employed as between processor 10902 and display component 10928Display driver 10912 generates.For example, and reference Figure 109 A, processor 10902 may include frame buffer zone 10904 and displayInterface 10908(is for example, Mobile Industry Processor Interface (MIPI), and display serial line interface (DSI)).Display interface 10908It can will be supplied to by the RGB data 10910 of pixel as the intermediary component between processor 10902 and display component 10928Display driver 10912, wherein display driver 10912 receives by pixel RGB data 10910 and generates for the independent of redFull frame shows data 10918, shows number for the independent full frame display data 10920 of green and the independent full frame for blueAccording to 10922, display order frame 10924 is thus supplied to display component 10928.In addition, display driver 10912 can be to displayComponent 10928 provides timing signal, all for example synchronous full frames 10918,10920,10922 as display order frame 10924Transmitting.In another example, and Figure 109 B is referred to, display interface 10930 can be configured to by direct to display component 10928It provides and shows data 10934 for red full frame, show data 10938 and the full frame for blue for green full frameDisplay data 10940 remove display driver 10912 as display order frame 10924.In addition, timing signal 10932 can be fromDisplay interface 10930 is supplied directly to display component.This configuration can by remove demand to display driver provide it is aobviousWrite lower power consumption.The demand to driver not only can be removed in this direct faceplate formation, but also the totality that can simplify configuration is patrolledVolume, and remove Pixel Information etc. is generated to faceplate formation of the change from pixel, from frame needed for redundant memory.
Figure 89 is the block diagram of the irradiation module of an embodiment according to the present invention.Irradiation module 8900 includes according to the present inventionLight source, combiner and the optical frequency conversion device of one embodiment.Light source 8902,8904 issues the input terminal towards combiner 8906The light radiation 8910,8914 of mouth 8922,8924.Combiner 8906 has combiner output port 8926, which issues combinationRadiation 8918.Combined radiation 8918 is received by optical frequency conversion device 8908, and optical frequency converter provides output optics spokePenetrate 8928.Optical frequency converter 8908 can also provide feedback radiation 8920 to combiner output port 8926.Combiner 8906 makes insteadFeedback radiation 8920 separates to provide from the source that input port 8922 issues feedback radiation 8912 and issue from input port 8924Source feedback radiation 8916.Source feedback radiation 8912 is received by light source 8902, and source feedback radiation 8916 is received by light source 8904.LightLight radiation 8910 and source feedback radiation 8912 between source 8902 and combiner 8906 can be according to free space and/or guide structuresAny combination of (for example, optical fiber or any other optical waveguide) is propagated.Light radiation 8914, source feedback radiate 8916, combinationRadiation 8918 and feedback radiation 8920 can also be propagated according to any combination of free space and/or guide structure.
Suitable light source 8902 and 8904 is including one or more LED or with the launch wavelength by light feedback influenceAny optical emitter.The example in source includes laser, and can be semiconductor diode laser.For example, 8902 He of light source8904 can be the element in semiconductor laser array.Source in addition to lasers can also be used (for example, optical frequency converter canIt is used as source).Although showing two sources in Figure 89, more than two source is can also be used to realize in the present invention.Combiner 8906It is shown generally as three port devices with port 8922,8924 and 8926.It is inputted although port 8922 and 8924 is referred to asPort, and port 8926 is referred to as combiner output port, these ports can be two-way, and can receive as described aboveWith transmitting light radiation.
Combiner 8906 may include the waveguide dispersive elements and optical element for defining port.Suitable waveguide dispersive elements canWaveguide optical grating, reflexive diffraction grating, transmittance diffraction grating, holographic optical elements (HOE), wavelength selectivity mistake including arrayThe assembly and photonic band gap structure of filter.Therefore, combiner 8906 can be wavelength combiner, and wherein input port is respectivelyWith corresponding, non-overlap input port wave-length coverage, to be couple efficiently into combiner output port.
Each two-phonon process can carry out in optical frequency converter 8908, including but not limited to: harmonic wave occurs and the life that takes place frequently(SFG), second harmonic generation (SHG), difference frequency generation, parameter generation, parameter amplification, parametric oscillation, three wave mixing, four waves are mixedFrequently, stimulated Raman scattering, stimulated Brillouin scattering, stimulated emission, acousto-optic frequency shift and/or electric light frequency displacement.
Generally, optical frequency converter 8908 receives to input according to the optics of the input set of optical wavelength, and provides according to lightThe optics output of the output collection of wavelength is learned, wherein output collection is different from input set.
Optical frequency converter 8908 may include nonlinear optical material, such as lithium niobate, lithium tantalate, potassium titanium oxide phosphate, niobic acidPotassium, quartz, silicon, silicon oxynitride, GaAs, lithium borate and/or barium metaborate.Light interaction in optical frequency converter 8908 can be eachCarried out in kind of structure, including body structure, waveguide, quantum well structure, quantum cable architecture, quantum-dot structure, photonic band gap structure and/Or multicomponent waveguiding structure.
In the case where wherein optical frequency converter 8908 provides parameter nonlinear optical process, this nonlinear optical processPreferably phase matched.Such phase matched can be birefringent phase matching or quasi-phase matched.Quasi-phase matched canThe disclosure of United States Patent (USP) 7 including authorizing Miller, the method disclosed in 116,468, the patent is contained in by referenceThis.
Optical frequency converter 8908 may also include the various elements for improving its operation, such as wavelength selectivity output couplingWavelength selective reflectors, for wavelength selectivity resonance wavelength selective reflectors, and/or for controlling converterThe wavelength selectivity of spectral response loses element.
In embodiments, multiple irradiation modules as described in Figure 89 can be associated to form compound irradiation module.
One component of irradiation module can be diffraction grating or grating as described further herein.Diffraction light gridThickness is smaller than 1mm, but still firm enough to be permanently glued to position or replace the coverslip of LCOS.Make in irradiation moduleWith one of grating the advantage is that it will increase efficiency using laser irradiation light source and reduce power.Grating can inherently haveThere is less stray light, and since wavestrip is relatively narrow, will be allowed for filtering out eyes in the lesser situation of reduction of perspective brightnessLuminous more more options.
Figure 90 depicts the block diagram of the optical frequency converter of an embodiment according to the present invention.Figure 90 shows feedback radiationHow 8920 provided by exemplary optical frequency converter 8908, and optical frequency converter 8908 provides parameter frequency transformation.Combined radiation8918 provide the forward radiation 9002 for traveling to the right part of Figure 90 in optical frequency converter 8908, and also travel to the right part of Figure 90Parametric radiation 9004 occur in optical frequency converter 8908 and from optical frequency converter 8908 be emitted as output light radiation 8928.In general, existing as interaction carries out (in this example, i.e., with radiation propagation to right part) from forward radiation 9002 to parametric radiation9004 net power transmitting.There may be the reflector 9008 of transmittance relevant to wavelength to be placed in optical frequency converter 8908In, with reflection (or part is reflected), forward radiation 9002 is in order to provide backward radiation 9006, or can be set to after end face 9010In outside optical frequency converter 8908.Reflector 9008 can be grating, inner boundary, added with coating or plus the end face of coating or itsAny combination.The preferred levels of the reflectivity of reflector 9008 are greater than 90%.Reflector at input interface 9012 mentionsFor purely linear feedback (that is, with the incoherent feedback of process efficiency).Reflector at end face 9010 provides maximumNonlinear feedback, because forward power maximizes the correlation of process efficiency at output interface (takes the ginseng of phase matchedAmount interaction).
Figure 91 is the block diagram of laser irradiation module according to an embodiment of the invention.Although in this embodiment using sharpLight device, it is to be understood that, it is possible to use other light sources, such as LED.Laser irradiation module 9100 includes diode laser array9102, waveguide 9104 and 9106, star-type coupler 9108 and 9110 and optical frequency converter 9114.Diode laser array9102 have the laser emitting elements for being coupled to waveguide 9104, and waveguide 9104 is taken on to the defeated of slab guide star-type coupler 9108Inbound port (port 8922 and 8924 on such as Figure 89).Star-type coupler 9108 passes through 9106 coupling of waveguide with different lengthClose another slab guide star-type coupler 9110.The combination of star-type coupler 9108 and 9110 and waveguide 9106 can be arrayChange waveguide optical grating, and takes on and provide the wavelength combiner of combination radiation 8918 (for example, the combiner on Figure 89 to waveguide 91128906).Waveguide 9112 provides combination radiation 8918 to optical frequency converter 9114.In optical frequency converter 9114, can optionally it reflectThe back reflection of the offer combination radiation 8918 of device 9116.Above in conjunction with as described in Figure 90, this back reflection each reality according to the present inventionIt applies example and nonlinear feedback is provided.Plane coating process and/or light can be used with reference to one or more of Figure 91 element describedCarving method comes in common substrate to manufacture, to reduce cost, number of parts and to alignment request.
Second waveguide can be placed so that its core near the waveguide core in optical frequency converter 8908.Such as this fieldIn it is known, the arrangement of this waveguide is used as directional coupler, so that the radiation in waveguide can provide in optical frequency converter 8908Additional radiation.It can avoid significantly coupling by providing the wavelength radiation different from the wavelength of forward radiation 9002, orSpurious radiation can be coupled in optical frequency converter 8908 at the position that forward radiation 9002 is depleted.
Although the standing wave feedback configuration for the same paths backpropagation that wherein feedback power is advanced along input power is that have, but traveling wave feedback configuration can also be used.In traveling wave feedback configuration, feedback is different from the outgoing position of input powerGain media is reentered at one position.
Figure 92 is the block diagram of recombination laser irradiation module according to another embodiment of the present invention.Recombination laser irradiation module9200 include one or more laser irradiation modules 9100 with reference to described in Figure 91.Although Figure 92 show for simplicity includingThe recombination laser irradiation module 9200 of three laser irradiation modules 9100, recombination laser irradiation module 9200 may include more or moreFew laser irradiation module 9100.Diode laser array 9210 may include one or more diode laser arrays9102, diode laser array can be the array of laser diode, diode laser array ,/or be configured to be emittedThe semiconductor laser array of the light radiation of (that is, wavelength is shorter than radio wave but longer than visible light) in infrared spectroscopy.
Laser array output waveguide 9220 is coupled to the diode laser in diode laser array 9210, and willThe output directional of diode laser array 9210 is to star-type coupler 9108A-C.Laser array output waveguide 9220, arrayChange waveguide optical grating 9230 and planar lightwave circuit can be used to manufacture on a single substrate for optical frequency converter 9114A-C, and can wrapInclude silicon oxynitride waveguide and/or lithium tantalate waveguide.
Array waveguide optical grating 9230 includes star-type coupler 9108A-C, waveguide 9106A-C and star-type coupler9110A-C.Waveguide 9112A-C provides combined radiation to optical frequency converter 9114A-C respectively, to star-type coupler 9110A-CFeedback radiation is provided.
Optical frequency converter 9114A-C may include nonlinear optics (NLO) element, for example, optical parametric oscillator element and/orQuasi-phase matched optical element.
Recombination laser irradiation module 9200 can produce the output light radiation of multiple wavelength.This multiple wavelength can be at visible lightIn spectrum, i.e., wavelength is shorter than infrared light but longer than ultraviolet light.For example, about 450nm and about 470nm can be similarly provided in waveguide 9240ABetween output light radiation, waveguide 9240B can provide the output light radiation between about 525nm and about 545nm, and waveguide 9240CIt can provide the output light radiation between about 615nm and about 660nm.These ranges of output light radiation can be selected again to provideThe visible wavelength (for example, respectively blue, green and red wavelength) of human viewers' pleasure is enabled, and can be combined again to generate white lightOutput.
Waveguide 9240A-C can with laser array output waveguide 9220, array waveguide optical grating 9230 and optical frequency translationIt is manufactured on the identical planar lightwave circuit of device 9114A-C.In some embodiments, the output light that waveguide 9240A-C is respectively providedRadiation can provide optical power of the range in the range between about 1 watt and about 20 watts.
Optical frequency converter 9114 may include being configured to execute second harmonic to the combination radiation of first wave length to generate (SHG)And generate the quasi-phase matched wavelength conversion waveguide of the radiation of second wave length.Quasi-phase matched wavelength conversion waveguide can be configured toCarry out the pumped optical parametric oscillator being integrated into quasi-phase matched wavelength conversion waveguide using the radiation of second wave length and generates thirdThe radiation of wavelength, third wavelength are optionally different from second wave length.The waveguide of quasi-phase matched wavelength conversion also can produce viaWaveguide 9112 travels to the feedback radiation of diode laser array 9210 by array waveguide optical grating 9230, so that settingIt can be determined by the corresponding port on array waveguide optical grating in each laser in diode laser array 9210Unique wavelength work.
For example, recombination laser irradiation module 9200 can be configured to using with the diode of the wavelength nominal operation of about 830nmLaser array 9210, to generate the output light spoke in visible spectrum corresponding with any color in red, green or blueIt penetrates.
Recombination laser irradiation module 9200 can be optionally configured in the case where no optical device intervened therebetweenDirect irradiation spatial light modulator.In some embodiments, recombination laser irradiation module 9200 can be used with single first wave lengthThe diode laser array 9210 of nominal operation, with generate simultaneously multiple second wave lengths (such as with red, green or blue it is rightThe wavelength answered) output light radiation.Each different second wave length can be generated by the example of laser irradiation module 9100.
Recombination laser irradiation module 9200 can be configured to will be more by using waveguide selectivity tap (not shown)The output light radiating composite of a second wave length generates the white light of diffraction limit into single waveguide.
Diode laser array 9210, laser array output waveguide 9220, array waveguide optical grating 9230, waveguide9112, the manufacture crafts such as coating or photoetching can be used to exist for optical frequency converter 9114 and frequency changer output waveguide 9240It is manufactured in common substrate.With reference to described in Figure 92, beam shaping element 9250 is coupled to recombination laser by waveguide 9240A-CIrradiation module 9200.
Beam shaping element 9250 can be placed on substrate identical with recombination laser irradiation module 9200.Substrate for example may be usedIncluding Heat Conduction Material, semiconductor material or ceramic material.Substrate may include copper-tungsten, silicon, GaAs, lithium tantalate, oxynitridingSilicon and/or gallium nitride, and can be used including coating, photoetching, etching, deposition and the semiconductor fabrication process of injection and handle.
In the element certain elements (such as diode laser array 9210, laser array output waveguide 9220,Array waveguide optical grating 9230, waveguide 9112, optical frequency converter 9114, waveguide 9240, beam shaping element 9250 and variousRelevant planar lightwave circuit) it can be by passive coupling and/or alignment, and in certain embodiments, on a common substrate by heightPackaging passive alignment.Each of waveguide 9240A-C can be coupled to the different instances of beam shaping element 9250, rather than as schemed instituteIt is coupled to discrete component with showing.
The output light radiation that beam shaping element 9250 can be configured to self-waveguide 9240A-C in future is configured to substantially rectangularDiffraction limited beam, and can also future self-waveguide 9240A-C output light radiation be configured in substantially rectangular beam shapeIt is upper that there is the brightness uniformity for being greater than about 95%.
Beam shaping element 9250 may include non-spherical lens, such as " high cap " (top-hat) lenticule, holographic element orGrating.In some embodiments, the diffraction limited beam that beam shaping element 9250 exports generates the spot of considerably reductionPoint is speckless.The light beam that beam shaping element 9250 exports can provide optics of the range between about 1 watt and about 20 wattsPower, and there is substantially flat phase front.
Figure 93 is the block diagram of the imaging system of an embodiment according to the present invention.Imaging system 9300 includes light engine9310, light beam 9320, spatial light modulator 9330, modulated light beam 9340 and projecting lens 9350.Light engine 9310 can beCompound light irradiation module, multiple irradiation modules described in such as Figure 89, the recombination laser irradiation module with reference to described in Figure 929200, or the LASER Illuminator System 9300 with reference to described in Figure 93.Spatial light modulator 930 can be 3LCD system, DLP system,LCoS system, transmissive type liquid crystal display (for example, transmission-type LCoS), liquid crystal over silicon array, the light valve based on grating or otherMicro-display or micro- optical projection system or reflective display.
Spatial light modulator 9330 can be configured to spatially be modulated light beam 9320.Spatial light modulator 9330It can be coupled to electronic circuit, which is configured to make spatial light modulator 9330 (such as, can be electric by video imageThe video image shown depending on machine or computer monitor) it is modulated on light beam 9320 to generate modulated light beam 9340.OneIn a little embodiments, using optical reflection principle, modulated light beam 9340 can receive the same of light beam 9320 from spatial light modulatorSide is exported from spatial light modulator.In other embodiments, using optical transmission principle, modulated light beam 9340 can be from spaceThe opposite side that optical modulator receives light beam 9320 is exported from spatial light modulator.Modulated light beam 9340 may be optionally coupled to throwingIn shadow lens 9350.Projecting lens 9350 is commonly configured to modulated light beam 9340 projecting to such as video display screenOn display.
The method of irradiation video display can be used compound irradiation module (such as including the compound of multiple irradiation modules 8900Irradiation module), recombination laser irradiation module 9100, LASER Illuminator System 9200 or imaging system 9300 execute.Diffraction limitOutput beam use compound irradiation module, recombination laser irradiation module 9100, LASER Illuminator System 9200 or light engine 9310To generate.Output beam use space optical modulator (such as spatial light modulator 9330) and optional projecting lens 9350To orient.Spatial light modulator can project image on the display of such as video display screen.
Irradiation module can be configured to be emitted any number of wavelength, including one, two, three, four, five, sixA or more, wavelength is spaced apart according to different quantity, and has equal or unequal power level.Irradiation module can be matchedIt is set to the single wavelength of every beam exit or the multiple wavelength of every beam exit.Irradiation module may also include additional component and function,Including Polarization Controller, polarization rotator, power supply, the power circuit of such as power fet, electronic control circuit, heat management system,Heat pipe and safety interlocking.In some embodiments, irradiation module can be coupled to optical fiber or light guide, such as glass (for example,BK7).
Some selections of LCoS headlamp designs include: wedge shape 1) with laminated coating (MLC).This conception of species uses MLCTo define the specific angle reflected and transmit;2) with the wedge shape of polarization beam apparatus coating.This conception of species is as vertical in conventional PBSIt square but is worked with much narrow angle.This can be PBS coating or wiregrating film;3) (these are similar to choosing to PBS prism bar#2 is selected, but there is the seam down to face plate center;4) wire-grid polarizer piece beam splitter (be similar to PBS wedge shape, but be only piece,So that it is mainly air rather than solid glass);And 5) include flexible membrane polarization beam apparatus (PBS), such as by havingCustomize refractive index different plastics each layer of alternating (so that its only in a face direction rather than matched in direction in another side)Manufactured 3M polarization beam apparatus.In mismatching direction, the quarter-wave stack of high reflector is formd, and in match partyXiang Zhong, film take on transparent plastic plate.The film may be laminated between glass prism, to be formed as fast on entire visual rangeFast light beam provides high performance wide-angle PBS.MLC wedge shape can be it is firm, and can be by gluing securely in place, without air gapIt can cold or thermal deflection.It can be used together with broadband LED light source.In embodiments, for complete module, MLC wedge shapeThe coverslip of alternative LCOS.MLC wedge shape, which can be, is approximately less than 4mm thickness.In one embodiment, MLC wedge shape can be 2mm it is thick orIt is thinner.
It is appreciated that the portion the present invention provides front light system as described here in all types of optical arrangementsAdministration, all types of optical arrangements may include but not necessarily include augmented reality eyepiece.Front light system is used as any classComponent in the optical system of type is especially preferred for use in as the source directly or indirectly irradiated to any one or more of opticsThe irradiation of element, optical surface or optical sensor most preferably has those of the optical path of alternative configuration component, allSuch as LCoS or liquid crystal display and/or reflected light.In some embodiments, at least some of the light that front light system generates canIt is reflected, to pass through one of front light system in the way that light reaches its final destination (for example, eyes, optical sensor etc.)Divide and is transmitted back to, and in other embodiments, generated light is in the way that it reaches its final destination not over headlight systemSystem is passed back.For example, front light system can irradiate the optical device of such as LCoS, to obtain image light, image light can pass through headlight systemThe component orientation of system is returned, later by adjusting image light so as to finally by the received additional light of one or more of eyes of userSystem.Such other optical systems can be or may include: that waveguide (can be the wave of free form surface in its componentLead), beam splitter, collimator, polarizer, eyeglass, lens and diffraction grating.
Figure 95 depicts the embodiment of LCoS headlamp designs.In this embodiment, the light irradiation from RGB LED9508Headlight 9504, the headlight can be wedge shape, PBS etc..Light is hit polarizer 9510 and is launched into its S stateLCoS9502, it is reflected back toward with its P-state by aspherical 9512 as image light there.Inline polarizer 9514 canThe 1/2 wave rotation to S state is polarized and/or caused to image light again.Then image light hits wire-grid polarizer 9520,And it is reflected into bending (spherical surface) part mirror 9524, pass through 1/2 wave delayer 9522 on the way at it.Image light is from lens reflecting to useThe eyes 9518 at family pass through 1/2 wave delayer 9522 and wire-grid polarizer 9520 again.The each of headlight 9504 will be discussed nowA example.
In embodiments, optics assembly includes partial reflection, partial transmission optical element, these optical elementsIt reflects the corresponding portion of the image light from image source and transmits the scene light of the see-through view from ambient enviroment, so as toThe eyes of user provide the combination image being made of each section of the image light reflected and the scene light transmitted.
In portable display system, it is important to provide bright, compact and light weight display.Portable display systemIncluding mobile phone, laptop computer, tablet computer and head-mounted display.
The present invention provides compact and light weight the headlights for portable display system, and headlight is by as part reflectorBending or other non-planar wire-grid polarizer films composition, polarizer film efficiently make the deflection of the light from edge light to irradiateReflective image source.Known wire-grid polarizer provides the high-efficiency reflective to a polarization state and allows other polarization states simultaneouslyPass through.Although sheet glass wire-grid polarizer is well known in the industry, and can be used the wire-grid polarizer of rigidity in the present invention,But flexible wire-grid polarizer film in a preferred embodiment of the invention, is used as bending wire-grid polarizer.Suitable wire grid polarizationDevice film can be obtained from the Asahi-Kasei E-materials company in Tokyo city.
Edge-light provides the illumination of compact form for display, but since it is located at the edge of image source, light must be inclinedIt turn 90 degrees with irradiation image source.In one embodiment of this invention, bending wire-grid polarizer film is used as partial reflection surface,So that being deflected down by the light that edge light provides to irradiate reflective image source.It is adjacent to provide polarization with edge lightDevice, so as to be provided to the irradiation light polarization of bending wire-grid polarizer.Polarizer and wire-grid polarizer, which are oriented such that, to be passed throughThe light of polarizer is reflected by wire-grid polarizer.Since the quarter-wave being included in reflective image source delays film, institute is anti-The polarization of image light is penetrated compared with irradiation light, is opposite polarization state.Therefore, institute's reflected image light passes through wire-grid polarizer filmAnd continue to display optics.By using flexible wire-grid polarizer film as part reflector, partial reflection surfaceIt can be bent in lightweight structure, wire-grid polarizer is taken on as the reflector for irradiation light and for image in this configurationThe dual role of the transparent component of light.Wire-grid polarizer film provides the advantage of it and can receive in a wide range of incidence angleImage light, so that curve will not interfere with the image light by reaching display optics.Further, since wire-grid polarizer film isThin (for example, less than 200 microns), curved shape will not make significantly image when image light is by reaching display opticsLight distortion.Finally, the tendentiousness that wire-grid polarizer enables light scatter is very low, so hi-vision contrast is maintained.
Figure 136 shows the schematic diagram of the image source 13600 of headlight of the invention.Edge light 13602 is provided by inclinedShake the irradiation light of device 13614 so that irradiation light 13610 is polarized, wherein polarizer 13614 can be absorption polarizers orReflective polarizer.Polarizer is oriented such that the polarizer state of irradiation light 13610 is so: light is inclined by bending wiregratingThe device 13608 that shakes reflects, and deflects down irradiation light 13610 towards reflective image source 13604.Therefore, polarizer 13614Pass through axis perpendicular to wire-grid polarizer 13608 by axis.It will be appreciated by the appropriately skilled person that although Figure 136 is shownWhat the image source 13600 of headlight was a horizontally oriented, but other directions are equally probable.As already described, usually allIf the reflective image source of LCOS image source includes that quarter-wave delays film, so that the polarization state of irradiation light is being reflectedProperty image source reflection during change, and as a result, image light have polarization state generally opposite compared with irradiation light.As known to those skilled in the art and as described in United States Patent (USP) 4398805, this change in polarization state is to all basesIn the operation of the display of liquid crystal be basic.To the various pieces of image, the liquid crystal cell in reflective image source 13604 will drawThe more or less change of polarizing state, so that the image light 13612 reflected has before through bending wire-grid polarizerThere is the elliptical polarization state of mixing.When by being bent wire-grid polarizer 13608 and appointing in display optics can be included inAfter what additional polariser, the polarization state of image light 13612 is determined by bending wire-grid polarizer 13608, and image light 13612The picture material for inside including determines local strength of the image light 13612 in the image shown by portable display system.
The flexible nature of the wire-grid polarizer film used in bending wire-grid polarizer 13608, which allows it to be shaped as, to shinePenetrate the shape that light 13610 focuses on reflective image source 13604.The shape for being bent the curve of wire-grid polarizer is selected as mentioningFor the uniform irradiation in reflective image source.Figure 136 shows the bending wire-grid polarizer 13608 with parabolic shape, but radiatesCurve, complicated spline curve or plane depend on the property of edge light 13602 it is also possible that the uniformly deflection of irradiation light 13610Onto reflective image source 13604.Experiment shows that parabola, radiation and complicated spline curve compared with flat surfaces, are provided whichIrradiation more evenly.But in some very thin headlight image sources, can be efficiently used flat wire-grid polarizer film withThe portable display system of light weight is provided.The shape of flexible wire-grid polarizer film can be maintained with side frame, the side as shown in Figure 138Frame has the shape slot of suitable profile wire-grid polarizer to be held in position in, and Figure 138 shows headlight image source groupThe schematic diagram of piece installing 13800.Side frame 13802 is shown with bent groove 13804 and is maintained at for flexible wire-grid polarizer filmIn required curved shape.Although only showing a side frame 13802 in Figure 138, two side frames 13802 and preceding can be usedOther components of lamp image source carry out the support bends shape on either side.In any situation, because of headlight of the inventionThe major part of image source is made of air and wire-grid polarizer film is very thin, so compared with the front light system of the prior art, weightAmount wants light.
In another embodiment of the present invention, headlight image source 13700 is provided, it has along reflective image sourceTwo or more edge lights 13702 of 13604 two or more edges placement.Adjoin with each edge light 13702Polarizer 13712 is provided adjacently so that irradiation light 13708 polarizes.Irradiation light 13708 by bending wire-grid polarizer 13704 deflect withIrradiate reflective image source 13604.Then the image light 13710 reflected passes through bending wire-grid polarizer 13704 and reaches aobviousShow on optical device.It is that more light can be applied to reflectivity using the advantages of two or more edge light 13702Thus image source 13604 provides brighter image.
Edge light can be fluorescent lamp, incandescent lamp, Organic Light Emitting Diode, laser or electroluminescent lamp.In this hairIn bright preferred embodiment, edge light is the array of 3 or more light emitting diodes.For uniform irradiation reflective imageSource, edge light should have sizable cone angle, such as edge light can be Lambertian source.The case where to laser light source, lightCone angle needs expanded.Array or multiple edge lights by using light source, the distribution of light to reflective image source can quiltAdjustment as a result, may make the brightness of shown image more evenly to provide irradiation more evenly.
Image light provided by headlight image source of the invention is passed to the display optics of portable display systemIn.How image depending on display is used, and various display optics are possible.For example, when display is plane screenWhen curtain display, display optics can be dispersion, or alternatively when display is that near-eye display or wear-type are aobviousWhen showing device, display optics can be refraction or diffraction.
Figure 139 is the flow chart in the present invention for the method for the portable display system with reflective image source.?In step 13900, the irradiation light of polarization is provided to one or more edges in reflective image source.In step 13902, bendingWire-grid polarizer receives irradiation light, and makes its deflection to irradiate reflective image source, and wherein the curve of wire-grid polarizer is chosen asImprove the uniformity of the irradiation to the region in reflective image source.In step 13904, reflective image source receives irradiation light,Reflected illumination light and the polarization state for corresponding to the image modification irradiation light shown at the same time.Image light is then in step13908 by being bent wire-grid polarizer and being passed in display optics.In step 13910, image is by portable displaySystem is shown.
In embodiments, the light weight portable display system for displaying images with reflective lcd image source canIncluding one or more edge lights (to adjoin one or more edges of reflective lcd image source polarized irradiation light to be provided),Be bent wire-grid polarizer part reflector (can receive polarized irradiation light, and make its deflection to irradiate reflective lcd image source) withAnd display optics (receive reflected image light from reflective lcd image source and show image).In addition, one or moreA edge light may include light emitting diode.In embodiments, wire-grid polarizer can be flexible membrane, and flexible membrane can be by side frameIt is maintained in curved shape.In embodiments, the bending wire-grid polarizer of display system can be parabola, radiation or complexitySpline curve.In addition, the reflective lcd image source of display system can be LCOS.In embodiments, display systemDisplay optics may include diffuser, and display system can be flat screen display.In embodiments, display systemDisplay optics may include refraction or diffraction element, and display system can be near-eye display or head-mounted display.
In embodiments, for providing image on the light weight portable display system with reflective lcd image sourceMethod can include: to one or more edges of reflective lcd image source provide polarized irradiation light, to be bent wire grid polarizationDevice receives irradiation light and deflects light to irradiate reflective lcd image source, reflects and relative to reflective lcd image to be usedThe polarization state for the image modification irradiation light that source is shown makes image light by bending wire-grid polarizer to provide image light, with aobviousShow that optical device receives image light, and display image.In each embodiment of this method, it is bent the Curved of wire-grid polarizerShape can be chosen so as to improve the uniformity of the irradiation of reflective lcd image source.In addition, one or more edge lights may includeLight emitting diode.In embodiments, wire-grid polarizer can be flexible membrane.In addition, flexible membrane can be maintained at bending by side frameIn shape.In the various embodiments of the invention, bending wire-grid polarizer can be parabola, radiation or complicated spline curve.In addition, reflective lcd image source can be LCOS in each embodiment of above method.In embodiments, optics device is shownPart may include diffuser, and display system can be flat screen display.In each embodiment of above method, optics is shownDevice may include refraction or diffraction element, and display system can be near-eye display or head-mounted display.
Figure 96 depicts an embodiment of the headlight 9504 including the optical bonding prism with polarizer.Prism is revealed asTwo cuboids have substantial transparent interface 9602 therebetween.Each prism is to angle bisection, along the interface placement divided equallyPolarizing coating 9604.The part of dividing equally of cuboid is formed by the triangle of lower position and is optionally made into monolithic9608.Prism can be made of BK-7 or equivalent.In this embodiment, cuboid, which has, is measured as the square end that 2mm multiplies 2mm.The length of cuboid is 10mm in this embodiment.In an alternate embodiment, divide equally including 50% eyeglass, 9704 surface, and twoInterface between a cuboid includes the polarizer 9702 that light can be transmitted according to P-state.
Figure 98 depicts three versions of LCoS headlamp designs.Figure 98 A depicts the wedge shape with laminated coating (MLC).This conception of species defines the specific angle reflected and transmit using MLC.In this embodiment, any in P or S-polarization stateThe image light of state is visually observed by user's.Figure 98 B depicts the PBS with polarizer coating.Herein, only to user'sThe image light of eyes transmitting S-polarization.Figure 98 C depicts right-angle prism, and eliminating allows image light by air as S polarized lightThe many materials of the prism of transmitting.
Figure 99 depicts the wedge shape with the polarizing coating 9902 being stacked on LCoS9904 plus PBS.
Figure 100 depicts light and enters short end (A) and light along two embodiments of long end (B) prism entered.SchemingIn 100A, the offset by dividing cuboid equally forms wedge shape, to form at least one 8.6 degree angle dividing interface equally.At thisIn embodiment, offset divide equally RGB LED10002 by its emit light side on obtain 0.5mm high a part andAnother part of 1.5mm high.Along place is divided equally, polarizing coating 10004 has been disposed.In Figure 100 B, by the offset for dividing cuboid equallyAmount forms wedge shape, to form at least one 14.3 degree angle dividing interface equally.In this embodiment, offset is divided equally in RGB0 obtains a part of .5mm high and another part of 1.5mm high on the side that LED10008 passes through its transmitting light.Edge is divided equallyPlace, has disposed polarizing coating 10010.
Figure 101 is depicted by the bending PBS film 10104 for the RGB LED10102 irradiation being placed on LCoS chip 10108.Rgb light from LED array 10102 is reflected on the surface 10108 of LCOS chip by PBS film 10104, but is enabled from imaging coreThe light of piece reflection passes unopposed through the eyes for reaching optics assembly and eventually arriving at user.The film used in this systemIncluding Asahi film, this is triacetate cellulose or cellulose acetate substrate (TAC).In embodiments, film can haveThe UV embossment ripple of 100nm, and the press polish coating on ridge is constructed, which can be directed to the incidence angle of light and form angle.Asahi film can by 20cm wide multiplied by 30 meters long involve in into, and when in LCD irradiates use when have BEF property.Asahi filmIt can support the wavelength from visible light to IR, and can all keep stablizing until 100 DEG C.
In another embodiment, Figure 21 and 22 depicts waveguide and the replacement arrangement of projector with decomposition view.At thisIn arrangement, projector is placed after abutting against the hinge of eyepiece arm, and it be oriented vertically to so that RGB LED signal justBegin into be it is vertical, until direction by reflexive prism change with enter waveguide lens.The projection engine of vertical arrangement can beCenter is with PBS218, in hollow and taper tunnel of the bottom with RGB LED array, with film diffuser with secondary colourColoured silk is so as to collection and condenser lens in optical device.PBS can have Prepolarization device on the plane of incidence.Prepolarization device canIt is aligned to transmit the light (such as p-polarization light) of a certain polarization and reflects light (such as s polarization of (or absorption) opposite polarizationLight).Then polarised light can reach field lens 216 by PBS.The purpose of field lens 216 can be foundation to the close of LCoS panelTelecentric iris irradiation.LCoS display can be real reflectivity, color be reflected with correct temporal order, so that image quiltCorrect display.Light can be reflected from LCoS panel, and for the bright areas of image, and light can be rotated to be s polarization.Light is then logicalThe refraction of field lens 216 is crossed, and can be reflected in the inner boundary of PBS, and leave projector, towards coupled lens.It is hollow, taperTunnel 220 can replace the lenslet that homogenizes from other embodiments.In being placed in by vertically oriented projector and by PBSCentre, space is conserved, and projector can be placed in hinge space, can be hung up from waveguide almost without the arm of moment of flexure.
It can be passed in environment outward from the light of image source or the reflection of the associated optics of eyepiece or scattering.These light lossesIt loses and " eyes shine " or " nightglow " is perceived as by outside beholder, wherein when being checked in rather dark environment, lensRegion around each section or eyepiece is revealed as shining.In certain situations that eyes as shown in FIG. 22 A shine, when by outerFor portion viewer when outside is checked, shown image is seen as the observable image 2202A in display area.In order to according toKeep the privacy of image just checked and according to make as user in rather dark environment using when eyepiece lessThe privacy of the viewing experience of user is kept in terms of noticeable two, preferably reduction eyes shine.Each method and deviceEyes can be reduced by light control element to shine, and reflected such as in optical device associated with image light source using partProperty eyeglass, using polarization optics etc..For example, the light into waveguide can be polarized, such as s polarization.Light control element can wrapInclude linear polarization.Wherein the linear polarization in light control element be oriented relative to the image light of linear polarization so thatThe second part for obtaining passing through partially reflective property eyeglass in the image light of linear polarization is blocked and eyes shine and are reduced.In each realityIt applies in example, eyes, which shine, relatively to be polarized from the light of the eye reflections of user (such as, in this instance by being attached to lensFor p-polarization) waveguide or frame (all fastening optical devices as described here) minimize or eliminate.
In embodiments, light control element may include the second quarter-wave film and linear polarization.Two or four/The second part of the image light of circular polarization is transformed into the image light of linear polarization by one wave film, and the image light of linear polarization has quiltThe polarization state that linear polarization in light control element stops, is reduced so that eyes shine.For example, working as light control elementIncluding linear polarization and when quarter-wave film, the incoming unpolarized scene light of the external environment before userIt is transformed into linearly polarized photon, and 50% light is blocked.First part in scene light by linear polarization is linear inclinedShake light, these light are transformed into circularly polarized light by quarter-wave film.From the third portion of partial reflection lens reflecting in scene lightDividing has reversed circular polarization state, then these light convert linear polarized light by the second quarter-wave film.Linear polarizationThen device stops the Part III reflected in scene light, thus reduce the light of escape and reduce eyes shining.Figure 22 B is shownOne example of the perspective display assembly in spectacle-frame with light control element.Glasses cross section 2200B shows spectacle-frame 2202BIn perspective display assembly each component.The entire see-through view that light control element covering user is seen.In eyes of userVisual field 2214B in, supporting member 2204B and 2208B are illustrated as support section reflectivity eyeglass 2210B and beam splitter layer respectively2212B.Supporting member 2204B and 2208B and light control element 2218B are connected to spectacle-frame 2202B.Such as fold eyeglassOther components of 2220B and the first quarter-wave film 2222B are also connected to supporting member 2204B and 2208B, so that groupThe assembly of conjunction is firm in structure.
The scattering of stray light in the compact optical of such as head-mounted display usually from shell or other structuresSide wall, light encounters surface with precipitous angle there.Such stray light generates the scattering light for surrounding displayed imageBright areas.
There are two types of methods to reduce such stray light.One is keep side wall or other structures dimmed or roughening nextReduce the reflectivity of light.However, although this increases the absorptivity at surface really, it still can quilt from the reflected light of surface scatteringIt notices.Another method is to provide baffle to stop or trim stray light.Stop or trims very big from the reflected light of surface scatteringGround reduces the effect of this stray light.In head-mounted display, it is beneficial to reduce stray light using both methods, becauseThe bright areas of shown image peripheral is eliminated, and the contrast of shown image is increased.
United States Patent (USP) 5949583 provide the observation window at the top of head-mounted display with stop stray light from top intoEnter.However, this does not solve the demand to the control for reducing the stray light inside wear-type display system.
United States Patent (USP) 6369952 provides two covers to stop the liquid crystal display image source in head-mounted displayPerimeter light.First cover is located on the input side in liquid crystal image source, adjoins backlight, and the second cover is located at liquid crystalShow on the outlet side of device.Since the two covers are respectively positioned near liquid crystal display, " the first cover 222 and the second cover 224 divideNot Ju You aperture or window 232,234, these apertures or window are substantially equal and congruent with the zone of action of LCD " the (the 15thColumn, 15-19 row).By being positioned about cover in image source, cover can to it is being emitted from image source, in image source from figureLight in the big cone angle in the closer each region in the center of the zone of action of image source has the function of very little.This big cone angle light can be pressedIt is reflected according to various modes from the side wall of shell, thus constitutes the stray light of bright areas form and lead to reduced contrast.
To which there are still the demands to the method for the stray light in each source inside head-mounted display is reduced.
Figure 160 shows the example of the display system with optically flat reflective surface, the optically flat reflective surfaceIt is the beam splitter being made of the optical film on substrate, wherein display system is near-eye display 16002.In this example, imageSource 16012 includes optical projection system (not shown) using the light for including the folding optical axis 16018 being located in near-eye display 16002It learns layout and image light is provided.Optical device along optical axis 16018 may include focusedimage light to provide to the eyes 16004 of userThe lens of focusedimage from image source 16012.Optical axis 16018 is folded into spherical surface from image source 16012 by beam splitter 16008Or spherical reflector 16010.Beam splitter 16008 can be partial reflection eyeglass or polarization beam apparatus.Near-eye displayBeam splitter 16008 in 16002 be directed to an angle with will at least part from the image light of image source 16012 againIt is directed to reflector 16010.From reflector 16010, at least another part of image light is reflected back toward the eyes 16004 of user.Another part of the image light reflected is passed back by beam splitter 16008 and focuses on the eyes 16004 of user.Reflector16010 can be eyeglass or part lens.In the case where reflector 16010 is part lens, near-eye display 16002 is come fromBefore the scene light of scene can be combined with image light, thus present to the eyes of user 16004 by the image along axis 16018The combination image light 16020 that light and scene light 16014 are constituted.Combined image light 16020 is presented to the eyes 16004 of userThe combination image of scene and the covering image from image source.
Figure 161 shows the diagram of nearly eye display module 200.Module 200 is by reflector 16104, image source module 16108It is constituted with beam splitter 16102.Module can be in side opening, wherein being located at reflector 16104,16108 and of image source moduleHave between at least some of connection edge between beam splitter 16102 attached.Alternatively, module 200 can be by side wall in sideIt closes, to provide the closed module for the inner surface for preventing dust, dirt and water contact module 200.Reflector 16104, image sourceModule 16108 and beam splitter 16102 can be manufactured separately and then be attached at together or wherein at least some can be in connectionIt is fabricated together in subassemblies.In module 200, optical film be can be used on beam splitter 16102 or reflector 16104.?In Figure 161, beam splitter 16102 is illustrated as flat surfaces, and reflector 16104 is illustrated as spherical surface.In nearly eye display module 200In, both reflector 16104 and beam splitter 16102 can be used to provide image to the eyes of user, as shown in Figure 160,Therefore surface is optically flat or optical-quality homogeneous is important.
It is assumed that image source 16108 includes optical projection system, optical projection system has the light source of big cone angle light, then image light also hasBig cone angle.As a result, image light is interacted with the side wall of module 200, this interaction can provide bright areas form and (be seen by userExamine as the bright areas of shown image peripheral) reflection and scattering light.These bright areas are made us very much for a userDivert one's attention, because they may appear as the halation of shown image peripheral.In addition, scattering light may be by shownImage on provide low-light randomly to make the contrast in image degrade.
Figure 162 shows the diagram of optical device associated with the type of head-mounted display 16200.In optical deviceIn, light source 16204 provides the big cone angle light including central ray 16202 and rim ray 16224.Light source 16204 can providePolarised light.Light passes to illumination beam splitter device 16210 from light source 16204, and the beam splitter is towards 16208 reflected light of reflective image sourceA part, which can be LCOS display.The first part of light by image source 16208 reflection and simultaneously withThe picture material being shown accordingly changes polarization state.Then the second part of light passes through illumination beam splitter device 16210, soPass through one or more lens 16212 of the cone angle of expansion light afterwards.The Part III of light is with an angle by imaging beamsplitter16220 reflect towards spherical surface (or aspherical) part lens 16214.The Part IV of 16214 reflected light of part lens, draws simultaneouslyPlay the eyes 16228 that light converges and image is focused on to user.After the Part IV of light is reflected by part lens 16214,The Part V of light passes through imaging beamsplitter 16220 and passes on the eyes 16228 of user, and image source 16208 is shown thereThe amplified version of the image shown is provided to the eyes 16228 of user.In perspective head-mounted display, the light from environment16218(or scene light) pass through part lens 16214 and imaging beamsplitter 16220 to provide the fluoroscopy images of environment.Then toUser provides the combination image being made of the fluoroscopy images of displayed image and environment from image source.
Central ray 16202 is along the center that the optical axis of the optical device of head-mounted display passes through optical device.Optics devicePart includes: illumination beam splitter device 16210, image source 16208, lens 16212, imaging beamsplitter 16220 and part lens 16214.Rim ray 16224 is transmitted along the side of shell 16222, and wherein light can be interacted with the side wall of shell 16222, wherein edge-lightLine 16224 can be reflected or be scattered by side wall, as shown in Figure 162.From this reflection of rim ray 16224 or the light of scatteringThe reduction of contrast in the bright areas or image of shown image peripheral is visible as to user.The present invention provides various methodsReflection and scattering light are reduced by stopping or trimming the reflection from side wall or scatter light to reduce bright areas.
Figure 163 shows the diagram of the first embodiment of the present invention, and wherein baffle 16302 is added in shell 16222Side, between illumination beam splitter device 16210 and lens 16212.Baffle 16302 is before rim ray 16224 is passed to lens 16212Blocking or trim edge light 16224.Baffle 16302 can be made of opaque any material, so that rim ray 16224It is blocked or is trimmed to about.In a preferred embodiment, baffle 16302 can be made of the black material with matt finishing coat, so thatIncident light is obtained to be absorbed by baffle.Baffle 16302 can be made of the plate material with the hole being located in shell 16222, orBaffle 16302 can be made into a part of shell 16222.Be placed in a distance from image source 16,208 1 due to baffle 16302 andImage light is diverging, therefore hole caused by surrounding baffle 16302 is bigger than the zone of action of image source 16208, so figureThe image that image source 16208 provides will not be trimmed in edge by baffle, as a result, entirely scheming provided by image source 16208As that can be seen by the eyes of user, as shown in Figure 163.In addition, baffle is preferably equipped with thin cross section (as shown in Figure 163)Or sharp edges, so that light will not be from the edge scatter of baffle.
Figure 164 is shown in which to be added to the another embodiment of the present invention of baffle 16402 at the plane of incidence of lensDiagram.Baffle 16402, which can be made into a part of shell 16222 or baffle 16402, can be applied on lens 16212Cover.In any case, baffle 16402 should be opaque, and preferably with the not black of gloss finishing coat, with resistanceIt keeps off and absorbs incident light.
Figure 165 shows the diagram of one embodiment of the invention similar to embodiment shown in Figure 164, in addition to baffleOn the outlet side of lens 16212.In this embodiment, baffle 16502 is provided to pass through lens in rim ray 16224Blocking or trim edge light after 16212.
Figure 166 is shown in which that baffle 16602 is attached to shell between lens 16212 and imaging beamsplitter 16220The diagram of 16222 another embodiment of the present invention.Baffle 16602 can be a part or baffle of shell 1622216602 can be the separate structure in shell 16222.Baffle 16602 stops or trim edge light 16224, so thatBright areas is not provided to the eyes 16228 of user in shown image peripheral.
Figure 167 shows wherein absorber coatings 16702 and is applied in the side wall of shell 16222 to reduce incident light and edge-lightThe diagram of the another embodiment of the present invention of 16224 reflection and scattering.Absorber coatings 16702 can with baffle 16302,16402,16502 or 16602 combinations.
Figure 168 shows the diagram in another source of stray light in head-mounted display, and wherein stray light 16802 is directly from lightThe edge in source 16204 enters.This stray light 16802 may especially become clear, because it is directly from light source 16204 without headIt first reflects from illumination beam splitter device 16210 and is then reflected from image source 16208.Figure 169 shows the stray light from light source 16204The diagram in 16902 another source, wherein stray light is from the surface reflection of image source 16208, and polarization state is changed thereAnd then stray light 16902 can pass through illumination beam splitter device according to the angle of comparable steepness.This stray light 16902 then can be from outerThe edge reflections of any reflective surface or lens 16212 in shell, as shown in Figure 169.Figure 170 is shown in which and light source16204 have been disposed adjacent the diagram of the one more embodiment of the present invention of baffle 17002.Baffle 17002 is opaque and from lightSource 16204 is stretched out, so that stray light 16802 and 16902 is blocked or trims after light source 16204, is thus preventedThe eyes 16228 of stray light arrival user.
In another embodiment, baffle or coating shown in Figure 163-167 and 169-170 are combined further to subtractThus stray light in few head-mounted display reduces the bright areas of shown image peripheral or increases shown imageIn contrast.Multiple baffles can be used between light source 16204 and imaging beamsplitter 16220.In addition, as shown in Figure 171, toolHaving the absorber coatings 1702 of ridge can be used, wherein a series of small ridges or step take on a series of baffles to stop or trim shellRim ray in 16222 entire sidewall areas.Ridge 17102 can be made into a part of shell 16222, or as separatelyOne layer of inner sidewall for being attached to shell 16222.
Figure 172 shows the another embodiment of belt or thin slice 17210, and belt or thin slice include that can be used for as shown in Figure 171The slide glass 17212 and ridge 17214 of blocking reflected light.Ridge 17214 non-straight overturning angle and sharp keen inclination on another side on side,So that being blocked from the incident light that sharp keen inclined side enters.Ridge 17214 can be with the triangular cross-section with a sharp edgeThe solid ridge in face, as shown in Figure 172 perhaps they can be attached to an edge thin inclination ruler or they can be withIt is attached to the inclined fiber of one end, so that a surface forms angle relative to side wall, and incident light is blocked.Belt is thinThe advantages of piece 17210, is that ridge 17214 can be relatively thin, and ridge can covering shell 16222 main region.Belt is thinPiece 17210 another is the advantage is that ridge 17214 is more easily made than ridge shown in Figure 171, ridge shown in Figure 171It is likely difficult to be molded into a part of shell.
In all embodiments, surrounding baffle can cause hole, and the size in hole corresponds to them along optical axis from image sourceDistance so that image light can be dissipated along optical axis, thus to the eyes of user 16228 provide image source 16208 without repairingThe view cut.
In one embodiment, the absorption polarizers in optics assembly be used to reduce stray light.Absorption polarizersIt may include anti-reflection coating.Absorption polarizers can be placed in after the condenser lens of optics assembly, pass through optics to reduceThe light of the optically flat film of assembly.Light from image source can be polarized to increase contrast.
In one embodiment, the anti-reflection coating in optics assembly can be used for reducing stray light.Anti-reflection coating canIt is placed on the polarizer of optics assembly or optics assembly delays on film.Film is delayed to can be quarter-wave film or two/ mono- wave film.Anti-reflection coating can be placed on the outer surface of partial reflection eyeglass.Light from image source can be polarizedTo increase contrast.
With reference to Figure 102 A, image light is directed to the beam splitter layer of optics assembly by image source 10228.Figure 103 is depictedThe amplification of image source 10228.In this particular embodiment, image source 10228 is shown to include light source (LED strip 10302), light sourceLight is orientated, bending wire-grid polarizer 10310 is reached by diffuser 10304 and Prepolarization device 10308, light is anti-thereIt penetrates to LCoS display 10312.Then image light from LCoS passes through bending wire-grid polarizer 10310 and half wave film10312 are reflected back the beam splitter layer of optics assembly 10200.In embodiments, including optical module 10204,10210,10212,10212,10230 optics assembly may be provided as the optics assembly of sealing, such as detachable (for example, buckleAnd open), it is replaceable etc., and image source 10228 can be used as the black box in eyepiece frame and be provided.This allows sealingOptics assembly waterproof and dustproof, replaceable, customizable etc..For example, given sealing optics assembly can be equipped with for oneThe corrective optics assembly of people, and it is available another with different corrective optics requirements (for example, different prescriptions)The second of one people seals optics assembly to replace.In embodiments, it is understood that there may be need not eyes all receive from eyepieceThe application of input.In this case, a people can simply dismantle side, and using only it is unilateral come projection for content.WithThis mode, user can have the optical path that do not blocked for eyes now, and wherein assembly has been removed, and eyepiece will be only oneHalf system saves battery life etc. in the case where running.
Optics assembly can be considered just being sealed about what part and be divided into separated part, such as be given birth to by imageIt is constituted at tool 10228 and indicative optical tooling 10204,10210,10212 and 10230, as shown in Figure 102 A.AnotherIn diagram, Figure 147 shows the embodiment that indicative optical device is shown as to the eyepiece of ' projection screen ' 14608a and 14608bConfiguration.The part of eyepiece electronic device and optical projection system 14602 is also shown in Figure 102 A, and wherein this part of optical projection system can quiltReferred to as image Core Generator.Image Core Generator and indicative optical tooling can be the subassemblies of sealing, such as inciting somebody to actionOptical device is launched wherein not by the invasion of the pollutant in ambient enviroment.In addition, indicative optical device can be removed, it is allSuch as replacing, for removing with allow that user do not blocked check, removed by force for adapting to non-destructive (for example, itsIn indicative optical device be knocked and from the main body of eyepiece be detached from without damage) etc..In embodiments, the present invention canIncluding the interaction wear-type eyepiece that user wears, wherein eyepiece includes that (by the optics assembly, user checks optics assemblyAmbient enviroment and shown content) and integrated image source (be suitable for content is introduced into optics assembly), wherein optics groupPiece installing includes the image Core Generator being mounted in the frame of eyepiece and before eyes of user and can be from the frame of eyepieceThe indicative optical tooling removed in frame, wherein image Core Generator is sealed in frame to reduce the dirt from ambient enviromentDye.In embodiments, sealing can be the optical window of sealing.As described here, eyepiece may also include handling implement, electric powerManagement tool, removal sensor, battery etc., wherein electrical management tool can indicate to detect by the dismounting from removal sensorThe dismounting of indicative optical tooling, and reduce the electric power of each component of eyepiece selectively to reduce the electric power of battery consumption.For example, electricThe component that power reduces can be image source, reduce the brightness of image source, the power supply for closing image source etc., wherein electrical managementTool can monitor the attachment again of indicative optical tooling, and the electricity usage of image source is reverted to the working water before removingIt is flat.Indicative optical tooling can be removed by disengaging mode, so that when indicative optical tooling is removed unintentionally, its meetingIt is removed in the case where not damaging eyepiece.Indicative optical tooling can be dismountable by connection mechanism, such as magnet, bolt,Rail, snap-on connector etc..Indicative optical tooling can provide the vision correction to the user for needing corrective glasses, wherein indicatingProperty optical tooling can for change eyepiece vision correction prescription purpose and be replaced.Eyepiece can have for each eyeTwo sseparated dismountable optics assemblies, wherein separated one of optics assembly is removed to allow for separated optics to assembleRemaining one simple eye use in part.For example, simple eye use, which can be gun, aims at use, instruction is wherein removed in eyepieceProperty optical tooling side be used for gun aiming, thus allow user for gun aim at the visual pathway not blocked,And retain tool provided by eyepiece to another eye simultaneously.Indicative optical tooling can be removed to allow to make suitable for interiorThe replacement of indicative optical tooling and the indicative optical tooling for being suitable for outdoor application.For example, using comparison for interiorOutdoor application may have different filters, the visual field, contrast, shielding etc..It is attached that indicative optical tooling can be adapted receivingCanadian dollar part, optical element, mechanical organ, adjustment element etc..For example, optical element can be inserted at the optics to userSide is adjusted.Indicative optical tooling can also be replaced to change the provided visual field, such as by with the second viewWild indicative optical tooling replacement has the indicative optical tooling in first visual field.
Unpolarized light is provided with reference to Figure 104, LED.Diffuser spreads the light from LED and light is made to homogenize.It absorbsLight is transformed to S-polarization by Prepolarization device.S polarized light is then bent over wire-grid polarizer and reflects towards LCOS.LCOS reflects S-polarizationLight, and depend on local image content and be converted into P-polarized light.P-polarized light becomes P polarization by bending wire-grid polarizerImage light.P-polarized light is transformed into S polarized light by half wave film.
Referring again to Figure 102 A, beam splitter layer 10204 is that polarization beam apparatus or image source provide polarization image light10208 and beam splitter layer 10204 is polarization beam apparatus so that the image light 10208 reflected is linearly polarized photon, this implementationExample and associated Polarization Control are shown in Figure 102 A.The beam splitter layer to wherein image source offer linear polarization image light10204 the case where being polarization beam apparatus, the polarization state of image light is aligned with polarization beam apparatus, so that image light 10208 is inclinedShake beam splitter reflection.The image light reflected is shown as polarizing with S state by Figure 102 A.Beam splitter layer 10204 is polarization whereinIn the case where beam splitter, the first quarter-wave film 10210 is arranged on beam splitter layer 10204 and partial reflection eyeglassBetween 10212.Linear polarization image light is transformed into circular polarization image light (in Figure 102 A by the first quarter-wave film 10210It is illustrated as S and is just transformed into CR).Then the first part reflected in image light 10208 is also circularly polarized, wherein circular polarization shapeState is inverted and (is illustrated as CL in Figure 102 A), so that (being shown with the polarization state of image light 10208 provided by image sourceIt is compared for S), after being passed back by quarter-wave film, the polarization state quilt for the first part reflected in image light 10208It keeps (for P polarization), as a result, the first part reflected in image light 10208 is by polarization beam apparatus without reflectingLoss.It has an X-rayed display assembly 10200 when beam splitter layer 10204 is polarization beam apparatus and includes the first quarter-wave filmWhen 10210, light control element 10230 is the second quarter-wave film and linear polarization 10220.In embodiments, light-operatedElement 10230 processed includes controllable darkening layer 10214.Wherein the second quarter-wave film 10218 is by the image light of circular polarizationThe image light 10208(that 10208 second part is transformed into linear polarization is illustrated as CR and is just transformed into S), the figure of linear polarizationAs the polarization state that there is light the linear polarization 10220 being independently controlled by light in element 10230 to stop, subtracted so that eyes shineIt is few.
When light control element 10230 includes linear polarization 10220 and quarter-wave film 10218, before userThe incoming unpolarized scene light 10222 of the external environment in face is transformed into linearly polarized photon and (is illustrated as P in Figure 102 APolarization state), and 50% light is blocked.In scene light 10222 by the first part of linear polarization 10220 be it is linear partiallyShake light, these light are transformed into circularly polarized light (be illustrated as P in Figure 102 A and be just transformed into CL) by quarter-wave film.Scene lightIn the Part III that is reflected from partial reflection eyeglass 10212 have reversed circular polarization state (be illustrated as in Figure 102 A fromCL is transformed to CR), then these light (are shown in Figure 102 A by 1 film 10218 of the second quarter-wave transformation linear polarized lightS-polarization is transformed into for CR).Then linear polarization 10220 stops the Part III reflected in scene light, thus reduce escapeLight and reduce eyes shine.
The first part reflected in image light 10208 as shown in Figure 102 A and second be transmitted in scene lightDivide circular polarization state (being illustrated as CL) having the same, is converted so that they organize to merge by the first quarter-wave film 10210Linear polarized light (is illustrated as P), and when beam splitter layer 10204 is linear beam splitter, linearly polarized photon passes through beam splitter.LinearlyThen polarization combination light 10224 provides constitutional diagram to the eyes 10202 of the user at the back for being located at perspective display assembly 10200Picture, wherein combination image by the external environment before the shown image and user from image source see-through view superpositionPart is constituted.
Beam splitter layer 10204 includes optically flat film, all Asahi TAC films as described here.Beam splitter layer 1024 canIt is placed in front of the eyes of user according to an angle, so that beam splitter layer reflection and corresponding portion and the transmission of transmission image lightThe scene light of see-through view from ambient enviroment, so that providing to the eyes of user by image light and the scene light that is transmittedThe combination image of each section composition.Optically flat film can be polarizer, such as wire-grid polarizer.Optically flat film can be laminatedTo transparent substrates.Optically flat film can be molded, be molded, gluing etc. is until in the surface of one of optical surface of eyepiece or tableOn face, such as beam splitter 10202.Optically flat film can be set to vertical line less than 40 degree.Bending polarizing coating can have less than 1:1Light source height to the ratio of the width of irradiated area.The highest point of bending film is lower than the length of the most narrow axis of display.?In each embodiment, once optical thin film is located on beam splitter, additional optical device, such as corrective optical device, prescription lightLearning device etc. can be added on surface, such as in order to make to keep flat in the interlayer of film therebetween.
The present invention also provides for providing the method on the optically flat surface with optical film.Optical film be to be formed have withOne optical texture of the very different optical characteristics of the rest part of the structure of imaging device facilitates method.In order to be set to imagingStandby to provide function, optical film needs to be attached to optical device.When optical film in a manner of reflexive by use, crucially anti-Penetrating property surface be it is optically flat, otherwise the wavefront of the light of self-reflection surface reflection will not be kept and picture quality will be byDegrade.Optically flat surface can be defined as, when the wavelength measurement of the light used for imaging device, and with flat surfaces or requiredAny one in optical curve compares, the uniform surface within 5 optical wavelength of per inch on surface.
Optically flat surface including optical film as described in the present invention can be included in display system, display systemIt include: projector, projection TV set, near-eye display, head-mounted display, see-through display etc..
Figure 140 shows the example of the display system with optically flat reflective surface, and optically flat reflective surface isThe beam splitter being made of the optical film on substrate, wherein display system is near-eye display 14000.In this example, image source14010 include optical projection system (not shown) using the optics for including the folding optical axis 14000 being located in near-eye display 14014Layout provides image light.Optical device along optical axis 14014 may include focusedimage light to provide to the eyes 14002 of userFrom the lens of the focusedimage of image source 14010.Beam splitter 14004 by optical axis 14014 from image source 14010 fold into spherical surface orSpherical reflector 14008.Beam splitter 14004 can be partial reflection eyeglass or polarization beam apparatus layer.Near-eye displayBeam splitter 14004 in 14000 be directed to an angle with will at least part from the image light of image source 14010 againIt is directed to reflector 14008.From reflector 14008, at least another part of image light is reflected back toward the eyes 14002 of user.Another part of the image light reflected passes beam splitter 14004 back and focuses on the eyes 14002 of user.Reflector 14008 canTo be eyeglass or part lens.In the case where reflector 14008 is part lens, before near-eye display 14000The scene light of scene can be combined with image light, thus to the eyes of user 14002 present by along axis 14014 image light andThe combination image light 14018 constituted along the scene light of axis 14012.Scene is presented to the eyes of user in combined image light 14018And the combination image of the covering image from image source.
Figure 141 shows the diagram of nearly eye display module 14100.Module 14100 is by reflector 14104, image source module14108 and beam splitter 14102 constitute.Module can be using attachment in side opening, and attachment is located at reflector 14104, image sourceBetween at least some of connection edge between module 14108 and beam splitter 14102.Alternatively, module 14100 can pass through side wallIt is closed in side, to provide the closed module for the inner surface for preventing dust, dirt and water contact module 14100.Reflector14104, image source module 14108 and beam splitter 14102 can be manufactured separately and then be attached at together, or wherein at least oneIt can be fabricated together in the subassemblies of connection a bit.In module 14100, optical film can be used in beam splitter 14102 or anti-In emitter.In Figure 141, beam splitter 14102 is illustrated as flat surfaces, and reflector 14104 is illustrated as spherical surface.It is shown in nearly eyeIn module 14100, both reflector 14104 and beam splitter 14102 can be used to provide image to the eyes of user, such as schemeShown in 140, therefore surface is optically flat or optical-quality homogeneous is important.
Figure 142 shows one embodiment of the invention --- the schematic diagram of thin skin pattern film assembly 14200.Thin skin patternFilm assembly 14200 includes the frame 14202 being made of frame member 14202a and lower frame component 14202b.Optical film 14204It is maintained between framing component 14202a and 14202b using adhesive or fastener.In order to improve the flat of optical film 14204Smooth property, optical film 14204 can stretch in one or more directions, while adhesive is applied, framing component 14202a and14202b is adhered to optical film 14204.After optical film 14204 is adhered to frame 14202, the edge of optical film can quiltFinishing is to provide smooth surface to the outer edge of frame 14202.
In some embodiments of the invention, optical film 14204 is a series of folded membrane being made of optically flat surfaces,And the interface of frame element 14202a and 14202b have matched collapsed shape.Then folded membrane is stretched simultaneously along the direction foldedPosition is bonded it to, so that framing component 14202a and 14202b keep optical film 14204 in collapsed shape, and a series ofEach surface in optically flat surface is maintained at appropriate location.
In all situations, after framing component 14202a and 14202b are adhered to optical film 14204, what is obtained is thinSkin pattern film assembly 14200 is in the optical device for can be placed in such as close eye display module 14100 to form beam splitter14102 rigid assembly.In this embodiment, thin skin pattern film assembly 14200 is in nearly eye display module 1410014102 assembly of replaceable beam splitter.The slot that side wall in nearly eye display module 14100 can have frame 14202 to be caught in, orIt alternatively can provide the flat surfaces of connection side wall and frame 14202 can be placed at the top of flat surfaces.
Figure 143 is the diagram for being inserted into molding assembly 14300 for including optical film 14302.In this embodiment, opticsFilm 14302 is placed in mold, viscosity plastics materials by molded door 14308 by injection mold so that filling plastic moldType chamber simultaneously forms the molded structure 14304 for adjoining optical film 14302 and being located at after optical film 14302.When plastic material is in mouldIt is hardened in tool, opens mold along seam line 14310, and remove insertion molding assembly 14300 from mold.Optical film 14302Then it is embedded into and is attached to insertion molding assembly 14300.In order to improve the optical film in insertion molding assembly 1430014302 optically flat property, the inner surface for placing the mold of optical film 14302 is optically flat surface.By this method, viscosity modelingExpect that material during molding process, forces optical film 14302 against the optically flat surface of mold.This technique can be used for mentioningFor flat or with required optical curve optically flat surface as described above.In another embodiment, optical film 14302 can matchAdhesive phase or tie layer are had, to increase the adherency between optical film 14302 and molded structure 14304.
In another embodiment, optical film 14302 is placed in mold, and between die surface and optical film 14302With protectiveness film.Protectiveness film can be attached to optical film 14302 or mold.Protectiveness film is smooth than die surface or flatIt is smooth, so as to provide smoother or flat surface to the optical film 14302 that it is moulded.Therefore, protectiveness film, which can be, appointsWhat material, such as plastics or metal.
Figure 144 shows the diagram of the laminating technology for making laminate of optical film 14400.In this embodiment,Upper lower platen 14408a and 14408b be used to for optical film 14400 being laminated on substrate 14404.Adhesive 14402 can be appointedChoosing is using to be adhered to optical film 14400 for substrate 14404.In addition, one or more of pressing plate 14408a and 14408b canIt is heated or substrate 14404 can be heated to provide higher degree of adhesion between substrate 14404 and optical film 14400.To liningOne or more heating may be alternatively used for softening substrate 14404 in bottom or pressing plate 14408a and 14408b, thus in optical filmPressure more evenly is provided after 14400 to improve flatness or flatness of the optical film 14400 in laminate.This implementationThe laminate with optical film 14400 of example is used as above being directed to nearly eye described in thin skin pattern film assembly 14200Learn the replaceable beam splitter in module 14100.
Figure 145 A-C is shown for the application work with the optical surface production molded structure 14502 for including optical film 14500The diagram of skill.In this embodiment, optical film 14500 is applied in molded structure 14502 with rubber applicator 14508Optically flat surface 14504.Adhesive phase can be applied to optically flat surface 14504 or the optical film of molded structure 14502Optical film 14500 is adhered to molded structure 14502 by any of 14500 bottom surface.Rubber applicator 14508 canTo be the relatively soft and flexible material with curved surface, so that the central part of optical film 14500 is forced to contactThe optically flat surface 14504 of molded structure 14502.When rubber applicator 14508 further pushes down on, optical film 14500Contact area size between the optically flat surface 14504 of molded structure 14502 increases, such as Figure 145 A, 145B and 145CShown in.This progressive application process provides the highly uniform application of pressure, this allows the air of interface applyingIt is ejected during journey.The optically flat surface 14504 of progressive application process and molded structure 14502, which provides, is attached to molding knotThe optically flat optical film 14500 of the inner surface of structure 14502, as shown in Figure 145 C.For optical film 14500 to be adhered to mouldThe adhesive phase of structure 14502 processed can be attached to the optically flat surface on 14502 inside of optical film 14500 or molded structure14504.It will be understood to those skilled in the art that this application process similarly can apply optics in the outer surface to molded structureFilm.In addition, optically flat surface can be flat surfaces or surface or a series of optically flat tables with required optical curveFace, wherein rubber applicator is shaped, to provide the progressive application of pressure with the application of optical film.
In embodiments, image display system may include the optically flat optical film comprising display module shell, whereinShell includes that optical film is made to keep optically flat substrate, image source and check position, wherein image provided by image source fromOptical film, which is reflected into, checks position.In embodiments, the optical film of image display system can be molded into display module.?In each embodiment, optical film can be applied to display module.In addition, in embodiments, the optical film of display system can beWire-grid polarizer, eyeglass, part lens, holographic film etc..In embodiments, image display system can be near-eye display.In embodiments, optical film is molded into display module, or when optical film is molded into display module, can be oppositeOptical film is kept in optically flat surface.In embodiments, the optical film of image display system may include 5 light waves of per inchLong optical flatness.
In one embodiment, the image display system including optically flat optical film may include that optical film is made to keep optics flatSmooth substrate, display module shell, image source and check position, wherein image provided by image source is reflected into from optical film looks intoIt sees position, and the substrate with optical film can be replaced in display module shell.In such embodiments, image display systemSubstrate can be frame, optical film can be kept under the tension of frame, and substrate can be the subsequent molded panel of film, and/orSubstrate can be laminate.In addition, the optical film of image display system can be beam splitter, polarization beam apparatus, wire-grid polarizer,Eyeglass, part lens, holographic film etc..In addition, image display system can be near-eye display.In embodiments, when schemingAs display system optical film behind molded panel when, the optical film can be kept against optically flat surface.In addition, in each embodimentIn, when plate is in turn laminated to the optical film of image display system, optical film can be kept against optically flat surface.In each implementationIn example, the optical film of image display system may include the optical flatness of 5 optical wavelength of per inch.
In one embodiment, the component in Figure 102 A collectively forms electro-optical module.Optical axis associated with displayAngle can be 10 degree or more vertical.This gradient refers to the degree that leans forward on optical module top.This allows beam splitterAngle is reduced, and beam splitter angle reduces so that optical module is thinner.The height of polarizing coating is bent to reflective image displayWidth ratio be less than 1:1.Curve on polarizing coating determines the width of the irradiated area in reflective display, and is bentThe inclination in region determines positioning of the irradiated area in reflective display.Polarizing coating is bent by the irradiation of the first polarization stateLight is reflected into reflective display, this changes the polarization of irradiation light and generates image light, and it is anti-to be bent polarizing coating transmitting instituteThe image light penetrated.Being bent polarizing coating includes parallel with reflective display a part on light source.The height of image source can be withIt is the 80% of at least display zone of action width, at least 3.5mm or less than 4mm.
In portable display system, it is important to provide bright, compact and light weight display.Portable display systemIncluding mobile phone, laptop computer, tablet computer, near-eye display and head-mounted display.
The present invention is provided as portable display system and provides compact and light weight headlight, and headlight is by partial reflection film structureAt to redirect the light from edge light to irradiate reflective image source.Partial reflection film can be part lens beam splitterFilm or polarization beam apparatus film.Polarization beam apparatus film can be multi-layer dielectric film or wire-grid polarizer film.Known polarization beam splitterFilm provides the high-efficiency reflective to a polarization state and another polarization state is allowed to pass through simultaneously.Multi-layer dielectric film can be from beauty3M company, Minneapolis, Minnesota city, state is bought with the title of DBEF.Wire grid polarization film can be from Tokyo cityAsahi-Kasei E-Materials company is bought with the title of WGF.
Edge-light provides compact light source for display, but since it is located at the edge of image source, light must be redirected 90Degree is with irradiation image source.When image source is reflective image source, when such as liquid crystal over silicon (LCOS) image source, irradiation light must quiltPolarization.Polarised light is by the surface reflection of image source, and the polarization state of light accordingly changes with the picture material shown.InstituteThen the light of reflection is passed back by headlight.
The prior art that Figure 187 shows with solid beam splitter block 18718 as headlight shows showing for assembly 18700Intention is shown.Display assembly includes headlight, one or more light sources and image source.In display assembly 18700, one or moreA light source 18702 is included to provide the light for being illustrated as light 18712.Light source can be LED, fluorescent lamp, OLED, incandescent lamp orSolid state lamp.Light 18712 is by diffuser 18704 with deflection dispersion light to obtain irradiation more evenly.If the light of diffusionIt is polarized, then diffuser includes linear polarization.Diffused ray 18714 passes through solid beam splitter block 18718 towards partial reflectionLayer 18708 is emitted, and is reflected in 18708 diffused ray of partial reflection layer towards 18720 part of reflective image source.Diffused rayThen 18714 are reflected by reflective image source 18720, image light 18710 is consequently formed, image light the being partially reflected property layer18708 transmissions.Then image light 18710 can be passed to associated image forming optics (to show) and be schemed with presenting to viewerPicture.However, as in Figure 187 as it can be seen that being illustrated herein as the height by light area of the light source of diffuser 18704 and illuminatedReflective image source 18720 it is of same size.Partial reflection layer 18708 is placed in 45 degree of angles to provide image light18710, the image light straight line or vertically proceed to associated image forming optics.As a result, shown in Figure 187Headlight is relatively large in size.
In imaging systems, it is however generally that, keep the wavefront from image source to provide with fine resolution and comparisonThe high quality graphic of degree is important.Therefore, as it is known by the man skilled in the art, image light 18710 must and reflective imageOrthogonally advance to provide uniform wavefront to associated image forming optics, to obtain being provided to viewing in source 18720The high quality graphic of person.Therefore, diffused ray 18714 must being partially reflected property film 18708 redirect with reflective imageSource 18720 is orthogonal, so that they can be reflected by vertical (as shown in Figure 187-198) and be passed to associated image optics deviceIn part.
Figure 188 shows another prior art and shows assembly 18802, which includes partial reflection film 18804, shouldIt is independent unsupported that film, which is supported at edge and on reflective image source 18720,.The display assembly is to be similar to figureShow that the mode of assembly works shown in 187, difference is to show assembly 18802 due to not having solid beam splitter block18718 and it is lighter than display assembly 18700 in weight.As in Figure 188 as it can be seen that the height of diffuser 18704 again with reflectionProperty image source 18720 it is of same size to provide image light 18808, the image light by reflective image source 18720 reflect whenIt vertically advances in associated image forming optics.
Figure 187 is shown if partial reflection film 18804 is located at the angle less than 45 degree in display assembly 18902Light can occur what signal diagram.In this case, each section in reflective image source 18720 is not illuminated uniformly.Irradiate the light of part farthest from diffuser in reflective image source or without straight ahead to associated image opticsIn device (such as in the case where light 18904), or before just from the surface reflection in reflective image source (such as light 18908In situation), this can change polarization state, and if partial reflection film is polarization beam apparatus film (also referred to as reflective polarizerFilm), then then light passes through the film.Therefore, when associated image forming optics only can be used from reflective image source 18720When the image light of straight ahead, when partial reflection film 18804 is located at the angle less than 45 degree, reflective image source 18720In illuminated region be reduced, the dark portion of corresponding map picture generates.
In one embodiment of the invention shown in Figure 190, bending part reflective surface 19004 is provided with by lightThe diffused light 19010 that source 18702 provides is redirected to irradiate reflective image source 18720 downwards.Curved partial reflectionSurface 19004 can be polarization beam apparatus film, which is thin and flexible.In this case, diffuser 19704 includes linearPolarizer, so that light 18712 is diffused and then is linearly polarized, so diffused light 19010 is polarized.In diffuser 18704Linear polarization and polarization beam apparatus film 19004 be oriented such that it is anti-by polarization beam apparatus film by the light of linear polarizationIt penetrates.By this method, when reflective image source 18720 changes the polarization of diffused light 19010, the image light 19008 that is reflectedPolarization is opposite polarization state compared with diffused light 19010.The image light 19008 and then passing through partially reflective property film reflected19004 and continue to display optics.By using flexible polarization beam apparatus film as partial reflection surface 19004,Partial reflection surface 19004 can be bending and light weight.Polarization beam apparatus film takes on irradiation reflective image source 18720The dual role of the transparent component of the image light 19008 of the reflector and reflection of diffused light 19010.Such as those skilled in the artKnown, advantage provided by polarization beam apparatus film is that they can receive light in large-scale incidence angle, so that curve is notThe light by reaching film can be interfered.Further, since polarization beam apparatus film is thin (for example, less than 200 microns), curved shape is notIt can make significantly image light distortion when image light 19008 reaches display optics by film.Finally, polarization beam apparatus film enablesThe tendentiousness of light scattering is very low, so hi-vision contrast is maintained.
The flexible nature of polarization beam apparatus film allows them to be formed curved shape, and curved shape is by the light from diffuserIt redirects and focuses on reflective image source.The light distribution that the shape of the curve of polarization beam apparatus film can be provided based on diffuserAnd select, to provide the uniform irradiation to reflective image source.Figure 190 shows the reflectivity of the bending part with parabolic shapeFilm 19004, but the property of light source 18702 and the validity of diffuser 18704 are depended on, radiation curve, complicated spline curve, phaseFlat curve, plane or sectional plan also are likely used for equably redirecting diffused light 19010 and be focused it onto anti-In penetrating property image source 18720.Experiment shows the curved surface on partial reflection surface 19004 often by 19010 meeting of diffused lightGather in the center in reflective image source 18720, so that curved surface provides edge more bright light in diffuser 18704Optimal use is obtained when distribution.On the contrary, experiment shows to be a relatively flat surface on partial reflection surface 19004 in diffuser18704 obtain optimal use when providing light distribution more bright at center.When partial reflection surface 19004 is by flexible membraneWhen composition, the shape on partial reflection surface can be maintained with side frame, and side frame has the groove of suitable profile to keep flexible membraneIn in position, as being shown as independent free-standing film in Figure 190.Two side frames are used to together with other components in display groupSupport bends shape is made of because of the significant portion of display assembly 19002 air on the either side of piece installing 19002, and partReflective surface 19004 is film, and compared with the prior art shown in Figure 187 shows assembly 18700, weight will be lightIt is more.In addition, the width such as in Figure 190 as it can be seen that illuminated reflective image source 18720 is greater than the height of diffuser 18704, withSo that display assembly 19002 shows that assembly is more compact than the prior art shown in Figure 188.
Figure 191 shows another embodiment of the present invention, wherein double light sources 19104 are used in display assembly 19102,In the partial reflection surfaces of two relatively flats be placed back-to-back.The headlight with two sides is arranged in shown in Figure 191Middle offer solid film retainer 19120 so that display assembly 19102 be similar to using two be arranged back-to-back as figureAssembly 18700 is shown shown in 187.In Figure 191, show light only for side, but the component of the other side andLight is symmetrical with shown side.It is the partial reflection being extended continuously between two sides in solid film retainer 19120Film 19110.Solid film retainer 19120 is also continuous between two sides, so that image light 19112 is not displayed assemblyJointing line between 19102 two sides interrupts or deflection.Solid film retainer 19120 and partial reflection film 19110 mention togetherFor constant optical thickness, so image light is not deflected or distorts.Therefore, with the image light 19112 of consecutive image qualityIt can be provided while being irradiated by the light from two light sources 19104.Each light source 19104 provides light to diffuser 1910819114, the diffuser deflection dispersion light 19114 with diffused light 19118 is provided so as to irradiate reflective image source 18720 oneHalf.Partial reflection film 19110 is maintained in required shape by solid film retainer 19120.Most of all, working as and reflectivityWhen the irradiating width of image source 18720 is compared, the height of diffuser 19108 is reduced in Figure 187 for display assemblyThe half of prior art diffuser 18704 shown in 18700.
Figure 192 shows with double light sources 19104 and is supported only at edge independent without support section reflective membraneThe signal diagram of 19204 display assembly 19202.In Figure 192, light is shown only for side, but the other sideComponent and light and shown side are symmetrical.Group shown in the function and Figure 191 of the various components of display assembly 19202Part function is identical, but the benefit for having display assembly 19202 lighter than display assembly 19102 in weight, becauseThe major part of display assembly 19202 is made of air.
Figure 193 shows with double light sources 19104 and is supported only at edge independent without support section reflective membrane19308 display assembly 19302, so that two curved surfaces are provided.In Figure 193, light is shown only for sideLine, but the component of the other side and light and shown side are symmetrical.Partial reflection film 19308 is continuous on two sides, there is similar bending on two sides.Bending is selected to diffused light 19312 provided by reflected diffusion body and by diffused lightIt focuses on reflective image source 18720.18720 reflected diffusion light 19312 of reflective image source, is consequently formed image light19310.The height of diffuser 19304 is less than the half of prior art diffuser 18704 shown in Figure 187, so that headlightIt is very compact with display assembly 19302.
Figure 194 shows the display assembly with the continuous part reflective membrane 19308 in solid film retainer 1940419402 signal diagram, display assembly 19402 are similar to shown in Figure 193 in other aspects and show assembly 19302.In Figure 194, light is shown only for side, but the component of the other side and light and shown side are symmetrical.SolidFilm retainer 19404 is used on the either side of partial reflection film 19308, film is maintained in specified two sides curve,And protect part reflective membrane 19308.Among bottom of the two sides of solid film retainer 19404 by solid film retainer 19404In relatively thin part connection, the jointing line of the image light 19310 of picture centre will be destroyed to further avoid providing.
In a preferred embodiment of the invention, the partial reflection film in assembly is shown shown in Figure 191-194 isPolarizing beam splitting film.In these embodiments, diffuser includes linear polarization, so that diffused light is polarized.Linear polarizationIt is aligned with polarization beam apparatus film, so that diffused light has the polarization state reflected by polarization beam apparatus film.Polarization beam apparatus filmAlso act as the analyzer of image light.It is using polarization beam apparatus film using the advantages of polarization diffused light in headlight, aobviousShow that stray light is reduced in assembly, because all polarization diffused lights are reflected by polarization beam apparatus film towards reflective image source,Diffused light is polarized at reflective image source is transformed into image light.If diffused light is not polarized, the diffusion that do not reflectedThe polarization state of light will be transmitted by polarization beam apparatus film, if this light is not affected by control, it will be provided to image lightLight is scattered, this can reduce the contrast into the image that viewer is presented.
Figure 195 shows with the single light source 19104 on side and polarizes control effectively to irradiate reflexive figure from two sidesThe signal diagram of the display assembly 19502 of image source 18720.In this case, light source 19104 provides unpolarized light19114 and unpolarized diffused light 19508.Specific reflective membrane is the polarization beam apparatus in solid film retainer 19514Film 19504.One polarization state (being illustrated as light 19510) of 19504 reflected diffusion light of polarization beam apparatus film while transmission is anotherOne polarization state (is illustrated as light 19518).Polarization beam apparatus film 19504 be fold and continuous so that have it is another partiallyThe two sides that the light of vibration state 19518 passes through folding polarization beam apparatus film 19504.Then this light 19518 passes through quarter-waveDelay film 19524, which delays film that polarization state is changed to circular polarization from linear.Circularly polarized light is then by eyeglass19528 reflect and delay film 19524 to pass back by quarter-wave, and quarter-wave delays film to change polarization state from circular polarizationFor linear polarization but there is a kind of polarization state (being illustrated as light 19520), so that light 19520 is then by polarization beam apparatus film19504 reflect towards reflective image source 18720.Therefore, light phase provided by the light source 19104 in assembly 19502 is shownLight with polarization state irradiates reflective image source 18720 on two sides.Since diffused light 19508 is not polarized, and two kinds inclinedVibration state (19510,19518) be used to irradiate reflective image source 18720, and substantially all light provided by light source is becomeChange image light (19512,19522) into.Image light (19512,19522) is directly provided to associated image forming optics.Once again, the height of diffuser 19108 is the half of diffuser 18704 shown in Figure 187, thus provide compact and efficientHeadlight and display assembly.
Figure 196 is shown with the display assembly 19602 for being similar to the arrangement of geometry shown in Figure 195, but is polarized and dividedBeam device film 19604 is independent unsupported, and is only supported at edge, to reduce the weight of headlight, while still being provided oppositeThe lower diffuser height of width in illuminated reflective image source.
Figure 197 shows one more embodiment of the present invention, including has double light sources 19704 and 19708 and fold polarization pointThe display assembly 19702 of beam device film 19714, wherein the two sides for folding polarization beam apparatus film 19714 are curved.From light source19704,19708 light 19718,19720 is not polarized, and diffuser 19710,19712 does not include polarizer, to diffuseLight 19722,19724 is also not polarized.The bending of polarization beam apparatus film 19714 and angled each side are by the one of diffused lightA polarization state (being illustrated as light 19728,19730) redirects towards reflective image source 18720, at the same also by light 19728,19730 converge on the imaging region in reflective image source 18720.In this display assembly, double light sources 19704,19708It works in complementary fashion with polarization beam apparatus 19714 is folded, because polarization beam apparatus film 19714 is continuous.Therefore, it is showingThe diffused light 19722,19724 not polarized is provided on every side of assembly 19702 respectively, and the first polarization state is (usuallyIt is redirected by polarization beam apparatus film 19714 towards reflective image source 18720 for S), while there is another polarization state (usuallyP light 19740,19738) is transmitted by polarization beam apparatus film 19714.Transmitted light 19740,19738 with another polarization stateBy folding the bilateral of polarization beam apparatus film 19714, so that it arrives separately at diffuser 19712,19710 on opposite sides.When light 19740,19738 influences diffuser 19712,19710 on opposite sides respectively, light is reflected by diffuser, and in processIn, what light became not polarized.Can to light source 19704,19708 and peripheral region addition reflector with increase to light 19740,19738 reflection.This irreflexive light not polarized then with light source 19704,19708 provide diffused light 19722,19724 mix on corresponding side, then pass back towards polarization beam apparatus film 19714, have first at polarization beam apparatus filmThe light 19730,19728 of polarization state is reflected towards reflective image source, and light 19738,19740 quilts with another polarization stateTransmission, the process continuously repeat.Therefore, in this embodiment of the invention, the light of another polarization state is continuously to follow againRing, thus increase the efficiency of display assembly 19702, because of light 19718,19720 provided by double light sources 19704,19708Two polarization states be used to irradiate reflective image source 18720.The increase diffusing reflection of the light of recycling also improve toThe uniformity for the irradiation light that reflective image source 18720 provides.Image light (19732,19734) can by directly to it is associated atAs optical device provides.
With in Figure 197 provide and can be used in another embodiment in the similar method of process as described above, whereinShow that assembly has flat surfaces in each side for folding polarization beam apparatus film.In this embodiment, since reflectivity polarizesEach side of film is flat, and the light from side lamp keeps irradiate uniformity provided by diffuser.
In another embodiment that assembly is shown shown in Figure 197, solid film retainer can be used, wherein it is another partiallyThe light of vibration state is recycled to improve efficiency.In this embodiment, each side of folding polarization beam apparatus film can be flatOr it is curved.
Figure 198 shows the diagram of the signal for manufacturing the method for headlight 19902 shown in such as Figure 199, the preceding lamps and lanternsThere are the folding mirror beam splitter film 19808 and double light sources on each side.In Figure 198, double light sources are had been not shown, because they canTo be a part of another assembling steps, or it is located in surrounding module.The flow chart of assemble method provides in Figure 20 4.?In this method, in step 20402, top 19810 and 19812 film retainer of bottom are provided.Top and bottom film is keptDevice 19810,19812 can be made of diamond turning, injection molding, compression molding or grinding of any transparent material.Material and systemThe combination for making technology is selected to provide the 19812 film retainer of top 19810 and bottom with low-birefringence.It is kept for filmThe suitable low birefringence material of device 19810,19812 includes glass material or plastics, such as Zeon Chemicals companyZeonex F52R, Mitsui company APL5514 or Osaka Gas company OKP4.It will be connect in the film retainer of top and bottomTouching fold polarization beam apparatus film 19808 surface be matched with by film 19808 it is in place in be maintained at required shape and angleWithout introducing significant air gap in degree, therefore image light can be hardly deflected by headlight 19902.In step 20404In, bottom film retainer 19812 or by adhesive bonding or by provide by bottom film retainer 19812 be maintained atThe surrounding structure of (either contact or in distance to a declared goal) is attached to reflexive figure in the relationship in reflective image source 18720Image source 18720.In step 20408, polarization beam apparatus film is folded.Then in step 20410, polarization beam apparatus film is folded19808 are placed in lower film retainer 19812, and upper membrane retainer 19810 is placed in top, thus forces polarization beam splittingDevice film 19808 and the match surface of 19812 film retainer of top 19810 and bottom are conformal.Implement in the replacement of the method for the present inventionIn example, adhesive is applied in the surface of top 19810 or 19812 film retainer of bottom, so that polarization beam apparatus film 19808It is adhered to top 19810 or 19812 film retainer of bottom.In step 20412, diffuser 19802,19804 is attached toEach side of 19812 film retainer of lower part.Being schematically represented in Figure 199 for headlight 19902 of assembling shows.Similar side can be usedMethod manufactures headlight shown in Figure 191,194 and 195.Within the scope of the invention, the order of assembling can be changed.
In the alternative embodiment of the above method, film retainer 19810,19812 be attached to diffuser 19802,19804 or reflective image source 18720 or any other part before, film retainer 19810,19812 and fold polarization beam splittingDevice film 19808 fits together.Then step 20402,20408 and 20410 carry out in order, be manufactured similarly to Figure 191,Inside shown in 194 and 195 has the solid film retainer for folding polarization beam apparatus film 19808.18720 He of reflective image sourceDiffuser 19802,19804 later, adhere to by (step 20404,20412).
Various methods can be used to be held in place reflexive beam splitter film between the film retainer of top and bottomIn.Film in place can be adhered to top or bottom film retainer.Around top or bottom film retainer can be adhered toStructural member (not shown) or associated image forming optics (not shown).When reflexive beam splitter film is with wire-grid polarizerPolarization beam apparatus film when, if using adhesive on wire grid construction side, the performance of wire-grid polarizer may be damaged.?In such case, polarization beam apparatus film can be adhered to top on the opposite side of wire grid construction or bottom film retainer (depends onWhich adjoins wire grid construction in top or bottom film retainer).It is used to polarization beam apparatus film being adhered to film retainerAdhesive must be transparent and low-birefringence.The example of suitable adhesive includes UV solidification adhesive or contact adhesive.
Figure 200-203 shows a series of signals diagram of the another method for manufacturing the headlight with bilateral lamp.Figure 20 5It is flow chart the step of listing this method.In this method, it can be poured in the appropriate location around folding mirror beam splitter filmCasting top and bottom film retainer.In step 20502, polarization beam apparatus film 20008 is folded.In step 20504, foldPolarization beam apparatus film 20008 is inserted into side frame, which there is groove or matching parts to keep polarization beam apparatus film 20008In the required shape for headlight (double curved shape shown in 00 referring to fig. 2).In step 20508, side frame and then quiltIt is attached to reflective image source 18720.In step 20510, diffuser 20002,20004 is attached to each side of side frame.ThisWhen, it folds polarization beam apparatus film 20008 and is surrounded on side by side frame and diffuser 20002,20004, and is anti-on bottomPenetrating property image source 18720 surrounds.Figure 200 shows the signal diagram in reflective image source 18720, and reflective image source 18720 hasThere are the diffuser 20002,20004 and independent unsupported reflexive beam splitter film 20008 of attachment, the reflectivity beam splitter filmIt is supported at edge, so that required shape is given reflexive beam splitter film 20008.
Figure 20 1 shows the hole in side frame or surrounding structure, this some holes be used to transparent cast material leading to folding mirrorUnder property beam splitter film.As shown, the biggish hole 20102 near reflective image source 18720 be used to introduce transparent pourMaterial is cast, and lesser hole 20104 is used to air to escape from folding mirror beam splitter film 20008 is lower.In this sideIn method, folding mirror beam splitter film 20008 forms closed cavity on reflective image source 18720, and the cavity is by diffuser20002,20004 and side frame or surrounding structure included.When transparent casting resin is slowly injected into hole 20102, come self-enclosedThe air of cavity is discharged from lesser hole 20104.When cavity is full, each section tap hole 20104 of transparent cast material, thusIt prevents from forming pressure in the case where reflectivity divides beam splitter film 20008, which can make the shape distortion of film.Hole 20102 and 20104 is rightAfter can be plugged to prevent transparent cast material from leaking out.
In step 20512, transparency liquid cast material 20202 is poured in 20007 top of polarization beam apparatus film, such as schemesShown in 202.In step 20514, then apply transparent top piece or plate 20302 to provide flat top to material 20202, such asShown in Figure 20 3.When applying the flat panel of transparent material to transparent cast material, it is necessary to carefully to prevent air from remaining inUnder the flat panel of bright material.Plug can be set in surrounding structure, so that the flat panel of transparent material is kept and reflective imageSource is parallel.
Transparency liquid cast material can be any transparency liquid cast material, such as epoxy resin, acrylic acid or urethane.Top-film retainer use transparency liquid cast material identical with bottom film retainer is coped with, therefore image light is exposed toThe solid block of even optical thickness, and image light is not folded the surface deflections of polarization beam apparatus film.Transparency liquid cast material canAfter casting by allowing curing time, being exposed to UV or be exposed to heating power and can be cured.The solidification of transparent cast materialIt can be carried out in single step or multiple steps.Solidifying for the lower part as shown in Figure 20 1 can be the top shown in Figure 20 2It is carried out before casting.Alternatively, entirely the solidification of casting headlight can carry out after the step shown in Figure 20 3.
It is the advantages of method shown in Figure 200-203: is obtained between transparent cast material and reflexive beam splitter filmIt must be in close contact, therefore light can unimpededly pass through each section of headlight.The casting method may be additionally used for solids top or bottomFilm retainer, so that only top or bottom film retainer are cast.Although Figure 200-203, which shows to have manufactured, has curved surfaceHeadlight, this method can also be used for manufacture have flat surfaces headlight.
In another embodiment, one of film retainer is manufactured to solid members, and another film retainer is divided with polarization is foldedThe casting of beam device film is in place.Folding polarization beam apparatus film can be glued before another film retainer is cast in suitable positionClose solid members.By this method, the film retainer of casting will be with the close contact with the surface of polarization beam apparatus film.For solidThe material of body film retainer should have refractive index identical with the film retainer cast, to avoid in image light self-reflection imageSource deflects image light when passing to associated image forming optics.The example of appropriate matched material is from Bayer companyAPEC2000, which has 1.56 refractive index, and the available EpoxAcast690 injection molding from Smooth-On company,EpoxAcast690 has 1.565 refractive index and can cast.
In one more embodiment of the present invention, manufactured using the multistep molding process shown in the flowchart of such as Figure 20 6Solid film retainer.In step 20602, bottom film retainer is molded.Suitable molding technique includes injection molding, compression moldingOr casting.In step 20604, polarization beam apparatus film is folded.In step 20608, folds polarization beam apparatus film and be placed inOn the bottom film retainer of molding, then it is placed in as insertion piece in the mold of top-film retainer.In step 20610,Then top-film retainer is molded in folds on polarization beam apparatus film and bottom film retainer.Final result is inside with allThe solid film retainer of polarization beam apparatus film is folded as shown in Figure 191,194 and 195.The advantages of multistep molding technique, existsIn, force folding polarization beam apparatus film and the surface of bottom film retainer conformal, and top and bottom film retainer and folding are inclinedThe beam splitter film that shakes is in close contact.In a preferred embodiment, the refractive index of top and bottom film retainer is identical, accidentallyDifference is in 0.03.In a further preferred embodiment, top is used for for the glass transition point ratio of the material of bottom film retainerThe glass transition point of the material of film retainer wants high, or the material for bottom film retainer is crosslinked, and is made to obtain and roll overWhen moulding top-film retainer on folded polarization beam apparatus film and bottom film retainer, bottom film retainer will not deformation.It can be moldedOne example of the appropriate combination of material is cyclic olefin material, and such as Tg is 139C and refractive index is 1.53 from ZeonThe Zeonex E48R and Tg of Chemicals company are 177C and refractive index is 1.53 from Topas AdvancedThe Topas6017 of Polymers company.
It is appreciated that some embodiments of AR eyepiece of the invention have permit before irrealizable resolution levels andSuch as the various combined high modulation transfer functions of the equipment sizes such as frame thickness.For example, in some embodiments, Xiang YongThe virtual image pixel resolution rank that family is presented can be in the range of about 28 to 46 pixels of every degree.
With reference to Figure 105 A to C, it is bent the direction of the angle control image light of wire-grid polarizer.It is bent the song of wire-grid polarizerThe width of line traffic control image light.Curve allows using narrow light source, because it spreads light, then when light hits curveLight/reflected light is folded with uniform irradiation image display.Upset is not affected by by the image light that wire-grid polarizer is passed back.CauseThis, curve also allows the miniaturization of optics assembly.
In Figure 21-22, augmented reality eyepiece 2100 includes frame 2102 and left and right mirror pin or temple part 2104.Such as bulletThe protectiveness lens 2106 of road lens are installed in front of frame 2102, with protect user eyes or they be prescriptionUser is corrected in the case where lens to the view of ambient enviroment.The front of frame may be additionally used for installing camera or imaging sensor2130 and one or more microphone 2132.Invisible in Figure 21, waveguide is installed in protectiveness lens in frame 2102After 2106, there is one on every side of center or the scalability bridge of the nose 2138.Front cover 2106 can be it is replaceable so that canEasily change color or prescription for the specific user of augmented reality equipment.In one embodiment, each lens can rapidly moreIt changes, to allow to have different prescriptions for each eyes.In one embodiment, lens are available as discussed elsewhere hereinFastener is replaced rapidly.Some embodiments only can have projector and waveguide combination in the side of eyepiece, and the other side is available normalAdvise the fillings such as lens, reading lens, prescription lens.Left and right mirror pin 2104 respectively can by projector or micro-projector 2114 or itsIts image source is vertically mounted on the top of hinge 2128 of loading spring, for easier assembly and vibration/shock protection.OftenOne side support member further includes the temple part shell 2116 for installing the associated electronics of eyepiece, and respectively may also include elasticityPad 2120 is held on head, for the better maintenance of user.Each temple part further includes extension circulating earphone 2112 and is used forThe aperture 2126 of mounting head headband 2142.
As noted, temple part shell 2116 includes electronic device associated with augmented reality eyepiece.Electronics devicePart may include several circuit boards as shown, such as the circuit board 2122 of microprocessor and radio, for chip-on communicationThe circuit board 2124 of system (SOC) and open multimedia application processor (OMAP) processor plate 2140.Chip-on communication system(SOC) may include electronic device for one or more communication capacities, communication capacity include wide local area network (WLAN),BlueToothTMCommunication, frequency modulation(PFM) (FM) radio, global positioning system (GPS), 3 axis accelerometers, one or more gyrosInstrument etc..In addition, right temple pieces can include optical touch board (not shown), the control for user to eyepiece on the outside of temple partSystem and one or more application.
In one embodiment, digital signal processor (DSP) can be programmed and/or be configured to receive video feed information,And by video feed be configured to that driving passes through that optical display uses no matter the image source of which kind.DSP may includeBus or other communication mechanisms for the communication information, and it is coupled to bus the internal processor of processing information.DSP canIncluding being coupled to bus for storing information and instruction to be executed memory, such as random-access memory (ram) or itsIts dynamic memory (for example, dynamic ram (DRAM), static state RAM(SRAM) and synchronous dram (SDRAM)).DSP may include couplingClose bus for for internal storage storage static information and instruction nonvolatile memory, such as read-only memory(ROM) or other static storage devices are (for example, programming ROM (PROM), erasable PROM (EPROM) and electrically erasable PROM(EEPROM)).DSP may include special purpose logic devices (for example, specific integrated circuit (ASIC)) or configurable logic device (exampleSuch as, simple programmable logical device (SPLD), Complex Programmable Logic Devices (CPLD) and field programmable gate array(FPGA)).
DSP may include at least one computer-readable medium or memory, for keeping programmed instruction and being used forInclude data structure, table, record or other data needed for driving optical display.It is suitable for the invention computer-readable JieThe example of matter can be compact-disc, hard disk, floppy disk, tape, magneto-optic disk, PROM(EPROM, EEPROM, dodge EPROM), DRAM,SRAM, SDRAM or any other magnetic medium, compact-disc (for example, CD-ROM) or any other optical medium, card punch,Paper tape or any other physical medium, carrier wave (being described below) or computer with sectional hole patterns can therefrom read any otherMedium.Realize one or more sequences of one or more instructions for optical display so as to execute can be related to computer canRead the various forms of medium.DSP may also include communication interface, with provide to can connected such as Local Area Network networkThe data communication of the alternative communication network of link or such as internet couples.Wireless link can also be implemented.Any suchIn realization, suitable communication interface is transmittable and receives carrying to various types of information (such as videos of optical display expressionInformation) the electricity of digit data stream, electric magnetically or optically signal.
Eyepiece is able to carry out the context-aware capture to video, and movement adjustment video of this capture based on viewer is caughtParameter is caught, wherein parameter can be image resolution ratio, video compression ratio, frame per second per second etc..Eyepiece can be used for various video and answerWith such as record is by integrating video that is camera shooting or sending from external video equipment, passing through eyepiece to wearer's playbackVideo (passing through methods described herein and system), stream transmission come from external source (for example, Conference Calling, live news feedbackSource, the video flowing from another eyepiece) or from the live video for integrating camera (for example, from integrated non-line-of-sight camera)Deng.In embodiments, eyepiece can accommodate the multiple Video Applications that can be once presented to wearer, for example checkStreamed external video link, while the video file that playback is stored on eyepiece.Eyepiece can provide 3D viewing experience, such asBy providing image to any one eye, or alternatively, offer simplifies 3D experience, such as provides quantity to one of two eyesThe content of reduction.Eyepiece can provide Text enhancement video, such as cannot hear included sound when audio conditions are too noisyWhen frequency, audio using foreign language for a user when, user is when wanting the transcription of record audio etc..
In embodiments, eyepiece can provide context-aware Video Applications, such as at least one of adjustment video captureParameter and/or checking according to the environment of wearer.For example, wearer's focal attention external environment can need and non-videoVideo is presented to the wearer of eyepiece by eyepiece in the context of external environment, wherein at least one parameter is so as to present lessDistracting mode adjusts presented video (for example, the adjustment of spatial resolution;The adjustment of number of pictures per second;With indicating to regardThe still image replacement video of frequency content is presented, photo that all people in this way of still image are stored, the single frame from video)Deng.In another situation, it can moved in wearer in the context of (for example, walking, running, cycling, driving) by eyepieceOn integrated cameras capture video, the wherein at least one parameter video that is capturing of adjustment with help to adapt to movement (for example,It makes adjustment during the quick movement that eyepiece sensing video will obscure, make tune during wearer just slowly walks or movesIt is whole).
In embodiments, which can be spatial resolution parameter (for example, by the pixel in region, pressingThe pixel of the specified color in region is limited to by region only single (' black and white ') pixel), the visual field, the frame temporally recorded, on timeBetween frame, data compression rate, the period for not being recorded/presenting etc. that present.
In embodiments, which can be adjusted based on the input that eyepiece is sensed, and such as basis is used forDetermine head movement (for example, with the quick head movement of determination, slow head movement) motion detection input (as described here),It is used to determine that surrounding's video of the relative motion between wearer and environment to be caught via the received image of integrated camera by handlingThe eye movement (as described here) of the movement of environment or movement, wearer in environment is caught to determine wearer whether by justThe video presented to wearer diverts one's attention, environment light and/or sound conditions etc..
In embodiments, eyepiece can about reduce influence to the quality of the video tastes of wearer of movement or environment orInfluence to the quality of the video stored when capturing video provides image procossing, such as compensate slight movement,Jump, quickly movement;Background illumination and/or acoustic environment are adjusted, by adjusting color mixture, brightness etc..To processingSelection can be according to input, environmental aspect, the video content etc. sensed.For example, preference high quality graphic in some cases, withSo that in some cases, the reduction of quality is unacceptable, therefore video may be suspended in these cases.AnotherIn one situation, when determining that situation interferes the capture of acceptable quality level but there is still a need for when certain continuity of capture, video and/Or audio compression can be applied.Processing can be also differently applied to each eyes of eyepiece, such as about the leading of wearerEye compares varying environment situation that another eye is experienced etc. about an eyes.Processing can compensate for bright light environments, wherein embeddingEnter formula sensor and is examined for ambient light level to carry out the possibility shown to content adjustment, such as so as to true based on environmentSurely the compression of what color channel and/or manipulation, modification color curve/palette to be executed so as to more visible or more relative to environmentInvisible, change color depth, changes how to compress color etc. at color curve.
In embodiments, as sensed situation as a result, eyepiece can start a movement, such as when for example more than eyepieceIf predetermined amount of movement etc. more than a condition when go to screenshot capture mode while continuing the audio-frequency unit in video, movement willPredetermined quality level degradation is set then to stop shooting video, the triggering video presentation when sports level is exceeded in the received video of instituteIn change etc..
In embodiments, as reception control signal as a result, eyepiece can start a movement.Control signal can be based on meshThe interior perhaps user gesture that the position of mirror, eyepiece are currently viewed.Movement can be the video captured by eyepiece to depositingThe upload or downloading that storage space is set.Movement should can only control the reception of signal itself or the confirmation control of control signal and user's startingThe reception of signal processed and be activated.Act the designated position that can be moved in the video just shown by glasses, to just by eyeThe designated position in video that mirror is shown adds the starting of the process of bookmark etc..
In embodiments, as sensing situation result carry out adjustment can by user preference, organizational policy, state orFederal regulations etc. control.For example, a preference, which can be, is always to provide a certain quality, resolution ratio, compression ratio etc., no matter sensedInput what is indicated.
In one example, the wearer of eyepiece can be at wherein their head and therefore the integrated camera of eyepiece in meshIn the environment quickly shaken while mirror recorded video.In this case, eyepiece can adjust at least one parameter and be caught with reducingThe degree for shaking video is caught, is such as increased and is applied to the compression ratio of video, reduces the frame number captured according to the time period (for example, everyCapture a frame within several seconds), be discarded between each frame in the picture the frame with great change, reduce spatial resolution etc..
In one example, the wearer of eyepiece may pass through eyepiece using video conference, and wherein eyepiece is passed by movementSensor senses wearer and is moving.As a result, during this movement, the video feed of the replaceable participant of still image,The image of such as one of other participants or be such as sent to other members user image.It by this method, can be to wearerAnd/or other participants in video conference reduce influence distracting caused by wearer's movement.
In one example, wearer may watch video and then start to drive, if wearer continues viewing currentlyThe video of display then this be likely to become safety problem.In this case, the motion detection of environment can be able to be instruction by eyepieceIn automobile, and viewing experience is changed to it is less distracting, such as if the eye movement of wearer indicates userThe case where just quickly alternately being changed in sight (steering direction) or the visual field after automobile and between shown video.Eyepiece can for example stop video, and the option continued is provided to viewer.Eyepiece can also sensed environmental movement to be inIt in automobile, bicycle, distinguishes between walking etc., and correspondingly adjusts.
In one example, no matter wearer be in the car, on bicycle, walking or it is other etc. whens may need to assistIt helps to navigate to a position.In this case, eyepiece will show video navigation application to user.What eyepiece was shown to user leadsBoat instruction can be selected by control signal.Control signal can be by being currently displaying in position that wearer specifies, glasses inHold or the wearer destination said generates.Position can be meal/drink, education, event, exercise, family, outdoor, retail shop,One of communications and transportation position etc..
In one example, wearer may capture video, and wherein ambient enviroment is distracting or in one aspectThe quality for reducing video, due to color contrast, mixing, depth, resolution ratio, brightness etc..Eyepiece can be directed to wearerIt is adjusted in outdoor comparison situation indoors, under different illumination conditions, unfavorable audio conditions are inferior.At thisIn kind situation, eyepiece can adjust recorded image and sound to obtain as to the more efficiently of the content capturedThe video product of expression.
In embodiments, eyepiece can provide external interface to computer peripheral, all monitors in this way of peripheral equipment,Display, TV, keyboard, mouse, memory store (for example, external fixed disk drive, optical drive, solid-state memory), netNetwork interface (for example, to network interface of internet) etc..For example, external interface can provide to the straight of external computer peripheralLead to (for example, being connected directly to monitor) in succession, to the indirect communication of external computer peripheral (for example, outer by centerPortion's peripheral interface equipment), by wired connection, by be wirelessly connected etc..In one example, it is external can be connected to offer for eyepieceThe central external peripheral interface equipment of the connection of portion's peripheral equipment, wherein outer peripheral interface equipment may include computer interface workTool, such as computer processor, memory, operating system, peripheral driver and interface, USB port, external display interface, netNetwork port, speaker interface, microphone interface etc..In embodiments, eyepiece can pass through wired connection, wireless connection, directly placeCentral external peripheral interface is connected in bracket is medium, and when attached, can be equipped with to eyepiece similar with personal computerOr identical calculating instrument.In embodiments, it is selected for that eyepiece can be looked at by user by the equipment that eyepiece controls, refers toTo eyepiece, from the user interface shown on eyepiece selection etc. select.In other embodiments, eyepiece can be checked in user and be setThe user interface of the equipment is shown when standby or sensing equipment.
Frame 2102 is in the general shape of the wound sunglasses of an inner loop.The two sides of glasses include marmem headBand 2134, such as Nitinol headband.Nitinol or other shapes memorial alloy headband are suitble to the user of augmented reality eyepiece.HeadbandIt is customized, so that its training or preferred shape is presented when they are worn by the user and are heated to close to body temperature.In embodiments, the suitable of eyepiece can provide eyes of user width technique of alignment and measurement.For example, the display projected forThe position of the wearer of eyepiece and/or to appropriate location will definitely be adjusted to, to adapt to the various eye widths of different wearers.Positioning and/or alignment can be automatically, such as by the position via optical system detection wearer's eyes (for example, irisOr pupil detection), or manually, carried out by wearer etc..
The other feature of this embodiment includes detachable Noise canceling headsets.As can be seen from the drawing, earphone is intended to connectThe control of augmented reality eyepiece is connected to transmit sound to the ear of user.Sound may include the nothing from augmented reality eyepieceThe input of line internet or telecommunication capability.Earphone may also include it is soft, can deformation plastics or foam sections so that with classThe mode for being similar to earplug protects the inner ear of user.In one embodiment, the input of the ear to user is limited to about by earphone85dB.This allows normally listening to for wearer, at the same provide for gunslinging noise or other explosive noises protection andIt is listened in high background noise environment.In one embodiment, the control of Noise canceling headsets can have automatic growth control, be used forCarry out eliminating the very quickly adjustment of feature when protecting wearer's ear.
Figure 23 depicts the layout of the projector 2114 of the vertical arrangement in eyepiece 2300, and wherein irradiation light is at it to can be withIt is in the display of silicon backboard and the way of imager plate Down-Up by the side of PBS, and hits and constitute when irradiation lightIt is refracted as image light when the inner boundary of the triangular prism of polarization beam apparatus, and reflects from projector and to enter waveguide saturatingMirror.In this example, width of the dimension of projector with the imaging plate of 11mm, one end from imaging plate to picture centre lineIt the distance of 10.6mm and is shown from picture centre line to the distance of the about 11.8mm of LED board one end.
Detailed and assembling the view of visible each component to projector discussed above in Figure 25.This view depicts work asWhen micro-projector 2500 is assembled near such as hinge of augmented reality eyepiece, how compact micro-projector 2500 have.Micro- projectionInstrument 2500 includes shell and the retainer 2508 for installing certain optical elements.Because each colour field is by optical display 2510Imaging, corresponding LED color are opened.RGB LED light engine 2502 is depicted near bottom, is mounted on radiator 2504On.Retainer 2508 is installed on the top of LED light engine 2502, and retainer is mounted with opticaltunnel 2520, diffusing lens 2512(eliminate hot spot) and condenser lens 2514.Light passes to polarization beam apparatus 2518 from condenser lens, and it is saturating to be then passed into fieldMirror 2516.Then light is refracted to LCoS(liquid crystal over silicon) on chip 2510, image is formed there.Light for image is then logicalIt crosses field lens 2516 to be reflected back toward, and is polarized and reflected 90 ° by polarization beam apparatus 2518.Light be then departed from micro-projector withJust it is emitted to the optical display of glasses.
Figure 26 depicts exemplary RGB LED module 2600.In this example, LED be with 1 red, 1 blue andThe 2x2 array of 2 green dies, and LED array has 4 cathodes and a general anode.Maximum current can be oftenA tube core 0.5A, and maximum voltage (≈ 4V) may be needed to green and blue tube core.
In embodiments, system is shown using the optical system that can generate monochromatic display to wearer, the monochromeBenefit can be provided to image definition, image resolution ratio, frame per second etc..For example, frame per second can be three times (compared with RGB systems),And this may night vision or it is similar in the case where it is useful, camera is imaged to around in night vision or similar situation, wherein thisA little images can be processed and be shown as content.Image is likely more bright, and such as three times are bright if having used three LEDIt is bright, or space is saved only with a LED.If having used multiple LED, they can be, and homochromy or they canTo be not homochromy (RGB).System can be changeable monochrome/color system, and which use RGB, but when wearer thinksWhen wanting monochrome, individual LED or multiple LED either can choose.All three LED can be used simultaneously, and non-sequential be madeWith to obtain white light.It is non-sequential any other in the case of three times using that may rise such as wherein frame per second using three LEDWhite light." switching " between monochromatic and colour can " manual " (for example, physical button, gui interface select) carry out, or switching canIt is carried out automatically depending on the application being currently running.For example, wearer is possibly into night vision mode or fog dispersal mode, and systemProcessing part automatically determine eyepiece and need to enter monochromatic high refresh rate mode.
Fig. 3 depicts the embodiment of the projector of the horizontal setting in use.Projector 300 can be placed in eyepiece frameIn arm.LED module 302 under processor control 304 can once be emitted single color according to rapid serial.The light being emitted canIt is travelled downwardly in opticaltunnel 308, and by least one before encountering polarization beam apparatus 312 and being deflected towards LCoS display 314A lenslet 310 that homogenizes, full color images are shown at LCoS display 314.LCoS display can have 1280x720pResolution ratio.Then image can be reflected back toward by polarization beam apparatus, from folding the reflection of eyeglass 318 and leave projector at itIt is advanced through collimator on the way, and enters waveguide.Projector may include the diffraction element for eliminating aberration.
In one embodiment, interactive wear-type eyepiece includes optics assembly, and user is checked by the optics assemblyThe content of ambient enviroment and display, wherein optics assembly include correct user to the corrective element of the view of ambient enviroment,Allow the optical waveguide of the free form surface of internal reflection and be arranged to will be from the figure of such as optical display of LCoS displayCoupled lens as being directed to optical waveguide.Eyepiece further includes for process content so that the one or more shown to user is integratedProcessor, and the integrated image source for being used to introduce content optics assembly of such as projector tool.Image source is whereinIn each embodiment of projector, projector tool includes light source and optical display.The light of light source from such as RGB block existsIt is emitted under the control of processor, passes through polarization beam apparatus, it is polarized there, later from such as LCoS display or certainThe optical display of LCD display in other embodiments reflects, and enters optical waveguide.The surface of polarization beam apparatus can will come fromThe color image of optical display is reflected into optical waveguide.RGB LED module can sequence emergent light to be formed from optical displayThe color image of reflection.Corrective element can be perspective correcting lens, it is attached to optical waveguide, to allow to ambient enviromentCorrectly check regardless of image source is on or off.This corrective element can be wedge-shaped correcting lens, and can be withIt is prescription, coloring, coating etc..It can may include allowing waveguide by the optical waveguide for the free form surface that higher-order multinomial describesCurvature and size adjusting double free form surface surfaces.The curvature and size adjusting of waveguide allow it to be placed in interactive wear-typeIn the frame of eyepiece.This frame can be sized to fit the head of user in the way of being similar to sunglasses or glassesPortion.Other elements of the optics assembly of eyepiece include homogenizer and collimator, the light by the homogenizer from light sourceIt is transmitted to ensure that light beam is uniform, and collimator improves the resolution ratio for entering the light of optical waveguide.
In embodiments, prescription lens may be mounted to that the inner or outer side of eyepiece lens.In some embodiments, locateCan power may be logically divided into the prescription lens being mounted on the outside and inside of eyepiece lens.In embodiments, prescription corrections areIt is provided by corrective optical device, corrective optical device such as depends on eyepiece lens or optics group by surface tensionOne component of piece installing, such as beam splitter.In embodiments, corrective optical device can be partially disposed one of optical pathIn position, and it is partially disposed in another position of optical path.For example, the half of corrective optical device may be disposed at pointThe outside of the convergence plane of beam device, and the other half is arranged on the inside of convergence plane.It by this method, can be to the image from inside sourcesLight and scene light differently provide correction.That is, the light from source can be only in being corrected property optical device on the inside of the convergent lens onPartial correction because image is reflected to the eyes of user, and scene light can be corrected by two parts, because light passes throughBeam splitter transmitting, is consequently exposed to different optical corrections.In another embodiment, optics assembling associated with beam splitterPart can be the assembly of sealing, make assembly waterproof and dustproof etc., wherein the inner surface of sealing optics assembly hasOne part of corrective optical device, and the outer surface for sealing optics assembly has another portion of corrective optical devicePoint.Suitable optical device can be provided by the Press-On Optics of 3M company, which at least can be used as prism (that is, luxuriant and rich with fragrance aluniteEar prism), aspherical subtract lens, aspherical plus lens and bifocal lens.Corrective optical device can be user it is removable andInterchangeable correction of refractive errors tool, it is suitable between the eyes of user and shown content to be adapted to be movably attached toIn position, so that eyesight of the correction of refractive errors tool about shown content and ambient enviroment correction user.Correction of refractive errors workTool may be adapted to be installed to optics assembly.Correction of refractive errors tool may be adapted to be installed to wear-type eyepiece.Correction of refractive errors tool can makeIt is suitble to friction to install.Magnetic attachment tool can be used to install for correction of refractive errors tool.Depending on the eyesight of user, Yong HukeIt is selected from multiple and different correction of refractive errors tools.
In embodiments, the corrective optical device that the present invention can provide ' buttons ' on eyepiece, such as wherein user canMobile and interchangeable correction of refractive errors tool is adapted between the eyes for being movably attached to user and shown contentSuitable position in so that correction of refractive errors tool about shown content and ambient enviroment correction user eyesight.DioptricCorrection tool may be adapted to be installed to optics assembly, wear-type eyepiece etc..Correction of refractive errors tool friction can be used to be suitble to, be magnetic attachedTool etc. install.Depending on the eyesight of user, user can select from multiple and different correction of refractive errors tools.
With reference to Fig. 4, it can be polarized and the image light of collimation optionally passes through display coupled lens 412 and enters waveguide414, display coupled lens itself can be or can not be collimator or be additional to collimator.In embodiments, waveLeading 414 can be free form surface waveguide, and wherein the surface of waveguide is described by Polynomial Equality.Waveguide can be straight.WaveLeading 414 may include two reflective surfaces.When image light enters waveguide 414, it can hit first surface with an incidence angle, shouldIncidence angle is greater than the critical angle that total internal reflection (TIR) occurs.Image light may participate between first surface and the second surface on oppositeTIR rebound, the activity for eventually arriving at compound lens checks region 418.In one embodiment, light may participate at least three times TIRRebound.Since waveguide 414 is tapered, to allow the eventually off waveguide of TIR rebound, the thickness of compound lens 420 may not be equalEven.The distortion of compound lens 420 checked in region can be saturating by the length placement wedge shape correction along free form surface waveguide 414Mirror 410 minimizes in order to provide the uniform thickness checked on region of at least lens 420.Correcting lens 410 can be prescriptionLens, the lens of coloring, polarized lens, trajectory lens etc. are installed in the inner or outer side of eyepiece lens, or in some realitiesIt applies in example, mounted then the inside of eyepiece lens and outside.
In some embodiments, although optical waveguide can have the first surface of the total internal reflection for the light for allowing access into waveguideAnd second surface, light actually will not may enter waveguide with the inside incidence angle that will lead to total internal reflection.Eyepiece can be in light waveIt include specular surface on the first surface led, with the content shown towards the reflection of the second surface of optical waveguide.Therefore, mirror surface tableFace allows access into the total reflection of the light of optical waveguide or enters at least part of reflection of the light of optical waveguide.In each embodimentIn, surface can be 100% mirror surface or small percentage mirror surface.In some embodiments, instead of specular surface, waveguide andAir gap between corrective element can cause not will lead to the reflection that the incidence angle of TIR enters the light of waveguide.
In one embodiment, eyepiece includes the integrated image source of such as projector, and the image source is from the light for adjoining eyepiece armWaveguide introduces content to show to optics assembly.The prior art carried out with the injection of wherein image from the top side of optical waveguideOptics assembly is different, and the present invention is provided from the side of waveguide injects the image of waveguide.The aspect ratio of displayed content is bigCause rectangular between substantially rectangular, which has approximate horizontal long axis.In embodiments, the aspect ratio of displayed contentIt is 16:9.In embodiments, can by via injection image rotation come realize wherein long axis it is approximate horizontal shown byThe rectangular aspect ratio of appearance.In other embodiments, it can be realized by stretching image until it reaches required aspect ratio.
Fig. 5 depicts the design for showing the waveguide eyepiece of sample dimension.For example, in this design, coupled lens 504Width can be 13~15mm, the continuously optical coupling of optical display 502.These elements can be placed in an arm of eyepiece,Or it is redundantly placed in two arms of eyepiece.Image light from optical display 502 projects to freedom by coupled lens 504The waveguide 508 of curved surface.The thickness of compound lens 520 including waveguide 508 and correcting lens 510 can be 9mm.In this designIn, 8mm exit pupil diameter that waveguide 502 allows to have 20mm gaps.Obtained see-through view 512 is about 60-70mm.From pupilHole to image light enters the distance in image light path of waveguide 502, and (dimension a) is about 50-60mm, this is suitable for big percentageHuman tau it is wide.In one embodiment, the visual field can be bigger than pupil.In embodiments, the visual field may not fill up lens.It answersUnderstand, these dimensions are and to be not necessarily to be construed as limiting for specific illustrative embodiments.In one embodiment, waveguide,It buckles optical device and/or corrective lens may include optical plastic.In other embodiments, waveguide, buckle optical device and/Or corrective lens may include glass, marginal glass, bulk glass, glassy metal, palladium strengthened glass or other suitable glass.In embodiments, waveguide 508 and corrective lens 510 can be by being selected to hardly lead to the different materials system of chromatic aberationAt.Material may include diffraction grating, holographic grating etc..
It is all as in the embodiment shown in fig. 1 in, when two projectors 108 are used for left images, the figure that is projectedAs can be stereo-picture.In order to allow solid to check, projector 108 can be placed in Adjustable Range each other, which allowsIt is adjusted based on the interpupillary distance of each wearer of eyepiece.For example, single optics assembly may include have for horizontal, vertical andTwo independent electro-optical modules of the various adjustment of sloped position.Alternatively, optics assembly can only include single electrooptics mouldBlock.
Figure 146 to 149 schematically shows augmented reality (AR) eyepiece 14600(without its temple part) an embodiment,The placement of middle image can be adjusted.The front and back perspective view of AR eyepiece 14600 is shown respectively in Figure 146 and 147.In this realityIt applies in example, the electronic device and each section (being referred to as 14602) of optical projection system are located above lens 14604a, 14604b.AR meshThere are two projection screen 14608a, 14608b for the tool of mirror 14600, they are adjustably from the wearer of lens 14604a, 14604bAdjustment platform 14610 on side is hung.It is equipped on adjustment platform 14610 only for the beam 14612 relative to AP eyepiece 14600The inclined mechanism of vertical adjustment lateral position and each projection screen 14608a, 14608b.
Structure for adjusting the position of one or two of display screen can be by manual actuation (for example, via button)Or motor, manual control equipment (thumb wheel, lever arm etc.) or motorization and both the manual equipments of software activationCombination is to control.AR eyepiece 14600 uses manual equipment, these equipment will will now be described.Those skilled in the art will manageSolution, adjustment mechanism are designed to make to laterally adjust to decouple with tilt adjustments.
Figure 148 shows the perspective dorsal view of the part on the left of the wearer of AR eyepiece 14600, wherein adjustment platform 14610The upper adjustment mechanism 14614 for projection screen 14608a is perhaps shown more clearly.Projection screen 14608a is installed in frameOn 14618, frame 14618 is fixedly attached at movable carriage 14620(or part of it).In its 14612 side of beam, bracketThe 14620 carrying axis 14622 that can be attached in first piece 14624 of arc groove of adjustment platform 14610 are rotatable and can slideIt supports dynamicly.In its temple part side, bracket 14620 can rotationally and slidably be supported by yoke 14628.With reference to Figure 150, yoke14628 have shaft portion 14630, it is fixedly attached to bracket 14620 and coaxial with axis 14622 is carried, with to bracket14620 provide rotary shaft.Yoke 14628 be attached to adjustment platform 14610 the second supporting block 14632(see Figure 151) arcIt is slidably and rotatably supported in slot.
Yoke 14628 also has from radial outwardly extending two parallel arms 14634a, 14634b of shaft portion 14630.OftenThe free end of one arm 14634a, 14634b has hole, such as the hole 14638 of arm 14634b, for fixedly capturing axis therebetween14678, (see Figure 149) as described below.Arm 14634a has anchor portion 14640, it is attached to the axle portion of yoke 14628 there14630.Anchor portion 14640 has the through-hole 14642 for slidably capturing bolt 14660, (see Figure 152) as described below.
Referring again to Figure 148, adjustment mechanism has the first thumb for controlling the lateral position of projection screen 14608aSpinning roller 14644 and inclined second thumb wheel 14648 for controlling projection screen 14608a.First thumb wheel14644 extend partially by the groove 14650 of adjustment platform 14610, and can be engaged threadedly by the first thread spindle 14652And it supports.First thread spindle 14652 is slidably supported in the through-hole in the third and fourth supporting block 14654,14658.Third and fourth piece 14654,14658 and/or groove 14650 side for preventing 14644 transverse shifting of the first thumb wheel.Therefore, thumb wheel 14644(is rotated about their axes to be indicated by arrow A) cause 14652 transverse shifting of the first thread spindle (by arrow BInstruction).As best in Figure 152 as it can be seen that the first thread spindle 14652 has from the radial outwardly extending bolt 14660 in the side Qi Liang.(it is noted that the screw thread of the first thread spindle 14652 is depicted not in the drawings, but it can be single or multiple pitch threads.) bolt14660 are slideably captured by the vertical vertical through hole 14642 in the anchor portion 14640 of the arm 14634a of yoke 14628.When the first thumb revolvesWheel 14644 is when being transferred to the direction for causing the first thread spindle 14652 laterally to advance towards beam 14612, and bolt 14660 is towards through-hole 1464214612 side of beam push, this makes yoke 14628, bracket 14620, frame 14618 and the first projection screen 14608a whole againTowards 14612 transverse shifting of beam (see arrow C).Similarly, the first thumb wheel 14644 is gone to opposite direction leads to the first throwingShadow screen is far from 14612 transverse shifting of beam.
Second thumb wheel 14648 be used to control the first projection screen 14608a around bracket axis 14622 and yoke axle partThe inclination of axis defined in 14630.Referring now to Figure 153, the second thumb wheel 14648 is fixedly attached to hollow flange axis14664 narrow portion 14662.The screw spindle part that screw thread receives eye hook 14672 can be used in the flange section 14668 of flange shaft 14664Divide 14670.(it is noted that the screw thread of threaded shank portion 14670 is depicted not in the drawings, but it can be single or multiple pitchesScrew thread.) in use, the narrow portion 14662 of flange shaft 14664 is rotatably by adjusting the countersunk hole in platform 1461014674(is shown in Figure 151) so that thumb wheel 14648 is on the bottom side of adjustment platform 14610, and eye hook 14672 is in top sideOn, and the flange section 14668 of flange shaft 14664 immerses oneself in be captured in part in countersunk hole 14674.Referring again to figure149, the eye of eye hook 14672 is slidably engaged around axis 14678, and axis 14678 is in the free end of yoke arm 14634a, 14634bHole in be captured.Therefore, around its axis rotate the second thumb wheel 14644(as indicated by arrow D) cause flange shaft 14664 withIt is rotated together with, this causes the threaded shank portion 14670 of eye hook 14672 vertically to pass in and out movement (such as relative to flange section 14668Indicated by arrow E), this causes the eye of eye hook 14672 to be pushed towards axis 14678, and this causes yoke 14628 mobile around its axis, becauseThis causes the first projection screen 14608 to tilt (as indicated by the arrowsf) facing away or facing towards wearer.
Referring again to Figure 148, it is noted that the electronic device and each section of optical projection system 14602a, which is located at, is fixed on bracketOn the platform 14680 at 14620 top.Therefore, projection platform 14608a and its optical projection system 14602a associated thereElectronic device and partial spatial relationship keep substantially not any laterally or diagonally adjusting to projection platform 14608a progressChange.
AR glasses 14600 further include the adjustment mechanism similar to the adjustment mechanism 14614 described just now, are used for located lateralWith the second projection screen 14608b on the wearer right side for being obliquely positioned at AR eyepiece 14600.
In one embodiment, eyepiece may include that inclination or curved guide rail, the guide rail for IPD adjustment make optical modeBlock is more maintained in curved frames.In some embodiments, display can be used for being connected to such inclination or curvedGuide rail.
In embodiments, one or more display screens of AR eyepiece are arranged to be parallel to the line of connection eyes of user.In some embodiments, one or more display screens around its vertical axis rotate so that they close to that end of nose from evenThe line for connecing the eyes of user starts to rotate inward with the angle in about 0.1 to about 5 degree of range towards eyes in parallel, i.e., " toes toIt is interior ".These latter embodiments it is some in, the inside angle of toes is permanently fixed, and in other embodiments, toesInside angle is that user is adjustable.User can adjust embodiment it is some in, adjustability is limited to two or morePredeterminated position, such as indicate those of closely convergence, middle distance convergence and remote convergence position.In other embodiments, it can adjustProperty is continuous.It preferably, is wherein further including each embodiment of the modified AR glasses of automatic vergence as disclosed hereIn, in-toed amount is taken into account in vergence amendment.It is inward from wherein toes in constant each embodiment, toesInside amount, which can be included directly in automatic vergence amendment, can adjust embodiment without position sensor, but in userIn, it is preferred to use position sensor is to transmit existing in-toed amount to processor so as in vergence corrected CalculationIt uses.It is to adjust or can be performed manually by the adjustable each embodiment of user, such as lead in the inside angle of wherein toesCross to use directly or for example selectively enables one or two display screen rotate around its vertical axis indirectly by drive chainDeflecting roller, or can be motorized to complete selectable rotation when being activated by user by user interface or control switchTurn.
In some cases, toes inwardly adjust can be used for during the eyes of user be maintained at the long activity of particular focal lengthEssion for telecommunication (for example, when reading, watching monitor, ball match or horizon) loosens the eyes of user.Above-mentioned toes are insideAdjustment can be used, with will pass through effectively rotational display screen come preferably with the eye alignment of user and be user interpupillary distance intoRow adjustment.
In embodiments, the present invention provides mechanical interpupillary distance adjustment, and such as wherein be suitable for can be by for the optics assembly of eyepieceUser adjusts position in frame, so that user has the ability of the position of the eyes change optics assembly about user.Position adjusts controllable horizontal position, upright position, inclination etc. of the optics assembly in spectacle-frame.
In embodiments, the present invention can provide the adjustment of digital interpupillary distance, and such as wherein integrated processor executes pupil alignmentProcess, the process allow user to adjust the placement location in the visual field that displayed content is presented on eyepiece optics assembly,Pupil alignment calibration factors are arranged so as to the use in the placement of other display contents.Calibration factors may include shown interiorHold level and/or vertical adjustment in the visual field.Calibration factors may include multiple calibration factors, respectively indicate to arrive real objectDistance will use range calibration factor when the calculating based on the distance to real object positions content in the visual field.Calibration becauseElement may include the calibration process based on multiple calibration factors, and calibration factors respectively indicate the distance to real object, when based on arrivingThe calculating of the distance of real object will use range calibration factor when positioning content in the visual field.The positioning of image can be in displayOn be adjusted in the visual field it is mobile it.Being moved apart two images will make the object for seeming imaging separate, and incite somebody to actionImage is moved into being close together so that object seems close.For the difference of position of the object in the visual field of each eyesIt is different to be referred to as parallax.Parallax and the object perceived leave the distance dependent of user.
Referring now to Figure 173, the decomposition view of glasses is depicted.Electronic device 17302 is located at superciliary glassesIn front frame, including CPU, display driver, camera, radio, processor, user interface etc..Optical module 17308 is using coveringThey can optionally lens 17304 be attached to frame.Lens 17304 can be coloring or colorable.Solid is shown hereinEmbodiment, it should be appreciated that single optical module 17308 can also be used.Electronic device 17302 is sealed with lid 17314, lid packetThe user interface 17310 of physics is included, which can be button, touch interface, spin, switch or any other physical User and connectMouthful.Physical user interface 17310 can control the various aspects of glasses, the function of such as glasses, the application run on glasses orControl the application of external equipment.User can be in the following manner come easily with this control function component: catching frameLower part stablizes it while touching control function component/UI of frame roof.Arm 17312 is laid on ear, and may includeFor fixing the headband of glasses, the socket of audio/ear-phone function or outer audio equipment, battery 17318 or function of supplying power etc..ElectricityPond 17318 can be placed in any arm, the option of battery 17318 be disclosed herein, but battery 17318 further includes any availableBattery types.Headband can be the ear band made of Nitinol or other shapes memorial alloy.Ear band can be the shape of beltFormula, or as in Figure 177, ear band 17702 can be bent wire form, is attenuated, is mitigated and reduces cost.For beautyPurpose, frame can be any color, and lens can be any color, and the tip of eyepiece arm or at least arm can beColoured.For example, formed arm tip Nitinol can be it is coloured.
Referring now to Figure 174, enables battery and come to the electronics in front frame, also by the way that hinge 17408 can be operatedIt is designed using wiring, wiring design using the electric wire of minimum number and passes through hinge in wire guide 17404.Wiring design canIncluding from front frame electronic device to the electric wire 17402 for the earphone being located on arm.Figure 175 depicts the amplified version of Figure 174, focusIt is on the electric wire 17402 of wire guide 17404.Figure 176 A-C is described with the various pieces of frame and internal glasses Operation ProfileWire guide.The view is the user side from the frame for looking at hinge.Figure 176 A shows the section of most of material, Figure 176 BThe section close to most of material is shown, and Figure 176 C shows the full release of glasses.
Fig. 6 depicts an embodiment of the eyepiece 600 with perspective or translucent lens 602.The image 618 projected canSee on lens 602.In this embodiment, the image 618 being just projected on lens 602 is that wearer is seeing by chanceThe augmented reality version for the scene seen, wherein showing the point of interest (POI) marked in the visual field to wearer.Augmented reality version canIt is enabled by the forward direction camera (being not shown in Fig. 6) being embedded in eyepiece, the imaging contents which is watching wearerAnd home position/POI.In one embodiment, the output of camera or optical transmitter can be sent to eyepiece controller or depositReservoir is checked for storing, for being sent to remote location or the people for wearing eyepiece or glasses.For example, video output can be spreadIt send to virtual screen and is watched for user.Therefore video output can be used for the position for assisting in user, or can remotely be sent outOther people are given to assist the position for helping to position wearer or for any other purpose.GPS, RFID, it is manually enteredOther detection techniques can be used to determine the position of wearer.Using position or mark data, database can be accessed by eyepieceTo obtain the information that can be applied, project or display together in other ways with the content being just seen.Augmented reality apply andTechnology will further discuss here.
In Fig. 7, an embodiment of the eyepiece 700 with translucent lens 702 is depicted, in translucent lens 702 justIn display streaming media (e-mail applications) and call-in reporting 704.In this embodiment, media, which have blocked, checks regionA part, it should be understood, however, that shown image can be placed in from anywhere in the visual field.In embodiments, matchmaker can be madeBody is more transparent or opaquer.
In one embodiment, eyepiece can be received from the external source of such as external transducer box and be inputted.Source can be depicted in meshIn the lens of mirror.In one embodiment, when external source is phone, the stationkeeping ability of phone can be used to show based on position for eyepieceThe augmented reality set, including the label covering from the AR application based on label.In embodiments, eyepiece processor is operated inOr the VNC client in associated device can be used for being connected to computer and control the computer, the wherein display of computerDevice is watched in eyepiece by wearer.In one embodiment, the content from any source can be streamed to eyepiece, such as fromIt is placed in the display of the panorama camera on vehicle top, the user interface of equipment, imaging from target drone or helicopter etc..For example, working asWhen the feed for the camera being mounted on gun is directed to eyepiece, which allows the shooting not target in direct sight.
Lens can be discoloration, such as photochromic or electrochromic.Electrochromism lens may include in response to placeThe outburst of charge that reason device applies on off-color material and the whole off-color material for changing at least part of opacity of lensOr discoloration coating.For example, and refer to Fig. 9, the color change portion 902 of lens 904 be illustrated as it is dimmed, such as the part justWhen providing shown content to eyepiece wearer, is provided to wearer and bigger check ability.In embodiments, existThere may be the multiple color change intervals that can be independently controlled, the major parts of such as lens, the sub-portion in the region projected on lensPoint, it is the programmable regions of lens and/or the region projected, controlled etc. in Pixel-level.It can be via to the activation of off-color materialThe control technology further described here is controlled, or for certain applications (for example, stream transmission Video Applications, solar trackingUsing the camera of the brightness in, ambient light sensor, the tracking visual field) it is automatic enable, or passed in response to the UV in insertion frameSensor controls.In embodiments, electrochromic layer can an optical element between each optical element and/or on eyepieceSurface on, on corrective lens, on trajectory lens etc..In one example, electrochromic layer can be made of stack, allAs having the PET/PC film of indium tin oxide (ITO) coating there are two electroluminescent (EC) layer therebetween, this can remove another layerThus PET/PC reduces reflection (for example, layer stack may include PET/PC-EC-PET/PC-EC-PET/PC).In embodiments,Electrically controllable optical layer can be used as providing based on the scheme of liquid crystal for the two-spot state with color.In other embodiments,The multilayer of the liquid crystal or alternate electronics color that form optical layer can be used for providing variable color, so that optical layer is certainLayer or section can be opened or closed by grade.Electrochromic layer can be generally used for any of the transparency of the electric control in eyepiece, packetInclude SPD, LCE, electrowetting etc..
In embodiments, lens can have angular-sensitive coating, which allows to transmit the light with low incidence angleWave, and reflect the light with high incident angle, such as s polarised light.Discoloration coating can such as pass through control technology described hereinBranch point integrally controls.Lens can be variable contrast, and contrast can be by pushing button or described hereinThe control of any other control technology.In embodiments, the wearable interactive wear-type eyepiece of user, wherein eyepiece includes opticsAssembly, by the optics assembly, user check surrounding environment and shown content.Optics assembly may include correctionUser to the corrective element of the view of ambient enviroment, for process content so as to the integrated processor that is shown to user andFor content to be introduced to the integrated image source of optics assembly.Optics assembly may include electrochromic layer, which provides dependenceIn the requirement of displayed content and the display Character adjustment of ambient conditions.In embodiments, display feature can be brightDegree, contrast etc..Ambient conditions can be a luminance level, and in no display Character adjustment, which will makeIt obtains shown content to be difficult to be checked by eyepiece wearer, wherein display Character adjustment can be applied to content in optics assemblyThe region being shown.
In embodiments, eyepiece can have the control such as brightness, contrast, spatial resolution in eyepiece view field,So as to for bright or dark ambient enviroment change or improvement user's checking to the content projected.For example, user may beEyepiece is used under bright sunshine condition, in order to make user clearly see shown content, display area may need to changeBrightness and/or contrast.Alternatively, checking that region can be modified around display area.In addition, either in display area alsoIt is outside display area, the region changed spatially can be directed or control according to the application realized.Such as, it is only necessary toThe fraction for changing display area, such as when the part of display area is between the display portion and ambient enviroment of display areaA certain determination or predetermined comparison degree ratio deviate when.In embodiments, each section of lens can be modified brightness, comparisonDegree, spatial extent, resolution ratio etc., be such as held to include entire display area, be adjusted to only a part of lens, adapt toChange in brightness-contrast of the illumination condition of ambient enviroment and/or shown content and be dynamic for this changeEtc..Spatial extent (for example, the region influenced by change) and resolution ratio (for example, display optical resolution) can be in lensDifferent piece on change, the different piece of lens includes high-resolution position, low resolution position, single pixel position etc.,What wherein different positions can be combined to realize the application being carrying out checks purpose.In embodiments, for realizing rightThe technology of the change of brightness, contrast, spatial extent, resolution ratio etc. may include electrochromic material, LCD technology, optical devicePearl, flexible display, suspended particulate equipment (SPD) technology, colloid technology of middle insertion etc..
In embodiments, it is understood that there may be the various activation patterns of electrochromic layer.For example, user can enter sunglasses mouldFormula, wherein compound lens only seems that some dimmed or users can enter " turning dark before one's eyes " mode, and wherein compound lens has seemedFull blackening.
Realize brightness, contrast, spatial extent, resolution ratio etc. change when adoptable technology an example can be withIt is electrochromic material, film, ink etc..Electrochromism is certain materials by reversibly changing appearance when charge is appliedShown phenomenon.Depending on specific application, various types of materials and structure can be used for constructing electrochromic device.ExampleSuch as, electrochromic material includes tungsten oxide (WO3), this is the main chemical for producing electrochromic or intelligent glass.In embodiments, when realizing change, Electro-Discolor Coating be can be used on the lens of eyepiece.In another example, electroluminescentColor changing display can be used at realization ' Electronic Paper ', and Electronic Paper is designed to imitate the appearance of regular paper, wherein Electronic PaperReflected light is shown as regular paper.In embodiments, electrochromism can be realized in various applications and material, includingGyricon(is made of the polyethylene spheres being embedded in transparent silicon sheet, and each ball suspending is in oil vacuole, so that they canRotate freely), electrophoretic display device (EPD) (granules of pigments of charging is rearranged by using the electric field of application to form image), E-Ink technology, electrowetting, electrofluid, interferometric modulator, the organic transistor being embedded in flexible substrate, nanochromics are aobviousShow device (NCD) etc..
When realizing the change of brightness, contrast, spatial extent, resolution ratio etc., another example of adoptable technology canTo be suspended particle device (SPD).When applying small voltage to SPD film, the microscopic particles in stable state are by random scatterIt opens, become alignment and allows light through.Response can be immediately, uniformly, and have stable color on film.To voltageAdjustment allow user's control to pass through bright, dazzle and heat.The range of the response of system can be from complete in its closed stateBlock the dark blue colored appearance of light to the clear appearance in its open state entirely.In various embodiments, SPD technology can be applicationEmulsion in plastic supporting base, to obtain movable film.The plastic foil can be (as the single glass surface) of lamination, be suspended in twoBetween a sheet glass, plastics or other transparent materials etc..
With reference to Fig. 8 A-C, in certain embodiments, Electrooptical devices can be installed in simple eye or eyes according to two partsTurn over/under turn in arrangement: 1) Electrooptical devices;And 2) corrective lens.Fig. 8 A depicts two-part eyepiece, wherein electric lightIt learns device and is comprised in the module 802 that can be electrically connected to eyepiece 804 via electric connectors such as plug, bolt, socket, wiringsIn.In this arrangement, the lens 818 in frame 814 can be entirely corrective lens.Two one of electro-optical module 802Pupil spacing between half can be adapted to various IPD in Liang808Chu.Similarly, the placement of display 812 can be via beam 808Adjustment.Fig. 8 B depicts eyes electro-optical module 802, and wherein half is flipped up, and the other half is turned over by under.The bridge of the nose can be completelyIt is adjustable and elasticity.This allows with head with 3 points of installations on the bridge of the nose and ear, to ensure that image is steady in user's eyeIt is qualitative, and it is different from the unstability for the optical device for being mounted on the helmet being displaced on scalp.With reference to Fig. 8 C, lens 818 canTo be to meet ANSI, hard conating damage resistant polycarbonate trajectory lens, it can be discoloration, can have angular-sensitive coating,It may include UV sensitive material etc..In this arrangement, electro-optical module may include the VIS/ based on CMOS for InfravisionThe black silicon sensor of NIR/SWIR.Electro-optic module 802 can have the feature of quick rupture capacity, replace for customer flexibility, sceneIt changes and upgrades.Electro-optical module 802 can have integrated power outlet.
In Figure 79, turn over/under to turn over lens 7910 may include light block 7908.Removable, elastic night adapter/light dam/lightBlock 7908 can be used for shielding turn over/under turn over lens 7910, such as nighttime operation.The decomposition plan view of eyepiece further depictsHead is with 7900, frame 7904 and the adjustable bridge of the nose 7902.Figure 80 depicts the electric light of positive (A) and flank angle (B) viewLearn the decomposition view of assembly.The corrective lens 7910 of retainer 8012 keep perspective optical device.O ring 8020 and screw8022 are fixed to retainer on axis 8024.Spring 8028 provides the company equipped with spring between retainer 8012 and axis 8024It connects.Axis 8024 is connected to steel framework 8014, which is fixed on eyepiece using thumbscrew 8018.Axis 8024 takes on hingeAnd tool is adjusted using the IPD of IPD adjustment handle 8030.As in Figure 81 as it can be seen that handle 8030 along adjustment screw thread 8134 rotate.Axis 8024 fixes helicla flute 8132 there are two also having.
In embodiments, photochromatic layer can be used as a part of the optical device of eyepiece and be included.PhotochromismIt is a chemical species by inverible transform of the absorption of electromagnetic radiation between two kinds of forms, two of them form has different suctionsReceive spectrum, the reversible change in the case where being exposed to given light frequency such as color, darkness.In one example, light-induced variableChromatograph can be included between the waveguide of eyepiece and corrective optical device, is first-class on the outside of corrective optical device.In each implementationIn example, photochromatic layer (being such as used as darkening layer) can use UV diode or other photochromic sound as known in the artThe activation of answering property wavelength.In the case where electrochromic layer UV photoactivation, eyepiece optics device may also include outside photochromatic layerThe UV coating of side, to prevent the UV light from the sun from activating it unintentionally.
Photochromic device quickly changes from light to dark at present, and slowly bright from secretly changing to.This is because photochromic materialMolecular change involved in change from limpid to dark.Photochromic molecules are removed in UV light such as UV light from the sunLater, it vibrates and returns to limpid.By increasing the vibration of molecule, such as by being exposed to heat, optical device will be quickly limpid.Photochromatic layer can be from secretly to bright speed relevant to temperature.Quickly from secretly change to it is bright be even more important to Military Application,In Military Application, the user of sunglasses usually from bright external environment enter dark internal environment, and it is important toRapidly seen in internal environment.
The present invention provides the photochromatic layer just with the heater of attachment, and heater be used to accelerate photochromic materialIn from secretly to limpid transformation.This method dependent on photochromic material from secretly to the relationship the speed of limpid transformation,Wherein change very fast in higher temperature.In order to allow heater to quickly increase the temperature of photochromic material, photochromic materialExpect to be provided as the thin layer with thin heater.By remaining the thermal mass of the photochromic film device of per unit areaLow, heater only needs to provide a small amount of heats quickly to generate the big temperature change in photochromic material.Due to photochromicMaterial only need to be from higher temperature be in during the transformation for secretly becoming limpid, heater only needs to use in a short time, so electricForce request is low.
Heater may be thin, transparent heating element, such as ITO heater or any other transparent and conductive filmMaterial.When user needs eyepiece quickly to become limpid, user can activate heater first by any control technology described hereinPart.
In one embodiment, heating element can be used for calibrating photochromic element, and to compensate lens, oneself dimmedCold environmental aspect.
In another embodiment, the shallow layer of photochromic material can be placed on thick substrate, stacked on top heater memberPart.For example, the sunglass lens of covering may include the photochromic scheme accelerated, but still have on the display region optionallyThe separated electrochromism piece of UV photocontrol is used with or without,
Figure 94 A depicts the photochromic film device with snake heater pattern, and Figure 94 B depicts photochromic filmsThe side view of equipment, wherein the equipment is the lens for sunglasses.Photochromic film device does not contact guarantor illustrated aboveShield property covers lens to reduce the thermal mass of equipment.
United States Patent (USP) 3,152,215 describes the heater layer combined with photochromatic layer, to be reduction from secretly to clearThe purpose of clear fringe time heats photochromic material.However, photochromatic layer is placed in wedge shape, this will be greatly increasedThus the thermal mass of equipment simultaneously reduces the rate that heater will change the temperature of photochromic material, or greatly increase changePower needed for the temperature of photochromic material.
The present invention includes the use to the thin bearing bed for being applied with photochromic material.Bearing bed can be glass or plastics's.As it is known in the art, photochromic material can be applied to by vacuum coating, by dipping or by thermal diffusionIn bearing bed.The thickness of bearing bed can be 150 microns or smaller.Selection to bearing bed thickness is based on the light in dark stateThe required speed changed between the required darkness and dark state and clear state of mutagens color film device carrys out selection.Thicker holdsCarrier layer can be darker in dark state, simultaneously because having bigger thermal mass and being more slowly heated to raised temperature.On the contrary,Relatively thin bearing bed is less dark in dark state, simultaneously because having lesser thermal mass and being heated to raised temperature quicklyDegree.
Protective layer shown in Figure 94 is separated with photochromic film device, so that the thermal mass of photochromic film device is protectedIt is low for holding.By this method, protective layer can be made thicker to provide higher impact strength.Protective layer can be glass or plastics, such as protective layer can be polycarbonate.
Heater can be the transparent conductor being formed in relatively uniform conductive path, so that being formed by heaterLength on the heat that generates be relatively uniform.One example of the transparent conductor that can be formed is titanium dioxide.As shown in fig. 94,Large area is provided for being in electrical contact at each end of heater pattern.
As noticed in the discussion of Fig. 8 A-C, augmented reality glasses can include lens to each eyes of wearer818.Lens 818 can be made to be easily fitted into frame 814, so that each lens can be to be intended to customize using the people of glasses.CauseThis, lens can be corrective lens, and can also be coloured for use as sunglasses, or be suitable for desired environment itsIts quality.Therefore, lens can be colored as yellow, dead color or other suitable colors, or can be it is photochromic so that thoroughlyThe reduction when lens are exposed to brighter light of the transparency of mirror.In one embodiment, lens are also designed to fasten to frameIn frame or on frame, i.e. buckle lens are one embodiment.For example, lens can be made of high quality Schott optical glass, andIt may include polarization filter.
Certainly, lens need not be corrective lens;They can be used only as sunglasses or to the optical system in frameProtection.It is non-turn over/under turn in arrangement, it goes without saying that outer lens are to helping to protect the wave in fairly expensive augmented reality glassesIt leads, check that system and electronic device are important.Bottom line, outer lens provide the abrasive protection to user environment no matterIt is sand, blackberry, blueberry prickly bushes, brambles in an environment etc. or the flying debris in another environment, bullet and howitzer.In addition,Outer lens can be it is decorative, for changing the appearance of compound lens, may personalization to user or fashion feel there is suctionGravitation.Outer lens can also help individual user to distinguish the glasses of his or her glasses and other people, such as when many useWhen family flocks together.
Lens are suitable for impact, such as ballisticimpact is desirable.To, in one embodiment, lens and frameMeet the ansi standard Z87.1-2010 for being used for trajectory resistance.In one embodiment, lens also meet ballistic standard CEEN166B.In another embodiment, military affairs are used, lens and frame can meet the standard of MIL-PRF-31013, standardOr 4.4.1.1 3.5.1.1.Each of these standards have slightly different requirement to trajectory resistance, and respectively are intended to protectThe eyes of user are protected not by the impact of High-velocity Projectiles or clast.Although not specifying specific material, such asGrade is gatheredCarbonic ester is typically enough to the test by specifying in proper standard.
In one embodiment, as seen in fig. 8d, lens are on the outside of frame rather than inside is buckled into preferably to be rushedResistance is hit, because any impact the outside for being expected to self-reinforcing Reality glasses.In this embodiment, replaceable lens 819With multiple fastening arm 819a, these arms are mounted in the recess 820a of frame 820.The engagement angle 819b of arm be greater than 90 °, andThe engagement angle of recess is also greater than 90 °.So that each angle is greater than right angle has the actual effect for allowing that lens 819 are removed from frame 820.Such asThe vision of one people of fruit changes, or different lens it is expected if it is any reason, and lens 819 need to be removed.What is fastened setsMeter is so that there are slight compression or bearing loads between lens and frame.That is, lens can be securely held in frame,Such as pass through the slight interference fit of frame interior len.
It is not the only possible mode for removedly fastening lens and frame that the cantilever of Fig. 8 D, which fastens,.For example, can makeIt is fastened with annular, wherein the amplification edge of the continuous sealing lip cemented lens of frame, then the amplification edge of lens fastens toIn lip, or it may be fastened on lip.It is such to fasten commonly used in pen cap is attached to pen.This configuration can haveThe chance that the advantages of solid connection, very small dust and dirty particle enter is smaller.Possible disadvantage include around lens andThe required quite tight tolerance in the entire periphery of both frames, and the requirement to dimensional integrity in all three-dimensionals at any time.
It is also possible to using even more simple interface, which still can be considered as fastening.It can be in the outer surface of frameMiddle molding slot, lens have surface outstanding, which can be considered as being assembled to the joint tongue in slot.If slot is semi-cylindrical, such as from about 270 ° to about 300 °, joint tongue will be fastened in slot and be firmlyyed hold, it would still be possible between being retained in slotGap removes.In this embodiment, shown in Fig. 8 E, lens or replacement lens or lid 826 with joint tongue 828 can be inserted intoIn slot 827 in frame 825, even if the lens or lid do not fasten in frame.Because this assembly is close fit, it willTake on and fastens and lens are securely retained in frame.
In another embodiment, frame can be made into two panels, such as lower part and top, be filled using conventional joint tongue and slotMatch.In another embodiment, this design also can be used standard fasteners to ensure close grasping of the frame to lens.The designThe disassembly of anything on the inside of frame should not be needed.Therefore, fasten or other lens or lid should be assembled on frame orIt is removed from frame, without entering on the inside of frame.As paid attention in other parts of the invention, augmented reality glasses haveMany components.The careful alignment of some needs in assembly and subassemblies.Moving and shake these assemblies mayIt is unfavorable to its function, mobile and vibration frame and outside or fasten lens or lid may also can be unfavorable to its function.
In embodiments, turn over/under turn over arrange allow eyepiece modularized design.For example, not only eyepiece can be equipped with listEye or eyes module 802, lens 818 can also be replaced.In embodiments, supplementary features can be included with module 802 no matterModule 802 is associated with a display 812 or two displays 812.With reference to Fig. 8 F, the simple eye or eyes version of module 802To can be only display 852(simple eye for any one of this), 854(eyes), or before being equipped with it is simple eye to camera 858() and860 and 862(eyes).In some embodiments, module can have additional integrated-optic device, such as GPS, laser range finderDeng.It is double in the embodiment 862 for enabling the reaction of city leader tactics, consciousness and visualization (also referred to as ' Ultra-Vis ')Eye electro-optical module 862 is equipped with before solid to camera 870, GPS and laser range finder 868.These features allow Ultra-Panorama night vision of the Vis embodiment with panorama night vision and with laser range finder and geographical location.
In one embodiment, Electrooptical devices characteristic can be but not limited to as follows:
In one embodiment, projector characteristic can be such that
In another embodiment, enhancing display eyepiece may include lens a part or work as micro-projector of electric controlA part of optical device between micro-projector and waveguide.It is real that Figure 21 depicts one with such liquid lens 2152Apply example.
Glasses, which may also include at least one, can provide the camera or optical sensor that one or more images for user are checked2130.Image is formed by micro-projector 2114 in every side of glasses, so that the waveguide 2108 on the side is conveyed.In a realityIt applies in example, additional optical element i.e. zoom lens 2152 can also be provided.Lens can be adjusted by user's electricity, so that waveguide 2108In the image seen focus for user.In embodiments, camera can be poly-lens camera, such as ' array camera ',Middle eyepiece processor can combine the data of multiple viewpoints from multiple lens and lens to construct high quality graphic.The technology canImaging is referred to as calculated, because software is used for the treatment of image.The advantages of imaging can provide image procossing is calculated, such as permission rootAccording to function treatment composograph under each lens drawings.For example, processor can provide since each lens provide the image of ownImage procossing is such as recessed into picture to create the image with special focusing, wherein the focus from one of lenticular image is clear, it is high-resolution etc., and wherein remaining image be defocus, low resolution etc..Composograph also may be selected in processorEach section stores in memory, while deleting rest part, such as when memory store it is limited and composograph onlyIt is a little partially important and to save.In embodiments, the use of array camera may be provided in after image is taken moreChange plan picture focus ability.Other than the imaging advantages of array camera, array camera be can provide than traditional simple lens groupThe thin mechanical outline of piece installing, and therefore make it easier to be integrated into eyepiece.
Variable lens may include by Varioptic company, Lyons, France city or California, USA Mountain View citySo-called liquid lens provided by LensVector Co., Ltd.Such lens may include that there are two types of immiscible liquids for toolCentral part.In general, light is immersed in a liquid by the path of lens, the i.e. focal length of lens by applying in these lensInterelectrode potential be modified or focus.Shadow of at least one of the liquid by obtained electric field gesture or magnetic field gestureIt rings.Therefore, electrowetting can occur, such as in the U.S. Patent Application Publication 2010/0007807 for authorizing LensVector Co., LtdDescribed in.Other technologies are described in the patent application publication 2009/021331 and 2009/0316097 of LensVector.EntirelyThese three inventions of portion are comprised in this by reference, as word for word described every page and each attached drawing here.
Other patent documents from Variopitc company describe the zoom for that can also work by electrowetting phenomenonThe other equipment and technology of lens.These documents include United States Patent (USP) 7,245,440 and 7,894,440 and U.S. Patent application2010/0177386 and 2010/0295987 are disclosed, each of these documents are comprised in this also by reference, asEvery page and each attached drawing are word for word described herein.In those references, two kinds of liquid usually have different refractive index and notSame electric conductivity, such as a kind of liquid is conductive, such as aqueous liquid, and another liquid is insulation, such as oily liquids.Applying potential can be changed the thickness of lens, and change path of the light by lens really, therefore change the focal length of lens.
Electric adjustable lens can be by the control of the control of glasses.In one embodiment, focus adjustment is by from controlPart recalls menu and adjusts the focus of lens to carry out.Lens can be controlled separately or can be controlled together.Adjustment is to pass throughPhysically rotate control handle, by what is indicated with gesture or carry out by voice command.In another embodiment, enhancing is existingReal glasses may also include rangefinder, and the focus of electric adjustable lens can by make the rangefinder of such as laser range finder be directed toward fromThe desired distance of user remote target or to being automatically controlled.
Such as above-mentioned United States Patent (USP) 7, shown in 894,440, variable lens can also be applied to augmented reality glasses or eyepieceOuter lens.In one embodiment, lens can simply replace corrective lens.With the variable of the adjustable control of its electricityMirror can replace the lens being mounted on image source or projector, or the supplement as this lens.Corrective lens plug-in unit is to useWhether activity provides corrective optical device to environment, the external world, the Waveguide display at family.
It is important that making the image presented to the wearer of augmented reality glasses or eyepiece, i.e., the image seen in waveguide is steadyIt is fixed.The view or image presented proceeds to digital electricity from one or two digital camera or sensor being mounted on eyepieceRoad, there in the display that image is apparent in glasses before, image is processed, and if it is desired, is stored as digital numberAccording to.In any situation, and as described above, numerical data is then used to form image, such as by using LCOS displayWith a series of RGB light emitting diodes.Use a series of lens, polarization beam apparatus and power supply liquid corrective lens and at least oneTransition lens from projector to waveguide handle light image.
It collects and the process of presentation image includes several mechanical and light linkages between each component of augmented reality glasses.FromAnd, it is clear that some form of stabilisation will be needed.This may include to most immediate cause, camera itself (because it is installed in shiftingOn moving platform), the optical stabilizations of glasses (itself is movably mounted on mobile subscriber).Thus it may be necessary to phaseMachine stabilizes or correction.In addition, reply liquid variable lens use at least some stabilisations or correction.It is desirable that this point atStabilization circuit not only corrects liquid lens, also circuit upstream of the correction from liquid lens, permitted including image sourceMultipart any aberration and vibration.One of this system the advantage is that many commercially ready-made cameras be it is very advanced,And usually there is at least one image stabilization feature or option.Accordingly, it is possible to respectively have there are many embodiments of the inventionThere is the identical or different method of stable image or very quick image stream, as described below.Term optical stabilizationization usually existsThis meaning for sentencing physically stabilized camera, camera platform or other physical objects is used, and image stabilization refers to countingAccording to manipulation and processing.
A kind of technology of image stabilization executes on the digital image when digital picture is formed.This technology can be usedBuffer area of the pixel of the outside boundaries of visible frame as undesirable movement.Alternatively, the technology can be used in successive frames separatelyOne metastable region or basis.This technology can be adapted video camera, by a manner of being enough to offset movement frame by frameThe electronic image of ground mobile video.This technology independent of sensor, and by reduce vibration from mobile camera orImage is directly stablized in other distracting movements.In some technologies, the speed of image can be slowed down, so as to digital mistakeThe rest part of journey adds stabilization procedures, and to each image request more times.These technology uses from moving frame by frameThe global motion vector that difference calculates is with the stabilized direction of determination.
The Photostabilised of image is moved or is adjusted optical element or imaging sensing using the mechanism of gravity or electric driveDevice makes it offset ambient vibration.Optically the another way of stable shown content is to provide gyro correction or increases to accommodatingThe sensing of the platform (for example, user) of strong Reality glasses.As described above, the biography that can be used and use on augmented reality glasses or eyepieceSensor includes MEMS gyro sensor.These sensors capture movement and movement in three-dimensional with very small increment, and can quiltCarry out the image that real time correction is sent from camera as feedback.Obviously, do not need at least and unwelcome movement it is very big by oneCaused by part may be the movement as user or camera itself.These biggish movements may include the overall movement of user, exampleAs walked or running, cycle.Lesser vibration may originate from augmented reality glasses, that is, be formed from camera (input) into waveguideImage (output) path electrically and mechanically linkage in component in vibration.These overall movements may be to correction or considerationIt is even more important, rather than independence and tiny movement in the linkage of the component in such as projector downstream.In embodiments, gyroImage can be stablized when image undergoes periodic motion by stabilizing.To such periodic motion, gyroscope can determine user movementPeriodically, and processor is sent information to so that the placement to the content in User is corrected.Gyroscope is in determinationThe rolling average of two or three in periodic motion or more circulation can be utilized when periodical.Other sensors can also by withIn stablizing image or correctly place image, such as accelerometer, position sensor, range sensor, ranging in the user visual fieldInstrument, biosensor, geodesy sensor, optical sensor, video sensor, camera, infrared sensor, optical sensor, lightBattery sensor or RF sensor.When sensor detects user's head or eyeball is mobile, sensor provides defeated to processorOut, processor can determine direction, speed, amount and the rate of user's head or eyeball movement.Processor can convert this informationIt is further processed at suitable data structure for controlling the processor (can be same processor) of optics assembly.Data structureIt can be one or more vectors.For example, the direction of vector can define mobile orientation, and the length of vector can define movementRate.It is exported using processed sensor, the display of content is adapted accordingly.
Therefore motion-sensing can be used for sensing and move and correct to it, such as in optical stabilization, or for sensingThen movement corrects the image for shooting and handling, such as in image stabilization.For sensing movement and correcting image or numberAccording to device describe in figure 34 a.In this device, one or more motion sensors, including accelerometer, angle position can be usedSet sensor or gyroscope, such as MEMS gyroscope.Data from sensor are fed back to sensor interface appropriate (such asAnalog-digital converter (ADC)) or other suitable interfaces (such as digital signal processor (DSP)).Microprocessor and then institute as aboveIt states and handles this information, and the frame of image stabilization is sent to display driver, be then sent to above-mentioned see-through displayOr waveguide.In one embodiment, display is shown with the RGB in the micro-projector of augmented reality eyepiece and is started.
In another embodiment, video sensor or augmented reality glasses or other equipment with video sensor can quiltsIt is installed on vehicle.In this embodiment, video flowing can be transferred in vehicle by telecommunication capability or the Internet-enabledPersonnel.One application can be sightseeing or visit to region.Another embodiment can be exploration or investigation to region, evenIt is patrol.In these embodiments, the gyrocontrol of imaging sensor will be useful, rather than to image or indicate imageThe correction of numerical data application gyro.One embodiment of this technology is described in Figure 34 B.In this technique, camera or imageSensor 3407 is installed on vehicle 3401.One or more motion sensors 3406 of such as gyroscope are installed in cameraIn assembly 3405.It stabilizes platform 3404 and receives information and stabilized camera assembly 3405 from motion sensor, so that shakeIt is minimized with rocking in camera work.This is real optical stabilization.Alternatively, motion sensor or gyroscope can be pacifiedIt is interior mounted in stabilizing with platform sheet or stabilizing platform itself.This technology actually provides stabilized camera or imaging sensorOptical stabilization, formed pair with the digital stabilization of image is corrected by data that computer disposal camera is shot laterThan.
In a kind of technology, the key of optical stabilization is before imaging sensor converts images into digital informationUsing stabilisation or correction.In a kind of technology, the feedback from the sensors such as gyroscope or angular-rate sensor is encoded simultaneouslyIt is sent to actuator, actuator mobile image sensor as the focus of autofocus mechanism adjustment lens.ImageSensor is moved for maintaining the projection on image to the plane of delineation, this is that the function of the focal length of lens currently in use has canCan automatic distance correction from the rangefinder of interactive wear-type eyepiece and focus information can be obtained by lens itself.AnotherIn one technology, the angular-rate sensor of also sometimes referred to as gyrosensor can be used for detection level respectively and vertically move.Then detected movement can be fed back to electromagnet to move the floating lense of camera.However, this optical stabilization skillArt will have to each lens for being applied to be conceived, so that result is fairly expensive.
U.S. Patent Application Publication of the stabilisation of liquid lens in the Varioptic company that Lyons, France city is awardedIt is discussed in 2010/0295987.It theoretically, is relatively simple to the control of liquid lens, because only existing to be controlled oneA variable: the voltage level that the electrode in the conductive and nonconductive liquid of lens is applied, such as use lens case and lidAs electrode.Apply voltage and causes change or inclination in liquid-liquid interface via electrowetting effect.This change or inclinationAdjust the focus or output of lens.In its most basic situation, there is the control program of feedback then will apply voltage and determinationEffect of the voltage applied to result, the i.e. focus or astigmatism of image.Then voltage can apply according to various modes, such as phaseDeng and opposite+and-voltage, two positive voltages of different amplitudes, two negative voltages of different amplitudes etc..Such lens are claimedFor electric variable optical lens or electrooptics lens.
Voltage can apply to electrode in a short time according to each mode, and carry out the inspection of focus point or aberration.Inspection can exampleSuch as carried out by imaging sensor.In addition, the sensor on lens in sensor or such case on camera, can detect cameraOr the movement of lens.Motion sensor may include being mounted on liquid lens or one of the optical train of very close liquid lensAccelerometer, gyroscope, angular-rate sensor or piezoelectric transducer on point.In one embodiment, then building is appliedThe table for such as calibrating table of voltage needed for voltage and correction angle or given mobile and horizontal.It can also be by the different portions of liquidPoint in use segmented electrode so that four voltages can be applied rather than two, Lai Zengjia complexity.Certainly, if using four electricityPole can then apply four voltages, cause than using the only much more mode of two electrodes.These modes may include for opposite panelThe equal or opposite positive and negative voltage etc. of section.One example is described in Figure 34 C.Four electrodes 3409 are installed in liquid lensIn shell (not shown).Two electrodes are installed in non-conductive liquid or near it, and another two electrode is installed in conductionIn liquid or near it.It is independent for the possibility voltage that each electrode can apply.
It searches or calibration table can be constructed and be placed in the memory of augmented reality glasses.In use, accelerometer orOther movable sensors are by the movement of sensing spectacles (camera or lens i.e. on glasses) itself.The motion-sensing of such as accelerometerDevice will especially sense the movement for the tiny oscillating mode smoothly transmitted for interfering image to waveguide.In one embodiment, thisElectrically controllable liquid lens can be applied to by locating the image stabilization techniques, so that the image from projector is by school immediatelyJust.This is by the output of stable projection instrument, at least partly at least the one of the vibration to augmented reality eyepiece and movement and userA little movements are corrected.It also can exist for manually controlling for the other parameters of adjust gain or correction.It is noted that except image passesOther than the focus adjustment that a part that sensor control provides and as adjustable focus projection instrument discusses, this technologyIt may be additionally used for the myopia or long sight of correcting individual user.
Another zoom element is using tunable liquid crystal cells with focus image.These elements are for example in U.S. Patent application public affairsIt opens and is disclosed in 2009/0213321,2009/0316097 and 2010/0007807, these patent applications are by quoting its whole quiltIt is incorporated herein and as foundation.In this approach, liquid crystal material is comprised in transparent cell, it is therefore preferred to have matched foldingPenetrate rate.Unit includes transparent electrode, such as the electrode made of indium tin oxide (ITO).Using a spiral electrode andSecond spiral electrode or plane electrode, spatially non-uniform magnetic field are applied.Other shapes of electrode can be used.Magnetic fieldShape determine the rotation of molecule in liquid crystal cells, lens focus is changed with realizing the change of refractive index and therefore realizingBecome.Therefore liquid crystal can be changed its refractive index by electromagnetically-operated, so that tunable liquid crystal cells take on lens.
In the first embodiment, tunable liquid crystal cells 3420 are described in Figure 34 D.The unit includes the internal layer of liquid crystal3421 and such as polyimides directional material thin layer 3423.This material facilitates liquid crystal aligning in preferred orientationsIn.Transparent electrode 3425 is located on every side of directional material.Electrode can be plane, or can be such as the right side in Figure 34 DIt is spiral shown in side.Transparent glass substrate 3427 includes the material in unit.Electrode is formed, so that they borrow shapeTo magnetic field.As noted, in one embodiment using the spiral electrode on side or bilateral, so that two sides are not symmetrical's.Second embodiment is described in Figure 34 E.Tunable liquid crystal cells 3430 include center liquid crystal material 3431, transparent glass liningBottom wall 3433 and transparent electrode.Bottom electrode 3435 is plane, and top electrodes 3437 are spiral-shaped.Transparent electrode canIt is made of indium tin oxide (ITO).
Additional electrode can be used for liquid crystal to amorphism or the Fast Restoration of nature.Small control voltage therefore by withDynamically to change the refractive index for the material that light passes through.The spatially non-uniform magnetic field of shape needed for voltage generates allows liquid crystalTake on lens.
In one embodiment, camera includes black silicon, the short-wave infrared (SWIR) in other place descriptions of this patentCmos sensor.In another embodiment, camera is the video sensor of 5,000,000 pixels (MP) optical stabilization.Implement at oneIn example, control includes 3GHz microprocessor or microcontroller, and be may also include with for from camera or video sensorImage carry out scan picture 30M polygon/second graphics accelerator 633MHz digital signal processor.In a realityIt applies in example, augmented reality glasses may include for broadband, personal area network (PAN), Local Area Network, wide area network, WLAN, followIEEE802.11 or wireless Internet, radio or the telecommunication capability for looking back communication.The kit provided in one embodimentInclude the bluetooth capability for following IEEE802.15.In one embodiment, augmented reality glasses include the encryption for secure communicationSystem, such as 256 advanced ciphering system (AES) encryption systems or other suitable encipherors.
In one embodiment, aerogram may include the ability for 3G or 4G network, and may also include wirelessly because of spyNet ability.For the extended service life, augmented reality eyepiece or glasses may also include at least one lithium ion battery, and institute as aboveIt states, including charging ability.Charging plug may include AC/DC power adapter, and be able to use multiple input voltages, such as 120Or 240V.The control of the focus for adjusting adjustable punktal lens includes 2D or 3D wireless air mouse in one embodimentMark is other in response to the posture of user or the non-contact control of movement.2D mouse can be from California, USA Fei LimengLogitech Company, city buys.3D mouse is described herein as, or other mouses can be used, and can such as be purchased from Taiwan Cideko companyThe Cideko AVK05 obtained.
In one embodiment, eyepiece may include the electronic device for being adapted control optical device and associated system, packetInclude central processor unit, nonvolatile memory, digital signal processor, 3-D graphics accelerator etc..Eyepiece can provide additionalElectronic component or functional component, including inertial navigation system, camera, microphone, audio output, power supply, communication system, sensingDevice, code table or isochronon function, thermometer, vibration temple part motor, motion sensor, enabling are to the audio frequency control of systemMicrophone enables contrast and the UV sensor of light modulation etc. to photochromic material.
In one embodiment, the central processing unit (CPU) of eyepiece can be OMAP4, have dual 1GHz processorCore.CPU may include 633MHz DSP, give 30M polygon/second ability to CPU.
Double micro-SD(secure digitals can also be provided in system) slot is for providing additional removable non-volatile memory.
Onboard camera can provide the color of 1.3MP, and record up to 60 minutes video footages.The video recorded can quiltMini-USB transmission equipment can be used to unload film in Wireless transceiver.
Chip-on communication system (SOC) can be with wide area network (WLAN), versions 3.0, GPS receiver, FM radio etc.Operation.
The operation of 3.6VDC lithium ion chargeable battery can be used in eyepiece, to obtain longer battery life and easy to use.Additional power supply can be provided by the solar battery outside system framework.These solar batteries can power, and can also be rightLithium ion battery charging.
The total power consumption of eyepiece is about 400mW, but depends on used feature and application and change.For example, having veryThe processor sensitive application of more video and graphics requires more power, and will be close to 400mW.It is relatively simple, less video is sensitiveUsing less power will be used.The operating time once charged can also use with application and feature and be changed.
Micro-projector illumination engine, herein also called projector, it may include multiple light emitting diodes (LED).In order to mentionFor lifelike color, the blue led of the red LED of Osram company, the green LED of Gree company and Cree company is used.These are the LED based on tube core.RGB engine can provide adjustable color output, and allowing user is various programs and optimizing applicationViewing.
In embodiments, illumination can be added to glasses or control illumination by various modes.For example, LED light or itsIts lamp can be embedded in the frame of eyepiece, be such as embedded in the bridge of the nose, around compound lens or at temple part.
The intensity of illumination or the color of illumination can be modulated.Modulation can be passed through by various control technologies described hereinVarious applications, filtering and amplification, to complete.
As an example, illumination can be modulated by various control technologies described herein, such as to the adjustment of control handle,Gesture, eyeball movement or voice command.If user wants to increase the intensity of illumination, which can adjust the control handle on glassesControl handle on hand or his adjustable lens in shown user interface, or otherwise.Eye can be used in userBall is mobile to control the control handle shown on lens or he can control handle otherwise.User can pass through handMobile or other bodies are mobile to be illuminated to adjust, so that the intensity or color of illumination are changed based on the movement that user carries out.AndAnd user can adjust illumination by voice command, such as by saying the illumination or request to show it that request increases or decreasesThe phrase of its color.In addition, illumination modulation can be realized by any control technology described herein or by other means.
In addition, illumination can be modulated according to the specific application being carrying out.As an example, using can be based on the application mostThe color of intensity or illumination that excellent setting adjust automatically illuminates.If current illumination level be not the application that is carrying out mostExcellent water is flat, then message or order can be transmitted to provide illumination adjustment.
In embodiments, illumination modulation can be completed by filtering or by amplifying.For example, can be used allows the strong of lightDegree and/or color are changed and filtering technique that optimal or required illumination is implemented.Moreover, in embodiments, illuminationIntensity can be by be modulated to reach required illumination intensity using larger or smaller amplification.
Projector can be connected to display to export video and other display elements to user.Used display canTo be SVGA800x600 point/inch SYNDIANT liquid crystal over silicon (LCoS) display.
The target MPE size of the system can be 24mm x12mm x6mm.
Focus can be it is adjustable, allow user improve projector export to be suitble to its demand.
Optical system can be comprised in the shell made of 6061-T6 aluminium and glass-filled ABS/PC.
In one embodiment, the weight of system is estimated as 3.75 ounces i.e. 95 gram.
In one embodiment, eyepiece and associated electronic device provide Infravision.Infravision can pass through black siliconSWIR sensor enables.Black silicon is silicon-based complementary metal oxide (CMOS) processing technique for making the photoresponse of silicon enhance 100 times.Spectral range is deeply expanded to short-wave infrared (SWIR) wave-length coverage.In this technique, the absorption of 300nm depth and antireflectionLayer is added to glasses.This layer provides improved responsiveness as shown in Figure 11, wherein the responsiveness of black silicon in visible light andIt is more much higher than silicon in NIR range, and extend to SWIR range.This technology is asked to enduring extremely high cost, performanceThe improvement of topic and high volume manufacturability problem current techniques.This technology is included in night vision optical device for CMOS technologyEconomically the advantages of, is taken in design.
From amplification starlight or current nigh-vison googles (NVG) of other environment light from visible spectrum is different, SWIRSensor picks up each photon and the light in SWIR spectrum is converted into electric signal, is similar to digital photography.Photon can be by nightReconfigure naturally (also referred to as " nightglow ") of oxygen and hydrogen atom generates in atmosphere.Short-wave infrared equipment passes through at night in instituteStarlight, urban lighting or the moonlight of reflection are interior to detect invisible, short wave infrared radiation to see object.They also on daytime, orIt works through mist, haze or cigarette, and current NVG image intensifier infrared sensor will be overwhelmed by heat or brightness.Because shortwave is redExternal equipment picks up the invisible radiation on visible spectrum edge, and SWIR image appears as the image of visible light generation, has phaseSame shade and contrast and face detail, are only black and white, sharply enhance identity, so people appears as people;The bulk that they are commonly seen unlike use thermal imagers.Important SWIR ability first is that afield provide aiming laser deviceView.Aiming laser device (1.064um) is sightless using current nigh-vison googles.Using SWIR Electrooptical devices,Soldier will check each aiming laser device in use, including those of use laser by enemy.With do not penetrate vehicleOn window or building thermal imagers it is different, it is seen that/near-infrared/short-wave infrared sensor it is see-through they, either daytimeOr at night, important tactical advantage is given to user.
Certain advantages include only using active illumination when needed.In some cases, night may have it is enough fromSo illumination, such as during full moon.It when such is the case, the use of the artificial night vision of active illumination may not be required.It adoptsWith the SWIR sensor based on black silicon CMOS, active illumination may not be needed in these conditions, and be not provided with source lighting, byThis improves battery life.
In addition, it is more than to obtain under night state of the sky in expensive InGaAs sensor that black silicon image sensor, which can have,Octuple signal-to-noise ratio of the signal-to-noise ratio arrived.This technology also provides better resolution ratio, provides than using current techniques for nightDepending on the resolution ratio of available high resolution much.It is difficult to be explained have generally, based on the SWIR of the CMOS long wavelength's image generatedThere is good heat detection, but resolution ratio is poor.This problem uses the black image silicon SWIR sensing of the wavelength dependent on much shorterDevice solves.For these reasons, to battlefield night vision goggles, SWIR is highly desirable.Figure 12 shows black silicon night vision technologyValidity, provide and pass through a) dust;B) mist and the image before and after c) cigarette is checked.Image in Figure 12 is shown newlyThe performance of the black silicon sensor of VIS/NIR/SWIR.In embodiments, imaging sensor can distinguish the change in natural environment,The ground etc. of the vegetation, disturbance that disturb.For example, enemy combatant may placed destructor on the ground in the recent period,So the ground on explosive will be on ' ground of disturbance ', and imaging sensor (and the processing work inside or outside eyepieceTool) ground disturbed in the recent period and surrounding ground can be distinguished.By this method, soldier can detect underground explosion device at a distanceThe placement of (for example, Improvised Explosive Device (IED)).
Previous night vision system is by " halation " from bright source, such as street lamp.These " halation " are in image enhancementIt is especially severe in technology and also associated with resolution loss.In some cases, cooling system is in image enhancement technique systemIn be it is required, increase weight and shorten battery life.Figure 17 shows A) it is able to carry out VIS/NIR/SWIR imagingThe flexibility platform and B of non-cooled formula cmos image sensor) difference of picture quality between Image enhancement night vision system.
Figure 13 is depicted between current or existing vision enhancement technology 1300 and non-cooled formula cmos image sensor 1307Difference in structure.Existing platform (Figure 13 A) carries out deployment due to cost, weight, power consumption, spectral range and integrity problemLimitation.Existing system is usually by front lens 1301, time electricity grade 1302, microchannel plate 1303, high-voltage power supply 1304, phosphorous screenCurtain 1305 and eyepiece 1306 are constituted.This with can with the sub-fraction of cost, power consumption and weight carry out VIS/NIR/SWIR imagingThe flexibility platform (Figure 13 B) of non-cooled formula cmos image sensor 1307 be contrasted.These much simpler sensor packetsInclude front lens 1308 and the imaging sensor 1309 with digital picture output.
These advantages are originated from the processing technique of CMOS compatible, which improves the photoresponse of silicon more than 100 times, and will frequencySpectral limit extends deep into short-wave infrared field.The difference of responsiveness is shown in Figure 13 C.Although typical nigh-vison googles limitIn UV, visible light and near-infrared (NIR) range, until about 1100nm(1.1 microns), new-type cmos image sensor range is alsoIncluding short-wave infrared (SWIR) frequency spectrum, extension reaches 2000nm(2 microns).
Black silicon nuclear technology, which can provide, significantly improves current night vision goggles.Femtosecond laser laser doping can be in very wide frequencyEnhance the light detection property of silicon in spectrum.In addition, optic response can be enhanced 100 times to 10,000 times.With current night vision systemIt compares, black silicon technology is quick, scalable and CMOS compatible the technology with unusual cost.Black silicon technology also provides low workMake bias, usually 3.3V.In addition, non-cooled formula performance may reach 50 DEG C.The cooling of current techniques is required to increase weightBoth amount and power consumption, and also cause the discomfort of user.As described above, black silicon nuclear technology provides for current image intensifier technologyHigh-resolution replacement.Black silicon nuclear technology can provide high-speed electronic shutters according to up to 1000 frames/second speed and have minimumCrosstalk.In some embodiments of night vision eyepiece, relative to other optical displays, such as LCoS display, it may be preferred toOLED display.
Accommodating the black silicon sensor of VIS/NIR/SWIR can provide better Situation Awareness (SAAS) monitoring and realtime graphic increasingBy force.
In some embodiments, the black silicon sensor of VIS/NIR/SWIR can be included in the desktop, laptop for being only suitable for night visionIn, in such as nigh-vison googles or Night vision helmet.Nigh-vison googles may include the feature for being adapted to military market, such as firmAnd the power supply of alternate form, and other forms factor may be adapted to consumer or toy market.In one example, nigh-vison googles canWith extended range, such as 500-1200nm, and also act as camera.
In some embodiments, the black silicon sensor of VIS/NIR/SWIR and other outer sensors, which can be included into, to be pacifiedIn the camera of installation in transport or fighting machine, so that video can be and being superimposed upon in frontal view by Real-time FeedbackIt is not blocked to be sent to the driver of vehicle or other occupants.Driver can preferably see what he or she was going toPlace, gunner can preferably see unexpected threat or target, and navigator can preferably perceive Situation Awareness (SAAS) simultaneouslyAlso searching threat.Feedback can also be sent to non-at-scene position on demand, and the high-rise general headquarters of such as memory/storage location supplyIt is used later in run-home, navigation, monitoring, data mining etc..
The further advantage of eyepiece may include steady connectivity.The connectivity allow using bluetooth, the internet Wi-Fi/,Honeycomb, satellite, 3G, FM/AM, TV and UVB transceiver for quickly sending/receiving mass data are downloaded and transmit.For example, UWB transceiver can be used for creation very High Data Rate, low probability of intercept/low detection probability (LPI/LPD), wirelessPeople's Local Area Network (WPAN) is come the mouse/controller, E/O sensor, medical treatment sensing that connect weapon sight, install on weaponDevice, audio/visual displays etc..In other embodiments, other communication protocols can be used to create WPAN for example, WPAN is received and dispatchedMachine can be the modularization front end for complying with COTS, be high responsiveness to make the power management of fight radio, and avoidEndanger the robustness of radio.By the way that ultra wide band (UWB) transceiver, base band/MAC and encryption chip are integrated in a module,It just obtains physically small-sized dynamic and configurable transceiver solves a variety of operational requirements.WPAN transceiver is worn in soldierLow-power, encrypt, wireless personal domain network (WPAN) are created between the equipment worn.WPAN transceiver can be affixed toOr in the substantially any battlefield military equipment with network interface of insertion (handheld computer, fight display etc.).This isSystem can support many users, AES encryption, be for human interference and the RF robustness interfered and for fightPreferably, low probability of intercept and detection probability (LPI/LPD) are provided.WPAN transceiver eliminates the data cable with soldierVolume, weight and " caused by obstacle ".Interface include USB1.1, USB2.0OTG, Ethernet10/100Base-T andRS2329 needle D-Sub.For the up to variable range of 2m, power output can be -10, -20dBm output.Data capacity can be withIt is 768Mbps and bigger.Bandwidth can be 1.7GHz.Encryption can be 128,192 or 256 AES.WPAN transceiverIt may include message authentication code (MAC) generation of optimization.WPAN transceiver may conform to MIL-STD-461F.WPAN transceiver can be adoptedWith the form of connector dust cover, and it can be attachable to the military equipment in any battlefield.WPAN transceiver allows while being regardedFrequently, speech, still photo, text and chat eliminate the needs to the data cable between electronic equipment, allow to set to multipleIt is standby to proceed without the control with hand without diverting one's attention, characterized by adjustable connectivity range, there is Ethernet and USB2.0Interface, by adjustable frequency 3.1 to 10.6GHz and 200mw peak value energy consumption and it is nominal it is standby characterized by.
For example, WPAN transceiver is allowed in the eyepiece 100 for showing glasses form, calculating of fighting that come back using GSE solidWPAN is created between biometric information registering apparatus as seen in machine, remote computation set remote-controller and Figure 58 kind.AnotherIn one example, if WPAN transceiver allow using turn over/under turn over new line display fight eyes, HUD CPU(it be outerPortion), before weapon grip controller and similar to shown in Figure 58 it is preceding it is arm computerized between create WPAN.
Eyepiece can provide the cellular connectivity of their own, such as be connect by the individual radio with cellular system.Personal nothingLine connection may be only available to the wearer of eyepiece or it can be available to multiple adjacent users, such as in Wi-Fi Hotspot(such as WiFi), wherein eyepiece provides local hot spot and utilizes for other people.These adjacent users can be other wearings of eyepiecePerson or the user of other a certain wireless computer devices, such as mobile communication equipment (such as mobile phone).Pass through the individual radioConnection, wearer may not be needed other honeycombs or the Internet radio is connected to wireless service.For example, if not collectingAt the individual radio connection in eyepiece, wearer may have to find WiFi connection point or the mobile communication for being connected to themEquipment is wirelessly connected to establish.In embodiments, eyepiece can be by being integrated in mesh for these functions and user interfaceReplace in mirror to the needs for possessing separated mobile communication equipment (such as mobile phone, mobile computer).For example, eyepiece canWith integrated WiFi connection or hot spot, true or dummy keyboard interface, usb hub, loudspeaker (such as by music streamIt is transmitted to loudspeaker) or loudspeaker input connection, integrated camera, external camera etc..In embodiments, what is connect with eyepiece is outerPortion's equipment can provide the single list that (such as WiFi, cellular connection), keyboard, control panel (such as touch pads) are connected with personal networkMember.
Communication from eyepiece may include communication linkage for a specific purpose.For example, with a small amount of time send and/orThe mass data moment is received using ultra-wideband communications link.In another example, in the case where very limited transmission rangeCan be used near-field communication (NFC) link so as to individual very close to when photos and sending messages send individual to, such as tactics originalCause is used for local direction, for warning etc..For example, soldier can safely send out/hold information, be transferred only to need knowOr need using the information very close to people.In another example, wireless personal domain network (PAN) can be used to for exampleMouse/controller, photoelectric sensor, medical treatment transducer, the audio-visual display installed on connection weapon sight, weaponDeng.
Eyepiece may include the inertial navigation system based on MEMS, such as GPS processor, accelerometer are (such as enablingThe head of system controls and other function), gyroscope, altimeter, inclinometer, speedometer/odometer, laser range finder, magneticPower meter, it is but also image stabilization is possibly realized.
Eyepiece may include that such as clear earplug 120 etc to user or wearer provides the earphone integrated of audio output.
In one embodiment, the camera (see Figure 21) of the face forward integrated with eyepiece allows basic augmented reality.In augmented reality, viewer can be imaged to what is just watched, then by it is enhancing, compiled, tagged or byThe version layering of analysis is deposited on basic views.In alternative embodiments, associated data can together with primary image orIt is shown on primary image.If two cameras are provided and are mounted at the correct interocular distance of user, can create verticalVolumetric video image.This ability may be useful for the individual for needing eyesight to assist.Many people are subjected to their eyesightsDefect, such as myopia, long sight etc..Camera and as described herein very close virtual screen provide for these people " depending onFrequently ", which is adjustable (closer or farther) in terms of focus, and can be come by individual by speech or other orders completeControl.The ability is also likely to be useful, such as cataract, retinitis pigmentosa for the individual by disease of eyeDeng.As long as certain organic visual capacity keeps existing, augmented reality eyepiece can help individual to see more clearly.Each implementation of eyepieceExample can by one of the following or it is multiple characterized by: amplification, increased brightness, by content map to eyes still health areaThe ability in domain.Each embodiment of eyepiece is used as bifocal or magnifying glass.Wearer can increase in the visual field scaling orIncrease scaling in partial visual field.In one embodiment, associated camera can get the image of object, then with scaledPhoto is presented to the user.User interface allows wearer to be directed toward the region of his desired scaling, such as utilizes control as described hereinTechnology processed, so that image procossing can keep being absorbed in particular task compared with all things in the visual field for only amplifying camera.
In a further embodiment, the camera (not shown) towards after also can be incorporated in eyepiece.In the embodimentIn, the camera towards after allows to control the eyes of eyepiece, and his or her eyes are directed toward on eyepiece by user and are shownThe specific project shown makes application or feature selecting.
Further embodiment for capturing the equipment of the biometric data about individual can stretch microcaloire CassegrainIt is integrated in a device that formula folds optical camera.Microcaloire Cassegrain fold concertina-wise optical camera may be mounted to that such as biological platingIn the handheld device of equipment, biological phone etc, the biology set for being used as acquisition biometric data on glasses also may be mounted to thatA part of part.
Cassegrain reflecting mirror is the combination of main concave mirror and auxiliary convex mirror.These reflecting mirrors are generally used for optical telescopeIn wireless aerial, because they provide good light (or sound) acquisition capacity with shorter, smaller packing forms.
In symmetrical Cassegrain, two reflecting mirrors are aligned about optical axis, and primary mirror usually has hole in center, and light is allowed to reachEyepiece or camera chip or light detecting device, such as CCD chip.A kind of usually alternate design used in radio telescopeFinal focus is placed on before principal reflection mirror.Further these reflecting mirrors of alternate design tiltable come avoid hindering it is main orAuxiliary reflecting mirror, and the needs to the hole in principal reflection mirror or auxiliary reflecting mirror can be eliminated.Microcaloire Cassegrain fold concertina-wise optics phaseAny one of above-mentioned modification can be used in machine, and final choice is determined by the desired size of optical device.
Traditional Cassegrain configuration 3500 uses parabolic mirror as primary mirror, and hyperboloidal mirror is as auxiliary mirror.Hyperboloid primary mirror and/or spherical surface or oval auxiliary can be used in the further embodiment of microcaloire Cassegrain fold concertina-wise optical cameraMirror.In operation, light is reflected down back through in primary mirror by traditional Cassegrain with paraboloid primary mirror and the auxiliary mirror of hyperboloidHole, as shown in Figure 35.Fold optical path to design it is more compact, and use micro-size, be adapted with it is as described hereinBiological plating sensor is used together with biological plating external member.In folded-optics system, light beam is bent so that optical pathIt is longer than the physical length of system.One common example of folded-optics system is prismatic binocular.In camera mirrorIn head, auxiliary mirror be may be mounted to that on an optically flat, optically transparent glass plate of barrel.This support is eliminatedAs caused by prismatic blade shape support tripod " star " diffraction effect.This allows the lens barrel of seal closure, and protects masterMirror, but cause certain losses of light collection ability.
Cassegrain design also utilizes the specific properties of parabolic mirror and hyperboloidal mirror.Concave paraboloid reflectionAll incident rays for being parallel to its symmetry axis are reflected into single focus by mirror.There are two focuses for convex hyperboloid mirror tool, andAll light for being directed toward a focus are reflected towards another focus.Reflecting mirror in this seed type camera lens is designed and is positioned to altogetherA focus is enjoyed, the second focus of hyperboloidal mirror is put at the identical point in place being observed with image, usually justOutside eyepiece.The parallel rays for entering camera lens is reflected into its focus by parabolic mirror, the focus and hyperboloidal mirrorFocus is consistent.Then those light are reflected into another focus by hyperboloidal mirror, cameras record image at this.
Figure 36 shows the configuration of microcaloire Cassegrain fold concertina-wise optical camera.The camera may be mounted to that augmented reality eyeOn eyeball, on biological phone or in other biological identification information acquisition equipment.Component 3600 has multiple telescopic segments, allows cameraAs cassegrainian optical system stretches, longer optical path is provided.Screw thread 3602 allows camera to be installed in equipment, such as increasesStrong Reality glasses or other biological identification information acquire equipment.Although the embodiment described in Figure 36 uses screw thread, can alsoUsing other mount schemes, such as bayonet mount, knob or cover.First telescopic segment 3604 acts also as camera lens in fully retracted positionWhen outer enclosure.Camera also drives the stretching, extension of camera and is inside contracted in combination with motor.It may also include the second telescopic segment 3606.ItsThe telescopic segment of his embodiment in combination with different number, the length of optical path needed for this depends on selected task or the data to be collectedDegree.Third telescopic segment 3608 includes camera lens and reflecting mirror.If camera is designed to follow traditional Cassegrain design, reflectMirror can be principal reflection mirror.Auxiliary reflecting mirror can be comprised in the first telescopic segment 3604.
Further embodiment can form camera using micro-reflector, while still through using folded-optics systemTo provide longer optical path.Identical principle is designed using with Cassegrain.
Camera lens 3610 provides the optical system that the folded-optics system for designing with Cassegrain is used together.MirrorFirst 3610 can select from various types, and can be changed according to application.Screw thread 3602 allows various cameras according to the need of userTo exchange.
The eyes control that feature and option select can be controlled by the object recognition software loaded on system processor andActivation.Object recognition software allows augmented reality, and identification output and inquiry database are combined, and by identification output and determinesCorrelation/likelihood calculating instrument is combined.
In the additional embodiment for combining 3D projector, three-dimensional viewing is also possible.The Miniature projector of two stackings(not shown) can be used for creating 3-D image output.
With reference to Figure 10, the multiple digital CMOS sensor (microprocessors of each sensor array and projector with redundancyAnd DSP) detection visible light, near infrared light and short-wave infrared light, to allow passively day and night to operate, such as Real-time image enhancement1002, real-time keystone correction 1004 and real-time virtual perspective correction 1008.Eyepiece can be sensed using Digital CMOS imageDevice and directional microphone as described herein (such as microphone array), for example, for being visually imaged monitor visible scene (as biologyIdentification, ability of posture control, with 2D/3D projection map carry out coordination imaging), IR/UV imaging come carry out scene enhancing (such as perspective mistHaze, cigarette, dark), audio direction sensing (direction of such as gunslinging or explosion, text hegemony).In embodiments, these are sensedEach of device input can be fed into digital signal processor (DSP) to handle, such as set inside eyepiece or with external treatmentThe DSP of standby interface.Then useful information number can be generated algorithmically to the output of the DSP processing of each sensor input streamsAccording to mode be combined.For example, the system may be useful for combination below: real-time face identification, in real time wordsSound detection is analyzed by the link to database, whiles especially with distortion correction and soldier, attendant etc.GPS positioning, such as when monitoring interested remote region, such as known path or trail or highly safe region.OneIn embodiment, be input to DSP audio direction sensor input can be processed to the user of eyepiece generate one or more canDepending on, audible or vibration queue indicate the direction of sound.For example, if obstructed using hearing protection it is loud explosion orThe sound of gunslinging to protect the hearing of soldier, or if explosion ring very much so that soldier do not can say it from where and theyEar may ring ground hummed so that they do not hear anything very much now, may be used to the audible of operatorOr vibration queue indicate the direction of original threat.
Augmented reality eyepiece or glasses can be powered by any energy-storage system, such as battery power supply, solar powered, route supplyElectricity etc..Solar collector can be placed on frame, on belt fastener etc..Battery, which charges, can be used wall type charger, vehicle-mounted fillsElectric appliance carries out on belt fastener, in spectacle case etc..In one embodiment, eyepiece can be rechargeable, and can matchHave the small USB connector for rushing electricity again.In another embodiment, eyepiece can be equipped with through one or more long-range inductionsFormula power converter topology carries out long-range inductive charging, such as the Powercast of Pennsylvania, America Ligonier;AndFulton Int ' l.Inc.(the said firm of Michigan, USA Ada also possesses another provider, Britain CambThose of Splashpower, Inc.) provide.
Augmented reality eyepiece further includes camera and camera is connected to any excuse needed for circuit.The output of camera canIt is stored in memory, is also displayed on the available display of wearer of eyes.Display driver can also by withIn control display.Augmented reality equipment further includes the power supply, electric power management circuit, Yi Jiyong of battery for example as shown etcIn the circuit recharged to power supply.As explained elsewhere, (such as small USB connector) can be connected by rigid line or pass through sense by rechargingDevice, solar panel input etc. is answered to occur.
The control system of eyepiece or glasses may include for the saving when the power supply instruction of such as battery etc goes out low batteryThe control algolithm of power supply.The saving-algorithm may include the power supply for closing the application to energy-sensitive, and such as illumination, camera or requirement are highSensor of energy level, such as any sensor for requiring heater etc..It may include slowing down for sensing that other, which save step,The power supply of device or camera, for example, slow down sampling or frame per second, when electric power is low enter it is lower sampling or frame per second, in lower levelWhen closure sensor or camera.To have at least three kinds of operation modes: normal mode according to available power;Battery saving mode;WithAnd urgent or close pattern.
The disclosure application can be controlled by the movement and direct action of wearer, such as his or her hand, finger,The movement of foot, head, eyes etc. passes through equipment (such as accelerometer, gyroscope, camera, optical sensor, the GPS sensor of eyepieceDeng) and/or pass through the equipment that wearer wears or be mounted on wearer (such as installation sensor control device physically)To enable.By this method, wearer can directly control eyepiece by the movement and/or movement of their body, without usingTraditional hand-held remote controller.For example, wearer can have be mounted on one or two on hand (such as at least one finger,On palm, on the back of the hand etc.) such as position sensor device etc sensor device, wherein the position sensor device provides handPosition data, and to eyepiece provide position data wireless communication as command information.In embodiments, the disclosureSensor device may include gyroscope apparatus (such as electronic gyroscope, MEMS gyroscope, mechanical gyro in terms of being supplied to location informationInstrument, Quantum gyroscope, Ring Laser Gyro instrument, fibre optic gyroscope), accelerometer, mems accelerometer, velocity sensor, powerSensor, pressure sensor, optical sensor, proximity sensor, RFID etc..For example, wearer can be in their right hand index fingerOn position sensor device is installed, wherein the equipment can sense the movement of the finger.In this example, user can or pass throughCertain switching mechanism on eyepiece (such as fast moves finger, finger tapping is hard by the predetermined motion sequence of fingerSurface etc.) activate eyepiece.Note that tapping hard surface can by the sensing of accelerometer, force snesor, pressure sensor etc. comeIt explains.Then position sensor device can transmit the movement of finger as command information, such as move finger in the sky and come across displayThe mobile cursor of image that is out or projecting moves in quick movement to indicate selection etc..In embodiments, position is feltThe command information sensed can be transmitted directly to that eyepiece is used for command process or command process circuit can be with position by measurement equipmentSensor device, which is located at, to exist together, such as is mounted on the component on finger as the sensor for including position sensor device in this exampleA part.Command information can be with visual detector.For example, cursor can change color when interacting with different content.For example,In order to know you when using peripheral equipment control glasses you finger where, visually indicating for command information can be existed by realityIn glasses.
In embodiments, multiple position sensor devices can be mounted on their body by wearer.Such as and continueThe example of front, position sensor device can be mounted on multiple points on hand by wearer, such as each sensor is differentOn finger, or as the set of sensor, such as in gloves.By this method, the sensing at different location on handTotal sensor command information of the set of device can be used for providing more complicated command information.For example, being simulated in the disclosureWith play simulation in use, user can be used sensor device gloves and play game, the wherein hand pair of gloves sensing userThe grasping of ball, bat, racket etc..In embodiments, multiple position sensor devices may be mounted to that on the different parts of body,Allow wearer by the compound movement of body be transmitted to eyepiece come by certain using.
In embodiments, sensor device can have force snesor, pressure sensor etc., such as detecting sensor deviceWhen with object contact.For example, sensor device may include the pressure sensor at the finger tip of the finger of wearer.In the situationIn, wearer can tapping, multiple tapping, draw brush, touch etc. come to eyepiece generate order.Force snesor may be alternatively used for indicatingThe degree touch, hold, pushing away etc., wherein threshold value that is scheduled or learning determines different command information.It by this method, can be according to notDisconnected update transmits order by a series of serial commands of eyepiece command information used in a certain application.In an exampleIn, wearer may be currently running simulation, such as game application, Military Application, business application etc., wherein moving and and objectContact (such as passing through at least one of multiple sensor devices) be fed into eyepiece as influence by eyepiece show shouldThe order of simulation.For example, a certain sensor device can be included in a controller, wherein controller can have force snesor,Pressure sensor, Inertial Measurement Unit etc., wherein controller can be used for the display for generating virtual writing, control and eyepieceAssociated cursor serves as computer mouse, provides control command etc. by physical motion and/or contact.
In embodiments, sensor device may include optical sensor or optical launcher, be construed to order as that will moveA kind of mode enabled.For example, sensor device may include the optical sensor on hand for being mounted on wearer, eyepiece shell may includeOptical launcher, so that movement can be interpreted order when optical launcher of their hand of user's movement by eyepiece.It is logicalCross the movement that optical sensor detects may include at different rates, with duplicate movement, stop and mobile combination etc. intoCapable brush of drawing passes through.In embodiments, optical sensor and/or transmitter can be located on eyepiece, and it is upper (such as to be mounted on wearerOn hand, on foot, in gloves, on certain part clothing), or be applied in combination between different zones on wearer and on eyepiece.
In one embodiment, there are the situation for monitoring wearer or several sensings of the someone adjacent to wearerDevice is installed in augmented reality glasses.Due to the progress of electronic technology, sensor has become much smaller.Reduce sum number in sizeIn terms of wordization, signal conversion and signal processing technology have also made major progress.Therefore may not only have in AR glassesThere is temperature sensor, it is also possible to which there is entire sensor array.As described, these sensors may include temperature sensor, andFor detecting the sensor of following content: pulse;Heartbeat variability;EKG or ECG;Respiratory rate;Core temperature;Body heat flow;Electrodermal response, that is, GSR;EMG;EEG;EOG;Blood pressure;Body fat;Hydration level;Activity level;Oxygen demand;Glucose or blood glucose waterIt is flat;Body position;And UV radiant exposure or absorption.In addition, can also have retina sensor and blood oxygen transducer, (such as Sp02 is passedSensor) etc..These sensors can be obtained from various manufacturers, the Vermed including Vermont ,Usa belotecan Si Fuersi;It is fragrantThe VTI of blue Ventaa;The ServoFlow of Massachusetts, United States Lexington.
In certain embodiments, sensor is mounted on the upper or personal equipment of individual rather than may in glasses bodyIt is more useful.For example, accelerometer, motion sensor and vibrating sensor can be usefully mounted on the clothing that individual is upper, personalOn object or on the personal equipment worn.These sensors can by Bluetooth radio transmitter or follow IEEE802.11 specificationOther wireless devices keep continuously or periodically contacting with the control of AR glasses.For example, doctor wishes to monitor patientThe movement or concussion undergone during footrace, if sensor is directly installed on personal skin or the even personal T wornWithout being mounted on glasses on sympathizing, then sensor is likely more useful.In such cases, by being placed on, individual is upper or clothesThe upper rather than sensor on glasses can get more accurately reading.These sensors are not needed as being adapted to be mounted within glassesSensor with this is small like that, as will be seen more useful.
AR glasses or goggles may also include environmental sensor or sensor array.These sensors are installed in glassesOn, and near wearer atmosphere or air sampling.These sensors or sensor array can be for predetermined substance or substancesConcentration is sensitive.For example, sensor and array can be used for measuring carbon monoxide, nitrogen oxides (" NOx "), temperature, relatively wetDegree, noise level, volatile organic chemicals (VOC), ozone, particle, hydrogen sulfide, air pressure, ultraviolet light and its intensity.Supplier andManufacturer includes the Sensares of French Kroll;The CairPol of French A Laisi;Columbia Province of Britain, Canada triangleThe Critical Environmental Technologies of Canada in continent city;The Apollo electronics technology of China ShenzhenCo., Ltd;The AV Technology Ltd. in Cheshire, UK stoke wave city.Many other sensors are well known.If thisA little sensors are mounted on individual upper or personal clothing or equipment, then they are also likely to be useful.These environmental sensorsIt may include radiation sensor, chemical sensor, toxic gas sensor etc..
In one embodiment, environmental sensor, health monitoring sensor, or both are installed in augmented reality glassesOn frame.In another embodiment, sensor may be mounted to that on individual upper or personal clothing or equipment.For example, for surveyingThe sensor for measuring the electrical activity of the heart of wearer can be implanted, the signal with the personal cardiomotility of conversion and transmission instructionApplicable attachment.
Signal can pass through bluetoothTransmitting set or other wireless devices for following IEEE802.15.1 specification passSend very short distance.Other frequencies or agreement can also be used instead.Then signal can be supervised by the signal of augmented reality glassesDepending on handling, being recorded and being shown on the available virtual screen of wearer with processing equipment.In another embodiment, signalThe friend or squad leader of wearer can be sent to through AR glasses.To, personal health and happiness can by the individual and otherPeople can also be tracked at any time to monitor.
In another embodiment, environmental sensor may be mounted to that on the upper or personal equipment of individual.For example, if by wearingIt is worn on personal coat or inner waist belt rather than is directly installed on glasses, radiation or chemical sensor are likely more useful.As described above, the signal from sensor can locally be monitored by the individual by AR eyes.Sensor reading can also be on-demandGround is automatically transferred to elsewhere, perhaps at a set interval, such as per quart hour or per half an hour.To,The historical record of sensor reading (either the body reading or environment of the individual) can be made to be used for tracking or trend mesh's.
In one embodiment, RF/ micropower impulse radio (MIR) sensor can be associated with eyepiece, and serves as short distanceMedical radar.The sensor is operable with ultra wide band.The sensor may include at RF/ impulse generator, receiver and signalDevice is managed, and for may be to detect and measure for heart signal by the ion stream for the 3mm heart myocyte for measuring skinUseful.Receiver can be phased array antenna to allow to determine the position of signal in an area of space.The sensor can quiltHeart signal is detected and identified for passing through the blocker of such as wall, water, concrete, dust, metal, wood or the like.Determine that how many people is located in concrete structure by detecting how many heart rate for example, user can be able to use the sensor.In another embodiment, the heart rate detected may act as personal unique identifier, so that they can be identified in future.OneIn embodiment, in the embeddable equipment of RF/ impulse generator, such as eyepiece or a certain other equipment, and receiver is embedded intoIn one different equipment, such as another eyepiece or equipment.It by this method, can when detecting heart rate between transmitter and receiverCreation virtual " trip wire ".In one embodiment, which is used as field diagnostic or self diagnosis tool.EKG can be dividedIt analyses and stores and be used as bio-identification identifier for future.User can receive the heart rate signal sensed and there are how many hearts rateWarning is as the content shown in eyepiece.
Figure 29 depicts the embodiment 2900 of augmented reality eyepiece or glasses with various sensors and signal equipment.OneA or more than one environment or health sensor pass through short-distance wireless electric line and antenna is locally or remotely connected to biographySensor interface, as shown.Sensor interface circuitry includes being examined for detecting, amplifying, handle and send and/or transmit sensorThe armamentarium of the signal measured.Distance sensor may include such as implanted heart rate monitor or other body sensors (notIt shows).Other sensors may include accelerometer, inclinometer, temperature sensor, be suitable for detecting one or more chemicals orAny sensor in other health or environmental sensor discussed in the sensor or the disclosure of gas.Sensor interfaceIt is connected to the microprocessor or microcontroller of augmented reality equipment, from this point of view, the information of collection, which may be recorded in, depositsIn reservoir, such as random-access memory (ram) or permanent memory, read-only memory (ROM), as shown.
In one embodiment, sensor device allows to carry out electric field sensing simultaneously by eyepiece.Electric field (EF) sensing is oneKind allows COMPUTER DETECTION, assessment and the neighbouring method for sensing to work together with the object near them.It is connect with the physics of skinTouching, for example, with another people shake hands or with other physical contacts of certain of conductive or non-conductive equipment or object, can be according to electric fieldIn change and be sensed, and allow to transfer data to eyepiece or from eyepiece transmit data or terminate dataTransmission.For example, the video that eyepiece is captured can be stored on eyepiece, the eyepiece until having embedded electric field sensing transceiverWearer's contact object and until initiating data transmission from eyepiece to receiver.Transceiver may include transmitter and data senseSlowdown monitoring circuit, transmitter include the transmitter circuit for causing medial electric field, and transceiver passes through detection transmission data and receptionData allow two-way communication to distinguish transmitting and reception pattern and export corresponding with both of which control signal.It can be by allThe contact such as shaken hands etc generates two person-to-person instantaneous dedicated networks.Data can be used in the eyepiece of a certain user and secondIt is transmitted between the data sink or eyepiece at family.Additional safety measure can be used to enhance the dedicated network, for example, face orAudio identification, eye contact detection, fingerprint detection, bio-identification input, iris or retina tracking etc..
In embodiments, authenticating device related with the access function of eyepiece may be present, such as go out shown by accessOr the limited content projected of the content that projects, access, the function of eyepiece itself is fully or partially enabled (as by stepping onLand accesses the function of eyepiece) etc..Certification can pass through speech, iris, retina, fingerprint to wearer etc. or other biologicalThe identification of identification marking symbol provides.For example, eyepiece or associated controller can have IR, ultrasonic wave or capacitive character tactile to passSensor, for receiving and authenticating or the related control of other eyepiece functions inputs.Capacitance sensor can detect fingerprint, and starts and answerWith or otherwise control a certain eyepiece function.Each finger has different fingerprints, therefore each finger can be used for controllingDifferent eyepiece function or quick start different application provide the certification of various ranks.Capacitor cannot work together with gloves,But ultrasonic sensor can be with, and can be utilized to provide biometric authentication or control in an identical manner.In eyepiece or phaseUseful ultrasonic sensor includes the SonicSlide of Sonavation in associated controllerTMUsed in sensorThe SonicTouch of SonavationTMTechnology, it is by measuring burr and the dimpled grain of fingerprint acoustically come with 256 grades of gray scales pairFingerprint imaging carrys out work to distinguish least details in fingerprint.SonicSlideTMThe crucial image-forming assembly of sensor is by ceramicsCeramic microelectronic mechanical system (MEMS) piezoelectric transducer array made of composite material.
Verification System can provide the bio-identification input database of multiple users, so that every in database based on being input toThe strategy of a user provides the access control using eyepiece with associated access privileges.Eyepiece can provide verification process.ExampleSuch as, authenticating device can sense when user removes eyepiece, and re-authentication is required when user puts on eyepiece again.This is more preferableGround ensures eyepiece only to those of being authorized to user and only wearer those of is authorized to privilege and provides access.Show oneIn example, authenticating device can detect the eyes of user or the presence on head when eyepiece is worn.In first order access, useFamily can only be able to access that low sensitive items, until certification is completed.During certification, authenticating device can identity user, and search himAccess privileges.Once these privileges are determined, authenticating device can then provide a user suitable access.It is detectingIn the case where unauthorized user, eyepiece can keep the access to low sensitive items, further limitation accesses, refusal completely is visitedIt asks.
In one embodiment, receiver can be associated with an object right to this by the touch of the wearer of eyepiece to allowAs being controlled, wherein touching allows command signal to transmit or execute in object.For example, receiver can be with car door locking phaseAssociation.When the wearer of eyepiece touches automobile, car door can be unlocked.In another example, receiver can be embedded into medicine bottle.When the wearer of eyepiece touches medicine bottle, caution signal can be initiated.In another example, receiver can be with the wall along pavementIt is associated.When the wearer of eyepiece is by wall or touches wall, can start in eyepiece or in the panel of videos of wall wideIt accuses.
In one embodiment, it when the wearer of eyepiece initiates physical contact, can be mentioned with the WiFi information exchange of receiverIt is connected to the instruction of the online activity of such as game for wearer, or may be provided in authentication in thread environment.In the realityIt applies in example, in response to the contact, the expression of the individual can be changed color or be subjected to certain other be visually indicated.
In embodiments, eyepiece may include such as the haptic interface in Figure 14, such as to enable the tactile control to eyepieceSystem, such as by drawing brush, light button, touch, pressing, click, the rolling of spin etc..For example, haptic interface 1402 may be mounted to thatOn the frame of eyepiece 1400, for example, in a temple, in two temples, the top of the bridge of the nose, frame, the bottom of frame etc..EachIn embodiment, haptic interface 1402 may include similar to the computer mouse, all 2D as described herein with left and right buttonsSet the control and function of control panel etc..For example, haptic interface may be mounted to that on eyepiece on the temple of user, and serve as pairEyepiece is projected to " temple mouse " controller of the content of user, and may include the rotary selector being mounted on temple and carriage returnButton.In another example, haptic interface can be one or more vibration temple motors, can vibrate to alert or notify to useFamily, such as left side danger, the right danger, medical conditions etc..Haptic interface may be mounted to that on the controller separated with eyepiece, exampleThe controller of such as wearing, the controller carried on hand.If there is accelerometer in controller, it can sense user's tapping,Such as on keyboard, they on hand (with controller hand wound or with have controller hand tapping) etc..It wearsThen person can be construed to the various ways of order to touch the haptic interface, such as by primary or more on interface by eyepieceSecondary tapping, by by the swiped through interface of finger, by pressing and keeping, by once pressing more than one interface etc..In each realityIt applies in example, haptic interface can be attachable to the body (such as their hand, arm, leg, trunk, neck) of wearer, their clothing, workFor the attachment of their clothings, as finger ring 1500, as bracelet, as necklace etc..For example, the interface can be affixed to bodyOn, such as at wrist back, wherein the different piece for touching the interface provide different command informations (such as touch front,Rear portion, centre, holding a period of time, tapping, stroke brush etc.).In embodiments, contact of the user with haptic interface can pass throughPower, pressure, movement etc. are explained.For example, haptic interface is in combination with resistive touch technology, capacitance touch technology, ratio pressurePower touching technique etc..In one example, in the case where the application requirement interface is simple, durable, low-power etc.,Haptic interface can utilize discrete resistance touching technique.In another example, in the case where more multi-functional by the interface requirement(such as passing through movement, stroke brush, multiple point touching etc.), which can utilize capacitive touch technology.In another example, it touchesFeel that interface can utilize pressure touch technology, such as when requiring variable pressure order.In embodiments, in these touching techniquesAny touching technique or similar touching technique can all be used in any haptic interface as described herein.
In one embodiment, hand-held attachment can be used for controlling dummy keyboard to be input to eyepiece.For example, if hand-held setStandby to have touch screen, then user touch screen or can be presented on-screen keyboard or be adapted to allow user with touch screen interactionWith equipment interaction (it and dummy keyboard are coordinated to provide input to glasses).For example, the dummy keyboard can be present in glasses,But not selects project in the sky, but user enables to touch panel device to be suitable for receiving to correspond to the defeated of dummy keyboardEnter.The equipment can track finger when finger slips over capacitive module, will provide keystroke to the click of equipment and feel.Equipment can beFront has touch-surface, has one or more Action Buttons later or above, allows user to click to be chosen without and needBy their finger lift-off touch-surface.The letter that user selected can be highlighted.User can still carry out cunning and refer to textInput lifts their finger to terminate a certain word, is inserted into space, double to kowtow to be inserted into fullstop etc..Figure 159 depicts user's viewThe dummy keyboard 15902 that Yezhong is presented.On the keyboard, two keys are highlighted, ' D ' and ' Enter ' (carriage return).It is attached at thisIn figure, touch screen accessory device 15904 is used to the input being supplied to keyboard, is then transferred to glasses as input.The view of input or control command is performed using the practical touch screen on virtual interface or external equipment to provide and indicateFeel indicator.
In embodiment, eyepiece may include using magnetic field come between eyepiece and external equipment transmit and/or receive order,Telemetering, information etc., or the haptics communications interface of order, telemetering, information etc. is directly transmitted or received from user to user.For example,User can have the body for being directly laid in them a certain position (such as skin, nail, in body) patterned magnetic materialMaterial, the oscillating magnetic field which generates haptics communications interface make response (such as vibration, power, fortune physicallyIt moves).The oscillating magnetic field can convey information by the modulation of field, such as by the amplitude of signal, signal it is time-relatedDifference, frequency of signal etc..The information of reception and registration can be alarm, incoming call instruction, for entertaining, for communicating, with eyepiece applicationIt is associated instruction, be used to indicate user and eyepiece the degree of approach, for providing a user touch feedback etc. from eyepiece.It is differentOrder can cause different stimulating effects to patterned magnetic material, for different orders or indicator.For example, use can be passed throughThe different frequency and/or sequence pattern of the incoming call of different people in the contacts list of user, for different warning levelsVarying strength, interesting mode for entertainment purposes etc. realize different stimulating effects.
Haptics communications interface may include transmission and/or the coil for receiving oscillation magnetic signal.Magnetic material can be ferromagnetic materialMaterial, paramagnetic material etc., and can according to power supply, ink, tatoo, applique, adhesive tape, transfer paper, spraying etc. apply.In each embodimentIn, magnetic material can have in user not when using eyepiece demagnetization, in the magnetic field that magnetic material is not present in from eyepiece whenThe abilities such as unmagnetized.It can be with functional spatial model come the magnetic material to be applied, such as in response to specificSignal of communication modulation has specific impedance, in response to specific frequency etc..The magnetic material of application can be visible figureAs, invisible image, tatoo, mark, label, symbol etc..The magnetic material of application may include one mode, which utilizesIncoming magnetic signal returns to the transmission signal (such as band is about identifier of user) of eyepiece haptics communications interface to generate, and makeesFor the signal etc. for indicating the degree of approach between eyepiece and magnetic material.For example, identifier can be User ID, the User ID withThe ID stored on eyepiece compares for confirming that the user is the authorized user of eyepiece.In another example, magnetic material can be onlyThe transmission signal for returning to eyepiece can be only generated in the case where magnetic material is close to eyepiece.For example, user can be by magnetic materialIt is applied to nail, and user nearby can provide order instruction to eyepiece by the way that their finger is put into user's haptic interfaceDevice.
In another example, wearer can have the interface being installed in finger ring as shown in figure 15, handpiece etc., whereinThe interface can have and have at least one of multiple command interface types for connecting of wireless command with eyepiece, as haptic interface,Positional sensor devices etc..In one embodiment, finger ring 1500 can have the control of mapping calculation machine mouse, such as button 1504(such as playing single button, more buttons and similar mouse function), 2D position control 1502, idler wheel etc..1504 He of button2D position control 1502 can be as shown in figure 15, and wherein button is located at the side towards thumb, and 2D positioner is located at top.Alternatively, other configurations can be used in button and 2D position control, such as all towards thumb side, be entirely located in top surface or anyOther combinations.2D position control 1502 can be 2D button position controller and (such as be embedded in the keyboard of certain laptop computersIn be used to control TrackPoint (TrackPoint) pointing device of position of mouse etc.), TrackPoint, control stick, optical tracking pad,Photoelectricity wheel trolley, touch screen, touch tablet, Trackpad, rolling Trackpad, trace ball, any other position or position control device etc..In embodiments, the control signal from haptic interface (such as finger ring haptic interface 1500) can be provided with wired or wireless interfaceTo eyepiece, wherein user can easily provide control input with their hand, thumb, finger etc..In embodiments,Finger ring perhaps can be expanded to adapt to any finger or shrink more close hand.For example, finger ring can have customized restraint strap orThe hinge of spring is installed.For example, user perhaps can clearly express control with their thumb, wherein finger ring is worn on userIndex finger on.In embodiments, the interactive mode that a kind of method or system can provide user's wearing wears eyepiece, wherein the eyepieceIncluding user observed by it ambient enviroment and the content shown optics assembly, for process content for display toThe processor of user, the integrated projector apparatus for content to be projected to optics assembly and user body (such as withThe hand at family) on the control equipment worn, which includes at least one control assembly motivated by user, and will be originated from shouldThe control command of the excitation of at least one control assembly is supplied to processor as command instruction.Command instruction can be for will showShow to the manipulation of the content of user.Control equipment can be worn on the first finger of user hand, and at least one control assemblyIt can be motivated by the second finger of user hand.First finger can be index finger, and second finger can be thumb, and the first fingerWith second finger can be located at user it is same on hand.Control equipment can have at least one be mounted on the side of the index finger towards thumbA control assembly.At least one control assembly can be button.At least one control assembly can be 2D positioner.The control assembly that control equipment can have at least one button being mounted on the side of the index finger towards thumb to motivate, and be mounted onThe control assembly towards the 2D positioner excitation in top side of index finger.Control assembly may be mounted to that user hand at leastOn two fingers.Control equipment can be used as gloves to be worn on user on hand.It is wearable on the wrist of user to control equipment.At least one control assembly can be worn at least one finger of hand, and transmission equipment can be individually worn on handOn.Transmission equipment can be worn on wrist.Transmission equipment can be worn on the back of the hand.Control assembly can be in multiple buttonsAt least one.At least one button can provide the function of being substantially similar to conventional computer mouse button.In multiple buttonTwo main buttons that can play a part of to be substantially similar to conventional double-button computer mouse.The control assembly can be rollingWheel.The control assembly can be 2D position control component.The 2D position control component can be button position controller, give directionsBar, control stick, optical tracking pad, photoelectricity wheel trolley, touch tablet, Trackpad, rolling Trackpad, trace ball, capacitive touch screen etc..The thumbs of the 2D position control component available subscribers controls.The disposable control assembly can be that by including byThe touch screen of touch control including button class function and 2D operating function.The control assembly can will pinpoint and control equipment in userIt is motivated when being placed in the processor content projected.Circular finger controls can be by can drop, rechargeable, solar energy etc.On-board battery power supply.
In embodiments, wearer can have the interface being mounted in finger ring 1500AA, which includes camera1502AA, as shown in Figure 15 AA.In embodiments, circular finger controls 1502AA can have control interface class as described hereinType, such as pass through button 1504,2D position control 1502,3D position control (as utilized accelerometer, gyroscope).Finger ring controlDevice 1500AA such as controls the manipulation of the display content of opposite wearer's projection in can be used for the function in control eyepiece.?In each embodiment, control interface 1502,1504 such as ON/OFF, zoom, can shake in terms of the camera 1502AA of insertion provides controlTake the photograph, focus, recording still image photo, record video etc..It alternatively, can be all to control by other control aspects of eyepieceFunction, such as pass through voice control, other Tactile control interfaces, eye-gaze detection as described herein.Camera can also openWith automatic control function, such as auto-focusing, timing function, face detection and/or tracking, autozoom etc..For example, havingThe circular finger controls 1500AA of integrated camera 1502AA can be used for checking wearing during the video conference started by eyepiecePerson 1508AA, wherein the extended circular finger controls of wearer 1508AA (such as being mounted on their finger) are to allow cameraThe facial view that 1502AA obtains them supplies to send at least one other participant of video conference to.Alternatively, wearer canRemove circular finger controls 1500AA and be lowered into surface 1510AA(such as table top surface) on, so that camera 1502AA sees pendantWearer.The image of wearer 1512AA is then displayed on the display area 1518AA of eyepiece, and is transmitted to video councilView other people, such as the image 1514AA of other participants together with conference call.In embodiments, camera1502AA can provide manually or automatically FOV(visual field) 1504AA adjusting.For example, wearer can put down circular finger controls 1500AAFor being used in conference call on to surface 1510AA, and FOV1504AA can by manually (as by button control 1502,1504, voice control, other haptic interfaces) or automatically (such as passing through face recognition) controls the FOV1504AA to make cameraIt is directed toward the face of wearer.FOV1504AA is aloowed to change as wearer is mobile, such as by via face recognitionTracking.FOV1504AA can also amplify/reduce to be adapted to the variation of the facial positions of wearer.In embodiments,Camera 1502AA can be used for a variety of static and/or Video Applications, wherein display area 1518AA of the visual field of camera in eyepieceOn be provided to wearer, memory, which may be present in, to be used to store image/video in eyepiece, and image/video can be turned from eyepieceIt moves, be communicated to some External memory equipment, user, web application etc..In embodiments, camera can be incorporated into it is multiple notWith mobile device in, such as be worn on arm, on hand, on wrist, on finger etc., such as have as shown in Figure 32 to 33The wrist-watch 3202 of embedded type camera 3200.As circular finger controls 1502AA, any one of these mobile devices all may be usedIncluding for the manual and/or automatic function as described in circular finger controls 1502AA.In embodiments, circular finger controls1502AA can have the function of additional sensor, insertion, controlling feature etc., such as fingerprint scanner, touch feedback, LCD screen,Accelerometer, bluetooth etc..For example, circular finger controls can provide the synchronization monitoring between eyepiece and other control assemblies, such as originallyDescribed in text.
In embodiments, eyepiece can provide a kind of for providing pendant to video conference participants by using external mirrorThe system and method for the image of wearer, wherein wearer sees themselves in reflecting mirror, and themselves image passes throughThe integrated camera of eyepiece is captured.The image of capture can be used directly or image can be reversed to correct the figure of reflecting mirrorAs reversion.In one example, wearer can be added with other people multiple video conferences, and wherein wearer perhaps can pass throughEyepiece watches other people real time video image.By the way that using the integrated camera in general mirror and lens, user is perhapsThemselves can be seen in mirror, made image be integrated cameras capture and provided them to other people for video conferenceThe image of oneself.Other than such as other people image involved in the video conference, which also be can be used as to eyepieceImage is projected to be obtained by wearer.
In embodiments, can also be provided can provide surface sensory package in control equipment is used to detect across surfaceThe control assembly of movement.The surface sensory package can be placed in the palmar side of user hand.Surface can be hard surface, pressure release surface,At least one of the skin surface of user, garment surface etc. of user.Wirelessly, by transmission such as wired connections it can provide controlSystem order.Controlling equipment can control fixed point function associated with the processor content shown.The fixed point function can be pairThe control of cursor position;To the selection of the content shown, selection and the mobile content shown;Change to the content shownCoke, panning, the visual field, size, the control of position etc..Controlling equipment can control fixed point associated with the ambient enviroment checkedFunction.The fixed point function, which can be, is placed on cursor on object what is observed in ambient enviroment.The positioning for the object observedPosition can combine the camera integrated with eyepiece by processor to determine.The mark for the object observed can be combined by processor and meshMirror integrated camera determines.Controlling equipment can control certain function of eyepiece.The function can be associated with the content shown.The function can be the scheme control of eyepiece.Control equipment can be it is folding, convenient for the storage when user does not wear.EachIn embodiment, control equipment can be used together with external equipment, such as jointly controlling external equipment with eyepiece.Outside is setIt is standby to can be amusement equipment, audio frequency apparatus, portable electronic device, navigation equipment, weapon, automatic controller etc..
In embodiments, body worn control equipment (such as be worn on finger, at palm bondage to hand,On arm, on leg, on trunk etc.) 3D position sensor information can be provided to eyepiece.For example, control equipment may act as " aerial mouseMark ", wherein 3D position sensor (such as accelerometer, gyroscope etc.) is in user command (such as by clicking button, voice lifeEnable, the posture of vision-based detection etc.) when location information is provided.User is able to use this feature perhaps to navigate to project by eyepiece and beSystem is projected to the 2D or 3D rendering of user.Further, eyepiece can provide the external of image and relay, for showing or being projected to otherPeople, such as in the case where demonstration.User perhaps can change the mode of control equipment between 2D and 3D, to be adapted to notSame function, application, user interface etc..In embodiments, multiple 3D control equipment can be used for certain applications, as emulation is answeredIn.
In embodiments, a kind of system can include: the interaction wear-type eyepiece that user wears, wherein eyepiece includes userThe optics assembly of ambient enviroment and the content shown is observed by it, wherein the optics assembly includes correcting user pairThe correcting element of the view of ambient enviroment;For process content for display to the integrated processor of user;And it is used for contentIntroduce the integrated image source of optics assembly;And it is mounted on the Tactile control interface on eyepiece, which passes throughUser contacts the interface and user is located at interface at least one in the vicinity to receive control input from the user.
In embodiments, can enable the control to eyepiece by hand control, and especially with the content that showsThe control of associated cursor, such as using wearable device 1500 shown in figure 15, virtual machine mouse as shown in figure 15 aMark 1500A etc..For example, wearable device 1500 can transmit order by physical interface (such as button 1502, idler wheel 1504), and emptyQuasi- computer mouse 1500A perhaps can explain order by the movement and movement of thumb, fist, the hand of detection user etc..?In calculating field, physics mouse is the pointing device acted on by detecting the two dimensional motion relative to its support surface.PhysicsThe object and one or more buttons that mouse is traditionally held by the subordinate of user form.It is with other elements sometimesFeature such as allows user to execute " idler wheel " of the various operations depending on system, or can add more controls or dimension inputAdditional buttons or feature.The movement of mouse is converted into the movement of the cursor on display, this allows to graphic user interfacePrecise controlling.In the case where eyepiece, user perhaps can utilize the combination of physics mouse, virtual mouse or both.In each realityIt applies in example, virtual mouse can be related to the one or more sensors for the hand for being attached to user, such as in thumb 1502A, fingerOn 1504A, palm 1508A, wrist 1510A etc., wherein eyepiece receives the signal from sensor and turns the signal receivedChange the movement of cursor on user's eyepiece displayer into.In embodiments, the outside of such as haptic interface 1402 etc can be passed throughInterface, by receiver on the inside of eyepiece, at secondary communication interface, in associated physics mouse or wear on interfaceEtc. receiving signal.Virtual mouse may also include the element of the driver or other output types that are attached to the hand of user, such as useTouch feedback is provided a user in passing through vibration, power, pressure, electric pulse, temperature etc..Sensor and driver can by housing,Finger ring, protector, gloves etc. are attached to the hand of user.In this way, eyepiece virtual mouse allows user that the movement of hand is converted into meshThe movement of cursor on mirror display, wherein " moving " may include move slowly at, quickly movement, wriggling, position, changing in positionBecome etc., and user is allowed to carry out work with three-dimensional, without physical surface and including some or all of 6 freedom degrees.Note thatSince " virtual mouse " can be associated with the multiple portions of hand, virtual mouse can be implemented as multiple " virtual mouse " controllers,Or the distributed director of multiple control members across hand.In embodiments, eyepiece can provide the use of multiple virtual mouses,Such as on every of user one on hand, one of user or multi-feet etc..
In embodiments, eyepiece virtual mouse may not be needed physical surface to operate, and such as add for example, by a variety ofSpeedometer-type (such as tuning fork, piezoelectricity, shearing mode, strain mode, capacitive character, heat, resistive, electromechanical, resonance, magnetic, optics, soundSound, laser, three-dimensional etc.) one of sensor detect movement, and determine by the output signal of sensor hand or handThe translation of certain a part or angle displacement.For example, accelerometer can produce the translational acceleration on three directions of its size and handSpend proportional output signal.Pairs of accelerometer can be configured to detect the rotary acceleration of each section of hand or hand.It canThe translational velocity and displacement for determining each section of hand or hand by integrating to accelerometer output signal, and can be by accelerationDifference-product between the output signal of degree meter pair divides rotation speed and displacement to determine hand.Alternatively, using other sensors,Such as ultrasonic sensor, imager, IR/RF, magnetometer, gyro magnetometer.Since accelerometer or other sensors can be pacifiedIn each section of hand, eyepiece is perhaps able to detect a variety of movements of hand, from usually associated with computer mouse movementThe more highly complex movement of the simple motion explanation of hands movement complicated into such as Simulation Application etc.In each embodimentIn, small translation or spinning movement can only be needed by user so that these movements are converted into and in the eyepiece projection of userUser's intention acts associated movement.
In embodiments, virtual mouse can have physical switch for controlling devices associated therewith, such as installOff/on switches on other positions of hand, eyepiece or body.Virtual mouse can also have the predefined fortune by handWhat dynamic or movement carried out opens/closes control etc..For example, the behaviour of virtual mouse can enable by quickly moving back and forth for handMake.In another example, the movement (such as before eyepiece) of eyepiece can be crossed by hand to disable virtual mouse.In each embodimentIn, the virtual mouse for eyepiece can provide multiple explanations for moving to and usually controlling associated operation with physics mouse, such asThis is just familiar for user without training, such as is clicked, double-clicked with finger, three hit, right click, left click, click and dragged, combined spotIt hits, roller motion etc..In embodiments, eyepiece can provide gesture recognition, such as explain gesture by mathematical algorithm.
It in embodiments, can be by first from the conductor as a part of the control system of eyepiece using the hand because of userThe technology that capacitive character changes caused by variation in the distance of part identifies to provide ability of posture control, therefore will be not necessarily to user'sAny equipment is installed on hand.In embodiments, conductor can be used as eyepiece a part be mounted, such as frame temple orIn other parts, or as certain external interface on the body or clothes that are mounted on user.For example, conductor can be antenna,Wherein control system works in a manner of similar to the contactless musical instrument of referred to as Te Leimenqin.Te Leimenqin uses heterodyne principleAudio signal is generated, but in the case where eyepiece, which can be used for generating control input signal.Control circuit can wrapInclude several radio-frequency oscillators, such as an oscillator with fixed frequency work and another by user hand control, wherein with handDistance change control antenna at input.In the art, the hand of user serves as L-C(inductor-capacitor) it can power transformation in circuitThe earth plate (body of user is grounded) of container, it is a part of oscillator and determines its frequency.In another example, circuitSingle oscillator, two pairs of heterodyne oscillators etc. can be used.In embodiments, multiple and different conductors may is used as control defeatedEnter.In embodiments, such control interface inputs (such as volume control, change for the control changed across a certain rangeCoke control etc.) for may be ideal.However, such control interface can also be used for more discrete control signal (such asON/OFF control), wherein predetermined threshold determines that the state of control input changes.
In embodiments, eyepiece can pad mouse, hand-held remote controller, the remote control installed on body with such as radio trackingThe physical remote control device of device, the remote controler being mounted on eyepiece or the like docks.Remote control equipment may be mounted to that external oneIn equipment, such as personal use, game, profession use, military etc..For example, remote controler can be installed in the weapon of soldierOn, such as be mounted on pistol grip, be mounted on muzzle shield, be mounted on foregrip, thus provide remote control to soldier andIt does not need their hand being moved away from weapon.Remote controler can be removably attachable to eyepiece.
In embodiments, it can be activated and/or control by proximity sensor for the remote controler of eyepiece.It is neighbouring to passSensor can be the existing sensor that neighbouring object is just able to detect without physical contact.For example, proximity sensor is capable of emittingElectromagnetism or electrostatic field or electromagnetic radiation beam (such as infrared ray), and find the variation in field or return signal.Sensed objectThe commonly referred to as target of proximity sensor.Different proximity sensor targets may need different sensors.For example, capacitorProperty or photoelectric sensor are likely to be suited for plastic target;Inductive proximity sensor requires metal target.Proximity sensor technologyOther examples include capacitive displacement transducer, vortex, magnetic, photocell (reflection), laser, passive thermal infrared, passive optical,CCD, reflection of ionising radiation etc..In embodiments, proximity sensor can be integrated in any control embodiment as described hereinIn, including physical remote control, virtual mouse, the interface being mounted on eyepiece, being mounted on an external equipment, (such as game controlsDevice, weapon) on control etc..
In embodiments, can be used for controlling eyepiece for measuring the sensor of the body kinematics of user, or as outerPortion's input, such as use Inertial Measurement Unit (IMU), 3 axis magnetometers, 3 axis gyroscopes, 3 axis accelerometers etc..For example, one passesSensor may be mounted to that user on hand, to allow to control eyepiece using the signal from sensor, as described herein.?In another example, sensor signal can be received and be explained by eyepiece to assess for the purpose except controlling and/or using usingThe body kinematics at family.In one example, the sensor being mounted on the every leg and every arm of user can provide letter to eyepieceNumber come allow eyepiece measurement user gait.Then the gait of user transfers the step that can be used for monitoring that user changes over timeState, for the progress during monitoring the variation of physical behavio(u)r, physical therapy, variation etc. as caused by head trauma.?In the example for monitoring head trauma, eyepiece can initially determine that the baseline gait profile of user, then monitor user at any time, allSuch as before and after physical event (such as related collision, explosion, car accident with movement).It is in sportsman or individualIn the case where in physical therapy, eyepiece can be used for the gait for periodically measuring user, and safeguard measured value in the databaseFor analysis.It can produce gait time profile of running, such as indicate physical trauma, physics progress for monitoring the gait of userDeng.
In embodiments, the facial characteristics of the user of eyepiece can be worn by sensing through facial stimulus sensor 1502BMovement, the tension of facial muscles, the movement of the fastening of tooth, chin etc. initiate the control to eyepiece, especially to it is aobviousShow to the control of the associated cursor of content of user.For example, as shown in fig. 15b, eyepiece can have facial stimulus sensor to makeFor the extension from eyepiece earphone assembly 1504B, extension of temple 1508B from eyepiece etc., septum reset stimulus sensor canSense power associated with the movement of facial characteristics, vibration etc..Facial stimulus sensor can also be installed separately with eyepiece assembly,Such as a part of separate headset, wherein the sensor output of earphone and facial stimulus sensor can be by wired or wirelessCommunication (such as bluetooth or other communication protocols known in the art) is sent to eyepiece.Facial stimulus sensor also can be attachable to earAround piece, in mouth, on the face, on neck etc..Facial stimulus sensor can also be made of multiple sensors, such as optimizing differenceFace or internal motion or the movement of the sensing of movement.In embodiments, the detectable movement of facial stimulus sensor and by theyIt is construed to order or original signal may be sent to that eyepiece for explaining.Order can be order for controlling eyepiece function,Control associated with the cursor from content to a part of the display of user or pointer that are provided as etc..For example, user can fastenTheir tooth indicates such as usually associated with the click of computer mouse to click or double-click once or twice.AnotherIn example, user can tense facial muscles to indicate to order, selection such as associated with the image of projection.In each embodimentIn, facial stimulus sensor can minimize face, first-class background motion using noise reduction process, such as by self-adapting signal atReason technology.Speech activity sensor can also be used to reduce for example from user, from other neighbouring individuals, from surrounding ringThe interference of border noise etc..In one example, facial stimulus sensor can also pass through the vibration during detection speech in user's cheekTo improve communication and eliminate noise, ambient noise is such as identified with multiple microphones and be increased by noise elimination, volumeEtc. eliminating ambient noise.
In embodiments, the user of eyepiece perhaps can be put into the visual field of eyepiece and be directed toward by lifting their handObject or position obtain the information about certain environmental characteristic, place, the object observed by eyepiece etc..For example, userThe finger for making direction can indicate an environmental characteristic, wherein the finger is not only in the visual field of eyepiece, but also is being embedded inIn the visual field of formula camera.System perhaps can will make now the finger of direction position and camera seen in environmental characteristic fieldInstitute is related.In addition, eyepiece can have position and orientation sensor, such as GPS and magnetometer, to allow the field of system aware userInstitute and sight.System perhaps can extrapolate the location information of the environmental characteristic as a result, such as providing a user place letterBreath, the position of environmental information is superimposed upon on 2D or 3D map, location information that further association is established is by the location informationWith auxiliary information (name of the individual of such as address, the address, the business organization name in the place, the place about the placeCoordinate) correlation etc..With reference to Figure 15 C, in one example, user is passing through eyepiece 1502C and is seeing and referred to their hand 1504CHouse 1508C into their visuals field, wherein embedded type camera 1510C existing hand 1504C for making direction in its visual fieldThere is house 1508C.In this example, system can determine the place of house 1508C, and provide addition in user to the environmentLocale information 1514C and 3D map on view.In embodiments, information associated with environmental characteristic can be by external equipmentIt provides and (connects by wireless communication to communicate), be stored in inside eyepiece and (download to eyepiece for current place etc.Deng).In embodiments, it may include related with scene observed by wearer a variety of for being supplied to the information of the wearer of eyepieceAny information in information, such as geography information, interest point information, social networking information (such as with station in front of wearerPeople is related enhance around the people push away the information such as special (Twitter), facebook (Facebook), such as " suspension " in the people's weekEnclose), profile information (such as being stored in the contacts list of wearer), historical information, consumption information, product information, retailInformation, security information, advertisement, business information, security information, information related with game, humour annotation, letter related with newsBreath etc..
In embodiments, user perhaps can control their relative to 3D projection image visual angle, such as with external ringsThe associated 3D projection image in border, the 3D for being stored and being retrieved project the film (such as downloading to watch) of image, 3D displayDeng.Such as and referring again to Figure 15 C, user perhaps can such as change the image 1512C of 3D display by rotating their headVisual angle, and together with being maintained when the image of 3D display rotates their head even if in user in real-time external environmentIn the case of they mobile position etc..By this method, eyepiece perhaps can pass through the observing in user by information superpositionAugmented reality, such as map 1512C, the Locale information 1514C of 3D display of superposition etc. are provided in external environment, wherein showingMap, information etc. can change with the change of the observation of user.In another example, in 3D film or the electricity converted through 3DIn the case where shadow, can by changing the visual angle of spectators to certain control of viewing visual angle so that spectators " entrance " film environment,Wherein user perhaps can rotate their head and correspondingly change view with the head position of change, and wherein user perhaps canIt is enough " to be entered into " in image when they physically go ahead, with their mobile the watching view attentively of eyes of user visual angle is changedBecome etc..Furthermore, it is possible to provide additional image information, such as in each side of User, this can be accessed by rotation head.
In embodiments, the user of an eyepiece perhaps can by its to projection image or video view at least with certainThe view of another user of one eyepiece or other video display apparatus is synchronous.For example, two independent eyepiece users may want toIdentical 3D map, game projection, point of interest projection, video etc. are watched, two of them spectators not only see identical projectionContent, and the view for projecting content is synchronized therebetween.In one example, two users may wish to jointly checkSome region of 3D map, which, which is synchronized into, enables a user perhaps to can point at another user on the 3D mapThe position seen and interacted.The two users are with perhaps can moving around on 3D map and share two users and 3DVirtual-physical interaction between figure etc..Further, one group of eyepiece wearer perhaps can hand over a projection jointly in groupsMutually.By this method, two or more users perhaps can be synchronized by the coordinate of their eyepieces to have unified enhancing existingEntity is tested.The synchronization of two or more eyepieces can be provided by conveying location information between eyepiece, such as absolute positionInformation, relative position information, translation and rotary position information etc., such as from position sensor (such as gyro as described hereinInstrument, IMU, GPS etc.).Can by internet, by cellular network, by satellite network etc. come the communication between guiding ocular.It is rightThe processing for facilitating synchronous location information can executed in the primary processor of single eyepiece, jointly held between one group of eyepieceRow, execution etc. or their any combination in remote server system.In embodiments, the projection between multiple eyepiecesThe view of the synchronization of the coordinatograph of content can provide to be experienced from an individual to the augmented reality of the extension of multiple individuals, wherein shouldMultiple individuals benefit from this group of augmented reality experience.For example, the people of one group of frequent concert can by their eyepiece with come fromThe feeding of concert production side is synchronous so that visual effect or audio can by concert production side, performing artist, other audiences atMember etc. is pushed to the people with eyepiece.In one example, performing artist can have argument mirror, and can control into audience member's transmissionHold.In one embodiment, content can be performing artist to the view of ambient enviroment.Argument mirror may be also used in performing artistFor various applications, such as control exterior lighting system interacts with augmented reality jazz drum or collection plate, recalls the lyrics etc..
In embodiments, the image or video shown on eyepiece can (it has logical with eyepiece with the equipment that is connectedLetter link) on showing or image or view captured by the equipment that this is connected or the feeding directly from remote cameraFrequency is synchronous.Feed can be selected or another movement can be inputted by the sensor that one of the equipment from being connected receives or controlSignal processed, the metadata sent by one of other equipment being connected etc. are initiated.Other video display apparatus can be other meshMirror, desktop computer, laptop computer, smart phone, tablet computer, television set etc..Eyepiece, equipment and remote cameraIt can be connected by wide area, local, metropolitan area, a domain and cloud network communication link.It is defeated that sensor input can be audio sensorEnter, video sensor input etc..Other movements that can input or control signal by receiving sensor to initiate may include initiatingIt such as tracks target, send message or the movement such as initiation audio video synchronization described elsewhere herein or the like.For example, by remoteThe video that the eyepiece of guard at journey inspection station or screening place is captured can be applied in face recognition and be fed from guard's eyepieceVideo in selected when identifying interested people automatically, for being shown on the eyepiece of administrator.
In embodiments, eyepiece can realize the audio direction of the wearer of eyepiece, such as benefit using sound projective techniqueWith around audio technology.The realization of the audio direction of wearer may include that sound is reproduced from source direction (in real time or as playback).It may include vision or audible indicator to provide the direction of sound source.Sound projective technique is for having such as since user is subjected toHearing impairment, user wear earphone, user wears for hearing defect caused by hearing protection etc. or the individual of obstruction and may beUseful.In this example, eyepiece can provide the audible reproduction of 3D of enhancing.In one example, wearer may have put on earphone,And have been carried out shooting.In this example, eyepiece perhaps can reproduce the 3D acoustic profile of shot, to allow wearer coupleShooting makes a response, it is known that sound comes wherefrom.In another example, with earphone, hearing loss, in noisy environment etc.Wearer, which does not perhaps can say, is saying what and/or just in the direction of talker, but provides the enhancing of 3D sound from eyepiece(for example, wearer is passing through the individual that earphone listens other close, therefore there is no directivity information).In another example, it wearsWearer is likely to be in noisy ambient enviroment, or in the environment in the noisy noise of possible generating period.In this example,Eyepiece can have the ability or sound of cutting off noisy sound to protect the hearing of wearer that may ring very much so that wearer notIt cans say sound to come wherefrom, and their ear may be very loud so that they do not hear anything now.In order to helpSuch case, eyepiece can provide the queues such as vision, the sense of hearing, vibration to wearer to indicate the direction of sound source.In each implementationIn example, in the case where the ear of wearer is plugged so that their ear is from noisy noise, eyepiece can provide " enhancing" hearing, but using earplug the reproduction of sound is generated to substitute and those of miss sound from natural world.This artificial soundDirectionality is given in then communication that sound can be used for the wireless transmission that cannot be heard naturally to operator.
In embodiments, it can be the point difference of different directions for establishing the example of the configuration of the directionality of source soundMicrophone.For example, at least one microphone can be used for the sound of wearer, at least one microphone is used for ambient enviroment, at least oneDownwardly directedly, possibly it is in multiple and different discrete directions.In this example, the microphone under being directed toward can be taken away it is isolatedHis sound, this can surround with 3D sound and enhance hearing technical combinations, as described herein.
In example of the voice enhancement system as a part of eyepiece, there are several users with eyepiece, such asIn noise circumstance, wherein all users " block ear ", as realized by the progress man-made noise barrier of eyepiece earplug.It wearsOne of wearer may shout that they need certain part equipment.By the hearing conservation that all ambient noises and eyepiece are created, do not havePeople can hear the request to the equipment.Here, the wearer for making oral request has filtering microphone near their mouth, theyThe request can be wirelessly communicated to other people, wherein their eyepiece can by voice signal relay to other users eyepiece withAnd the ear of correct side, other people will be appreciated by the right or eye left to look at that who has made the request.The system can be with allThe geographical location of wearer and " virtual " ambiophonic system further enhance, and " virtual " ambiophonic system uses two earplugsTo give 3d space perception (the true loop technique of such as SRS).
In embodiments, sense of hearing queue is also possible to computer generation, therefore the user communicated does not needTheir communication out, but can be selected from commonly used command list, it is logical that computer generates this based on preconfigured condition etc.Letter.In one example, wearer is likely to be at such a case, and wherein they are not desired to have display before their eyesDevice, but earplug is want to be placed in their ear.In this case, if they want group of notifications in someone get up and followThey, they can just click controller specific times, or provide visual gesture with camera, IMU etc..System may be selected " and thenI " it orders and other users will be sent to together with its 3D system place with the user for making communication, to lure into them from realityIt listens in the upper place beyond the invisible positioned at them.It in embodiments, can be true by the location information of the user from eyepieceFixed and/or offer directional information.
Eyepiece may include for providing a user palmesthetic equipment, such as by the frame of goggle structure or templeVibration exciter (such as by mechanical oscillation motor, piezoelectric vibration driver, ultrasonic vibration driver etc.).Vibration can be mentionedFor indicating to the user that outbound message indicates, instruction as the user to visual impairment (such as due to dark, cigarette, cloud, blindness)Device, a part as game, a part of emulation etc..It can make in the side mirror leg of eyepiece individually or together with loudspeakerWith vibration exciter, to help to create 3D vision-sound-vibration reality environment, such as game, emulation etc..ExampleSuch as, vibration exciter may be mounted to that in each side mirror leg of eyepiece, so that when a certain application is presented on a left side for user's headWhen the projectile that side is flown over, left side vibration exciter is configured in a manner of emulating the feeling that projectile actually flies over userTo vibrate.In addition, the loudspeaker of the side can synchronously apply the sound for imitating that projectile can be issued when flying over user's headSound.Vibration and/or loudspeaker can be installed on eyepiece in the way of providing a user 3D vibration-audio experience, to increaseIt is strong to pass through visual experience provided by the content visually shown, such as in 3D visual display content.By this method, userIt can be enclosed in more virtual 3D environment of perception.In embodiments, the disclosure may include a kind of interaction head worn by userFormula eyepiece is worn, wherein to include: user observe the optics assembly of ambient enviroment and the content shown by it, be suitble to the eyepieceIn the integrated image source that content is introduced to optics assembly and it is suitable for managing the processing equipment of the function of eyepiece, wherein shouldThe structure that wear-type eyepiece has includes: that user is used to observe the frame of ambient enviroment and for frame to be supported on to the head of userVibration exciter in each of the left side in portion and right side temple and left and right side temple, each vibration exciter are onlyOn the spot in response to the vibration command from processing equipment.Vibration command may be in response to a part as shown content outVirtual projectile, virtual explosion, message instruction, visual cues, warning etc., initiate vibration in one of vibration exciter.DisplayContent out can be used as user and play a part of emulation, game application, useful application etc. to be provided.Call vibration commandThe application can locally operate on eyepiece, partly or wholly by external platform run etc., wherein eyepiece hasCommunication with the external platform interconnects.In addition, eyepiece may include integral speakers as described herein, such as in left and right sideIn each of temple, wherein vibration command initiates vibration in one of vibration exciter, this shakes in reception in timeThe audible command for initiating sound when dynamic order in the loudspeaker in the temple of the same side is synchronous.
In embodiments, eyepiece can provide the various aspects of SIGNT (SIGINT), such as existing WiFi, 3G,The signals of communication such as bluetooth in use, be used to collect the equipment near eyepiece wearer and the SIGNT of user.These signalsIt may be from other eyepieces, such as collecting the information about user friendly known to other;Carry out free unauthorized individual to pick upOther eyepieces, such as signal by being generated when unauthorized user attempts using eyepiece;From other communication equipments (such as nothingLine electricity, cellular phone, pager, walkie-talkie etc.);Electric signal from the equipment that may not be used directly to communication;Etc..ByThe information that eyepiece is collected can be directional information, location information, motion information, the quantity of communication and/or rate etc..In addition,Information can be collected by the coordinated manipulation of multiple eyepieces, such as in the signal triangulation location for determining the position of signal.
With reference to Figure 15 D, in embodiments, the user of eyepiece 1502D is perhaps able to use the hand 1504D's from themMultiple hands/finger point relative to the visual field (FOV) 1508D for penetrating view definition camera 1510D, is such as answered for augmented realityWith.For example, in the example shown, user is adjusting the camera of eyepiece 1502D using their the first finger and thumbThe FOV1508D of 1510D.User can adjust FOV1508D using other combinations, such as utilize the combination of finger, finger and thumbThe combination of finger, two hands finger and thumb combination, use palm, cup-shaped hand etc..It may make using multiple hands/finger pointUser can change the FOV1508 of camera 1510D in the mode almost the same with the user of touch screen, wherein hand/fingerDifference establishes each point of FOV to establish desired view.However, in this example, between the hand and eyepiece of userPhysical contact is not made.Here, camera, which can be command by, is associated each section of user hand to establish or change cameraFOV.Order can be any command type as described herein, on the hands movement and eyepiece in the including but not limited to FOV of cameraThe associated order of physical interface, order, from a certain position of user with close to the movement of eyepiece sensed is associatedOn the order etc. that receives of command interface.Fingers/hand perhaps can be moved and be identified as order by eyepiece, such as in certain repetitionIn movement.In embodiments, user can also adjust certain a part of the image projected using the technology, and wherein eyepiece willImage observed by camera is related in a certain respect to the image that projects, the hand/finger point and user in such as viewThe image projected is related.For example, the image that user may just watch at the same time external environment and project, and user utilizes and is somebody's turn to doTechnology changes viewing area, region, the magnifying power etc. projected.In embodiments, user can execute for a variety of reasonsChange to FOV, including being zoomed in or out in real time environment from observed scene, observed by the image projectedPart zoom in or out, change distribute to the viewing area of the image projected, change environment or the image that projects it is saturatingView etc..
In embodiments, eyepiece allows FOV simultaneously.For example, width simultaneously, in, narrow camera FOV can be used,Middle user may make different FOV in the view and meanwhile present (it is i.e. wide for showing entire place, may is that it is static, and it is narrowIt for focusing on specific objective, is perhaps moved together with eyes or cursor).
In embodiments, eyepiece perhaps can track eyes by the light reflected via eyes of user, to determine userJust it is being look at the movement of where or eyes of user.Then the information can be used to help to be relevant to the image projected, phaseMachine view, external environment etc. carry out the sight of association user, and in control technology as described herein.For example, user can infuseIt depending on a certain place on the image that projects and makes a choice, the eyes such as detected with external remote control or with certain are mobile(as blinked).In the example of the technology and Figure 15 E is referred to, the transmitting light 1508E of such as infrared light etc can be from eyes1504E reflect 1510E, and such as pass through camera or other optical sensors in optical display 502() at sensed.The information is rightIt can be analyzed to extract eyes rotation from the variation of reflection afterwards.In embodiments, eye-tracking device can be by corneal reflectionAnd pupil center be used as be characterized to be tracked at any time;Using from cornea front reflection and crystalline lens back makeIt is characterized and is tracked;The feature (such as retinal vessel) from inside of eye is imaged, and tracks this in eyes rotationA little features;Etc..Alternatively, other technologies can be used to track the movement of eyes in eyepiece, such as utilize around eyes, installationComponent etc. in contact lenses on eyes.For example, the special invisible glasses with built-in optical component can be provided a userTo measure the movement of eyes, reflecting mirror, magnetic field sensor etc..In another example, the electricity being placed in around eyes can be usedExtremely measure and monitor potential, using eyes as the constant potential field of bipolar generation, such as its anode in cornea and cathode existsRetina.In this example, the contact electrode on the skin being placed on around eyes, on the frame of eyepiece etc. can be used to obtain electricitySignal.If eyes are shifted around from central part, retina is close to an electrode and cornea is close to an opposite electrode.This variation in bipolar orientation and caused electric potential field leads to the variation for the signal measured.By analyzing these changesChange, it is mobile that eyes can be traced.
The eye gaze direction of user and associated control can be related to visual detector by another example how to applyPlacement (passing through eyepiece) in the peripheral vision of user and optional selection (passing through user), such as in order to reducing eyesClutter in the narrow in the user visual field around direction of gaze where the input of highest vision.Since brain is can be primaryHandle that how many message context are limited, and brain most pays close attention to the vision content close to direction of gaze, therefore eyepiece can regardFeel in periphery and provides the visual detector projected as the clue to user.In this way, brain may only need to handle to instructionThe detection of device, rather than information associated with indicator, to reduce a possibility that information makes user's excess load.IndicatorIt can be icon, photo, color, symbol, object of flashing etc., and indicate warning, Email arrival, incoming call, dayJourney event, the internal or external processing equipment for needing concern from the user etc..Utilize the visual detector on periphery, Yong HukeRealize it without being divert one's attention by it.Then user can optionally determine to promote content associated with the visual cues to seeMore information such as watches the visual detector attentively, and opens its content by doing so.For example, indicating incoming electricityThe icon of sub- mail, which can indicate that, receives Email.User can pay attention to the icon and select to ignore it (such as if figureMark a period of time is not activated then icon disappearance, such as by watch attentively or certain other control equipment).Alternatively, Yong HukeThe visual detector and the direction by watching the visual detector attentively are noticed to select " to activate " it.The Email the case whereUnder, when eyepiece detect user eye gaze and icon position consistency when, the openable Email of eyepiece simultaneously shows in itHold.By this method, user maintains the domination being just concerned to what information, as a result, making minimum interference and making contentService efficiency maximizes.
In embodiments and with certain optical arrangements (such as front lighting LCoS) as described herein in association, two orFeedback between more displays can ensure that display brightness and contrast having the same.In embodiments, Mei GexianShow that the camera in device may be utilized.Electric current to LED can be controlled and can get color balance, such as by selecting similar matterThe LED of amount, output and/or color (such as from similar frequency slot (bin)), it is possible to provide right and left pulsewidth modulation (PWM) value, andExecutable periodic calibration.In embodiments, it can be achieved that the calibration of power spectrum.If display due to high outside illumination and byIt turns down, then user may know that the calibration to each display.In embodiments, it can create equal between two displaysBrightness, color saturation, color balance, coloration etc..This can prevent the brain of user from ignoring a display.In each embodimentIn, the feedback system from display can be created, it allows user or another people to adjust brightness etc., so that each display utensilThere are constant and/or consistent brightness, color saturation, balance, coloration etc..It in embodiments, may on each displayThere are luminance sensors, it can be color, RGB, white sensor, full optical sensor etc..In embodiments, sensor canTo be that monitoring or inspection pass to LED or by the power sensor of the LED power consumed.User or another people can by increasing orThe power supply to LED is reduced to adjust one or more displays.This can be carried out during manufacture, and/or can be in the service life of eyepieceIt period and/or periodically carries out.In embodiments, in terms of may having dynamic range.Since LED and/or power supply are gradually darkGet off, it is understood that there may be the brightness that by the power algorithm of accurate adjustment the two can be consistent on a display.In each implementationIn example, user and/or manufacturer or eyepiece can adjust LED to follow identical brightness curve when powering and changing.RGB may be presentLED, and LED curve can be matched between two displays.Therefore, can be controlled in a dynamic range brightness, color saturation,Color balance, coloration etc..In embodiments, it can be surveyed during manufacture, in dynamic range, during the service life of glasses etc.Measure and control these.In embodiments, brightness equal between two displays, color saturation, color balance, colorationEtc. can by actual creation, or the difference that can be created between the eyes based on user and be easily noticed by the users.In embodiments,The adjustment of brightness, color saturation, color balance, coloration etc. can be executed by user, manufacturer, and/or can be by eyepiece baseIt is executed automatically in feedback, various programmed algorithms etc..In embodiments, sensor feedback can cause to be saturated in brightness, colorAutomatically and/or manually adjustment at least one of degree, color balance, coloration etc..
In embodiments, a kind of system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes userThe optics assembly of ambient enviroment and the content shown is observed by it and the collection for content to be introduced to optics assemblyAt image source, wherein optics assembly includes two or more displays, wherein adjusting at least one of described displayAt least one of brightness, color saturation, color balance and coloration are saved, so that the two or more displays is brightAt least one of degree, color saturation, color balance and coloration are balanced relative to each other within a predetermined range.In each implementationIn example, which may include brightness, color saturation, color balance and coloration etc. so that the two or more displaysAt least one of relative to each other within a predetermined range.In embodiments, to brightness, color saturation, color balance andThe adjustment of at least one of coloration etc. can be made based on the detection to the power for being delivered to integrated image source.In each embodimentIn, which can be based on power algorithm, so that at least one of brightness, color saturation, color balance and coloration etc. existIt is consistent between two or more displays.In a further embodiment, which can be based on whole optical sensor feedbacksSensor.In embodiments, can during manufacture, dynamic output range caused by integrated image source it is medium at leastAt least one of brightness, color saturation, color balance and coloration etc. are adjusted in one.In embodiments, system can fitTogether in periodically automaticly inspected relative to each other in the life cycle of eyepiece the two or more displays brightness,At least one of color saturation, color balance and coloration etc..In embodiments, system is suitably adapted for relative to each other certainlyAt least one of dynamic brightness, color saturation, color balance and coloration for checking the two or more displays etc., andSelectively by described in the brightness of the two or more displays, color saturation, color balance and coloration etc. extremelyFew one is set as predetermined value.In addition, an embodiment of the system be suitably adapted for automaticly inspecting relative to each other it is described two orAt least one of brightness, color saturation, color balance and coloration of more displays etc., and surveyed based on sensor feedbackAmount, selectively will be described in the brightness of the two or more displays, color saturation, color balance and coloration etc.At least one is set as predetermined value.
In embodiments and with certain optical arrangements (such as front lighting LCoS) described herein in association, described twoIt is equal that contrast between a or more display, which can be adjusted to that equal or user discovers,.In embodiments, it comparesDegree can be checked and correspondingly be adjusted for each display, and can be conditioned during manufacturing process aobvious to calibrate and adjustShow device, and can be measured in the fabrication process, in dynamic range, during the service life of glasses etc..In embodiments, it isThe contrast of system can be automatically calibrated between two displays and compared to the external world.In embodiments, Yong HukeCompensate the difference between his eyes.Contrast can be compensated the eyesight and/or Undersensing of user by on-demand adjustment.In each realityIt applies in example, how contrast ratio can be assembled according to optical module and be changed.As described herein, reducing stray light can be dedicated to being used forAssembling is to provide the technology of high contrast ratio.In embodiments, various types of single pixel brightness and/or the detection of more pixel colorsDevice can be inserted into optical element string, come some or all to the eye movement range (eyebox) for not fully entering displayLight is sampled.It in embodiments and depends on detector puts where in the optical path, Real-time Feedback can be provided to systemIt compensates rigging error, LED and LCoS panel yield, vanning error, the compensation of hot and cold panel, and/or maintains personal user's calibration.In embodiments, the brightness and contrast of display can be managed by good manufacturing practice.In addition, during manufacture,Can carry out quality analysis test and on demand calibrate display and on demand compensation.In addition, with component in system lifetimAbrasion or system are heated and are cooled during use, and calibration can be modified using the look-up table of offset.In embodiments,The adjustment of brightness, color saturation, color balance, coloration, contrast etc. can be executed by user, manufacturer, and/or can be byEyepiece is executed automatically based on feedback, various programmed algorithms etc..In embodiments, sensor feedback can cause in brightness, colorAutomatically and/or manually adjustment at least one of color saturation degree, color balance, coloration, contrast etc..
In embodiments, a kind of system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes userThe optics assembly of ambient enviroment and the content shown is observed by it and the collection for content to be introduced to optics assemblyAt image source, wherein optics assembly includes two or more displays, wherein adjusting at least one of described displayContrast is saved, so that the contrast of the two or more displays is balanced relative to each other within a predetermined range.?In further embodiment, contrast is adjusted so that it is equal between two or more displays.In each embodimentIn, it can be during manufacturing process, the medium adjusting contrast of the dynamic output range caused by integrated image source.In each embodimentIn, system is suitably adapted for periodically automaticly inspecting the two or more displays relative to each other on the service life of eyepieceContrast.In embodiments, system is suitably adapted for automaticly inspecting the two or more displays relative to each otherContrast, and be selectively predetermined value by the contrast settings of the two or more displays.In embodiments, it isSystem is suitably adapted for automaticly inspecting the contrast of the two or more displays relative to each other, and is surveyed based on sensor feedbackThe contrast settings of the two or more displays are selectively predetermined value by amount.In embodiments, contrast canIt is adjusted to the deficiency of compensation user.In embodiments, contrast can be according in the light that stray light and integrated image source generateAt least one be conditioned.In embodiments, it can be adjusted based on the feedback of the detector in the optical path of system pairDegree of ratio.Further, detector may include at least one of single pixel brightness and more pixel color detectors.In each embodimentIn, Real-time Feedback can be provided to system and mended to compensate rigging error, LED and LCoS panel yield, vanning error, hot and cold panelAt least one of repay and maintain personal user's calibration.In embodiments, the calibration of contrast can be based on one or moreThe look-up table of offset is conditioned.
In one embodiment, particular optical configuration (such as front lighting LCoS) as described herein allows for camera to be inserted into along lightIn many positions for learning element string, camera is placed directly on the optical axis.For example, can be placed on LCoS attached for camera sensorClosely, such as the camera 10232 in Figure 102 B.This then allows the measurement in the position to pupil, diameter, speed and direction and to rainbowThe direct imaging of film.These measurements and imaging can be used for secure log or load user setting, by measurement capillarySize and/or thickness detect health status, placeholder/bookmark etc. are arranged based on the last watching area in books.PhaseThe data for the various assemblies about eyes that machine is collected can be used for controlling user interface, determine stress level, monitoring warning, inspectionSurvey the reaction etc. to outside or the stimulant projected.Due to frontlighting optical device be it is sharp and compact, have it is minimumThe camera of pixel can be placed in optical element string, to keep the overall dimensions of optical device small and ensure high resolution graphicsPicture.In embodiments, camera can be placed in many components of optical path by being inserted into beam splitter as in Figure 185, butIt is that can also allow that for camera to be placed on LCoS PCB, is directly embedded in LcoS silicon substrate or other optical element strings are placed.
In embodiments, when camera is placed directly on the optical axis, camera perhaps it can be seen that or detect eyes orIt is immediately seen or detects inside of eye.In embodiments, system can track eyes movement, and detection pupil expands, and measure pupilPosition, diameter, speed and the direction in hole, and directly to iris imaging.In embodiments, camera can determine that user isIt looks around or user is controlling eyepiece.Only it is caused to send signal to track eyes shifting as an example, camera can senseDynamic eyes Move Mode, so that it senses the predetermined control order that user may be executed with his eyes.As an example,Camera can identify that the eyes of user are reading the something in user interface based on the mobile mode of eyes of user.In these feelingsIn condition, camera initiates the detection to a certain group of eye commands and is sent to eyepiece to execute a certain function, such as opens electronics postalPart etc..In embodiments, user, which can be detected, in camera to focus on an object to control the predetermined way of eyepiece, such as logicalIt crosses and something is focused on the extended period, focus on something, fast move eyes and then focus on the object etc. again.WithCamera detects suchlike Move Mode, it can send signal to eyepiece to execute a certain function.It is only as an example, poly-Burnt, transfer sight and again focusing may make camera signals to carry out on the something of eyepiece user intention in the display" double-click ".Certainly, this any quasi-mode and/or algorithm can be used for controlling equipment by the eyes movement of user.In each realityIt applies in example, a certain Move Mode can be detected in camera, and when the movement is detected when specific application is used, phaseMachine can send signal specific to eyepiece based on this combination.As an example, if e-mail program is open and userEyes show and read consistent mode, then camera available signal notice eyepiece opens the specific postal that the eyes of user are focusedPart.In embodiments, it can be initiated based on the detection of camera for controlling the order of eyepiece.
In embodiments, camera is to the detection in the position of pupil, diameter, speed and direction, retina and/or irisDirect imaging etc. is contemplated that safety measure.As an example, retina scanning, view can be performed in camera when user puts on eyepieceThe Database Identification user of on film scanning control eyepiece or long-range storage.In embodiments, if user is identified as eyeThe owner of mirror or the user of glasses, then it is openable application and provide a user access.If their glasses do not identifyUser out, then they can lock or prevent all or part of function.In embodiments, user may not be needed this password,Eyepiece can execute this function automatically.In embodiments, when user is unrecognized, camera can steal eyepiece in wearerIn the case where obtain identification information about wearer.
In embodiments, eyepiece can be based on detection, position to pupil, diameter, speed and the direction mobile to eyesDetection, to direct imaging of retina and/or iris etc. come execute user diagnosis.For example, diagnosis can be based on pupil dilation.ExampleSuch as, if the pupil of user by with liar it is consistent in a manner of expand, the user is can be detected in camera and/or eyepieceIt lies.In addition, if user obtained cerebral concussion, although then the light of specified rate enters eyes, pupil may also change size.MeshMirror can alert user, and whether he obtained cerebral concussion.In embodiments, it can be given when soldier, sportsman etc. exit physical activityTheir eyepieces, and eyepiece can be used for that user is for example diagnosed as cerebral concussion.Eyepiece can have onboard or divide with eyepieceThe customer data base opened, the customer data base store information relevant to each user.In one embodiment, when sportsman leavesWhen court is to sideline, he can put on one's glasses to execute retina scanning, come identity user, then to pass through detection by databaseThe pupil size of user simultaneously relatively diagnoses or checks user compared with pupil size expected under given illumination condition.If userData fall beyond the expected range, then glasses can tell user that his pupil is consistent with cerebral concussion was obtained.It can be used similarPurposes detects possible drug poisoning, detection retinal damage, detection eye condition etc..
In embodiments, Organic Light Emitting Diode (OLED) can be used for micro-display and/or sensor hereinApplication, and can be with such as OLEDCam(OLED camera) etc Fraunhofer system be used together, or otherwiseIt is used for the detection mobile to eyes, or is otherwise used together with eyepiece to illuminate eyes of user etc..In each implementationIn example, the equipment for detecting eyes movement can be placed on the optical axis of user along optical element string.In embodiments, micro- rulerDegree optical launcher and receiver can be integrated in same chip.They can be implemented as two-way or single with array type structureTo micro-display.In embodiments, equipment can be presented and/or capture simultaneously image.Micro-display may be for individualizingThe basis of the system of information, and information can be presented to user and identify the interaction that user makes.By equipped with bi-directional displayThe eyepiece of device, user can perceive environment as usual, and additional information can be presented.Visual information is adaptable in the operation of systemHereafter, user can be interacted by the movement or movement of eyes.In embodiments, CMOS chip may include being located at one piece of substrateOn micro-display and camera, the central member of substrate is the nested active square being made of OLED pixel with photodiodeBattle array.In embodiments, pixel unit can be made of red-green-blue-white and R-G-B-photodiode pixel unit etc..
In embodiments, a kind of system may include the interaction wear-type eyepiece that user wears, and wherein the eyepiece includes usingThe optics assembly of ambient enviroment and the content shown is observed by it, is suitable for the content that would indicate that introducing optics group in familyThe integrated image source of piece installing and it is placed on the camera in optics assembly along optical axis, so that camera can be observed user'sAt least part of eyes.In embodiments, camera can be suitable for capturing eyes, pupil, retina, eyelid and/or eyelashThe image of hair.In embodiments, it can be initiated based at least one image of cameras capture for controlling the order of eyepiece.?In each embodiment, the diagnosis of user can be based at least one image of cameras capture.The mark of user may be based on cameras captureAt least one image.As an example, diagnosis may include the diagnosis to cerebral concussion.In each embodiment of the system, user'sMark can be deployed as the security feature of eyepiece.In embodiments, integrated image source can be during camera carries out picture catchingIlluminate eyes.Further, the light from image source can be modulated during camera carries out picture catching.In embodiments,Camera may include one or more Organic Light Emitting Diodes (OLED).In embodiments, the eyes of user or listed hereinOther positions including iris, pupil, eyelid, eyelashes etc. can be illuminated by various light, LED, OLED etc..In each implementationIn example, imaging technique, the data for capturing eyes, mark etc. can be used for the illumination of eyes of user.
In one embodiment, which may include the interaction wear-type eyepiece that user wears, and wherein the eyepiece includes userThe optics assembly of ambient enviroment and the content shown is observed by it, is suitable for the content that would indicate that introducing optics assemblingThe integrated image source of part and the equipment moved for detecting eyes.In embodiments, for detecting the mobile equipment of eyesIt may include the minute yardstick optical launcher and receiver being integrated in same chip.In embodiments, which may includeCMOS chip, the CMOS chip include micro-display and camera on one piece of substrate.In embodiments, it is moved for detecting eyesThe dynamic equipment can be along the optical axis that optical element string is placed on user.
In embodiments, camera is arranged in optics assembly along optical axis, so that eyes of the camera looks into fee to userAt least part and the one or more in eyes, pupil, retina, eyelid and eyelashes can be imaged.Integrated placeThe eyes that reason device and camera are adapted to tracking user are mobile;Measure pupil dilation, pupil position, pupil diameter, pupil speed andAt least one of pupil direction;User's eye that the is eyes of user for being expected to control or order is mobile and being used to read or watch attentivelyEyeball is mobile mutually to be distinguished;By the eyes of user it is mobile be used as processor be used to control integrated processor or interaction wear-type eyepieceThe order of function;And it is the eyes of user are mobile as the life for controlling the equipment outside user and outside interaction wear-type eyepieceIt enables.Diagnosis or mark to user can be based at least one image of cameras capture, such as cerebral concussions.It can be by portion to the mark of userAdministration is the security feature of eyepiece.The system may include setting based on eyes from the user are mobile to control or signal outsideStandby user input interface.Camera be suitably adapted for capture eyes image, wherein image with including other images of eyesDatabase compared to relatively come indicate diagnosis.The optical axis in integrated image source and the optical axis of camera can be different.Integrated image sourceOptical axis and at least part of optical axis of camera can be identical.
In augmented reality eyepiece such as camera, the minute yardstick optical launcher being integrated in same chip and receiver orIt include that the equipment of CMOS chip of micro-display and camera etc can detect the eyes movement of user on one piece of substrate.Integrated figureImage source be adaptable to it is following at least one: camera carry out picture catching during the light from image source is modulated and is shoneBright eyes.Camera may include one or more Organic Light Emitting Diodes (OLED).It can be along light for detecting the mobile equipment of eyesElement displacement is learned on the optical axis of user or on the axis different from the eyes of user.Integrated processor is suitably adapted for userThe mobile order being construed to for equipment or external equipment in operating interactive wear-type eyepiece of eyes.
A kind of method that the eyes detecting user are mobile may include wearing wear-type eyepiece, which includes userThe optics assembly of ambient enviroment and the content shown is observed by it, is adapted to the content that would indicate that introducing optics assemblingThe integrated processor of part and integrated image source and camera, the eyes with camera and integrated processor detection user are mobile, andEquipment is controlled by eyes movement and integrated processor, wherein the movement of at least one eye eyeball of camera detection user is simultaneouslyThe movement is construed to order.Integrated processor can be mobile in the eyes as order and be expected to the eyes movement watched attentivelyBetween distinguish.The method may include by the mobile order for being construed to execute a certain function of scheduled eyes.The method canIncluding scanning at least one eye eyeball of user to determine the mark of user.The method may include scanning at least one eye of userEyeball is to diagnose the physical condition of user.Camera may include at least one Organic Light Emitting Diode (OLED).Specific eyes are mobileIt can be interpreted specifically to order.Eyes movement can be selected from the group of following composition: blink blinks repeatedly, counting of blinking, blinksEye rate, eyes open-are closed and (blink at a slow speed), watch tracking attentively, the eyes to side are mobile, upper and lower eyes movement, from side toThe eyes movement of side, a series of eyes movement by positions, the eyes movement to specific position, the stop in a certain positionTime, towards the watching attentively of fixed object, watching attentively by the specific part of the eyeglass of wear-type eyepiece.The method may include leading toEyes movement and user input interface are crossed to control equipment.The method may include around being captured with the camera or second cameraThe view of environment is shown to user.
In embodiments, eyepiece can using subconsciousness control aspect, as around wearer image, with lower than consciousness senseImage that the rate known is presented to the user, the subconsciousness perception of scene that viewer is seen etc..For example, can by eyepiece withImage is presented to wearer by the rate that wearer does not perceive, but discover with making wearer's subconsciousness be presented it is interiorHold, such as reminds, alerts and (such as request wearer to increase the warning to the concern rank of something, but do not have to too many so that userNeed entirely realize remind), it is related with the direct environment of wearer indicate (such as eyepiece detects wearer in the visual field of wearerPossible interested something, the instruction attract wearer to the interest of the object) etc..In another example, eyepiece can pass through brainActivity monitoring interface to wearer provide indicator, wherein before Individual Consciousness identifies image to them big intracerebral telecommunicationsNumber excitation.For example, brain activity monitoring interface may include electroencephalogram (EEG) sensor (s) to observe currently in wearerBrain activity is monitored when environment.Start " discovering " ambient enviroment when sensing wearer by brain activity monitoring interface eyepieceWhen a certain element, eyepiece can provide level of consciousness feedback to wearer, so that wearer more perceives the element.For example, wearingWearer may unconsciously start to perceive to be seen known face (such as friend, suspect, famous person) in crowd, and eyepiece is to wearingPerson provides vision or audio instruction to make wearer more be consciously noticeable the individual.In another example, wearer can look intoSee the product for causing their attentions with a certain subconscious level, eyepiece to wearer provide consciousness instruction, about the product moreMulti information, the enhancing view of the product, about link of more information of the product etc..In embodiments, eyepiece will wearThe ability that the reality of wearer expands to subconscious level aloow eyepiece to wearer provide beyond wearer to they weekEnclose the augmented reality of the normal consciousness experience in the world.
In embodiments, eyepiece can have multiple modes of operation, wherein to the control of eyepiece based in part on handPosition, shape, movement etc. control.In order to provide the control, eyepiece can detect hand/finger shape using hand recognizerThose hands configuration (may be combined with the movement of hand) is then associated as order by shape.In reality, due to that may only have Finite NumberThe hand of amount configures and movement can be used for order eyepiece, and the configuration of these hands may need the operation mode according to eyepiece and being repeated makesWith.In embodiments, can to eyepiece being assigned from a Mode change to next mode specific hand configure or movement, thusAllow the reuse to hands movement.Such as and refer to Figure 15 F, the hand 1504F of user can be moved to the camera on eyepieceThe visual field in, according to the mode, then which can be interpreted different orders, such as circular motion 1508F, pass through the visual fieldMovement 1510F, move back and forth 1512F etc..In the example that one simplifies, it is assumed that there are two kinds of operation modes, mode one are usedIn view of the panning from the image projected, mode two is for scaling the image projected.In this example, user may thinkWill using from left to right, the hands movement of finger pointing come order panning movement to the right.However, user may also desire to using a small left sideCarry out order image to right, finger pointing hands movement and amplifies bigger.In order to allow this hands movement for two kinds of command typesDual use, eyepiece can be configured to the mode being currently located according to eyepiece differently to explain hands movement, wherein specific handMovement has been assigned for Mode change.For example, clockwise rotation can indicate that the transformation from panning to zoom mode, andCounter-clockwise rotary motion can indicate that from the transformation for zooming to panning mode.The example be intended to it is illustrative rather than with anyMode is limited, those skilled in the art will appreciate that how this general technology can be used to realize using hand and fingerVarious order/mode configurations, such as hand-finger configuration-movement, both hands configuration-movement.
In embodiments, a kind of system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes userThe optics assembly of ambient enviroment and the content shown is observed by it, wherein the optics assembly includes correcting user pairThe correcting element of the view of ambient enviroment, for process content to be shown to the integrated processor of user and for by contentIntroduce the integrated image source of optics assembly;And to the integrated camera apparatus that posture is imaged, wherein the integrated processingThe posture is identified and is construed to command instruction by device.Control instruction can provide the manipulation to content to be shown, be communicated to outsideOrder of equipment etc..
In embodiments, the control to eyepiece can be enabled by eyes movement, movements of eyes etc..For example, in meshThere may be the cameras for reviewing wearer's eyes on mirror, and wherein eyes are mobile or movement can be interpreted command information, such as logicalCross blink, blink repeatedly, blink countings, blink rate, eyes are opened-are closed, watch tracking attentively, the eyes movement, upper and lower to sideEyes are mobile, eyes from side to side are mobile, a series of eyes movement by positions, the eyes shifting to specific positionResidence time in dynamic, a certain position watches (corner of the eyeglass of such as eyepiece) attentively, by the specific of eyeglass towards fixed objectIt is partial to watch attentively, watch attentively in real world objects etc..It is shown in addition, eyes control aloows viewer to focus on from eyepieceThe specified point on image shown, and because camera perhaps can be related to the point on display by the direction of observation of eyes, meshMirror perhaps can just see where and movement (such as blink, touch interface equipment, position sensing of wearer by wearerEquipment it is mobile etc.) combination explain order.For example, viewer perhaps can see certain an object on display, and pass throughPosition sensing device realizes ground, selects the object by the movement of finger.
In certain embodiments, glasses can be equipped with the eye of the movement of the eyes (or preferably eyes) for tracking userEyeball tracking equipment;Alternatively, glasses can be equipped with the sensor for six-freedom degree mobile tracking, i.e. head mobile tracking.These equipment or sensor can irrigate this city from the Chronos Vision GmbH and Massachusetts, United States of Berlin, GermanyISCAN is obtained.It is mobile that retina scanners can also be used for tracking eyes.Retina scanners also may be mounted to that augmented reality eyeIt in mirror, and can be obtained from various companies, such as SMI, Yi Jiqian of the Tobii of Stockholm, SWE, Germany Tai ErtuoThe ISCAN stated.
Augmented reality eyepiece further includes that user input interface as shown is used to allow user control device.It is set for controllingStandby input may include any sensor in sensor discussed above, and may also include Trackpad, one or more functionsKey and any other suitable Local or Remote equipment.For example, eye-tracking device can be used for controlling another equipment, such asVideo-game or external trace device.As an example, Figure 29 A is depicted with the eyes for being equipped with other places discussion in the documentThe user of the augmented reality eyepiece of tracking equipment 2900A.Eye-tracking device allows eyepiece to track the single eyes of user (or mostWell eyes) direction, and mobile will be sent to the controller of eyepiece.Control system includes augmented reality eyepiece and is used for militaryThe control equipment of device.The control equipment for weapon is moved and then can be transmitted to, the control of weapon controlled device controls equipmentIn the sight of user.Then the movement of eyes of user is converted into the letter of the movement for controlling weapon by suitable softwareNumber, such as quadrant (range) and orientation (direction).Additional control can be used together with eyes tracking, such as utilize the tracking of userPad or function key.Weapon can be the large caliber weapon of such as howitzer or mortar etc, or can be such as machine gun etcMinor-caliber weapon.
Then the movement of eyes of user is converted into the signal of the movement for controlling weapon by suitable software, such asThe quadrant (range) of weapon and orientation (direction).It is additional to control the single that can be used for weapon or continuous transmitting, such as utilizeThe Trackpad or function key of user.Alternatively, weapon can be fixed and non-directional, such as inbuilt thunder or shape chargeWeapon, and can be protected by safety device, such as order by requiring specific coding.The user of augmented reality equipment can pass through hairSend code appropriate and order to activate weapon, without using eyes tracking characteristics.
In embodiments, the control to eyepiece can be enabled by the posture of wearer.For example, eyepiece can have outwardCamera that (such as forward, to side, downwards) sees simultaneously by the posture of the hand of wearer or mobile is construed to control signal.Hand signalIt may include making hand by camera, converting before camera hand position or symbolic language, be directed toward real world objects (such as activatingThe enhancing of object) etc..Hands movement may be additionally used for the object that manipulation is shown on the inside of translucent eyeglass, and such as mobile pairAs, target rotation, delete object, the screen in opening-closing image or window etc..Although having used hand in the example of frontMovement, but the object that any position of body or wearer hold or wear can also be used to carry out posture knowledge by eyepieceNot.
In embodiments, head movement control can be used for sending to eyepiece and order, wherein such as accelerometer, gyroThe motion sensor of instrument or any other sensor as described herein etc may be mounted to that on the head of wearer, on eyepiece, capIn son, the helmet it is medium.With reference to Figure 14 A, head movement may include the quick movement on head, and such as head is sudden dynamic before or after1412, head sudden dynamic 1410, head above and/or under swings suddenly from side to the other side, stops in a certain position, such asTo side, movement and it is held in place etc..Motion sensor can be integrated in eyepiece, by with the wired of eyepiece orBe wirelessly connected and be mounted on the head of user or the overcover of head (such as cap, the helmet) in.In embodiments, userWearable interactive wear-type eyepiece, wherein eyepiece includes the optics group that user watches ambient enviroment and the content shown by itPiece installing.Optics assembly may include correcting user to the correcting element of the view of ambient enviroment, for process content for display toThe integrated processor of user and for by content introduce optics assembly integrated image source.Multiple head movement sensing controlsAt least one of control equipment can or eyepiece integrated with eyepiece it is associated, they based on the predefined head movement feature of sensing comeControl command is provided as command instruction to processor.Head movement feature can be the swing of user's head, so that the pendulumDynamic is the obvious movement dissimilar with generic head movement.The obvious movement can be the sudden dynamic movement on head.Control instruction canIt provides to for showing the manipulation of content, the order for being communicated to external equipment etc..Head movement control can be with other control mechanismsIt is used with being combined, such as carry out activation command using another control mechanism as discussed herein and holds head movementRow.For example, wearer may wish to move right certain an object, and controlled by eyes as discussed herein,It selects the object and head movement is activated to control.Then, by the way that their head is tilted to the right, object can be command by and move rightIt is dynamic, and the order is terminated by eyes control.
In embodiments, eyepiece can be controlled by audio, such as pass through microphone.Audio signal may include that voice is knownNot, speech recognition, voice recognition, sound detection etc..It can be talked about by the microphone on eyepiece, throat's microphone, jawbone microphone, suspension typeCylinder, headphone, earplug with microphone etc. detect audio.
In embodiments, order input can provide multiple control functions, such as open/close eyepiece projector, open/Audio is closed, camera is opened/closed, opens/closes augmented reality projection, opens/closes GPS, being interacted with display (as selectedSelect/receive shown function, reset the image captured or video etc.), interact with real world (such as capture image or video,To shown book page turning etc.), with insertion or external mobile device (such as mobile phone, navigation equipment, musical instruments,VoIP etc.) movement, the browser control (such as submitting, next result) for internet, Email control are executed (as readEmail, display text, text compressing, typesetting, selection etc.), (such as save location transfers preservation for GPS and Navigation ControlPosition, direction is shown, is consulted a map on place) etc..In embodiments, eyepiece or its components can pass through sensorIt indicates and is automatically switched on and/or closes, such as passed from IR sensor, accelerometer, force snesor, micro switch, capacitive characterSensor passes through eyes tracking detection equipment etc..For example, when user takes off eyepiece from their head, by sensing the meshMirror no longer has the capacitive sensor of physical contact (such as at the bridge of the nose of user's nose) with user's skin, and eyepiece can be automaticIt closes.It will be appreciated by those skilled in the art that for sensing the other similar configuration when eyepiece is taken off.In embodiments, meshMirror can sense when detachable member is attached to eyepiece or removes from eyepiece, and can open/close eyepiece using the sensingVarious aspects.For example, a part of optical device is dismountable, and when the optic part is removed, to eyepiece systemThe power supply of half just powers off to save the electric power of battery.The disclosure may include power supply management device, wherein power supply management deviceThe electric power for being supplied to selected eyepiece component is controlled in response to sensor.Eyepiece may be mounted to that with nose support and foldableIn the frame of temple, wherein the hinge attachment of frame is in folding temple, and wherein sensor may be mounted to that the nose support of frameIn, in temple, in hinge etc..Selected component can be image source, processor etc..It is electric when user does not wear eyepieceSource control equipment can be at suspend mode, and wherein suspend mode may include periodically reading sensor, wherein working as power managementEquipment detects that it is transformed into awakening mode and powers to eyepiece when user is just adorning oneself with eyepiece.Power supply management device can be based on meshRemaining electric power, network availability, rate of power consumption etc. reduce the power supply to component in the use of mirror function, integrated battery.User preference can be based on by reducing power supply.User can ignore power supply by ordering and reduce.When power supply is reduced, can pass throughThe user interface of eyepiece provides a user an instruction.If the luminance level of image source is dropped due to reducing the power supply to image sourceLow, then the electrochromism density in optics assembly can increase.
In embodiments, eyepiece can provide a user 3D display imaging, such as by conveying stereopsis, automatic stereoChange, the hologram that computer generates, body show that image, perspective view/space image, view sequential display, electronic holographic are shownDevice, parallax " dual-view " display and parallax panoramagrams, reimaging system etc., to create 3D depth perception for viewer.It shows that the left eye and right eye that different images is presented to the user can be used in 3D rendering to user, such as has in left optical path and right optical pathIn the case where having certain optical module for distinguishing the image, different images are projected to the left eye and right eye of user in projector apparatusIn the case where, etc..It, should including by the optical path of the eyes of optical path to user may include graphic display device from projector apparatusGraphic display device forms the visual representation of certain an object with three physical dimensions.Integrated processor or outside in such as eyepieceThe processor of processor in equipment etc can provide 3D rendering processing as at least step for generating 3D rendering to user.
In embodiments, line holographic projections technology can be used for that 3D imaging effect is presented to user, and such as computer generatesHologram (CGH), it is the method for being digitally generated holographic interference pattern.For example, hologram image can be thrown by holographic 3D display deviceIt penetrates, such as the interference based on coherent light is come the display of work.The hologram that computer generates this have the advantage that people wantThe object shown is completely without with any physical reality, that is to say, that they can be fully generated as " synthetic hologramFigure ".In the presence of a variety of different methods of the interference figure for calculating CGH, including from holographic information and calculate reduction fieldAnd in calculating and quantification technique.For example, Fourier transformation method and point-source hologram are two examples of computing technique.Fu LiLeaf transformation method can be used for each depth plane of simulated object to the propagation of holographic plan, and wherein the reconstruct of image can be sent outLife is in far field.In instantiation procedure, possible there are two steps, wherein the light field first in remote viewing person plane is calculated, soThis is switched back into lens plane by Fourier transformation afterwards, wherein to be each object plane by the wavefront of hologram reconstructionSuperposition of the Fourier transformation in depth.In another example, target image can be multiplied by the phase for applying inverse Fourier transformBitmap.Then intermediate hologram can be generated by shifting image product, and be combined to create final collection.The final collection of hologramThen kinoform can be formed by approximation shows that wherein kinoform is phase hologram, wherein object to user's carry out sequenceThe phase-modulation of wavefront is registered as surface undulation profile.In holography of point objects drawing method, object is broken down into spontaneous luminous point,In to each point source calculating elements hologram, synthesize final histogram by being superimposed all element histograms.
In one embodiment, 3D or holographic imaging can be enabled by dual projector system, two of them projector stacksOn top of each other for 3D rendering output.It can be by controlling mechanism as described herein or by capturing image or signal (such as handThe heart stretches hand upward, SKU, RDIF are read etc.) enter line holographic projections mode.For example, the wearer of eyepiece may be viewed by oneLetter " X " on hardboard, this projector for making eyepiece enter holographic mode and opening second, stacking.Control technology can be usedTo carry out the selection to show what hologram.Projector can project to hologram on the letter on hardboard " X ".PhaseThe image projected simultaneously is moved with the movement of alphabetical " X " in the position of associated software traceable alphabetical " X ".In another exampleIn, eyepiece can scan SKU, such as toy builds the SKU in external member, and can access from line source or nonvolatile memoryAt the 3D rendering built of toy.Control as described herein can be used with the interaction of hologram (rotate it, amplification/diminution etc.)Making mechanism carries out.Scanning can be enabled by associated bar code/SKU scanning software.It in another example, can be in spaceIn or surface on project keyboard.Holographic keyboard can be used for or control any associated applications/functions.
In embodiments, eyepiece equipment can be used for relative to true environment object (such as desk, wall, Vehicular instrument panelDeng) position of locking dummy keyboard, wherein dummy keyboard then not with wearer it is mobile they head and move.Show oneIn example and refer to Figure 24, user may be sitting in front of desk and wear eyepiece 2402, and it is desirable that apply in such as word processing,Text is inputted in the application of web browser, communications applications or the like.User perhaps can generate dummy keyboard 2408 or other friendshipsMutual formula control element (such as virtual mouse, calculator, touch screen) is used to input.User can provide for generating dummy keyboard2408 order, and indicate using gesture 2404 the fixation position of dummy keyboard 2408.Dummy keyboard 2408 then can phaseThe a certain position being such as fixed on desk 2410 in space is kept fixed for external environment, wherein even if in userWhen rotating their head, the position of the dummy keyboard 2408 is also maintained on desk 2410 by eyepiece equipment.That is, meshMirror 2402 can compensate for the head movement of user to remain in the User of dummy keyboard 2408 on desk 2410.EachIn embodiment, the wearable interactive wear-type eyepiece of user, wherein eyepiece includes that user passes through its viewing ambient enviroment and displayThe optics assembly of content out.Optics assembly may include correcting user to the correcting element of the view of ambient enviroment, be used forProcess content come be shown to user integrated processor and for by content introduce optics assembly integrated image source.It canIt provides that ambient enviroment is imaged and user gesture is identified as the order of Interactive control element position and (such as moves in a specific wayHand that is dynamic, positioning in a specific way-finger configuration etc.) integrated camera apparatus.It is ordered in response to Interactive control element positionIt enables, regardless of the view direction of user changes, the position of Interactive control element then can be relative to pair in ambient enviromentAs and be kept fixed in position.By this method, user perhaps can be to use the almost identical side of physical keyboard with themFormula utilizes dummy keyboard, and wherein dummy keyboard is maintained in same position.However, in the case where dummy keyboard, without such as" physical limit " of gravity etc can position keyboard wherein to limit user.For example, user can stand against wall, and by keyDisk position is arranged on wall etc..It will be appreciated by those skilled in the art that " dummy keyboard " technology can be applied to any controlDevice, such as virtual mouse, virtual touchpad, virtual game interface, virtual telephony, virtual computing device, virtual paintbrush, virtual drawingPlate etc..For example, virtual touchpad can be visualized by eyepiece to user, and positioned by using gesture by user, and replaceIt is used for physical touch plate.
In embodiments, visualization technique can be used in eyepiece equipment, is such as similar to parallax, trapezoidal distortion by applicationThe projection of object (such as dummy keyboard, keypad, calculator, notebook, control stick, control panel, book) is in by equal deformationNow on the surface.For example, can be by application ladder with the appearance that suitable perspective view projects the keyboard on the desktop in face of userShape distortion effect is assisted, wherein be distorted by the projection that eyepiece provides a user so that it, which is appeared as, lies in tableOn the surface of son.In addition, these technologies are dynamically applied, to also provide conjunction even if user moves around about the surfaceSuitable perspective view.
In embodiments, eyepiece equipment can provide gesture recognition, gesture recognition can be used for being provided with eyepiece keyboard andMouse experience.For example, system perhaps can be real using the image of the keyboard for the middle and lower part for being covered on display, mouse and fingerWhen track finger position to enable virtual desktop.By gesture recognition, without wired and external power supply equipment just can be carried out withTrack.In another example, it is not necessarily to wired and external power supply, fingertip location can be tracked by the gesture recognition that eyepiece carries out, it is allSuch as utilize the gloves for there are passive RFID chips in each finger tip.In this example, each RFID chip can have the sound of their ownFeature is answered, so that multiple fingers can be read simultaneously.RFID chip can be matched with glasses, so that they can be nearOther RFID phases of work are distinguished.Glasses can provide signal to activate RFID chip and have two or more receiving antennas.Each receiving antenna can be connected to a phase measuring circuit element, which then determines algorithm to positionInput is provided.Position determines that velocity and acceleration information can also be provided in algorithm, and the algorithm can finally be mentioned to eyepiece operating systemFor keyboard and mouse message.It in embodiments, can be by the phase difference between receiving antenna come really using two receiving antennasThe position of orientation of fixed each finger tip.Then relative phase difference between RFID chip can be used to determine the radial position of finger tip.
In embodiments, eyepiece equipment can be used visualization technique by the medical scanning previously obtained (such as X is penetratedLine, ultrasonic wave, MRI, PET scan etc.) projection be presented on the body of wearer.Such as and refer to Figure 24 A, eyepiece can visitAsk the radioscopic image of the hand getting collection to wearer.Then eyepiece can check the hand 2402A of wearer using its integrated camera,And the projection image 2404A of X-ray is superimposed upon on hand.In addition, eyepiece perhaps can be protected when wearer moves their handImage superposition is held, and is watched attentively relative to each other.In embodiments, the skill can also be realized when wearer is seeing mirrorArt, wherein eyepiece exchanges an image on reflected image.The technology is used as a part of diagnostic program, is used for physicsRehabilitation, encouragement during treating are moved and are dieted, explain diagnosis or situation etc. to patient.Image can be wearer image,General image etc. from medical conditions image data base.General superposition can show for physical condition be it is typical certainIf the internal problem of type, about follow particular routine up to body after a period of time will appear to projection how, etc..?In each embodiment, the external control devices of such as indicating needle controller etc allow the manipulation to image.Further, imageSuperposition can synchronize between multiple people, everyone wears eyepiece as described herein.For example, patient and doctor can throw imageIt is mapped to patient on hand, wherein doctor can explain body illness now, while patient watches the synchronous images of the scanning projectedAnd the explanation of doctor.
In embodiments, eyepiece equipment, which can be used for removing in dummy keyboard projection, there is the part of intervening obstruction (as usedThe stick shift at family is lived, wherein being not intended to keyboard projecting user on hand).In one example and with reference to Figure 30, eyepiece 3002The dummy keyboard 3008 projected can be provided to wearer, such as on the table.Wearer and then accessible dummy keyboard 3008To be keyed in " thereon ".Since keyboard is only the dummy keyboard projected rather than physical keyboard, not to projectionIn the case that image out makees certain compensation, the virtual machine projected would be projected the back of the hand "upper" of user.However, such asIn this example, eyepiece can provide compensation to the image projected, so that the hand 3004 of wearer hinders dummy keyboardThe part of expected projection on the table can be removed from the projection.That is, it may be undesirable to keyboard projection 3008Some parts visualized user on hand, therefore eyepiece subtract dummy keyboard projection it is co-located with the hand 3004 of wearerPart.In embodiments, the wearable interactive wear-type eyepiece of user, wherein eyepiece includes that user passes through its viewing surrounding ringThe optics assembly in border and the content shown.Optics assembly may include correction member of the correcting user to the view of ambient enviromentPart, for process content come be shown to user integrated processor and for by content introduce optics assembly integrated figureImage source.The content shown may include Interactive control element (such as dummy keyboard, virtual mouse, calculator, touch screen).CollectionThe physical feeling of user can be imaged in a certain physical feeling and Interactive control element interactions of user at camera apparatus,Wherein view of the processor based on user, by the user's body portion being confirmed as with imaging for subtracting Interactive control elementThe co-located a part in position removes the part of Interactive control element.In embodiments, this part projects imageRemoval technology can be applied to the image and barrier that other are projected, and be not intended to be limited to hand on the virtual keyboardThis example.
In embodiments, eyepiece equipment can provide residence to any virtual content being shown on " true " world contentBetween obstacle.If a certain referential is confirmed as being placed in interior at a certain distance, worn between virtual image and viewerAny object crossed can be removed from shown content out, so as not to be present in specific range to expected shown informationThe user at place causes to be interrupted.In embodiments, various adjustable focus technologies also can be used increase to the content watched itBetween the perception apart from level.
In embodiments, eyepiece equipment can be used for from such as dummy keyboard such as by finger, stylus, whole handEtc. the ability that a series of character contacts streaked determine expected text input.Such as and refer to Figure 37, eyepiece may beDummy keyboard 3700 is projected, wherein user wishes to input word " wind ".In general, user will discretely press corresponding to " w ", soEach key position of " i ", then " n ", last " d " afterwards, and a certain equipment associated with eyepiece (camera, accelerometer etc., such asIt is described herein) each position will be construed to correspond to the letter of the position.However, system perhaps can also monitor the hand of userRefer to or movement of other pointing devices across dummy keyboard or draw is swept, and determines the mobile best fit matching of pointer.In the attached drawingIn, pointer starts at character " w " and the inswept path 3704 by character e, r, t, y, u, i, k, n, b, v, f and d, is parked in d.Eyepiece can be observed the sequence and such as determine the sequence by input path analysis device, the sequence feed-in word that will be sensedWith search equipment, and best fit word is exported, is " wind " as text 3708 in the situation.In embodiments, eyepieceIt can monitor movement of the pointing device across keyboard and more directly determine word, such as be known by the matching of automatic whole-word, modeNot, Object identifying etc., wherein certain " separator " indicates the space between word, pause in the movement of such as pointing device,Tap, circumnutation of pointing device of pointing device etc..For example, whole stroke sweep path can be with mode or object recognition algorithm oneUsing entire word and the movement of the finger of user is related to form the discrete mode that word is formed by each characterConnection, wherein the pause between moving is as the description between word.Eyepiece can provide best fit word, best fit wordInventory etc..In embodiments, the wearable interactive wear-type eyepiece of user, wherein eyepiece includes that user passes through its viewing surroundingThe optics assembly of environment and the content shown.Optics assembly may include correction of the correcting user to the view of ambient enviromentElement, the integrated figure for process content for display to the integrated processor of user and for content to be introduced to optical moduleImage source.Shown image out may include interactive keyboard control element (such as dummy keyboard, calculator, touch screen), whereinKeyboard Control element is associated with input path analysis device, word matched search equipment and keyboard input interface.User Ke TongCross the cunning by a pointing device (such as finger, stylus) to want the substantially sequence as the word of text input by userDynamic movement slips over the character keys of keyboard input interface to input text, wherein input path analysis device determines institute in input pathThe character of contact, word matched equipment find the best word matched of contacted character string and make the best word matchedTo input text input.In embodiments, it can be the something in addition to keyboard with reference to display content, such as hand-written textThis sketching board or similar to control game or real machine people and aircraft 4 to other interfaces of control stick pad with reference to etc..Another example can be virtual jazz drum, the colored pad such as made a sound with user's " tap ".Eyepiece is explained across a certainThe ability of the mode of the movement on surface allows reference projection content, is directed toward to give user's something, and provide a userVision and/or audible feedback.In embodiments, " movement " that eyepiece detects can be user's eyes of user when seeing surfaceMovement.For example, the equipment that eyepiece can have the eyes for tracking user mobile, and by possessing the virtual key projectedBoth the content display position of disk and the direction of gaze of eyes of user, eyepiece are perhaps able to detect view of the eyes of user across keyboardLine is mobile, and movement is then construed to text as described herein.
In embodiments, eyepiece can provide through gesture " sky-writing " come the ability of order eyepiece, such as wearerIt is drawn in the sky in the visual field of the eyepiece camera of insertion using their finger and scans out letter, word etc., wherein eyepiece is by fingerLetter, the word, symbol that movement is construed to for ordering, stamped signature, writes, sends e-mails, sending the documents this etc..For example, wearerThe technology can be used to utilize " aerial stamped signature " to sign document.The technology can be used to write text, such as in electronics in wearerIn mail, text, document etc..The Symbol recognition made by hands movement can be control command by wearer's eyepiece.In each implementationIn example, as described herein, the gesture identification that can be explained by the image of eyepiece cameras capture or set by other input controlsStandby (being such as mounted on the finger of user, the Inertial Measurement Unit (IMU) in the equipment on hand) Lai Shixian sky-writing.
In embodiments, eyepiece equipment can be used for that the content shown corresponding with the label identified, the mark is presentedNote indicates the intention for showing the content.That is, can order eyepiece shown based on scheduled external visual cues are sensedShow specific content.Visual cues can be image, icon, photo, face recognition, hand configuration, body configuration etc..What is shown is interiorHold can be transferred out for use interface equipment, facilitate when user reaches a certain travel locations help user find it is a certainThe navigation of position, the advertisement when eyepiece watches a certain target image, profile rich in information etc..In embodiments, visionIn the memory that label clue and their associated contents for display can be stored on eyepiece, it is stored in outer computerBe imported into storage equipment and on demand (according to geographical location, with the degree of approach, the order of user etc. that trigger target), by theTripartite's generation etc..In embodiments, the wearable interactive wear-type eyepiece of user, wherein eyepiece includes that user passes through its viewing weekThe optics assembly in collarette border and the content shown.Optics assembly may include school of the correcting user to the view of ambient enviromentPositive element, the collection for process content for display to the integrated processor of user and for content to be introduced to optics assemblyAt image source.It can provide the integrated camera apparatus to external visual cues imaging, wherein integrated processor identifies external view lineRope simultaneously is construed as showing the order of content associated with the visual cues.With reference to Figure 38, in embodiments, visual lineRope 3812 can be included in a certain direction board 3814 in ambient enviroment, wherein the content projected is associated with advertisement.It shouldDirection board can be billboard, and the advertisement is the personalized advertisement of the preference profile based on user.Visual cues 3802,3808The content that can be gesture, and project can be the dummy keyboard 3804,3810 projected.It is come from for example, gesture can beThe thumb and index finger gesture 3802 of first user hand, and dummy keyboard 3804 is incident upon on the palm of the first user hand, wherein usingFamily can be keyed on the dummy keyboard with second user hand.Gesture 3808 can be the thumb and index finger gesture group of user's both handsIt closes, and dummy keyboard 3810 is projected between the both hands of user according to the configuration of the gesture, wherein user is able to use userThe thumb of hand key on the dummy keyboard.Virtual clue can be provided to the wearer of eyepiece by scheduled external view lineRope automation resource associated with using the expected result of content way is projected, so that wearer does not have to oneself scounting lineRope.
In embodiments, eyepiece may include that the visual identity language translation for providing the translation of visual presentation content is setIt is standby, for road sign, menu, billboard, shop sign, books, magazine etc..Visual identity language translation equipment can utilize lightIt learns character recognition and carrys out the identifier word mother from content, alphabetic string and word and expression are matched by translation database.The abilityIt can be completely contained in eyepiece, such as using off-line mode, or be at least partly comprised in external computing device,Such as on external server.For example, user may be in foreign country, wherein the wearer of eyepiece does not understand direction board, menu etc., butIt is that eyepiece can provide translation for it.These translations can behave as annotating for a user, (all with translation substitution foreign language wordOn direction board), user etc. is supplied to by audio translation.By this method, wearer will not have to make great efforts to search word translation,But word translation can be automatically presented with the video.In one example, the user of eyepiece may be Italian and come the U.S., Ta MenyouExplain a large amount of road signs in order to the needs of safe driving.With reference to Figure 38 A, the Italian subscriber of eyepiece is seeing the parking in the U.S.(STOP) direction board 3802A.In this example, eyepiece can identify the letter on the direction board, and word " stop " is translated into meaningThe parking " arresto " of big benefit language, and make stop signpost 3804A seem pronounce word " arresto " rather than"stop".In embodiments, eyepiece can also provide simple translation message to wearer, provide audio translation, mention to wearerFor translation dictionary etc..The disclosure may include a kind of interaction wear-type eyepiece worn by user, and wherein eyepiece includes that user passes throughIt is watched ambient enviroment and the optics assembly of content shown and is suitable for introducing content into the integrated of optics assemblyImage source;For the integrated camera to the text imaging seen in ambient enviroment;For will from the text watched oneA or multiple characters are related to one or more characters of first language and by one or more characters of first language andThe relevant optical character recognition equipment of one or more characters of two language, wherein integrated image source by one of second language orMultiple characters are rendered as the content shown, wherein the content shown is locked in relative to from the text watchedIn the position of one or more characters.The presentation of one or more characters of second language can behave as the annotation to user, andThe content shown is placed as relative to the original text watched.The presentation of one or more characters of second language can foldIt is added in the viewing location of the original text watched, one or more characters of such as second language are superimposed on original seenPresentation on the text seen matches the character feature of the original text watched.The text watched is located at direction board, printing textOn shelves, books, road sign, billboard, menu etc..Optical character recognition equipment can be incorporated into eyepiece, mention outside eyepieceFor or by provide in a manner of internal and external combination.The one or more character can be word, phrase, alpha-numeric stringDeng.One or more characters of second language can be saved in external equipment and tagged to watch in the second eyepieceOne text Shi Keyong tags including geographical location instruction, object identifier etc..In addition, the view when text removesThe presentation of one or more characters of second language can be stored when except the view of eyepiece, so that when text is moved back into eyepieceView within when its for present purpose and recalled to.
In one example, eyepiece can be used in adaptive advertisement, such as blind users.In embodiments,Face recognition or the result of object identity can be processed to obtain audible as a result, and can be used as audio by associated earPlug/earphone is presented to the wearers of glasses.In other embodiments, face recognition or the result of object identity can be converted intoTactile vibrations in glasses or associated controller.In one example, if someone stands in face of the user of adaptive glasses, phaseMachine can be imaged to the people and send image to integrated processor and handle for use by facial recognition software, or sends work to and takingThe facial recognition software being engaged on device or in cloud.The result of face recognition can be rendered as the display of glasses for certain individualsPenman text in device, but for blind person or weak-eyed user, it as a result can be processed to obtain audio.In other examples,Object identifying can determine user close to roadside, doorway or other objects, and glasses or controller will acoustically or tactileAlert user.For weak-eyed user, the text on display can be amplified or contrast can be enhanced.
In embodiments, GPS sensor can be used to determine the position for wearing the user of adaptive display.GPS is passedSensor can audibly notify user when user is close to or up to various points of interest by navigation application access.In each embodimentIn, by navigation application, user is audibly directed to terminal.
Eyepiece may be useful for various applications and market.It should be understood that controlling mechanism as described herein can by withIn the function of controlling application as described herein.Eyepiece can once run single application or can once run multiple applications.Using itBetween switching can be carried out with controlling mechanism as described herein.Eyepiece can be used for military application, game, image recognition apply withCheck/subscribe e-book, GPS navigation (position, direction, speed and ETA(Estimated Time of Arrival)), mobile TV, sport (check stepSpeed, ranking, match number;Receive coach teach), tele-medicine, industrial detection, aviation, shopping, stock control tracking, fire-fighting(by understand thoroughly smog, haze, dark VIS/NIRSWIR sensor enable), open air/venture, customization advertisement etc..In an embodimentIn, eyepiece can with the Email of GMAIL in Fig. 7 etc, internet, web-browsing, check sports score, Video chat etc.It is used together.In one embodiment, eyepiece can be used for education/training goal, such as (such as be exempted from by display substep guidanceIt mentions, wireless maintenance and repairing indicate).For example, manual video and/or instruction are displayed in the visual field.In one embodiment, meshMirror can be used for fashion, health & beauty.For example, it may be possible to suit, hair style or cosmetics can be projected onto the mirror of userAs upper.In one embodiment, eyepiece can be used for business intelligence, talks and meeting.For example, the nametags of user can be sweptIt retouches, their face is searched in the database to obtain biology quickly through facial-recognition security systems or name that they sayInformation.Nametags, face and session through scanning, which can be recorded, to be checked or filters for subsequent.
In one embodiment, " mode " can be entered by eyepiece.In this mode, specific application may be available.For example,The eyepiece of consumer's version can have tourist's mode, educational pattern, internet mode, TV mode, game mode, motor pattern,Designer's mode, personal assistant mode etc..
The user of augmented reality glasses may want to participate in video call or video conference while wearing spectacles.It is manyAll there is computer (both desktop computer and laptop computer) integrated camera to be easy to use video call and meeting.Allusion quotationType, software application be used to for the use of camera being integrated with calling or conference features.By being provided in augmented reality glassesOn knee and other calculating equipment most of functions, many users may want to while wearing the movement of augmented reality eyesUtilize video call and video conference.
In one embodiment, video call or video conference application can connect work with WiFi, or can be withA part of the associated 3D or 4G call network of the cellular phone of user.Camera for video call or meeting, which is placed in, to be setOn preparation controller, such as wrist-watch or other individual electronic computing devices.It is existing that video call or conference camera are placed on enhancingIt is unpractical on real glasses, because this placement can only provide a user themselves view, without display conferenceOr other participants in calling.However, user may be selected to show their environment or video using the camera of face forwardAnother individual in calling.
Figure 32 is depicted for the typical camera 3200 used in video call or meeting.This camera be usually it is small,And it can be installed on wrist-watch 3202 shown in such as Figure 32, on cellular phone or including laptop computerOn other portable computing devices.Video call with cellular phone or other communication equipments by connecting device controller come workMake.The operating system and communication equipment of equipment utilization and glasses or the software for calculating hardware compatibility.In one embodiment, enhanceThe screen of Reality glasses can show the list of the option for making calling, and user can be used pointing control device or use thisAny other control technology described in text carrys out the video call option on the screen of selective enhancement Reality glasses to make posture.
Figure 33 shows the embodiment 3300 of the block diagram of video call camera.Camera combines lens 3302, CCD/CMOSSensor 3304, the analog-digital converter 3306 for vision signal and the analog-digital converter 3314 for audio signal.Microphone3312 acquisition audio inputs.Both their output signal is sent to signal enhancing module by analog-digital converter 3306 and 33143308.Enhanced signal is transmitted to interface 3310 by signal enhancing module 3308, which is video and audioThe synthesis of both signals.Interface 3310 is connected to IEEE1394 standard bus interface and control module 3316.
In operation, video call camera depends on signal capture, and incident light and incident sound are transformed into electronics by it.It is rightYu Guang, the process are executed by CCD or CMOS chip 3304.Sound is become electric pulse by microphone.
The first step in generating the process of the image of video call is by image digitazation.CCD or CMOS chip3304 subdivision images are simultaneously converted pixel.If many photons of a certain pixel collection, voltage will be height.If the pictureElement has collected seldom photon, then voltage will be low.The voltage is the analogue value.During digitized second step, voltage by intoThe analog-digital converter 3306 of row image procossing is transformed into digital value.At this point, can get original digital image.
The audio that microphone 3312 captures also is transformed into voltage.The voltage is sent to analog-digital converter 3314, thereThe analogue value is transformed into digital value.
Next step is enhancing signal, so that it can be sent to the viewer of video call or meeting.Signal enhancingColor is created in the picture including using the colour filter for being located at 3304 front CCD or CMOS chip.The colour filter be it is red, green orIndigo plant simultaneously changes its color pixel by pixel, it can be color filter array or Bayer color filters in one embodiment.These original numbersThen word image is enhanced by the colour filter to meet aesthetic requirement.Audio data can also be enhanced to obtain preferably calling bodyIt tests.
In final step before transmission, image and audio data are compressed and are exported as digital video frequency flow, oneDigital video camera is used in embodiment.If using photo camera, exportable single image, and in further embodimentIn, speech can be commented on and be additional to these files.Camera is left in the enhancing of initial numberical data, in one embodiment can beAugmented reality glasses occur in the device controller that video call or session communicate or calculating equipment.
Further embodiment can provide portable camera and be used for industry, medical treatment, astronomy, microscopy, require specialized camerasThe other field of purposes.These cameras usually abandon signal enhancing and export original digital image.These cameras may be mounted to thatOn other electronic equipments or user on hand in order to using.
Camera using IEEE1394 interface bus come with augmented reality glasses and device controller or calculate equipment interconnection.It shouldInterface bus delivery time demanding data, such as video and the extremely important data of its integrality, including to manipulate numberAccording to or transmission image parameter or file.
Other than interface bus, the behavior of protocol definition equipment associated with video call or meeting.In each implementationIn example, one of following agreement: AV/C, DCAM or SBP-2 is can be used in the camera for being used together with augmented reality glasses.
AV/C is the agreement for audio frequency and video control, and defines the number including video camera and video recorderThe behavior of video equipment.
DCAM refers to the digital camera specification based on 1394, and definition without audio exports the phase of uncompressed image dataThe behavior of machine.
SBP-2 refers to serial bus protocol, and the mass-memory unit of definition such as hard disk drive or disk etcBehavior.It can be communicated with one another using the equipment of same protocol.To be exhaled for the video for using augmented reality glasses to carry outIt cries, identical agreement can be used in the video camera and augmented reality glasses on device controller.Due to augmented reality glasses, equipmentController and camera use identical agreement, and data can exchange among these devices.The file packet that can be transmitted between devicesIt includes: image and audio file, image and voice data stream, the parameter for controlling camera etc..
In one embodiment, it is desirable to which the user for initiating video call can select from the screen presented when initiating video callVideo call option.User makes posture by using pointing device to select, or postures to signal and exhale videoCry the selection of option.Then the camera being located on device controller, watch or other separable electronic equipments is positioned to by userSo that the image of user is by cameras capture.Image is processed by above-mentioned process, be then streamed to augmented reality glasses withAnd other participants are for display to user.
In embodiments, camera may be mounted to that cellular phone, personal digital assistant, watch, in falling decoration or can be byOn other small portable apparatus for carrying, wearing or installing.The image or video of cameras capture can be streamed to eyepiece.ExampleSuch as, when camera is installed on rifle, wearer perhaps to target imaging not within view and can wirelessly receive figureAs the stream as the content shown to eyepiece.
In embodiments, the disclosure can provide the content reception based on GPS to wearer, as shown in Figure 6.As described,The augmented reality glasses of the disclosure may include memory, global positioning system, compass or other orientation equipments and camera.The available computer program based on GPS of wearer may include that can usually obtain from the application shop of Apple Inc. for iPhoneThe many applications used.These programs of counterpart can be used for the smart phone of other brands, and can be applied to the disclosureEach embodiment.These programs include such as SREngine(scene Recognition engine), NearestTube, TAT AugmentedID, Yelp, Layar and TwittARound, and other more specialized applications of such as RealSki etc.
SREngine is the scene Recognition engine of object observed by the camera that can be identified for that user.It is can to identifyThe software engine of the static scene of the scenes such as building, structure, photo, object, room etc.Then it can be according to itAutomatic incite somebody to action identified virtually " label " be applied to structure or object.For example, the program can be by the user of the disclosure in viewing streetIt is called when road scene, such as Fig. 6.Using the camera of augmented reality glasses, which will identify the Fontaines de la in ParisConcorde(Place de la Concorde fountain).Then the program will call out a virtual label, project eyeglass 602 as being used as in Fig. 6On virtual image 618 a part shown in like that.The label can be only text, as being seen the bottom of image 618Like that.Other labels that can be applied to the scene may include " fountain ", " museum ", the cylindric building in " hotel " or backTitle.Other such programs may include Wikitude AR Travel Guide, Yelp and many other programs.
NearestTube for example directs the user to the nearest subway station in London, other programs using identical technologySame or similar function can be executed in other cities.Layar is come using camera, compass or direction and GPS dataThe position of identity user and the another application in the visual field.Using the information, covering or label can virtually occur to help to orientWith guidance user.Yelp executes similar function with Monocle, but their database is more detailed to a certain extent,It helps to guide user to restaurant or other service providers in a similar way.
Any control described in this patent can be used to control glasses and call these functions in user.For example, glassesCan be equipped with the microphone of pickup voice commands from the user, and it is handled using the software that the memory of glasses is included?.Then user can make a response the prompt from the miniature loudspeaker or earplug that are also contained in eye glass frame.GlassesIt can also be equipped with small Trackpad, similar to those of finding small Trackpad on smart phone.Trackpad allows user in ARMobile pointer or indicator, are similar to touch screen on virtual screen in glasses.When user reaches the desired point on screen, useFamily presses Trackpad to indicate his or her selection.To, user can caller, such as guide-book, if then byDry vegetalbe list may is that find his or her road and select country, city then classification.Classification selection may include such as hotel, purchaseObject, museum, restaurant etc..User makes his or her selection, then by AR program designation.In one embodiment, glasses are alsoIncluding GPS locator, and provide can substituted default location for current countries and cities.
In one embodiment, the object recognition software of eyepiece can handle figure received by the camera of the face forward of eyepieceAs determining in the visual field what has.In other embodiments, may be enough by the GPS coordinate of the position determined the GPS of eyepieceDetermine in the visual field what has.In other embodiments, the RFID in environment or other beacons can broadcast locations.Above-mentioned any oneKind or combination can be identified the position of thing and identity in the visual field by eyepiece use.
When object is identified, the resolution ratio of object imaging can be increased, or figure can be captured with little compressiblePicture or video.In addition, the resolution ratio of other objects in the visual field of user can be lowered, or be captured with more high compression rate,In order to reduce required bandwidth.
Once being determined, content related with the point of interest in the visual field can be superimposed in real world image, such as societyHand over Web content, interactive travelling, local information etc..With film, local information, weather, restaurant, restaurant availability, local thingThe related information such as part, local taxi, music and content can be accessed by eyepiece and be incident upon on the eyeglass of eyepiece to be seen for userIt sees and interacts.For example, the camera of face forward can shoot image and send it to mesh when user sees Eiffel TowerThe association processor of mirror is handled.Object recognition software can determine that the structure in the visual field of wearer is Eiffel Tower.SubstitutionGround can search for GPS coordinate determined by the GPS of eyepiece in the database to determine the coordinate of the coordinate matching Eiffel Tower.Under any circumstance, it then can search for and the information of Eiffel Tower visitor, the restaurant nearby and in tower itself, local dayThe related contents such as gas, local subway information, local hotel information, other neighbouring tourist spots.Control as described herein can be passed throughMechanism allows the interaction with the content.In one embodiment, the content reception based on GPS can be in the tourism pattern for entering eyepieceWhen be activated.
In one embodiment, eyepiece can be used for watching streamed video.For example, can by by GPS location search,Video is identified by the search of the Object identifying of object, voice search, holographic keyboard search etc. in the visual field.Continue Eiffel TowerExample, can through the GPS coordinate of tower or when having determined that it is exactly the structure in the visual field according to word " Eiffel Tower " comeSearch for video database.Search result may include the video or video associated with Eiffel Tower for adding geographical labels.Pass throughIt can be rolled using control technology as described herein or turning video.It can play interested view using control technology as described hereinFrequently.Video can be superimposed in real-world scene, or can be shown on eyeglass out of sight.In one embodiment, can pass throughMechanism as described herein keeps eyepiece dimmed to allow the viewing of more high contrast.In another example, eyepiece perhaps can utilizeCamera and all network connections as described herein to provide streamed video conference capabilities to wearer.Streamed video canTo be the video of at least one other video conference participants, visual presentation etc..Streamed video can be by certainly when being capturedIt is dynamic to upload to video storage location, without passing through the interaction of eyepiece user.Streamed video can be uploaded to physics or virtualStorage location.Virtual storage location can be located at single physical position or cloud storage position.The video of streamed video conference is alsoIt can be modified by eyepiece, wherein modification can be inputted based on sensor.Sensor input can be visual sensor input or audio passesSensor input.Visual sensor inputs the image that can be another participant, visual presentation of video conference etc..Audio sensorInput can be the speech of a certain participant of video conference.
In embodiments, eyepiece can provide receiving from such as smart phone, plate, personal computer, amusement equipment,The external equipment of the happy video equipment of portable audio, household audio and video system, home entertainment system, another eyepiece or the like it is wirelessThe interface of streaming media (such as video, audio, text messaging, call and schedule warning).Wireless streaming mediaIt can connect by any wireless communication system known in the art and agreement, such as bluetooth, WiFi, wireless home network, is wirelessLocal area network (WLAN), wireless family digital interface (WHDI), cellular mobile telecommunication etc..Multiple wireless communications systems also can be used in eyepieceSystem, such as one for flowing, transmission High Data Rate media (such as video), one for low data rate media, (such as text message is receivedHair), one for order data between external equipment and eyepiece etc..For example, High Data Rate video can be through WiFi DLNAThe transmission of (digital real-time network alliance) interface incoming flow, and bluetooth is used for the low data-rate applications of such as text messaging etc.In embodiments, external equipment can be provided that the application docked of the support with eyepiece.For example, may make a mobile application forUser is used to dock their smart phone with eyepiece.In embodiments, external equipment, which can be provided that, docks with eyepieceTransmission device.Such as, it is possible to provide transmitter adapter (dongle) docks the smart phone of user with eyepiece.Due to from outerMany processing requirements may be applied to external equipment by portion's equipment streaming media, and eyepiece may require less onboard processingAbility adapts to streaming media.For example, an embodiment of the eyepiece for adapting to streaming media includes for receiving to spreadMedia, buffered data are sent, streaming media is supplied to user passes through its optics for watching ambient enviroment and the content shownThe interface of assembly etc..That is, an embodiment of the eyepiece for receiving streaming media can be mesh as described hereinThe simple version of the other embodiments of mirror, to be used as the display of external equipment.In one example, user perhaps can will regardFrequency is streamed to the eyepiece of " simple version " from their smart phone.However, it will be understood by those skilled in the art that described hereinAny other function also be included in create the eyepiece of each embodiment version, from display circle for functioning only as external equipmentThe eyepiece of the most simple version in face transmits interface only to the version for the ability for including gamut as described herein, such as wireless streamsIt is one of multiple functions and the ability that eyepiece provides.For example, even if being controlled as described herein in the eyepiece of more simple versionTechnology processed, power-saving technique, using, drive with streaming media one or two display, shown etc. with 3D mode and be also likely to beIt is useful, so as in the command mode of streaming media, for battery management, optional media watching mode for increasing the service life etc.Middle offer help.Alternatively, the eyepiece of super simple version can provide the embodiment of the cost and complexity minimum of eyepiece,The case where interface between such as external equipment and eyepiece is wireline interface.For example, an embodiment of eyepiece can be in the intelligence of userWireline interface can be provided between phone or plate and eyepiece, wherein can be restricted to only now will be streamed for the processing capacity of eyepieceMedia presentation supplies to watch processing required by content on the eyeglass of eyepiece to optics assembly.
In other embodiments, the application operated on smart phone may act as the remote input apparatus of glasses.For example,The user interface of such as keyboard etc allows user by smart phone come typing character.The application will make phone seemAs bluetooth keyboard.The application can be only that will touch the full frame sky for being transmitted to the pseudo- touch screen driver run on glassesWhite application, enabling a user to use smart phone to carry out two fingers scaling (pinch) and towing as completion, these are movedActual physics place, and obtain to the visual feedback in the touch feedback and glasses of your hand.To the benefit run on glassesIt can be worked in the case where user is using smart phone touch screen well with more common applications of the input gestures of these types.Command information can be accompanied by visual detector.For example, in order to when you just control glasses or eyewear applications using external equipmentKnow you finger where, command information it is visual instruction be displayed in glasses, the highlighted track of such as finger movement.The disclosure may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes that user passes through its viewing ambient enviroment and displayThe optics assembly of content out is suitable for introducing content into the integrated image source of optics assembly;Integrated processor;Outside is setStandby, which has physical user interface and external equipment is become to the use that can be operated by integrated processor of eyepieceThe application at family interface is instructed in the content shown wherein interacting with the physics of external equipment.In embodiments, outsidePortion's equipment can be smart phone, plate, mobile navigation equipment etc..Physical user interface can be keypad, touch tablet, controlInterface etc..For example, physical interface can be iPhone, and the content shown is to show the user action on iPhone keypadThe movement being shown as on virtual keypad, the content shown on eyepiece dummy keyboard, such as when the finger movement of user is realHighlighted key, key press instruction etc. are shown on virtual keypad when interacting on border with the physical keypad of iPhone.HandFinger movement can be the selection to content and one of the movement of content to showing.The manipulation can be multiple on touch tabletFinger movement such as readjusts the two fingers scaling manipulation of the size of the content shown on eyepiece.
As described above, the user of augmented reality can be from a large amount of source reception contents.Visitor or tourist may want to selection officeIt is limited to local businessman or mechanism;On the other hand, want that obtaining visitor or the businessman of tourist may want to the supply by them or solicitIt is limited to the people that local was being visited but had not been in region or position in them.To, in one embodiment, visitorOr his or her search may be only limitted to local businessman by tourist, i.e. businessman those of in specific geographic limitation range.These limitsSystem can indicate geographical limitation to be arranged by GPS criterion or manually.For example, someone may require stream contentOr the source of advertisement those of is restricted in the certain radius (setting quantity or kilometer or mile) of the people source.Alternatively,The criterion can require source to be limited to town or those of inside the province source.These limitations can be by augmented reality user just as computerUser is at home or office will use keyboard and the his or her search of mouse limitation is arranged like that;Augmented reality user's is defeatedEnter other modes described in the part simply by each control of discussion of speech, hands movement or the disclosure to make.
In addition, the available content of user's selection can be carried out restrained or limitation by the type of supplier.For example, user can will selectSelect the website (.gov) being limited to by government organs' operation or by nonprofit institution or the website (.org) of tissue operationThose of.In this manner it is possible to can send out the more interested tourists such as government bodies, museum, historic site or visitor is visitedExisting his or her selection is less mixed and disorderly.When available selection has been reduced to more reasonable quantity, which may be moreIt is readily able to make decision.Quickly abatement can use presence of the ability of selection in such as Paris or Washington D.C. etc manyIt is desirable in more urban areas of selection.
User by this patent elsewhere it is described it is any in a manner of or mode control glasses.For example, user can pass through wordsSound recalls required program or application by indicating selection on the virtual screen of augmented reality glasses.Enhance glassesThe Trackpad being mounted on eye frame can be made a response, as described above.Alternatively, glasses may be in response to be mounted on frameOne or more movements or position sensor.Signal from sensor is then communicated to the microprocessor or micro- in glassesController, glasses additionally provide any desired signal conversion or processing.Once the program of selection starts, user just passes through hereinAny method makes a choice and inputs response, such as passes through head movement, gesture, Trackpad pressing or speech lifeIt enables to signal "Yes" or "No".
Meanwhile content provider (i.e. advertiser) may also wish their supply being limited to specific geographical area (such asTheir city scope) in people.Meanwhile advertiser's (may is that museum) may not want that and provide content to localPeople, but may want to touch visitor or stranger.In another example, advertisement can not be presented when the user is at home, but worked as and usedAdvertisement is presented when family travels or is away from home.Augmented reality equipment discussed in this article is preferably provided with GPS ability and telecommunications energyPower, and for realizing the integrated processor based on geographical rule presented for advertisement.Pass through limit for museumIts broadcasting power is made to provide streamed content in finite region will be a simple thing.However, museum may lead toIt crosses internet and content is provided, and its content may can get in the world.In this example, user can be set by augmented realityStandby reception content is opened the door and to be apprised of for visit museum's today.
User can make a response the content by the augmented reality equivalent of the link in click museum.Augmented realityEquivalent can be other sensing instructions of speech instruction, hand or eyes movement or user's selection or by using associatedThe controller of installation physically.Then museum receives indicates that the identity of user or the Internet service of at least user provideThe cookie of quotient (ISP).If the cookie indicates or suggests that the Internet service other than local provider providesQuotient, then museum's server can be used as the special advertisement of visitor or offer then to respond.The cookie may also include electricityBelieve the instruction of link, such as telephone number.If telephone number is not local number, this is that the people made a response is visitorAdditional clue.Museum or other mechanisms then can be desired by its market departments or the content suggested continues.
The ability of user is utilized to control eyepiece and its tool, wherein to user hand in the another application of augmented reality eyepieceUse it is minimum, then use voice commands, posture or movement.As described above, user can require augmented reality eyepiece retrieval letterBreath.The information may have been stored in the memory of eyepiece, but be alternatively likely located at it is long-range, such as by internet orPerhaps via only can be by the addressable database of Intranet of specific company or the employee access of tissue.Eyepiece so as to one calculateMachine mention in the same breath or with can by extremely close range watch and hear and usually in the case where the hand to people is using least situation comeThe display screen of control is mentioned in the same breath.
Using so as to including that field data is supplied to machinist or Electronic Installation Technician.Technician can be in search about spyWhen determining information that is structure or being encountered problems, such as when repairing engine or power supply, put on one's glasses.Using voice commands, heOr she then accessible database and searches for specific information in database, such as handbook or other R and M documents.InstituteThe information needed is so as to be accessed at once and least energy is spent to be applied, so that it is required to allow technician to more quickly performIt repairs or safeguards and equipment is made to restore to use.For key task equipment, other than saving repairing or maintenance cost, when thisBetween save be also possible to save somebody's life.
Information to be administered may include repair manual etc., but may also include the audio-visual information of gamut, i.e., in technicianOr machinist attempt execute particular task while, eyepiece screen can illustrate how to execute the video of the task to the individual.IncreaseStrong real world devices further include telecommunication capability, therefore technician also has has certain complexity or unexpected difficulty in taskIn the case of request other people the ability that helps.Maintenance and repair are not limited in terms of this education of the disclosure, but canIt is applied to any educational undertaking, intermediate classroom, advanced classroom, teacher's training courses or theme, seminar etc..
In one embodiment, the eyepiece for enabling WiFi can run location-based application to carry out the user that selection is participated inGeo-location.User can be by logging in the application on their phone and allow to broadcast their position, or by themGeo-location is enabled on the eyepiece of oneself to select to participate in.When the wearer of eyepiece scans people and therefore scans their choosingWhen selecting the equipment of participation, which can identify the user that selection is participated in and send the commands to projector by augmented reality indicatorIt projects on the user that a certain selection in the user visual field is participated in.For example, can place green ring around the selected people participated inIt is seen their position.In another example, yellow ring can indicate that the selected people participated in but be unsatisfactory for certain criterion, such as they do not have FACEBOOK account or there is no mutual in the case where they have FACEBOOK accountFriend.
Certain social networkings, occupation networking and appointment application can be with location-based application cooperations.It resides on eyepieceSoftware tunable from network and appointment website and location-based application data.For example, TwittARound is exactly oneA such program, it, which is detected and is marked using the camera of installation, is marked the micro- of position from other neighbouring micro-blogsIt is rich.This will make using the disclosure individual can position near other Twitter(push away spy) user.Alternatively, Yong HukeTheir equipment can be must be provided with to coordinate the information from various networkings and appointment website.For example, the wearer of eyepiece mayWant to see all E-HARMONY users for the position for broadcasting them.If the user that a certain selection is participated in is by eyepiece markKnow, then augmented reality indicator can be coated on the user of selection participation.If the user and wearer have some commonPlace and user are there are many something in common etc., then different appearances can be presented in the indicator.Such as and with reference to Figure 16, wearingPerson is observing two people.The two people are identified as E-HARMONY user by the ring being placed around.However, withWoman shown in solid line ring has that at least one is identical as wearer, and the woman shown in dotted line ring and wearer without it is common itPlace.Any available profile information can all be accessed and be shown to user.
In one embodiment, when wearer by eyepiece be directed toward have such as FACEBOOK, TWITTER, BLIPPY,When the direction of the user of the networking account such as LINKEDIN, GOOGLE, WIKIPEDIA, the nearest model of user or profile information canIt is displayed to wearer.For example, nearest state updates, " tweets " (is pushed away as described in above with respect to TwittARoundIt is special), " blips " etc. (a kind of shopping sharing microblogging) can be shown.In one embodiment, it is used when eyepiece is directed toward target by wearerWhen the direction at family, swashed if eyepiece was directed toward up to duration a period of time and/or a certain posture, head, eyes or audio frequency controlLiving, then they can indicate that interested in the user.Target user can receive emerging in their phone or their glassesInterest instruction.If wearer is represented interest labeled as interested but waiting wearer by target user first, can standThe instruction of the interest about target user is popped up i.e. in eyepiece.A kind of controlling mechanism can be used to capture image and use targetThe information at family be stored on associated nonvolatile memory or online account in.
In the other application of social networking, such as TAT-The Astonishing Tribe from Sweden Ma ErmoThe face recognition program of the TAT Augmented ID of company etc can be used.This program can be used for the face according to peopleFeature identifies people.The software identifies individual using facial recognition software.It is soft using the photo id such as from FlickrThe other application of part etc, people can then identify specific individual nearby, and people then can be from having about the individualInformation social networking site download information.The information may include the individual in such as Facebook, Twitter or the likeThe name and profile of the available individual are provided on website.The application can be used for refreshes user to some man memory or markSomeone near knowing, and collect the information about the individual.
In the other application for social networking, wearer perhaps can using the location-based equipment of eyepiece comeEach place leaves annotation, commentary, comment etc. in association, in each place, to each product etc. with people.For example, someone is perhapsThe a certain place that can be accessed him is posted comment, wherein then may make this is posted to be obtained by social networks by other people?.In another example, someone perhaps can post the comment at the position in the place, so that when another people comes thisIt can get the comment when position.By this method, wearer perhaps can access when they come the place and be left by other peopleComment.For example, wearer may come the entrance in a certain restaurant and be able to access that the comment about the restaurant, such as according to certainOne criterion is come sort (age of such as nearest comment, commentator).
User can select simultaneously by speech, by making a choice as described above from virtual touch screen, by using TrackpadSelected desired program initiates required program by any control technology as described herein.Then it can be used similarOr complementary mode make menu selection.The sensor or input equipment for being mounted on the convenient location of user's body can also quiltsIt uses, is such as mounted on wrist guard, sensor and Trackpad on gloves, or even separate devices, perhaps there is intelligence electricityThe size of words or personal digital assistant.
The application of the disclosure can provide access to the Internet to wearer, such as browsing, searching for, doing shopping, entertaining, such asBy the wireless communication interface for arriving eyepiece.For example, wearer can initiate Web search with control posture, such as by being worn onOn a certain position of wearer's body (as on hand, on head, on foot), wearer's a certain component (such as individual calculus currently in useMachine, smart phone, music player) on, the first-class control of a piece of furniture (such as chair, desk, desk, desk lamp) near wearerControl equipment, wherein the image of Web search, which is projected, is watched for wearer by eyepiece.Then wearer can be checked by eyepiece and be searchedRope simultaneously controls web interaction by control equipment.
In one example, user may adorn oneself with according to a pair of glasses configure a certain embodiment, wherein project becauseThe image of special net web browser is provided by eyepiece, while retaining while checking at least some of around actual environmentAbility.In this example, user may adorn oneself on hand motion-sensing control equipment at them, and wherein the control equipment canIt sends the hand of user and control command for web control is used as the relative motion of eyepiece, such as counted similar to conventional personalMouse in the configuration of calculation machine.It is understood that, it will so that user is executed with configuring similar mode with ordinary personal computerWeb movement.In this case, the image of Web search is provided by eyepiece, and the control of the selection to the movement for executing searchIt is provided by the movement of hand.For example, the mass motion of hand can move cursor, finger in the image of Web search projectedFlick can provide selection movement etc..By this method, by being connected to the embodiment of internet, wearer may make to be able to carry outThe function of required Web search or any other enabling explorer.In one example, user may beFrom App Store(application shop) computer program Yelp or Monocle have been downloaded, or such as be used to position from ZagatNRU(" near you " is applied in nearby restaurant or other shops), the similar products of Google Earth, Wikipedia or the like.The individual can initiate for example to restaurant or other commodity or ISP (such as hotel, mechanic or the like) or informationSearch.When the information required for finding, position is shown or shown to the distance of desired locations and direction.Display can adoptWith the form of virtual label co-located with real world objects in User.
From Layar(Amsterdam, the Netherlands) other application include for special each of specific information desired by userKind " layer ".Layer may include restaurant information, the information about specific company, real estate listings, gas station etc..It is answered using such as movementThe global positioning system (GPS) of the information and user that provide in software application, information can be present on the screen of glasses,With the label with desired information.The Tactile control discussed using disclosure other places or other controls, user can press axisRotation otherwise rotates his or her body, and watches the building for being marked with the virtual label comprising information.IfUser searches for restaurant, and screen will show the restaurant information of such as Name & Location etc.If user searches for particular address, emptyQuasi- label will appear on the building in the user visual field.User then can by speech, by Trackpad, pass through virtual touchScreen etc. is selected or is selected to make.
The application of the disclosure can provide a kind of mode by advertisement delivery to wearer.For example, when viewer start he orShe one when, while browsing internet, while carrying out Web search, when walking to duck into the store etc., advertisement can pass through meshMirror is displayed to viewer.For example, user may be carrying out Web search, and become the mesh of advertisement by Web search userMark.In this example, advertisement can be projected in the same space of projected Web search, float to side or be located at and wearOn or below the visual angle of wearer.In another example, when a certain advertisement provides equipment (may is that one near wearer)When sensing presence (such as by wireless connection, RFID) of eyepiece, triggerable advertisement guides advertisement to be delivered to eyepieceTo eyepiece.In embodiments, eyepiece can be used for track advertisement interaction, user see billboard, promotion, advertisement etc. or withInteract.For example, user can be tracked about the behavior of advertisement, for providing a user benefit, remuneration etc..OneIn example, whenever user sees billboard, user is paid for 5 dollars of ideal money.Eyepiece can provide impression tracking, such as based onSee brand image (such as based on time, geography).As a result, offer can be based on position and event related with eyepiece (as usedWhat is seen, what is heard, is interacted with what in family) determine target.In embodiments, advertising objectiveization can be based on historyWhat behavior interacted, interactive mode etc. based on user's past with.
For example, wearer may go window-shopping in Manhattan, wherein shop is equipped with this series advertisements and provides equipment.Work as wearerIt is out-of-date to walk by shop, and advertisement is provided equipment and can be used based on determined by the integrated position sensor of such as GPS etc of eyepieceThe known location at family triggers delivery of the advertisement to wearer.In one embodiment, the position of user can further pass through otherIntegrated sensor (such as magnetometer) refines, allowing the augmented reality advertisement of super localization.For example, if magnetometer andWhen user is located in front of certain shops by GPS reading, the user on the bottom of market can receive particular advertisement.When user is in quotientIn when upper layer, GPS location can keep identical but magnetometer readings can indicate that the variation of user's height above sea level and user existNew definition before one different shops.In embodiments, personal profiles information can be stored, so that advertisement offer equipment can be moreThe demand of advertisement and wearer are matched well, wearer can provide the preference about advertisement, and wearer can prevent at least oneA little advertisements etc..Advertisement and associated discount perhaps can also be passed to friend wearer and can directly pass them by wearerUp to nearby and those of eyepiece for enabling themselves friend;They can also pass through the social networks etc to such as friendWireless Internet connection, by Email, SMS convey them;Etc..Wearer can be connected to the following reception and registration of permissionEquipment and/or infrastructure: the advertisement from sponsor to wearer;From wearer to advertising equipment, ad sponsor etc.Feedback;Someone near to other users (such as friend and kinsfolk) or wearer;To shop, such as eyepiece Local or RemoteWebsite (such as on the internet or on the home computer of user);Etc..These interconnectivity equipment may include the use of eyepieceThe position of user and the integrated equipment of direction of gaze are provided, such as by using GPS, 3 axle sensors, magnetometer, gyroscope,Accelerometer etc., for determining direction, the speed, attitude (such as direction of gaze) of wearer.Interconnectivity equipment can provide telecommunications and setIt is standby, cellular link, WiFi/MiFi bridge etc..For example, wearer perhaps can be linked by available WiFi, pass through honeycombMiFi (or any other people or group cellular link) of the system integration etc. is communicated.There may be wearers for storing advertisementEquipment for using later.May be present with the eyepiece of wearer it is integrated or be located locally in computer equipment allow to advertisement intoThe equipment of row cache, such as in local zone, wherein cached advertisement allow wearer it is close with thisAdvertisement is delivered when the associated position of advertisement.For example, local advertising can be stored in comprising through geo-location local advertising andOn the server of bargain goods, and when wearer is close to specific position, these advertisements can be individually delivered to wearer,Or when wearer enters geographic area associated with advertisement, one group of advertisement can be delivered in batches wearer, so thatObtaining can get these advertisements when user is close to specific position.Geographical location can be city, a part in city, several streetsArea, single block, street, a part in street, pavement etc. represent provincialism, local, super local zone.Note that above-mentionedDiscuss use term advertisement, skilled person will understand that this also mean bulletin, broadcast, leaflet, commercial advertisement,There is the communication of patronage, criticize list, notice, promotion, notification, message etc..
Figure 18-20A, which is depicted, is delivered to the short of the facility (such as retail shop) for wishing to send message for that will customize messageThe mode of people in distance.Referring now to Figure 18, each embodiment can provide such as when the wearer of eyepiece is walking or drivesWhen it is above-mentioned for search for cargo and service supplier application come check customization notice board by way of.Such as Figure 18Middle description, notice board 1800 shows the exemplary advertisement based on augmented reality shown by seller or service provider.Such asDiscribed, which can be related with the drink offer provided by bar.For example, only the cost of a drink can mentionFor two portions of drinks.Advertisement based on augmented reality and offer in this way, the attention of wearer can be easily guided toNotice board.The details of the position in relation to the bar, such as street address, floor number, telephone number can also be provided in notice boardDeng.According to other embodiments, the several equipment other than eyepiece can be utilized to check notice board.These equipment may includeBut be not limited to smart phone, IPHONE, IPAD, windshield, user's glasses, the helmet, wrist-watch, earphone, vehicle-mounted bracket etc..According to an embodiment, when user (wearer in the case where augmented reality is embedded in eyepiece) passes through or drives to pass throughOffer can be automatically received when the section or checks the scene of notice board.According to another embodiment, user can be according to his requestTo receive offer or check the scene of notice board.
Figure 19 shows two illustrative roadside notice boards 1900, they include that can be checked with augmented reality modeOffer and advertisement from seller or service provider.Live and close reality can be provided to user or wearer by enhancing advertisementPerception.
As shown in Figure 20, the equipment (camera lens provided in such as eyepiece) for enabling augmented reality can be used to connectIt receives and/or checks and be displayed at roadside or building and shop top, side, the scribble 2000 in front, poster, drawing etc..Roadside notice board and scribble can have advertisement or advertising database can be linked to notice board visual indicator (for example, code,Shape) or wireless pointer.When wearer is close and checks notice board, the projection of billboard advertisement is then provided to pendantWearer.In embodiments, personal profiles information can also be stored so that advertisement can preferably match the needs of wearer, wornPerson can provide the preference for advertisement, and wearer can prevent at least some of advertisement etc..In embodiments, eyepiece can be for noticeThe eyepiece view field of plate controls with brightness and contrast, to be promoted such as under bright external environment to the readable of advertisementProperty.
In other embodiments, user can be according to the GPS location or other positions indicator (such as magnetometer of specific positionReading) come in the specific location posted information or message.When some of expected viewer in the position apart from it is interior when, this is looked intoThe person of seeing is it can be seen that the message, as combined Figure 20 A to explain.In the first step 2001 of method 20A, user determines a position,Message will be received by the people as the message sending object at this location.Message is then posted 2003, to connect in recipientOne or more people appropriate are sent to when nearly expected " checking region ".The position of the wearer of augmented reality eyepiece is by structure2005 are continually updated at the GPS system of eyepiece a part.When GPS system determine wearer be located at expectation check region certainWhen a distance (for example, 10 meters) is interior, message is then sent 2007 to viewer.In one embodiment, message then showsFor the Email or text message to recipient, or if recipient just adorns oneself with eyepiece, which be may alternatively appear in eyepiece.Since message is sent to the people according to the position of people, in some sense, message can be displayed as in specified location or lean onThe building or " scribble " in feature of the nearly designated position.It is all by " checking region " that specific setting, which can be used to determine,People still only specific people or group or the equipment with unique identifier can see message.For example, having checked oneThe soldier in village can by by message or identifier (such as marking the big X of the position in the house) associated with a house come virtualThe house is labeled as having checked by ground.The soldier can indicate that only other US soldiers can receive location-based content.WhenWhen other US soldiers pass through the house, they can be such as by seeing virtual " X " on house side (if they have meshMirror or some other enable augmented realities equipment) or by receiving indicate message that the house has been checked and automatically connectReceive instruction.In another example, content (warning, target identification, communication etc.) related with security application can be streamedTo eyepiece.
Each embodiment can provide the mode for checking information associated with (such as in shop) product.Information may includeThe nutritional information of food product, the introduction on discharge of dress-goods, consumer electronics technology explanation, electronic coupons, rushPin, compared with the price of other similar product, compared with the price in other shops etc..This information can be relevant to shop arrangement etc.The peripheral visual field of wearer is projected onto the relative position of product.Product can be by SKU, brand label etc. by vision terrestrial referenceKnow;It is transmitted by product packaging, such as passes through the RFID label tag on product;It is transmitted by shop, is such as existed according to wearerPosition in shop, relative to position of product etc..
For example, viewer can just walk through clothes shop, and can be provided that when they walk about the clothing on shelfThe information of clothes, wherein the information is provided by the RFID label tag of product.In embodiments, which can be used as information columnTable, diagram, audio and/or representation of video shot etc. are delivered.In another example, wearer can buy food, and advertisement suppliesAnswer mechanism that can provide information associated with the neighbouring product of the wearer to wearer, when wearer selects the product and checksIt can be provided that the information whens brand, name of product, SKU etc..In this way, can provide to wearer a kind of can have whereinThe environment with more information of effect ground shopping.
One embodiment allows user (to be such as mounted on exemplary sun eye by using the equipment for enabling augmented realityCamera lens in the eyepiece of mirror) it receives or shared about shopping or the information in urban district.These embodiments will use enhancing existingReal (AR) software application, such as those above applications referred in conjunction with the supplier of search cargo and service.In a sceneIn, the wearer of eyepiece can take a walk in street or market for shopping purpose.In addition, user can activate and various help to be directed toSpecial scenes or environment define the mode of user preference.For example, user can enter navigation mode, worn by the navigation modePerson, which can be navigated to pass across a street with market, buys the accessory and product of preference.The mode can be selected and wearer can pass throughVarious methods (passing through Text Command, voice command etc.) provide various guidances.In one embodiment, wearer can provideVoice command selects navigation mode, this can lead to the enhancing before wearer and show.Enhancement information can describe in marketIt is the offer that is provided in the related information in each shop and the position of supplier, each shop and by each supplier, currentIt reduces the price time, current date and time etc..Various types of options can also be displayed to wearer.Wearer can roll each optionAnd it is guided to take a walk on the street Di by navigation mode.According to provided option, wearer can be according to such as offer and foldingButton etc. selects to be most suitable for the place of his shopping.In embodiments, eyepiece can provide search, browsing, selection, preservation, sharingBuy the ability (such as checking by eyepiece) of article, the advertisement for receiving purchase article etc..For example, wearer can be in internetOne article of upper search simultaneously (passes through application shop, E-business applications etc.) in the case where not making a phone call and is bought.
Wearer can provide voice command come to the place navigation and wearer be then directed at this.Wearer may be used alsoAutomatically or according to about current transaction, promotion and the request of event in interested position (such as near shopping shop) connectReceive advertisement and offer.Advertisement, preferential and offer can neighbouring wearer occur and option can be shown to be used according to advertisement, excellentFavour and offer buy desired product.Wearer can for example select a product and buy it by Google checkout.It is similarAs describing in Fig. 7, the information that the transaction of message or Email and purchase product has been completed may alternatively appear in eyepieceOn.Product sending state/information can also be shown.Wearer can further by social network-i i-platform convey or remind friend andRelative can also request them to be added about offer and event.
In embodiments, the wearable wear-type eyepiece of user, wherein eyepiece includes optics assembly, passes through the optics groupPiece installing user can check the environment of surrounding and the content of display.Shown content may include one or more local advertisings.MeshThe position of mirror can be determined by integrated position sensor and local advertising can have the correlation with the position of eyepiece.MakeFor an example, the position of user can by GPS, RFID, be manually entered etc. and to determine.In addition, user, which can walk, passes through coffeeShop, and according to the proximity of user and the shop, similar to such as fast food restaurant the display described in Figure 19 shop brand 1900(orThe brand of coffee) advertisement may alternatively appear in the visual field of user.When user is when ambient enviroment range is mobile, he or she can be experiencedThe local advertising of similar type.
In other embodiments, eyepiece may include the capacitive sensor that can be sensed eyepiece and whether just contact with human skinDevice.The sensor can be capacitive sensor, resistance sensor, inductive pick-up, emf sensor etc..Such biographySensor or sensor group can be placed on eyepiece in the way of allowing to detect glasses and when being worn by the user and/or eyepiece mirror holderOn.In other embodiments, sensor can be used to determine whether eyepiece is in a position, so that sensor can be for example in meshMirror can be worn by the user when being in the position of expansion.In addition, local advertising can only when eyepiece is contacted with human skin, can wearWhen wearing position, both combination, sent when being actually worn by the user etc..In other embodiments, local advertising may be in response toEyepiece, which is turned on or is turned on and is worn by the user in response to eyepiece etc., to be sent.As an example, advertiser may be selected only whenThe facility of user's adjacent specific and when the practical positive wearing spectacles of user and local advertising is sent when glasses are turned on, thusAllow advertiser that advertisement is directed to user in reasonable time.
According to other embodiments, local advertising can be used as banner, two-dimensional diagram, text etc. to be shown to user.ThisOutside, local advertising can be associated with the physics aspect in the user visual field of ambient enviroment.It is existing that local advertising also can be displayed as enhancingReal advertisement, wherein advertisement is associated with the physics aspect of ambient enviroment.Such advertisement can be two-dimentional or three-dimensional.As showingExample, local advertising can (as further described in Figure 18) associated with physics notice board, wherein the attention of user canThe content of display is fallen in, the content of the display shows the drink for the real building object being poured into ambient enviroment from notice board 1800Material.Local advertising may also include the sound that user is shown to by earphone, audio frequency apparatus or other means.In addition, local advertisingIt can be animated in embodiments.It is flowed to from notice board on close building and optionally for example, beverage can be seen in userIt flows in the environment of surrounding.Similarly, advertisement can show the movement for any other type wanted such as in advertisement.Additionally,Local advertising can be displayed as three dimensional object, which can be associated with ambient enviroment or be handed over ambient enviromentMutually.In the associated embodiment of object in the user visual field of the wherein advertisement with ambient enviroment, even if when user rotates hisWhen head, advertisement can also keep associated with the object or neighbouring object.For example, if an advertisement is (described in such as Figure 19Coffee cup) it is associated with specific building, even when user rotate his head go the another pair checked in its environment asWhen, coffee cup advertisement can keep associated with the building and on building position appropriate.
In other embodiments, local advertising can be shown to user according to the Web search that user carries out, wherein advertisement quiltIt is shown in the content of web search result.For example, user can search for " reducing the price the time " when he just takes a walk in the street, andIn the content of search result, the local advertising for the beer price advertisement in local bar can be shown.
In addition, the content of local advertising can be determined according to the personal information of user.The information at family can be used to answer webIt is available with, advertising organizations etc..In addition, the eyepiece of web application, advertising organizations or user can be filtered according to the personal information of userAdvertisement.In general, for example, user can store what is liked about him and does not like what personal information, and it is suchInformation can be used for the eyepiece that advertisement is directed to user.As specific example, user can be stored about him to local motionThe data of the hobby of team, and when advertisement is made available by, those advertisements in relation to the team, fortune team that he likes can be given preferentiallyAnd it is pushed to user.Similarly, the things that user does not like can be used for excluding specific advertisement from the visual field.In each implementationIn example, advertisement can be cached on the server, and advertisement can be by advertising organizations, web application and eyepiece on that serverAt least one access and be displayed to user.
In embodiments, user can be interacted with various ways and any kind of local advertising.User can pass throughThe movement of at least one of mobile eyes, body movement and other postures is made to request additional letter related with local advertisingBreath.For example, if an advertisement is displayed to user, he can be in his visual field in his hand of advertisement Back stroke or by his eyesIt is moved in the advertisement and receives to select specific advertisement about the more information of the advertisement.In addition, user may be selected to lead toAny movement being described herein or control technology (, body movement mobile by eyes, other postures etc.) are crossed to neglectSlightly advertisement.In addition, user can allow this wide by not selecting advertisement to carry out further interaction within given a period of timeAnnouncement is ignored by default to select to ignore the advertisement.For example, if user's selection does not make posture in five seconds that advertisement is shownTo obtain more information from advertisement, then the advertisement can acquiescently be ignored and be disappeared from the user visual field.In addition, user canSelection does not allow local advertising shown, this can be selected on a graphical user interface such option or be passed through by the userSuch feature is closed via the control on the eyepiece to realize.
In other embodiments, eyepiece may include audio frequency apparatus.Therefore, shown content may include local advertising andAudio, so that user can also hear message or other sound effects (since they are related with local advertising).As exampleAnd referring again to Figure 18, when user sees the beer being poured out, he can will actually hear and respectively act phase in advertisementCorresponding audio transmission.In this case, user can hear bottle cap open and poured followed by liquid from bottle andThe sound being poured on roof.In other embodiments, descriptive message can be played or general information can be by as wideA part of announcement provides, or both.In embodiments, any audio can be played as expected for advertisement.
According to another embodiment, social networking (can be such as provided in eyepiece by using the equipment for enabling augmented realityCamera lens) promote.This can be used for several users that will do not have the equipment for enabling augmented reality or other people connectBe connected together, these several users or other people can share idea and viewpoint each other.For example, the wearer of eyepiece can be with otherStudent is sitting in campus together.Wearer can connect with first student for appearing in cafe and be sent to it message.Wearer can inquire that first student is related to specific subject (for example Environmental Economics) interested people.WithOther students pass through the visual field of wearer, configuring camera lens inside eyepiece can be traced student and by student and networking numberAccording to library (such as may include open profile " Google me(carries out Google search to me) ") it matches.From open numberCan occur in the front of wearer according to the profile of the interested and related personnel in library and pop up on eyepiece.It may not in profileRelevant some profiles can be shielded or be revealed as being shielded to user.Associated profiles can be highlighted for wearer'sQuick Reference.By wearer selection associated profiles may interested in disciplinary environment economics and wearer can also be with themConnection.In addition, they can also also connect with first student.In this way, increasing can be enabled by wearer's useThe eyepieces of strong real-world characteristics constructs social networks.By wearer management the social networks and session therein can be saved withReference for future.
The present invention, which can be used in, has used the equipment for enabling augmented reality (such as camera lens of the configuration in eyepiece)Real estate scene in.According to this embodiment, wearer may wish to obtain the information in relation to a place, and wherein user is in spyFixed time (during driving, walking, jog etc.) may alternatively appear in the place.Wearer can for example want to know at thisHouse benefit and loss in place.He may also go for the details in relation to the facility in the place.Therefore, it wearsWearer can utilize map (such as Google Online Map) and identify the real estate that there can be used for hiring out or buying.As mentioned aboveIt arrives, mobile Internet application (such as Layar) can be used to receive about praedial information that is for sale or hiring out in user.In applying as one, the information about the building in the user visual field is projected on the inside of glasses so that user examinesConsider.Each option can be shown to wearer on eyepiece lens for such as rolling using the tracking plate installed on spectacle-frame.It wearsWearer may be selected and receive the information in relation to selected option.The scene of the enabling augmented reality of selected option can be shownShow to wearer and wearer can check picture and carry out facility visit in virtual environment.Wearer can further receiveThe information of related real estate manager is simultaneously met with one in real estate manager.Email notification can also be received on eyepieceOr Advise By Wire is for reservation confirmation.If wearer has found that selected real estate is worth very much, transaction may achieve and wearWearer is commercially available.
According to another embodiment, customize and support tourism and travelling can by using enable augmented reality equipment (such asThe camera lens being provided in eyepiece) Lai Zengqiang.For example, wearer (as traveller) reaches a city (such as Paris) simultaneouslyWant to receive about this area in relation to visit and the information gone sightseeing, to arrange next a couple of days in the trip of his retention periodIt plays.Wearer can put on his eyepiece or the equipment of any other enabling augmented reality of operation and the language for providing the request about himSound or Text Command.The eyepiece for enabling augmented reality can be positioned wearer position by geographical remote sensing technology and determine wearerTourism favor.Eyepiece can be received according to the request of wearer and displaying format customization information on the screen.Customization travel information can wrapInclude the information about following place: National Portrait Gallery and museum, monument and history place, do shopping comprehensive place, amusement and night lifePlace, restaurant and bar, most popular tourist famous-city and tour center/hot spot, most popular local/culture/area purposeGround and hot spot etc., rather than limit.Selection according to user to one or more of these classifications, eyepiece can prompt the user with itIts problem, residence time, tourism cost etc..Wearer by voice command can respond and with wearer instituteThe order of selection receives customization travel information.For example, wearer can give National Portrait Gallery higher than monumental priority.Therefore, shouldInformation becomes can be available to wearer.In addition, map and different travel option collection and different priority rankings can also go outThe front of present wearer, such as:
1: the first travel option of priority ranking (avenue des champs elysees, Louvre Palace, Luo Dan, museum, famous coffeeShop)
2: the second option of priority ranking
Priority ranking 3: third option
For example, the first option is ordered as having highest priority, wearer due to the preference indicated according to wearerThe first option may be selected.Advertisement related with sponsor can pop up immediately after selection.Then, virtual tourism can be by very closeIn the augmented reality mode of true environment.Wearer can for example proceed to spending a holiday for the Atlantis holiday village in BahamasIt goes sight-seeing within dedicated 30 seconds.Virtual 3D visit may include the fast browsing to room, sandy beach, public space, park, facility etc..It wearsWearer can also experience the shopping facilities in the region and receive offer and discount in these places and shop.Terminate in this dayWhen, wearer, which may be sitting in his room or hotel, has experienced daylong visit.Finally, therefore wearer can determineWith his plan of arrangement.
Another embodiment can be come by using the equipment (being such as provided in the camera lens in eyepiece) for enabling augmented realityAllow to pay close attention to the information of motor vehicle repair and safeguard service.Wearer can be related to receive for the voice command requested by sendingIn the advertisement of auto repair shop and dealer.The request for example may include the demand of changing oil in vehicle/car.Eyepiece can be from repairingShop receives information and is shown to wearer.The 3D model of the vehicle of wearer can be stopped and be passed through by eyepiece enables augmented realityScene/view shows remaining oil mass in car.Eyepiece can show other also relevant informations about the vehicle of wearer, allSuch as the maintenance needs in other positions (such as brake block).Wearer can be seen the 3D view of the brake block of abrasion and may be to generalThese brake blocks place under repair or replace interested.Therefore, wearer can be come by using the integrated wireless communications ability of eyepieceThe reservation with manufacturer is arranged to solve the problems, such as this.It can be reminded by Email on eyepiece camera lens or Inbound Calls to connectReceive confirmation.
According to another embodiment, purchase present (can be such as provided in eyepiece by using the equipment for enabling augmented realityCamera lens) Lai Shouyi.Wearer can announce the present request for certain situation by text or voice command.Eyepiece canPrompt wearer answers his preference, present type, age group, the cost range of present of the people that receive the present etc..Various options can be presented to user according to the preference received.For example, the option for being presented to wearer may is that cookiesBasket, grape wine and assorted, the present basket of golfer of cheese basket, chocolate etc..
Available options can be rolled by wearer and can select most suitable option by voice command or Text Command.ExampleSuch as, the present basket of golfer may be selected in wearer.The present basket of golfer and the 3D view of golf course canIt appears in front of wearer.The present basket of golfer that is enabled by enhancing display and golf course it is virtual3D view is in close proximity to real world perceptually.Wearer can eventually by eyepiece to the address of prompt, position andOther similar inquiries respond.Then it can be reminded by Email on eyepiece camera lens or Inbound Calls to receive reallyRecognize.
In embodiments, eyewear platform in combination with various controlling mechanisms come using and take physics and information-based input,It executes on processing function and control panel and surface and system (including being based on feedback loop), to interact and hold with contentRow e-commerce transaction.The e-commerce of even now and content scene be it is a large amount of, but some such scenes include butIt is not limited to retail shopping environments, educational environment, transportation environment, home environment, event context, catering environment and outdoor environment.Although there is described herein these fields, various other scenes be will be apparent to those skilled in the art.
In embodiments, eyewear platform can be used in retail shopping environments.For example, glasses can be used to receive in userContent related with interested article and/or environment.User can receive and/or search the pricing information in retail shopping environmentsOr substitution offer, product information (SKU/ bar code), scoring, advertisement, GroupOn(purchase by group) offer etc..In each embodimentIn, user can find or obtain the location information for special article.User also can get about with particular brand, article and/Or the information of the related integral plan information of shopping environment.In addition, user can be used equipped with camera, scanner, QR readerDeng glasses article is scanned into shopping basket.In addition, eyepiece can be used to detect article best in one group of article in user.As an example, user can the features of active spectacles with specific mode visualize article, such as determine or feel using programIt is best in a branch of to find to survey density or the thickness of an article.In embodiments, glasses can be used to negotiate article in userPrice or provide the price of his preference.For example, after virtually or using scanner associated with glasses etc. scanning article,User can make posture, it is mobile he eyes or using voice command or use other way that will pay to provide him for the articlePrice.User can further use glasses, and to carry out order article scanned and then by showing or providing via user interfaceMethod of payment is paid.Such payment can be indicated by hand gesture described herein, eyes movement etc..Similarly, it usesFamily can such as by GroupOn during the trip of her shopping exchange " plot point " or reward, and receive with special article and/orThe related promotion of facility.In addition, user can carry out image recognition using glasses, so that article is identified in an interfaceAnd it places an order to the article.For example, the program that glasses use allows user to use glasses to identify the wrist-watch in StoreFront, thusWhen the article is identified, triggering is directed to the menu that places an order of the article in the interface.It in other embodiments, can be by sweepingBar code, QR code, Product labelling etc. are retouched to enter information into eyewear platform.Move back and forth in retail environment as user orWhen he is just using glasses to participate in retail interface, sales promotion information (processing, mark, advertisement, discount coupon etc.) can be swept by glassesIt retouches or receives or identify in other ways.User can scan accumulating card with glasses to use in transaction or input in other waysSuch information during retail transaction to use.In embodiments, glasses can assisting navigation and guide.For example, user canIt is presented the detailed map in shop, and can be provided that channel Notation Of Content, so that user be allowed preferably to navigate to article and moreIt navigates in retail environment well.User can capture product image or downloading product image from true environment, so that the figureAs can be used for buying article, the notes for creating the article, generation or the scoring, comment and product information that receive the article etc..In addition, the application of the geographical location of object images and glasses allows, user receives the proximal most position of article, the local of article is commented onDeng.In embodiments, the geographical location of user allows specific object images to be generated or more suitably identified.
As more specific example, system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes being used forEyepiece is determined just adjacent to the module of retail environment, and system may include that the optics of retail environment around is checked by its userThe feature of assembly, for identification environment and 3- of the rendering for the ad content of the retail location of eyepiece on wear-type eyepieceThe 3-D processing module that D is shown, the wearer for capturing and handling wear-type eyepiece environment image image-processing mouldBlock, the integrated image source for content to be introduced into optics assembly, wherein integrated image source, which shows 3-D, is rendered into environmentallyCovering, and the 3-D that wherein related with retail environment ad content is presented in integrated image source is shown.In embodiments,Display elements can be locked in the identificated feature of environment by 3-D processing module, and content is available for being known in displayThe relationship of another characteristic is presented.In embodiments, the rendering that the 3-D of advertisement is shown can be scanning bar code, QR code, produceThe result of at least one of product label etc..This is also possible to achievement below: purchase product, in eyepiece input product figureAs, into retail environment position (or the position for moving into retail environment), by the eyes of user be fixed to product on andThe input integral plan information in eyepiece.In embodiments, the 2nd 3-D shows that the result that can be used as at least one of carrys out quiltRendering: scanning product, purchase product, into position of retail environment etc..In embodiments, user can be executed by eyepieceE-commerce transaction, and the transaction may include scanning article come buy, according to compared with other articles come select an article,Negotiated price, exchange plot point, exchange promotion, order item etc..In embodiments, advertisement may include the article close to userPosition.The position of article can be shown relative to the position of user, and user can be administered to the guiding of the article.In each realityIt applies in example, eyepiece can be used for social networking and eyepiece can use the face recognition of another user in retail environment.In addition,Eyepiece can be used for identifying the presence of a people in the environment, and presentation and wearer and the relationship known between others are relatedSocial networking content.In addition, user can send and/or receive friend request by making posture with a position of his body.In embodiments, user can be by advertisement come the price of comparative item.In embodiments, advertisement may include audio content.In embodiments, one feature of identification may include at least one of: to the automatic processing of the image comprising this feature, with letterIt number this feature is checked, communicated with this feature, identify this feature by handling the position of this feature, from database retrievalSpecified about the information of this feature, the user of this feature etc..In addition, user can be specified by the user interface interaction with eyepieceOne feature is for keeping overlay content.In embodiments, covering can be in the identificated feature or neighbouring the identificated feature be inExisting content, and in a further embodiment, the identificated feature can be at least one of: the article to be bought, for saleArticle, mark, advertisement, channel, the position in shop, retail kiosk, information desk, cash register, television set, screen, shopping cart etc..
In embodiments, glasses can be used in educational environment.As an example, glasses can show e-learning content,Such as in textbook or in other ways in find.Glasses allow user to check, cultivation, review project is with for testting.?In each embodiment, user can be monitored when testting.Glasses can carry out timing to him when user leaies through material and can be withThe response of track user necessarily adjusts examination with the process according to the answer of user and/or test.In further embodimentIn, user can check that augmented reality (AR) is covered by glasses.In embodiments, AR covering may include in laboratory,In the medium gradually guidance in classroom.In embodiments, virtual professor can be shown, to allow through video, audio and talkIt interacts.User can check blackboard/blank notes by glasses and he can onboard input additional item, these are additionalItem can when other users check blackboard/blank in the user interface or when other users check true plate and they intoRow is shared, so that AR notes can be added and/or cover when user checks specific blackboard/blank.In embodiments,Glasses can provide social networking platform to the member of class or educational class, and provide the social activity being directed to and about class member and joinNet content.
In embodiments, glasses are used in combination with the business in educational environment.As an example, glasses can be used in userTo buy course or otherwise tracking content progress and course credit.In addition, user can monitor examination and test rank withAnd upcoming examination and test administrative date, user can download course credit/class information, user can capture on classroomIdentical content is simultaneously added in calendar by operation that is discussing, listing on syllabus or otherwise obtaining, andAnd user can meet with friend or class member by way of communicating via glasses with other people.In embodiments,User can check his bill and tuition fee report for checking and tracking them.In embodiments, user can purchase courseDo same thing or course can be used in he, wherein course provides advertisement associated there.
In a further embodiment, user can use glasses in educational environment.User can be examined by glasses to scanExamination/test paper is for checking, operate.User can be scanned or otherwise capture and textbook content, handbook and/or workBook, blackboard/associated data of blank content are for recording the note and job trace.User can be scanned or capture and poster/markRelated data.Upcoming student's meeting, increase in inventory description, meeting-place etc. can be traced in user as a result,.In each implementationIn example, user can capture the face of classmate, friend, concern personage etc..In embodiments, the eye of user can be traced in glassesEyeball is mobile to verify the interaction with content.In embodiments, glasses allow " Lifestride(advances with big strides) " or otherFunction is to absorb content etc..User can make the pen taken down notes and communicated by movement with glasses with posture, and glasses can be depositedStore up the notes of user.In other embodiments, user can make posture and glasses can record notes according to such posture,And in a further embodiment, another sensor associated with user hand allows to clock in user's lettering pen, takes down notes quiltGlasses record.
In embodiments, system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes for determiningEyepiece is just adjacent to the module of educational environment.In addition, system can include: check that the optics of ambient enviroment assembles by its userPart;The processing module of the feature of environment and rendering education related content related with environment for identification;For capturing and handlingImage-processing module of the image of the environment of the wearer of wear-type eyepiece, the image processing module can lock display elementsIt is scheduled in the identificated feature of environment;And the integrated image source for content to be introduced into optics assembly, wherein integrated figureImage source is rendered into covering environmentally for related content is educated, and wherein the content is available opposite with the identificated feature in displayRelationship present.In embodiments, the display of transport content related with transportation environment can be presented in integrated image source, andSuch relationship with the identificated feature can not be presented.In embodiments, the rendering of education content can be scanning bar shapedCode, QR code etc. as a result, it can be result below: in eyepiece input textbook image, in eyepiece input distribute materialImage, the label in environment-identification and the position for entering educational environment of material.In embodiments, educational environment can be religionRoom, gymnasium, automatic garage, garage, outdoor environment, gymnasium, laboratory, factory, business site, kitchen, hospital etc..ThisOutside, education content can be text, textbook extracts, instruction, video, audio, laboratory agreement, chemical structure, 3-D image, 3-DCovering, text covering, classroom book, test, prescription, classroom notes, case history, client file, safety instruction and daily forgingRefining.In embodiments, education content can be associated with the object in environment or be covered on the object.In each embodimentIn, object can be blank, blackboard, machine, automobile, aircraft, patient, textbook, projector etc..In embodiments, systemIt can be used for social networking, and further can be used at least one face recognition of classmate, teacher etc. in environment.EachIn embodiment, user can send and/or receive friend request by making posture with a position of his body.In each embodimentIn, user can be interacted with content for taking an examination, fulfiling assignment, checking syllabus, check course project, practiceTechnical ability, tracking credit, records the note, records notes, submits a question at tracking course process.In embodiments, covering can be in instituteKnow in another characteristic or adjacent to known another characteristic presentation content.In addition, known another characteristic can be at least one of: seaReport, blackboard, blank, screen, machine, automobile, aircraft, patient, textbook, projector, monitor, desk, intelligent plate etc..AsExample, notes may alternatively appear in by blackboard frame at display on;Film may alternatively appear in the display of screen whereabouts, and molecule is shownCan occur on blackboard etc..In embodiments, one feature of identification may include at least one of: to the image comprising this featureAutomatic processing, with signal check this feature, communicated with this feature, identify this by handling the position of this featureFeature retrieves the information about this feature, the information from database retrieval about this feature by handling the position of this featureAnd the user of this feature is specified.In addition, user can specify a feature to be used to protect by the user interface interaction with eyepieceHold overlay content.
In embodiments, glasses can be used in transportation environment.As an example, user can retrieve or capture to be had with transportThe content of pass, such as timetable, availability, delay and cancellation.For example, he can be checked by glasses when user arrives at the airportHis Flight Information simultaneously sees that his flight will still postpone on schedule.In embodiments, user can check his seat/freight spaceSelect and select dessert and diet preference.He he can be exchanged or more by glasses by glasses check-in and in embodimentsNewly his seat selection.In embodiments, user pilot can be given the gradually stream of inventory before the flight required for FAAJourney.In embodiments, guidance and navigation whens train train chief, pilot etc. can be given operation train, aircraft etc. indicate.In other embodiments, user passenger can look back security information by glasses, for example, user refers to safely before can checking flightShow, how wherein he is shown operates emergency equipment etc..In embodiments, glasses can be used to carry out predetermined supporting item for user,Rent car, hotel etc..In embodiments, user can make a reservation for the visit to appear in person and/or he can be by glasses pairInterested region carries out virtual tours.He can check him by the ambient enviroment for the destination travelled, so that he is reachingThe region is familiar with before.In other embodiments, he can also check and the processing of more various items.User can check and/orReceive integral content, such as to the available reward of particular account, he it is convertible integral and to what project, he is convertibleDeng.User can be used to exchange plot point at assigned aircraft, rental car, hotel etc. by glasses.In embodiments, Yong HukeUsing glasses come the purpose for networking in travelling or transportation environment.For example, user can find out in his specific flight or fireWhom people on vehicle is.Also glasses can be used to check entertainment content during transportation for user.As an example, film in-flight canIt is transferred to the glasses of user.In embodiments, user can check content related with each position and he can check ARContinental embankment etc..As an example, user can check such as associated with specific region interested when train or aircraft pass through a landscapeThe AR of project covers (such as continental embankment).In embodiments, it is received when user can pass through notice board/mark in his transfer wideIt accuses.In addition, user can receive personal information related with the transport professional person for participating in its transport.As an example, user can connectReceiving the record of the driver from the point of view of related with taxi driver information or he can check and can reflect scoring safely for pilotPilot accident and/or act of violating regulations record.
In addition, user can use relatedly with business glasses in transportation environment.For example, glasses can be used to make a reservation in userSeat, exchange reward plot point carry out Reserved seating, are arranged and paid for the diet during transport.User can find flight simultaneouslyIt carries out making a reservation for/paying, rent car, predetermined hotel, taxi, bus etc. thus.User can be related with the travelling with himPeople (such as other passengers) networks.In embodiments, glasses can be used to navigate in user.For example, user can be givenThe map of bus and taxi is to be shown in the best route to get around in city and method.User can be for identical purposeTo application payment and/or in order to which identical purpose is checked and applies associated advertisement.User can during his travelling with continental embankmentAR content around interior and continental embankment interacts, and with from notice board, mark based on AR etc. advertisement and promotion carry outInteraction.
In embodiments, project can be input to eyewear platform in transportation environment by user.For example, he can be by usingGlasses scan his ticket to start check-in process.He can be provided that instrument board, which shows speed during his transportDegree, fuel and GPS location.Glasses can be communicated by bluetooth with the IT system of vehicle with display instrument dial plate and provide aboutThe information of vehicle and/or Transportation Model.In embodiments, user glasses can be used identify other passengers face and/orThe image of other passengers is stored by inputting an image into glasses.User continental embankment related content can be input in glasses withFor interacting or creating the database of the content for recalling in the future.In embodiments, user can input can based on AR's orCan be not based on notice board/mark of AR for their storage and with their interaction.
In addition, system may include the interaction wear-type eyepiece that user wears, wherein eyepiece includes: for determining that eyepiece is just adjacentIt is bordering on the module of transportation environment;The optics assembly of ambient enviroment is checked by its user;The feature of environment and wash with watercolours for identificationContaminate the processing module of transport related content related with transportation environment;For capturing and handling the ring of the wearer of wear-type eyepieceDisplay elements can be locked in the identificated feature of environment by image-processing module of the image in border, the image processing module;And the integrated image source for content to be introduced into optics assembly, wherein integrated image source is rendered into environment for content is transportedOn covering, and wherein the content can be presented with the opposite relationship with the identificated feature in display.In embodiments, collectThe display of transport content related with transportation environment can be presented at image source and such and the identificated feature relationship can notIt is presented.In embodiments, the rendering for transporting content can be result below: scanning bar code, QR code, ticket etc., inputThe image of ticket for transport, into train, railway station, taxi station, taxi, airport, aircraft, ship, platform,Iron, subway station etc..In embodiments, transport content can be that text, video, audio, 3-D image, 3-D covering, text coversLid, guide, arrangement, map, navigation, advertisement, the position of point of interest, auxiliary resources, safety instruction, flight instruction, operator are clearList, FAA information, Flight Information, arrival and Departure airport information, itinerary etc..In embodiments, auxiliary resources may includeMake a reservation for for making hotel, make car rental and make a reservation for, make that dinner is predetermined, marks personal preference, the selection of change seat, discoveryLocal amusement, the resource for arranging locally visit etc..In embodiments, user eyepiece can be used buy for flight, by ship,Through ticket by train, user can purchase the set ticket taken for subway, check schedule, compare travel price, retrieval sideTo, retrieve transit route, consult a map in current location, check the high usage route etc. for Transportation Model.In each embodimentIn, content can be associated with vehicle for showing the information about the vehicle, wherein such information includes emergency exitInformation, maintenance information, operation information, instrument board information, type information etc..System can be used for social networking, and the systemThe face recognition to traveller, operator in environment etc. can be used.User can be by making posture with his position of bodyTo send and/or receive friend request.In embodiments, eyepiece can be used for identifying the presence of a people in the environment, and be inNow with wearer and know the related social networking content of relationship between others.In embodiments, user can with showThe advertisement shown is interacted to obtain additional information.In embodiments, content can be enhancing environment (and content can be withEnvironment is enhanced), including following arbitrary content: visually indicate, audio instruction, visual indicia, for various reasons (includingFor in case of emergency escaping from the environment etc.) covering route planning.In embodiments, covering can be in known another characteristicUpper or neighbouring known another characteristic presentation content.In addition, known another characteristic can be at least one of: poster, train flyMachine, taxi, ship, subway, screen, retail kiosk, map, window and wall.In embodiments, identification one feature may include withIt is at least one lower: to check this feature to the automatic processing of the image comprising this feature, with signal, lead to this featureLetter, identify by handling the position of this feature this feature, from database retrieval about the information of this feature, the user of this featureIt specifies.In addition, user can specify a feature to be used to keep overlay content by the user interface interaction with eyepiece.
In embodiments, eyewear platform can be used in home environment.In embodiments, glasses make in combination with contentWith.For example, glasses can be used for entertaining, wherein user, which is in, watches media.It is miscellaneous for example to generate that glasses may be additionally used for shoppingGoods inventory etc., and carry out the article needed for inventory and stored and be used to check these contents.Glasses can be used to carry out in userFamily is coordinated, by via glasses come payment bill, the production task list to be completed of family etc..For example, glasses can quiltReservation, upcoming football match etc. for making and retaining doctor.Glasses can be used for program-guide.As an example,Glasses can be used for that user is instructed to come controlling electric device equipment, DVD player, VCR, remote controler etc..In addition, glasses can be used for protectingPeace and/or safety.User can activate alarm system to ensure what it was open when being in or when being away from home.User can leaveWhen check family's camera and open family lamp and close family lamp etc..User can be given the finger for emergencyShow, for example, user can be given about the instruction how done during kindling, hurricane etc..The openable eye described herein of userThe feature of mirror understands thoroughly smog etc..During this emergency, user glasses can be used track kinsfolk and and theyIt is communicated.Glasses can provide help CPR guidance, 911 callings etc..
In embodiments, glasses are used in combination with the business in home environment.For example, glasses can be used to order in userPurchase food delivery, check dry-cleaning, subscribe dry-cleaning pickup etc..User can order entertainment content, film, video-game etc..?In each embodiment, user can find and using for family's project, guiding material of payment bill etc..User can look into when being inIt sees advertisement and/or promotion and takes action according to them.For example, if when just when kitchen is using blender, advertisement is displayed on userIn glasses, advertisement can prompt the user with discovery more about the information of new blender and user can choose the advertisement to learnMore information about the equipment.
In embodiments, user can use glasses in family's sublimity, so that user enters information into glasses.As an example, user can input secretarial work for storage, recall, interaction etc..User can input Shopping List, bill, clearList, handbook, mail etc..User can input the advertisement from the paper email advertisement for enablings such as AR, TV, radio.User canPaper advertisement is scanned to check or receive additional AR information associated with advertisement.User can input embedded symbol and/orIdentifier for example identifies electrical equipment or other hardware.User can input Wi-Fi network content to glasses.In addition, user canInput television content, such as screen and smart television content.User can be carried out by eyewear platform and such content as a result,Interaction.Remote control command can be input in eyewear platform by user, so that user will operate various equipment, it is such as electricDepending on, VCR, DVD player, electrical equipment etc..In addition, user can input security system content, enables a user to and ensure public securitySystem, camera associated with security system etc. are interacted and are controlled them.User can check and security system phaseAssociated various camera feeds, so that he can check each region around home environment by eyewear platform.Glasses can be throughBy bluetooth, via internet, Wi-Fi connection etc. connect with such camera.User can be further able to setting alarm, close policeIt reports, check alarm and is interacted with alarm associated with security system.
In addition, system may include the interaction wear-type eyepiece that user wears, wherein eyepiece includes: for determining that eyepiece is just adjacentThe module of nearly home environment;The optics assembly of home environment around is checked by its user;The feature of environment is simultaneously for identificationRender the processing module of family related with environment related content;For capturing and handling the environment of the wearer of wear-type eyepieceImage image-processing module, which can be locked in display elements in the identificated feature of environment;WithAnd the integrated image source for content to be introduced into optics assembly, wherein family's related content can be rendered by integrated image sourceCovering environmentally, and wherein the content can be presented with the opposite relationship with another characteristic known in display.In each implementationIn example, the display of content related with environment and such and relationship and content of the identificated feature can be presented in integrated image sourceIt can not be presented.In embodiments, the rendering of content can be result below: be fixed on into family, by the eyes of userOn an article in family, in eyepiece environment-identification label, operate another equipment etc. of family.In embodiments, contentMay include the user interface for operating such as following equipment: VCR, DVR, satellite receiver, set-top box, video on demand equipment,Audio frequency apparatus, video game console, warning system, home computer, heating and cooling system etc..In embodiments, it usesFamily can by eyes movements, hand gesture, point is first-class interacts with user interface.In embodiments, content allowsUser completes following task: generating shopping list, checks groceries inventory, payment bill, checks bill, activation equipment, operation lampLight generates for kinsfolk and/or other people virtual communication, orders delivery service (dry-cleaning, food etc.), by environmentIn advertisement action etc..In embodiments, user can identify another people just in home environment or family by eyepieceFace.In embodiments, content may include that instruction and the instruction in urgent setting can be audio, video, videoAt least one of instruction etc..In embodiments, content can be enhancing environment or content and can increase to environmentBy force, and including following arbitrary content: visually indicate, audio instruction, visual indicia, in case of emergency escaping from environment etc.Cover route planning.In embodiments, content may be in response to embedded symbol, television audio and/or video content, advertisement etc.To generate.In embodiments, content can be retrieved from the user's manual being stored in eyepiece or from internet downloading etc..ContentIt may include 3-D advertisement, audio, video, text etc..In embodiments, one feature of identification may include at least one of: to packetThe automatic processing of image containing this feature checks this feature with signal, is communicated with this feature, by processing this featurePosition come identify this feature, it is specified etc. about the user of the information of this feature, this feature from database retrieval.In each embodimentIn, user can specify a feature to be used to keep overlay content by the user interface interaction with eyepiece.In embodiments,Covering can be in known another characteristic or adjacent to known another characteristic presentation content.In addition, known another characteristic can be it is followingAt least one: electrical equipment, notes write station, note pad, calendar, wall, electronic equipment, security system, room, door, gateway,Key holder and fixed device.
In embodiments, user can use glasses in event context.In various event contexts, eye is can be used in userMirror platform is interacted with content.As an example, user, which can check, is directed to such as concert, ball match, various entertainments, quotientTimetable, ticket information and/or the ticket/seat availability of the events such as industry event.User can check or in other ways with oneThe sales promotion information of event interacts.User can check integral program content, points such as associated with an event or rewardValue etc..User can be provided in association the access for an event due to integral plan etc. or with integral plan etc..WithFamily can be provided the chance of " premiums " material for an event of checking due to integrating state etc..In embodiments, it usesFamily can check ancillary service associated with event and commodity, buy these services and commodity etc..In embodiments, Yong HukeCheck the AR content at event, first down line, goal mark, to racer/performing artist access etc..In each implementationIn example, user can check optional video feed, and the side visual angle, backstage when another location such as user in stadium regardAngle/video feed etc..
In embodiments, glasses are used in combination with the business in event context.As an example, user is commercially available/pre-Booking, check selection/available seat etc..User can predetermined supporting item, such as purchase backstage pass upgrades his seatPosition etc..In embodiments, user can purchase event dependent merchandise, sport shirt, concert clothing, poster etc..User canPlot point is further exchanged, such as those plot points associated with reward or frequent participant's project.In embodiments, Yong HukeBuy the specific part or complete of picture and/or picture with scenes as souvenir, the chronicle of events, such as from event, match or eventThe project etc. of the video for being digitized " signature " of whole match or event.The available more costs of user or view for free are in this wayEvent during the additional video or explanation of racer and/or performing artist.
In embodiments, project and/or data can be input to eyewear platform in event context by user.In each implementationIn example, user can input ticket/pass to find his seat, log on to event etc..User can input promotion with AR enhancingMaterial (such as poster and mark) is for checking them and/or interact with them.User can input integral plan information simultaneouslyIt can be for particular event scanning card etc..Such user can such account related with event interact, to the accountData, activation account etc. are provided.In embodiments, Web content can be input to glasses via Wi-Fi, bluetooth etc. by userIn.
In addition, system may include the interaction wear-type eyepiece that user wears, wherein eyepiece includes: for determining that eyepiece is just adjacentThe module of nearly event context;The optics assembly of event context around is checked by its user;For the wash with watercolours on wear-type eyepieceProcessing module of the dye for the display of the event content of the event context of eyepiece;For capturing and handling the wearing of wear-type eyepieceImage-processing module of the image of the environment of person, the processing include identification feature related with event and the position for storing this featureIt sets;And the integrated image source for content to be introduced into optics assembly, wherein event content is rendered by integrated image sourceThe covering environmentally checked by eyepiece wearer is simultaneously associated with this feature by the content;The wherein presentation of integrated image source and ringThe related content in border.In embodiments, display elements can be locked in the identificated feature of environment by image processing module,And content can be presented with the relationship of the identificated feature in opposite display.In embodiments, the rendering of content can be withAt least one lower result: the eyes of user are fixed in the project at event, the spy in environment-identification by entry event environmentImage etc. of the appearance, input of sign, the ticket for scanning user, one people of identification under event from event.In embodiments, contentIt may include enhancing vision feeding, including following arbitrary content: first down line, place mark line, the display of performing artist, performanceThe display of person's musical instrument, instant replay, enhancing view, live video, optional view, advertisement related with event, 3-D content, seatPosition upgrade availability etc..In various embodiments, content may include enhancing audio feed, including following arbitrary content: racerExplain, explain audio, race sounds, enhancing performance sound, performing artist's comment, live audio etc..User can it is mobile by eyes,Hand gesture, point are at least one of first-class to be interacted with content.In embodiments, eyepiece can be used for identifying a peoplePresence under event, and can present and wearer and know the related social networking content of relationship between others.In addition,User can send and receive at least one of friend request by making posture (such as nodding) with his one position of body.System may include user interface come for purchase events project, image and event view and from event digital signature it is bigAt least one of thing note.In addition, content may be in response at least one of embedded symbol, television content, advertisement etc. next lifeAt.In embodiments, content may include backstage, cloakroom, baseball player lobby, baseball bull pen, racerAt least one of the enhancing video of bench etc. and audio.In embodiments, one feature of identification may include at least one of:This feature is checked to the automatic processing of the image comprising this feature, with signal, is communicated with this feature, is somebody's turn to do by processingThe position of feature come identify this feature, it is specified etc. about the user of the information of this feature, this feature from database retrieval.In addition,User can specify a feature to be used to keep overlay content by the user interface interaction with eyepiece.In embodiments, it coversLid can be in known another characteristic or adjacent to known another characteristic presentation content, and in embodiments, known another characteristicCan be an object of match, including at least one of: place, ball, goal, scoreboard, extra large screen, screen, ball are passed throughDistance, path, the stadium seat of ball etc..In embodiments, known another characteristic can be a pair of artist's performanceAs, including at least one of: musician, Musical Instrument, stage, music easel, performer, setting, stage property, curtain etc..In each realityIt applies in example, known another characteristic can be an object in leased territory, including at least one of: doll, animal stuffed toy, soundHappy meeting clothing, food, beverage, cap, clothing, sandy beach towel, toy, sport souvenir, concert souvenir etc..
In embodiments, eyewear platform can be used in catering environment.For example, glasses can be in catering environment with regard to contentTo file a request.In embodiments, glasses can be used to make the making a reservation for of seat etc., check that possible seat is available in userProperty checks scoring, comment, party venue position and content etc..User can also check menu content and price, the party venue andComparison between other party venues, about food and beverage details (such as comment, nutritional ingredient, it how quasi- beIt is standby etc.), the collocation of wine of the scoring of wine, automation etc..User can check social content, such as can recognize or identify a peopleAnd/or it is interacted with the client of identical party venue.In embodiments, user can check and the account of user and/or spyDetermine the related integral program content of party venue, points of such as having dinner.Glasses can be used to translate the item on menu, pass through in userIt searches for search the title of ingredient and definition etc..User can check the video or image of menu item.In embodiments, Yong HukeCheck the AR version of menu.In embodiments, user can capture menu image and check the image with infinite focal, increaseMagnifying power adjusts contrast, to menu illumination etc..In embodiments, user can check menu item and just scoring and price etc.Automatically drinks and beverage part are arranged in pairs or groups.User may have access to what he had eaten and he likes what databaseAnd check the prompting of passing meal.In embodiments, user can be seen that the item different from the item that he is consuming.For example, such asFruit user has selected small pieces salad, this can be considered as filet steak etc. by he.
In embodiments, glasses can be used in the business of catering environment.For example, glasses can be used to find party groundPoint makes or more new subscription, browses menu, is selected from the menu interested item or the item to be bought and in party venueOptions is come.Glasses can be used for being paid for one, shared payment, calculate tip, exchange plot point etc..
In embodiments, glasses can be used in catering environment with input data/item.In embodiments, Yong HukeCarry out input content via Wi-Fi, bluetooth etc..In embodiments, user can input menu, mark with AR enhancing etc. to look intoIt sees them and/or is interacted with them.In embodiments, user can input the ad content with AR enhancing to check themAnd/or it is interacted with them.User can input the item for payment, credit/debit card, integral payment/exchange etc..In this wayInput can be made by near-field communication etc..In embodiments, user can pay via face recognition.In each embodimentIn, glasses can be used for identifying the face of employee and such payment can file a request according to such face recognition.?In other embodiments, the face of the facial or another individual of user can be identified and account can be withholdd accordingly to make branchIt pays.
In embodiments, system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes: for determiningEyepiece is just adjacent to the module of at least one of eating surroundings and wine culture;The optics assembling of ambient enviroment is checked by its userPart;The processing module of the feature of environment and rendering food and drink related content related with environment for identification;For capturing and handlingImage-processing module of the image of the environment of the wearer of wear-type eyepiece, the image processing module can lock display elementsIt is scheduled in the identificated feature of environment;And the integrated image source for content to be introduced into optics assembly, wherein integrated figureAt least one of food and drink related content can be rendered into covering environmentally by image source, and the content can be used relatively and in displayThe relationship of the identificated feature is presented.In embodiments, the display of content related with environment can be presented in integrated image source, andAnd such relationship and content with the identificated feature can not be presented.In embodiments, the rendering of content can be followingAt least one result: at least one of eating surroundings and wine culture, the eyes of user are fixed in this contextMenu on, open that menu, the label in environment-identification, to focus mark in the environment first-class.In embodiments, content canIncluding enhance menu content, including at least one of: the scoring of menu, the comparison of menu item, menu item nutritive value, wineThe video of the image of pairing, menu item, the audio description of menu item, menu item with menu item, enhancing magnifying power, menu itemContrast and illumination and according to geographic area, ingredient, the scoring of item, before user whether the menu item of post-consumer this etc.Classification.In embodiments, content can be received as menu at seats such as users.In embodiments, user can pass through eyeEyeball movement, hand gesture, point are at least one of first-class to be interacted with content.In embodiments, user can be via meshMirror is ordered.In embodiments, user can pay check, bill or expense via eyepiece.In embodiments, eyepiece canBe used for social networking and provide at least one of: comment of the user to environment and the face to people another in environment are knownNot.In embodiments, user can be sent and received in friend request extremely by making posture with a position of his bodyIt is one of few.Content may include the additional information related with menu item retrieved from internet.In embodiments, covering can beIn known another characteristic or adjacent to known another characteristic presentation content.In embodiments, known another characteristic can be followingAt least one: poster, frame, Menu Board, menu, beverage container, food display vehicle, bar, desk, window, wall etc..EachIn embodiment, one feature of identification may include at least one of: to the automatic processing of the image comprising this feature, with signal comeExamination this feature, communicated with this feature, identified by handling the position of this feature this feature, from database retrieval aboutThe information of this feature, the user of this feature are specified etc..In embodiments, user can by the user interface interaction with eyepiece comeA specified feature is for keeping overlay content.
In embodiments, eyewear platform can be used in outdoor environment.In embodiments, glasses can be used for it is interiorHold interaction and/or checks content.User can check navigation information, such as track position, to destination time, arrive track or edgeThe AR covering of the track barrier that may not be able to otherwise see of the traveling of track, track map, user etc..User canIt is given the condition of outdoor environment, temperature, weather, sowing condition, fishing condition, water level, tidal conditions etc..User can makeIt is communicated with glasses, coordination, weather alert etc. of the opsition dependent related with outdoor environment to group.User collects letterBreath, is named to identify plant, trees, animal, birds, sound, bird.In embodiments, user can check object simultaneouslyAnd by inquiring to glasses " what this is ", user can be presented content and/or information about the object.In each embodimentIn, user can get security information, and whether some thing is edible, toxic, dangerous etc..For example, user canPropose problem " this is dangerous snake? " when seeing by glasses, glasses then can be used for providing a user about thisThe information of snake, it has venom etc..In embodiments, user glasses can be used identify and/or receive and with open airThe related content of continental embankment of environmental correclation connection.Such continental embankment can help user to navigate or understand in the environment environment.In each realityApply in example, glasses can be used to check program-guide in user, such as how pitch a tent, make a call to one specifically tie, cross it is difficultLandform etc..User can inquire that " how I lift this tent " and user can receive the gradually instruction about this.In each implementationIn example, user can check about itself, the content of behavior or situation, or analysis thus.User can request to update from glasses, allSuch as " I has been dehydrated ", " my hypothermia ", " my oxygen content is low ".According to as a result, user can be changed his behavior comeIt prevents specific result or promotes specific result.In embodiments, user can check and other people experience on trackRelated social content and environment, experience blog etc..In embodiments, it is only for row that user, which can be alerted some ski trail,Family or user can further be informed the present situation, such as various pieces in place have serious ice patch.
In embodiments, glasses can be used as in environment related with business by user outdoors.User can download and environmentRelated related content.As an example, user can download diameter road map, fishing map, about grabbing fish, skiing, Halfpipe etc.Data.User can arrange to stay, order supply, rental equipment, arrange guide, visit, entry event, obtain for example for fishingThe license of fish, go hunting licensing or other etc..In such setting, user can interact via glasses and social networks,For example, user can participate in training club, with other on diameter road or people in specific environment communicate etc..WithIt is the achievement being directed toward that family, which can mark and/or track with target,.For example, user is traceable or marks the target for climbing Mount Whitney, it canMark the target etc. of charitable " happy race ".The business prototype etc. based on blog can be used in user.In embodiments, user can be throughThe subsidy for some outdoor event is improved using social networking by eyewear platform.
In embodiments, user can will be in outdoor environment or content related with outdoor environment, data etc. are input to eyeIn mirror.In embodiments, the camera in glasses can be used to be used for scenery identification in user, and user can use GPS by glassesTo provide information related with specific environment or navigate in specific environment.User can send to the other users in ambient environmentCommunicate and receive from they communication or send and receive communication related with environment.User can input continental embankment data, useAR enhances the continental embankment etc. for checking environment.User can input leaf and the features such as spend, and make label related with these features,The picture of these features can be captured and/or understand them in the environment.User can capture the image of the item of composing environment, animal etc.Come learn more about they, related with them data of storage, with and their related AR contents interact.
In embodiments, system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes: for determiningEyepiece is just adjacent to the module of outdoor environment;The optics assembly of outdoor environment around is checked by its user;For in wear-typeProcessing module of the rendering for the display of the outdoor content of the outdoor environment of eyepiece on eyepiece;For capturing and handling wear-type meshImage-processing module of the image of the environment of the wearer of mirror, the processing include identifying feature related with event and storing to be somebody's turn to doThe position of feature;And the integrated image source for content to be introduced into optics assembly, wherein integrated image source will be in open airAppearance is rendered into the covering environmentally checked by the eyepiece wearer and content is associated with this feature, wherein integrated image sourceContent related with outdoor environment is presented.In a further embodiment, display elements can be locked in by image processing moduleIn the identificated feature of environment, and content can be presented with the relationship of the identificated feature in opposite display.In embodiments,Content can be used as the result of at least one of to be rendered: fix in the environment into outdoor environment, by the eyes of userOne presence upper, the feature in environment-identification, one people of identification are in environment, input environment image, focus in the environmentIndicate first-class.In embodiments, content may include enhancing ambient Property, including at least one of: covering diameter road information arrivesThe temporal information of destination, user's forward information, landmark information, the security information about environment, in environment relative to other sourcesPosition and information about the organism in environment.In embodiments, content may include the instruction for user, andInstruction can be at least one of: audio, video, image, 3D rendering, the covering on object, gradually instruction etc..User Ke TongCross that eyes movement, hand gesture, point are at least one of first-class to be interacted with content.User it is executable following at least itOne: arranging to stay, order supply, rental equipment, arrange excursions, obtaining for movable license or licensing, input about ringThe comment etc. in border.In addition, content can enhance at least one of: camera input, GPS information, the continental embankment in environment and open airFeature in environment.In embodiments, eyepiece be used to identify the presence of a people in the environment, and present and wearer and instituteKnow the related social networking content of relationship between others.In addition, user can be by making posture with his position of bodyTo send and receive at least one of friend request.In embodiments, content can be according to the analysis to user's situation by wash with watercoloursDye.In embodiments, one feature of identification may include at least one of: to the automatic processing of the image comprising this feature,This feature is checked with signal, is communicated with this feature, this feature is identified by handling the position of this feature, from databaseIt is specified etc. to retrieve user about the information of this feature, this feature.In addition, user can by the user interface interaction with eyepiece comeA specified feature is for keeping overlay content.In embodiments, covering in known another characteristic or neighbouring can be identifiedFeature presentation content.In addition, known another characteristic can be at least one of: plant, trees, shrub, diameter road, rock,Fence, path, field, campsite, cabin, tent, the mode of water transportation, marine vehicle and animal.
In embodiments, user can use glasses in exercise environment.In embodiments, glasses can be used in userIt checks, download or is interacted in other ways with content.For example, user can such as by inquiry glasses " I is dehydrated? "," my hypothermia? ", " my oxygen content is low? " etc. taking ego behavior or status analysis.In embodiments, user Ke ChaSee the content towards health club, club's expense and offer, upcoming training course etc..User can check faceTo trained content, content is such as instructed and indicated.For example, user can check to squat, stretch, how to useInstruction, video, AR of equipment etc. etc..User can check, comment on and update blog, personal exercise such as related with exercise environmentBlog.
In embodiments, user can use glasses in the business in exercise environment.As an example, user can be by payingMoney or free download master plan, such as with instruction, coach or the related plan of other guidances.In embodiments, userIt can be tracked successfully and/or be in progress by planning, until terminating.In embodiments, application can have associated wants with themIt is displayed to the advertisement of user.In embodiments, glasses can be used to be used for ancillary equipment purchase and sale in user.For example,The commercially available new sport footwear for increasing arch of foot insole with running of user.In embodiments, glasses can be used to use in userIn charitable activity, such as, but not limited to " happy race " or " climbing Mountain Everest for charities X ", wherein user is via glassesPlatform collectoin and/or check or update the blog entries for this.
In embodiments, information and/or data that user can be used glasses to input in exercise environment.In each embodimentIn, user can enter data to for showing tracking, via sensor input data and input picture and video.It is only used as and showsExample, user can record a video to another person in some activity, and the video is then used during the training of himselfTo make form, technology etc. become perfect.
In embodiments, system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes: for determiningEyepiece wearer is just in exercise environment or just adjacent to the module of at least one of exercise environment;Surrounding is checked by its userThe optics assembly of exercise environment;Processing module for the rendering exercise related content on wear-type eyepiece;For capture andHandle image-processing module of the image of the environment of the wearer of wear-type eyepiece and the feature of environment-identification;For by contentIt is introduced into the integrated image source of optics assembly, wherein exercise is rendered by integrated image source is checked environmentally by userCovering, be fixed to neighbouring known another characteristic when wherein being covered on the mobile eyepiece of user, and wherein integrated image source is inNow content related with exercise environment.In embodiments, the rendering of content can be the result of at least one of: enter forgingRefining environment, the eyes of user are fixed on one of environment it is upper, automatically identify feature in the eyepiece visual field, in exercise environmentUsing in an equipment, environment-identification label, to focus on mark in environment first-class.In embodiments, content may includeEnhance exercise, including at least one of: towards trained content, club information content, the instruction for exercise, passIn the information etc. of upcoming course.In embodiments, content may include 3-D content, audio, vision, video and textAt least one of content.User can hand over by the way that eyes movement, hand gesture, point are at least one of first-class with contentMutually.In embodiments, content may include user information, including vital signs heart rate, exercise time, once come in swimming contestReturn at least one of required time, best setting time, historical use data etc..Content allows user to buy trainingCourse, using the time of machine, in club's more times, beverage, health etc..In embodiments, content can beThe advertisement of at least one of: upcoming course, Health Club, to every discount of juice bar, equipment sale etc..In addition, in embodiments, eyepiece can be used for social networking, wherein eyepiece provides at least one of: user is to environmentComment and to the face recognition of people another in environment.In addition, user can be sent out by making posture with a position of his bodyAt least one of give and receive friend request.In embodiments, user can send out to another member, coach, director etc.It send friend request or receives friend request there from them.In embodiments, covering can be in known another characteristic or neighbouringKnown another characteristic presentation content.In addition, known another characteristic can be at least one of: calendar, wall, window, plate, mirrorSon, weighing machine, bicycle, stationary bicycle, elliptical machine (one of gymnastic equipment), prize ring bag, runway, is scored at treadmillPlate, goal, a region in court, a region in tennis court etc..In embodiments, identification one feature may include with down towardIt is one of few: to check this feature to the automatic processing of the image comprising this feature, with signal, communicated, led to this featureThe position of processing this feature is crossed to identify this feature, specify from database retrieval about the information of this feature, the user of this featureDeng.In embodiments, user can specify a feature by interacting with the user interface of eyepiece etc. to be used to keep in coveringHold.
It may be the mobile online game using augmented reality eyes to the attractive another application of user.These gameCan be computer video game, such as those are mobile by Electronic Arts Mobile(electronics artistic skill), UbiSoft andThe dynamic view severe snow of Activision Blizzard() supply game, for example, World of(WoW) (Warcraft generationBoundary).It is played on home computer (the calculation machine and inoperative is used tricks) just as game and entertainment applications, augmented reality glasses can also be usedGame application.Screen may alternatively appear in the inside of glasses so that the game of user's Observable and participation game.In addition, void can be passed throughIntend game console (such as control stick, control module or the mouses of other place descriptions here) to provide the control to game is playedSystem.Game console may include the element of sensor or other output types for being attached to user's hand, such as coming fromThe feedback by acceleration, vibration, power, pressure, electric pulse, body temperature, electric field sensing etc. of user.Sensor and actuator can pass throughThe modes such as sheath, ring, liner, gloves, bracelet are attached to the hand of user.Eyepiece virtual mouse allows user as a result,The movement of hand, wrist and/or finger is construed to movement of the cursor on eyepiece is shown, wherein " moving " may include slowly movingDynamic, quickly movement, jerky movements, positioning, change in position etc., and permissible user operates without physics in three dimensionsSurface, and including some or all of six-freedom degree.
As seen in Figure 27, both 2700 usable internets and GPS are realized in game application.In one embodimentIn, game is via game provider's (perhaps by using their web services and internet of such as display) from customer databaseDownload to subscriber computer or augmented reality glasses.At the same time, it may have the glasses of telecommunication capability are via cellular tower and satelliteReceive concurrent power transmission letter and telemetered signal.Online game system, which is able to access that, as a result, wants about user location and userThe information of ludic activity.
The knowledge of the available position about each player of game.For example, game can be via GPS locator or magnetometerLocator building uses the feature of player position, for rewarding plot point when reaching the position.Game can also reach special in playerMessage (such as display clue) or scene or image are sent when positioning is set.For example, message can be destined to next be provided toNext destination of player.Scene or image can be used as a part of the struggle or obstacle that must be overcome or swim as gettingThe chance for plot point of playing provides.Therefore, in one embodiment, the position of wearer can be used for augmented reality eyepiece or glassesAccelerate and enlivens computer based video-game.
It is a kind of play augmented reality game method describe in Figure 28.In this method 2800, user logs on to website, byThis accesses the game being licensed.Game is selected.In one example, if multiple players are available and cater to the need, then game can be added in user;Alternatively, user can perhaps create customized trip by using special role desired by userPlay.Game can be arranged the time, and in some cases, and specific time and place for game may be selected in player, will refer toShow to be assigned to and will play the website etc. of game.Later, player is met and is stepped on one or more players using augmented reality glassesRecord game.Participant then plays game and if applicable, game result and any statistical data (score of player, tripPlay number etc.) it can be stored.Once game has begun, position can change for different players in gaming, thusOne player is sent to a position and another player or other multiple players are sent to different positions.Game then can be according to everyThe position that the GPS or magnetometer of a player or every group of player are provided has the different scenes for each player or every group of player.Each player can also be sent different message or image according to his or her role, his or her position, or both.WhenSo, each scene then can lead to other scenes, other interactions, to the instruction of other positions etc..In one sense, suchGame mixes the game that the reality of player position is being participated in player.
Game can extend from the easy game type (such as small-sized, single player) that can be played in player's palm.Alternatively, more complicated, multiplayer game can also be played.At the former, classification is game, such as SkySiege, AR DroneWith Fire Fighter360.In addition, multi-player gaming is also readily appreciated.Since all players must log in game, someGame can be logged and the friend of other one or more people is specified to play.The position of player can also can via GPS or other methodsWith.Sensor (such as accelerometer, gyroscope or even magnetic sieve in augmented reality glasses or game console described aboveDisk) it may be alternatively used for orienting and playing game.Another example is AR invader (Invader), in the iPhone from application shopIt can be used in.It can be from other suppliers and for (France bar, the Layar and Parrot company in such as AmsterdamMultitude) (i.e. the provider of AR Drone, AR Flying Ace and AR Pursuit) provide non-iPhone type system, obtainOther game.
In embodiments, game can be 3D's so that user can experience 3D game.For example, when playing 3D gameWhen, user can check virtual, augmented reality or user in other environment at the visual angle that can wherein control him.User can be rotated himHead check the various aspects of virtual environment or other environment.As a result, when user rotates his head or makes other shiftingsWhen dynamic, he can check game environment, just look like he actually just in this environment.For example, the visual angle of user can be withIt is so that user is stayed " in " 3D game environment by least some control to visual angle, wherein user can move his headPortion and the view changed with game environment corresponding with the head position of change.In addition, when user actually walks forwardWhen, he can " entering into " game, and have as the mobile visual angle of user changes.In addition, with the note of his mobile eyes of userDepending on view, visual angle also be can be changed.It can be such as in the additional figure of lateral position offer in the user visual field accessed by rotation headAs information.
In embodiments, 3D game environment can be projected on the lens of glasses or be checked by other means.ThisOutside, lens can be opaque or transparent.In embodiments, 3D game image can be associated with the external environment of userAnd the external environment of user is combined, the head for rotating him and 3D rendering are enabled a user to together with external environment.ThisOutside, such 3D game image is associated with changeable with external environment, so that 3D rendering is in all cases and in external environmentMore than one object or an object more than one position it is associated so that user is appearing to be 3D renderingIt is interacted with the various aspects or object of true environment.As an example, user can check 3D game strange beast climb up a building orAutomobile is climbed up, wherein such building or automobile are the real objects in user environment.In such game, user can be with monsterA part that beast interacts as 3D gaming experience.True environment around user can be one of 3D gaming experiencePoint.In each embodiment that wherein lens are transparent, when user can move around in his or her true environment with 3D gameEnvironment interacts.The element of user environment can be integrated in game by 3D game, it can fully be assembled by game or itIt can be the mixing of the two.
In embodiments, 3D rendering can be associated with augmented reality program, 3D Games Software etc. or existing by enhancingReal program, 3D Games Software etc. generate by other means.Made in wherein augmented reality for the purpose of 3D gameIn each embodiment, 3D rendering can occur or be easily noticed by the users according to the position of user or other data.Such enhancingPractical application is provided to user to interact with such 3D rendering, to provide 3D game environment when using glasses.As user for example changes his position, the process in game can advance and each 3D element of game can become viewerIt may have access to or inaccessible.As an example, the various 3D enemies of the game role of user can go out according to the actual position of userNow in gaming.User can play the user and/or 3D associated with the user that other are playing game of game with otherElement interacts or causes its reaction.Such element associated with the user may include weapon, message, currency, user3D rendering etc..According to the position of user or other data, he or she can encounter in any manner, checks, participate in other useFamily and 3D element associated with other users.In embodiments, 3D game can also be by being mounted in glasses or downloadingIt is provided to the software of glasses, the position of user is used or not used in glasses.
In embodiments, lens can be opaque to provide a user virtual reality or other virtual 3D game bodiesIt tests, wherein user " places oneself in the midst of " in game, and wherein the visual angle of the 3D game environment of user can be changed in the movement of user.User Ke TongFollowing manner is crossed to move around or explore in virtual environment and thereby play 3D game: various bodies, head and/or eyes moveIt is dynamic, the use of game console, one or more touch screens or permissible user described herein navigate to 3D environment,Any control technology of manipulation and interaction.
In various embodiments, user can navigate, interact and manipulate simultaneously body to 3D game environment in the following mannerTest 3D game: body, hand, finger, eyes or other movements, by using one or more wired or wireless controllers, oneA or multiple touch screens, any control technology described herein etc..
In embodiments, inside and outside facility available to eyepiece can be used for learning the behavior of eyepiece user and incite somebody to actionThe behavior learnt is stored in behavior database to enable location aware control, activity aware control, PREDICTIVE CONTROL etc..ExampleSuch as, user may make event and/or be recorded to the tracking of movement by eyepiece, such as from the user to order, pass through camera sensingThe GPS location of the image, user that arrive, with the time sensor input, by user triggering movement, to and from user'sCommunication, user's request, web activity, the music listened to, the instruction of request, used or recommendation provided etc..This behavior numberAccording to that can be stored in behavior database, such as with user identifier or automatically mark.Eyepiece can be in mode of learning, collectionThis data are collected in mode etc..The passing data that eyepiece can be made using user inform or remind the user did before themWhat, or alternatively, eyepiece can predict what eyepiece user may need according to the experience of passing collection using the dataFunction and application.In this way, eyepiece can be used as the automation assistant of user, for example, when user usually starts applicationStart they, when close to a position or while entering building close augmented reality and GPS, spread when user enters gymnasium it is defeatedMusic etc..Alternatively, the behavior learnt and/or movement of multiple eyepiece users can be automatically stored in common behavior numberAccording in library, wherein the behavior learnt between a plurality of users can be available to each user based on similar situation.For example, userA city can visited and waiting train on platform, the eyepiece of user accesses common behavior database to determine other useFamily done when waiting train it is a little what, such as obtain instruction, search for interested place, listen to certain music, search trainThe public place of entertainment in the region is known to obtain travel information, be connected to social networking site in timetable, connection city websiteDeng.In this way, eyepiece can provide a user the help of automation by benefiting from many different user experiences.In embodiments, the behavior learnt can be used for improving the preference profile for user, recommendation, advertisement positioning, social networkNetwork contact person, user or behavioral profiling of user group etc..
In one embodiment, augmented reality eyepiece or glasses may include passing for detecting one or more sound of soundSensor 2900.More than one example describe in Figure 29.In some sense, acoustic sensor is similar to microphone, because itAll detect sound.One or more frequency bandwidths that acoustic sensor usually has them more sensitive, and sensor energyTherefore it is selected for the application wanted.Acoustic sensor can obtain from various manufacturers and in combination with appropriateFrequency converter and other desired route are available.Manufacturer includes ITT Electronic Systems(ITT electronic system) (the U.S.Utah State salt lake city);Meggitt Sensing Systems(Meggitt sensor-based system) (California, USA is holy recklesslyAmp- Ka Pisitelanuo);And National Instruments(American National instrument) (texas,U.S Austin).ProperlyMicrophone include those include single microphone microphone and those include the microphone or microphone of microphone arrayArray.
Acoustic sensor may include the sensor that those use Micro Electro Mechanical System (MEMS) technology.Due to MEMS sensorInterior very delicate structure, sensor is very sensitive and usually has extensive sensibility.MEMS sensor is usually to pass through halfMade of conductor manufacturing technology is come.One element of typical MEMS accelerometer be by two groups of finger sets at dynamic girder construction.OneGroup is fixed to the solid ground level of substrate;Another group is attached to and known be mounted on and can move in response to the acceleration of applicationBlock on dynamic spring.This acceleration applied changes the capacitor between fixed and dynamic beam finger beams.The result is that very quickThe sensor of sense.Such sensor is for example by STMicroelectronics(STMicw Electronics) (Dezhou Austin) andHoneywell International(Honeywell Int Inc) (New Jersey Morrison town) manufacture.
In addition to mark, the sound capabilities of augmented reality equipment can also be used to the origin of location sound.As is generally known,Two sound or acoustic sensor are at least needed to carry out location sound.Acoustic sensor will be equipped with frequency converter and signal appropriateProcessing circuit (such as digital signal processor) is come for explaining signal and completing desired target.For sound alignment sensorOne application can be from emergency location (building, traffic accident for burning etc.) determine sound origin.Equipped withThis description embodiment first-aid personnel can each have one or more than one being embedded in shelf acoustic sensor orMicrophone.Certainly, sensor is also wearable on the clothes of people or is even attached on the person.Anyway, signal is transmittedTo the controller of augmented reality eyepiece.Eyepiece or glasses are equipped with GPS technology and can also be equipped with bearing measurement ability;ReplacementGround, by everyone two sensors, microcontroller can determine that the direction of noise origin.
It is if there is two or more fire fighters or other emergency reactions personnel, then knowable from their GPS abilityTheir position.Any of two or Chief Fire Officer or control general headquarters then know two reaction personnel position and fromEvery reaction personnel are to the direction of the noise detected.The exact originating point of noise then can be by using known technology and calculationMethod determines.See, for example, Acoustic Vector-Sensor Beamforming and Capon DirectionEstimation(acoustic vectors-sensor beam forming and Capon direction estimation, M.Hawkes and A.Nehorai, IEEE letterNumber processing transactions, roll up the 46, the 9th phase, in September, 1998, the 2291-2304 pages);See also Cram é r-Rao Bounds forDirection Finding by an Acoustic Vector Sensor Under Nonideal Gain-PhaseResponses, Noncollocation or Nonorthogonal Orientation(are used to ring in undesired gain phaseAnswer, it is non-arrangement or nonopiate orientation under by acoustic vectors sensor carry out direction finding Cram é r-Rao circle, P.K.Tam andK.T.Wong, IEEE sensor magazine roll up the 9, the 8th phase, in August, 2009, the 969-982 pages).The technology used may include timingDifference (reaching time-difference of institute's sensor parameter), velocity of sound difference and acoustic pressure difference.Certainly, acoustic sensor usually measures acoustic pressureGrade (for example, as unit of decibel), and these other parameters can be used in the acoustic sensor of appropriate type, includingAcoustic emission sensor and ultrasonic sensor or frequency converter.
Algorithm appropriate and all other necessary programming can be stored in the microcontroller of eyepiece or storeIn the addressable memory of eyepiece.By using more than one reaction personnel or several reactions personnel, then can determinePossible position, and react personnel and can attempt to position personnel to be succoured.In other applications, these can be used in reaction personnelAcoustic capability come determine concern personage position to enforce the law.In other applications, multiple personnel in manoeuvre can encounter enemyFang Huoli, including direct firepower (sight line) or indirect firepower (outside sight line, including high-angle fire).It is described herein identicalTechnology can be used for the position for estimating enemy firepower.If there is several personnel in region, estimation can be more accurate, especiallyIt is in the case where several personnel are separated at least some range in wider array of region.This can be for drawing for enemyLead the effective tool of anti-artilleryman or counter mortar firepower.If target is close enough, direct firepower be can be used as.
Described in Figure 29 B using an example of augmented reality eyepiece embodiment.In example 2900B, Shuo GeshiIn patrol, everyone soldier alarm equipped with augmented reality eyepiece and for enemy firepower.Acoustic sensor or wheat by themThe sound that gram wind detects can be passed to squad's vehicle such as display, their platoon leader or long-range tactical operations center (TOC)Or command post (CP).Alternatively or cumulatively, signal also may be sent to that mobile device, the airborne platform such as shown.Communication between these soldiers and additional position can be promoted by using local area network or other networks.In addition, all passedThe signal sent can be protected by encryption or other safeguard procedures.One in squad's vehicle, platoon leader, mobile platform, TOC or CPIt is a or it is multiple will have integration capability, for the input from several soldiers being combined and being determined the possibility of enemy firepowerPosition.Signal from every soldier will include the soldier position from the intrinsic GPS ability of augmented reality glasses or eyepiece.OftenThe acoustic sensor of position soldier can indicate the possibility direction of noise.By using the signal from several soldiers, enemy firepowerDirection and possible position can be determined.Soldier can then eliminate the threat of the position.
Other than microphone, augmented reality eyepiece can be equipped with earplug, can as other places herein are mentionedTo be hinge earplug, and it can be movably additional 1403 or can be equipped with audio output consent 1401.Eyepiece and earplugIt can be equipped to delivering noise and eliminate interference, so that user be allowed preferably to hear the audio-from augmented reality eyepiece or glassesThe sound delivered in video communication ability, and can be to automatic gain control feature.The loudspeaker or earplug of augmented reality eyepieceIt can also be connect with the full acoustic frequency of equipment and visual capabilities, with high-quality of the delivering from included telecommunication apparatus and clearly soundThe ability of sound connects.As other places herein are mentioned, this includes radio or cellular phone (smart phone) audio energyPower, and may also include supplementary functions, such as Bluetooth for wireless personal area network (WPAN)TM(bluetooth) ability or correlationTechnology, such as IEEE802.11.
The another aspect for enhancing audio capability includes speech recognition and identification capability.Speech recognition has been focused on illustrating assortedAnd voice identifier focuses on understanding speaker that whom is.Voice identifier can be used in combination with the face recognition ability of these equipment to be comeMark concern personage more for certain.As described in other places herein, what a part as augmented reality eyepiece connectedPhase function insignificantly focuses on desired personnel, multiple faces in single people or crowd in such as crowd.By usingThe image of camera and facial recognition software appropriate, people or personnel can be obtained.Each feature of image is then divided into arbitrarilyThe measurement and statistical data of quantity, and result is compared with the database of known people.Then identification can be made.WithSame mode can get voice or speech sample from concern personage.Sampling can be on for example specific time interval be madeMark marks, and the descriptions of the physical features or number of such as employment marks.Speech sample can be with known peopleDatabase is compared, and if personnel voice match, identification can be made.In embodiments, multiple interestedIndividual can be selected such as biological identification.Multiple choosings can be made by using cursor, hand gesture, eyes movement etc.It selects.As multiple selections as a result, the information about selected individual can be provided to use by showing, by audio etc.Family.
It is used for the control in each embodiment of the bio-identification of multiple people, being described herein in crowd in wherein cameraTechnology can be used for selection face or iris for imaging.For example, the cursor selection using handset type control equipment can be used forSelect multiple faces in the visual field of user surrounding environment.In another example, watch tracking can be used for which face selected attentivelyIt selects to be used for biological identification.In another example, handset type control equipment can sense the posture for selecting individual, such asIt is directed toward each individual.
In one embodiment, the key property of the speech of someone can be adopted from a sampling or from the multiple of the people's voiceUnderstand in sample.Sampling is often divided into segment, frame and subframe.In general, key property includes the basic frequency of the voice of people, energyAmount, formant, speech rate etc..These characteristics are analyzed by analyzing the software of voice according to specific formulation or algorithm.ThisA field just constantly changes and improves.However, current this classifier may include such as following algorithm: neural network classificationDevice, k- classifier, hidden Markov model, gauss hybrid models and pattern matching algorithm etc..
The general template 3100 for speech recognition and speaker identification is described in Figure 31.The first step 3101 is to provide languageSound signal.It is desirable that a people has from the known sampling previously met with, which can be compared with known sampling.Signal is then digitized in step 3102 and is divided to segmentation, such as segment, frame and subframe in step 3103.Speech samplesFeature and statistical data be then generated and extract in step 3104.Classifier or more than one classifier are then in step3105 are applied to determine the general classification of sampling.The post-processing of sampling will then can be sampled in step 3106 by application examples TathagataIt is compared with known sampling for finding possible matching and mark.As a result it can then be exported in step 3107.The outputIt is directed into the matched people of request, and can also be recorded and be sent to other people and one or more databases.
In one embodiment, the audio capability of eyepiece includes the hearing protection using associated earplug.At the audio of eyepieceReason device can enable automatic noise suppression, in the case where loud noise is such as detected near the head of wearer.It is described hereinAny control technology in combination with automatic noise suppression come using.
In one embodiment, eyepiece may include Nitinol headband.Headband can be curved metal sheet band, can otherwise fromIt extracts or rotates out of in the mirror holder of eyepiece and reach and eyepiece is fixed to head on rear side of head.In one embodimentIn, the end of Nitinol headband can have silica gel sheath, pull out so that the silica gel sheath is grasped from arm end.In each embodimentIn, only one arm has Nitinol band, and it is fixed to another arm to form headband.In other embodiments, twoArm all has Nitinol band and two sides are drawn out to be either formed together headband or individually catch one of headDivide and eyepiece is fixed on the head of wearer.In embodiments, eyepiece can have interchangeable device to adhere to eyepieceTo the head of individual, headband, spectacle frame, helmat belt, the helmet buckle the connector that connection etc. can be attached to.For example, close to useThere may be a connectors at the temple of family, and eyepiece may be affixed to headband at this, and headband can be disconnected at this, fromAnd user can adhere to spectacle frame, come so that eyepiece has the form of glasses, is attached to the helmet etc..In embodiments, by meshIt may include flush type antenna that mirror, which is attached to user's head or the interchangeable device of the helmet,.For example, Nitinol headband can have insideEmbedded antenna, for specific frequency, for multiple frequencies etc..In addition, mirror holder, headband etc. may include RF absorption bubbleFoam material is with the absorption of the help RF energy when antenna be used to transmit.
With reference to Figure 21, eyepiece may include the adjustable sheath of one or more around scalable mirror holder 2134.Around can receiveThe position of eyepiece can be fixed on the head of user by the adjustable sheath of contracting mirror holder 2134.One in scalable mirror holder 2134 orIt is multiple to can be made of shape-memory material.In embodiments, one or two of mirror holder is by Nitinol and/or to appointMade of what shape-memory material.In other cases, around the end of at least one of the sheath of scalable mirror holder 2134Silicone resin can be coated with.In addition, the adjustable sheath around scalable mirror holder 2134 can be stretched from the end of eyepiece mirror holder 2116Out.They can telescopically be stretched out and/or they can be skidded off from the end of eyepiece mirror holder.They can be out of eyepiece mirror holder 2116Portion skids off or they can be slided along the outer surface of eyepiece mirror holder 2116.In addition, scalable mirror holder 2134 can contact with each other and consolidateIt is fixed.Scalable mirror holder is also attached to another part of wear-type eyepiece to create the dress for eyepiece to be fixed to user's headIt sets.It can be fixed to each other, interlock otherwise around the sheath of scalable mirror holder 2134, connecting, magnetically coupling or is logicalIt is fixed to provide the fixed attachment of user's head to cross other means.In embodiments, around scalable mirror holder 2134 canSheath is adjusted also to be individually adjustable to be attached to the head of user or catch each position of user's head.It as a result, can be singleThe mirror holder solely adjusted allows user to have the customizability of promotion for personalized adjustment eyepiece is fixed to user's head.In addition, in embodiments, at least one around the sheath of scalable mirror holder 2134 can be separated from wear-type eyepiece.At itIn its embodiment, it can be the supplementary features of wear-type eyepiece around the sheath of scalable mirror holder 2134.In this case, it usesFamily may be selected scalable, non-telescoping or other mirror holders being placed into wear-type eyepiece.Allow user will for example, mirror holder can be used asEyepiece is customized to meet a part of the external member of his or her certain preference or external member to sell.Therefore, user can pass through selectionThe different external members of specific scalable mirror holder with the preference for being appropriate to him can be made into customize around scalable mirror holder 2134The material type of adjustable sheath.Therefore, user can customize his eyepiece for his specific needs and preference.
In other embodiments, adjustable headband 2142 can be attached to eyepiece mirror holder, so that it surrounds user's headRear portion extend so that eyepiece is fixed on suitable position.Headband can be adjusted to suitable position appropriate.It can be by any conjunctionSuitable material is made, including but not limited to rubber, silicone resin, plastic cement, cotton thread etc..
In one embodiment, eyepiece can be fixed to the head of user by a variety of other structures, such as rigid mirror holder,Flexible mirror holder, gooseneck bending mirror holder, cable tension system etc..For example, flexible mirror holder can be from hose such as in gooseneck configurationBuilding, wherein flexible mirror holder can be bent to position to be adjusted to be suitble to given user, and wherein flexible mirror holder can be fixed again on demandShape.In another example, such as in robot finger's configuration, flexible mirror holder can be constructed from cable tension system, the flexibility mirrorFrame has the connector of multiple connection component parts, and each component part is applied by being applied to by the cable of each connector and component partIf pulling force be manipulated into curved shape.In this case, cable traction system can realize hinged ear angle for big dittySection and eyepiece headwear keep cable tension system that can have two or more connections, and cable can be stainless steel, based on NiTiIt is promise, electric actuation, ratchet, wheel adjustment etc..
Each embodiment of cable tension system 17800 is shown in Figure 178-179A and B.In embodiments, cableForce system may include ear angle 17802, which is made of the connector of two or more connection component parts, each to formPart is manipulated into curved shape by being applied to the tension of the cable 17804 across each connector and/or component part.SchemingAt the erection position shown in 178, ear angle can be positioned straight along user's head.Cable 17804 can pass through 17808 quilt of adjusterThus attachment and tension position adjuster to promote the tension in cable 17804 and ear angle is caused to be bent or curvedMeet the shape of user's head.By increasing such tension, ear angle can fasten and/or become more rigid.By according toThe setting in account portion, ear angle 17802 can be directed to by the way that eyepiece is securely maintained at user's head specific user's head intoRow adjusts and/or eyepiece is assisted to keep.In embodiments, as the tension of cable 17804 increases, ear angle becomes more rigidOr less relax to position to user's head, and as the tension in cable 17804 discharges, ear angle becomes more flexible, thusOne or two of ear angle is allowed to stretch and/or fold.In embodiments, adjuster 17808 can be ratchet, electroluminescentIt is dynamic, wheel adjustment including wedge block etc..In embodiments, wedge block can be taper regulating member, can pass throughPull ring etc. is drawn in or is pulled out, adjusting is provided, allow the position for the one or more parts for making ear angle and/or eyepiece be raised orDecline.In embodiments, as shown in Figure 179 B, ear angle 17804 may be configured to robot finger and configure and shape.HereinThe adjustable ear angle of description can provide Foldable convenient coat so that eyepiece is fixed to user's head by easy-to-use while offerBenefit.In embodiments, ear angle can provide the design on package head, wherein the head of the ear angle package user at left and right sides of eyepiecePortion and in contact with or close in contact user's head rear portion.In embodiments, ear angle can mutually hitch increased to provideIt is fixed.Such hitch can be realized by the magnet on each ear angle, hitch gear on ear angle etc..In embodiments,Ear angle can partially or even wholly wrap up the head of user or mutually agree with the contouring head of user and/or they can pass through edgeThe side of user's head and/or be fixed on user's ear rear to be fixed to the head of user.In embodiments, ear angle canIt is attached to the earphone of eyepiece, the earphone 2104 shown in such as Figure 22.Ear angle can permanently or removably be attached to earphone.?In each embodiment, as shown in Figure 180, ear angle may include the earphone of eyepiece a part or it may include entire earphone (notIt shows).In embodiments, adjuster 17808 can be close adjacent to part, the ear angle of eyepiece positioned at ear angle or passes through user's earPiece end or ear angle and/or any other position of eyepiece.In embodiments, as described in this, one in ear angle orTwo are adjustable.In each embodiment as described in this, as shown in Figure 184, ear angle (only shows own and does not haveHave eyepiece) head of user can be wrapped up and/or mutually agreed with the contouring head of user.
In embodiments, the changeable gravitation between layers multiple in laminate can be used for ear angle.For example, one orMultiple ear angles may include the layer in laminate and the gravitation between layer may be from magnetic, electrostatic and/or vacuum plant.EachIn embodiment, magnet can be used in the following manner: magnetic pole being rotated to attraction or repels position to allow in laminateEach layer attract each other so that ear angle fastening and it is mutually exclusive so that ear angle relaxation.Each layer of laminate leans on whereinIn each embodiment together, voltage can be applied to generate the electrostatic attraction that can be electrically switched.As gravitation is generated, ear angle canFastening.When voltage is removed or electrostatic attraction is switched, ear angle can relax.It in embodiments, can be by laminated by twoVacuum is generated together, the two layers are combined together and have rebound, this time in one or more parts of each layerChamber or space are generated between each layer of bullet to generate vacuum.As each layer is summed, they may make ear angle firm.VacuumSealing can be broken to allow the relaxation of ear angle.In various embodiments, since ear angle is firm, they can provide eyepiece with accountMore rigid and/or fixed holding in portion.In embodiments, ear angle can partially or even wholly wrap up user head orMutually agree with the contouring head of user and/or they can be by the side along user's head and/or after being fixed on user's earSide is fixed to the head of user and/or the rear portion of user's head.As electrostatic potential, polarity and/or vacuum are adjusted,Ear angle firm can allow the ear angle to be fixed to the head of user, and ear angle can relax or discharge and close to stretch and/or fold intoCoincidence is set.
In embodiments, one or more ear angles may include interior bar and/or cable architecture, wherein each ear angle is further wrappedInclude magnet.The magnet at each ear angle can be connected with each other, to allow the head of two ear angles package users.Magnet is interconnectedIt acts and allows line and/or interior bar structure draws taut, be suitble to provide to the more constant of user's head.In embodiments,By connecting magnet, harder head and/or ear angle to allow ear angle package user can be erect or be become to the interior bar at ear angleInner wire can be tightened and allow the head of ear angle package user.In embodiments, ear angle can partially or even wholly wrap up useThe head at family is mutually agreed with and/or they by the side along user's head and/or can be fixed on the contouring head of userUser's ear rear is fixed to the head of user.When magnet is not connected, ear angle can stretch and/or can be folded.
In embodiments, one or more ear angles can be using the intracavitary air pressure inside ear angle, can firm ear angle.GasPressure can be increased to firm ear angle.It is such it is firm can eyepiece using when allow ear angle be adjusted and/or wrap up user'sHead.In embodiments, ear angle can partially or even wholly wrap up the head of user or mutually agree with the contouring head of user,And/or they by the side along user's head and/or can be fixed on user's ear rear and be fixed to the head of user.GasPressure can be lowered come the ear angle that relaxes.When ear angle is relaxed, they can stretch and/or be folded.Air pressure can be placed into useIt is conditioned before or after taking away in account portion or from user's head.In embodiments, air pressure can by by finger pressing orPump in the side frame of other way operation is adjusted.In embodiments, pump can via the user interface being shown in glasses orIt is adjusted via other means.
In each embodiment being described herein, the solidness at ear angle can be related with cubic relationship to thickness.As showingExample, compared to single layer, the solidness of two not connected layers is up to twice, however if layer is connected to single layer,Combination layer with double thickness will have the solidness for being elevated 8 times.As further example, three individual layers are comparedThere is the solidness of three times in single layer, but be joined together will be firm with 27 times compared to single layer for three layersDegree.
In embodiments, as shown in Figure 181, one or more ear angles may include inside and outside position, thusInner portion be formed from one of ear angle part and outer portion is formed from another part at ear angle.It is internal and outerPortion position can be formed from bifurcated in ear angle or in other ways from ear angle to constitute two individual positions, one of positionIt is outer portion and the other is inner portion.In embodiments, inner portion can contact and outside portion with the head of userPosition can be contacted with inner portion.In embodiments, inside and outside position is interlockable, such as the embodiment described in Figure 182Middle display.Inside and outside position may include interlock slot, tooth or other devices that they are interlocked or are tied.TopAnd/or outer portion may include pull ring or other protrusions, may make inside and outside position no longer to lock together by its user.In embodiments, each position can be bent to the head of user.In addition, inner surface opposite can outwardly push away.By mutualInside and outside position is locked, the thickness at each position can be doubled.Therefore, by promoting the thickness of ear angular position, solidness can quiltIt is promoted.In embodiments, by the thickness at ear angle of doubling, compared to single layer, solidness can be elevated 8 times.Strip outer layerEar angular position can be returned to flexible state, so that ear angle be allowed to be folded.In embodiments, ear angle can pass through magnet, folderSon, suspension hook are attached to otherwise to be fixed to user's head.
In addition, in embodiments, as described in Figure 183, one or more ear angles may include three positions.At thisIn the embodiment of sample, ear angle may include the inside and outside position as described in reference Figure 181 and 182, however the embodiment may be used alsoIncluding intermediate position 18302, so that being explicitly made of three positions in ear angle such as Figure 183.Ear angle can further comprise oneA or multiple buttons, hasp, interlock slot, tooth, nail or other devices lock together each position in ear angle.One of each position orMultiple may include pull ring or other protrusions, can be via discharging tooth or other devices for locking together each position by its userCome so that inside and outside position no longer locks together.In embodiments, compared to single layer, three not connected layers can haveThere is the solidness of three times, but when three layers are locked/linked together, compared to single layer, ear angle there can be 27 times of heavily fortified pointSoundness.When three positions are not connected or are not locked together, ear angle can be flexible, so that they can be stretchedAnd/or it folds.In addition, although each position is not locked together, each position can mutually be slided, so that them be allowed to be softProperty and be more easier to store when being not used, and when layer is locked together or is nailed together, they can notMutually sliding.Each position at ear angle can reside in sheath, pipe or other structures including ear angle, so that individually each positionIt is not demonstrated.Notwithstanding tool, there are two the ear angles with three positions, but skilled artisans appreciate that eachIn embodiment, ear angle can be made of more than three positions and/or modified thickness.
In each embodiment described herein, package ear angle is foldable.(such as work as when ear angle is folded to closed positionWhen user does not use eyepiece), ear angle can it is vertical so that they fold and ear angle package user's head and/or ear or withThe ability that the profile of account portion and/or ear mutually agrees with can not interfere folding.In each embodiment being described herein, ear angleIt can be folded and thereby vertically, thus allowing ear angle to become flat allows eyepiece to store with flat, configuration.In each embodimentIn, when discharging at hinge or discharging otherwise, ear angle can be vertical, so that eyepiece be allowed to be folded.As retouched hereinIt states, in various embodiments, ear angle can become less rigid, so that them be allowed to fold.
In embodiments, leveller gasket can be used for one or more ear angles, enable them to as in notEar with upright position provides the different location of the ear or eyes that adjust or solve user.In embodiments, gasket canIt is placed on the contact point with user's ear at ear angle to be adjusted eyepiece to be suitble to user's ear and/or eyes notSame position.In embodiments, leveller gasket can be adjusted by wedge block or by various means.Leveller gasket canA part or leveller gasket to be ear angle can be attached to ear angle by clip, gluing, friction or other means.
In each embodiment being described herein, eyepiece and ear angle can be on the one or more regions contacted with userClosed-cell foam material is installed.Foamed material can provide a user comfort, while be also prevented from moisture and sweat infiltration foamed material.In addition, closed-cell foam material provides impunctate surface also to prevent eyepiece Carried bacteria, microorganism and other organisms and preventTheir growth.In each embodiment being described herein, foamed material can be it is antimicrobial and/or antibacterial and/Or it is handled with the substance for such purpose.
In one embodiment, eyepiece may include security feature, such as M-Shield Security(M-Shield safety),When secure content, DSM, safe operation, IPsec etc..Other software feature can include: user interface, application, frame, BSP, volumeDecoder, integrated, test, system verifying etc..
In one embodiment, eyepiece material can be selected to achieve reinforcing.
In one embodiment, eyepiece is perhaps able to access that 3G access point, 802.11b connection and bluetooth including 3G radioConnection, so that data can jump to the eyepiece embodiment for enabling 3G from an equipment.
Present disclosure also relates to the method and apparatus for capturing the biometric data about individual.The method and deviceThe wireless capture of fingerprint to individual, iris patterns, face structure and other unique biometric features is provided, then will be countedAccording to being sent to network or be transmitted directly to eyepiece.From individual acquisition data can also compared with the data previously acquired, andIt is used to identify particular individual.
In embodiments, eyepiece 100 can be with such as bio-identification flash lamp 7300, bio-identification phone 8000, biologyIdentify the mobile biometric apparatus of camera, pocket biometric apparatus 5400, armband formula biometric apparatus 5600 or the likeAssociated, wherein the movement biometric apparatus can be used as autonomous device or communicate with eyepiece, such as to equipment control,To the storage of the displays of the data from equipment, data, it is linked to external system, is linked to other eyepieces and/or other movementsBiometric apparatus etc..Mobile biometric apparatus may make soldier or other non-soldier's acquisitions or utilize existing bio-identificationData to carry out sidelights on to a certain individual.The equipment can provide tracking, monitoring and acquisition such as including video, speech, gait,The biometric record of face, iris biometric feature or the like.Equipment can provide geo-location label for the data of acquisition,Such as band having time, date, place, data acquisition people, environment etc..Such as using thin film sensor, record, acquisition, mark withAnd verifying face, fingerprint, iris, latent fingerprint, dive palmmprint, speech, articles in pocket and other identifier witness marking and environmentData, equipment perhaps can capture and record fingerprint, palmmprint, scar, mark, tatoos, audio, video, annotation etc..Equipment is perhapsWet or dry printed article can be read.Equipment may include camera, for example, with IR illumination, UV illumination etc., with perspective dust,The ability of cigarette, haze etc..Camera can support dynamic range expansion, adaptive defect pixel correction, the enhancing of advanced acutance, geometric distortionCorrection, advanced color management, hardware based face detection, video stabilization etc..In embodiments, camera output can be transmittedTo eyepiece for being presented to soldier.Depending on requiring, which can accommodate multiple other sensors, all as described herein, packetInclude accelerometer, compass, ambient light sensor, proximity sensor, baroceptor and temperature sensor etc..Equipment can also haveHave mosaic plating sensor as described herein, thus generate the fingerprint of individual, the simultaneously swirls of multiple fingerprints, palmmprint etc. andThe high-definition picture of flow liner.Soldier can more easily acquire personal information using mobile biometric apparatus, such asDocument and Media Development utilize (DOMEX).For example, operator is capable of taking pictures and reads mark during interview, registration, inquiry etc.Data or " articles in pocket " (such as passport, ID card, personal document, cellular phone catalogue, photo), acquire biometric dataIt is paid close attention in person profile Deng, typing, which can be input in the safety database that can search for.In embodiments, can makeWith most specific image plus being manually entered come archived biological identification data, to realize that partial data captures.Data can be automaticGround geo-location, between the added-time/date tag, filing be medium to digital archives, such as with local or network distribute it is global onlyOne identifier (GUID).For example, can be in IED(improvised explosive devices) face-image is captured at explosion scene, it can be suicide quick-friedLeft iris image is captured at fried scene, latent fingerprint can be extracted from sniper rifle, each is all in different places and timeIt is acquired with different mobile biometric apparatus, and the mark concern personage from multiple inputs together, such as with locomotiveCheckpoint.
The further embodiment of eyepiece can be used for providing biometric data acquisition and result report.Bio-identification numberData are identified according to the visual biological that can be such as facial biometric data or iris biometric data etc, or can beAudio biometric data.Figure 39, which is depicted, provides the embodiment that biometric data captures.Assembly 3900 combinesThe eyepiece 100 that face is discussed about Fig. 1.Eyepiece 100 provides the interaction wear-type eyepiece including optics assembly.It can also be used and mentionFor other eyepieces of similar functions.Eyepiece may also be combined with global positioning system ability to allow location information to show and report.
Optics assembly allows user to observe ambient enviroment, including the individual near wearer.One embodiment of eyepiece permitsFamily allowable is using face-image and iris image or both face-image and iris image or audio sample come in biology subscriptIndividual near knowing.Eyepiece combines correcting user to the correcting element of the view of ambient enviroment, and also display passes through integrated processingDevice and image source are supplied to the content of user.The content that integrated image source will be displayed to user introduces optics assembly.
Eyepiece further includes the optical sensor for capturing biometric data.In one embodiment, integrating optical sensorDevice is in combination with the camera being mounted on eyepiece.The camera be used to capture the biometric image of the individual near eyepiece user.By the way that eyepiece to be located in suitable direction, optical sensor or camera are directed toward neighbouring individual by user, this can only lead toIt crosses and sees the individual to complete.User may choose whether one or more in face-image to be captured, iris image or audio sampleIt is a.
The seizable biometric data of eyepiece shown in Figure 39 includes face-image, the Yong Huhong for face recognitionThe iris image that film identifies and the audio sample for voice identification.Eyepiece 3900 is combined along the support of the right of eyepiece and a left sideMultiple microphones 3902 of the endfire array form of both temples setting.Microphone array 3902, which is specially tuned to permission, has heightThe speech of people is captured in the environment of level environment noise.Microphone can be directionality, can turning to and can change.Microphone3902 provide the selectable option for improved audio capture, including omni-directional operation or directed beams operation.Directed beams operationUser is allowed to record the audio sample from the particular individual by the way that microphone array to be redirect to the direction of target individual.It is adaptiveAnswer microphone array that can be created, it will allow operator dimensionally to turn in the direction of microphone array, and wherein directed beams can be by realityWhen adjustment come for on-fixed target maximize signal or minimize interference noise.ARRAY PROCESSING allows to pass through analog or digital handSection is summed to heart-shaped line element (cardioid element), cuts wherein may be present between omnidirectional and directional array operationIt changes.In embodiments, beam forming, array steering, adaptive array processing (speech source positioning) etc. can be by airborne processorsIt executes.In one embodiment, microphone is perhaps able to carry out 10dB orientation record.
By combining the phased array audio and video tracking for tracking audio and video capture, audio bio-identification is capturedEnhanced.Audio tracking allows continuously to capture audio sample when target individual moves in the environment with other noise sourcesThis.In embodiments, the speech of user can be reduced from track, clearer heavy to make it possible to have target individualIt is existing, such as distinguishing what has been said, better position tracking is provided, better audio tracking is provided etc..
In order to which to display optics and biometric data acquisition power supply, eyepiece 3900 also incorporates lithium ion battery3904, it can work more than 12 hours on the basis of single charge.In addition, eyepiece 100 also incorporates processor and consolidatesState memory 3906 is for handling the biometric data captured.Processor and memory are configurable to and are used as lifeObject identification captures any software of a part of agreement or format (such as .wav format) or algorithm works together.
Eyepiece assembly 3900 further embodiment provides the biometric data captured is transmitted in lifeThe integrated communicaton ability of the remote equipment of biometric data is stored in object identification database.Biometric data database interpretationThe biometric data of capture explains data, and preparing content on eyepiece for showing.
In operation, it is desirable to capture from nearby observe individual biometric data eyepiece wearer by heOneself or herself be located so that the individual appears in the visual field of eyepiece.Once in place, user initiates to believe bio-identificationThe capture of breath.The biometric information that can be captured includes iris image, face-image and audio data.
In operation, it is desirable to capture the wearer of the eyepiece of the audio biometric data from the individual nearby observedBy himself or herself it is located so that the individual appears near eyepiece, is specifically proximate to be located in eyepiece templeMicrophone array.Once in place, user just initiates the capture to audio biometric information.The audio biometric information is by targetThe record sample of individual voice forms.Audio sample can identify data together with the visual biological of such as iris and face-image etcIt is captured together.
In order to capture iris image, eyepiece is simultaneously located so that optics passes by wearer/desired individual of user's observationSensor assembly or camera can acquire the image of the biometric parameters of the desired individual.Once capturing, eyepiece processingDevice and solid-state memory are ready for captured image to be transmitted to remote computing device for being further processed.
The phase that remote computing device receives the biometric image that transmission comes and will the transmission next image and previously captureThe biometric data of same type compares.Iris or face-image come compared with the iris or face-image that previously acquiredDetermine whether the individual had previously encountered and identified.
Compare once having made, remote computing device just sends the report compared to wearer/user eyepiece, for aobviousShow.The images match that this report can indicate that captured biometric image and previously capture.In such cases, Yong HujieReceive the report of identity and other identifier information or statistical data including the individual.And the not all bio-identification capturedData determine identity with all allowing non-ambiguity.In such a case, remote computing device provides the report of discovery situation, and can askUser is asked to acquire additional biometric data (may be different types of biometric data), to help to identify and compare placeReason.Visual biological identification data can use audio biometric data supplement as the further auxiliary to mark.
Face-image is captured in a manner of being similar to iris image.Due to the size of acquired image, the visual field mustIt is bigger.The target that this also allows subscriber station to obtain from the facial biometric data that is just captured is farther.
In operation, user may initially capture the face-image of the individual.However, the face-image may beIt is incomplete or uncertain, because what the individual may wear is the clothing or other dress ornaments for having blocked facial characteristics, such asCap.In this case, remote computing device can request to be captured and transmitted using different types of bio-identification additionalImage or data.In the above case said, bootable user obtains iris image to supplement captured face-image.In other realitiesIn example, additional requested data can be the audio sample of the speech of the individual.
Figure 40, which is exemplified, captures iris image for iris recognition.The attached drawing exemplifies the focus parameter for analyzing imageAnd the geographical location including the individual when biometric data captures.Figure 40 further depicts the sample report being shown on eyepieceIt accuses.
Figure 41 exemplifies the capture of a plurality of types of biometric datas, is face and iris image in this example.It shouldCapture can carry out simultaneously, or when the biometric data of the first kind leads to uncertain result according to remote computing deviceRequest carries out.
Figure 42 shows the electricity configuration of multiple microphone arrays included in the temple of the eyepiece of Figure 39.End-fire microphone array is permittedXu Yigeng big distance carries out bigger differentiation and better directionality to signal.Pass through the transmission that will postpone to be incorporated into rear microphoneIn line, signal processing is enhanced.The switching from omni-directional microphone to directional microphone is realized in the use of double omni-directional microphones.This allows for instituteThe audio capture of desired individual carries out better direction finding.Figure 43, which is exemplified, to be depicted with the obtainable orientation of different microphonesProperty improve.
As shown in the top of Figure 43, single omni-directional microphone can be used.The microphone can be placed on from sound source to set a distancePlace, and the acoustic pressure or digital audio input (DI) at microphone will be horizontal in given dB.Single microphone is substituted, multiple words can be usedCylinder or microphone array.For example, 2 microphones can be placed on distance sources twice at a distance, it is 2 apart from the factor, acoustic pressure increases 6dB.It replacesDai Di can be used 4 microphones, be 2.7 apart from the factor, and acoustic pressure increases 8.8dB.Array also can be used.For example, apart from the factor 48 microphone arrays at place can have the DI of 12dB to increase, and can have the DI of 13.2dB to increase in 12 microphone arrays at the factor 5Add.The figure of Figure 43 depicts a little, these given sound pressure levels from the point generate identical signal level at microphone.Such as figureShown in 43, the super cardioid microphone of the first order can be used in identical distance, in this example there is 6.2dB to increase, and the second level.It is moreA microphone can be arranged with compound microphone array.Substitution captures audio sample, eyepiece using the high quality microphone of a standardMultiple microphones of temple part receiving different characteristics.For example, this can user just generating the bio-identification fingerprint of someone speech forIt is provided when capturing and compare in the future.The example that multiple microphones use, which is used with the microphone of cellular phone isolation, to be reproducedThe definite electrical and acoustic characteristic of the speech of individual.The sample is stored in database for comparing in the future.If the individualSpeech captured later, then previous sample just can be used for comparing, and due to the acoustic characteristic of two samples will match, willIt is reported to eyepiece user.
Figure 44 shows using adaptive array and improves audio data capture.By modifying pre-existing be used at audioThe algorithm of reason can create the adaptive array for allowing user to turn to the directionality of antenna in three dimensions.Adaptive array processingAllow to position the source of voice, therefore the audio data captured is related into specific individual.ARRAY PROCESSING allow digitally orThe simple summation of the heart-shaped element of pair signals is carried out using analogue technique.In routine use, user should be in omni-directional modeSwitch microphone between directionality array.Processor allows to execute beam forming, array steering and adaptive array on eyepieceColumn processing.In embodiments, audio frequency phase array can be used for the audio tracking to particular individual.For example, user may lockThe audio feature code (database such as obtaining in real time or from sound characteristic code) for determining a certain individual in ambient enviroment, withThe position of the track individual is without keeping eye contact or their head of user's movement.It the position of the individual can be aobvious by eyepieceShow that device is projected to user.In embodiments, the tracking of certain individual can also be provided by the embedded type camera in eyepiece,Wherein user will not be required holding and the eye contact of the individual or their head of movement to follow.That is, in audioOr in any case of vision tracking, eyepiece perhaps can track the individual in home environment, and user does not need to show fingerShow the occurent physical motion of tracking, even when user moves their visual direction.
In one embodiment, it integrates camera and sustainably records video file, and integrated microphone sustainably records audioFile.The integrated processor of eyepiece may make can add event tag in the long section of continuous audio or videograph.For example,When occurring event, dialogue, experience or other interested projects, the passive record of whole day can be tagged.Tagging canIt is completed by explicitly push button, noise or physics tapping, gesture or any other control technology as described herein.LabelIt can be placed in audio or video file or be stored in metadata stem.In embodiments, label may include event, it is rightThe GPS coordinate of words, experience or other interested projects.In other embodiments, label can aim at the GPS day on the same dayIt is synchronous on time.The trigger of other logic-baseds can also tag to audio or video file, such as with other users, setThe proximity relation of standby, position etc..Event tag can be the active event label that user manually triggers, the passive thing occurred automaticallyPart label (by pre-programmed, by event profile management equipment etc.), by user location triggered position sensing markLabel etc..The event of trigger event label can be triggered by following: sound, landscape, visual indicia, received from network connection, opticsTrigger, sound trigger, neighbouring trigger, Trigger of time, geographical space trigger etc..Event trigger can be raw to userAt feedback (such as audio tones, visual detector, message), information (such as storage file, document, the entry in list, sound are storedFrequency file, video file etc.), generate information transmission etc..
In one embodiment, eyepiece is used as SigInt(SIGNT) glasses.Use integrated WiFi, 3G or bluetoothOne or more of radio, eyepiece can be used for significantly and passively collecting the signal of the equipment and individual near userInformation.When particular device ID is in nearby sphere, when specific audio sample is detected, when reaching specific geographic positionWhen setting etc., SIGNT can be automatically collected or can be automatically triggered.
The various embodiments of tactics glasses may include being separately identified or acquiring to biometric information, in safe distancePlace pays close attention to personage (POI) using visual biological identification information (face, iris, walking gait) come geo-location, and utilization pairThe steady sparse recognition algorithm of face and iris identifies POI for certain.Glasses may include being used as bio-identification computer to connectThe display without hand of mouth merges plating and visual biological identification information on an integrated display (with enhancingTarget emphasize), and check matching and warning without alert POI.Glasses may include position consciousness, such as display currently andAverage speed adds the route and ETA(Estimated Time of Arrival to destination), and preload or record trouble spot and withdraw fromRoute.Glasses may include the real-time interconnection tracking to blue and red army, with know always you friendly troop where, realize blueVisual separation range between color and red army, and geo-location and their position of real-time sharing are carried out to enemy.WithThe associated processor of glasses may include the ability of OCR conversion and voice conversion.
Tactics glasses can be used in fight to provide the graphic user interface being incident upon on eyeglass, the graphic user interfaceProvide a user the augmented reality data in direction and the things about such as following information etc: Team Member's position data,The cartographic information in area, SWIR/CMOS night vision, the vehicle S/A of soldier, for be typically less than 2 meters of accuracy to POI or bigGeo-location laser range finder, S/A blue army rang ring, the Domex registration of geo-location are carried out in 500 meters of targets(registration), covering and in real time UAV video are repaired in the battlefield AR.In one embodiment, laser range finder can be1.55 microns of eye protection laser range finders.
Eyepiece can be all as described herein using GPS as described herein and inertial navigation (as utilized Inertial Measurement Unit)Those of, to provide position and direction accuracy.However, eyepiece can be enhanced using additional sensor and associated algorithmPosition and direction accuracy utilizes three axis digital compasses, inclinometer, accelerometer, gyroscope etc..For example, military rowIt is dynamic to may require the position accuracy bigger than obtained by the GPS, thus can utilized in combination increased with other navigation sensorsAdd the position accuracy of GPS.
The resolution ratio that tactics glasses can be enhanced is characterized, such as 1280 × 1024 pixels, can be characterized by auto-focusing.
It is getting off and is capturing in the belligerent task of enemy army, winning low-intensity, low-density, the war institute justice of unsymmetric form and do not allowDiction is effective information management.Tactics glasses system is by disoperative data record and to the synthesis picture of Situation AwarenessIntuitive tactics show that combining each soldier of ES2(is a sensor) ability.
In embodiments, tactics glasses may include the one or more waveguides being integrated in frame.In certain implementationsIn example, total internal reflection eyeglass turned over simple eye or eyes/under turn over configuration and be affixed to a secondary Anti-bullet glasses.Tactics glasses may includeFor omnidirectional's earplug of advanced hearing and protection, and the de-noising suspended microphone for conveying differentiated order on voice.
In certain embodiments, waveguide can have contrast control.Any control technology as described herein can be used to controlContrast processed, ability of posture control, automated sensor control, manually controlling using the controller being mounted on temple.
Tactics glasses may include anti-skidding, adjustable elastic head band.Tactics glasses may include inserting clip correcting lens.
In certain embodiments, total internal reflection eyeglass is affixed to the equipment installed on the helmet, as in Figure 74, and canIncluding day night VIS/NIR/SWIR CMOS color camera.The equipment utilization " perspective ", the electric light projector image turned over are shownDevice allows to obtain uncrossed " visual field " of the weapon to threat and soldier oneself.The helmet is mounted on shown in Figure 74 AOn equipment may include IR/SWIR luminaire 7402, it is UV/SWIR luminaire 7404, visible to SWIR full shot 7408, rightSWIR object lens (not shown) is visible, transparent checks that pane 7410, iris recognition object lens 7412, laser emitter 7414, laser connectDevice 7418 or any other sensor, processor or technology described in eyepiece as described herein are received, such as integrated IMU,Eye protection laser range finder, integrated GPS receiver, the compass for positional accuracy and inclinometer, the sight for changing imageSee angle match the perspective control of eye position, electronic image stabilization and real time enhancing, stored on-board or long-range storageThe threat library etc. accessed by tactical network.The wireless computer worn on body can be with the equipment interconnection in Figure 74.It is mounted onEquipment on the helmet includes to SWIR projector optical device as it can be seen that such as RGB micro-projector optical device.Multispectral IR and UVImaging helps to recognize fakement or the document through changing.The equipment being mounted on the helmet can use the wireless UWB wrist strap or weapon of encryptionPreceding grip controller controls.
In one embodiment, transparent observing pane 7410 could rotate through 180 ° and come and it to project image onto a surfaceOther people are shared.
Figure 74 B shows the decomposition side view for the equipment being mounted on the helmet.The equipment may include for being mounted on the helmetLeft or right side on ambidextrous pedestal.In certain embodiments, two equipment may be mounted to that the left and right two of the helmetOn side, to allow binocular vision.The equipment or two equipment can bite into MICH the or PRO-TECH helmet pedestal of standard.
Now, soldier cannot effectively utilize the data equipment in battlefield.Tactics glasses system is combined with low-profile (lowProfile) form, lightweight material and fast processor fast and accurately determine with making in battlefield.SystemModularized design allows equipment to be efficiently deployed to personal, squad or company, while retaining mutual with any battlefield computerThe ability of operation.Tactics glasses system combines the real time communication to data.Using airborne computer interface, operator can be real-timeCheck, upload or compare data.This, which provides valuable situation and environmental data, can be promptly broadcast to all networkingsPersonal and command post (CP) and tactical operation center (TOC).
Figure 75 A and 75B depict the exemplary reality of bio-identification and Situation Awareness glasses respectively with front view and side viewApply example.The embodiment may include multiple visuals field biography for biometric information acquisition Situation Awareness and enhancing View user interfaceSensor 7502, quick lock in GPS receiver and IMU(include 3 axis digital compasses, the gyro for position and direction accuracyInstrument, accelerometer and inclinometer), for help biometric information capture and aim at 1.55 microns of eye protection laser rangingsInstrument 7504, the integrated digital video logger for being stored with two quick flashing SD cards, real-time electronic image stabilization and Real-time image enhancement,Be stored in carried micro SD card or by the threat library of tactical network remote loading, turn over optic metachromatic eyeglass 7508, flexibleDe-noising suspended microphone 7510 and plus enhancing hearing and protect system 3 axis removable stereographic sound earplugs.For example, this is moreA visual field sensor 7502 allows 100 ° × 40 ° of FOV, this can be panorama SXGA.For example, sensor can be VGA biographySensor, SXGA sensor and the panorama SXGA view that 100 ° × 40 °F OV with suture are generated on the display of glassesVGA sensor.Display can be translucent and have perspective control, and the viewing angle that perspective control changes image comesMatch eye position.The embodiment may also include SWIR detection so that wearer see the sightless 1064nm of enemy and1550nm laser guidance, and can be characterized by the following contents: 256 AES encryptions connections of ultra low power, tactics between glasses withoutLine electricity and computer, instant 2 times of amplifications, automatic face tracking, face and iris record and have 1 meter of automatic identification rangeIdentification and GPS geo-location.The embodiment may include power supply, 4-AA alkaline battery, the lithium electricity of such as 24 hour durationPond and rechargeable battery box, computer and memory expansion slot have waterproof and dustproof band.In one embodiment, glassesIncluding curved holographical wave guide.
In embodiments, eyepiece perhaps can sense laser used in such as battlefield aiming.For example, the biography in eyepieceSensor is perhaps able to detect the laser of typical military laser transmission band (such as 1064nm, 1550nm).By this method, eyepiece orWhether the position for being permitted to be able to detect them is just being aimed, whether another location is just being aimed, uses laser as aiming at auxiliaryThe position etc. of spotter.Further, since eyepiece perhaps can sense laser, such as directly or reflectingly, soldier can be not onlyDetection has been guided or has been reflected into enemy's laser source of their position, and oneself can provide laser source and come in battlefield sceneIt positions optical surface (such as eyes).For example, soldier can use laser scanning battlefield, and use eyepiece viewing laser reflection return asPass through the possible position for the enemy that eyes are observed.In embodiments, eyepiece can be continuously to ambient enviroment scanning laser, and rootFeedback and/or movement are provided according to the result of detection, such as referred to the audible alarm of soldier, by the vision on eyepiece displayerShow the position etc. that device indicates.
In certain embodiments, compact camera (Pocket Camera) can carry out videograph and capture picture, fromAnd allow operator to record environmental data be set to be stored in size the movement in pocket, light weight, firm lifeObject identification equipment is analyzed.One embodiment can be 2.25 " × 3.5 " × 0.375 ", and face can be carried out at 10 feetIt captures, iris capture is carried out at 3 feet, come with the format of the obedience EFTS and EBTS compatible with any iris/facial algorithmRecording of voice, articles in pocket, walking gait and other identifier witness marking and environmental data.The equipment is designed to pre-The specific image for obeying EFTS/EBTS/NIST/ISO/ITL1-2007 is audited and captures, it is soft to be matched by any bio-identificationPart or user interface match and file.The equipment may include HD video chip, the 1GHz processor with 533Mhz DSP,GPS chip, active illumination and pre-quantization algorithm.In certain embodiments, small-sized biological camera (Pocket Bio Cam) can notInventory is monitored in conjunction with bio-identification, therefore it can be used in all echelons and/or be taken action for police reserve force.Data can quiltAutomatically geo-location and add date/time stamp.In certain embodiments, equipment can run Linux SE operating system, fullSufficient MIL-STD-810(military standard 810) environmental standard, and waterproof to 3 feet depths (about 1 meter).
In one embodiment, it can be referred to biological plating equipment for the equipment of fingerprint collecting.Biological plating device includesThere are two the transparent platens of bevel edge for tool.Platen is by a pile LED and one or more camera illuminations.Multiple cameras are used and are leaned onNear-earth is arranged and is directed toward the bevel edge of platen.Finger or palm are placed in the pressing of the upper surface on platen and to platen, and wherein camera is caughtCatch crestal line pattern.Use frustrated total internal reflection (FTIR) Lai Jilu image.In FTIR, light by the finger that presses platen orThe crestal line and valley line of palm are formed by air gap and escape out platen.
Other embodiments are also possible.In one embodiment, multiple cameras are put with the inversion " V " shape of saw tooth patternIt sets.In another embodiment, the image for forming rectangle and being generated using the light directly through side, camera array capture.Light is logicalThe side for crossing rectangle enters rectangle, and camera is under rectangle, so that camera, which can be captured, passes through the square by lightShape and the crestal line and valley line illuminated.
After image is captured, using software come the image stitching from multiple cameras together.Customization can be usedFPGA carry out Digital Image Processing.
Once being captured and handling, image can be streamed to remote display, such as smart phone, computer, handHold formula equipment or eyepiece or other equipment.
Above description provides the summary of the operation of disclosed method and device.These and other implementations are provided belowThe additional description and discussion of example.
Figure 45 is illustrated according to the fingerprint based on optical device of an embodiment and the construction and layout of palmmprint system.Optics battle arrayColumn are made of about 60 wafer-level cameras 4502.4503,4504 use are ambiented lighting using continuous based on the system of optical deviceIn to swirls and flow liner the progress high-resolution imaging for constituting fingerprint or palmmprint.This configuration provides low-profile, light weight,Extremely firm configuration.Durability is enhanced by anti-scratch, transparent platen.
Mosaic plating sensor is supplied images to using frustrated total internal reflection (FTIR) optic panel and is mounted on class PCBWafer-level camera array on substrate 4505.Sensor can be scaled to any flat width and length, and depth is about 1/2".Size can be in the plate of the plate for being small enough to capture just what a finger roll printing to the plating for being large enough to capture both hands simultaneouslyVariation in range.
Mosaic plating sensor allows operator to capture plating and collected data is compareed on-board data base progressCompare.Data wirelessly can also be uploaded and be downloaded.The unit can be used as separate unit operation or can be with any bio-identification systemSystem is integrated.
In operation, mosaic plating sensor provides high reliability in the adverse circumstances with excessive daylight.In order toThis ability is provided, is reduced by using pixel, multiple wafer scale optical sensors are digitally stitched together.It is generatedImage is designed to more than 500 points of per inch (dpi).Power supply is parasitic from other sources by battery offer or by using usb protocolDraw power supply in ground.Format obeys EFTS, EBTS NIST, ISO and ITL1-2007.
Figure 46 exemplifies the traditional optical method that other sensors use.It is suppressed complete interior anti-that this method is equally based on FTIR(It penetrates).In the figure, finger contact prism and light is scattered.The light of cameras capture scattering.It is shown as dark by the crestal line of plating on fingerLine, and the valley line of fingerprint is shown as bright line.
Figure 47 shows the method used by mosaic sensor 4700.Mosaic sensor also uses FTIR.However, plate isIt is to be illuminated from side, and internal reflection is comprised in the plate of sensor.Its image shown in portion is taken on the figuresThe crestal line contact prism of fingerprint simultaneously scatters light, thus the light for allowing cameras capture to scatter.Crestal line on finger is shown as bright line, and paddyLine is shown as concealed wire.
Figure 48 depicts the layout of mosaic sensor 4800.LED array is arranged around the periphery of plate.It is to be used under plateCapture the camera of fingerprint image.Image is captured on the bottom plate (referred to as capture plane).Plane is captured to be placed in fingerSensor plane thereon is parallel.The quantity of the thickness of plate, the quantity of camera and LED can be according to effective capture region of plateSize and change.The thickness of plate can be reduced by addition reflecting mirror, these reflecting mirrors fold the optical path of camera, to reduceRequired thickness.Each camera should cover an inch of space, and some of them pixel is overlapped between camera.This allows mosaicSensor realizes 500ppi.Camera can have 60 degree of visuals field;However there may be significant distortions in the picture.
Figure 49 shows the embodiment 4900. of the camera fields of view of multiple cameras used in mosaic sensor and interactionEach camera covers small capture region.The region depends on camera fields of view and the distance between camera and the top surface of plate.α isThe half of the horizontal field of view of camera, and β is the half of the vertical visual field of camera.
Mosaic sensor can be bonded in the biological phone and tactical computer gone out as illustrated in Figure 50.Biological phoneComplete mobile computer framework is used with tactical computer, it combines dual core processor, DSP, 3D graphics accelerator, 3G-4G, WLAN (according to 802.11a/b/g/n), bluetooth 3.0 and GPS receiver.Biological phone and tactical computer withThe encapsulation delivering and the comparable ability of standard laptop of phone size.
Figure 50 exemplifies the component of biological phone and tactical computer.Biological phone and tactical computer assembly 5000 mentionIt has supplied to include display screen 5001, loudspeaker 5002 and the keyboard 5003 in shell 5004.These elements in biological phone andThe front of tactical computer assembly 5000 is visible.Positioned in 3800 behind of assembly being camera for iris imaging5005, the camera 5006 and biological plating fingerprint sensor 5009 for face imaging and videograph.
In order to provide secure communication and data transmission, equipment combines selectable 256 AES encryptions and COTS sensingDevice and the software audited in advance of bio-identification is carried out for obtaining to POI.The software is by any approval for sending and receiving safetyThe bio-identification of " unabiding " speech, video and data communication matches software to match and achieve.In addition, the biology phone branchHold Windows Mobile, Linux and Android (Android) operating system.
Biological phone is the handheld device for enabling 3G-4G, for touching Web portal and enabling the monitoring of biometric informationInventory (BEWL) database.These databases allow to carry out the biometric image and data captured live comparison.This setsIt is standby to be designed to be suitable for the LBV of standard or pocket.In embodiments, bio-identification phone and tactical computer can be used withThe mobile computer framework that the following contents is characterized: dual core processor, DSP, 3D graphics accelerator, 3G-4G, WLAN(802.11a/b/g/n), bluetooth 3.0 allow readable capacitive touch under safe and civilian network, GPS receiver, WVGA daylightPanel type display can export stereo 3 D video, tactile backlight qwerty keyboard, stored on-board, support multiple operating systems etc., itIt is designed with light-type and the ability of laptop computer is provided.
Biological phone can search for, acquire, register and verify a plurality of types of biometric datas, including face, iris, doubleRefer to fingerprint and personal life data.The equipment also records video, speech, gait, mark label and articles in pocket.MouthfulArticle includes the various small articles usually carried in pocket, wallet or parcel in bag, and may include such as change, bodyPart card, passport, credit card and other items.Figure 52 shows the typical set of such information.Pocket is depicted in Figure 52The example of the set 5200 of interior article.The type for the article that can be included be personal document and photo 5201, books 5202,The document of notebook and paper 5203, such as passport 5204 etc.
Bio-identification phone and tactical computer may include that such as high definition static state and video camera etc are able to carry out lifeObject identifies the camera of data acquisition and video conference.In embodiments, eyepiece camera and video conference energy as described hereinPower can be used together with biological phone and tactical computer.For example, the camera being integrated in eyepiece can capture image and willImage is communicated to bio-identification phone and tactical computer, and vice versa.Exchange data between eyepiece and bio-identification phone,Network connection can be established or be shared by any one.In addition, can be with firm, completely military construction come to bio-identification electricityWords and tactical computer add shell, are resistant to militarization temperature range, waterproof (as deep as 5 meters) etc..
Figure 51 exemplifies using biological phone the embodiment 5100 for capturing latent fingerprint and palmmprint.With 1000dpi with coming fromThe active illumination of ultraviolet-ray diode covers the fingerprint and palmmprint of scale bar to capture.Life all can be used in fingerprint and palmmprint 5100Object phone captures.
Using GPS ability, the data acquired by biological phone are stabbed by automatic geo-location and plus date and time.DataIt can be uploaded or download and compare airborne or networking database and be compared.Pass through the 3G-4G, WLAN and indigo plant of equipmentTooth ability is convenient for this data transmission.Data input can be completed with qwerty keyboard, or pass through available other methodsIt completes, stylus or touch screen etc..After using most specific image is acquired, biometric data is archived.HandDynamic input allows partial data to capture.Figure 53 shows the digital archives image being maintained at database and bio-identification monitoring inventoryBetween interaction 5300.The data that bio-identification monitoring inventory be used to capture on the spot are compared with the data previously capturedCompared with.
EFTS, EBTS NIST, ISO and ITL1-2007 format can be used to provide one with biometric data for formattingThe compatibility of serial and various databases.
The specification of biological phone and tactical computer is given below:
Operating temperature: -22 DEG C to+70 DEG C
Connectivity I/O:3G, 4G, WLAN a/b/g/n, bluetooth 3.0, GPS, FM
Connectivity output: USB2.0, HDMI, Ethernet
Physical size: 6.875 " (height) × 4.875 " (width) × 1.2 " (thickness)
Weight: 1.75 pounds.
Processor: the 3D graphics accelerator of double-core -1GHz processor, 600MHz DSP and 30M polygon per second
Display: 3.8 " WVGA(800 × 480) readable, transflective, capacitive touch screen under daylight, scalable is aobviousShow that output is used for while connecting 3 1080p high definition screens.
Operating system: Windows Mobile, Linux, SE, Android
Storage: 128GB solid state drive
Additional storage: double SD card slots for additional 128GB storage.
Memory: 4GB RAM
Camera: 3 fine definition static state and video camera: face, iris and meeting (face of user)
3D is supported: can export stereo 3 D video.
Camera sensor is supported: dynamic range of sensor extension, adaptive defect pixel correction, advanced acutance enhance, are severalWhat distortion correction, is based on HW(hardware at higher management) face detection, video stabilization
Bio-identification: airborne optical, 2 fingerprint sensors, face, DOMEX and iris camera
Sensor: according to requiring, accelerometer, compass, ambient light sensor, proximity sensor, air pressure be can adapt toThe addition of sensor and temperature sensor.
Battery: less than 8 hour, 1400Mah, rechargable lithium ion, hot plug battery pack.
Power supply: the various power supply options for continuous operation.
Software features: face/posture detection, noise filtering, pixel correction.
With the powerful video-stream processor for covering more, rotating and being sized ability.
Audio: onboard microphone, loudspeaker and audio/video input.
Keyboard: there is the full touch qwerty keyboard of adjustable backlight.
Additional equipment and external member are also in combination with mosaic sensor and can be together with biological phone and tactical computerWork, to provide the complete solution on the spot for acquiring biometric data.
One such equipment be in Figure 54 illustrated by small-sized biological external member.The component of small-sized biological external member 5400 includesGPS antenna 5401, biological plating sensor 5402, keyboard 5404, they are included in shell 5403.The biology is described belowThe specification of external member:
Size: 6 " × 3 " × 1.5 "
Weight: 2 pounds are amounted to
Processor and memory: 1GHz OMAP processor
650MHz kernel
The 3D accelerator per second that up to 18,000,000 polygons can be handled
64KB L2 cache
32 FSB of 166MHz
The embedded PoP memory of 1GB, can be extended to up to 4GB NAND
64GB solid-state hard drive
Display: 75mm × 50mm, 640 × 480(VGA) screen of readable LCD, anti-dazzle, antireflection, anti-scratch under daylightCurtain processing
Interface: USB2.0
10/100/1000 Ethernet
Power supply: battery operation: to register the continuous registration for carrying out about 8 hours in about 5 minutes every time.
Embedded ability: mosaic sensor optical fingerprint reader
Digital iris camera with active IR illumination
Digital face and DOMEX camera (visible) with flash lamp
Quick lock in GPS
Each feature of biological phone and tactical computer also may be provided on the life for biometric data acquisition systemIn object external member, which is folded in firm and compact shell.Data are to use biometric identification criteriaWhat image and data format acquired, it can be used for by cross reference and Ministry of National Defence's bio-identification authoritative database carries out near real-time dataCommunication.
Small-sized biological external member shown in Figure 55 can be captured with 1000dpi with the active illumination from ultraviolet-ray diodeCover the latent fingerprint and palmmprint of scale bar.The biology external member possesses 32GB memory storage card, can be with fight radio or meterCalculation machine interoperability comes upload and downloading data under the conditions of Real-time Battlefield.Power supply is provided by lithium ion battery.Biological external member combinationThe component of part 5500 includes GPS antenna 5501, biological plating sensor 5502, the shell 5503 with substrate 5505.
Biometric data acquisition is readily located geographically mobile to monitor and track individual.The biology external member can be used to acquireThey are simultaneously registered in database by fingerprint and palmmprint, iris image, face-image, latent fingerprint and video.For fingerprint and the palmThe algorithm of line, iris image and face-image acquires convenient for the data of these types.In order to help to capture simultaneously iris image andLatent fingerprint image, the biology external member have IR the and UV diode of active illumination iris or latent fingerprint.In addition, the small-sized biological setPart also obeys EFTS/EBTS, including ITL1-2007 and WSQ completely.The biology external member meets MIL-STD-810 for extremeIt works in environment, and uses (SuSE) Linux OS.
In order to capture image, which, which uses to have, carries out wavefront coded high dynamic range phase to the maximal field depthMachine, so that it is guaranteed that the details in latent fingerprint and iris image is caught in.Once it is captured, Real-time image enhancement software and imageStablize and begins to improve readable and outstanding visual discrimination is provided.
The biology external member can also record video and will be stored in airborne " take the photograph on chip by dynamic (30fps) color video entirelyIn camera ".
Eyepiece 100 can be docked with movable folding type biometric information Enrollment Kit (i.e. biological external member) 5500, the external memberIt is the biometric data acquisition system being folded in the shell of Compact robust, so that its opening becomes for as described hereinFingerprint, iris and face recognition, the biometric datas such as latent fingerprint small workstation.Biometric apparatus is moved with otherSituation it is the same, movable folding type biometric information Enrollment Kit 5500 is used as autonomous device or related to eyepiece 100Connection ground is used, as described herein.In one embodiment, movable folding type biometric information Enrollment Kit is can be folded into such as6 " × 3 " × 1.5 " etc small size, all for example 2 pounds of weight.It may include processor, digital signal processor, 3D accelerationDevice hashes (FSB) function, solid-state memory (such as stacked package (PoP)), hard disk drive, display based on rapid verificationReadable LCD, anti-dazzle, antireflection, anti-scratch screen under (such as 75mm × 50mm, 640 × 480(VGA) daylight), USB, etherNet, mosaic optical fingerprint readers, digital iris camera (as with active IR illumination), has flash lamp at embedded batteryDigital face and DOMEX camera, quick lock in GPS etc..Data can be acquired with biometric identification criteria image and data format,It can be used to carry out near real-time data communication with Ministry of National Defence's bio-identification authoritative database by cross reference.The equipment perhaps can adoptThe biometric data of collection concern personage and geographical location are for monitoring and tracking, by using the war with standard networking interfaceBucket radio or computer carry out wireless data upload/downloading etc..
Other than biological external member, mosaic sensor can be incorporated into the finger being mounted in wrist as shown in Figure 56Line, palmmprint, geo-location and POI registering apparatus.Eyepiece 100 can be docked with biometric apparatus 5600, the biometric apparatusWrist or arm part in soldier are tied up, and folds can open and is used for fingerprint as described herein, iris recognition, computer etc. and gives birth toThe biometric data acquisition system of object identification data.The equipment can have readable display under integrated computer, keyboard, daylightDevice, bio-identification sensing platen etc., therefore operator can store rapidly and remotely or compare what data were used to acquire and identifyPurpose.For example, armband formula bio-identification sensitivity platen can be used to scan palm, fingerprint etc..The equipment can provide concern personageGeo-location label, and acquisition data with having time, date, position etc..With the feelings of other movement biometric apparatusShape is the same, which is used as autonomous device or is used in association with eyepiece 100, such as this paper instituteIt states.In one embodiment, biometric apparatus can be small and light, to allow it to be worn comfortably on soldier'sOn arm, there is 5 " × 2.5 " size such as active fingerprint and palmprint sensor, weight is 16 ounces.It may be presentThe algorithm captured for fingerprint and palm.The equipment may include processor, digital signal processor, transceiver, qwerty keyboard,The pressure-actuated plating sensor of big wind resistance rain, daylight descend readable transflective QVGA colored backlight LCD display, insidePower supply etc..
In one embodiment, being mounted on the assembly 5600 in wrist includes following elements: band in shell 56015602, it is arranged and opens/closes button 5603, the protection cap 5604 of sensor, pressure-actuated sensor 4405 and keyboardWith LCD screen 5606.
The fingerprint, palmmprint, geo-location and POI registering apparatus include integrated computer, qwerty keyboard and display.It shouldDisplay is designed that and is easily operated under strong daylight, and behaviour is alerted using LCD screen or LED indicatorAuthor has successfully carried out fingerprint and palmmprint captures.The display is improved using transflective QVGA colored backlight LCD screenIt is readable.The equipment is light weight and compact, weighs 16 ounces, measures 5 " × 2.5 " in mosaic sensor.The compact size andWeight allows equipment to slide into LBV pocket or tied up on the forearm of user, as shown in Figure 56.Be combined with mosaic sensorOther equipment it is the same, all POI added label in the Shi Douyong geographical location information that is captured.
The size of sensor screen allows ten fingers, palm, four to refer to that clapping print and finger tip captures.The sensor combines greatlyPressure-actuated plating sensor, for as specified by MIL-STD-810 under any weather condition withThe rate of 500dpi is quickly registered.Software algorithm support fingerprint trap mode and palmmprint trap mode both, and use(SuSE) Linux OS carries out equipment management.Due to the 720MHz processor with 533MHzDSP, capture is rapid.At thisCorrectly formatted specific image is consigned to the system software of any existing approval by reason ability.In addition, the equipment also takes completelyFrom EFTS/EBTS, including ITL1-2007 and WSQ.
As other mosaic sensor devices, the wireless 256 AES transceivers of removable UWB is used to make with nothingRay mode communication is possibly realized.This also provides with the biometric data library that is stored in except equipment carry out safe upload and underIt carries.
Power supply is provided using lighium polymer or AA alkaline battery.
The above-mentioned equipment being mounted in wrist can also be used together with other equipment, including be shown with data and videoThe augmented reality eyepiece of device, as shown in Figure 57.Assembly 5700 includes with lower component: eyepiece 5702 and biological plating sensingDevice equipment 5700.Augmented reality eyepiece provide redundancy, eyes, three-dimensional sensor and display, and provide in various illuminationUnder the conditions of the ability watched, the sun dazzling from the noon intensity level extremely low to night.Utilize the rotation on the temple for being located at eyepieceTurn on pass, the operation of eyepiece be it is simple, user arm computerized or sensor or laptop devices can access data in the past.MeshMirror also provides hearing of omnidirectional's earplug for hearing protection and raising.De-noising suspended microphone also can be incorporated in eyepiece, withBetter communication to order differentiated on voice is provided.
Eyepiece be able to use the UWB of 256-bit AES encryption come with biological phone sensor and be mounted on setting on forearmStandby wireless communication.This also allows equipment and laptop computer or fight radio communication, and network connection to CP, TOC andBiometric data library.Eyepiece is compatible with ABIS, EBTS, EFTS and JPEG2000.
Similar to other above-mentioned mosaic sensor devices, eyepiece is come using GPS the and RF filter arrays of networkingThe geo-location of the pin-point accuracy of POI is provided.
In operation, low-profile, the computer that is mounted on forearm and Tactical Display Unit be integrated with face, iris, fingerprint,Palmmprint and finger tip acquisition and mark.The equipment also records video, speech, gait and other distinguishing characteristics.Face and irisTracking is automatically, so that equipment help be allowed to identify non-cooperation POI.The transparent display provided using eyepiece, operator is alsoIt may be viewed by sensor imaging, mobile map, the superposition application with navigation, aim at, from the position of sensor or other lettersThe individual or other target/POI of breath, UAV etc., data and the biometric data that is just captured.
Figure 58 show fingerprint, palmmprint, geo-location and POI registering apparatus further embodiment.Equipment is 16 ounces (bigAbout 450 grams), use 5 " × 2.5 " active fingerprints and palmmprint capacitive sensor.Sensor can register ten fingers, hand with 500dpiThe palm, four, which refer to, claps print, finger tip plating.0.6-1GHz the processor with 430MHz DSP provides quickly registration and data capture.The hardware compatibility ABIS, EBTS, EFTS and JPEG2000, and with the networking for carrying out high accuracy positioning to concern personageGPS is characterized.In addition, the equipment is logical by 256-bit AES encryption UWB and laptop computer or fight radio wirelessLetter.Database information may also be stored in equipment, to allow to compare on the spot without upload information.The on-board data can alsoWith other equipment wireless sharing, such as laptop computer or fight radio.
The further embodiment for being mounted on the biological plating sensor assembly 5800 in wrist includes following elements: biologyPlating sensor 5801, wrist strap 5802, keyboard 5803, fight wireless electric connector interface 5804.
Data can be stored in forearm equipment, because equipment can utilize military condition (Mil-con) data storage lid(cap) Lai Zengjia memory capacity.Data input is executed on qwerty keyboard, and the progress that can have gloves on.
Display is configured to transflector QVGA readable in the sunlight, colored backlight LCD display.In addition in strong dayExcept running under light, equipment can be run in large-scale environment, because equipment meets MIL-STD-810 in extreme circumstancesService requirement.
Above-mentioned mosaic sensor can also be incorporated into movable folding type bio-identification Enrollment Kit, such as institute in Figure 59Show.The movable folding type bio-identification Enrollment Kit 5900 folds into itself and size is set to be suitable for tactical vestPocket has 8 × 12 × 4 inches of size when unfolded.
Figure 60 exemplifies how eyepiece and the equipment that is mounted on forearm can be adopted for biometric data to fetching to provideThe embodiment 6000 of the holonomic system of collection.
Figure 61 provides the system diagram 6100 for movable folding type bio-identification Enrollment Kit.
In operation, movable folding type bio-identification Enrollment Kit allows user to search for, acquire, identify, verify and step onRemember face, iris, palmmprint, finger tip and the life data of certain an object, and also recordable voice sample, articles in pocket, withAnd other visible marks label.Once collected, data can be stabbed by automatic geo-location and plus date and time.It can be rightThe data of acquisition are searched for and compared according to airborne and networking database.In order to the database communication not in equipment, provideWireless data upload/downloading is carried out using the fight radio or laptop computer with standard networking interface.It formatsSubmit to EFTS, EBTS, NIST, ISO and ITL1-2007.Image through auditing in advance can be sent directly to matching software, becauseAny matching and registration software can be used in equipment.
It provides in conjunction with above-mentioned equipment and system for moving the comprehensive of biometric data acquisition, mark and Situation AwarenessClose solution.Equipment can acquire fingerprint, palmmprint, finger tip, face, iris, speech and video data, for uncooperativeConcern personage (POI) identify.Video is captured using high-speed video, is enabled in the case where unstableIt is captured, is such as captured from mobile video.The information captured can be easily shared, and additional data passes through keyboardIt is entered.In addition, all data are all coupled with date, time and geographical location label.This is convenient in potential changeable environmentIn propagate information necessary to Situation Awareness rapidly.Using the more people for being equipped with these equipment, additional data acquisition is alsoIt is possible, to demonstrate the theory of " each soldier is exactly a sensor ".By the way that biometric apparatus and fight is wirelessElectricity and battlefield Automated library system are convenient for sharing.
In embodiments, eyepiece can utilize thin film sensor flexible, be such as integrated in eyepiece itself, with eyepieceThe external equipment of docking is medium.Thin film sensor may include thin multilayer electromechanics configuration, by unexpected contact force or continuouslyIt generates electric signal when the power of variation.The typical case of electromechanical thin film sensor use to the on-off electric switch of power sensing andBoth time resolution sensings.Thin film sensor may include switch, dynamometer etc., and wherein thin film sensor can be dependent on following effectAnswer: suddenly electrical contact (switching), under force the gradual change of electrical impedance, under stress the gradually release of charge,The generation etc. of the progressive electromotive force of transconductor when being moved in magnetic field.For example, flexible thin film sensor can be used for power pressingIn sensor, with the micro object sensitive pixels for two-dimentional power sensor array.This may for it is following be useful: calculateMachine, smart phone, notebook, similar to the touch screen of the equipment of MP3, especially those equipment with Military Application;ForUnder the control of the computer control anything (including unmanned plane (UAV), target drone, mobile robot, based on the equipment of ectoskeleton)Screen;Etc..Thin film sensor may be in security application it is useful, such as detect intrusion, equipment, window, dressIn the standby remotely-or locally sensor opened or closed etc..Thin film sensor may be to have for trip wire detection, such as together with electronic equipment used in noiseless, long-range trip wire detector and radio.Thin film sensorIt can be used for opening-closing detection, the power for detecting the ess-strain in compartment, hull, aircraft target ship etc. sensesDevice.Thin film sensor is used as biometric sensor, when print, taking palmmprint, fetching point line etc..Film-sensingDevice may be for leak detection it is useful, detect groove tank, storage facility of leakage etc..Thin film sensor is in medicineMay be in sensor it is useful, when detecting liquid or blood etc. outside body.These sensor applications are intended to pairThin film sensor can with the illustration that is controlled external equipment and monitored many applications adopted in association by eyepiece,And it is not intended to be limited in any way.
Figure 62 exemplifies the embodiment 6200 of film fingerprint and palmmprint acquisition equipment.The equipment can record four fingerprints and clap print(slap) and roll printing (roll), palmmprint and the fingerprint for reaching NIST standard.Outstanding quality can be captured with wet or dry handFingerprint image.Compared with other large sensors, equipment is to reduce in weight and power consumption.In addition, sensor is independentAnd it is hot-swappable.The configuration of sensor can be altered to accommodate various demands, and sensor can carry out quilt with various shape and sizeManufacture.
Figure 63 depicts the embodiment 6300 of fingerprint, palm and registration data acquisition equipment.The equipment records finger tip, rollingPrint claps print and palmmprint.Built-in qwerty keyboard allows to input written registration data.As above-mentioned equipment, all dataAll it has been coupled with date, time and the geographical location label of acquisition.Built-in database provides control built-in database to potentialConcern personage airborne matching.Matching can also be executed with other databases by battlefield network.The equipment can with it is above-mentionedOptical bio identification information acquisition eyepiece integrates to support face and iris recognition.
The specification for finger, palm and registering apparatus is given below:
Weight and size: 16 ounces of forearm strap or insertion LBV pocket
5 " × 2.5 " finger print/palm print sensors
5.75 " × 2.75 " qwerty keyboards
3.5 " × 2.25 " LCD displays
One-handed performance
Environment: working under all weather conditions, and -20 DEG C to+70 DEG C
Waterproof: 1 meter up to 4 hours, performance without reducing works
Biometric information acquisition: fingerprint and palmmprint acquisition, mark
For paying close attention to the keyboard and LCD display of personage's registration
Remain larger than 30000 for concern personage carry out airborne matched complete template archives (2 irises, 10 fingerprints,The biometric information of face-image, 35 fields).
The biometric data all acquired is added into time, date and location tags
Pressure capacitance finger print/palm print sensor
30fps high contrast bitmap images
1000dpi
It is wireless: it interoperates completely with battlefield radio, hand-held or laptop computer, 256 AES encryptions
Battery: double 2000 milliamperes of lithium polymer batteries
Fast Charge Battery in greater than 12 hours, 15 seconds
Processing and memory: 256MB flash memory and 128MB SDRA support 3 SD cards, each most 32GB of SD card.
600-1GHz ARM Cortex A8 processor
1GB RAM
Figure 64-66 depicts the use to the equipment for combining the sensor for acquiring biometric data.Figure 64 showsThe embodiment 6400 of two-stage palmmprint capture is gone out.Figure 65 shows the acquisition 6500 using finger tip tapping.Figure 66 illustrates acquisition and clapsThe embodiment 6600 of print and roll printing.
Above discussion is related with the method for biometric data is collected, and such as obtains fingerprint or the palm using platen or touch screenLine, as shown in Figure 66 and 62-66.The disclosure further includes for using polarised light to carry out no touch or contactless printMethod and system.In one embodiment, fingerprint using polarized light source and can use the polarization by reflection in two planes by peopleLight fetches fingerprint image to be acquired.In another embodiment, fingerprint can be taken by people using light source and using multispectral processingPrint image is referred to be acquired, two imagers are used such as at two different locations with different inputs.These are differentInput may be caused due to using different filters or different sensor/imagers.The technology application may includeBio-identification inspection in the possible problematic situation of the safety of the people checked to unknown personnel or object.
In the method, unknown personnel or object may be close to checkpoint, such as in order to be allowed to further go to hisOr her destination.Depicted in system 6700 as shown in figure 67, the people P and suitable physical feeling are (allSuch as hand, palm P or other positions) it is illuminated by polarized light source 6701.As well known to the technical staff of optical field, polarizationLight source can be simply with the lamp of polarizing filter or other light sources, to issue the light polarized in one plane.Light advances to the people for being located at and being given in the region of non-contact print, so that polarised light is incident on the people P'sOn finger or other physical feelings.Then incident polarised light is reflected from finger or other physical feelings, and from the people on four sidesIt transmits from all directions.After the optical element that light has passed through such as lens 6702 and polarizing filter 6703 etc, two imagersOr camera 6704 receives reflected light.Camera or imager can be installed on augmented reality glasses, as above be discussed about Fig. 8 FLike that.
Light then from the palm of concern personage or one or more finger be transmitted to different polarizing filter 6704a,6704b is then passed to imager or camera 6705.Passed through polarizing filter light can have 90 ° of direction difference (it is horizontal andVertically) or other direction differences, such as 30 °, 45 °, 60 ° or 120 °.Camera can be with suitable digital imagery sensor byIncident light is converted into the digital camera of suitable signal.Signal is then by the suitable processing of such as digital signal processor etcThe processing of circuit 6706.Then signal can be combined such as by the digital microprocessor 6707 with memory in a usual manner.HaveThe digital processing unit of suitable memory is programmed to generate the number for being suitable for palm, the image of fingerprint or other desired imagesAccording to.Then numerical data from imager is combined in this process, for example, using U.S. Patent number 6249616 technology orOther.As described above in the disclosure, then can contrasting data library check " image " of combination to determine the identity of the people.IncreaseStrong Reality glasses can include this database in memory, or can refer to the signal data in other places 6708 and be compared and examineIt looks into.
It is disclosed in the flow chart of attached drawing 68 a kind of for obtaining contactless fingerprint, palmmprint or other biological identification platingProcess.In one embodiment, 6801 polarized light sources are provided.In second step 6802, positioning concern personage and selectedPhysical feeling for by optical illumination.In another embodiment, incident white light may be used rather than uses polarized light source.Work as figureAs when being ready to be acquired, light reflects 6803 to two cameras or imager from the people.Polarizing filter is placed in two camerasEach of before camera so that the received light of camera quilt in two different planes (such as horizontal and vertical plane)Polarization 6804.Then each camera detects 6805 polarised lights.Then camera or other sensors convert incident light into is suitable forPrepare image signal or data 6806. finally, image be combined 6807 come formed obviously, reliable plating.As a resultIt is the image of very high quality, it can relatively identify the people and detection concern personage compared with numerical data base.
It should be understood that other imagers can be used although digital camera is used in the contactless system, it is such as activePixel imager, cmos imager, with multiple wavelength imaging imager, CCD camera, photodetector array, TFT imagerDeng.It should also be understood that other in reflected light can also be used although being used for polarised light to create two different imagesVariation.For example, substitution uses polarised light, white light can be used, be applied to imager in being different optical filter, such as Bayer is filteredLight device, CYGM optical filter or RGBE optical filter.In other embodiments, polarized light source may be eliminated, on the contrary using nature orWhite light rather than polarised light.
A period of time is had been developed to the use of no touch or contactless print, is proved in such as previous systemLike that.For example, U.S. Patent application 2002/0106115 uses polarised light in contactless system, but require the people for being fetched lineFinger enterprising row metal spraying.Such as U.S. Patent number 7651594 and U.S. Patent Application Publication No. 2008/0219522Described in those of the later system requirements of technology etc contacted with platen or other surfaces.Contactless system as described hereinThe contact in imaging is not required, does not require to contact in advance yet, and coating or reflectivity are set such as in physical feeling of interestCoating.Certainly, the position of imager or camera relative to each other should be known, in order to more easily handle.
It in use, can be in the inspection of such as compound entrance, building entrance, roadside checkpoint or other convenient locations etcIt makes an inventory of place and uses contactless system of fingerprints.This position can be desirable to permit certain people's entrance and refuse other concern personagesInto or even detain other concern personages place.In practice, if using polarised light, system can utilize external light source,Such as lamp.It may be mounted to that a secondary augmented reality glasses for the camera of out of contact imaging or other imagers (for a people)Opposite sides on.For example, showing two camera versions in Fig. 8 F, two of them camera 870 is installed on frame 864.?In the embodiment, the software at least handling image can be comprised in the memory of augmented reality glasses.Alternatively, it comes fromCamera/imager numerical data can be routed to neighbouring data center suitably to be handled.The processing may include groupNumerical data is closed to form the image of plating.The processing may also include check known people database come determine object whether bePay close attention to personage.
The method of another contactless print non-contactly scans finger and hand using quantum dot laser, with detectionThe explosive compound and Narcotics compounds of extremely low (parts per billion or even trillion/several) concentration.For example, quantum dot orOther kinds of laser, laser array may be mounted to that in the back of biological phone or the frame of glasses, so as to veryIt closely but is non-contactly detected, to prevent the pollution between object.As a result, in addition to the glasses or other accessory devicesIt acquires except biometric data related with iris, fingerprint, face and speech, can also acquire explosive or drugs pollution ID.
Alternatively, but two people everyone use a camera, as seen in the camera 858 in Fig. 8 F.Match at thisIn setting, two people by relative close so that their respective images will suitably it is similar come by suitable combination of software.ExampleSuch as, two cameras 6705 in Figure 67 may be mounted to that on two secondary different augmented reality glasses, such as two soldier's manipulations oneCheckpoint.Alternatively, camera may be mounted to that on the wall or fixed position of checkpoint itself.Two images then can be by havingThe teleprocessing unit 6707 of memory combines, the computer system at such as building checkpoint.
As discussed above, using the people of augmented reality glasses can by least one of many wireless technologys comeConstant contact is mutually kept, especially they are when a checkpoint is on duty.Therefore, from single camera or from two camera versionsThis data may be sent to that data center or other command posts carry out proper treatment, find the palm followed by database is checkedThe matching of line, fingerprint, iris line etc..Data center is convenient to be located near checkpoint.Utilize present age computer and storageAvailability, provide multiple data centers and wirelessly update software cost by be not these systems prime cost consider becauseElement.
No touch discussed above or the collection of contactless biometric data can be controlled with several modes, such as this public affairsOpen the control technology that middle other places discuss.For example, in one embodiment, user can be by pressing the touch pads on glasses or passing throughVoice commands are provided to initiate data collection session.In another embodiment, user can be by hands movement or posture or using originallyAny control technology described in text initiates session.Any technology in these technologies may bring up menu, and therefrom user may be selectedOne option, such as " starting data collection session ", " terminating data collection session " or " continuing session ".If having selected dataSession is collected, then the menu of computer control can provide the menu selection about camera quantity, which camera etc., this and user selectIt is similar to select printer.There is likely to be various modes, polarize optical mode, colour filter mode etc..After each selection,The achievable task of system provides another selection when suitable.May also require user intervention, such as opening polarized light source orOther light sources, using filter or polarizer etc..
After having obtained fingerprint, palmmprint, iris image or data desired by other, menu then can provide passWhich database to be used to the selection compared, which equipment is used to store etc. in.No touch or contactless biometric data are collectedSystem can be controlled by any method as described herein.
Although the system and sensor have obvious purposes in terms of identifying potential concern personage, there is also positiveBattlefield use.Fingerprint sensor can be used for transferring the case history of soldier, to quickly and easily be provided immediately about allergy, bloodType and other times sensitivity and the information for determining the data treated, to allow to provide suitable treatment under battlefield conditions.This is particularly useful for patient out of the count in initial treatment and the patient that may have lost identification (RFID) tag.
For from individual capture biometric data equipment further embodiment stored in combination with server andHandle biometric data collected.The biometric data of capture may include the hand image with multiple fingers, palmmprint, facePortion's camera image, iris image, the audio sample of the speech of individual, the gait of individual or the video of movement.The data of acquisition mustIt must can be accessed to useful.
The processing of biometric data locally or at separated server can be carried out remotely.Processing locality canThere is provided capture original image and audio and the information is become when main frame is needed by WiFi or USB link canOption.Alternatively, then another processing locality method processing image simultaneously transmits processed data by internet.It shouldIt finds the step of processing locality includes the steps that the step of finding fingerprint, grading to fingerprint, finds face and then trimIris and then the step of grading to it and the similar step for audio and video data.Although locally handling dataIt is required that more complicated code, but its certain the advantages of providing reduced Internet data transfer.
Scanner associated with biometric data acquisition equipment can be used and USB image common in scanner standardThe compatible code of device protocol.Different scanner standards can be used in other embodiments, this depends on demand.
When using WiFi network to transmit data, biological plating equipment described further herein can run or showIt obtains as to the web server of network.It can be obtained by selecting or clicking on web page interlinkage or button from browser clientEach of various types of images.The web server function can be a part of biological plating equipment, specifically byIt is included in microcomputer function.
Web server can be a part of biological plating micro-mainframe computer, so that biological plating equipment be allowed to createOne webpage, the webpage disclose captured data and also provide certain controls.The additional embodiment of browser application can provide withLower control: capturing the high-resolution lines of the hand, face-image, iris image, camera resolution is arranged, when the capture of audio sample is arrangedBetween, and also allow to carry out stream transmission connection using IP Camera, Skype or similar mechanism.The connection can be affixed toAudio and face camera.
Further embodiment provides given by File Transfer Protocol (FTP) or other agreements to the image capturedWith the browser application of the access of audio.The further embodiment of browser application can be used for automatic with selectable rateRefresh repeatedly to seize preview image.
Additional embodiment provides the processing locality using microcomputer to the biometric data captured, and providesAdditional control is to show the grading of captured image, allow user to grade each plating found, fetch instituteIt the face of capture and fetches pruned iris image and user is allowed to grade each iris line.
Another embodiment provides the USB port with open multimedia application platform (OMAP3) system compatible.OMAP3 isIt is a kind of for portable multimedia application chip on proprietary system.OMAP3 device port is equipped with remote network driveInterface specification (RNDIS), it is the specialized protocol that can be used on USB.These systems are provided when biological plating equipment quiltEquipment is shown as the ability of IP interface when being inserted into Windows computer usb host port.The IP interface will in WiFi(TCP/IP web server) on it is the same.This allows for data to be moved away from micro-mainframe computer, and provides to the aobvious of the plating capturedShow.
A kind of application on microcomputer can realize the above by receiving data from FPGA through usb bus.OnceIt is received, has been created that JPEG content.The content can be written in the socket to server run on laptop computerOn, or it is written into file.Alternatively, server can receive socket stream, pop up image, and make its opening in the window, fromIt and is that each biometric information capture creates new window.If microcomputer operational network file system (NFS) (forA kind of agreement that system or SAMBA based on Sun are used together) (SAMBA is to provide file and printing for Windows clientThe free software of service is realized again), the file captured can (it be that a kind of PC is logical by operation NFS or System Management Bus (SMB)Letter bus is realized) any client computer share and access.In this embodiment, JPEG reader will show file.Display visitorFamily machine may include laptop computer, augmented reality glasses or the phone for running Android platform.
One additional embodiment provides server end application, which provides service same as described above.
One alternate embodiment of server end application shows result on augmented reality glasses.
Further embodiment provides microcomputer on moveable platform, is similar to mass-memory unit or stream transmissionCamera.Moveable platform also incorporates active USB serial port.
In embodiments, eyepiece may include for capturing 360 degree of sound and/or view around the wearer from eyepieceThe audio and/or visual sensor of feel.This may be from being mounted on the sensor with eyepiece sheet, or is coupled in and is mounted on wearingSensor on vehicle locating for person.For example, sound transducer and/or camera may be mounted to that outside vehicle, wherein sensor is logicalLetter eyepiece is coupled in provide ambient sound and/or to the landscape " view " of ambient enviroment.In addition, the audio system of eyepiece canSound protection is provided, eliminates, enhance etc., to help to improve wearer's when wearer is surrounded by external or noisy noiseAcoustical quality.In one example, wearer can be coupled in the camera being mounted on the vehicle that they are driving.These camerasThen it can be communicated with eyepiece, and provide vehicle periphery 360 degree of views, such as in the figure for being transmitted to wearer by eyepiece displayerIt is provided in shape image.
In one example and Figure 69 is referred to, may include using 6902 form of wrist-watch controller in terms of each control of eyepieceRemote control equipment, such as include for when user does not wear eyepiece and eyepiece to fetch carry out information receiving and transmitting and/or control eyepieceReceiver and/or transmitter.Wrist-watch controller may include camera, fingerprint scanner, discrete control button, 2D control panel,LCD screen, for multiconductor control capacitive touch screen, give touch feedback vibration motor/piezoelectric damper, have touchingButton, bluetooth, camera, fingerprint scanner, accelerometer of sense etc., the control function region 6904 of such as wrist-watch controller 6902In or other function part 6910 on provided by.For example, wrist-watch controller can have standard watch display 6908, but additionalGround has the function of control eyepiece, such as passes through the control function 6914 in control function region 6904.Wrist-watch controller can be shownShow and/or otherwise (such as vibration, audible sound) message from eyepiece that notifies user, it is such as Email, wideAnnouncement, schedule warning etc., and show the content from user currently without the message of the eyepiece of wearing.Vibration motor, piezoelectricity subtractTouch feedback can be provided to touch screen control interface by shaking device etc..Watch Receiver perhaps can be in 6904 user of control function regionVirtual push button and click are provided in interface, and hummed and hit the wrist etc. of user when message is received.Eyepiece and handTraffic connectivity between table receiver can be connect by bluetooth, WiFi, cellular network or any other communication known in the artIt mouthful provides.Wrist-watch controller can utilize embedded type camera to carry out video conference (as described herein), iris scan (as rememberingThe image for recording iris stores in the database, for authenticating etc. together with iris image existing in storage), take pictures, regardFrequency etc..Wrist-watch controller can have all fingerprint scanners as described herein.Wrist-watch controller or it is as described herein any otherHaptic interface can measure the pulse of user, can be such as located in watchband, below wrist-watch main body by pulse transducer 6912(Deng).In embodiments, eyepiece and other controls/haptic interface component can have pulse detection, so that coming from different controlsThe pulse of interface module is monitored in a synchronous manner, for health, activity monitoring, authorization etc..For example, wrist-watch controllerBoth there can be pulse monitoring with eyepiece, wherein eyepiece can sense whether the two synchronizes, whether all the two previously surveyed by matchingThe profile (for authenticating) etc. obtained.Similarly, other biological identification information can be used to carry out multiple control interfaces and meshCertification between mirror, with fingerprint, iris scan, pulse, healthy profile etc., it is that same people wears that wherein eyepiece, which is known whether,Interface module (such as wrist-watch controller) and eyepiece.It can be determined by seeing the IR LED view of skin, seeing under surface pulse etc.Personal biometric information/health.In embodiments, more equipment certifications (such as the token shaken hands for bluetooth) can be used,Use the sensor (fingerprint as the hash of bluetooth token) in such as two equipment having in two equipment.
In one embodiment, wrist-watch controller can have touch screen, even if this is for being not installed on user face in glassesIt may be useful that (such as they are in knapsack), which also can control for glasses, in the case where portion.The transparent lens of wrist-watch haveOLED display, switchable reflecting mirror is posted in portion under a lens.In other embodiments, wrist-watch controller lens may include electricitySub- reflecting mirror or electronic ink display.Under any circumstance, lens can be covered on standard analog watchwork, and including canSwitched-mirror or electron mirror or the transparent lens of electronic ink display can be activated to display content.Wrist-watch can be used forAbility of posture control is carried out using the integrated sensor of detection posture.Wrist-watch is used as AR label, so that working as the camera of glassesWhen identifying wrist-watch, an application can be activated.Wrist-watch can be used as with covering virtual image in application as a kind ofPhysical surface, this actually makes wrist-watch become touch screen interface.
With reference to Figure 70 A-70D, eyepiece can be stored in eyepiece Portable box, such as including rechargeable ability, integrated aobviousShow device etc..Figure 70 A depicts the embodiment for being shown as the box with integrated rechargeable AC plug and digital display of closure, andFigure 70 B shows the same embodiment of box opening.Figure 70 C shows another embodiment of box closure, and Figure 70 D shows opening stateThe same embodiment, wherein digital display is shown through lid.In embodiments, box can have when eyepiece is located in box pairThe ability that eyepiece recharges, such as by AC connection or battery (as interior build in Portable box is used for when far from AC power supplies to meshThe rechargable lithium ion cell of mirror charging).Electric power, which can be connected by wire or wireless, is conveyed to eyepiece, such as passes through box and meshWireless induction pad configuration between mirror.In embodiments, box may include the digital display communicated with eyepiece, such as pass through indigo plantTooth is wireless etc..The display can provide the information about eyepiece state, and message, the battery levels such as received are indicated, notifiedDeng.
With reference to Figure 71, eyepiece 7120 can be used together with unattended ground transaucer unit 7102, such as shouldGround transaucer unit is formed that the stake 7104 on ground 7118 can be inserted by people, is emitted, by RC Goblin by thrown with airplaneDeng.Ground transaucer unit 7102 may include camera 7108, controller 7110, sensor 7112 etc..Sensor 7112 may includeMagnetic Sensor, sound transducer, vibrating sensor, heat sensor, passive IR sensor, motion detector, GPS, real-time clockDeng, and monitoring is provided at the place of ground transaucer 7102.Camera 7108 can have the visual field in both orientation and elevation7114, complete or partial 360 degree of camera arrays and ± 90 degree of elevations in such as orientation.Ground transaucer unit 7102 can captureThe sensor and image data of event, and it is sent to eyepiece 7120 by wireless network connection.Further, eyepiece is thenExternal communication device 7122, such as cellular network, satellite network, WiFi network can be transferred data to, another eyepiece is transmitted toDeng.In embodiments, data can be relayed to another unit from a unit by ground transaucer unit 7102, such as from7102A to 7102B arrives 7102C.Further, then data can be relayed to eyepiece 7120B from eyepiece 7120A and to communicationEquipment 7122, such as in backhaul data network.It is acquired from ground transaucer unit 7102 or ground transaucer cell arrayData can be shared with multiple eyepieces, such as from eyepiece to eyepiece, from communication equipment to eyepiece etc., so that the user of eyepiece canIt is utilized using the primitive form of data or post-treated form (graphical display for passing through eyepiece as data) and shares numberAccording to.In embodiments, ground transaucer unit may be not expensive, disposable, toy grade etc..In each embodimentIn, ground transaucer unit 7102 can provide the backup to the computer documents from eyepiece 7120.
With reference to Figure 72, eyepiece can provide control by the inside and outside equipment of eyepiece, such as from ambient enviroment 7202,Captured from input equipment 7204, from sensor device 7208, from user action equipment 7210, from inter-process equipment 7212, from interiorPortion's multimedia processing apparatus, from internal applications 7214, from camera 7218, from sensor 7220, from earphone 7222, from projector7224, by transceiver 7228, by haptic interface 7230, from external computing device 7232, from applications 7234, from eventAnd/or data feed 7238, from external equipment 7240, from third party 7242 etc. initiate.The order and control model 7260 of eyepieceIt can be by sensing through the input of input equipment 7244, user action 7248, external equipment interaction 7250, event and/or dataThe reception of feeding 7252, internal applications execute 7254, applications and execute 7258 etc. to initiate.In embodiments, it may be presentSeries of steps, which is included in, to be executed in control, including at least two combinations in following: event and/or data feeding, sensingInput and/or sensor device, user action capture input and/or output, user's movement for controlling and/or initiating orderAnd/or on movement, order and/or control model and interface (wherein input can be reflected), platform order to can be used defeated to respondThe application that enters, from platform interface to the communication and/or connection of external system and/or equipment, external equipment, applications, rightThe feedback 7262(of user is such as related with external equipment, applications) etc..
In embodiments, event and/or data feeding may include Email, with it is military it is related communicate, schedule is warnedAnnouncement, security incident, financial events, personal event, input request, is indicated, is handed over into active state, into military affairs security eventFight active state, into certain type of environment, into hostile environment, into certain place etc. and their combination.
In certain embodiments, sensing input and/or sensor device can include: charge, black silicon sensor, IRSensor, acoustic sensor, inductive pick-up, motion sensor, optical sensor, opacity sensor, proximity sensor,Inductance sensor, eddy current sensor, passive infrared proximity sensor, radar, capacitance sensor, capacitive displacement transducer, suddenlyThat effect sensor, Magnetic Sensor, GPS sensor, thermal imaging sensor, thermocouple, thermistor, photoelectric sensor, ultrasoundSensor, inertia motion sensor, MEMS internal motion sensor, ultrasonic 3D motion sensor, accelerates infrared laser sensorSpend meter, inclinometer, force snesor, piezoelectric transducer, rotary encoder, linear encoder, chemical sensor, ozone sensor,Smoke sensor device, heat sensor, magnetometer, carbon dioxide detector, carbon monoxide detector, oxygen sensor, glucose passSensor, smoke detector, metal detector, Raindrop sensor, altimeter, GPS, to whether in external detection, to environmentDetection, to movable detection, object detector (for example, billboard), sign detector (for example, for making the geography of advertisementPosition mark), laser range finder, sonar, capacitor, optic response, heart rate sensor, RF/ micropower impulse radio (MIR) passSensor etc. and their combination.
In embodiments, user action captures input and/or equipment may include head tracing system, camera, speech knowledgeOther system, eye-gaze detection system, tongue touch pads, blows the formula of sobbing (sip- at body movable sensor (such as dynamic pickup)And-puff) system, control stick, cursor, mouse, touch screen, touch sensor, finger tracking equipment, 3D/2D mouse, inertiaMobile tracking, microphone, wearable sensor group, robot motion's detection system, Optical motion tracking systems, laser motion withTrack system, keyboard, dummy keyboard, the dummy keyboard on physical platform, back border determine that system, activity determine system (such as in trainIt is upper, aboard, walking, exercise etc.), finger follow camera, virtualization hand in display, symbolic language system, trace ball,It is mounted on camera, the sensor positioned at temple, the sensor positioned at glasses, the Bluetooth communication, wireless communication, satellite communication of handDeng and their combination.
In embodiments, for controlling or initiating the user's movement or movement of order can include: head is mobile, head is shakenIt shakes, nod, head is looped, forehead is twitched, ear is mobile, eyes are mobile, opens eyes, close one's eyes, blink, eyes are turn-taked, hand is mobile, are heldFist, open fist, shake fist, stretch out fist, withdraw fist, voice commands, blown by suction pipe sob, tongue is mobile, finger is mobile,One or more fingers are mobile, elongation finger, bending finger, withdraw finger, stretching, extension thumb, make symbol with finger, use fingerMake symbol with thumb, by finger by thumb, with finger drag and drop, touch and drag, touched and dragged, hand with two fingersWrist is mobile, wrist is turn-taked, wrist is overturn, arm movement, arm elongation, arm withdrawal, arm left rotaring signal, arm right-hand rotation letterNumber, with arms akimbo, the elongation of both hands arm, leg be mobile, kicking, leg elongation, leg bending, straddle jump, body movement, walking, raceWalk, turn left, turning right, turning round, rotating, both hands arm raising and rotate, an arm is ignored and rotates, with various hands and armPosition rotation, finger is mediated and extension movement, finger mobile (being keyed in as virtual), fiercely attack, tapping, hip motion, shoulder movements,Mobile brush, symbolic language (such as ASL) and their combination are drawn in foot movement.
In embodiments, input can be reflected in order therein and/or control model and interface can include: figure is usedFamily interface (GUI), audible command interface, the icon that can be clicked, the list that can be navigated, virtual reality interface, augmented reality interface,Head up display, 3D navigation interface, order line, virtual touch screen, robot control interface, is keyed in (such as benefit semi-transparent displayWith the non-persistent virtual keyboard for being locked in appropriate position), based on prediction and/or study user interface (as study wearer exist" training mode " be what and they when and wherein do), simple command mode (such as start the gesture of a certain applicationDeng), bluetooth controller, cursor keep, locking virtual monitor, head around positioning cursor mobile etc. and their combination.
In embodiments, the application that order can be used on eyepiece and/or input is responded can include: military affairs are answeredWith the application of, weapon control, it is military aim at application, war game simulation, simulator of fighting bare-handed, repair manual are applied, tactics rowDynamic application, mobile phone application (such as iPhone application), information processing, fingerprint capture, face recognition, information is shown, information passesPass, information collect, iris capture, amusement, pilot be easy to get information, in real world with 3D positioning object, with the common peopleFor target, using police as target, teaching, without using hand study course guidance (such as in maintenance, assemble in, first aid it is medium), it is blindPeople navigation auxiliary, communication, music, search, advertisement, video, computer game, e-book, advertisement, shopping, e-commerce, videoMeeting etc. and their combination.
In embodiments, from eyepiece interface to external system and the communication of equipment and/or connection may include microcontroller,Microprocessor, digital signal processor, steering wheel control interface, Joystick controller, movement and sensor resolver, steeper(stepper) controller, audio system controller, integrated sound and the program of picture signal, Application Programming Interface (API), figureShape user interface (GUI), navigation system control, network router, network controller, reconciliation system, payment system, game are setStandby, pressure sensor etc..
In embodiments, external equipment to be controlled can include: weapon, weapon control system, communication system, bombDetection system, bomb remove system, Remote Control Vehicle, computer (and many equipment so as to be controlled by computer), phaseMachine, projector, cellular phone, tracking equipment, display (such as computer, video, TV screen), video-game, war game mouldQuasi- device, moving game, fixed point or tracking equipment, radio or audio system, rangefinder, audio system, iPod, smart phone,TV, entertainment systems, the weapon system of computer control, target drone, robot, automobile instrument panel interface, lighting apparatus (such as shine by mental stateIt is bright), athletic equipment, gaming platform (such as identification user simultaneously preloads them and likes playing what gaming platform), vehicle, enabling depositEquipment, payment system, ATM, POS system of reservoir etc..
In embodiments, application associated with external equipment can be Military Application, weapon control is applied, military affairs are taken aim atQuasi- application, the application of simulator of fighting bare-handed, repair manual, tactical operation application, communication, information processing, refers to war game simulationLine capture, face recognition, iris capture, amusement, pilot be easy to get information, in real world with 3D positioning object, withThe common people be target, using police as target, teaching, without using hand study course guidance (such as in maintenance, assemble in, in first aid),Blind man navigation auxiliary, music, search, advertisement, video, computer game, e-book, automobile instrument panel application, advertisement, military enemyPeople's aiming, shopping, e-commerce etc. and their combination.
In embodiments, to wearer with external equipment and using related feedback can include: visual displays, liftHead display, target center or target following display, tone output or audio alarm, performance or grading indicator, score, task are completeInstruction is completed at instruction, movement, content plays, information shows, reports, data mining, recommendation, targeted ads etc..
In one example, may include combination below in terms of the control of eyepiece: soldier's nods to initiate when from movementSilence order (such as in belligerent period) inputs the figure for being reflected in mode therein and/or interface by reflecting to controlUsing order and/or the Military Application made a response to control input, for from eyepiece interface to outside in user interface, eyepieceAudio system controller of communication and/or connection of system or equipment etc..For example, soldier may pass through mesh in belligerent periodMirror controls safety communications equipment, and it is desirable that changing communication in a certain respect, channel, frequency, code levels etc. do not have to issueSound and with least movement to minimize a possibility that being audible or seeing.In this example, nodding for soldier head canIt is programmed to indicate the change, such as quickly nodding forward indicates and start to transmit, and quickly nods backward and indicates that end passesIt is defeated etc..In addition, eyepiece may be used for the graphic user interface of safety communications equipment to soldier's projection, any letter is such as shownRoad be it is movable, what alternate channel is available, in their troop currently transmitting other people etc..Soldier's nodsThen it can be construed to change order by the processing equipment of eyepiece, which is transmitted to audio system controller, and communicates and setStandby graphic user interface shows the change.Further, certain nod/body kinematics can be interpreted the special life to be transmittedIt enables, so that eyepiece is that hear just send pre-established communication without soldier.That is, soldier perhaps can pass through bodyBody, which is moved, sends preprepared communication (such as determination together with troop before belligerent) to their troop.With this sideFormula is worn and perhaps can be connect and be docked with external security device in a manner of secret completely using the soldier of eyepiece equipment,To keep the Silent communication with their troop in belligerent period, even outside the sight of troop.In embodiments,Order therein can be reflected in using other movements or movement, input as described herein for controlling or initiating orderAnd/or on control model and interface, platform can be used order and/or to input make a response application, from platform interface toCommunication or connection of external system and equipment etc..
In one example, may include combination below in terms of the control of eyepiece: movement and position sensor are defeated as sensingEnter, augmented reality interface as input can be wherein reflected to soldier order and control interface, motion sensor and weaponThe rangefinder of system is used as will acquire the external equipment of information, to the feedback related with external equipment of soldier by control and therefromDeng.For example, the soldier for wearing eyepiece may use motion sensor to monitor military movement in a certain environment, and when movement passesWhen sensor is triggered, augmented reality interface can be projected to wearer to help to identify target, people, vehicle etc., for intoRow further monitoring and/or aiming.In addition, rangefinder perhaps can determine the distance of object and the information fed back to soldierFor aiming at, (such as manually, wherein soldier executes movement of opening fire;Or automatically, wherein weapon system receives information and aims at,Soldier provides firing order).In embodiments, augmented reality interface can provide the information about target to soldier, such as rightAs on the map that 2D or 3D are projected position, from previously acquired information (such as be stored in object database, includingFace recognition, Object identifying) the identity of target, the coordinate of target, target night vision imaging etc..In embodiments, it movesThe triggering of detector can be construed to Warning Event by the processing equipment of eyepiece, which can be transmitted to rangefinder to determine objectPosition, and pass to the loudspeaker of eyepiece earphone and sense movement in just monitored region to provide to soldierThe audio-alert of object.The audio-alert of soldier may act as to soldier's plus visual detector it should be noted that the mobile objectInput such as found by accessed database known such as when object has been identified as the suspicion object of soldierSoldier, known type of vehicle etc..For example, soldier may monitor sentry post periphery at night at sentry post.In this case, ringBorder may be it is dark, soldier is possibly into low state of attention, because may be to the late into the night, and all environmental aspects be peacesQuiet.Eyepiece, which then may act as sentry, enhances equipment, carries out " observation " (certain external prisons with sentry post from the personal visual angle of soldierIt is opposite depending on equipment).When eyepiece senses it is mobile when, soldier is alerted immediately and is directed to the position of the movement, distance, identityDeng.By this method, soldier perhaps can react to avoid personal danger, open fire to the mobile aiming of positioning, Yi JixiangSentry post alerts potential danger.Further, occur therewith if fought, soldier may be improved due to the warning from eyepieceIn the reaction time, better decision made by the information about target, and make to be injured or sentry post is slipped into it is dangerous mostSmallization.It in embodiments, can also be using other sensing inputs and/or sensor device, input can be anti-as described hereinReflect wherein order and/or control model and interface, useful external equipment to be controlled, with external equipment and/or outerPortion is using related feedback etc..
In embodiments, eyepiece allows the delivery to such as truck, robot, target drone, helicopter, ship or the likeThe remote control of tool.For example, wear eyepiece soldier perhaps can be issued by internal communications interface order for control deliveryTool.Delivery vehicle control can mobile (such as soldier be equipped with the mobile biography with eyepiece interactive type communication by voice commands, bodySensor, by eyepiece to fetching control delivery vehicle), keyboard interface etc. provides.In one example, the soldier for wearing eyepiece canRemote control to bomb disposal robot or delivery vehicle is provided, is generated by soldier by the command interface of eyepiece wherein ordering, such asIt is described herein such.In another example, soldier can order the aircraft of such as remotely piloted target aircraft etc, remote control tactics to reversely rotateHelicopter etc..Again, soldier can provide the control to remote control aircraft by control interface as described herein.
In one example, may include combination below in terms of the control of eyepiece: wearable sensor group is as soldier'sMotion capture input, using robot control interface as input can be reflected in it is therein order and control interface, target drone orOther robot equipment is as external equipment to be controlled etc..For example, the soldier for wearing eyepiece may be equipped with for the army of controlWith the sensor group of target drone, the movement of target drone is such as controlled using motion sensor input, identifies control using hand to manipulateThe controlling feature (for example, the graphic user interface such as by showing through eyepiece) of target drone, is controlled using voice commands inputTarget drone processed etc..In embodiments, by eyepiece to the control of target drone may include flight control, to airborne inquiry sensor (such asVisible camera, IR camera, radar) control, Threat Avoidance etc..Soldier is perhaps able to use the sensor of installation physically simultaneouslyBy virtual 2D/3D project image description go out practical battlefield, target drone is directed to its scheduled target, wherein flight, camera,Monitoring control is the body kinematics by soldier come order.By this method, soldier be perhaps able to maintain flight to target drone andPersonalized, the Full vision of environment immerse, more intuitively to be controlled.Eyepiece can have robot control interface for managingThe various controls input for the sensor group worn with reconciliation from soldier, and for providing the interface for controlling target drone.SoTarget drone can be remotely controlled by the physical action of soldier afterwards, such as passes through the military control to control and management for target droneThe wireless connection at center.In another like example, soldier can control bomb disposal robot, which can be worn by soldierThe sensor group and associated eyepiece robot control interface worn is controlled.For example, bomb disposal can be provided to soldierThe graphic user interface of the 2D or 3D view of environment around robot, wherein sensor group provides soldier (such as arm, handDeng) the movement for moving to robot conversion.By this method, the Remote Control Interface that soldier is perhaps capable of providing robot comesIt is preferably sensitively controlled during careful bomb disposal process.In embodiments, can also using as described herein itsHis user action captures input and/or equipment, input can be reflected in order therein and/or control model and interface, will quiltThe useful external equipment etc. of control.
It in one example, may include combination below in terms of the control of eyepiece: when soldier enters a certain position to soldierEvent instruction, based on prediction-study user interface as event occur input can be reflected in order and control thereinMode and/or interface, weapon control system are as external equipment to be controlled etc..For example, eyepiece can be programmed to study scholarWhat the behavior of soldier such as usually does when soldier enters the specific environment with specific weapon control system, such as wearerWhether open system, equip with arms the system, recall visual displays for the system etc..According to the behavior of the acquistion, eyepiece orPerhaps the prediction what soldier wants in terms of eyepiece control function can be made.For example, soldier may be pushed into combat situation, andIt needs to use weapon control system immediately.In this case, eyepiece can when soldier is close to weapon system sensing weapon systemPosition and/or identity, by weapon system configuration/enabling at soldier close to weapon system when usually how to configure the system, such asThe previously used and order weapon control system of the weapon system is come according to last time configuration when eyepiece is in mode of learningOpening system.In embodiments, eyepiece can sense position and/or the identity of weapon system by a variety of method and systems,By vision system, RFID system, the GPS system etc. that identify position.It in embodiments, can be by following come to weaponControl system says the word: providing the graphic user interface of the vision of the control of opening fire of weapon system to soldier, provides choosing to soldierThe audio for selecting and carrying out speech recognition to say the word-voice command system interface, the scheduled automatic activation to a certain functionEtc..In embodiments, profile associated with the order of this acquistion may be present, wherein soldier can modify the acquistionIt profile and/or preference is set in the profile of the acquistion helps to optimize auto-action etc..For example, soldier can be ready with regard to weapon(when i.e. on duty and waiting action) and weapon are effectively fought with enemy with the weapon control profile separated.Soldier may need to repairChange profile to adapt to weapon system variation using associated changing condition, in such as firing order agreement, ammunition typeNumber, the increased ability of weapon system etc..It in embodiments, can also be using other events as described herein and/or numberOrder therein and/or control model and interface, useful external equipment to be controlled can be reflected according to feeding, inputDeng.
In one example, may include combination below in terms of the control of eyepiece: the personal liability event of soldier (is such as expert atDisposed in dynamic region of war, and manage their time) as event and/or data are fed, voice recognition system is as userMotion capture input equipment, audible command interface can be reflected in order therein and control interface as input, based on videoCommunication as be used to making a response the input from soldier on eyepiece using, etc..For example, wearing the soldier of eyepieceMay obtain be projected to they, visually indicate about the event that is ranked of the communication for supporting groupVideo between commanding officer.Then soldier can be recalled contact details for calling using voice commands to the audible command interface on eyepiece, and be ordered by speechIt enables and initiates groupVideo communication.By this method, eyepiece may act as the personal assistant of soldier, to recall the event of being ranked and to soldierThe command interface without using hand for executing the event that is ranked is provided.It is communicated in addition, eyepiece can provide visual interface for groupVideo,Wherein the image of other commanding officers is projected to soldier by eyepiece, and wherein external camera is just being mentioned by the communication connection with eyepieceFor soldier video image (with camera external equipment, using with the reflecting mirror etc. for being internally integrated camera, such as thisDescribed in text).By this method, eyepiece can provide a kind of fully-integrated personal assistant and based on phone/video communications platform, fromAnd the function of other traditionally separated electronic equipments is included, such as radio, mobile phone, visual telephone, aPeople's computer, schedule, without with the order of hand and control interface etc..It in embodiments, can also be using as described hereinOther events and/or data feeding, user action capture input and/or equipment, input can be reflected in it is therein order and/Or the application etc. that order can be used on control model and interface, platform and/or input is made a response.
In one example, may include combination below in terms of the control of eyepiece: the security incident of soldier as event and/orData feeding;Camera and touch screen are as user action capture input equipment;Information processing, fingerprint on eyepiece capture, are facialIdentification application is to make a response input;Figure for communication and/or connection between eyepiece and external system and equipment is usedFamily interface;For accessing external security device and the processing of internuncial external information, fingerprint capture, face recognition application and dataLibrary etc..For example, soldier can receive " security incident " when military checkpoint is on duty, plurality of individual will be carried out safe inspectionIt looks into and/or identifies., in which case it is possible in the presence of the needs to the biometric information for recording these individuals, because they do not haveIt occurs in safety database, meet sidelights on of combatant etc. because of suspicious actions, because of them.Then soldier can be usedBiometric input device, such as camera for taking pictures to face and the touch screen for recording fingerprint, wherein passing through eyepieceOn internal information, processing, fingerprint capture and face recognition application come manage bio-identification input.In addition, eyepiece can provide figureShape user interface is used as external information, processing, the communication connection that fingerprint captures and face recognition is applied, wherein the graphical userInterface provides data capture interface, external database accessing, concern figure database etc..Eyepiece can provide End-to-End Security managementEquipment, including monitoring suspect, the input equipment for obtaining biometric data, display input and database information, outsidePortion's safety and the connectivity of database application etc..For example, soldier may pass through military checkpoint scrutineer, and orderSoldier to meet sidelights on but not currently exist in anyone in safety database acquire face-image, such as with iris biologyIdentification information.When individual is close to soldier, such as positioned at will be in the troop by checkpoint, the eyepiece of soldier obtains each individualHigh-definition picture with carry out face and/or iris recognition, such as by through the addressable database of network communication links intoRow checks.If someone is unsatisfactory for sidelights on (such as young child) or is not considered as in the database threat with themInstruction, then the individual can be allowed through checkpoint.If someone has been instructed to threaten or has met sidelights on but not in numberAccording in library, then the individual can not be allowed through checkpoint, and be pulled to side.If they need to be input into secure dataIn library, then soldier perhaps directly can control external equipment to handle the individual, such as by the equipment of eyepiece or using eyepieceIt acquires the personal information of the individual, shoot the face of the individual and/or the close-up image of iris, record fingerprint etc., such as hereinIt is described.In embodiments, it can also be captured using other events as described herein and/or data feeding, user action defeatedEnter and/or equipment, platform on can be used order and/or the application that input is made a response, from platform interface to external systemCommunication or connection with equipment, for application of external equipment etc..
In one example, may include combination below in terms of the control of eyepiece: finger is mobile as soldier's initiation eyepiece lifeThe user action of order, the icon that can be clicked as user action can be reflected in order therein and control model and/or interface,Application (such as weapon control, army are mobile, information data is fed) on eyepiece, Military Application tracking API are used as and answer from eyepieceThe communication and/or connection, external staff for using external system track application, to feedback of army personnel etc..For example, one can be passed throughAPI realizes the system for monitoring soldier to the selection applied on eyepiece, so that the monitoring is provided to the military for monitoringWith tracking using the service of situation, based on the behavior that monitors to soldier about the anti-of other application obtained by themFeedback etc..During one, soldier may be selected a certain application and come using and/or download, and such as can click icon by presentingGraphic user interface, and soldier perhaps can be realized equipment based on the mobile control of finger (such as camera or inertia system lead toThe finger movement for crossing camera or inertia system soldier is used as control input, be that selection can click icon in this case) it selectsSelect the icon.Then API can be tracked by the Military Application and monitor the selection, the Military Application track API by the selection orMultiple selections (the storage selection on such as a period of time) of storage are sent to external staff and track application.Choosing of the soldier to applicationSelect, be " virtual click " in this case, can then be analyzed to optimization utilization rate, such as by increase bandwidth, change canWith applying, improve existing application etc..Further, which tracks application can determine that wearer is answering using the analysisWith the preference of use aspect what is, and with may the recommendation of interested application, preference profile, other similar army to wearerThe form of thing user list of application currently in use etc. is sent to wearer to be fed back.In embodiments, guiding ocular is being helpedWhile the military use applied with it, eyepiece can provide service to improve experience of the soldier to eyepiece, such as utilize to soldierUse recommendation that can be benefited from it etc..For example, being that its energy may be fully utilized in the soldier of new hand for eyepiece for usingPower, whens using augmented reality interface, organizations, task support etc..Eyepiece can have the utilization rate of monitoring soldier, incite somebody to actionThe utilization rate and use rate metric (being such as stored in external eyepiece utilization rate equipment) are compared and provide to soldier anti-Feedback is to improve the use to eyepiece and the ability of associated efficiency etc..It in embodiments, can also be using as described hereinThe other users for controlling or initiating order it is mobile or movement, input can be reflected in order and/or control mould thereinIt can be used that order and/or the application made a response to input, interface to external system and is set from platform in formula and interface, platformStandby communication or connection, for external equipment application, related with external equipment and/or applications feed back etc..
In one example, may include combination below in terms of the control of eyepiece: the sensors such as IR, heat, power, carbon monoxide are madeFor input;Microphone is as additional input equipment;Voice commands initiate the movement of order as soldier;Head up display is as inputOrder therein and control interface can be reflected in;The religion of offer guidance while needs of their hand is used reducing soldierGuidance application is learned, in repairing on the spot, maintenance, assembly etc.;Movement and sensor based on soldier are inputted to be provided to soldierThe visual displays of feedback;Etc..For example, the delivery vehicle of soldier may be damaged in fighting, so that soldier is strandedWithout instant transport capacity.Soldier perhaps can recall teaching-guiding application, be provided when being run by eyepiece without usingThe instruction of hand and the access of computer based expertise are come the problem of diagnosing delivery vehicle.In addition, using can provide soldierUnfamiliar step study course such as restores the basic and interim function of delivery vehicle.Eyepiece is also possible to monitoring to be had with diagnosisThe various sensors input closed, the sensors such as IR, heat, power, ozone, carbon monoxide, so that sensor input can be taughtIt learns application access and/or is directly accessed by soldier.Using the microphone that acceptable voice commands can also be provided;For showing instruction letterThe head up display of 2D or the 3D description for being repaired part of breath, delivery vehicle;Etc..In embodiments, eyepiece perhaps canIt is enough to provide to soldier without the virtual assistant with hand, to help them to diagnose and repair delivery vehicle, to re-establish transportMeans, to allow soldier and enemy again belligerent or be moved to point of safes.It in embodiments, can also be using such as this paper instituteOther sensing inputs stated and/or sensor device, user action capture input and/or equipment, for controlling or initiating orderUser is mobile or movement, input can be reflected on order therein and/or control model and interface, platform and order can be usedAnd/or to application, the feedback related with external equipment and/or applications etc. that input makes a response.
In one example, may include combination below in terms of the control of eyepiece: eyepiece enters " active state ", such as " armyThing is belligerent " activity pattern, such as instructed by received task, soldier's order eyepiece enters military fire patterns or meshMirror senses it near a certain military activity, may even scheduled or as target belligerent region, this may beIt is further developed partially by from the general belligerent appointment for monitoring and learning wearer.Continue the example, into workDynamic state (such as military belligerent active state) (such as when drive vehicle and enemy meet with or enter hostile territory) canBe combined with following: object-detection device is as sensing input or sensor device, wear-type camera and/or eye-gaze detection systemAs user action capture input, eyes are mobile mobile as the user for controlling or initiating order or act, 3D navigation circleFace can be reflected in belligerent management application airborne in order therein and control model and/or interface, eyepiece as input and makeFor for coordinating to order input and the application of user interface, be controlled with external system or equipment communication or the navigation system connectingDevice, Navigation for Means of Transportation system are made as external equipment, military planning and the execution equipment to be controlled and/or to be mated withFor for handling applications, target center or Target Tracking System about the user action of military affairs instruction as the pass to wearerIn the feedback etc. for the chance for aiming at enemy within view while driving.For example, soldier may be in the delivery work for driving themEnter hostile environment when tool, and detects the existing eyepiece in the belligerent region of enemy (such as by GPS, straight by integrated cameraConnect and observe target etc.) " military belligerent active state " (such as enabled by soldier and/or ratified) can be entered.Eyepiece is then availablePositioning aims at the object-detection device of enemy's chance to detect enemy's delivery vehicle, adverse party residence etc., such as passes through wear-type phaseMachine.Further, the eye-gaze detection system on eyepiece can monitor soldier seeing where, and may highlight about pendantThe information for watching the target at position attentively of wearer, such as enemy personnel, enemy's delivery vehicle, enemy weapon and You Fang army,Wherein friend and enemy are identified and distinguish.The eyes movement of soldier can also be tracked, such as changing the target of concern, orFor ordering input (to indicate the mobile order for indicating that additional information of select command, downward eyes as quickly noddedDeng).Eyepiece can call the projection of 3D navigation interface to help to provide information related with their ambient enviroment, Yi Jijun to soldierThe belligerent application of thing such as obtains the input from soldier, provides to 3D navigation interface for coordinating military belligerent movable stateOutput and external equipment and interface applications etc..Eyepiece can for example using navigation system control come with Navigation for Means of Transportation systemDocking, so as to include in military belligerent experience by Navigation for Means of Transportation system.Alternatively, leading for their own can be used in eyepieceBoat system such as substitutes carrier systems or enhances it, such as leave delivery vehicle in soldier and it is desirable that provide ground to themOn face when direction.As a part of military belligerent active state, eyepiece can with external military planning and execution equipment interconnection, it is allSuch as it is used to provide current state, troop's movement, weather condition, friendly troop position and troops.In embodiments, it is lived by enteringDynamic state, soldier can be provided that feedback associated with the active state, such as military belligerent active state, withThe form of the associated information of the target identified provides feedback.In embodiments, can also using as described herein otherEvent and/or data feedback, sensing input and/or sensor device, user action capture input and/or equipment, for control and/Or initiate order user it is mobile or movement, input can be reflected in order therein and/or control model and interface, platformCan be used order and/or to input responded application, from platform interface to external system and equipment communication or connection,For external equipment application, it is related with external equipment and/or applications feed back etc..
In one example, may include combination below in terms of the control of eyepiece: secure communication is received as the touching to soldierHair event, inertia mobile tracking capture input equipment as user action, draw brush movement with what finger drag and drop and soldier were madeAs the user for controlling or initiating order is mobile or movement, the list that can navigate can be reflected in life therein as inputIt enables and control interface, information is conveyed as order usable on eyepiece and a type of application made a response to input, tuneWith system as interface is captured to the communication or connection of external system and equipment, iris from eyepiece and identifying system is used as and is used forExternal system and the applications of equipment etc..The soldier for wearing eyepiece can receive secure communication, which can enter eyepiece and makeFor " event " to soldier, such as triggering certain operating mode of eyepiece, with vision and/or aural alert, starting meshApplication or movement on mirror etc..Soldier perhaps can react to event by multiple controlling mechanisms, and such as wearer usesTheir finger and hand carries out " dragging and dropping " by intelligent sketching, draws brush etc. (such as to answer by the airborne camera of eyepiece and gestureWith, wherein wearer by communication Email or information be dragged into file, using, it is another communication etc.).Wearer may bring up canThe list of navigation is as a part for making action to communication.User can be applied from the secure communication by some eyepiece by the letterBreath is communicated to external system and equipment, such as tracking the reconciliation system of communication and associated movement.In embodiments,Eyepiece and/or security access system can require identity verification, and such as by bio-identification authentication, such as fingerprint captures, rainbowFilm captures identification etc..For example, soldier can receive secure communication, which is safety warning, and wherein the secure communication is along with arrivingThe secure link of further information, wherein the soldier is required to provide biometric authentication before being provided access.OnceIt is certified, which just perhaps can be in their use when the content as obtained by eyepiece is made a response and manipulatedGesture, list, the link, data, image that such as manipulation can be directly obtained from communication and/or the link by being included obtainsDeng.The ability for providing response and manipulation content associated with secure communication to soldier can preferably allow soldier not endanger himThe mode of any insecure environments that may be presently in interact with message and content.In embodiments, it can also answerInput and/or equipment are captured, for controlling or initiating with other events as described herein and/or data feeding, user actionThe user of order is mobile or movement, input can be reflected on order therein and/or control model and interface, platform can be usedOrder and/or the application that input is made a response, from platform interface to the communication or connection of external system and equipment, for outerApplication of portion's equipment etc..
In one example, it may include combination below in terms of the control of eyepiece: using inertia user interface dynamic as userThe military instruction that work capture input equipment will be supplied to soldier by eyepiece is supplied to external display device.For example, wearing meshThe soldier of mirror may want to one group will be supplied in battlefield from the instruction of their bulletins as obtained by the equipment of eyepieceOther soldiers.The soldier can by using physics 3D or 2D mouse (as with inertia motion sensor, MEMS inertial sensor,Ultrasonic 3D motion sensor, IR, ultrasonic wave or capacitance type touch sensor, accelerometer etc.), virtual mouse, virtual touchScreen, dummy keyboard etc. are helped, to provide the interface for manipulating the content in bulletin.The bulletin can pass through eyepiece quiltViewing and manipulation, but are also exported in real time, such as to being connected to external display device (such as computer monitor, projector, videoScreen, TV screen etc.) outside router.Eyepiece can provide one kind for soldier and make other people that them be watched to pass through as a result,It is that eyepiece is seen and by way of the thing that the control equipment of eyepiece is controlled, to allow soldier that will open with by eyepieceThe associated multimedia content of bulletin exports to other non-eyepiece wearers.In one example, mission briefing can be provided thatGive battleficld command official, commanding officer perhaps can be made to their troop as eyepiece with the multimedia as eyepiece obtained by withThe bulletin of augmented reality resource provides the advantage that obtain these visual resources as described herein.In embodiments,Input and/or equipment, input can also be captured using other sensing inputs and/or sensor device, user action as described hereinCan be reflected in order therein and/or control model and interface, from platform interface to external system and equipment communication orConnection, useful external equipment to be controlled, feedback related with external equipment and/or applications etc..
In one example, may include combination below in terms of the control of eyepiece: nod it is mobile as the user that initiates order,For reflect control input be reflected in the graphic user interface of mode therein and/or interface, on eyepiece using order and/Or the audio that the entertainment applications made a response are inputted to control, eyepiece interface is communicated and/or connect with external system or equipmentSystem controller etc..For example, the wearer of eyepiece may control audio player by eyepiece, and it is desirable that change to nextTrack.In this example, nodding for wearer can be programmed to instruction track change.In addition, eyepiece may be to wearerProjection is used for the graphic user interface of audio player, such as shows and which track be playing.Nodding for wearer then may be usedBy the processing equipment of eyepiece be construed to change TRACK command, the order then may be sent to that audio system controller for changingTrack, and the graphic user interface for audio player can then show the change of track to wearer.
In one embodiment, may include combination below in terms of the control of eyepiece: motion sensor is as sensing input, increasingStrong reality interface as input can be reflected to the order of wearer and control interface, rangefinder conduct will be controlled and will be therefromAcquire the external equipment etc. of information.For example, the wearer of eyepiece just may monitor the movement in a certain environment with motion sensor,And when motion sensor is triggered, augmented reality interface can be projected to wearer, and help identifies object.In addition, otherSensor can help to identify, and such as rangefinder is used to determine the distance for arriving object.Augmented reality interface can be provided to wearerAbout the information of object, position of such as object on the map that 2D or 3D are projected (is such as deposited from the information previously acquiredStorage in object database, including face recognition, Object identifying) the identity of object, the coordinate of object, object night vision imagingDeng.The triggering of motion detector can be construed to Warning Event by the processing equipment of eyepiece, the order and then can be transmitted to rangingInstrument determines the position of object, and sends the loudspeaker of eyepiece earphone to and sensed mobile object to provide to wearerAudio-alert.The audio-alert add to the visual detector of wearer may act as it is to wearer, about should be noted that the shiftingThe input of dynamic object, such as when the object has been identified as wearer's object of interest.
In one example, may include combination below in terms of the control of eyepiece: wearable sensor group is dynamic as userOrder therein and control interface, target drone or other machines can be reflected in by making capture input, robot control interface as inputDevice people equipment is as external equipment to be controlled etc..For example, the wearer of eyepiece may be provided with the sensing for controlling target droneThe input of device group, such as motion sensor is (such as logical for the controlling feature for manipulating target drone come movement, the hand identification control for controlling target droneCross the graphic user interface shown through eyepiece), voice commands input be used to control target drone etc..Eyepiece can have robot to control boundaryFace is used to manage and reconcile the various controls input from sensor group, and for providing the interface for controlling target drone.SoCan remotely control target drone by the movement of wearer afterwards, such as by the control centre for controlling and managing for target drone,More directly arrive the wireless connection etc. of target drone.In another like example, boundary can be controlled by sensor group and eyepiece robotFace controls robot (such as bomb disposal robot).For example, graphic user interface can be provided to wearer, which is mentioned2D the or 3D view of the environment around robot is supplied, wherein sensor group provides the movement of wearer (arm, hand etc.)To the conversion of the movement of robot.By this method, wearer is perhaps capable of providing the Remote Control Interface to robot.
In one example, it may include combination below in terms of the control of eyepiece: entering a certain position as the thing to eyepiecePart, based on prediction-study user interface as event occur input can be reflected in it is therein order and control model and/orInterface, entertainment systems are as external equipment to be controlled etc..For example, eyepiece can be programmed to the behavior of study wearer, it is allAs wearer enter with entertainment systems room when what usually does, such as wearer whether turn on television set, audio system,Game system etc..According to the behavior of the acquistion, eyepiece perhaps can make what wearer wants in terms of eyepiece control functionPrediction.For example, coming into parlor, eyepiece, which senses the position and wearer, would generally pass through entertainment systems when entering roomMusic is opened, and order entertainment systems open the music of last time broadcasting.In embodiments, eyepiece by a variety of methods and can beSystem carrys out sensing the feedback of position, by vision system, RFID system, the GPS system etc. that identify position.Saying the word to entertainment systems canIt is carried out by following: providing the graphic user interface of selection to wearer, provides selection to wearer and to orderAudio-voice command system interface of sound identification, the automatic activation ordered etc..It may be present associated with the order of this acquistionProfile, wherein wearer can modify the profile of the acquistion and/or the preference that is arranged in the profile of the acquistion helps mostOptimize auto-action etc..
In one example, may include combination below in terms of the control of eyepiece: personal event is presented as event and/or dataGive, voice recognition system as user action capture input equipment, audible command interface as input can be reflected in it is thereinOrder and control interface, video conference are as application that be used to make a response the input from wearer on eyepiece, etc..For example, wearer can get visually indicating for the calendar events about a certain Conference Calling for being projected to them.User then canThe information of dialling in of the calling is recalled using voice commands to the audible command interface on eyepiece, and initiates to regard by voice commandsFrequency meeting.By this method, eyepiece may act as personal assistant, recalls calendar events and provides the nothing for executing calendar events to wearerThe command interface of hand need to be used.In addition, eyepiece can provide the visual interface for video conference, wherein other people image passes through meshMirror is projected to wearer, and external camera passes through the video image of the communication connection offer wearer of eyepiece.Eyepiece canA kind of fully-integrated personal assistant and phone/videoconferencing platform are provided, thus by other traditionally separated electronic equipmentsFunction be included, mobile phone, PDA, schedule, without with the order of hand and control interface etc..
In one example, may include combination below in terms of the control of eyepiece: security incident is presented as event and/or dataIt send;Camera and touch screen are as user action capture input equipment;Information processing, fingerprint on eyepiece capture, face recognition is answeredFor being made a response to input;Graphic user interface for communication and/or connection between eyepiece and external system and equipment;For accessing external security device and the processing of internuncial external information, fingerprint capture, face recognition application and database etc..ExampleSuch as, security official may handle " security incident ", this may be some checkpoint will to many people carry out safety inspection and/orMark needs to check and/or identify some individual etc., wherein identifying the needs (example to the biometric information of recording individualAs because they do not appear in safety database, because of suspicious actions etc.).Then bio-identification input can be used in security officialEquipment, such as the touch screen to the camera taken pictures of face and for recording fingerprint, wherein by internal information on eyepiece,Processing, fingerprint capture and face recognition application inputs to manage bio-identification.In addition, eyepiece can provide graphic user interface conductTo external information, processing, the communication connection that fingerprint captures and face recognition is applied, wherein the graphic user interface provides data and catchesCatch interface, external database accessing, concern figure database etc..Eyepiece can provide End-to-End Security management equipment, including monitoringIt pays close attention to personage, the input equipment for obtaining biometric data, display input and database information, arrive external security and dataThe connectivity etc. of library application.
In one example, may include combination below in terms of the control of eyepiece: finger is mobile as initiation eyepiece orderUser action, can click icon as user action can be reflected in it is therein order and control model and/or interface, eyepiece onApplication (such as phone application, music searching, advertisement selection), advertisement tracking API as external system is applied to from eyepieceCommunication and/or connection, external advertisements are applied, to feedback of user etc..For example, can be realized by an API for monitoring userSystem to the selection applied on eyepiece, so that the monitoring provides service, based on the behavior monitored to advertisement implantation equipmentTo the feedback etc. about the possible interested other application of wearer of wearer.During one, wearer be may be selectedA certain application comes using and/or downloads, and the graphic user interface of icon can be such as clicked by presenting, wearer perhaps being capable of baseRealize that (such as camera or inertia system pass through camera or the finger movement of inertia system wearer to equipment in the mobile control of fingerIt is used as control input, is that selection can click icon in this case) select the icon.Then it can be tracked by the advertisementAPI monitors the selection, which tracks API for the selection or multiple selections (choosing stored on such as a period of time of storageSelect) it is sent to external advertisements application.Wearer application selection (being " virtual click " in this case) can then be analyzed withJust advertising income is generated, by planting back wearer by advertisement, data being sold to third party's advertising equipment etc..Further,The external advertisements application can determine that wearer in the preference using aspect is using the analysis, and with to wearerMay the recommendation of interested application, preference profile, other similar user downloading interested what list etc. form to pendantWearer sends feedback.In embodiments, while helping to apply by external advertisements is that third party generates ad revenue, meshMirror, which can provide, improves service of the wearer to the experience of eyepiece, such as utilize to wearer may interested downloading recommendation,Utilize possible interested stronger advertisement of specific aim of wearer etc..
In one example, may include combination below in terms of the control of eyepiece: body is mobile (such as moving for sensing head movementForce snesor, the camera for sensing hands movement, the dynamic pickup for sensing body kinematics) and touch sensor or audio as userMotion capture sensor device (the game station as sensed such as steering wheel, sword or the like;Sense another player in game;DengDeng), the mobile user action (such as passing through ability of posture control) as controlling and/or initiating order of head and hand, virtually showReal interface can be reflected in order therein as input and control interface, information show and is used as on eyepiece and can make sound to inputThe application answered, computer game device are as will be by external equipment that game application controls and to wearer's game replayingContent and performance, grading, score etc. are as to user and external equipment and applying related feedback.For example, wearer orPerhaps can be played together with eyepiece interactive computer game (as on computers, on computer based gaming platform, movingOn dynamic gaming platform), wherein the body of wearer is mobile is interpreted control input, such as passes through body movable sensor, touchingTouch sensor, infrared sensor, IR camera, visible camera etc..By this method, the movement of wearer's body can be fed into computerGame, rather than more conventional control is used to input, such as portable game controller.For example, eyepiece can pass through the IR on eyepieceAnd/or visible camera moving through airborne or external gesture recognition algorithm and handle the hand that senses user etc., eyepiece canSensed by the motion sensor on eyepiece user's head movement (as sense user response in game come jump, comeReturn is dynamic, mobile from one side to one side) etc..In one embodiment, posture can by downwards camera or by such as by usingCamera that folded optics (fold optics) are imaged downwards captures.In other embodiments, camera can capture non-line-of-sightPosture identifies.For example, spill (foviated) or segmented camera can carry out motion tracking and room mapping in eyes front,But there is the quadrant looked down or hemisphere to capture posture and user interface control order, wherein the hand of user is placed on their bodiesSide or on thigh it is not parallel with the axis of centres of display.Certainly, gesture can be tracked as described, such as byIMU sensor, magnetic marker, RF label etc. are used in equipment (such as circular finger controls, wrist-watch, pistol grip).These bodies fortuneThen dynamic control input can provide a user game ring in virtual reality interface on feed-in eyepiece and information display applicationThe visual depiction in border, can feed-in computer game platform carry out the motion control gaming platform according to user, be supplied to eyepiece and tripBoth the virtual reality interface of play platform and information display to create augmented reality gaming platform etc. by eyepiece.In each embodimentIn, for sense body it is mobile or otherwise show user's interaction control equipment or Interactive control element can by with meshThe associated processor of mirror is removed from computer picture.In the case where being not intended to sensor to become a part of game, controlThe all or part of the image of control equipment can be removed from playing in the image that game generates.For example, being only applied to examine in sensorIn the case where surveying hand/limbs movement, sensor can be removed from image generation, however be and object for appreciation in sensor or control equipmentWhen the related object of game (such as sword), the object itself, which can be depicted in, to be played in game or AR environment.In embodiments, may be usedIt can wish that control equipment is viewed in the position except it actually present position.For example, the target that user throws dartlike weapon canIt is displayed on the end in passageway before user, rather than the dartlike weapon thrown with user is shown in association.As furtherExample, if user never actually play game in discharge dartlike weapon, as control equipment dartlike weapon can be illustrated as based on useThe feature that family is thrown advances to target.In embodiments, computer game can be used as local game application and completely it is airborneGround operates on eyepiece, docks with the outside gaming devices of wearer local, and (such as a large amount of multiplayers exist with the game station of networkingLine game, MMOG) interface, on eyepiece and the combination etc. that passes through gaming platform.Eyepiece and a local outside gaming devices (such asGaming platform in wearer family) it docks and in the case where controlling the outside gaming devices, the eyepiece application obscure portions that game executesVisual environment can be provided to wearer and information is shown, and outside gaming devices can provide game application execution.Alternatively, eyepieceIt can provide user's moving sensing interface and provide this information to gaming platform, wherein gaming platform is then by the vision of gameInterface is supplied to user.Alternatively, eyepiece can provide user's moving sensing interface, and wherein the information is by eyepiece and gaming platform twoFor person using augmented reality interface is created, which is combined with vision in the game to user is presentedInterface and gaming platform.In embodiments, AR application can be as object be by prospect and in building or the side of other structuresEnhance advertisement etc. on face.When user drives to pass through, camera can pay attention to object (such as kerbside is along upper lamp stand) just than backThe faster speed movement in enhancing surface in scape passes through the visual field.A part that display system can subtract enhanced image carrys out reserved graphAs the virtual hierarchy of rear content.This can require the proper calibration of the parallax between the glasses, display and camera of user.In each realityIt applies in example, which can be used for generating depth map.Those skilled in the art will be clear that: can be achieved the processing provided by eyepiece withMany different demarcations configuration between the processing provided by external equipment.In addition, game realization can expand to outside across internetGame station, such as using MMOG.External equipment (either local or across internet) then can be provided to wearerFeedback such as provides at least part (sheet being such as combined with the content from external equipment and other players of played contentThe game projection that ground provides), performance instruction, score, grading etc..In embodiments, eyepiece can provide for computer gameUser environment, wherein eyepiece and external control input and external processing apparatus are to fetching creation next generation's gaming platform.
As eyepiece by sensor with the direct physical connection of wearer (such as dynamic pickup, body movable sensor,Touch sensor) the mobile substitution of body is detected, eyepiece sensed indirectly in combination with for the body movement to wearerWith the active remote sensing system of explanation, by using sensed using IR, sonar, RF, energy projected etc. wearer's hand,The 3D active depth transducer of the position of foot etc..Active 3D depth transducer can be also combined with the visual or IR camera on eyepieceGround uses.The combination of camera and 3D depth transducer can provide 3D motion capture, and 3D motion capture is processed on eyepieceAdvanced gesture recognition is provided.3D active depth transducer may include source (such as IR laser projecting apparatus) and receiving sensor (such as IR biographySensor).In embodiments, it under camera and 3D active depth transducer can be directed toward relative to eyepiece sight, is directed toward one side, is directed towardOutside, visibility of the Lai Tigao system to hand, the foot of user etc..In embodiments, there may be multiple cameras on eyepiece, such asIt is described herein for imaging one or more cameras (a such as face forward, a detection eye motion, one towardsRear), order and control one or more phases of eyepiece function, application, external equipment etc. for sensing the movement of wearerMachine.In one example, the combination of depth transducer and camera may point to image and the movement of the hand of capture wearer, whereinEyepiece processor tracks hand mobile (translation and rotation of such as hand, hand using the input from depth transducer and cameraThe movement of each finger), calculate the movement of hand using motion algorithm, and based on the movement detected and according to being detectedTo the on-board data base of command function of movement control eyepiece function.In embodiments, the hands movement explained can quiltFor controlling eyepiece function, eyepiece application, external equipment is controlled by eyepiece, outside game platform is input to, is input to insideReality-virtualizing game is medium.In embodiments, camera, 3D active depth transducer and associated algorithm can be with airborne wordsCylinder or microphone array combine to detect sound and the movement of ambient enviroment.
The disclosure may include a kind of interaction wear-type eyepiece that user wears, and wherein eyepiece includes optics assembly, the lightIt learns assembly and provides a user the view of sight and introduce optics from integrated image source that the edge of ambient enviroment looks to the front simultaneouslyThe view for the content of assembly shown, the eyepiece provide the collection of the view for the sight that there is the edge of ambient enviroment to look downAt camera, for carrying out user's posture identification by integrated gesture recognition device.In embodiments, gesture recognition device can incite somebody to actionThe motion identification that eyepiece is explained is the order to eyepiece.The movement can be hands movement, arm motion, finger movement, foot fortuneDynamic, leg movement etc..Integrated camera perhaps can check the ambient enviroment towards forward sight, and the sight looked down is used to appearanceGesture identification.The integrated camera can have segmented optical element to be used to simultaneously to the view imaging and opposite direction towards forward sightUnder see sight view imaging.It is sensed and is explained indirectly for the body movement to user in addition, the eyepiece can haveActive sensing system, wherein the active sensing system is provided along the sight view looked down by optics assembly and is actively believedNumber.The active signal can be the active signals such as IR, sonar, RF.The active sensing system may include for sense the hand of user,The 3D active depth transducer of the position of at least one of foot, body etc..The active sensing system can be together with integrated camera oneIt rises and is used to further provide for user's posture identification.
In embodiments, eyepiece may include the double mode for marking positioning and tracking.It can create general near POIThen the GPS of mark position can create another second label.Second label can be known by sensor reading, image procossing, imageNot, user feedback etc. generates.Second label can be used for being tracked between acquisition GPS reading.In embodiments,Second label can be used for providing or leaving the distance of point of interest.Double labelling system can provide a user the distance between two o'clock,Time and direction.Point can be the point of interest of travelling, transport, business, business etc..This is used to mark the double mode of positioning and trackingPermissible user is positioned to the article of purchase, project, Reiseziel, the Transportation Model to be visited etc..Transport item may include usingAutomobile, train, aircraft, the taxi at family are called a taxi point, taxi, subway etc..Business item may include various projects, such as be not limited toFood, amusement, shopping, clothing, books, service etc..In embodiments, the project of Yao Dingwei can be tourist attractions, restaurant,Park, street etc..There may be the ecosystems of label, from QR code to broad range of communication equipment (router and interchanger)Or passive sensor (RFID label tag that can be checked), all of which may wish to for certain relevant information to be forwarded to glasses, noPipe is that back-end network is allowed to estimate certain content specific to the position of the exact position or label that deliver what contentItself.Folk prescription may help to orient or triangulation location glasses using two labels, thus with than with certain single markings (especiallyIt is to be not those of vision) easier way provides exact orientation and range information.In embodiments, it can handleTwo labels.Eyepiece perhaps can identify two labels nearby or in the visual field, and work to them, simultaneously (such as withIn triangulation) or assign priority one of thereto (payment label may be more excellent than non-payment label such as in advertising scenariosFirst;Label towards safety may than it is AD tagged more preferably etc.).Label may be from glasses, but can also by such as otherGlasses or other systems (such as system of advertiser, government side etc.) are placed.
In embodiments, a kind of system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes userOptics assembly for the content observing ambient enviroment and showing, the integrated figure for content to be introduced to optics assemblyImage source and the integrated processor for reading the label of generation point of interest based on GPS and label being stored in the memory of eyepiece,Wherein integrated processor creation the second label relevant to the GPS point, and in memory by the second label storage.In each realityIt applies in example, which can pass through sensor reading, image procossing, image recognition, user feedback to current location etc.At least one generate.Second label can be used for calculating at least one of distance, direction and the time for arriving point of interest.?In each embodiment, point of interest can be at least one of tourist attractions, restaurant, park, street etc..In embodiments, shouldGPS point can be used together with the second label, to provide at least one of the distance, direction and time of a certain business item etc..In embodiments, which can be used together with the second label, come provide the distance of a certain Transportation Model, direction and whenAt least one of between, the tracking etc. to a certain Transportation Model.In these embodiments, Transportation Model may include train, subway,At least one of automobile etc..In each embodiment of system, which can be used together to provide to interest with the second labelThe tracking of point.In embodiments, which can be used together to provide the tracking to a certain business item with the second label.?In each embodiment, user feedback can be the Oral input to eyepiece.Second label can be generated by various means.For example,Believe the position that second label can obtain the main body of image based on the processing of the static state and/or video image captured to eyepieceBreath.In embodiments, the second label can be based on the number obtained from internet hunt, scanning QR code, bar code, object etc.According to.
In various embodiments, eyepiece may include for providing the earphone of enhancing hearing, and wherein user can hear hisAmbient enviroment, there are also additional audios.This audio may include game content, sports commentary etc..In embodiments, microphoneAnd/or earplug can binaural or otherwise play audio.In embodiments, bone conduction can be used together with eyepieceEarphone.These earphones allow user to obtain the audio wave for being transmitted to inner ear (thus the ear-drum for bypassing user) by cranium.In embodiments, which can be used together with the cheekbone of ear front for being placed exactly in user, or can be transmitted soundOther bones of frequency.Therefore, user is aloowed monitoring or to hear sound while cognition is to his or her ambient enviromentFrequently.In embodiments, sound laser also can be used in earphone, and whereby by the use to laser, earphone issues sound wave.In each realityIt applies in example, the raising volume for allowing user to experience the sound that external voice or earphone generate and/or clear also can be used in earphoneThe equipment of degree.In various embodiments, the earphone of eyepiece can play and/or transmit audio from radio, wirelessly obtainsAudio, pass through audio etc. that internet obtains.In embodiments, eyepiece can also send satellite broadcasting audio to user.
In various embodiments, eyepiece may include the RF shielding to other of brain or user's body position.In each implementationIn example, emit the eyepiece of electromagnetic field any part can made of conductive or magnetic material or other materials barrier shield.In embodiments, barrier may include sheet metal, metal mesh, foam metal, foamed material etc..Hole in shielding or gridWavelength than the radiation being blocked or other radiation is much smaller.In embodiments, eyepiece inside or other parts and/or meshMirror shell is coatable to have metallic ink or other materials to provide shielding.Such metal can be using very small particle formCopper, nickel etc..Such metal can be sprayed-on on the shell.In a further embodiment, this RF shielding can be worn by userWear other positions to prevent various frequencies from touching his or her brain, eyes or body.
In embodiments, the interaction wear-type eyepiece that a kind of user wears may include user be used to watch ambient enviroment andThe optics assembly of the content shown, for by content introduce the integrated image source of optics assembly, radio component andShielding, the wherein radiation-curable electromagnetic radiation of radio component, and the radiation for shielding a part of barrier from eyepiece goes out eyepiece.In a further embodiment, shielding, which can be positioned so that, protects users from radiation.In addition, shielding can be positioned so that protection userWith other people not rayings.In embodiments, shielding can at least shield brain, a certain position of user's head, use of userAnother position of family body etc..In embodiments, shielding can be by conductive material, magnetic material, sheet metal, metal mesh, netAt least one of lattice and foam metal are constituted.In the embodiments described herein, shielding may include the wave than specific radiationLong small hole, and hole is small than the wavelength of the radiation given off from eyepiece.In embodiments, shielding may include metalAt least one of ink, copper ink, nickel ink etc..In embodiments, it shields coatable inside eyepiece.In embodiments, shieldIt covers at least one of forehead portion of temple, eyepiece that may be disposed at eyepiece etc..In embodiments, shielding can be setIt sets at least one of temple, forehead portion of eyepiece of eyepiece etc..In embodiments, shielding can be worn by user.
In one example, the controlling party face of eyepiece includes combination below: IR as input, heat, power, ozone,The sensors such as carbon monoxide;Microphone as additional input equipment;Order is issued as the movement made by wearerVoice command;Head-mounted display as the order and control interface that are reflected in wherein input energy;Instruction guidance applicationTo provide the needs that guidance simultaneously reduces the hand using them simultaneously, such as in maintenance and assembling;According to the movement of wearer andSensor input to provide the visual display etc. of feedback to wearer.For example, Motor Vehicle Technician can be the wearer of eyepiece, whereinTechnician is just assisting the maintenance of vehicle using eyepiece.The instruction guidance application such as run by eyepiece can asking on diagnosis vehicleSlip out of the hand formula instruction and the access of computer based expertise are provided when topic to technician.In addition, the application can provide technician institute notThe guide of known program.Eyepiece can also monitor and diagnose with security-related various sensors input, such as IR, heat, power,The sensors such as ozone, carbon monoxide, so that sensor input can be instructed to application access and/or can directly be accessed by technician.The application can also be provided can received microphone by its voice command;It is used to indicate the display of information, vehicle is in and repairsThe head-mounted display that the 2D or 3D of part in reason describe;That repairs timely feedbacks and spends.In embodiments, eyepieceIt can be provided to technician and slip out of the hand formula virtual assistance to assist them in the diagnosis and repairing of vehicle.
In one example, may include combination below in terms of the control of eyepiece: eyepiece enters " active state " (such as" shopping " activity pattern), for example, user command eyepiece enters shopping mode or eyepiece senses it just adjacent to shopping areaDomain, the interested shopping area of the wearer perhaps obtained even through preference profile can be partially by selfThe shopping preferences of monitoring and study wearer are come by further perfect.Continue this example, activity is such as entered when drivingState (such as shopping activity state) can be combined with following: object-detection device is as sensing input or sensing equipment, wear-typeCamera and/or gaze detection system are mobile as the use for controlling or initiating order as user action capture input, eyesFamily is mobile or movement, 3D navigation interface as the order and control model that are reflected in wherein input energy and/or interface, in eyepieceUpper airborne E-business applications are as the application and external system and equipment communication for coordinating order input and user interfaceOr the navigation system control of connection, Vehicular navigation system are as will be controlled and/or interfaced external equipment, advertisement machineStructure is used as applications, target center or the Target Tracking System for handling user action about advertising database to wearingFeedback of the person when driving about the shopping machine meeting in sight.For example, wearer can enter shopping area when driving their vehicleDomain, and (being directly viewable target etc. by GPS, via integrated camera) in the appearance for detecting shopping area, eyepiece canInto " shopping activity state " (such as enabled by wearer and/or ratified).Then available objects detector such as passes through eyepieceWear-type camera positions notice board, the StoreFront etc. of shopping machine meeting to detect.In addition, the gaze detection system on eyepiece can monitor pendantWearer just looking at where and about the target stared in wearer at position possible prominent information, supply in such as current shopThe merchandising answered or Special Events.The eye movement of wearer can also be tracked, such as changing interested target orFor ordering input (indicating the mobile instruction of select command, downward eye for the order etc. of additional information for example, quickly nodding).For example, the iris or retina of user can be tracked to provide control input.Eyepiece can call the projection of 3D navigation interface to assistIt provides to wearer about the information around them and the E-business applications for coordinating shopping activity state, such as takesThe input from wearer is obtained, output is provided to 3D navigation interface, is docked etc. with external equipment and application.Eyepiece can be such asIt is docked using navigation system control with Vehicular navigation system, and thus can include into shopping body by Vehicular navigation systemIn testing.Alternatively, it is such as come out from vehicle as wearer and it is desirable that eyepiece can be used when having the direction of travel for being supplied to themThe navigation system (such as enhancing instead of Vehicular system or to it) of own.As a part of shopping activity state, meshMirror can be docked with external advertisements mechanism, and all Tathagata provides the current preferential, Special Events for surrounding businessman, pop-up advertisementDeng.External advertisements mechanism can also be connect with third party advertiser, publisher, businessman's organization of supply etc., they can wear to being supplied toThe information of wearer is made contributions.In each implementation mutual benefit, by entering active state, wearer be can be provided that and active state phaseAssociated feedback is such as provided with the anti-of the form of information associated with the target of mark for shopping activity stateFeedback.
In one example, may include combination below in terms of the control of eyepiece: the Email as trigger event receives,The inertia mobile tracking of input equipment is captured, as the user's movement or movement for controlling or initiating order as user actionThe drag and drop for using finger to make and sliding it is mobile, as the column that navigate for the order and control interface being reflected in wherein input energyTable orders as that can use on eyepiece and transmits, the information for the application type that input makes a response as the interface from eyepieceTo the payment system of communication or the connection of external system and equipment, as the iris for external system and the applications of equipmentCapture and identifying system etc..For example, wearer can receive bill via e-mail, and Email can be used as to wearer's" event " enters eyepiece, and all Tathagata triggers the operation mode of eyepiece using visual and/or audible alert to start eyepieceOn application etc..Wearer can react to email event by multiple controlling mechanisms, and such as wearer passes through hand(for example, by airborne camera on eyepiece and hand gesture application, wherein wearer is by Email or electricity at portion posture interfaceInformation in sub- mail is dragged into file, using, another Email etc.) use finger and hand " drag and drop ", sliding etc..Wearer canThe bill list that can be navigated is called to pay.User can via eyepiece application by the information from Email (for example, billInformation, account number, amount of money of payment etc.) it is transmitted to external system and equipment, such as paying the payment system of bill.EachIn embodiment, eyepiece and/or payment system can require authentication, such as be caught by bio-identification authentication, such as fingerprintIt catches, iris captures identification etc..
It in one example, may include using inertia user interface to capture input as user action to set in terms of the control of eyepieceFor the combination come by eyepiece to external display device offer instruction.For example, wearer can wish from can by the equipment of eyepieceDemonstration provides instruction to one group of individual.Wearer can be by using physics 3D or 2D mouse (for example, having inertia motion to passSensor, MEMS inertial sensor, ultrasonic 3D motion sensor, accelerometer etc.), virtual mouse, virtual touch screen, dummy keyboard etc.To assist the interface to provide for the operating content in demonstration.Demonstration can be being checked by eyepiece and can pass through eyepieceManipulation, but can also export in real time, such as to being connected to external display device (for example, computer display, projector, aobviousDisplay screen, video screen etc.) outside router.Eyepiece can be provided to wearer as a result, allows other people to check that wearer passes through meshWhat mirror sees and the control facility of eyepiece come by way of controlling, thus the more matchmakers for allowing wearer that will enable by eyepieceBody event exports to other non-eyepiece wearers.
It in one example, may include using event/data feeding and sensing input/sensor device in terms of the control of eyepieceCombination, such as wherein the additional acoustic sensor of security incident can be implemented.There may be sent to the safety warning of soldier simultaneouslyAnd acoustic sensor is used as input equipment to monitor the direction etc. of the voice content in ambient enviroment, artillery fire.For example, safe policeAll army personnels being broadcast in specific region are accused, and by alarm, eyepiece activation monitors embedded acoustic sensorThe application of array, the sound of embedded acoustic sensor array analysis sound with the type that identifies sound source and sound fromDirection.In embodiments, as described in this, other events and/or data feeding, sensing input and/or sensing equipment etc.It can also be applied.
In one example, may include in terms of the control of eyepiece captured using event/data feeding and user action input/The combination of equipment, such as requesting input is additional to use camera.Soldier can be located at interested position and be sent to coming fromThe request of the photo or video of their positions is such as wherein requested with the instruction for taking pictures to what.For example, soldierJust in inspection station, and in some central command institute, a concern individual is determined may attempt to through the inspection station.Central commandThen can provide instructions to record and upload image and video to the eyepiece user of the neighbouring checkpoint, this is in embodimentsIt can be performed automatically, camera must be manually turned on without soldier.In embodiments, as described in this, other thingsPart and/or data feeding, user action capture input and/or equipment etc. and can also be applied.
In one example, may include in terms of the control of eyepiece such as when soldier enter " active state " and they by handPosture is for being controlled or being initiated the combination of order using event/data feeding and user's movement or movement when controlling.Soldier canInto being ready in the active state belligerent with enemy, and soldier using hand gesture in belligerent command and control environmentVoicelessly order eyepiece.For example, soldier can abruptly enter the hostile area as determined according to the new information received, the new informationEyepiece is placed in the alarm condition of promotion.In this case, it is desirable that noiseless may be a kind of demand, and so eyepiece is convertedTo hand posture order mode.In embodiments, as described in this, other events and/or data feeding, for control orUser's movement or movement for initiating order etc. can also be applied.
It in one example, may include anti-using event/data feeding and input energy wherein in terms of the control of eyepieceThe combination of the order/control model and interface reflected, such as user into a type of environment and virtual touch screen.SoldierWeapon system region can be entered, and virtual touch screen can be used to one for being controlled weapon system to wearerPoint.For example, soldier enters weapon vehicle, and detect the presence of weapon system and soldier is authorized to use the eyepiece void of weaponQuasi- touch screen recalls virtual firepower control interface.In embodiments, as described in this, other events and/or data feeding,The order and/or control model and interface etc. being reflected in wherein input energy can also be applied.
It in one example, may include using life using the energy on event/data feeding and platform in terms of the control of eyepieceThe combination for the application for enabling/making a response to input is such as accessed for the security incident for pilot with the simplicity to informationCombination.Squadron pilot (or people of some flight check for being responsible for UAV) can before aircraft takeoff theySecurity incident notice is received when close to the aircraft, and may bring up an application makes them pass through preflight check.For example, target droneExpert is prepared for starting it close to target drone, and an interactive checking process is displayed to soldier by eyepiece.In addition, communication channel can open the driver of target drone, therefore they can be included in preflight check.In each embodimentIn, as described in this, order can be used in other events and/or data feeding, platform and/or is answered what input made a responseIt can also be applied with equal.
It in one example, may include using event/data feeding and the interface to outside from platform in terms of the control of eyepieceThe combination of communication or the connection of system and equipment, such as soldier in-position and graphic user interface (GUI).Soldier can enterWherein they are required the position interacted with external equipment, and wherein external equipment is docked by GIU.For example, soldierInto military vehicles, and to soldier present GUI, the GUI expansion indicate to the user that transport different phase they needDo what interactive interface.In embodiments, as described in this, other events and/or data are fed, from platformInterface can also be applied to the communication or connection of external system and equipment etc..
It in one example, may include using event/data feeding and useful outside to be controlled in terms of the control of eyepieceThe combination of equipment, such as provided instruction and weapon system.The feeding of instruction or instruction can be provided to soldier, wherein extremelyA few instruction is about the control to external weapon system.For example, soldier can operate piece of artillery, and eyepiece is not only to himInformation in performance associated with weapon and program is provided, feeding, the correction etc. of instruction associated with aiming are also provided.In embodiments, as described in this, other events and/or data feeding, useful external equipment to be controlled etc. can alsoIt is applied.
It in one example, may include using event/data feeding and useful external equipment application in terms of the control of eyepieceCombination, such as in security incident/feeding and biometric feature capture/identification.Soldier can be by (such as passing through safe feedbackSend) it is notified by transmission security incident to capture the biometric feature (fingerprint, iris scan, gait profile) of particular individual,Middle biometric feature (is such as provided from the server based on safe Military Network/cloud by the application of external biometric feature) by storage, assessment, analysis etc..In embodiments, as described in this, other events and/or data feeding, outside are setStandby application etc. can also be applied.
It in one example, may include using event/data feeding and to soldier's and external equipment in terms of the control of eyepieceWith the combination of the related feedback of application, such as into active state and soldier is provided the display of information.Soldier can be by eyepieceIt is placed in and enters active state, for military buildup, preparation, take action, debrief, and as to being placed in into livingThe feedback of dynamic state, soldier receives to be shown about the information of the state entered.For example, soldier enters the Clustering Phase of task,Wherein eyepiece grabs the information of a part of must completing during assembly as soldier for task from remote server, including solidLocking equipment, additional training etc..In embodiments, as described in this, other events and/or data feeding and external equipmentAnd/or related feedback of applications etc. can also be applied.
It in one example, may include defeated using sensing input/sensing equipment and user action capture in terms of the control of eyepieceEnter/the combination of equipment, such as utilizes inertia motion sensor and head tracing system.The head movement of soldier can be by eyepieceInertia motion sensor track, for the control of nodding of eyepiece, visual direction sensing of eyepiece etc..For example, soldier can be justWeapon system is aimed at, and eyepiece senses the gaze-direction on soldier head by inertia motion sensor to provide forceThe lasting aiming of device.It in addition, weapon system may be in response to the gaze-direction of soldier and constantly move, and is thus constantly pairTarget, which is opened fire, to be ready.In embodiments, as described in this, other sensing inputs and/or sensing equipment, user actionCapturing input and/or equipment etc. can also be applied.
It in one example, may include using sensing input/sensing equipment and for controlling or initiating in terms of the control of eyepieceUser's movement of order or the combination of movement utilize the movement such as optical sensor and eye closing, blink.The state of soldier's eyeIt can be sensed by the optical sensor being included in the optical train of eyepiece, such as controlling mesh using eye movementMirror.For example, soldier can aim at their rifle, wherein rifle has the energy opened fire by the control command from eyepiecePower is (such as in the case where sniper, wherein initiating order by eyepiece can reduce due to caused by manually cockingError in aiming).Soldier can be then by by detecting that predetermined eye is mobile (such as in the commanded profile being retained on eyepieceIn) optical sensor issue order weapon is opened fire.In embodiments, as described in this, other sensing inputsAnd/or sensing equipment, user's movement for controlling or initiating order or movement etc. can also be applied.
It in one example, may include using sensing input/sensing equipment and input energy quilt wherein in terms of the control of eyepieceOrder/control model of reflection and the combination at interface such as utilize proximity sensor and robot control interface.It is integrated in meshProximity sensor in mirror can be used for sensing the proximity of soldier opposed robots control interface to activate and enable machineThe use of device people.For example, soldier moves towards bomb detection robot, and robot automatically activate and is initialized and is directed to that this is specificThe configuration (for example, being configured for soldier's preference) of soldier.In embodiments, as described in this, other sensing inputs and/Or sensing equipment, the order that is reflected in wherein input energy and/or control model and interface etc. can also be applied.
It in one example, may include being ordered using can be used on sensing input/sensing equipment and platform in terms of the control of eyepieceThe combination for the application for enabling/making a response to input, such as utilizes audio sensor and music/acoustic application.Audio sensor canMonitoring ambient enviroment sound simultaneously starts and/or adjusts the volume of music, ambient enviroment sound, sound and eliminate etc. to help to fight notDesired ambient enviroment sound.For example, soldier be loaded onto means of transport and the means of transport engine it is initial when be closed.During this time, in addition to rest, soldier may not have other tasks, so they open music to help them to rest.WhenWhen the engine start of transporter, music/acoustic application adjusts volume and/or the additional sound of starting eliminates audio to help soundHappy input is maintained at opened with engine before.In embodiments, as described in this, other sensing inputs and/or sensingOrder can be used in equipment, platform and/or the application etc. that input makes a response can also be applied.
It in one example, may include using sensing input/sensing equipment and the interface from platform in terms of the control of eyepieceTo the combination of communication or the connection of external system and equipment, such as using passive IR proximity sensor and external digital signal atManage device.Passive IR proximity sensor monitoring night scene, sensor instruction movement can be used in soldier, and eyepiece starts toThe connection of external digital signal processor is to help to identify the target from proximity sensor data.In addition, IR image cameraIt can be initiated to contribute additional data to digital signal processor.In embodiments, as described in this, other sensings are defeatedEnter and/or sensing equipment, interface can also be applied to the communication or connection of external system and equipment etc. from platform.
It in one example, may include using sensing input/sensing equipment and to be controlled useful in terms of the control of eyepieceThe combination of external equipment, such as using acoustic sensor and weapon system, wherein sensing loud sound by the eyepiece that soldier wearsSound (such as may be explosion or the report of a gun), and wherein eyepiece starting to the control of weapon system for being directed to and loud soundThe possibility movement of the associated target of generation of sound.For example, soldier is carrying out guard duty, and hear the report of a gun.EyepieceIt is able to detect the direction of the report of a gun, and soldier is directed to the position that the report of a gun is made.In embodiments, as retouched hereinIt states, other sensing inputs and/or sensing equipment, useful external equipment to be controlled etc. can also be applied.
In one example, the controlling party face of eyepiece includes using sensing input/sensing equipment and those useful external equipmentsApplication combination, such as indicated using camera and applications.The camera being embedded in soldier's eyepiece can check tableThe bright available target icon of instruction, and eyepiece accesses applications to be indicated.For example, soldier is delivered to build-up areaDomain, and when entering, eyepiece camera checks icon, externally access instruction and is directed to the instruction what is made to soldier's offer,Wherein all steps can be automatically, so that providing instruction in the case where soldier does not know icon.In embodiments,As described in this, other sensing inputs and/or sensing equipment, the application of external equipment etc. can also be applied.
It in one example, may include using sensing input/sensing equipment and to user and outside in terms of the control of eyepieceThe combination of equipment and the related feedback of application, such as using GPS sensor and from the visual display of remote application.Soldier can haveHave and position coordinates are sent/be streamed to the embedded GPS sensor of remote location mechanism/application, the remote location mechanism/answerWith by the visual display of physical environment around eyepiece is sent/is streamed to for display.For example, soldier can be by eyepiece constantlyAmbient enviroment is checked, and by embedded GPS sensor, even if eyepiece allows soldier changing position by constantly stream transmissionWhen also have ambient enviroment augmented reality view visual display covering.In embodiments, as described in this, other are passedSense input and/or sensing equipment, feedback related with external equipment and/or applications etc. can also be applied.
In one example, the control method of eyepiece may include using user action capture input/equipment and for control orThe combination of the user's movement or movement of initiating order, such as utilizes body movable sensor (for example, motion sensor) and armMovement.Soldier can have the body movable sensor for being attached to their arms, and wherein the movement of their arms transmits order.ExampleSuch as, soldier can have the motion sensor on their arms, and the movement of their arms is copied to aircraft landing illuminationIn system, so that the lamp usually held by the personnel that help is landed can become bigger and more visual.In each embodimentIn, as described in this, other users motion capture input and/or equipment, the user for controlling or initiating order it is mobile orMovement etc. can also be applied.
It in one example, may include capturing input/equipment using user action and inputting wherein in terms of the control of eyepieceThe combination of the order/control model and interface that can be reflected, such as wearable sensors set and the use based on prediction inquiry learningFamily interface.Soldier's wearable sensors set, is based on wherein the data from the set of sensors are continuously collected and pass throughThe user interface of study is fed to machine learning mechanism, and wherein soldier can receive, refuse, modifying etc. to act from them and rowFor study.For example, soldier generally executes same task, machine learning mechanism in each morning Monday in a manner of same physicalThe routine of acquistion can be established, the routine of the acquistion is supplied to soldier in next morning Monday, such as prompting below:Clean particular device, fill in certain table, broadcasting specific music and particular person meet etc..In addition, soldier can be by routineDirect editing come modify study as a result, such as in the behavioral profiling of acquistion.In embodiments, as described in this,Other users motion capture input and/or equipment, the order and/or control model and interface etc. being reflected in wherein input energyIt can be applied.
It in one example, may include capturing in input/equipment and platform to make using user action in terms of the control of eyepieceWith the combination for the application ordered/made a response to input, the camera and Video Applications of such as subsidiary finger.Soldier can control meshThe embedded camera of mirror passes through the direction of intrinsic Video Applications shooting video.For example, soldier can check scene of fighting, whereinThey must stare (such as maintains vigilance to the new development in belligerent) in one direction, while being shot in different directions(such as current belligerent point).In embodiments, as described in this, other users motion capture input and/or equipment, platformIt is upper to use order and/or the application etc. that input makes a response can also be applied.
It in one example, may include input/equipment being captured using user action and from platform in terms of the control of eyepieceInterface to external system and equipment communication or connection combination, such as microphone and speech recognition input additional steering wheel controlInterface.Soldier perhaps can change the various aspects of processing vehicle via voice command, which is received by eyepiece and quiltIt is delivered to the steering wheel control interface of vehicle (such as by the radio communication between eyepiece and steering wheel control interface).ExampleSuch as, soldier is just in driven on public roads vehicle, and so vehicle has the ideal particular procedure ability for highway.ButVehicle also has other modes for driving at different conditions, it is such as cross-country, in snow, in mud, greatly in the rain,When pursuing another vehicle etc..In such instances, soldier perhaps can vehicle change riving condition when by voice command comeChange pattern.In embodiments, as described in this, other users motion capture input and/or equipment, the interface from platformCommunication or connection to external system and equipment etc. can also be applied.
It in one example, may include capturing input/equipment using user action and being controlled in terms of the control of eyepieceUseful external equipment combination, such as microphone and speech recognition input additional automobile instrument panel interface equipment.Soldier can makeControlled with voice command with the related each equipment of the instrument board of vehicle, it is such as heating and ventilation, radio, music, brightLamp, trip computer etc..For example, soldier may drive vehicle to execute task, pass through rough terrain, so that they are notEither hand off-direction disk can manually be controlled vehicular meter disc apparatus.In such instances, soldier can be by rightThe voice control of eyepiece controls vehicular meter disc apparatus.Such as relative to by the voice control of instrument board microphone system,Voice command by eyepiece be it is specific beneficial because military vehicle can be dipped into very loud acoustic enviroment, thus existThe performance substantially promoted can be provided under the conditions of this using the microphone in eyepiece.In embodiments, as described in this,Other users motion capture input and/or equipment, useful external equipment to be controlled etc. can also be applied.
In one example, the controlling party face of eyepiece includes capturing input/equipment and those useful outsides using user actionThe combination of the application of equipment such as utilizes joystick device and external entertainment applications.Soldier may have access to game console rod controllerAnd game can be played by external entertainment applications, the multi-player gaming of main memory on such as network server.For example, soldier mayDown-time period during positive experience deployment, and in base they access the joystick device docked with eyepiece, eyepiece and withExternal amusement equipment docking.In embodiments, soldier can network together with the army personnel of other on network.Soldier can haveThere are preference, the profile etc. of storage associated with game is played.External entertainment applications can be such as according to the deployment of soldier, current standardThe pipes such as standby state, required preparation state, passing history, ability rating, command post position, ranking, geographical location, future deploymentManage their object for appreciation game.In embodiments, as described in this, other users motion capture input and/or equipment, outside are setStandby application etc. can also be applied.
It in one example, may include input/equipment being captured using user action and to user's in terms of the control of eyepieceWith the combination of external equipment and the related feedback of application, system and tone output or audible alert such as are determined using activity.ScholarSoldier can determine system by eyepiece access activity to monitor and determine such as in extreme activity, rest, boring, anxiety, exerciseThe active state of soldier whens equal, and wherein when situation exceeds limitation in any way (such as preset, acquistion, typical)When, eyepiece can provide the form of tone output or audible alert.For example, the current health shape of soldier can be monitored during fightState, and wherein when healthiness condition enters danger level, soldier and/or another individual are (for example, doctor, hospital personnel, soldierAnother member of team, command centre etc.) it is provided earcon, such as indicate that the soldier is injured in fight.As a result,Other people can be alerted the condition of the injury of the soldier, and can look after the condition of the injury in a more efficient manner.In embodiments, such asIt is described herein, other users motion capture input and/or equipment, feedback related with external equipment and/or applications etc.It can also be applied.
It in one example, may include using the user's movement or movement for controlling or initiating order in terms of the control of eyepieceThe combination of order/control model and interface that additional wherein input energy is reflected, the fist such as held and can navigating lists.ScholarSoldier can use the postures such as the fist held with a firm grip recall as eyepiece show be projected content can navigating lists.For example,Eyepiece camera can check the hand gesture of soldier, identification and identify hand gesture and according to scheduled posture to command databaseTo execute order.In embodiments, hand gesture may include the posture of hand, finger, arm, leg etc..In embodiments, such asIt is described herein, other be used to control or initiate order user is mobile or movement, the order that is reflected in wherein input energy and/Or control model and interface etc. can also be applied.
It in one example, may include using the user's movement or movement for controlling or initiating order in terms of the control of eyepieceThe combination that the application ordered/made a response to input can be used on additional platform, such as nods and shows with information.Soldier can be with allIt such as shakes the head, arm motion, leg exercise, eye motion posture are applied to recall information display.For example, soldier can wish to pass throughEyepiece accesses application, database, network connection etc., and can with the nodding of their heads (such as by it is in eyepiece, in soldierMotion detector on head, on soldier's helmet etc. senses) it recalls as graphic user interface a partDisplay application.In embodiments, as described in this, other are used to control or initiate user's movement of order or movement, put downOrder can be used on platform and/or the application etc. that input makes a response can also be applied.
It in one example, may include using the user's movement or movement for controlling or initiating order in terms of the control of eyepieceThe additional interface from platform to external system and equipment communication and connection combination, the blink of such as eyes and via to outsideThe API of application.Soldier can recall application program with the mobile etc. of the blink of eyes, the nodding of head, arm or legInterface is to access applications.For example, soldier can access applications by the API being embedded in eyepiece facility, it is used in combinationThe blinks (such as being detected by the Optical Monitoring ability of the optical system via eyepiece) of eyes is done so.In each realityIt applies in example, as described in this, other are used to control or initiate the user's movement or movement, the interface to outside from platform of orderCommunication or connection of system and equipment etc. can also be applied.
In one example, the controlling party face of eyepiece include using the user for controlling or initiating order is mobile or movement withAnd the combination of external equipment to be controlled, the external rangefinder equipment of access is such as dubbed by foot.Soldier can have and will examineThe sensor (dynamic pickup etc. on their shoes) of the movement of soldier's foot is surveyed, and soldier is (all using the movement of footSuch as dubbing for their feet) carry out the distance determined using external rangefinder equipment to an object (such as enemy targets).For example, soldierJust weapon system can be aimed at, and use two hands in the process.In this case, made by eyepiece by foot-propelledMode allows " without hand " to issue order to issue order.In embodiments, as described in this, other are for controllingOr user's movement or movement, useful external equipment to be controlled for initiating order etc. can also be applied.
It in one example, may include using the user's movement or movement for controlling or initiating order in terms of the control of eyepieceThe combination of the application of those additional useful external equipments, such as makes mark and messenger with hand.Soldier is availableHand shape at mark come by external information transmission application (such as external information feeding, photo/video sharing application, text are answeredWith etc.) the shared information of triggering.For example, soldier opens embedded camera using hand signal, and video flowing and another people are dividedIt enjoys, share and arrive storage etc..In embodiments, as described in this, other be used to control or initiate order user it is mobile orMovement, application of external equipment etc. can also be applied.
It in one example, may include using the user's movement or movement for controlling or initiating order in terms of the control of eyepieceThe additional combination with external equipment and the related feedback of application to soldier, additional audible alert of such as shaking the head.Soldier is wearableEquipped with accelerometer (or similar sensor that can be used in detecting gravity and shake the head) eyepiece, wherein when soldier's experience is in dangerWhen shaking the head, audible alert is heard the high-caliber gravity of danger as the feedback to user, such as or as applying on eyepieceA part otherwise a part as the application for being detached from eyepiece determine.In addition, the output of accelerometer can be recorded and be depositedStorage is for analysis.For example, soldier can undergo the gravity generated by close explosion to shake the head, and eyepiece can sense and record and be somebody's turn to doIt shakes the head related sensing data.In addition, shaking the head for danger level can trigger the auto-action made by eyepiece, such as to itHis soldier and/or to command centre's transmission warning, start to monitor and/or transmit the soldier from the other sensors worn with itHealth and fitness information, to soldier provide audible indicate etc. related with their the possible conditions of the injury.In embodiments, as described herein, other are used to control or initiate user's movement of order or movement, feedback related with external equipment and/or applications etc.It can also be applied.
It in one example, may include using the order being reflected in wherein input energy/control mould in terms of each control of eyepieceThe combination for the application ordered/made a response to input can be used on the additional platform of formula and interface, such as graphic user interface is additionalReside in the various applications on eyepiece.Eyepiece can provide graphic user interface to soldier and present using for selection.For example,Soldier can have the graphic user interface projected by eyepiece, which provides the application of different field, such as military, aPeople, citizen etc..In embodiments, as described in this, other order/control models being reflected in wherein input energy and boundaryOrder can be used on face, platform and/or the application etc. that input makes a response can also be applied.
It in one example, may include using the order/control model being reflected in wherein input energy in terms of the control of eyepieceWith the combination of communication or the connection at the additional interface from platform in interface to external system and equipment, such as 3D navigates outside eyepiece interfaceIt is added to the navigation system control interface of external system.Eyepiece can enter navigation mode and be connected by navigation system control interfaceIt is connected to external system.For example, soldier is just holding military maneuvers and is recalling pre-loaded surrounding terrain by eyepiece navigation mode3D rendering, and eyepiece is automatically attached to external system to be updated, current perpetual object (is such as covered by satellite imageLid) etc..In embodiments, as described in this, other order/control models being reflected in wherein input energy and interface,Interface can also be applied to the communication or connection of external system and equipment etc. from platform.
It in one example, may include using the order/control model being reflected in wherein input energy in terms of the control of eyepieceWith the combination of the additional external equipment to be controlled in interface, such as augmented reality interface applied external tracking equipment.The mesh of soldierMirror can enter augmented reality mode and and external trace device to fetch with augmented reality show covering and object to be tracked orThe related information in the position of people.For example, enhancing display pattern may include 3D map, and the people determined by external trace devicePosition can be covered on map, and with the mobile display track of tracked people.In embodiments, as described in this,Other order/control models being reflected in wherein input energy and interface, useful external equipment to be controlled etc. can also be answeredWith.
It in one example, may include using the order/control model being reflected in wherein input energy in terms of the control of eyepieceWith the combination of the application of those additional external equipments of interface, such as translucent additional simulation application of display pattern.Eyepiece can be set toTo enhance the display that simulative display is applied to soldier in translucent display pattern.For example, soldier is prepared for task,And before entering battlefield, soldier is provided the simulation of task environment, and checks him since user being not present during simulationAround true environment actual needs, therefore eyepiece places it in translucent display pattern.In embodiments, such as existThis description, other order/control models being reflected in wherein input energy and interface, the application of external equipment etc. can also be answeredWith.
It in one example, may include using the order/control model being reflected in wherein input energy in terms of the control of eyepieceWith the additional combination to user and external equipment and the related feedback of application in interface, such as additional tone in audible command interface is defeatedIt feeds back out.Eyepiece can be placed in audible command interface model by soldier, and eyepiece is exported back and forth to should be used as coming from tone and isThe eyepiece of system is ready to receive the feedback of audible command.For example, audible command interface may include external position (such as on networkOutside) in audible command interface at least partly, once and whole system be ready to receive audible command, tone is justIt is provided.In embodiments, as described in this, other order/control models being reflected in wherein input energy and interface,Feedback related with external equipment and/or applications etc. can also be applied.
It in one example, may include that the application ordered/made a response to input can be used on platform in terms of the control of eyepieceThe additional interface from platform to external system and equipment communication or connection combination, the additional network routing of such as communications applicationsDevice, wherein soldier can open communications applications, and eyepiece automatically searches for network router to search out the company of the network facilitiesIt connects.For example, soldier is just with their group in battlefield, and new campsite is established.Once communications facility has been set up,The eyepiece of soldier is just connectable in safe wireless connection.In addition, once communications facility has been set up, even if soldier does not have alsoPaid to ping letter, eyepiece can also remind soldier.In embodiments, as described in this, on other platforms can use order/it is rightInput that the application made a response, interface can also be applied to the communication or connection of external system and equipment etc. from platform.
It in one example, may include using order/make a response to inputting can be used on platform in terms of the control of eyepieceUsing the combination of additional useful external equipment to be controlled, such as Video Applications applied external camera.Soldier can be with deploymentCamera docking, such as monitoring battlefield.For example, movement, which can dispose camera, to fall from aircraft, and soldier is then led toEyepiece Video Applications are crossed with the connection to camera.In embodiments, as described in this, the energy on other platforms uses lifeApplication, the useful external equipment to be controlled etc. for enabling/making a response to input can also be applied.
It in one example, may include using order/make a response to inputting can be used on platform in terms of the control of eyepieceUsing the combination of the application of applied external equipment, search is applied using applied external search on such as eyepiece.Search on eyepieceUsing can be enhanced with external search application.For example, soldier can search for the identity for the individual being just asked, and searched when on eyepieceWhen rope causes not find, eyepiece connects external search facility.In embodiments, as described in this, energy on other platformsUsing order/to input make a response application, external equipment application etc. can also be applied.
It in one example, may include using order/make a response to inputting can be used on platform in terms of the control of eyepieceUsing the additional combination with external equipment and the related feedback of application to soldier, the additional performance indicator of such as entertainment applications is anti-Feedback.Entertainment applications are used as needing to rest but may be due to the rest mechanism of the soldier of other reasons anxiety, and tableNow feed back by for soldier's design at given conditions, such as when in the deployment that they need to rest but keep quick,During attention is declining and needs to be brought back idle hours when coming etc..For example, soldier can on means of transport and willTo enter belligerent.In such instances, entertainment applications can be action thinking game to improve attention and enthusiasm, and itsMiddle performance indicator feedback, which is designed to maximize soldier to execute and to ponder a problem in a manner of quickly and efficiently, draws a conclusionExpectation.In embodiments, as described in this, on other platforms can use order/to input make a response application, withExternal equipment and/or the related feedback of applications etc. can also be applied.
In one example, may include in terms of the control of eyepiece using from platform interface to the communication of external system and equipmentOr the combination of the additional external equipment to be controlled of connection, such as thrown to processor interface applied external on the eyepiece of outside plantShadow instrument.Eyepiece processor is connectable to external projector, so that other people can check to the available content of eyepiece.For example,Soldier in battlefield and can access them and need not wearing the shared content of people's (such as non-military individual) of eyepiece with other.In this example, the eyepiece of soldier perhaps can be docked with external projector, and content is fed to projection from eyepieceInstrument.In embodiments, projector can be pocket projectors, the projector in vehicle, the projector in meeting room, remotely determineThe projector etc. of position.In embodiments, projector can be also integrated into eyepiece, so that content can be from integrated projectorFrom outer projections.In embodiments, as described in this, other from platform interface to external system and equipment communication orConnection, useful external equipment to be controlled etc. can also be applied.
In one example, may include in terms of the control of eyepiece using from platform interface to the communication of external system and equipmentOr the combination of the application of connection applied external equipment, such as audio system controller interface applied external audio system.Soldier's energyIt is enough that the audio-frequency unit (for example, music, audio playback, audio network file etc.) of eyepiece facility is connected to external sound system.For example, soldier perhaps can repair the communication for just being received vehicle sounds system by eyepiece, so that other people can hear.EachIn embodiment, as described in this, other from platform interface to the communication or connection of external system and equipment, external equipmentUsing etc. can also be applied.
In one example, may include in terms of the control of eyepiece using from platform interface to the communication of external system and equipmentOr the additional combination with external equipment and the related feedback of application to soldier of connection, such as additional shape in steeper controller interfaceState feedback.Soldier can by steeper controller interface using digital steeper control access and control mechanism, wherein mechanism toUser provides the feedback about the mechanism status.For example, the soldier for removing roadblock can have lifting on their vehicleMechanism, and soldier can directly be docked by eyepiece with the elevating mechanism.In embodiments, as described in this, otherFrom platform interface to external system and equipment communication or connection, it is related with external equipment and/or applications feedback etc.It can also be applied.
It in one example, may include using those additional external equipments of external equipment to be controlled in terms of the control of eyepieceApplication combination, such as enable the additional automated back-up application of equipment of storage.Soldier in battlefield can be provided that data storeFacility and associated automated back-up application.For example, storage facility can be located in military vehicle, so that data can be from multipleThe eyepiece of soldier backups to vehicle, especially in the case that network linking is not useable for downloading to remote backup site.Storage is setIt is standby can it is associated with campsite, associated with soldier's subset (for example, in group) in battlefield, on soldier they itselfDeng.In embodiments, when network service connection is made available by, facility, which is locally stored, can upload backup.In embodiments,As described in this, the application etc. of other useful external equipments to be controlled, external equipment can also be applied.
In one example, in terms of the control of eyepiece may include using external equipment to be controlled it is additional to soldier with it is outerThe combination of portion's equipment and the related feedback of application, such as additional feedback from system of external payment system.Soldier may have access to armyThe payment system of thing management, and wherein the system to soldier provide feedback (for example, receipt, account balance, account behaviors etc.).For example, soldier can make payment to seller by eyepiece, wherein eyepiece and external payment system exchange data, authorization, fund etc.,And payment system provides feedback data to soldier.In embodiments, as described in this, other to be controlled useful outerPortion's equipment, feedback related with external equipment and/or applications etc. can also be applied.
It in one example, may include using external equipment using additional setting with outside to soldier in terms of the control of eyepieceStandby and the related feedback of application combination, the information such as from external 3D map-rendering facility shows additional aobvious together with informationShow feedback.Soldier perhaps can be such that 3D map information data shows by eyepiece, and wherein map facility can be passed such as according to passingThe information of the information, passing request sent, other people request in the region, basis change associated with geographic areaTo provide feedback to soldier.For example, soldier can receive the rendering of 3D map from applications, wherein applications are also to identical geographyAt least the second soldier in region provides the rendering of 3D map.Soldier then can receive related with the second soldier from outside plantFeedback, their position described in the rendering of 3D map, identity information, mobile history etc..In embodiments, such asDescribed herein, the application of other external equipments, feedback related with external equipment and/or applications etc. can also be applied.
In embodiments, in response to medical conditions, eyepiece can provide a user various forms of guidances.Show as firstExample, user, which can be simulated for training goal using eyepiece, to be sent out in fight, in training, when on duty or not on duty etc.Raw medical conditions.The simulation can be adjusted towards medical professional or non-medical personage.
As an example, eyepiece can be used to check that the medical simulation as training module a part is come for low-level fight soldierIt provides for the training in response to the medical conditions on battlefield.Eyepiece can provide enhancing environment, and wherein user, which checks, is covered on separatelyThe condition of the injury on one soldier simulates those conditions of the injury that are afield common or can afield finding.Soldier can then pass throughUser interface obtains prompt to make a response to the situation presented.User can be given for afield providing emergency medicalThe gradually instruction of the series of actions of relief or user may be in response to the situation and execute movement, these movements are then repaired straightIt is presented to response appropriate.
Similarly, eyepiece can provide the training environment for medical professional.Eyepiece can be presented to user needs medical treatmentThe condition or situation of response are with the training goal for medical professional.Eyepiece can release required for its userGrasp the common scene of battle field of response and lifesaving skill appropriate.
As an example, the augmented reality of wounded soldier can be presented to user, wherein soldier's body has bullet wound.Medical professionalism peopleScholar can then carry out him and feel it is to select him to feel to this from the user interface of eyepiece the step of suitably response for the situationSituation proper step, the user interface that step is input to eyepiece are medium.User can pass through sensor and/or input equipmentUsing to carry out response or he user interface can will be input to via eye movement, hand gesture etc. the step of his response.ClassAs, he can select the appropriate step for being presented to him by user interface via eye movement, hand gesture etc..Work as movementIt is implemented and when user makes the decision about treatment, additional guidance and instruction can be presented to user according to his performance.For example, and user starts for soldier to be raised to danger position, Yong Huke if the soldier that chest has bullet wound is presented to userIt is cautioned or prompts the therapeutic process to change him.Correct step can alternatively be prompted the user with to implement mistake appropriateJourney.In addition, the example of the medical records of wounded soldier can be presented to trainee under Training scene, wherein user is possible must be extremelyFew decision that him is made according to including the content in the medical records.In various embodiments, the movement and performance of userIt can be recorded and/or put on record for making further to judge and refer to after training course is suspended or is stopped in other ways by eyepieceShow.
In embodiments, in response to true medical conditions in fight, eyepiece can provide a user various forms of fingersIt leads.As an example, can be prompted to unbred soldier under condition when doctor cannot occur immediatelyThe gradually lifesaving of comrade-in-arms indicates.When comrade-in-arms's injury, user can input the type of the condition of the injury, eyepiece can detect the condition of the injury or theseCombination can occur.At this point, the lifesaving instruction for being used to treatment wounded soldier can be provided a user.Such instruction can be for useIt is presented in the step-by-step procedure of the instruction at family with the form of augmented reality.In addition, eyepiece can be provided a user about close to injuredThe enhancing for dissecting covering etc. of the position of the vitals of soldier's condition of the injury, soldier's body visually helps.In addition, eyepiece can be to thisSituation is recorded a video, which can then be sent back to doctor not afield or rush for the doctor in battlefield, to permitXu doctor instructs untrained user by lifesaving skill suitable on battlefield.In addition, the eyepiece of wounded soldier can incite somebody to actionImportant information is sent to the eyepiece of soldier being treated (such as by the related injured scholar of integrated or associated sensor collectionThe information of soldier), which, which is sent to doctor or it, can be sent straight to doctor in remote location, so that treatingSoldier can according to the information collected from wounded soldier eyepiece come to wounded soldier provide medical treatment help.
In other embodiments, when the condition on battlefield is presented, mesh is can be used in trained doctorMirror covers to provide the dissection of soldier's body, so that he can more suitably make a response the situation on hand.It is only used as and showsExample is without limiting the present invention, if wounded soldier, just because the bullet wound of leg is bled, the enhancing that soldier's artery can be presented to user is existingReal-time coupling, so that user can determine whether artery is hit and injury has mostly seriously.It can be presented to user by eyepieceFor the appropriate draft of given wound, so that he can check each step over the course for the treatment of.Such draft can also use increasingStrong reality, video, audio or other formats are presented to the user.Eyepiece can provide physicians with existing with the enhancing in step-by-step procedureThe draft of real instruction form.In embodiments, the augmented reality covering of wounded soldier organ can be presented to user also to lead toAny process is crossed to guide doctor, so that doctor will not make other injury to the organ of soldier over the course for the treatment of.ThisOutside, eyepiece can provide a user the dissection covering of the position about the vitals close to the wounded soldier condition of the injury, soldier's bodyDeng enhancing visually help.
In embodiments, eyepiece can be used for scanning the retina of wounded soldier afield to obtain his medical treatmentRecord.This can remind the possible drug allergy of doctor or can provide other material particulars of benefit in medical procedure.
In addition, if wounded soldier wears eyepiece, then equipment can be by heart rate, blood pressure, respiratory pressure including wounded soldierInformation Deng including is sent to the glasses of doctor.The eyepiece may also aid in the gait that user observes soldier, whether to determine soldierWith craniocerebral injury and they can help user determine bleeding or injury position.Such information can provide a user relatedThe information of possible medical treatment, and in embodiments, draft appropriate or to the selection of draft can be displayed to user withHim is helped to treat patient.
In other embodiments, eyepiece allows user to monitor other symptoms of patient for psychological health states inspection.Similarly, user can check to determine it is mobile and further use eyepiece to patient whether patient is showing quick eyeSedation treatment is provided, provides the mobile exercise of eye, respiratory training etc. to patient.In addition, when the life about wounded soldierWhen sign and the information of health data are collected and sent to the eyepiece of doctor from the eyepiece of wounded soldier, doctor be can be provided thatThe information.This can provide physicians with the real time data from wounded soldier, without himself such as by measuring injured scholarThe blood pressure of soldier determines such data.
In various embodiments, the prompting from eyepiece can be provided a user, which tells him the rescue of aerial or groundIt is how far to leave his position afield.This can provide physicians with important information and remind to him can give the situationIn the case where with the time, whether certain processes should or must be attempted, and this can be gratifying to injured soldier's offerRescue just knowing or remind he him may need other to help sources on the way.
In other embodiments, if detection is gone wrong, user can be provided that the warning of the vital sign of himself.ExampleSuch as, if the hyperpiesia of soldier, he can be warned, to alert him, he must take medicine or if possible by himselfIt is withdrawn from fight his blood pressure is returned to level of security.Also, user can be warned about other such a numbersAccording to, his pupil size, heart rate, gait change etc., to determine whether user is undergoing medical problem.In other implementationsIn example, the eyepiece of user can also remind the medical condition of user to the healthcare givers of another location, be directed to user's to sendHelp whether to know that he needs such help but regardless of him.In addition, general data can assemble from multiple eyepieces with to fingerIt waves official and the details such as the wounded soldier about him, his how much injuries of how many soldier in fight, in them is provided.
In various embodiments, trained medical professional can also use mesh in the medical response except fightMirror.Such eyepiece has doctor described above in general headquarters or not in general headquarters but the similar use except Combat Condition.It is logicalThis mode is crossed, eyepiece can be provided a user obtains augmented reality assistance, medical procedure of putting on record, long-range during medical procedureExecute the mode of medical procedure under the guidance of commanding officer in military base in military base or not via video and/or audio etc..This can provide assistance under a variety of situations that wherein doctor may need additional assistance.When doctor is just outside training routine, gymnasticsOut, one example of this situation can occur when military pleasure trip on foot etc. is on duty.When doctor is only respondent, when he is newly to cureIt is raw, close to new situation etc. whens, such assistance may be important.
In some embodiments, eyepiece can provide user guided in environment related with military transportation airplane.For example, when instructionPractice, into war, in investigation or rescue duty in, in mobile device, aboard execute maintenance etc. whens, eyepiece can by withIn this environment.It is such to use the personnel for being suitably adapted for various grades and rank.
For purposes of illustration, user can be on transporter and by eyepiece reception audio when entering training routine and visuallyInformation.The information can provide a user the details about training mission, such as condition of battlefield, weather condition, task instruction, regionMap etc..The true war scene of eyepiece analog is ready to make user be directed to fight.Eyepiece can also pass through various handsThe response and movement of segment record user.The feedback for the performance that such data collection allows user to receive about him.In addition, meshMirror then can according to the result of acquisition come during training routine change simulation with when simulate it is underway when change the simulation orChange for the simulation of the future of user or each user.
In embodiments, when military transport is confidential enters fight, eyepiece can provide user on the military transportation airplaneGuidance and/or interaction.User can receive the audio and visual information about task in user's aboard.Can be in userShow checklist to ensure that his has the suitable material and equipment of task.In addition, being used for the appropriate use of fixed equipment and safety beltInstruction can be presented together with the information (such as position of emergency exit, oxygen cylinder and safety equipment) about aircraft.User can be presented instruction, when rest such as before task and when be administered for this purpose.Eyepiece can be mentioned to userIt eliminates for noise for resting before task, and can then terminate in the rest of user and further task preparation will openThe user is reminded when the beginning.Additional information can be provided that, vehicle and/or the number of personnel on the map of such as battle field, battlefieldAmount, weather condition of battle field etc..Equipment can provide the link of other soldiers, so that instruction and fight preparation may includeSoldier's interaction, wherein commanding officer can be heard by subordinate etc..In addition, the information for each user can be formatted to be suitble to hisSpecific needs.For example, commanding officer, which can receive, may need not be provided to the higher level of lower grade official or more maintain secrecyInformation.
In embodiments, user can use eyepiece in investigation or rescue duty on military transportation airplane, wherein flyingEyepiece captures and stores the various images and/or video in place interested when crossing each region, can be used for obtaining about potentiallyThe information of face battle field etc..Eyepiece can be used for detecting movement and thereby the detection enemy to be defeated of the people and vehicle on groundPeople or the friendly troop that succour or assist.Eyepiece can provide for label to be applied to and fly over and the map in region searched for or imageAbility, to be encoded to being searched or searched region still being needed to provide specific color.
In embodiments, be provided to will be by the equipment of stock, to be moved by the user on military transportation airplaneThe instruction of quantity and position and/or checklist, and indicated for the specially treated of various equipments.When article is being unloaded or loadedWhen can provide a user for close to vehicle warning to ensure safety.
For the maintenance and safety of military transportation airplane, preflight check can be provided a user for the correct fortune of aircraftMake.If can be alerted before task without completing correctly maintenance, pilot.In addition, can be provided to aircraft operatorsThe graphical overview of aircraft history or list track the history of craft preservation.
In some embodiments, eyepiece can provide user guided in environment related with military fighter aircraft.For example, when instructionPractice, into fight, for safeguard etc. whens, eyepiece can be used in such environment.Such use is suitably adapted for various gradesWith the personnel of rank.
As an example, eyepiece can be used for the training fought to military fighter aircraft by user.Simulation can be presented to user in spyDetermine the augmented reality situation of the Combat Condition in military jet machine or aircraft.The response and movement of user can be recorded and/or divideAnalysis is to provide a user additional information, judge and change training routine according to passing data.
With practical related each embodiment of fighting, it can be presented to user and show surrounding to him and/or close to his friendThe information of army and non-friendly troop's aircraft.The information about enemy aircraft, such as maximum speed, mobility can be presented to userAnd scope.In embodiments, user can receive information related with the appearance of ground hazards and be warned the situation.Eyepiece can be synchronized to the aircraft and/or aircraft instrument and meter of user so that pilot can be seen that urgent warning and aboutThe additional information of aircraft not shown in cockpit generally.In addition, eyepiece can be shown to the number of seconds of target area, rootAccording to bring was threatened from vehicle launch guided missile or the time of pop-up.Eyepiece can be according to ambient enviroment, potential threat etc.It is recommended that motor-driven for pilot's execution.In embodiments, even if friendly troop's aircraft is in undercover operations mode, eyepiece can alsoIt detects and shows friendly troop's aircraft.
In embodiments, preflight check can be provided a user for the correct running of fighter plane.If task itIt is preceding without completing correct routine maintenance, then pilot can be alerted by linking with maintenance record, aircraft computer etc..MeshMirror allows pilot to check the history of craft preservation and the chart of the history and diagram.
In some embodiments, eyepiece can provide user guided in environment related with military helicopter.For example, instructingPractice, into fight, for safeguard etc. whens, eyepiece can be used in such environment.Such use is suitably adapted for various gradesWith the personnel of rank.
As an example, eyepiece can be used for the instruction to the military helicopter operation in fight or high pressure situation by userPractice.The augmented reality situation of Combat Condition of the simulation in given aircraft can be presented to user.The response and movement of user canIt is recorded and/or analyzes to provide a user additional information, judge and change training routine according to passing data.
In training and/or course of battle, the eyepiece of user can be synchronized to aircraft for about aircraftImportant statistical data and maintenance are alerted.User can check plan and the security procedure for passenger when he climbs up aircraftAnd emergency procedure.Such program can illustrate how safely to take aircraft, how operate door to enter and leave flightThe other information such as position of device, equipment of saving somebody's life.In embodiments, the position and/or orientation threatened can be presented to user in eyepiece,Such as those may generate dangerous threat during the flight of helicopter to it.For example, user can be presented low latitudeThe position that the position and land that flight (such as target drone, other helicopters) threatens threaten.In embodiments, Noise canceling headsetsIt can be provided together with eyepiece with multi-user's user interface, to allow to communicate during flight.In the decline of wherein helicopterIn situation, position can be transmitted to commanding officer and rescue team with machine information is gone straight up to by the eyepiece of user.In addition, appointing in low-latitude flyingAllow user that can not be detected the closing of powerful helicopter spotlight using the night vision of eyepiece during businessEnemy is searched for or found in the case where arriving.
In embodiments, as described in each example described herein, eyepiece can be in the tracking of craft preservationAspect, which provides, assists and determines whether that correct routine maintenance has been carried out.In addition, using other aircraft as mentioned hereinAnd vehicle, augmented reality can be used for providing assistance in terms of the maintenance and operation to aircraft.
In some embodiments, eyepiece can provide user in environment related with military target drone aircraft or robot and refer toIt leads.For example, eyepiece can be used in this way when investigating, arresting with rescue duty, fight, generation to the particular risk of the mankind etc.Environment in.
In embodiments, eyepiece can provide a user the video feed in relation to target drone ambient enviroment.For about eachThe real-time video of the information of region-of-interest can be displayed up to the several seconds.Collecting such information can be provided to soldier about opposing in regionThe knowledge of quantity, the layout of building of necromancer soldier etc..In addition, data can be collected and sent to from target drone and/or robotEyepiece is to collect the information about the position for paying close attention to personage that be captured or succour.For example, in safe place or blindageExcept user target drone and/or robot can be used to send back the position about the people in safe place, quantity and activityVideo or data feeding with prepare capture or succour.
In embodiments, during the use in conjunction with target drone and/or the eyepiece of robot allows commanding officer's collection taskBattlefield data make Planning Change and provide the various instructions of team according to the data of collection.In addition, eyepiece and therewithAssociated control allows user to pass through the user interface in eyepiece to affix one's name to weapon in target drone and/or robot upper part.From target droneAnd/or the data feeding that robot is sent can be provided about what weapon will be disposed and when dispose their user information.
In embodiments, allow user close to potential danger situation from the data that target drone and/or robot are collected.ExampleSuch as, this allows user investigation biology spilling, bomb, stoneshot, foxhole etc. when being maintained at user except directly damageThe data about situation and environment are provided to the user.
In some embodiments, eyepiece can provide user guided in environment related with marine ships used for military purposes.For example,Training, into war, in search and rescue duty, execute calamity after cleaning, execute maintenance when etc., eyepiece can be used in this wayEnvironment in.It is such to use the personnel for being suitably adapted for various grades and rank.
In embodiments, eyepiece can be used in training to allow user's preparation for the job responsibility of theirs aboard shipThe various technical ability combination of performance.Training may include to user's navigation, control ship and/or execute various tasks under Combat ConditionDeng the simulation tested of ability.The response and movement of user can be recorded and/or be analyzed to provide a user additional letterBreath is judged and changes training routine according to passing data.
In embodiments, eyepiece can be by providing a user the augmented reality view that potential ship threatens outside horizonTo allow user to check the situation.Such threat can be indicated by dot, diagram or other means.Once eyepiece detectsSpecific threat can be sent to user by eyepiece with the instruction of the belligerent preparation of enemy about carrying out.In addition, user can check at itIn they by the map at the harbour of landing pier or video and be provided hostile location.In embodiments, eyepiece allows to useFamily and ship and/or weaponry synchronize use of the guidance to user's Navigation Equipment during fight.Eyepiece can be passed throughTo user reminding international and national hydrosphere where.
In each embodiment for wherein needing to search for and succour, eyepiece can be used for tracking water flow and/or to searching for recentlyWaters be marked.In each embodiment that wherein water flow is tracked, this can provide a user the concern that transmission will be succouredThe potential site of personage or through change position information.Similarly, eyepiece can must investigate each of ambient enviroment in wherein userIt is used in environment.For example, user can be alerted significantly changing for hydraulic pressure and/or water sport, it is such to significantly change capable of emittinglyThe signal of curtain movement and/or upcoming disaster closed on.The threat of change, earthquake and/or tsunami about earth mantle etc.Prompting can be sent to user by eyepiece.Such prompting can pass through tracking sea by the eyepiece synchronous with the equipment on shipForeign-water movement, water flow change, hydraulic pressure changes, the landing of surrounding water level or promotion etc. provide.
After wherein ships used for military purposes is disposed for calamity in clean each embodiment, eyepiece can be used for the area of detection pollutionThe speed and the prediction where will stop about depth and pollution that domain, pollution are advanced.In embodiments, eyepiece can by withIn the air of 1,000,000 volume of ppm(of the detection pollution volume number of contained pollutant) and variation thereon determine pollution bodyThe variation of product position.
In various embodiments, eyepiece can provide a user plan to check ship and the thereon correct running of equipment.In addition, if before deployment without completing correct routine maintenance, then each operator of ship can be alerted.In each realityIt applies in example, user perhaps can check the state of the maintenance history of ship and the important function of ship.
In embodiments, in the environment of submarine, eyepiece can provide a user various forms of guidances.For example, working asTraining, into fight, for safeguard etc. whens, eyepiece can be used in such environment.Such use is suitably adapted for various etc.The personnel of grade and rank.
As an example, eyepiece can be used for the training to the submarine operation in fight or high pressure situation by user.WithFamily can be presented the augmented reality situation of Combat Condition etc. simulated in particular submarine.Drill program can based on user etc.Grade, so that his grade will determine the type of presented situation.The response and movement of user can be recorded and/or analyze withIt provides a user additional information, judge and changes training routine according to passing data.In embodiments, eyepiece may be used alsoSafeguard submarine, using submarine and it is correct in terms of training user.
In combat environment, eyepiece can be used for providing a user the depth about user, the position of enemy and object, waterThe information of friendly troop and/or enemy on face.In embodiments, such information can be transmitted in visual presentation by audio etc.To user.In various embodiments, eyepiece can be synchronized to the equipment of submarine and equip and/or utilize the equipment and shape of submarineState collects the data from GPS, sonar etc. to collect various information, the position of other objects, submarine etc..Eyepiece canThe instruction about the appearance of enemy in security procedure, task details and region is shown to soldier.In embodiments, equipment canWith ship and/or weaponry communication or synchronize user is instructed when using such equipment and is provided and specific dressStandby related display.Such display may include and equip related visual and audio data.As further example, equipmentCan be used together with periscope enhance the visual picture of user and/or audio with show the place of potential threat, concern withAnd shown information may not be able to be carried out by using periscope, it is the position of enemy outside such as visual field, country and international hydrosphere, eachKind threat etc..
Eyepiece can be also used in the maintenance of submarine.For example, it is checked before can providing a user travelling for shipCorrect running, it can be to being not carried out before task or remind without completing correct routine maintenance operation.In addition, withFamily can be provided that detailed history to check the maintenance etc. of execution.In embodiments, eyepiece can also be by providing augmented realityOr the plan of users is indicated to assist to safeguard submarine in other maintenances as executing.
In embodiments, in port in the environment of ship, eyepiece can provide a user various forms of guidances.For example,When training, into fight, for safeguard etc. whens, eyepiece can be used in such environment.Such use is suitably adapted for variousThe personnel of grade and rank.
As an example, eyepiece can be used to be used for in fight, the ship in the port under attack or high pressure situation for userTraining.It can be presented to user and the Combat Condition that may be seen in specific harbour and on such ship is simulatedAugmented reality situation.Drill program can show land data from global each harbour and surrounding, to timingBetween may be in the data of the quantity of alliance's ship or enemy's ship in harbour, and it can show local gas station etc..TrainingPlan can be based on the grade of user, so that his grade will determine the type of presented situation.The response and movement of userIt can be recorded and/or analyze to provide a user additional information, judge and change training routine according to passing data.?In each embodiment, eyepiece also user can be safeguarded and be executed on ship maintenance of machine, ship use and make on shipIt is trained with correct security procedure etc..
In combat environment, eyepiece can be used for providing a user with wherein user by landing pier or by landing pierThe related information in harbour.Position about enemy in harbour and/or friendly troop's ship or other visual representations can be provided a userInformation.In embodiments, user can get warning about close aircraft and enemy's ship, and user can be withShip and/or weaponry synchronize to instruct user in terms of using equipment, and are provided simultaneously about the equipmentInformation and/or display data.Such data may include the quantity and effect of specific munitions etc..Eyepiece can show to soldier and closeIn the instruction of the appearance of enemy in security procedure, task details and region.Such display may include visual and/or audio letterBreath.
Eyepiece can be also used in the maintenance of ship.For example, it is checked before can providing a user travelling for shipCorrect running, it can be reminded being not carried out before task or operating without the correct routine maintenance of completion.In addition, userDetailed history be can be provided that check the maintenance etc. of execution.In embodiments, eyepiece can also by provide augmented reality orIt is other to indicate the plan of user in executing such maintenance to assist to safeguard ship.
In other embodiments, eyepiece or other equipment can be used to obtain about those close to the object at harbour in userThe biometric information of body.Such information can provide the identity of user and allow user know the people be threaten or it is of interestSomeone.In other embodiments, the article or container that user can scan import to harbour are shipped in cargo with findingPotential threat etc..User can according to density or by sensor collection associated with eyepiece or equipment various other information comeDetect dangerous substance.Eyepiece can record information or scanning document to determine whether the document is forged in some way or modifies.This can help user to check personal certificate, and it can be used for checking proof document associated with Specific Goods with toFamily remind can potential threat related with kinds of goods or problem, inventory, the file of forgery etc. of inaccuracy.
In embodiments, when using tank or other land vehicles, eyepiece can provide a user various forms of fingersIt leads.For example, when training, into fight, for monitoring, group transport, for safeguard etc. whens, eyepiece can be used in such environmentIn.It is such to use the personnel for being suitably adapted for various grades and rank.
As an example, user eyepiece can be used for fight, under attack or high pressure situation when using tank or itsThe training of its land vehicle.It can be presented to user to when the Combat Condition seen in tank and/or when operating tank carries outThe augmented reality situation of simulation.Drill program can be used etc. for correctly equipment and weapon and be tested user.Training meterDrawing can be based on the grade of user, so that his grade will determine the type of presented situation.The response and movement of user canIt is recorded and/or analyzes to provide a user additional information, judge and change training routine according to passing data.EachIn embodiment, eyepiece can also safeguard tank, using tank and when the correct peace used in tank or whens climbing up vehicle etc.Whole Process etc. training user.
In combat environment, eyepiece can be used for providing a user the position with the enemy army of land and/or friendly troop's vehicleRelated information and/or visual presentation.In embodiments, user can get about close aircraft and enemy's vehicleWarning and user can be synchronized with tank and/or weaponry to instruct in terms of using equipment user, andInformation and/or display data about the equipment is provided simultaneously.Such data may include the quantity and effect of specific munitions etc..Eyepiece can show the instruction about the appearance of enemy and friendly troop in security procedure, task details and region to soldier.It is suchDisplay may include visual and audio-frequency information.In embodiments, user can spread 360 sent from the ambient enviroment outside tankView is spent, this can be synchronized to camera or other equipment with such view by using eyepiece to realize.Can to tank/The internal or external user as much as possible of vehicle provides video/audio feeding.This allows user to monitor vehicle and static prestigeThe side of body.Eyepiece can with vehicle and various vehicles described herein or will be apparent to those skilled in the art in other ways,Aircraft and equipment are communicated to monitor car statistics data, armoring breakage, engine status etc..Eyepiece can be mentioned furtherFor GPS be used for navigation purpose, and to the use of other technologies black silicon or described herein come detect enemy army and night withAnd environment is navigate to when non-optimal viewing etc..
In addition, eyepiece can be used in tank/land vehicle environment for monitoring.In embodiments, user is perhapsCamera or other equipment can be synchronized to obtain 360 degree of the visual field to collect information.Night vision and/or SWIR described herein etc.It can be used for further information if necessary to collect.User can be used eyepiece detection heat signal latent to detect to investigate environmentThreat, and can check soil density etc. to detect roadside bomb, track of vehicle, various danger etc..
In embodiments, eyepiece can be used for promoting to transport using the group of tank or other land vehicles.For example, canArticle to be transported and personal inventory are provided a user, which is visual, interaction etc..User perhaps can trackAnd the inventory of more new article is to track those articles in transit etc..User perhaps can check the ground of peripheral regionFigure, scanning prove document and file for the mark of personnel, identification and track in transit individual related article, look intoSee route/mission bit stream etc. of the individual in transport.
Eyepiece can be also used in the maintenance of vehicle.For example, it can provide a user check before travelling with for tank orThe correct running of other vehicles, it can be mentioned to being not carried out before task or operating without the correct routine maintenance of completionIt wakes up.In addition, user can be provided that detailed history to check the maintenance etc. of execution.In embodiments, eyepiece can also be by mentioningIt assists to safeguard vehicle for the plan of augmented reality or other instruction users in executing such maintenance.
In embodiments, in city or suburban environment, eyepiece can provide a user various forms of guidances.For example,When training, into fight, for monitor etc. whens, eyepiece can be used in such environment.Such use is suitably adapted for variousThe personnel of grade and rank.
As an example, user eyepiece can be used be used for when fight, under attack or high pressure situation, with localsMember interacts to be trained in city or suburban environment whens waiting.It can be presented to user to seeing when in such environmentThe augmented reality situation that Combat Condition is simulated.Drill program can be used etc. for correctly equipment and weapon and be carried out to userTest.Drill program can be based on the grade of user, so that his grade will determine the type of presented situation.The sound of userIt should can be recorded and/or analyze with movement to provide a user additional information, judge and change example according to passing dataRow training.In embodiments, user can check the alternating scene in city and suburb setting, and city and suburb setting include trueBuilding and building layout and potential fight region.Enter the region before, can provide a user weather andWeather information, and can be notified that the number in given time or that time in one day generally in this region is come for canCan attack or other belligerent be prepared.In addition, can provide a user in the building in given area, around and push upOn individual position so that enter the environment before user be ready to.
In city and suburban environment, eyepiece or other equipment allow user also to investigate local employee.User can receiveCollect face, iris, voice and the fingerprint and palm print data of concern personnel.User can be in the case where not discovering from distancePOI0-5 meters, bigger distance or scan such data just beside POI.In embodiments, eyepiece can be used in userCome the environment understanding thoroughly smog and/or being destroyed with mark and record the appearance of vehicle in this region, with record ambient image withFor (such as in plan of action) in the future using, the density of population to mark the region in one day each time, various buildBuild the layout etc. of object and path.In addition, user collects and receives the reality of the specific aborigines associated about soldierFeelings.
When fight, user can use eyepiece or other equipment in urban/suburban environment.Equipment allows user logicalLaser range finder is crossed to position using geographical location and destroy unfriendly target.In embodiments, it can provide ambient enviromentWith the birds-eye view of building.It can show the enemy in the peripheral region of user and identify individual (such as enemy army or friendly troop or useThose of family group member) position.Eyepiece or other equipment can be used to keep in touch with his general headquarters, pass through eyepiece in userThe instruction from commanding officer is checked/listens to, wherein instruction can be made after checking or listening to the data from user environment.ThisOutside, eyepiece may also allow for user to provide order to other members in his group.In embodiments, user can be to neighbouringThose members execute biometric data and collect, and record such information and/or retrieve the information about them in fightIt uses.User can monitor with other soldier's linking of devices and using the various equipments carried by soldier.In embodiments,When the edge for the building that eyepiece can occur quickly to user reminding at roof and when close to ground transformation or when bump intoRow warning.Usable family can check the ground map combining of environment and the member of his group, and he is able to detect and to issueSignal and neighbouring possible enemy is alerted to other people near warning.In various embodiments, eyepiece can be used for by userIt is communicated with other group members and carrys out executive plan.In addition, user eyepiece can be used detecting in dark tunnel andThe enemy in other regions that enemy can be located therein.
Eyepiece can be used in desert Environment.Have in addition to described herein with training, fight, existence, monitoring purpose etc.Close general and/or applicatory use, eyepiece can be further used in can be met in the environment such as desert Environment it is eachIn kind usage scenario.As an example, eyepiece can be used to correct in fight, monitoring and instruction in user when entering fight or trainingDecline in white silk through the vision of sandstorm.In addition, eyepiece can in training mode for user simulate sandstorm bad visibility andOther deserts are dangerous.In fight, eyepiece can assist user to see or detect enemy to exist by kind described above modeAppearance in sandstorm.In addition, user can be alerted and/or it can be seen that the husky cloud for causing the sum of Sha Yun to be generated by wind by vehicle itBetween difference it is close to be warned potential enemy.
In various embodiments, eyepiece can be used to detect ground hazards and environmental hazard in user.For example, user can be usedEyepiece is come edge, the sand-protecting barrier etc. that detect sand dune.Eyepiece can be used also to detect sand density to detect various danger, such as in userThe equipment that hole in the ground, steep cliff, mines and bombs etc. are buried.The map in desert can be presented to user to check such dangerPosition.In embodiments, it can provide a user and his vital signs be monitored by it and when he is due to extreme environmental conditionsThe device alerted when (cold, temperature variation, the dehydration of heat, evening in such as one day) is in danger to him.Such warningIt illustratively provides in the user interface that can be shown in eyepiece with monitoring and/or is provided by audio-frequency information.
In embodiments, the map in desert can be presented to user to check the position of his group, and he can be usedEyepiece detects neighbouring signal or obtains the warning of possible hostile forces, which is displayed on map or from earThe audio-alert of machine.In such embodiments, user can have the advantage of the enemy compared to him, this is because he can haveIn sandstorm, in building, the ability of the position of his group of the medium determination of vehicle and enemy.User can check his positionMap, the map can will be shown and another color in new region in a kind of color in the region that wherein user advances recentlyDisplay.In this way or by other means, equipment may make user not lost and/or keep moving to be correctly orientedIt is dynamic.In embodiments, user can be provided that weather satellite covering to remind user's sandstorm and hazardous weather.
Eyepiece can be used in field environment.Have in addition to described herein with training, fight, existence, monitoring purpose etc.Close general and/or applicatory use, eyepiece can be further used in can be met in the environment such as field environment it is eachIn kind usage scenario.
As an example, user can be used in training using eyepiece to prepare in field.For example, user can be usedEyepiece simulates the degree of the variation of field environment.In embodiments, user can undergo around dangerous animal veryThick and heavy tree/bushes, and in other training environments, he can be subjected to having less place can be to the challenge that enemy hides.
In fight, eyepiece can be used to come for numerous purposes for user.Eyepiece can be used to detect the tree being newly broken in userBranch occurs to detect nearest enemy.In addition, eyepiece can be used to detect dangerous steep cliff, cave, the change of landform, most in userThe dust etc. of nearly movement/disturbance.As an example, by detecting the appearance of the dust disturbed recently (if it has and surroundingThe different density of dust/leaf or heat signal, then it can be detected or it can be detected otherwise), it usesFamily can be alerted trap, bomb or other hazardous equipments.In each environment being described herein, eyepiece is can be used in userIt is communicated by user interface or other means with his group, so that communication can be in enclosed environment, sensitive to echoIt keeps silent in open environment etc. and/or is not detected by enemy.Also, in each environment, user can be used and be described hereinNight vision detect the presence of enemy.User can also check the covering of trajectory diagram and/or high mountain trajectory diagram in eyepiece, so thatUser can check path before the situation that the potential hazard region of experience and/or enemy can be located therein.It is described hereinEach environment in, the sense of hearing that eyepiece can also amplify user carrys out the detection for potential enemy.
In embodiments, user can use eyepiece in field environment, in the case where searching for and succouring service condition.For example,It is mobile to detect soil/leaf that eyepiece can be used in user, with determine it is whether disturbed come be used to track mankind track andFor finding the corpse buried.User can check the map in the region being labeled to show by air force and/or otherGroup member searches for the region of covering, so that user is directed to not searched region from the region searched for.In addition,Eyepiece can be used to carry out the night vision for detecting through trees, undergrowth, bushes etc. for the mankind and/or animal for user.In addition,The presence that the sprig being newly broken is detected by using eyepiece, when in monitoring and/or rescue duty, user is able to detectPay close attention to personage presence or recently there are.In embodiments, user can also check trajectory diagram and/or high mountain track in eyepieceThe covering of figure, so that user can check path before meeting with potential hazard region and/or situation.
In other embodiments, user can field using eyepiece purposes for except land and existence type situationLower life.As an example, when search of food, user eyepiece can be used track animal dis and it is mobile.In addition, user can incite somebody to actionEyepiece is used for the detection of soil moisture and detects presence and the position of water system.In embodiments, eyepiece can also amplify useThe hearing at family detects the animal of potential prey.
Eyepiece can be used in arctic circumstances.Have in addition to described herein with training, fight, existence, monitoring purpose etc.Close it is general and/or applicatory be applicable in, eyepiece can be further used in can be met in the environment such as arctic circumstances it is eachIn kind service condition.For example, the visual and audio that eyepiece analog user can meet in arctic circumstances is newborn when in trainingWhite (white out) weather condition, so that user is adaptable to operate at such pressures.In addition, eyepiece can be to userThe program that various situations and scene are simulated according to extremely cold is provided, and the program can be traced and show the prediction heat waste with userLose related data.In addition, the program can be suitble to carry out the situation that analog subscriber may be undergone in this heat loss.EachIn embodiment, which cannot suitably control his four limbs, this may occur in which the reduction of weapon accuracy.At itIn its embodiment, help information can be provided a user and about such as burrowing in snowfield the instruction of matters that keeps warm and variousExistence skill for arctic condition.In other embodiments, eyepiece can be synchronized to vehicle so that vehicle seems such as itSuch as it executes in the specific environment with arctic condition and ice and snow and makes a response like that.Therefore, vehicle can be similarly to userIt makes a response and eyepiece can also come like that analog vision and audio in environment as being in user.
In embodiments, user can use eyepiece in fight.Eyepiece can be used to allow him to have an X-rayed milky white day in soldierGas bar part.User can recall covering map and/or the audio for the information for providing building irrigation canals and ditches, land danger etc. to allow scholarSoldier safely moves in the environment.Eyepiece can remind user to detect the raising or decline of snow density, thus when face of snowingContinent is varied whens indicating possible irrigation canals and ditches, hole or other danger, the object buried in snow etc. to allow him to know.ThisWhen outside, under conditions of wherein being difficult to see, no matter avenge and whether hinder the visual field of user, his group member can be provided to himWith the position of enemy.Eyepiece can provide heat signal in arctic circumstances also to show animal and individual to user.In each implementationIn example, the user interface in eyepiece can show his vital signs to soldier and when he is since environmental condition extreme around is inWarning is provided when dangerous.In addition, eyepiece can also by provide a user from vehicle about transmission slip, wheel slip etc.Prompting help the user to operate vehicle under the conditions of more snow.
Eyepiece can be used in jungle environment.Have in addition to described herein with training, fight, existence, monitoring purpose etc.Close general and/or applicatory use, eyepiece can be further used in can be met in the environment such as jungle environment it is eachIn kind usage scenario.For example, eyepiece can in training to provide a user can be eaten about which plant, which be it is toxic withAnd the information that insect and animal may make user on the line.In embodiments, eyepiece analog user may beThe various noises and environment met in jungle, so that the environment will not divert one's attention when in fight.In addition, when in fight or veryWhen in real jungle environment, diagram covering or other maps can be provided a user to show the region around him and/or help himTrack he from where and he must go where.This can remind allied forces and enemy army in the area to him, and it canSensing is mobile so as to animal potential near user reminding and/or insect.Such prompting can help user by avoiding attackingIt hits with search of food and survives.In other embodiments, the augmented reality such as with diagram mulching method can be provided a userData allow user by biology and/or animal and those of encountered biology and/or animal is compared to help user areaPoint which be edible safety, which be toxic etc..The information threatened to user by having specific biology, whenWhen in undercover operations or quiet mode, he can need not weapon deployment.
Eyepiece can also use relatedly with special force task.In addition to described herein and training, fight, existence, monitoringPurpose etc. is related general and/or use applicatory, eyepiece can be further used in possibility related with special force's taskIn the various usage scenarios met with.In embodiments, eyepiece can be used for the specific use of undercover operations task.For example, withFamily can silently be communicated by the user interface that each member can see on his eyepiece with his group completely.User sharingInformation can be navigated in the user interface by eye movement and/or controller equiment etc..With user make instruction and/orIt navigates in user interface and specific data about the information to be transmitted, other users also can be seen that the data.In each implementationIn example, each user can be inserted into the problem of being answered by instruction leader by user interface.In embodiments, Yong HukeTalk or initiate other audios that all users can be heard by their eyepiece or other equipment.This allows afield eachUser at a position sends plan of action, instruction, problem, shared information etc. and allows them the case where not being detectedUnder do so.
In embodiments, eyepiece may be additionally used for military fire-fighting.As an example, eyepiece can be used to run fire-fighting in userThe simulation of scene.Augmented reality can be used to simulate fire behavior and as time goes by the structural damage of building in equipmentIt is bad, and it can reproduce scene true to nature with other way.As mentioned herein, training program can monitor the progress of userAnd/or scene and training module are changed according to the movement of user.In embodiments, glasses can be used in practical fire-fighting.Eyepiece allows user to understand thoroughly smog by various means described herein.User can check, downloads or come in other waysAccess the layout of building, container, aircraft, vehicle or the structure caught fire.In embodiments, user will have general viewMap or other each group members of display are located at map where.Eyepiece can monitor that user wears or other during fire fightingEquipment.User can see his Oxygen supplied level in his eyepiece and be mentioned when he should withdraw to obtain more oxygenIt wakes up.Eyepiece can send the command post of structural outer for the notice from user equipment to dispose new personnel and enter or leave fire, and to update and the prompting to possible fire fighters danger of doing well.User can be such that his vital sign shows, with trueHis fixed whether temperature is excessively high, loss is too many oxygen etc..In embodiments, eyepiece can be used for the density according to beam, heatSignal etc. come analyse whether in beam or molding in have crack and notify the structural intergrity of user's building or other environment.WhenWhen structural intergrity is damaged, eyepiece can provide automatic warning.
In embodiments, eyepiece may be additionally used for maintenance purpose.For example, eyepiece can provide a user before task and/orCorrect running using inventory, for article to be used.If correctly maintenance is not logged in the database of article,So it can remind operator.It can provide virtual maintenance and/or performance histories for user to determine the safety of article or beThe safety and/or performance requisite measure to be taken.In embodiments, eyepiece can be used for executing augmented reality program etc.To be used for the training user in terms of weapon maintenance and maintenance, and used in related new and/or Advanced Equipment the class to skilled workerCheng Zhong.In embodiments, eyepiece can be used for various articles (weapon, vehicle, aircraft, equipment etc.) maintenance and/Or in repairing.User can be used eyepiece and indicate come the visual covering for checking article and/or audio so that user is not needing to holdIt is safeguarded in the case where handbook.In embodiments, video, still image, 3D and/or 2D image, the image of animation, soundFrequency etc. can be used for such maintenance.In embodiments, user can check the covering of article and/or the video of various images,So which user, which is shown, will partially remove, with what order removes and how to remove, which will partially be added, be replacedIt changes, repair, enhance.In embodiments, such maintenance program can be augmented reality program etc..In embodiments,Eyepiece can be used to connect running to monitor machine or equipment and/or important statistical data with machine or equipment and help in userRepair and/or provide maintenance information.In embodiments, user is able to use eyepiece to suggest next one during maintenanceConsecutive movement, and eyepiece can damage machine about such movement to user's transmission, help how repair machine, machineAnd/or whether by after the following step running etc. a possibility that information.In embodiments, eyepiece can be used forIt is that this is mentioned or may be used on military environment in other ways or met in military environment all items, machine, vehicle, setThe maintenance of standby, aircraft etc..
Eyepiece may be additionally used for having the language said in wherein user in unfamiliar environment to a certain degree.As showingEyepiece and/or equipment can be used to obtain the translation of the near real-time of the talker around him to those in example, soldier.Pass through equipmentEarphone, he can hear the translation with his mother tongue to his talker.In addition, he can record and translate by the prisoner of war and/orThe comment that other internees make.In embodiments, soldier can have can translate phrase or by earphone, pass through userEyepiece the user interface of translation is provided a user in text image or otherwise.In embodiments, eyepiece can be byLinguist using come to veteran linguist provide about the dialect described in specific region or the institute of the people near himWhat is said is the supplemental information of what dialect.In embodiments, linguist can be used eyepiece come record instruction sample for intoThe comparison and/or study of one step.Other experts eyepiece can be used use speech analysis with by monitoring change voice, tone, stutterEtc. come determine speaker whether just undergoing it is angry, ashamed, tell a lie etc..Even if hearer and speaker say different language, this is stillThe intention of hearer speaker's script can be given.
In embodiments, eyepiece allow user interpret body language from another people and/or facial expression or itsAllogene identifies data.For example, user equipment can be used analyze the amplification of the pupil of people, the changing voice of blink rate, sound, body moveIt is dynamic etc. come determine whether the people tells a lie, it is inimical, under pressure, may be to threaten etc..In embodiments, eyepiece may be used alsoWhether the data for collecting such as facial expression are being told a lie or may made insecure old detecting and alert user speakerState, is inimical etc..In embodiments, when with group or other individual interact when, eyepiece can provide a user remind withAlert the potential individual with menace that can be disguised oneself as by non-belligerent or common citizen or other individuals.User reminding canTo be audio and/or visual, and can appear in the user interface in the eyepiece of user or be covered in the vision of user and/orIt is associated with the individual investigated in the sight of user.As described in this such monitoring can user using eyepiece and/orInvisibly collecting data or it from a distant place when equipment can be executed in close distance by camouflage or discontinuous mode,Or it is executed in the case where individual under a cloud is known and/or agreed to.
When handling bomb and other hazardous environments, eyepiece be can be used as.As an example, eyepiece can provide a userAbout the prompting that the soil density close to roadside changes, the bomb buried can be reminded to user and/or group.In each embodimentIn, similar approach can use in various environment, and it is fried to determine whether to find in arctic circumstances etc. to test the density of snowBullet or other explosives.In embodiments, eyepiece can provide density and calculate to determine whether luggage and/or transport article are inclined toIn with unexpected density or falling in by the density except the particular range of transport article.In embodiments, eyepiece can provideSimilar density calculates and if density is found to be when falling in the desired extents such as explosive equipment, other weapons, mentionsFor warning.It will be appreciated by persons skilled in the art that bomb detection can also be via chemical sensor and/or the known way of this fieldTo use and can be used in various embodiments by eyepiece.In embodiments, glasses can be used in bomb disposal processing.It can be toIt is existing certain types of fried on how to remove to obtain that user provides augmented reality or other audios and/or visual coveringThe instruction of bullet.Similar to maintenance program described above, the instruction for disarming a bomb can be provided a user.In each embodimentIn, if bomb type be it is unknown, user interface can be provided a user for safe handling and what may be taken connectThe instruction for the step of getting off.In embodiments, user can be warned neighbouring potential bomb and can be rendered for safetyGround handles the instruction of the situation, such as how safely fleeing from bomb region, how safely exiting vehicle with bomb, usingFamily leave bomb it is how close be it is safe, how to be disarmed a bomb by the instruction and the skill level of user that are suitable for the situationDeng.In embodiments, eyepiece can also provide a user the training in such hazardous environment etc..
In embodiments, eyepiece can detect various other danger, biology leakage, chemicals leakage etc. and to userThe warning of unsafe condition is provided.In embodiments, user also can be provided that various about in this context and/or suchUnder the conditions of mitigate situation, become safety and keep other people safety instruction.Although it have been described that the situation with bomb,It is intended to indicate that eyepiece can be used similarly under various dangerous and/or unsafe conditions and be watched out for and be suppressed, and/orInstruction etc. is provided when suffering from such dangerous and dangerous.
In various embodiments, eyepiece can be used in general body-building and training environment.Eyepiece can provide a user allThe information of the mile of traveling such as during he runs, goes hiking, walks.Eyepiece can provide a user the number of the exercise such as carried outThe information of amount, the calorie to burn etc..In embodiments, eyepiece can provide a user related with specific exercise is correctly carried outVirtual instruction, and it can desirably or ideally provide a user additional exercise.In addition, eyepiece may be provided in wherein bodyEnergy benchmark is exposed to soldier to meet the user interface of the requirement of the specific program for him or other modes.In addition, eyepieceData related with the number amount and type of exercise being implemented are needed be can provide to make user meet such require.In this wayRequirement can be adjusted towards special force's qualification, propaedeutics etc..In embodiments, user can hinder during exercise with virtualHinder and cooperates together to prevent user from setting up true railing, obstacle etc..
Although there is described herein specific each embodiment and usage scenario, such description be not intended to limit.In addition,Being intended to eyepiece can be used in apparent each example to those skilled in the art.It is also expected to such as being mentioned for specific environmentEyepiece be suitable for using that can be used in other each environment, even if not mentioning specifically therewith.
In embodiments, user may have access to and/or manipulation is stored in secure digital (SD) card, mini SD in other waysCard, other memories, remotely load in tactical network or the information bank stored by other means.The library can be userA part of equipment and/or it can be remote accessible.User equipment may include DVR or be collected for storing by userThe other devices and recorded data of information and/or feeding can be by on-demand delivery to other places.In embodiments, which canIncluding locally threaten image, be listed in threat each individual information and/or image etc..The library of threat can be stored on plateIn mini SD card or other devices.In embodiments, it can remotely be loaded in tactical network.In addition, in each embodimentIn, information bank may include that program and other information or data useful in terms of the maintenance of military vehicle can be any typeOr about any kind of information.In various embodiments, information bank can be used together so that data are transmitted with equipmentAnd/or it is sent to storage medium and user equipment or is sent from storage medium and user equipment.As an example, data can be sentIt is sent in eyepiece to user and the library from storage, so that he can check the image of local concern personage.In each embodimentIn, data may be sent to that including in soldier's equipment or positioned at long-range library and from including in soldier's equipment or being located at long-rangeLibrary in send, and data may be sent to that various equipment described herein and send from various equipment described herein.In addition, data can be sent between various equipment described herein and kind described above library.
In embodiments, military simulation and training can be used.As an example, the scene of game for being generally used for amusement canIt is adapted and is used for battlefield simulation and training.The various equipment of all eyepieces as described in this etc. can be used for such mesh's.Near-field communication can be used in such simulation come the personnel of changing, presentation danger, change strategy and scene and for variousOther communications.Such information, which can be posted, provides instruction and/or information with required for it place shared information.It is variousScene, training module etc. can be running on the equipment of user.Merely exemplary and not using to such trainingLimitation, the eyepiece of user can show augmented reality combat environment.In embodiments, user can be in this environment such as himIt is practical to take action and react like that in fight.User can advance or fall back according to his performance.In various embodiments, userMovement can be recorded for the performance feedback to be provided according to him.In various embodiments, regardless of user performance whetherIt is recorded, user can be provided that feedback.In embodiments, the information puted up as described above can be password or biology is knownOther protecting field and/or cryptographically, and it is immediately available or can be used after specific a period of time.It electronically stores in this wayInformation can be ordered for all change and may be ideal update and by immediate updating.
Near-field communication or other means can be used in training environment and neutralize for safeguarding with total in the place for needing informationIt enjoys with posted information and provides instruction and/or information.As an example, information can be posted in classroom, laboratory maintenance preventionIn, workshop is medium or other needs as from anywhere in training and instruction.The user of all eyepieces as described in this etc.Equipment allows such information transmission and receives.Information can be shared via augmented reality, and user meets in the augmented realitySpecific region and once obtaining such information notice once him there.Similar to described herein, near-field communication can by withIn maintenance.As an example, information, which can be accurately pasted onto, needs its place, such as in maintenance prevention, in workshop,It is associated with the article to be repaired etc..It more specifically but is not limitation of the present invention, repairing instruction can the army of being posted withinWith under the hood of vehicle and can be visible by using the eyepiece of soldier.Similarly, various instructions and training information can be anyIn given training condition with various user sharings, such as the training for fight and/or for the instruction of military equipment maintenancePractice.In embodiments, the information puted up as described above can be password or bio-identification protecting field and/or cryptographically simultaneouslyAnd it is immediately available or can be used after specific a period of time.It can be in time for all with such information of electronics situation storageChange order and may be ideal update and be updated.
In embodiments, being applied to application of the invention can be used for face recognition or sparse face recognition.It is suchOne or more facial characteristics can be used to exclude the possibility when mark pays close attention to personage in sparse face recognition.Sparse face recognitionCan have automatic obstacle masking and mistake and angle to correct.In embodiments, as example and not be to the present inventionLimitation, eyepiece, flash lamp and equipment described herein allow sparse face recognition.This with can be similar to human vision workMake, and by rapidly excluding unmatched each region or entire wheel using sparse matching to all image vectors immediatelyIt is wide.This, which may make, almost impossible there is false positive.In addition, this can simultaneously be expanded vector space and be mentioned using multiple imagesRise accuracy.This can according to availability or operability require come either with multiple databases or with multiple target images come togetherWork.In embodiments, equipment can manually or automatically identify the one or more for having least reduction in terms of accuracySpecific clean feature.As an example, accuracy can be various ranges, and it can at least for nose 87.3%,For eyes 93.7% and for the 98.3% of mouth and chin.In addition, the angle correct by face reconstruct can be used,And in embodiments, it can be achieved that the at most 45 degree of angle corrects for passing through face reconstruct.This can with 3D rendering mapping techniques comeIt further enhances.In addition, fuzzy region masking and replacement can be used.In embodiments, can respectively for sunglasses andScarf realizes 97.5% and 93.5% fuzzy region masking and replacement.In embodiments, ideal input picture can be 640Multiply 480.Target image may be due to long range or atmosphere masking and to be less than 10% input resolution ratio reliable matching.In addition,Particular range as mentioned above in various embodiments can be greater or lesser.
In various embodiments, equipment described herein and/or network can be applied to the mark to friend and/or allied forcesAnd/or tracking.In embodiments, face recognition can be used for identifying friend and/or friendly troop for certain.In addition, real-time networkThe real-time network of tracking and/or blue force and Red Army track allow user know he allied forces and/or friendly troop where.In each realityIt applies in example, visual separation model may be present between blue force and Red Army and/or the army identified by various marks and/or modeIt encloses.In addition, user can geo-location enemy and in real time share enemy position.In addition, the position of friendly troop can also be real-timeIt shares on ground.Equipment for such application can be bio-identification described herein collect glasses, eyepiece, other equipment andThose equipment well known by persons skilled in the art.
In embodiments, in medical treatment when equipment described herein and/or network can be used in diagnosis.As an example,Such equipment aloows healthcare givers to make remote diagnosis.Furthermore and as an example, when battlefield doctor reaches sceneWhen or remotely, they the equipment of fingerprint sensor etc. can be used transfer at once the case history, anaphylaxis, blood group of soldier withAnd other time-sensitive medical datas take most effective treatment.In embodiments, such data can be by can be viaWhat eyepiece described herein or another equipment were realized transfers face recognition, iris recognition of soldier etc..
In embodiments, user can share various data by various networks described herein and equipment.As showingThe video wireless transceiver of example, 256 AES encryptions can be bidirectionally total between each group and/or between the computer of vehicleEnjoy video.In addition, biometric data set, potential registration, mark and the verifying for paying close attention to personage, the bio-identification for paying close attention to personageData etc. can locally be shared and/or remotely be shared by wireless network.In addition, potentially paying close attention to this mark of personageIt can be by locally shared and/or by the data of wireless network teleengineering support complete or provide help with verifying.It retouches hereinThe biological recognition system and equipment line stated also are enabled to through network share data.In embodiments, data can with it is variousEquipment, individual, vehicle, position, unit etc. are shared or share from them and/or share between them.In each embodimentIn, it is understood that there may be in unit and unit communicates outside and data sharing.Data can carry out shared or from following via the following termsShare in and/or share between the following terms: there are 256 to add for the existing communication resource, mesh network or other networksClose military control type ultra-wideband transceiver cap, military control type cable, mobile SD and/or micro- SD memory card, Humvee(are braveHorse), PSDS2, unmanned vehicle, WBOTM or other relays, fight radio, netted networking computer, allThe equipment of various equipment such as, but not limited to, described herein, the computer of biology phone 3G/4G networking, digital archives, tactics are madeWar center, command post, DCSG-A, BAT server, individual and/or group of individuals and any eyepiece described herein and/orEquipment known to equipment and/or those skilled in the art etc..
In embodiments, equipment described herein or other equipment may include checking pane, this checks pane oppositelyImage projecting is checked on any surface for carrying out fight group by squad and/or group group leader.Transparent checks paneOr it is other check pane can in projection mode by rotation 180 degree or another degree with group and/or the shared number of multiple individualsAccording to.In embodiments, including but not limited to the equipment of monocular and binoculars NVG can with use in it is all orNearly all Tactical Radio docking, and user is allowed to know in real time or in other ways to share live video, S/A, biologyOther data and other data.Equipment as binoculars and monocular as mentioned above can be independentVIS, NIR and/or SWIR binoculars or monocular, and including colored day night view and/or number display, and with tightlyIt gathers, encrypt, enabling wireless computer to be docked with Tactical Radio.Various data can pass through fight radio, nettedNetwork and long-range tactical network come in real time or near real-time share.In addition, data can be organized into digital archives.Pay close attention to peopleWhether the data of object (POI) can be organized into digital archives, register but regardless of such POI.In embodiments, it is sharedData can be compared, be manipulated.Although referring to specific equipment, any equipment can be as described in this as mentioned hereinAnd/or as those skilled in the art will appreciate that as shared information.
In embodiments, the data of biometric data, video and various other types can pass through various equipment, methodIt is collected with device.For example, fingerprint can be collected from the weapon and other objects from war, terrorist activity and/or scene of a crimeWith other data.Such collection can be captured by video or other means.Pocket biological camera, flash of light as described in thisLamp and embedded Still Camera, various other equipment described herein or other equipment collect video, record, monitoring andIt collects and identifies bio-identification photographed data.In embodiments, various equipment are recordable, collect, mark and verify data withAnd with face, fingerprint, latent fingerprint, latent palmmprint, iris, sound, articles in pocket, scar, related biometric data of tatooingAnd other mark witness markings and environmental data.It is that data can be geo-location and use date/time stamp.Equipment can be caughtCatch the specific image for meeting EFTS/EBTS to match software by any bio-identification to match and file.In addition, executableVideoscanning and potential matching for embedded or long-range iris and facial database.In embodiments, various bio-identificationsData can be captured and/or compare for database and/or it can be organized into electronic record.In embodiments, atPicture and detection system can provide bio-identification scanning, and allow the feature tracking and iris recognition to multiple objects.Object canTo move into crowd or removal crowd at a high speed and can immediately be identified, and local can be executed to such image and/or dataAnd/or remotely storage and/or analysis.In embodiments, multi-mode living things feature recognition can be performed in equipment.For example, equipment canIt collects and identifies face and iris, iris and latent fingerprint, the various other combinations of biometric data etc..In addition, equipment can be rememberedRecord video, sound, gait, fingerprint, latent fingerprint, palmmprint, latent palmmprint etc. and other distinguishing labels and/or movement.EachIn a embodiment, being manually entered to be returned of the additional actuating section data capture of most specific image is can be used in biometric dataShelves.Data can by automatically geo-location, with time/date stamp, and with locally or the GUID filing of network distribution is to numberIn word archives.In embodiments, equipment can record the bat print of 4 fingers of complete fingerprint sensitive scanning and roll printing, fingerprint are clappedPrint and roll printing, palmmprint, finger tip and fingerprint.In embodiments, operator can collect POI when inspecting Territorial Army and with airborneOr remote data base verifies POI.In embodiments, equipment may have access to Web portal and enable the audit listing of bio-identificationDatabase and/or may include for POI acquisition existing bio-identification preliminary hearing software.In embodiments, bio-identification can lead toThe bio-identification matching software of any approval is crossed to match and file for sending and receiving safe unabiding voice, videoAnd data.Equipment is combinable and/or analyzes bio-identification content in other ways.In embodiments, biometric data canBe collected into biometric identification criteria image and data format, these formats can by it is mutual referring to come for defend biological authority portionDoor (Department of Defense Biometric Authoritative) or other databases carry out near real-time or realWhen data communication.In embodiments, algorithm can be used to come for related with fingerprint and palmmprint, iris and face-image for equipmentDetection, analysis etc..In embodiments, equipment can illuminate iris or latent fingerprint for integration analysis simultaneously.In each implementationIn example, high-speed video is can be used to capture specific image under unstable situation and intuitive tactics display can be used to promote in equipmentThe fast propagation that situation knows.Real-time situation can be provided to command post and/or tactics operation center to know.In each implementationIn example, equipment allows each soldier to become a sensor and observes and report.Collected data available dates, time and receiptsThe geographical location of collection marks.In addition, biometric image, which can be, meets NIST/ISO, including ITL1-2007.In addition,In embodiments, laser range finder can assist bio-identification to capture and position target.Threaten library that can be stored in airborne miniIn SD card or remotely load in tactical network.In embodiments, band transceiver and/or ultra wide band can be used to receive for equipmentDevice is sent out wirelessly to transmit encrypted data between devices.Equipment can be for embedded database or safely in battlefield netDatabase on network executes the airborne matching to potential POI.In addition, equipment can use high-speed video under all environmental conditionsTo capture specific image.Bio-identification profile can be uploaded in the time, download and search in a few seconds or less.In each embodimentIn, equipment can be used to utilize visual bio-identification geo-location POI at safe distance in user, and with for face, irisDeng steady sparse recognition algorithm identify POI for certain.In embodiments, user is combinable and by visual bio-identificationFeature is printed upon one and shows that, without reminding the POI, which there is enhancing target to highlight and view comprehensivelyMatch and alerts.Such display can be in various equipment, eyepiece, handheld device etc..
In embodiments, when native country personage is when controlled checkpoint and/or station are by screening, operator be can be usedUnobtrusive face and iris biometric feature come collection, registration, mark and verifying POI from audit listing.EachIn embodiment, bio-identification is collected and mark can be carried out in scene of a crime.For example, in explosion or other scenes of a crime, operatorBiometric data can be rapidly collected from all potential POI.Data can be collected, GEOGRAPHICAL INDICATION and be stored in digital archivesIn to be compared for passing and future scene of a crime.In addition, in house and building search, it can be in real time from POICollect biometric data.Shown such data can allow operator to know release, detain or arrest potentialPOI.In other embodiments, unobtrusive data collection and mark can carry out in street environment etc..User can be such asThrough the market is simultaneously assimilated with Local resident, while biometric feature, geographical location and/or ring are collected under minimum visible influencesBorder data.In addition, biometric data can be collected with the dead or the wounded to identify whether they are POI.In each embodimentIn, user can be by the dead or the wounded or other people face identification, iris identification, fingerprint, visible mark label etc.To identify known or unknown POI, and the update to electronic record is kept with such data.
In embodiments, laser range finder and/or inclinometer can be used to determine concern personage and/or improvised explosive dressIt sets, other positions for paying close attention to article etc..Various equipment described herein may include digital compass, inclinometer and laser range finder withThe geographical location of POI, target, IED, concern article etc. is provided.POI and/or the geographical location for paying close attention to article can be by networks, warArt network etc. is sent and such data can be shared between individuals.In embodiments, equipment allows optical arrayWith laser range finder in uncontrolled environment using in battlefield group or crowd be observed continuously come simultaneously to multiplePOI carries out geo-location and ranging.In addition, in embodiments, equipment may include laser range finder and specified device with by pairBeing observed continuously for one or more targets simultaneously to carry out ranging and paint to target.In addition, in embodiments, equipment canBe soldier wear, it is hand-held etc., and including using integrated laser range finder, digital compass, inclinometer and GPS receiverTarget geographic position position the enemy on battlefield.In embodiments, equipment may include integrated digital compass, inclinationMeter, MEMs gyroscope and GPS receiver record and show the position of soldier and the direction of his sight.In addition, various equipmentIt may include for the integrated GPS receiver of position precision and directional precision etc. or other GPS receivers, IMU, 3 number of axle word sieveDisk or other compass, laser range finder, gyroscope, the gyroscope based on micro machine system, accelerometer and/or inclinometer.?The various device and method of this description aloow user to position the enemy in battlefield and POI, and pass through network or other sidesThe shared such information of formula and friendly troop.
In embodiments, it can be used communication together with geographical location is by the netted networking of user or networking.In addition, eachUser can be provided that the pop-up window or other positions map of all users or adjacent user.This can be provided a user about friendly troopKnowledge positioned at where.As described above, the position of enemy can be found.The position of enemy can be tracked and pop-out can be usedMouthful or the other positions map of enemy provide, can provide a user the knowledge for being located at where about friendly troop.Friendly troop and enemyPosition can be shared in real time.User can be provided that the map for describing such position.About friendly troop, enemy position and/Such map of quantity and their combination be displayed in the eyepiece of user or other equipment in for checking.
In embodiments, equipment, methods and applications, which can provide, is not necessarily to hand, wireless, visually maintenance and repairAnd/or the instruction of audio enhancing.Such application may include the RFID sensing for part positioning and relevant equipment.In each exampleIn, equipment can be used to carry out the spot repair instructed for augmented reality for user.Such spot repair can by without with hand,Wireless, maintenance and repair indicate to instruct.It eyepiece, projector, monocular etc. and/or other described herein setsThe standby equipment waited can show the image of maintenance and repair process.In embodiments, such image can be it is static and/orVideo, animation, 3-D(is three-dimensional), 2-D(it is two-dimensional) etc..In addition, user can be provided that the voice about such processAnd/or audio annotation.In embodiments, this application can use in high threat environment, wherein the work not being detectedIt is a security consideration.Augmented reality image and video can be projected to or cover in other ways user working it is trueOn object or in the user visual field of object, to provide video, diagram, text or the other instructions of the process to be performed.EachIn embodiment, the program library for various processes can be from the computer worn with it or from remote equipment, database and/or serviceIt wiredly or wirelessly downloads and accesses in device etc..Such program can be used for really maintenance or training goal.
In embodiments, equipment, method and the description found herein can be used for inventory status notification system.In each embodimentIn, such tracking system allows the scanning from most 100 meters of distance to use 2mb/s data transmission rate to handle more than 1000Synchronization link.When checking inventory and/or near inventory, which can provide annotated sound related with inventory trackingFrequency and/or visual information.In embodiments, equipment may include eyepiece described herein, monocular, binocularsAnd/or other equipment, and the wired or nothing that inventory tracking can be used SWIR, SWIR color and/or night vision technology, wear with itComputer on line, wireless UWB safety label, RFID label tag, the helmet/safety cap reader and display etc..In embodiments, andAnd only as an example, user can receive visual and/or audio-frequency information related with inventory, such as which article is wanted destroyed, is turnedMove, the quantity of destroyed or transfer article, article to be transferred or be discarded into where etc..In addition, such information can be withHighlight or provide in other ways visible mark and the instruction of article in question.Such information is displayed atOn the eyepiece of user, projects on article, is shown on number or other displays or monitor etc..Article in question canIt is labeled by UWB and/or RFID label tag and/or augmented reality program can be used for providing a user visualization and/or instructionSo that various equipment described herein can provide for information needed for inventory tracking and management.
In various embodiments, when fire fighting, SWIR, SWIR color described herein, monocular, night vision, withWireless computer, eyepiece and/or the equipment described herein of wearing can be used.In embodiments, user can have transmissionThe promotion visibility of smog, and the position of each individual can be shown in covering map or other maps by the equipment of userTo user, so that his knowable fireman and/or other people position.Equipment can provide the real-time of the position of all firemansIt has been shown that, and provide to the Hot spots detection with the region below and above 200 degree celsius temperatures, without triggering false alarm.FacilityMap can also be provided, be displayed in equipment by equipment, being projected from equipment, and/or by augmented reality or other meansIt is covered in the sight of user, to help that user is instructed to pass through structure and/or environment.
System and equipment described herein can be configured for any software and/or algorithm to meet task specific needsAnd/or system upgrade.
With reference to Figure 73, eyepiece 100 can be docked with " bio-identification flash lamp " 7300, " bio-identification flash lamp " 7300It such as include the biometric signature and form factor function and with typical handheld flash lamp for recording individualBiometric data obtain sensor.Bio-identification flash lamp can be docked directly with eyepiece in the following manner: such as logicalCross be wirelessly connected directly from bio-identification flash lamp to eyepiece 100 or the embodiment as indicated in Figure 73 in show, pass throughThe intermediary transceiver 7302 wirelessly docked with bio-identification flash lamp and by wired or wireless interface from transceiver to meshMirror (for example, wherein transceiver apparatus is wearable on such as waistband).Although depicting other mobile biologies in the accompanying drawings to knowOther equipment skilled artisans appreciate that may make appointing in mobile biometric apparatus without showing transceiverOne is communicated indirectly with eyepiece 100 by transceiver 7300, is directly communicated with eyepiece 100 or individually operated.Data canEyepiece memory, the memory in transceiver apparatus are transmitted to, as one, bio-identification flash lamp from bio-identification flash lampThe memory etc. in mobile memory card 7304 divided.As described in this, bio-identification flash lamp may include integrated cameraAnd display.In embodiments, bio-identification flash lamp is used as stand-alone device and does not have to eyepiece, and wherein data are internalGround stores and information is provided over the display.In this way, civilian personnel can be easier and safely using lifeObject identification blink lamp.Bio-identification flash lamp can have the range for capturing certain form of biometric data, such as 1Rice, 3 meters, 10 meters etc. of range.Camera can provide monochromatic or color image.In embodiments, bio-identification flash lamp can mentionFlash lamp-camera is collected for hidden biometric data, rapidly geo-location, monitoring and environment and biology can be collectedIdentify data for the identification matching of airborne or remote biometric.In an example usage scenario, soldier can be assigned to the whistle at nightInstitute.Soldier can only come as typical flash lamp using bio-identification flash lamp, but wherein unwitting individual is setIt is standby to illuminate, and also bio-identification is run and/or obtained as a part of data collection and/or bio-identification identification procedureFeature.
Referring now to Figure 76,360 ° of imagers are recessed into using number is small as will arrive any given region in set of pixels,To deliver the high-definition picture in the specified region.Each embodiment of 360 ° of imagers can with the small recessed visual field of ultrahigh resolution andIt is simultaneously and 10x(10 times independent) optical zoom characterize continuous 360 ° × 40 ° visuals field panorama FOV().360 ° of imagersMay include two 5,000,000 element sensors and 30fps(number of pictures per second) imaging capability and < 100 image acquisition time.360°Imager may include the gyro-stabilized platform with independently stable imaging sensor.360 ° of imagers can only have a shiftingDynamic part and two imaging sensors, this allows reduced image processing bandwidth in compact Optical System Design.360 ° atAs device can also characterize low angular resolution and high-speed video handles and to can be sensor unknowable.360 ° of imagers are used asIn an equipment, in the move vehicle with gyro-stabilized platform, be mounted on traffic lights or phone mast, robot,Supervision equipment on aircraft or other positions for allowing persistently to monitor.Multiple users can independently and simultaneously check by 360 °The environment of imager imaging.For example, being displayed in eyepiece by the video that 360 ° of imagers capture to allow all of dataThere is recipient (all occupants in such as combat vehicle) real-time 360 ° of situations to know.360 ° of imagers of panorama can be recognizedPeople at 100 meters and recessed 10x(10 times small) zoom can be used to read the licence plate at 500 meters.360 ° of imagers allow to ringThe lasting record in border simultaneously characterizes independently controllable small recessed imager.
Figure 76 A depicts the 360 ° of imagers assembled and Figure 76 B depicts the cross-sectional view of 360 ° of imagers.360 ° of imagingsDevice includes capturing mirror 7602, object lens 7604, beam splitter 7608, lens 7610 and 7612, MEMS mirror 7614, sensor total field of view7618, panoramic picture lens 7620, folding mirror 7622, central concave transducer 7624 and spill image lens 7628.It is can be geo-location with the image that 360 ° of imagers are collected and be subject to time and date label.Other sensors can quiltIt is included in 360 ° of imagers, thermal imaging sensor, NIR sensor, SWIR sensor etc..MEMS mirror 7614 is onlyOne captures the reflecting prism of system using single view hemisphere face, to allow high and balanced resolution ratio.Imager design makes< 0.1 ° of scanning accuracy, < 1% small recessed distortion, the 50%MTF at 400lp/mm and < 30 millisecond must be can be realizedSmall recessed acquisition.
360 ° of imagers can be a part that the network of TOC or database is traced back to wireless or physics.For example, withThe display with 360 ° of imager drivers can be used wirelessly to check the image from 360 ° of imagers in family, or usesWired connection (such as military control type cable) checks the image from 360 ° of imagers.Display can be fight radio and setNetted Net-connected computer that is standby or networking with general headquarters.Data from database (such as Ministry of National Defence's authoritative database) can be by fightingWireless device or the computer of netted networking such as by using removable memory storage card or pass through networking connected reference.
Referring now to Figure 77, consistent more visual field cameras can be used for being imaged.Feeding from consistent more visual field cameras can quiltIt is sent to eyepiece 100 or any other suitable display equipment.In one embodiment, consistent more visual field cameras can be completelyConnect, 3- or the consistent visual field 4-, SWIR/LWIR imaging and target appointing system, allow simultaneously: wide, medium and narrow viewOpen country monitoring, wherein there is each sensor VGA or SXVGA resolution ratio to be used for daytime or nighttime operation.It is light weight, solid with gimbalFixed sensor array can be by inertially stable and geographically refer to, so as to utilize its NVG compatible laser indicatorAbility is specified come the sensor positioning for realizing pin-point accuracy under all conditions and target.It is unique multiple while the visual field canWide area monitoring is realized in visible, near-infrared, short-wave infrared and LONG WAVE INFRARED region.When with from digital compass, inclinometer andWhen the output coupling of GPS receiver, it also allows high-resolution Narrow Field Of Vision for the more accurate of point of use to mesh coordinateTarget identification and specified.
It is all there may be separated, steerable, the consistent visual field in one embodiment of consistent more visual field camerasSuch as 30 °, 10 ° and 1 °, there is automatic POI or multiple POI tracking, face and iris recognition, airborne matching and by 256AES encryption UWB come with laptop computer, combat radio or it is other networking or netted networked devices wirelessly communicate.PhaseMachine can be networked to CP, TOC and biometric data library, and may include 3 axis, gyrocontrol, high dynamic range, high-resolutionRate sensor delivers the ability in terms of under conditions of from dazzling sunlight to extremely low light.Identifier (ID) can immediately be existedLocally or in long-range storage stores and analyze.Camera can characterize the accurate geo-location of " find and position " of POI and threatTo > 1,000 meter of distance, 1550 nanometers of integrated eye safety laser range finders, the GPS of networking, 3 axis gyroscopes, 3 axis magnetic forceMeter, accelerometer and inclinometer, electronic image enhancing and increase electronic stability assist tracking, record full motion (30 frames are per second) colorColor video, ABIS, EBTS, EFTS and JPEG2000 are compatible and meet MIL-STD810 for operating in extreme circumstances.PhaseMachine can be installed via the ball system equipped with universal joint, should be used to isolate bio-identification equipped with the ball system integration of universal jointThe disoperative bio-identification collection of the movement of capture scheme and mark and laser ranging and POI geo-location, are such as blockingAt point, checkpoint and facility.Multi-mode living things feature recognition includes collecting and identifying face and iris and record video, gaitWith other distinguishing labels or movement.Camera may include to all POI and collecting data with time, date and position come groundManage the ability positioningly marked.Camera promotes the fast propagation that the situation to the squad CP and TOC for enabling network knows.
In another embodiment of consistent more visual field cameras, camera characterization provides consistent 20 °, the 7.5 ° and 2.5 ° visuals field3 separated, colored VGA SWIR electro-optic modules, and point out with precision in ultra-compact configuration pair for extensive region1 LWIR thermoelectricity optical module of the imaging of POI and target.3 axis, gyrocontrol, high dynamic range, colour VGA SWIR phaseMachine delivering is seen under conditions of from dazzling sunlight to extremely low light and through mist, cigarette and without " in full bloom " hazeAbility.It can be by the way that 3 axis gyroscope of micro machine system (MEMS) and enhancing GPS receiver and 3 axis of magnetometer data be acceleratedDevice is integrated to obtain geo-location.1840 nanometers of integrated eye safety laser range finders and target specify device, GPS receiver" find and position " is provided, to the accurate geo-location to 3 kilometers of distances of POI and threat with IMU.Camera show and itFull motion (30 frames are per second) color video is stored in " camcorders in chip ", and is stored it in solid-state, moved in drivingFor the remote access during flight or for postoperative inspection.Electronic image enhancing and increase electronic stability are assisted to POITracking, geo-location ranging with target and specified.Feeding of the eyepiece 100 by display from consistent more visual field cameras as a result,To deliver " vision " that is not blocked of threat.In some embodiments of eyepiece 100, display sensor is also can be used in eyepiece 100Image, moving map and data " perspective ", turn over/under the electric light indication mechanism that turns over not being obstructed for soldier oneself weapon is providedThe view hindered.In one embodiment, turn over/under the electric light indication mechanism that turns over can be caught in the MICH or PRO-TECH of any standardIn the NVG installation of the helmet.
Figure 77 depicts an embodiment of consistent more visual field cameras, including laser range finder and specified device 7702, it is complete in it is anti-Penetrate mirror 7704, mounting ring 7708, total internal reflection mirror 7710, total internal reflection mirror 7714, antireflection honeycomb ring 7718,1280x1024SWIR380-1600 nano-sensor 7720, antireflection honeycomb ring 7222,1280x1024SWIR380-1600 receiveRice sensor 7724, antireflection honeycomb ring 7728 and 1280x1024SWIR380-1600 nano-sensor 7730.Other implementationsExample may include additional TIR lens, FLIR sensor etc..
With reference to Figure 78, flight eyes (flight eye) is depicted.Feeding from flight eyes may be sent to that eyepiece100 or any other suitable display equipment.Flight eyes may include being mounted in the folding imager array with multiple FOVMultiple individual SWIR sensors.Flight eyes are monitoring and the target appointing system of low-profile, which can be using oftenA sensor is in VGA to SXGA resolution ratio, at daytime or night, realizes in single low-latitude flying through mist, cigarette and haze pairThe consecutive image in entire battlefield.The selectivity that its modularized design allows to any element from 1 ° to 30 °, fixed pointResolution changes, for dolly-out,ing dolly-back to the wide-angle image in any region of array.The resolution ratio of each SWIR imager is1280x1024 is simultaneously sensitive at 380-1600 nanometers.Abatement is heavy together and automatically by all images " suture " for more DSP array boardsFold-over element is for seamless image.Consistent 1064 nanometer laser specifies device and rangefinder 7802 that can consistently pacify with any imagerDress, without stopping its FOV.
With reference to Figure 106, eyepiece 100 can be with eyepiece Application development environ-ment 10604 with the software inhouse application 7214(of eyepieceDevelop in association) it operates in combination, wherein eyepiece 100 may include being suitble to project image onto perspective or translucent lensProjection facility so that the wearer of eyepiece can check ambient enviroment and as by software inhouse using 7214 provideShown image.May include memory and operating system (OS) 10624 processor can main memory software inhouse using 7214,Docking between control eyepiece order and control and software application, control projection facility etc..
In embodiments, eyepiece 100 may include calculating the main memory software inhouse run on facility 7212 in multimedia to answerWith 7214 operating system 10624, wherein internal applications 7214 can be the software application developed by third party 7242 and quiltIt provides with for download to eyepiece 100, such as from application shop 10602,3D AR eyepiece application shop 10610, from the third of networkingSquare application server 10612 etc..Internal applications 7214 such as can pass through input equipment 7204 with API10608 in combination, outside is setStandby 7240, external 10630 facilities of facility 7232, the order of eyepiece and control etc. that calculate control 10634 mistake for the treatment of facility with eyepieceJourney interaction.Internal applications 7214 connect such as internet 10622(by network communication, Local Area Network, have other eyepiecesWith the mesh network, satellite communications link, cellular network of mobile device etc.) and eyepiece 100 can be used.Internal applications 7214 canIt is bought by application shop (application shop 10602,3D AR eyepiece application shop 10610 etc.).Internal applications 7214 canIt is provided by 3D AR eyepiece shop 10610, such as applies 7214 for the software inhouse of 100 specific development of eyepiece.
Eyepiece Application development environ-ment 10604 can be used to create to software developer new eyepiece application (for example, 3D is answeredWith), modification application substantially create the new 3D application version applied substantially etc..Eyepiece Application development environ-ment 10604 can wrap3D application environment is included, which is adapted the application once completed and is loaded on eyepiece or in other ways to meshThe available access just provided to developer to available on eyepiece control scheme, UI parameter and other specifications of mirror.Eyepiece canThe API10608 of communication between application and eyepiece computing system including being designed to promote completion.The exploitation ring of developerDomestic application developer can then pay close attention to exploitation have specific function application, and do not have to be concerned about themselves how with meshThe details that mirror hardware interacts.API also may make developer more directly to modify existing application and answer to create 3DWith for being used on eyepiece 100.In embodiments, internal applications 7214 can utilize the server 10612 of networking to be used forClient-server configuration, (for example, by internal applications 7214, partly local runtime exists mixed-client-for server configurationIt is on eyepiece 100 and partially operational on application server 7214), will using complete main memory on the server, under serverCarry etc..Network data storage 10614 can be provided in association with internal applications 7214, such as further with application server10612, the application etc. of purchase is provided in association with.In embodiments, internal applications 7214 can with sponsor facility 10614,Market 10620 etc. interacts, and all Tathagata provides the advertisement of patronage, to eyepiece in combination with the execution of internal applications 7214100 user provides marketplace content etc..
In embodiments, software and/or application can have been developed to be used together with eyepiece or supplement eyepiece.For meshThe application of mirror can be developed via Open Source Platform, closing source code platform and/or software development kit.For developing answering for eyepieceSoftware development kit and the software therefrom developed can be open source or closing source code.Using can have been developed toAndroid, Apple, other platforms etc. are compatible.Using can be by application shop associated with eyepiece, independent application shop etc.To sell or therefrom download.
For example, the integrated processor of eyepiece can run at least one software application and process content is shown with feeding to user,And content can be introduced into the optics assembly of eyepiece by integrated image source.Software application can pass through the control and sensing with eyepieceInteractive 3D content is supplied to user by the interaction of at least one of device facility.
In embodiments, eyepiece can be used for various applications.Eyepiece can be used for consumer's application.Only as example andNon- offer detailed bill, eyepiece can be used for it is following application or it is used therewith: travel application, educational applications, Video Applications,Exercise application, personal assistant applications, augmented reality application, search application, local search application, navigation application, film applications, facePortion's identification application, venue identifier application, the application of character recognition and label symbol, text application, instant message transrecieving application, Email are answeredWith the application of, item to be done, social networking application etc..Social networking application may include the application such as Facebook, Google+.In embodiments, eyepiece can be used for entertainment applications.As example rather than exhaustive list is provided, eyepiece can be used for following answerWith or it is used therewith: charging application, customer relation management application, business intelligence application, human resource management application, listAutomation application, office products application, Microsoft Office etc..In embodiments, eyepiece can be used for industry and answerWith.As example rather than exhaustive list is provided, eyepiece can be used for following application or used therewith: product of the future qualityPlanning software application, product parts approval software application, statistical data process control application, professional training application etc..
With reference to Figure 107, eyepiece Application development environ-ment 10604 can be used for can be presented to application shop 10602,3DThe exploitation of the application of AR eyepiece application shop 10610 etc..Eyepiece Application development environ-ment 10604 may include user interface 10702, rightThe access etc. of control program 10704.For example, developer can using in user interface menu and dialog box access controlling partyCase 10704 is for selection, therefore scheme may be selected in application developer.The template that developer can select general operation to applyScheme, but can also have be selected for can application execution sometime cover template function scheme various functionThe single control of energy.Developer can also have the visual field (FOV) to control using user interface 10702 with control program exploitationThe application of (such as passing through the interface FOV).The interface FOV may be provided in the FOV of two displays (for each eyes) of display and singleThe mode mediated between display.In embodiments, it can be designed in single display view for the 3D application of eyepiece, this isBecause API10610 will provide the explanation for determining which display is used for which content, although developer perhaps can be for spyDetermine the specific eyes of content selection to show.In embodiments, developer manually can such as pass through user interface 10702What will show in each eyes to select and/or check.
Eyepiece can have the software stack 10800 as described in attached drawing 108.Software stack 10800 can have wear-type hardware andSoftware platform layer 10818, to podium level interface-API- packaging 10814, for 10812 layers of the library of exploitation, application layer10801 etc..Application layer 10801 can include in turn consumer using 10802, entertainment applications 10804, industrial application 10808 andOther similar applications 10810.It is arrived in addition, hardware 10820 associated with the execution of internal applications 7214 or exploitation is also combinableIn software stack 10800.
In embodiments, user experience can be that focus is aligned, simultaneously by ensuring to enhance image relative to ambient enviromentAnd display is arranged on brightness appropriate to optimize in the case where given ambient light and the content being shown.
In one embodiment, eyepiece optics assembly may include delivering content in a manner of three-dimensional for each eyesElectro-optic module, also referred to as show.In some cases, three-dimensional view is not ideal.In embodiments, for specific interiorHold, only one display can be opened or only one electro-optic module can be included in optics assembly.In other embodimentsIn, the brightness of each display can be changed, so that brain ignores darker display.The auto brightness control of image source can basisBrightness in environment controls the brightness of displayed content.The speed that brightness changes may depend on the change in environment.Brightness changesThe speed of change can be matched with the adjustment of eyes.Display content can close a period of time after the sudden change of ambient brightness.DisplayContent can be dimmed with the dimmed of environment.Display content can brighten with brightening for environment.
When entering dark environment from bright environment, human eye needs a period of time to adapt to dark.In this period of timePeriod, eyes only have the limited visibility to dark surrounds.In some cases, such as under safety or law enforcement situation, energyEnough being moved to dark environment from bright environment and capable of quickly determining what activity or object is important in dark environment's.However, the eyes of people, which fully adapt to dark environment, needs at most 20 minutes.During between at this moment, view of the people to environmentPower is compromised, this can lead to dangerous situation.
In some cases, the strong luminous energy such as flash lamp be used to illuminate dark environment.In other cases, intoThe eyes of people are covered into a period of time to allow eyes partly to fit before entering dark environment before entering dark environmentIt is possible for answering dark environment.However, cannot be used in strong light in dark environment and enter dark environment itBefore to cover the eyes of people be in infeasible situation, it is desirable to provide assist the method checked to reduce in the tour from bright to darkBetween people the eyesight damaged time.
Nigh-vison googles and binoculars are known for providing the image of dark surrounds.However, these equipment provideThus the image of constant brightness does not simultaneously allow the eyes of user to adapt to dark, so equipment must be continuous in dark environmentGround uses.As a result, these equipment do not utilize people can be well in dark ring after their eyes adapt to dark completelyThe fact that checked in border.
United States Patent (USP) 8094118 provides the brightness for correspondingly adjusting display with the brightness of ambient enviroment to saveThe method of electric power.This method is directed to the display brightness experienced, and not with the eyes of user from bright to the environment of darkAdjustment in transformation is related.In addition, this method does not assist user to check environment.
Therefore, it is necessary to a kind of methods to assist people from bright ring during the eyes of people adapt to dark a period of timeThe method that border is moved to dark environment.
Head-mounted display device with see-through capabilities provides the clear view of scene before user and at the same time also providingShow the ability of image, wherein user sees the combination picture being made of the displayed image of see-through view and covering.This hairBright disclosure provides the method for the assistance view for providing environment when user is from bright environment transition to dark environment.It shouldMethod rapidly adjusts capturing condition using the camera on head-mounted display device, so that the image of dark surrounds can be caughtIt catches and image is displayed to user.The brightness of displayed image is progressively decreased so that the eyes of user adapt to darkEnvironment.
Figure 154 (from by Davison, H., data, " the The Eye " of Hecht and Mandelbaum in the books of editor(" eyes ") volume Two, Academic Press London, 1962, in the 5th chapter " dark adaptation and night vision " that Pirene, M.H. writeObtain) show the chart for being directed to the typical black dark adaptation curve of human eye, wherein stamping grinding for the region expression 80% of shadeStudy carefully subject population.In this chart, curve is shown in the minimal illumination that can be observed at specific time, the specific time withStart and immediately enter dark surrounds at the time 0 under strong light environment, wherein the minimal illumination that can be observed is to pass throughThe luminous point of different illumination on a region is shown in people and the people reports the luminous point that can be seen after adusk different timeCome what is determined.It can such as find out from the curve, human eye adjusts with the time, so that the luminous point compared with low-light (level) can be with about25 minutes a period of time is progressively seen.As annotated in the chart in Figure 154, actually facilitate there are two types of mechanismThe dark adaptation of human eye.The cone (also referred to as photopic vision) in eyes compared to relatively slow adjustment rhabodoid (also referred to asFor noctovision) it is quickly adjusted at brighter condition.As a result, for adjusting being moved in darker condition from brighter conditionThe suitable time, which can be spent, the how dark quite a long time depending on environment.During dark accommodation time section, people can be approachedThen it becomes blind.
Table 2 is provided for general illumination condition with Lux(lux) and lambert Lambert() two unitsTypical brightness value.The range of illumination for external lighting conditions is in bright sunlight and the not no span at the cloudy night of the moonMore 9 orders of magnitude.Also the brightness value for interior lighting condition is provided for comparing.
Table 2 shows that coming from websitehttp://www.engineeringtoolbox.com/light-level-rooms-d_708.htmlTypical illumination level.
| Typical illumination level | Lux | Lambert |
| Sunlight | 107527 | 10.7527 |
| Bright and beautiful daylight | 10,752 | 1.0752 |
| Cloudy day | 1075 | 0.1075 |
| Very dark day | 107 | 0.0107 |
| Dusk | 10.8 | 0.00108 |
| Depth dusk | 1.08 | 0.000108 |
| Full moon | 0.108 | 0.0000108 |
| Crescent | 0.0108 | 0.00000108 |
| Starlight | 0.0011 | 0.00000011 |
| Cloudy night | 0.0001 | 0.00000001 |
| Supermarket | 750 | 0.075 |
| Common office | 500 | 0.05 |
| Classroom | 250 | 0.025 |
| Warehouse | 150 | 0.015 |
| Dark public domain | 35 | 0.0035 |
Table 3 provides the brightness experienced when the lighting condition adapted to completely from an eyes changes to compared with dark conditionValue (has brightness (Bril) unit).The change of shown lighting condition and the illumination provided according to the lambert from table 2It is worth related.It is the explanation of brightness in the example that 3 bottom of table provides, wherein the brightness for 1 brightness experienced is about by crescentIn the brightness that sunny night or 0.000001 lambert provide, wherein by the brightness experienced in human visual system and lighting conditionIn the relevant equation of change provided in United States Patent (USP) 8094118, the equation below be used as equation 3 come provide for ginsengIt examines.
B=λ (L/La) σ equation 3
Wherein
σ=0.4log10 (La)+2.92
λ=102.0208La0.336
Table 3 show other examples in, it is readily seen that many situations encountered in real life cause wherein according toThe change of bright condition causes the dark situation experienced.The change of various lighting conditions is shown in table 3 and when this changesBecome the brightness experienced when occurring for the first time.In many examples in these examples, it is moved to when for the first time from bright condition darkerThe brightness experienced when condition is significantly less than in the complete brightness provided by crescent for adapting to experience after dark.It is mobile from daylightPublic domain to warehouse or dark is especially problematic, and eyes are substantially to have become blind a period of time until it becomes to adapt to newlyLighting condition.Invention described herein provide it is a kind of for from bright conditional transition to during darker condition,Eyes are adapting to the method for assisting human eye when darker condition.
Table 3 show the luminance level experienced when changing from bright environment to dark environment using equation 3 andBrightness value from table 2:
Figure 155 is provided about from Spillman L., " there are the moons for the paper of Nowlan A.T. and Bernholz C.D.The dark that the thanks to dark adaptation under background illuminance " (magazine of Optical Society of America, volume 62, the 2nd phase, 2 months 1972) usesThe measurement data of the speed of adaptation.Figure 155 is shown for measuring increment threshold with the log background illuminance of linearly reductionValue.Background 3.5 minutes (), 7 minutes (Δ), 14 minutes (zero), 21 minutes (◇) and 7 log are changed in 3.5 minutes(logarithm) unit, without pre-exposure (■).The time of arrow instruction background delustring.In the case where lacking any background illuminance (×)The common dark threshold value of record is most of consistent with precipitous background slope and omits after becoming invariant.
Data in Figure 155 are that institute's speed based on measurement can be detected under the measured speed by human eyeSpeck on minimal illumination horizontal (threshold value) with when lighting condition is from 0.325 lambert (cloudy day of part) to complete darknessEyes become to adapt to dark (more sensitive) and progressively reduce.The different curves shown in the chart of Figure 155 are for wherein(rather than the change immediately shown in Figure 154) lighting condition that change from bright to dark is completed with different linear velocities.In chartThe more quickly adaptation to dark under conditions of the change shown in from bright to dark of the curve in left side quickly completes.Such asWhat data that are showing in Figure 154 and being shown by Figure 155 were supported, when directly from it is bright be moved to complete darkness when adapt to darkTypical time period is about 15 minutes.Brightness, which is shown, in data in Figure 155 linearly to change with 14 minutes a period of timeBecome and only there is the data display in the time-related small loss of dark adaptation, Figure 155 to adapt to the dark time from being directed to15 minutes changed immediately increase to 19 minutes changed for 14 minutes slopes.The present invention there is provided herein for darkThe displayed image of environment provides the method with the brightness with time progressive reduction, so user is provided dark surroundsObservable image simultaneously still allows the eyes of user to adapt to dark environment simultaneously.This method use quickly adjusts to dark ringBorder so that its image that can capture dark surrounds camera.The image captured is supplied to use on perspective head-mounted displayFamily, wherein the brightness of image was progressively reduced with the time, thus the eyes of user can adapt to dark and user can be with saturatingEnvironment is progressively seen depending on the see-through capabilities of head-mounted display.
Figure 156 is the diagram with the head-mounted display apparatus 15600 of see-through capabilities.Head-mounted display apparatus 15600 wrapsSee-through display 15602, one or more cameras 15604 and electronic device 15608 are included, wherein electronic device 15608 can includeOne of the following is multiple: processor, battery, GPS sensor (GPS), direction sensor, data storage, channel radioLetter system and user interface.
In one embodiment, the head-mounted display apparatus 15600 at least one camera 15604 or 15610 is used forThe enhancing view of the dark surrounds is provided in see-through display 15602 during the eyes of user are adapting to dark surrounds.Camera 15604 or 15610 can quickly automatically adjust trap setting using auto exposure system, such as gain, ISO,Resolution ratio or pixel binning (binning).In some embodiments, the camera lens of camera 15604 or 15610 is changeable with energyThe picture catching promoted in dark surrounds.The brightness for being shown in the image in see-through display 15602 can be with the timeBe adjusted to matching eyes debugging and can light-sensitive material associated with head-mounted display apparatus 15600 any change.In this way, the light-sensitive material quickly changed is not needed.Transit time with the minute order of magnitude becomes clear light-induced variableColor material is well adapted for various embodiments of the present invention.Under any circumstance, the visual field of shown ambient image should be with headThe visual field matching that formula shows equipment 15600 is worn, to provide the display for the dark surrounds for being easy docking in augmented reality mode.
The present invention provides the head-mounted display apparatus 15600 with one or more cameras 15604 or 15610, wherein instituteThe image of scene can be shown with the time with the brightness in a range before the user of capture.It is adapted to compared to eyes of userAmbient brightness variation, camera 15604 or 15610 and their associated auto exposure systems much more quickly to adapt to environment brightThe variation of degree, usually in 1 second.In one embodiment, camera 15604 or 15610 captures the image of scene before user, andWhen the brightness of scene rapidly from it is bright dimmed when, the image of captured scene is displayed to user in see-through display 15602.The brightness of displayed image is lowered so that with the time and is and then moved to after dark surrounds i.e. becoming clear to user's sceneImage, and brightness is then reduced with the time by the dark rate for allowing the eyes of user to adapt to environment.Figure 157 show withThe time be supplied to user displayed image brightness chart, wherein t1 be when the brightness of environment from it is bright dimmed whenBetween.To the capture of ambient image can at time t1 or before start.After t1, when the brightness of displayed image is reduced untilBetween t2, in time t2 user eyes adapt to dark surrounds.After time t 2, the brightness of displayed image is maintained at a waterPut down constant, wherein user can be with perspective mode environment of observation.In other embodiments of the invention, shown after time t 2 to schemeThe brightness of picture is 0, so that user only observes dark surrounds with perspective mode.In further embodiment of the present invention, showThe picture material of diagram picture enhances after t2 from the image modification of environment before the user captured to other images or such as existingThe information of real information (for example instruction or direction) etc..In another embodiment of the invention, if environment is secretly in predetermined waterFlat, then the brightness of shown ambient image is lowered to the level maintained after time t 2, so that the version of night vision is provided,Middle night vision makes a response the quick change of ambient lighting, and also when condition is too dark for eyes adapt to task at handLonger-term night vision is provided.Wherein after time t 2 provide night vision imaging dark level can by user operation mode setting inSelection, wherein need the task that more details in environment are detected using provided during night vision mode it is brighter shown byAmbient image setting.
In the preferred embodiment, the brightness of shown ambient image is with the rate with eyes of user adaptation dark surroundsCorresponding rate reduces, such as corresponding with the curve shown in Figure 155 from bright image to dark images or without figureIt converts 14 minutes of picture.In this way, when eyes of user adapts to dark, which is temporarily provided ambient image,But adapt to time of dark surrounds with no displayed image the case where under time for adapting to compare it is significantExtend.
In further embodiment of the invention, when user enters dark surrounds, camera 15604 or 15610 it is saturatingMirror is varied to provide the low light image capturing ability promoted.In this case, camera 15604 or 15610 or electronic deviceAnother photolytic activity detector in 15608 detects the change from bright light environments to dark surrounds, and wherein the brightness of environment is by electronicsAutomatic exposure sensor in device 15608 or the picture by detecting the imaging sensor in camera 15604 or 15610The reduction of plain code value detects.The lens of camera 15604 or 15610 are then varied to promote light capacity gauge or make phaseMachine 15604 or 15610 can capture infrared image.Example: light capacity gauge can by change to lower f# lens comeIt is promoted.Example: infrared image capture can be realized in camera 15604 or 15610 in the following manner: remove in lens assemblyInfrared cutoff filter, lens element is mutually shifted to reset focal length or change the one or more in lens elementAt Infrared Lens element.In another embodiment, the imaging sensor of camera 15604 or 15610 is varied to achieve infrared figureAs capturing.
Figure 158 shows the flow chart of a method of the invention.In step 15802, user is moved to dark from bright light environmentsEnvironment.In step 15804, another photolytic activity detector in camera 15604(or electronic device 15608) detection is in the environmentChange of the lighting condition to dark condition.In step 15808, the capturing condition that camera 15604 or 15610 uses is by automatic exposureSystem adjusts to realize that picture catching in a dark environment, especially video image capture.In step 15810, the figure of environmentAs being captured by camera 15604 or 15610 and being shown in see-through display 15602 with the first luminance level, wherein shown figureFirst luminance level of picture be similar to immediately in environment from it is bright be changed to dark lighting condition before user regard in the perspective of environmentThe brightness perceived in figure.Then in step 15812, the brightness of shown ambient image is lowered with the time, so thatThe eyes of user can adapt to image that is dark and checking environment simultaneously.The reduction of brightness can be with the period it is linear,It or as shown in Figure 157 is nonlinear.The period being lowered with brightness of image can change with lighting condition in environmentIt is covert corresponding.How black had according to environment, in step 15812, the brightness of displayed image can be lowered to 0 or maintain predetermined waterIt puts down to provide night vision version.
Exemplary scene 1
The police that (about 1.0 lambert) works in daylight has broken Yishanmen, which, which leads to, is similar to from such as table 2Many restaurants for showing of data in degree of darkness (about 0.0035 lambert) some dark rooms.When door is openedWhen, it is 0.000007 brightness or dark 10000X(10000 times of the illumination than being provided by crescent that police, which will experience dark room), such asWhat the data in table 3 were shown.In fact, he will not see anything in the dark room.According to the song in Figure 155Line, police will see in the dark room anything (its be in 0.0035 lambert (0.0035 lambert=0.54Log millilambert)) have about 1 minute before.This is dangerous situation, because the eyes of the people in the dark room areThrough having adapted to dark, so they will see police.Having worn as described in this in police has camera and perspective aobviousIn the case where the head-mounted display apparatus for showing device, the image of dark room can be presented to police in about 1.5 minutes, at thisBetween in section the eyes of police adapting to dark.After such time, police can pass through the dark room from the point of view of see-through displayBetween.See-through display can still be used for will instruction or other letters when the police checks the dark room by see-through displayBreath is sent to the police (such as in augmented reality presentation system).Head-mounted display apparatus of the invention is mentioned to police as a result,For the instantaneous vision for being limited solely by the low light ability of camera in dark room.
As long as the visual field presented in display image is closely matched with the part in the police visual field and video image is onlyFact with lag limited between capture and display, the police will easily come using only display image in the darkIt is moved in room.As the eyes of police adapt to the dark room, the brightness of displayed image is reduced with the time.
Camera can be fairly standard digital camera, have the good low light operated in high ISO and binning modePerformance is down to the video imaging of part moonlight illumination level to provide.Short-wave infrared camera has visual+near infrared imaging energyThe camera (such as removing the camera of infrared cutoff filter) of power can be used to provide for being down to the imaging of darker level.Such as schemingThe data instruction shown in 154 and 155, under conditions of very dark, it may be necessary to be provided a user at most 25 minutesImage, the eyes of user will adapt to dark completely at the time point.
Exemplary scene 2
Door is opened in lighting house (0.025 lambert of illumination=0.40Log millilambert) internal soldier and is walked out, into toolHave in the night (0.00001 lambert of illumination=- 2Log millilambert) of full moon.It can such as find out from the number in table 3, work as scholarIt is that (it when eyes adapt to completely than having with 0.000001 brightness of brightness that soldier steps into the darkness experienced when night firstDark 1000000X times of the night of crescent) complete darkness.Curve in Figure 155 shows this change for illumination, can beThe eyes of soldier would require about 2 minutes before seeing object under the conditions of darker.As in previous example, due to soldierSubstantial ablepsia 2 minutes, this can be dangerous situation.The present invention provides perspective head-mounted display, capture ringThe image in border simultaneously displays them to soldier to eliminate the period of ablepsia.In this case, the brightness of image can be with3-4 minutes a period of time reduce, so the eyes of soldier adapted to dark and at the end of the time, soldier can useHead-mounted display operates in perspective mode or augmented reality mode.
Moment visuality can provide on the display visual field with regard to displayed image.As the eyes of user adapt to dark condition,The transformation that perspective is checked is provided by being gradually lowered the brightness of displayed image.
The technology may be additionally used for compensation photochromic material lens associated with head-mounted display apparatus, cannot be non-It is clear often rapidly to become.
In the alternative embodiment, the image being presented to the user can be that the 2D(captured by single camera 15610 is wherein inNow identical to the image of eyes of user) or the 3D(that is captured by stereoscopic camera 15604 be wherein presented to the user the image of eyesThere is provided the different perspectives of scene).As it is known to the person skilled in the art, being also possible for generating other methods of stereo-picture, such as optical field imaging is realized using the lens with division pupil or using the lens with microlens array.
Shown image can also be transferred to different color (such as red or green) to help eyes quickly to adapt toDark, as usually realized in night vision binoculars.
In embodiments, augmented reality eyepiece (AR) of the invention, which is adapted, determines and/or compensates eyes of userIt turns to.Steering be user's eyes around vertical axis while rotate to move and obtain in the opposite direction by the their own optical axisOr maintain binocular vision.When a people sees closer object, the eyes of the people are mobile inward towards nose by their own optical axis,The compound motion referred to as assembled.In order to see farther away object, the eyes of the people are by their own optical axis to moving outside noseIt is dynamic, the compound motion referred to as dissipated.When the people watch attentively infinity or it is very remote when, the eyes diverging of the people is each until themFrom the optical axis be substantially parallel to one another.It turns to adjust with eyes adaptability and be operated together to allow a people to move in object relative to the peopleThe clear image to the object is maintained when dynamic.Compensation is turned at virtual image (that is, AR image) (such as label or other information)When being placed adjacent to true picture or covering true picture or when the virtual image of object will be superimposed upon the true figure of objectAs it is upper when in the case where become important, to make the placement of virtual image correct relative to true picture.Of the invention is used forThe method for turning to compensation and/or determining is described herein and is collectively referred to as forward method.
Forward method may include turning with a distance from user of the determining perpetual object from AR eyepiece with subsequent determined with the distanceTo angle, that is, when the eyes of user look at the object, the optical axis intersection of eyes of user is formed by angle.Steering angle is thenBe used for determining correct placement of the AR image relative to the object, can before object, below or with its matched positionPlace.For example, the single auto-focusing digital camera with output signal is assembled in AR in first group of forward method embodimentSome in eyepiece facilitates at position, for example, in nasal bridge region or close to one of temple.The output of camera is provided to AR meshMicroprocessor in mirror and/or it is transmitted to teleprocessing unit.In any case, its is related with its auto-focusing abilitySignal is used for determining the distance for the object that the user can be seen when user goes ahead and sees.The distance and eyes of userInterocular distance be used for determining to turn to and to the correct placement of the ideal virtual image of those objects (for example, label).It should be away fromFrom and/or steering angle may be additionally used for determine virtual objects focusing level can correctly be observed by user.Optionally, aboutThe additional information of the steering characteristic of specific user can be entered and be stored in memory associated with microprocessor and by withIn the determination for adjusting steering.
In second group of forward method embodiment, some in AR eyepiece is integrated into independently of the electronic distance meter of cameraFacilitate at position, for example, in nasal bridge region or close to one of temple.In these embodiments, the output of electronic distance meter withMode identical with the output of automatic focusing camera relatively described with first group of forward method embodiment come using.
In third group forward method embodiment, AR eyepiece includes multiple distance-measuring equipments, they can be auto-focusing phaseMachine and/or electronic distance meter.In multiple equipment can all be aligned because become in the same direction come determine object away fromFrom or one or more of equipment can be differently aligned so that with other equipment about to the letter at a distance from various objectsBreath is obtainable.Output from one or more of equipment with first group of forward method embodiment relatively to describeThe output identical mode of automatic focusing camera input and analyze.
In the 4th group of steering Solution Embodiments, one or more distance-measuring equipments are used by manner discussed above.It is attachedAdd ground, AR eyepiece includes one or more eye tracking equipments, they are configured to track one or two of user's eyesMove and/or check direction.The microprocessor or may pass to that the output of eye tracking equipment is provided in AR eyepieceTeleprocessing unit.The output is used for determining the direction that user checks, and when the eye from two eyes tracks letterWhen breath is available, for determining the steering of eyes of user.The direction and direction information (if applicable) are then by individuallyIt is used together using or with direction information determining from distance-measuring equipment, to determine one or more be likely to be looking at userThe placement of the related one or more virtual images of a object and (optionally) focusing level.
In the 5th group of forward method, one or more distance-measuring equipments, which are directed to, to be left in the front of the user of AR eyepieceDirection.It is used in the manner described above by the distance to object that rangefinder equipment detects to show the virtual of objectImage.Although he may realize that or do not recognize virtual image when user, which is going ahead, to be seen, when user withHe will recognize virtual image when the direction of object related with virtual image is seen.
Calibrating sequence can be used together with any forward method embodiment.Mechanically calibrated property, electricity can be used in calibrating sequenceThe step-length of son calibration property, or both.During calibrating sequence, the interocular distance of user can be determined.Also, user can be askedAsk see it is a series of there are true or pseudo range range (for example, from closely to remote) a true or virtual objects, and eyesTurn to mechanically or electronically or both ground measurement.Information from the calibrating sequence then can be when AR eyepiece be usedIt is used for the determination that steering, focusing and/or virtual image are placed.The calibrating sequence preferably quilt when user puts on AR eyepiece for the first timeIt uses, but can also think to recalibrate in user will use the useful any time.By user and during calibrating sequenceAs long as the relevant information of the information of acquisition can be stored for specific user for example using any technology being described hereinIt just can be used when user himself to be identified as to it to AR eyepiece.
It should be noted that some distance-measuring equipments use distance to determine method, wherein from the received information of sensor of equipmentIt is mapped in space representation straight line or non-straight wire grid.The information of each section from grid is relatively determined each otherRange distance.In forward method embodiment, raw sensor data, map information, the distance being computed or these is anyCombination can be used for the placement and/or focusing of one or more virtual images.
It is understood that forward method embodiment includes for one in eyes of user or in eyes of userTwo virtual images are placed.In some embodiments, a virtual image is provided to the left eye of user, and different virtualImage is provided to the right eye of user.This for example provides permission to one or more virtual images to an eyes, obtains simultaneouslyInformation from another eyes is for retouching standard.In the case where wherein multiple images are placed before user, regardless of imageBe it is identical be also different, place can be and meanwhile, it is in different time or staggered in time, for example, image is with oneA or multiple scheduled flicker rate (for example, 30,60 and/or 180 hertz) displays, wherein when the image for right eye is not inThe current image for left eye is presented and vice versa.In some embodiments, virtual image only shows the dominant eye of people,And in further embodiments, virtual image only shows the nondominant eye of people.Wherein using staggered image in timeIn some embodiments, the virtual image for being positioned away from the various objects at each distance of user is shown in the manner described aboveShow;When user in terms of the true picture of an object to the true picture of another pair elephant when, only with the object just checkedThe corresponding virtual image of true picture will just be seen by the brain of user.For example, by using at a high speed (for example, 30 to 60Hertz) operation focus mechanism (variable focus lens for being such as attached to the piezoelectric actuator of LCOS or being inserted into optical path),One or more in identical or different virtual image can be placed in one in eyes of user or eyes of userIn two more than one depth planes.
In some embodiments, the focal length of virtual image can be adjusted to provide a user virtual image at ideal distanceMirage.When image is just being presented to two eyes of user, such adjusting is particularly useful, and two imagesOpposed lateral positions are conditioned to turn to.This adjusting can be realized for example, by following manner: adjust the optical path that image is shownLength or using one or more variable lens, this can be for example by promoting or reducing LCOS panel come of the invention someIt is realized in embodiment.
In embodiments, the present invention is provided to provide Depth cue by augmented reality virtual objects or virtual informationMethod, broad range of perceived depth can be passed to special with different eyes by the augmented reality virtual objects or virtual informationThe individual of the broad range of sign.These Depth cue embodiments of the method for the invention use the increasing for being supplied to two individual eyesDifference between the located lateral of strong real world images different provides the steering of the virtual objects or virtual information of conveying sense of depthOn difference.One advantage of these methods is: the transverse shift of augmented reality image can be directed to the difference of augmented reality imageIt is partially different, so that perceived depth is different for those parts.In addition, transverse shift can be by augmented reality imageThe image procossing of each section is realized.User can experience the perceived depth of gamut by the method, from that of a physical efficiency focusingIt is close to arrive infinity (regardless of the age of individual).
These Depth cue embodiments of the method for a better understanding of the present invention are remembered that following is useful: being enhancedThe some aspects of reality, head-mounted display be used to add the related virtual objects of scene view seen with userImage or virtual information.In order to which additional effect to be added to the perception of augmented reality, virtual objects or virtual information are placedIt is useful at perceived depth into scene.As an example, virtual tag (title of such as building) can be placed on sceneIn object on.If the virtual tag and the building are perceived as in the scene in same depth, virtual to markNote and the perception relevance of the building are enhanced.Head-mounted display with see-through capabilities is perfectly suitable for providing increasingStrong reality information (such as label and object), this is because they provide a user the clear view of environment.However, for valuableThe augmented reality information of value, it must be easily associated with the object in environment, and as a result, augmented reality information relative toPositioning of the object in see-through view is important.Although having the camera that can be calibrated to see-through view in head-mounted displayIn the case where the horizontal and vertical positioning of augmented reality information be relatively straightforward, but depth localization is more complicated.The U.S. is specialBenefit 6690393 describes the method for positioning 2D label in 3D virtual world.However, this method is not for at itThe major part for the image that middle user sees is not provided digitally and thus the position 3D of object is unknown see-through viewDisplay.United States Patent (USP) 7907166 describes the robotic surgical system using anaglyph viewer, and wherein telestration illustratesIt is coated on the stereo-picture of execute-in-place.However, the method described in United States Patent (USP) 6690393 of being similar to, the systemUsing the image of capture, these images captured are then operated to addition diagram, and thus not in wherein imageThe relative position for the object that major part is not provided digitally and user sees is the specific condition of unknown see-through display.Another prior art for augmented reality is to adjust the focus of virtual objects or virtual information, so that user experiencesThe difference of the depth of focus, the depth of focus provide a user Depth cue.As user must focus his/her eye againThe object seen in scene simultaneously sees virtual objects or virtual information, and user feels associated depth.However, can be related to focusThe depth bounds of connection are limited by the adaptability that the eyes of user can be realized.The adaptability is losing the suitable of them when eyesIt is limited in certain individuals when the major part of answering property range, especially if individual is older.In addition, being depending on userMyopia or long sight, adaptability range are different.These factors make using the result of focus clue for having not the same yearIt is insecure for a large number of users group of discipline and different eye features.Therefore, it always exists to except the prior artIt is widely used to be used for the needs of depth information method associated with augmented reality.
It is some by herein and next about attached drawing 109 to attached drawing in Depth cue embodiment of the method for the inventionIt is described in 121 paragraph.Head-mounted display with see-through capabilities the clear view of scene before user is provided and at the same timeThe ability of display image is also provided, wherein user sees the combination picture being made of the display image of see-through view and covering.Method needs to show 3D label and other 3D information to help user to explain the environment around user using see-through display.3DLabel and the stereo pairs of other 3D information can be presented to the left eye and right eye of user, by 3D label and other 3D informationPositioning is in the scene at the different depth of user's perception.By this method, 3D label and other 3D information can more easily withSee-through view and ambient enviroment are associated.
Attached drawing 109 is the diagram with the head-mounted display apparatus 109100 of see-through capabilities, and is shown and passes through in Fig. 1It wears herein come the special version of the augmented reality eyepiece 100 described.Head-mounted display apparatus 109100 may include see-through display109110, stereoscopic camera 109120, electronic device 109130 and rangefinder 109140.Electronic device may include one in followingIt is a or multiple: processor, battery, GPS sensor (GPS), direction sensor, data storage, wireless communication system and useFamily interface.
Figure 110 is the diagram for the scene before user seen in see-through view such as user.It is different deep in the sceneMultiple objects at degree are shown to be used in discussing.In Figure 111, several objects in scene are identified and mark.However, label is presented in a manner of two-dimentional (2D) in the following manner: either only to the eyes of user present label orLabel in image at same position is presented to each eyes to be consistent so that marking when being checked simultaneously.This typeThe label of type especially when there are foreground and background object, is seen because marking so that will mark associated with object more difficultIt is entirely located in identical perceived depth up.
In order to enable it is more easily that label or other information is associated with the ideal object of environment or aspect, it will markOr other information is rendered as three-dimensional (3D) label or other 3D information are beneficial, so that the information is perceived by the user notSame depth.This can be completed in the following manner: on the position between the covering image being covered on fluoroscopy imagesTwo eyes that the 3D label covered in image or other 3D information are presented to the user by transverse shift, so that covering image toolThere is perceived depth.For the technical staff of three-dimensional imaging field, this transverse shift between image is also known as parallax, andIt causes user to change the relative positioning of his/her eye to be visually directed at image and which results in depth perceptions.HaveThe image of parallax is the image of the 3D label being covered in the see-through view for the scene that user sees or other 3D information.Pass throughThere is provided, there is the 3D of big parallax to mark, and user must slightly be directed at the optical axis of his/her eyes for the mark in stereo-pictureNote brings alignment into, provides the perception that label is located at close user.3D label energy quilt with small parallax (or without parallax) canIt is aligned depending on ground with the eyes of user for looking at front and this provides 3D and marks remotely located perception.
Figure 112 and 113 shows the stereo pairs of the 3D label of the see-through view to be applied shown into Figure 110.Figure112 be to be displayed to the image of the 3D label of user's left eye, and Figure 113 is the image for being displayed to the 3D label of user's right eye.Figure 112 and Figure 113 together provide stereo pairs.In the stereo pair, 3D label is positioned laterally on Figure 112 and Figure 113It is different between the image of middle display.Figure 114 provides the covering image of Figure 112 and Figure 113.In order to increase in Figure 114Clarity, the 3D label from Figure 113 has been shown as grey and the 3D label in Figure 112 is shown as black.?In the prospect of Figure 114, the 3D label from Figure 113 is located in the left side of the 3D label from Figure 112 with relatively large parallax.In the background of Figure 114, the 3D label from Figure 113 is consistent with the 3D label from Figure 112 and is positioned in from Figure 112'sThe top of 3D label is without parallax.In the middle scene area that Figure 114 is shown, during 3D from Figure 112 and Figure 113 label hasDeng parallax.3D is marked such as corresponding with user's perceived depth to this relative parallax of left eye and right eye presentation.By forThe selection of 3D label marks the consistent depth of the depth of object associated therewith for a user can with 3D in sceneIt will be readily understood that the connection of 3D label and the object or user between the other aspects for the environment seen in see-through view.Figure115 show the see-through view of the scene with the 3D label for showing its parallax.However, being used when checking in real lifeThe orientation that family may change his/her eyes comes so that 3D is marked consistent in each left collection/right collection and is exactly thisA perception for providing a user depth.
The calculating of parallax it is known to those skilled in the art that.For relative disparity and the equation of distance by equation1 provides:
Z=Tf/d
Wherein Z is the distance from stereoscopic camera to object, and T is the separation distance between stereoscopic camera, and f is camera lensFocal length, and d is the parallax distance on camera sensor in scene between the image of same object.Items are rearranged to find out viewDifference, equation become equation 2:
d=TF/Z
For example, for separating 120 millimeters and in conjunction with the imaging sensor with 2.2 microns of center to center pixel distanceFor what is used has 7 millimeters of focal length cameras, with when a display is by compared with another display visual object point movedThe parallax that pixel quantity indicates is directed to some representative distances (being provided with rice) in table 1 and provided.
Table 1
| Distance (rice) | Parallax (pixel) |
| 1 | 381.8 |
| 2 | 190.9 |
| 10 | 38.2 |
| 50 | 7.6 |
| 100 | 3.8 |
| 200 | 1.9 |
Note that sometimes in the prior art, the parallax value of stereo-picture is described using number from negative to positive,In 0 parallax for being defined in the object left at one selected location of observer, observer can feel at the selected locationAs in middle scape.In view of 0 point this movement, equation listed above must be adapted.When parallax value is come in this wayWhen description, the parallax of nearly object and remote object can be identical in amplitude but opposite on symbol.
Figure 116 shows the stereo pairs captured by the stereoscopic camera 109120 on head-mounted display apparatus 109100Diagram.Since these images capture from different perspectives, they will have with a distance from head-mounted display apparatus 109100Corresponding parallax.In Figure 117, the view between image that two images from Figure 116 are capped to show stereo pairDifference.The parallax matches with the parallax seen in the 3D label shown for the object in Figure 114 and 115.3D label will as a result,It is perceived as being located at and is intended to the identical depth of object associated therewith with 3D label.Figure 118 shows the work seen by userThe diagram marked for the 3D of the covering to the see-through view seen with left eye and right eye.
Figure 119 is the flow chart of Depth cue embodiment of the method for the invention.In step 119010, head-mounted display apparatusElectronic device 109130 in 109100 determines the GPS location of head-mounted display apparatus 109100 using GPS.In optional step119020, electronic device 109130 determines the direction in the visual field using electronic compass.This permits a determination that visual field position and the visual fieldDirection, so that the object and neighbouring object in the visual field can be positioned relative to the visual field of user in the following manner: by headWear the database that formula shows the GPS location of other objects in the GPS location and head-mounted display apparatus 109100 of equipment 109100It is compared or is connected to other databases using being wirelessly connected.In step 119030, perpetual object or pass through electronics devicePart 109130 analyzes the database being stored in equipment 109100 or by wirelessly being communicated in conjunction with another equipment come phaseFor the visual field mark of user.In step 119040, by by the GPS location and perpetual object of head-mounted display apparatus 109100GPS location be compared to determine arrive perpetual object distance.In step 119050, with about perpetual object title orThe related label of other information is then generated together to be felt with to corresponding at a distance from perpetual object by user with parallaxKnow and 3D label is provided at distance.Figure 111 shows showing for title, distance including perpetual object in the user visual field and the label of descriptionExample.In step 119060, the left eye and right eye that the 3D label of perpetual object is shown to user with parallax are in desired depthPlace provides 3D label.
Figure 120 is the flow chart of another Depth cue embodiment of the method for the present invention, wherein similar to step those of in Figure 119Rapid step has used accompanying drawing number identical as used in Figure 119 to number.In step 120140, to the phase of perpetual objectDistance and direction for the user visual field either connected by electronic device 109130 in equipment or combining wireless itsIts equipment determines.The left eye and right eye that user is shown to parallax in step 120160,3D label are at desired depth3D label is provided, and in addition, 3D label is provided in the part corresponding with the direction to perpetual object in the user visual field.Figure 111 shows that the label of wherein remote perpetual object is provided to the rear in the user visual field and has towards the remote concernThe example in the direction of object is shown as label in this example and " comes to town 10 miles from this direction." this feature offer 3D letterVisual cues in breath, make user be easily navigated to perpetual object.It should be noted that 3D label can be in perspective viewOther objects are provided above.
Figure 121 is the flow chart of another Depth cue embodiment of the method for the invention.In this embodiment, using distanceMeasuring device 109140(such as rangefinder) determine the distance of perpetual object in scene.In step 121010, close to headIt wears formula and shows that one or more images of the scene of equipment 109100 are captured by using stereoscopic camera 109120.Alternatively,Single camera can be used for the one or more images for capturing scene.One or more images of the scene can be different spectrumThe image of type, for example, image can be visible images, ultraviolet image, infrared light image or HYPERSPECTRAL IMAGERY.OneOr multiple images are analyzed to identify one or more perpetual objects in step 121020, wherein the analysis can be by electronic device109130 progress or image can be wirelessly sent to another equipment for analysis.In step 121030, set using range measurementThe distance of perpetual object is determined for 109140.Determination makes the related view of the correlation distance of perpetual object in step 121040Difference.In step 121050, the label or other information of perpetual object are determined.In step 121060, perpetual object is shown3D label or other 3D information.
Figure 122 is the flow chart of another Depth cue embodiment of the method for the invention.In the present embodiment, by using verticalBody camera directly to measure the distance of object in scene to obtain the depth map of scene.In step 122010, stereoscopic camera109120 be used to capture one or more stereo-picture groups of the scene close to head-mounted display apparatus 109100.The sceneOne or more stereo-picture groups can be different spectrum picture type, for example, stereo-picture can be visible images, purpleOutside line light image, infrared light image or HYPERSPECTRAL IMAGERY.One or more stereo-picture groups in step 122020 be analyzed withOne or more perpetual objects are identified, the wherein analysis can be carried out or one or more perspective views by electronic device 109130As group can be wirelessly sent to another equipment for analysis.Figure in step 122030, one or more stereo-picture groupsParallax as being compared to determine one or more perpetual objects.It is related with one or more perpetual objects in step 122040Label or other information be determined.In step 122050, the 3D label and/or 3D information of one or more perpetual objects are shownShow.
In embodiments, camera focus information can be used to provide display content and place in the present invention, such as utilizes and oneselfDynamic focus determines the integrated camera of facility binding operation, wherein the letter with the distance dependent of real-world objects into ambient enviromentBreath, which is integrated processor and determines in facility from auto-focusing, to be extracted, and wherein integrated processor determines that content exists according to this distancePlacement location in the visual field of optics assembly.The visual field may include two visuals field that can be separately controlled, each and eyes of userIn one alignment so that user can check peripheral region and content with two eyes, and the placement location of content includesThe placement location in each of the visual field that can be separately controlled for two.Content may include two independent images, two of themIndependent image is placed in be separatedly in each of visual field that two can be separately controlled, and two of them independent image is when two3D rendering can be formed when being shown to user in a visual field being separately controlled.Placement location can by from to real-world objectsDistance corresponding placement value table in extract placement value and determine.Integrated processor can calculate placement location.
In embodiments, rangefinder information can be used to provide display content and place in the present invention, such as uses and eyepieceThe rangefinder to determine the distance of real-world objects in ambient enviroment is integrated and operated, and wherein integrates processingDevice determines placement location of the content in the visual field of optics assembly according to this distance.The visual field, which may include two, to be separately controlledThe visual field, be each aligned with one in eyes of user so that user can check peripheral region and content with two eyes, andAnd the placement location of content includes the placement location for each of two visuals field that can be separately controlled.Content may include twoA independent image, two of them independent image are placed in be separatedly in each of visual field that two can be separately controlled,In two independent images 3D rendering can be formed when being shown to user in the visual field that can be separately controlled at two.Placement location can lead toThe placement value extracted from the table of placement value corresponding at a distance from real-world objects is crossed to determine.Integrated processorPlacement location can be calculated.
In embodiments, the present invention multiple distances can be used to determine sensor to show that content is placed, and such as pass through benefitIt determines that the integrated distance of the distance of real-world objects in ambient enviroment determines sensor with multiple operations, and wherein collectsPlacement location of the content in the visual field of optics assembly is determined according to this distance at processor.The visual field, which may include two, to be dividedOpen the visual field of control, be each aligned with one in eyes of user so that user can be checked with two eyes peripheral region andContent, and the placement location of content includes the placement location for each of two visuals field that can be separately controlled.ContentIt may include two independent images, two of them independent image is placed in each in the visual field that two can be separately controlled to be separatedlyIn a, two of them independent image can form 3D rendering when being shown to user in the visual field that can be separately controlled at two.It placesPosition can be determined by the placement value extracted from the table of placement value corresponding at a distance from real-world objects.CollectionPlacement location can be calculated at processor.In embodiments, multiple integrated distances determine that sensor can be camera sensor, surveyDistance meter etc..
In embodiments, the combination of the determining sensor of distance and the tracking of user's eye can be used to show content in the present inventionIt places, such as by using multiple integrated sensors (for example, camera, rangefinder) and from conjunction with the optics assembly of eyepieceThe eye tracking information for the eye trace facility that ground includes, object's position (example of the Lai Jianli relative to the visual field and the position of objectSuch as, to the angle of object, to the distance of object).In embodiments, the present invention is available and content optics assembly viewThe related other facilities (position of image and placement in such as user's indirect vision) of the placement of Yezhong, using calibrating sequence, makeAssisted with grid position and/or calibrate, for image at different locations to each eyes interlaced image etc..
In embodiments, the present invention can the mobile period of eyepiece provide display content-control, such as by when by withFamily is adapted to the integrated mobile checkout facility of the movement of detection wear-type eyepiece when wearing, and wherein integrated processor determines shiftingDynamic type simultaneously reduces showing for shown content according to mobile type.Mobile type can be shake, quickly moveIt moves.The reduction shown can be the elimination of displayed content, to the reduction of the brightness of displayed content, to displayed contentThe reduction of contrast, to change of focus of displayed content etc..
Near-field communication (NFC) allows the short distance wireless data between NFC reader and passive NFC device to exchange, whereinNFC reader is used as " promoter " (the providing electric power for exchange) of communication and passive NFC device is used as " target " (from coming fromThe field the RF harvest electric power of NFC reader is to provide back data to reader).One example of this configuration can be NFC readingDevice equipment, the NFC reader equipment read the identification information for coming from label (such as garment labels).It is noted that NFC also with penetrateFrequency identification (RFID) technology is mutually compatible with.If two electronic equipments include NFC reader and be brought to it is very close to each other,The NFC switched wireless of data can also be two-way.The example of this configuration can be the smart phone that two enable NFC and existInformation (for example, exchange of electronic business card) is exchanged between them, one enables the smart phone of NFC and clothes that one enables NFCPoint (POS) devices exchange information of being engaged in (for example, be used for EFT, such as utilize GOOGLE wallet mobile-payment system), twoA moving game exchanged between equipment information etc. for enabling NFC.NFC technique application may include EFT, mobile payment,File-sharing, electronic business card exchange, moving game, social networks connection, ticketing service purchase, boarding card check-in, POS, discount coupon are receivedCollection and/or exchange, tour guide station promoter, ID card, key card, vehicle or the key in hotel etc..NFC technique has about 4 centimetresThe practical distance of (theoretically about 20 centimetres), and thus promoter and target closely should be used to communicate generation.
In one example, credit card information can be stored in their smart phone for enabling NFC by user, to permitPerhaps they are at retail shop by the way that their smart phone is placed in the very close POS terminal of NFC that enables come to the POSEquipment carries out electronic money payment (again, such as realizing in GOOGLE wallet mobile-payment system).In this way,User does not need to extract actual credit card out to transact business, this is because credit card information by POS terminal by NFC connection from intelligenceIt can be read in mobile phone.However, user, which still has, must take out from their pocket or wallet by their smart phone, liftIts close POS terminal and the inconvenience for then again taking their smart phone away.
The present invention provides for (being such as worn on their wrist by user by providing a user NFC watch deviceOn) Lai Shixian enable NFC wireless transactions scheme, which then easily can be used for always enabling to anotherThe equipment of NFC is lifted for data exchange.Although the present invention describes the embodiment of NFC " wrist-watch ", it is not intended that with appointingWhere formula limits, and those skilled in the art will appreciate that the replacement for being able to achieve spirit of that invention is realized, be such as embodied as bracelet,Watch chain, ring etc..Each embodiment of NFC wrist-watch may include individual NFC device, NFC trunking etc., and wherein NFC relaying is setStandby building enables the equipment of NFC (for example, user with NFC target device (for example, the POS terminal for enabling NFC) and secondSmart phone) both communication.For example, it may include wanting in the case where wherein NFC wrist-watch is used as individual NFC deviceThe information (for example, credit card information) exchanged.In the case where wherein NFC wrist-watch is used as NFC trunking, which is not wrappedInformation to be switched is included, but on the contrary, information to be switched is stored in another electronics that NFC trunking communicatesIn equipment, smart phone, mobile computing device, personal computer etc..
In embodiments, wherein NFC wrist-watch is used as NFC trunking, user can by their personal device (such asSmart phone) stay in their pocket or wallet, and only by NFC wrist-watch close to another equipment for enabling NFC for numberAccording to exchange, wherein NFC wrist-watch provides the communication between another equipment for enabling NFC and the personal device of user.For example, withFamily can wear NFC wrist-watch in their wrist, and the smart phone including their credit card information is placed on to their mouthIn bag.When they are close enables the POS terminal of NFC to be used for e-payment, user can not take out their intelligence nowCan mobile phone and only by their NFC wrist-watch close to POS terminal, wherein NFC wrist-watch and the smart phone of user are non-by someClose to communication link (for example, bluetooth, WiFi etc.) Lai Tongxin.NFC wrist-watch reads the credit card information of user from smart phoneAnd transfer data to POS terminal.With this configuration, user takes out completely without by their smart phone, this be becauseIt is them it is only necessary to which their NFC wrist-watch is made e-payment close to POS terminal, while maintains they all individualsInformation and Financial Information are intensively placed in their smart phone.
In embodiments and refer to Figure 20 7, NFC wrist-watch 20702 can provide the general function of typical wrist-watch, such as withIn the surface 20704 of the display of time and date 20708, function button 20710, for the embedded controller of watch functionDeng.But in addition, NFC can provide communications facility, such as to the equipment for enabling NFC near-field communication, for neighbouringThe medium range communication (for example, bluetooth) of electronic device, longer range communications (for example, WiFi) for arriving neighbouring electronic device etc..?In each embodiment, the antenna 20712A-B for near-field communication may be provided as the antenna 20712A(in watchband for example, havingNFC loop antenna), the antenna 20712B(in table body is for example, have NFC " stamp " antenna) etc..Antenna 20712A is located at whereinIn the case where in watchband, user operationally by the band portion of NFC wrist-watch hold come close to enable the target device of NFC withFor data exchange.In the case where wherein antenna 20712B is located in table body, user is operationally by the table body portion of NFC wrist-watchIt holds and comes close to enabling the target device of NFC for data exchange.In embodiments, watch displays 20704 can also be providedUser can be made to input and/or select the control interface 20718 of information, information is such as the credit in electronic money exchangesCard number, identifying code, the data transferred accounts etc., and wherein control interface 20718 may include display, control button, 2D control pad,Touch screen etc..
With reference to Figure 20 8, an example usage scenario may include that user 20802 wears the NFC wrist-watch for being used as independent NFC device20702A, wherein NFC wrist-watch 20702A is picked up the POS terminal 20084 for enabling NFC for by NFC communication link20804A pays purchase.In this case, NFC wrist-watch 20702A include will be with the POS terminal that enables NFCThe payment information of 20804 exchanges.In embodiments, it can previously pass through including the information in NFC wrist-watch 20702A and arriveIt calculates the wired or wireless connection of facility (for example, mobile computing device, smart phone, personal computer), pass through network connection(for example, local network connection, WiFi connection) etc. is entered manually via control interface 20718.
With reference to Figure 20 9, an example usage scenario may include user 20902 wear be used as and the smart phone in user's pocket20908 wirelessly communicate the NFC wrist-watch 20702B of the NFC trunking of 20908A, and are wherein used for the information of data exchangeIt is included in the smart phone 308 of user.In the case where not using NFC wrist-watch, user is by must be by their intelligent handMachine takes out from their pocket to develop simultaneously close to enabling the POS terminal of NFC for data exchange.By using the present invention,User 20902 can stay in smart phone 20908 in their pocket, and only lift NFC wrist-watch 20702B to enabling NFCPOS terminal 20804, wherein NFC wrist-watch 20702B communicates 20804A with the POS terminal 20804 for enabling NFC, is achieved inThe two communication channel 20804A20908A set up via NFC wrist-watch 20702B are in smart phone 308 and the POS for enabling NFCThe transmission of information between equipment 20804.In this configuration, smart phone 20908 does not need to enable NFC, becauseWhat smart phone 20908 needed is only to utilize bluetooth etc. via medium range communication link 20908A() arrive NFC wrist-watchThe communication link of 20702B.
In embodiments, NFC wrist-watch can be led to by non-NFC medium range communication link with a number of other electronic equipmentsLetter, such as with personal computer, mobile computer, mobile communication equipment, navigation equipment, with it wear electronic equipment, enhancingReality glasses, wear-type electronic equipment, home entertainment device, home security devices, family is automatically brought into operation equipment, local network connectsConnect, personal network connection etc. communicate.For example, NFC wrist-watch can be communicated with eyepiece, all following mesh in this way of the eyepieceMirror: the optical device including that can enable see-through display can be presented in the see-through display from integrated processor offerContent, and wherein the various aspects of eyepiece can by be related in sensor, camera, tactile interface, accessory device etc. one orMultiple complex control technology controls.In embodiments, NFC wrist-watch can be detailed to the presentation transaction in glasses is fetched with glassesFeelings.Glasses control system is used as the interface for any interaction needed for transaction.
In embodiments, NFC wrist-watch can provide computing resource, such as microcontroller, memory, independently of communication linkInput/output unit (for example, memory card, wired connection), to local network wireless connection for updating, may be programmedProperty etc..For example, NFC trunking can provide memory to store the commodity of the history of purchase, preference, personal profiles, sale reportValence, redemption code, the customer ID of preference, incentive message, integral planning data etc..
Method and system described herein, each embodiment of especially creative augmented reality eyepiece, can quiltAdjustment by and/or via any electronic communication system or network is communicated and is received communication.Such electronic communicationThe example of system and network type and their relevant agreement, topological structure, network elements etc. include the following: (1) wired networkNetwork, such as :(a) wide area network (WAN)-uses such as point-to-point (PPP), High-Level Data Link Control (HDLC), synchrodata chainRoad controls the leased line and digital subscriber line of agreements such as (SDLC);Use the circuit switching of the agreements such as PPP and ISDN;Using such as frame relay, X.25(before OSI stack), the grouping in Synchronous Optical Network/synchronization level (SONET/SDH), more assistPoint for the agreements such as note switching (MPLS), the multi-megabit data service (SMDS) of switching, Ethernet (for example, 10GB, 100GB) of assessing a bid for tenderGroup switching;Use the cell relay of the agreement of ATM(Asynchronous Transfer Mode) agreement etc.;And network element is used, such asRouter, exchanger, network hub and firewall;(b) Metropolitan Area Network below (MAN) is used: such as ATM, distributed lightThe agreements such as fine network interface (FDDI), SMDS, Metro Ethernet and distributed queue dual bus (DQDB);It is such as starlike, totalThe topological structures such as line, grid, ring-type and tree;And the network elements such as router, exchanger, network hub and firewallElement;(c) for example using Local Area Network below: the HSSI High-Speed Serial Interface agreement of such as Ethernet (for example, Ethernet, fastlySpeed, 1GB, 10GB and 100GB);Such as starlike and tree topological structure;And such as router, exchanger, network hub andThe network elements such as firewall;(d) using the personal area network (PAN) of the technologies such as USB and firewire;(2) nothing such as belowGauze network :(a) using wide area network below (WAN): such as RTT (CDMA), EDGE (GSM), EV-DO (CDMA/TDMA),Flash-OFDM (Flash-OFDM), GPRS (GSM), HSPA D and U (UMTS/3GSM), LTE (3GPP), UMTS-TDD(UMTS/3GSM), the standards such as mobile Internet of WIMAX (802.16), satellite, general 3G and 4G;Such as base station subsystemThe network elements such as system, network and switching subsystem, GPRS nuclear network;It is operations support systems, subscriber identity module (SIM), generalTerrestrial access network (UTRAN) and nuclear network, and use is with lower interface: such as W-CDMA (UTRA-FDD)-UMTS,ULTRA-TDD HCR-UMTS, TD-SCDMA-UMTS, user device interface-UMTS, radio resource control (radio linkControl, media access control), Um Interface (for having with the GSM air interface of lower layer, the object of such as GMSK or 8PSK modulationManage layer, the data link layer of such as LAPDm and the network layer of such as radio resource, mobile management and Call- Control1);(b)Use the Metropolitan Area Network (MAN) (MAN) of the agreements such as WIMAX (802.16);Use following technology Local Area Network): such as with such asThe osi layer and OFDM of special and infrastructure isotype Wi-Fi, SCMA/CA etc. and spread spectrum etc.Sub- technology;And using the network elements such as router, interchanger, network hub, firewall, access point, base station andWith clients such as personal computer, laptop computer, IP phone, mobile phone and smart phones;(c) using the personal area network (PAN) of star, tree and the topological structures such as netted, Web vector graphic skill such as belowArt: (i) bluetooth (for example, using role's (such as master and slave and master/slave simultaneously), protocol stack, (assist by such as core agreement, cable replacementView, the agreement of telephone control protocol and use), mandatory agreement (such as LMP Link Manager Protocol (LMP), logic link controlWith adapting protocol (L2CAP), service discovery protocol (SDP)), matching method (such as traditional pairing and simple and safe pairing),Air interface (such as exempt from permit ISM band (2.402-2.480GHz))), (ii) infra red data as-sodation (IrDA) is (for example, useMandatory protocol stack layers are (for example, infrared physical layer specification (IrPHY), infrared link access protocol (IrLAP), infrared link pipeAgreement (IrLMP) or optional protocol stack layers are managed (for example, small transport protocol (Tiny TP), infrared communication protocol(IrCOMM), object exchange (OBEX), LAN and Infrared (IrLAN), IrSimple and IrSimpleShot))), it is (iii) wirelessUSB, (iv) Z-Wave(Z- wave) (for example, mesh network topologies structure, one or more master controller routings with source routingIt is modulated with safety and FGSK);(v) ZigBee(is for example, have the physics defined in 802.15.4 and medium access control layerAnd the application that defines of such as network layer, application layer, Zigbee device object and manufacturer and the component using CSMA/CA),(vi) body area network and (vii) Wi-Fi;(3) such as those 13.56 megahertzs with peer-to-peer network type operations and ISO/IEC18000-3 operated and data rate with 106Kbits/s-424kbits/s and have it is passive and/Or the near-field communication (NFC) of active communication mode.Method and system described herein, especially with the augmented reality of inventionEach embodiment of eyepiece can be applied to meet any or all aspect of mobile device Network Management System, such as policy pipeReason, user management, profile management, business intelligence, incident management, performance management, enterprise-class, multi-platform support mobile device management(including son in terms of, such as software and SaaS), safety management (including in terms of son, such as certificate control (for example, with Email,It is related using, Wi-Fi access and VPN access), password implementation, device clear, remote lock, follow-up auditing/log, central typeDevice configuration verifying, escape from prison/detection, safety container and the application packages of root), platform supports (for example, Android, iOS, blackThe certain kind of berries, Saipan, Windows be mobile and Windows mobile phone), compliance management, software management (including in terms of son, such as application downloadingDevice, application verification are supported using update, support (to answer for example, enterprise applies with third party using patch support, application shopWith)) and hardware/equipment management (e.g., including equipment registration is (for example, ownership, classification, registration, user's identification, EULA exploitationAnd Limit exploitation), external memory obstruction and configuration change history.Method and system described herein especially has hairEach embodiment of the augmented reality eyepiece of bright property can be applied to be related to software as service (SaaS), platform with including thoseAny kind of privately owned, group or mixed cloud as service (PaaS) and/or infrastructure as the feature of service (IaaS)It calculates network or cloud computing environment is used together.
Method and system described herein can be by partly or wholly by executing computer software, journey on a processorSequence code and/or the machine of instruction are disposed.Processor can be server, Cloud Server, client, network implementation, movementA part of computing platform, fixed computing platform or other computing platforms.Processor, which can be, is able to carry out program instruction, generationAny kind of calculating of code, binary instruction etc. or processing equipment.Processor can be or including signal processor, number atDevice, embedded processor, microprocessor or any deformation are managed, the program code being stored thereon such as can be directly or indirectly promotedOr the coprocessors (mathematics coprocessors, figure coprocessors, communication coprocessors etc.) of the execution of program instructionDeng.In addition, processor is able to carry out multiple programs, thread and code.Thread can be performed simultaneously to enhance the property of processorIt can and promote the simultaneously operating of application.It can be one in this as the mode of the descriptions such as realization, method, program code, program instructionIt is realized in a or multiple threads.Thread can cause other lines of other assigned priority for having and being associatedJourney;Processor can be according to priority or any other order come according to instruction execution these threads provided in program code.Processor may include storage herein with it is other place description methods, code, instruction and program memory.Processor can pass throughInterface access is storable in method, code and the storage medium of instruction of this and other place descriptions.It is associated with processorFor storage method, program, code, program instruction or it is other types of can be calculated or processing equipment execute instruction storageMedium may include but be not limited to one of the following or multiple: CD-ROM, DVD, memory, hard disk, flash memory, RAM, ROM, high speedBuffer etc..
Processor may include the one or more cores that can enhance multiprocessor speed and performance.In embodiments, it handlesIt is more that device can be dual core processor, four core processors, other chip-scales for combining two or more individual cores (referred to as wafer)Processor etc..
Method and system described herein can partly or completely pass through execute server, client, firewall, netThe machine of pass, network hub, router or other such computers and/or the computer software on networking hardware comes portionAdministration.Software program can be associated with server, which may include file server, printing server, domain server, Yin TeNetwork server, intranet servers and other variants, secondary servers, primary server, distributed server etc..ServerIt may include one of the following or multiple: memory, processor, computer-readable medium, storage medium, port (physics and voidIt is quasi-), communication equipment and other servers, client, machine and equipment can be accessed by wired or wireless medium etc. connectMouthful.It can be executed herein by server with the method for other place descriptions, journey logic bomb.In addition, for executing in the present inventionOther equipment of the method for description can be considered a part of facility associated with server.
Server may include the interface to other equipment, and other equipment include but is not limited to: client, other servers,Printer, database server, printing server, file server, the communication server, distributed server, social networks etc..Additionally, the coupling and/or connection can promote the long-range execution of program across a network.The networking of some or all of these equipmentThe parallel processing that can promote in the program or method of one or more positions is made without departing from the scope of the present invention.In addition, passing throughInterface be attached to server any equipment may include at least one can storage method, program, code and/or instruction depositStorage media.Central repositories can provide the program instruction to execute on different devices.In this implementation, long-range repositoryIt can be used as the storage medium of program code, instruction and program.
Software program can be associated with client, which may include file client, print client, domain clientEnd, the Internet client, intranet client and other variants, such as secondary client, primary client, distributed clientsDeng.Client may include one of the following or multiple: memory, processor, computer-readable medium, storage medium, portIt (physics and virtual), communication equipment and wired or wireless medium etc. can be passed through accesses other clients, server, machineWith the interface of equipment.Herein with it is other place description methods, journey logic bomb can be by client executing.In addition, for executingOther equipment of method described in the present invention can be considered a part of facility associated with client.
Client may include the interface to other equipment, and other equipment include but is not limited to: server, Cloud Server,Its client, printer, database server, printing server, file server, the communication server, distributed server, societyHand over network etc..Additionally, the coupling and/or connection can promote the long-range execution of program across a network.Some in these equipment orWhole networkings can promote the parallel processing in the program or method of one or more positions and be made without departing from the scope of the present invention.In addition, by any equipment that interface is attached to client may include at least one can storage method, program, code and/Or the storage medium of instruction.Central repositories can provide the program instruction to execute on different devices.In this implementation,Long-range repository can be used as the storage medium of program code, instruction and program.
Method and system described herein partly or can be disposed completely by the network facilities.The network facilities may includeEach element, such as calculating equipment, server, Cloud Server, router, network hub, firewall, client, individual calculusMachine, communication equipment, routing device and other activities known in the art and passive equipment, module and/or component.With the network facilitiesAssociated calculating and/or non-computational equipment may include but be not limited to other components: storage medium, such as flash memory, buffer,Stack, RAM, ROM etc..It herein can be by the one of network facilities element with the process, methods of other place descriptions, program code, instructionIt is a or multiple execute.
Method, program code and the instruction with other place descriptions can be implemented in the Cellular Networks with multiple units hereinIn network.Cellular network can be either frequency division multiple access (FDMA) network or be CDMA connecting mode (CDMA) netNetwork.Cellular network may include mobile device, cell site, base station, transponder, antenna, launching tower etc..Cellular network can beGSM, GPRS, 3G, EVDO, grid or other network types.
It can be implemented in mobile device or pass through movement in method, program code and the instruction of this and other place descriptionsEquipment is realized.Mobile device may include navigation equipment, mobile phone, mobile phone, mobile personal digital assistant, laptop computer,Palmtop computer, net book, pager, E-book reader, music player etc..In addition to other components, these equipment can be wrappedInclude storage medium, such as flash memory, buffer, RAM, ROM and one or more calculating equipment.Meter associated with mobile deviceCalculating equipment can be initiated to execute program code, method and the instruction being stored thereon.Alternatively, mobile device can be configured toIt is cooperated with other equipment to execute instruction.Mobile device can be communicated with base station, which docks and be configured with serverTo execute program code.Mobile device can communicate on peer-to-peer network, grid network or other communication networks.Program code can quiltIt is stored on storage medium associated with server and is executed by the calculating equipment being embedded in server.Base station may includeCalculate equipment and storage medium.Storage equipment can store the program code executed by calculating equipment associated with base station and refer toIt enables.
Computer software, program code and/or instruction can be stored on machine readable media and/or in machine readable JieIt is accessed in matter, machine readable media can include: for retaining the calculating unit for being used for the numerical data calculated for some periodPart, equipment and recording medium;The referred to as semiconductor storage of random-access memory (ram);It is deposited commonly used in more longlastingThe a mass storage device of storage, such as CD, similar hard disk, tape, drum, card and other types of magnetic storage form;PlaceManage device register, cache memory, volatile memory, nonvolatile memory;The optical storage of such as CD, DVD;It canMove media, such as flash memory (for example, USB stick or key), floppy disk, tape, paper tape, card punch, independent ram disc, zip diskDriver, removable massive store, offline etc.;Other computer storages, such as dynamic memory, static memory, reading/Write storage, variable storage, read-only, arbitrary access, sequential access, location addressing, file addressing, content addressed, network building-out are depositedStorage, storage area network, bar code, magnetic ink etc..
Physics and/or invisible Xiang Congyi state can be transformed into another state by method and system described herein.HereinThe data for indicating physics and/or invisible item can be also transformed into another state from a state by the method and system of description.
The each element (including attached drawing in flow chart and block diagram) for being described herein and describing implies patrolling between each elementCollect boundary.However, being practiced according to software or hardware engineering, discribed element and their function can be executable by computerMedium is implemented on machine, and machine, which has, is able to carry out the conduct monolithic integrated circuit software configuration being stored thereon, as independenceThe processing of software module or the program instruction as the module or any combination of these for using external routine, code, service etc.Device, and all such realize can be within the scope of the invention.The example of such machine may include but be not limited to, a numberIt is word assistant, laptop computer, personal computer, mobile phone, other Handheld computing devices, Medical Devices, wired or wirelessCommunication equipment, chip, calculator, satellite, board PC, e-book, gadget, electronic equipment, has artificial intelligence at frequency converterEquipment, calculate equipment, networked devices, server, router etc..In addition, the element described in flow chart and block diagram or appointingOther logic modules of anticipating, which can be implemented in, to be able to carry out on the machine of program instruction.Therefore, although above-mentioned attached drawing and description is explainedIn terms of the function of having stated revealed system, but in addition to clearly stating or clearly obtaining from the context in other ways,The ad hoc arrangement of software in terms of should not being inferred to from these descriptions for realizing these functions.Similarly, it can manageSolving identified above and description each step can be changed, and the order of step can be adapted to the technology disclosed hereinSpecific application.All these variations and modifications are intended to fall within that scope of the present disclosure interior.Accordingly, for the order of each stepDescription and/or description be not construed as require for these steps execution certain order, unless by specific applicationIt needs or clearly states or clearly obtain from the context in other ways.
Process as described above and/or process and they the step of may be implemented in hardware, software or be suitable for specific answerIn any combination of hardware and software.Hardware may include general purpose computer and/or dedicated computing equipment or specific calculatingThe particular aspects or component of equipment or specific computing device.Process can be implemented in one or more microprocessors, microcontroller,It is embedded in microcontroller, programmable digital signal processor or other programmable devices and internally and/or externally memory.It crossesJourney can also alternatively be embodied in using in specific integrated circuit, in programmable gate array, programmable array patrolsVolume in any other equipment or can be configured to processing electronic signal equipment combination in.It will be further appreciated that mistakeOne or more of journey can be implemented as the computer-executable code that can be executed on a machine-readable medium.
Computer-executable code can be created by using following: the programming language of the structuring of C etc., such as C+The programming language of+the object-oriented waited or any other advanced or low level programming language (including assembler language, hardware description languageAnd database programming language and technology), these language by storage, compiling or can be explained in one of above equipment and differenceThe combination of the processor of type, the combination of processor architecture or different hardware and softwares or any other program that is able to carry out refer toIt is run on the machine of order.
As a result, in one aspect, each method and their combination described above, which can be embodied in computer, to holdLine code, when one or more calculates and executes computer-executable code in equipment, the step of executing them.In another partyFace, each method can be embodied in the system for the step of executing them, and can by with various ways distribution in a device or instituteIt is functional to be integrated into dedicated, autonomous device or other hardware.On the other hand, for executing and procedure described above phaseThe device of the step of association may include any in hardware and/or software described above.All such arrangements and combination purportIt is falling within the scope of this disclosure.
Although having been incorporated with shown and detailed description preferred embodiments discloses the present invention, to each of itsKind modification and improvement are obviously to those skilled in the art.Therefore, the spirit and scope of the present invention not by withOn example limited, but permitted broadest feel to be understood to restrain.
Therefore all documents cited herein pass through reference and are merged.