Movatterモバイル変換


[0]ホーム

URL:


CN206833252U - A kind of mobile electronic device - Google Patents

A kind of mobile electronic device
Download PDF

Info

Publication number
CN206833252U
CN206833252UCN201720702958.4UCN201720702958UCN206833252UCN 206833252 UCN206833252 UCN 206833252UCN 201720702958 UCN201720702958 UCN 201720702958UCN 206833252 UCN206833252 UCN 206833252U
Authority
CN
China
Prior art keywords
electronic device
mobile electronic
mission area
image
locating module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201720702958.4U
Other languages
Chinese (zh)
Inventor
潘景良
陈灼
李腾
陈嘉宏
高鲁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ju Da Technology Co Ltd
Original Assignee
Ju Da Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ju Da Technology Co LtdfiledCriticalJu Da Technology Co Ltd
Priority to CN201720702958.4UpriorityCriticalpatent/CN206833252U/en
Application grantedgrantedCritical
Publication of CN206833252UpublicationCriticalpatent/CN206833252U/en
Expired - Fee Relatedlegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Landscapes

Abstract

First mobile electronic device includes wireless signal transceiver, image processor, locating module and motion module, and wireless signal transceiver obtains the image of the mission area including the first mobile electronic device gathered by camera;The characteristic information of the first mobile electronic device in the image in image processor identification mission area, and establish coordinate system for image;Locating module identifies position and the profile of the first mobile electronic device, and draw the map of mission area by comparing the characteristic information and the origin of coordinates of the first identified mobile electronic device on coordinate system;The second mobile electronic device is also communicatively coupled to, sends map to the second mobile electronic device, and the second mobile electronic device receives the task location set from user on map;And motion module receives the task location from the second mobile electronic device, and according to the position and profile of task location and the first mobile electronic device identified, path planning, and moved according to path.

Description

A kind of mobile electronic device
Technical field
It the utility model is related to electronic device field.Specifically, it the utility model is related to intelligent robot system field.
Background technology
Traditional sweeping robot presses the map autonomous positioning and movement or collision reaction deflecting random walk of scanning, togetherWhen clean ground.Therefore, traditional sweeping robot is because drawing and location technology are immature or inaccurate, in the course of the workGround complicated state can not be judged completely, the situation for losing position and direction easily occur.In addition, some types are not due to possessingStationkeeping ability, can only be by the physical principle of collision reaction come deflecting, or even household articles or robot itself can be caused to damageBad even personal injury, the problems such as being interfered to user.
Utility model content
Mobile electronic device system described in embodiment of the present utility model, such as robot system utilize installation indoorsFixing camera, positioned and map making, and mission area is determined by the feedback of user, carry out path planning,Real-time localization machine robot, so as to which robot is walked according to the path of planning.Thus solving robot can not judge groundThe problem of face situation, present position and optimal mobile route.The tracing sensor of robot is replaced with fixed camera;With machineThe work of the duplication of labour substitution people of device people, saves the cost of intelligent robot research and development and equipment and makes an investment in sweeping, dust collecting machineCost on structure.In addition, because camera is fixed, therefore also eliminates and be arranged on the camera of robot interior and broughtAccumulated error, so as to realize accurately positioning and mapping.
Robot system described in embodiment has accurately positioning and mapping, can improve the work of robotEfficiency, while mitigate the live load of user.
The mobile electronic device of one kind first according to one embodiment includes wireless signal transceiver, image procossingDevice, locating module and motion module, wherein:The wireless signal transceiver is communicatively connected to install shooting indoorsHead, it is configured to obtain the image of the mission area including first mobile electronic device gathered by the camera;It is describedImage processor is communicatively coupled to the wireless signal transceiver, is configured to identify described in the image of the mission areaThe characteristic information of first mobile electronic device;The locating module is communicatively coupled to described image processor, is configured to lead toThe characteristic information and the origin of coordinates established coordinate system and compare the first identified mobile electronic device are crossed, identification described first movesThe position of dynamic electronic equipment and profile, and draw the map of the mission area;The locating module is also communicatively coupled toTwo mobile electronic devices, it is configured to send the map to second mobile electronic device, and second mobile electron is setIt is standby to receive the task location set on the map from user;And the motion module be communicatively coupled to it is described fixedPosition module and second mobile electronic device, it is configured to receive the task position from second mobile electronic devicePut, and according to the position and profile, path planning, and root of the task location and first mobile electronic device identifiedMoved according to the path.
Alternatively or alternately, it is described wherein the image of the mission area also includes at least one first barrierImage processor is additionally configured to identify the characteristic information of at least one first barrier in the image of the mission area;InstituteState characteristic information and the origin of coordinates that locating module is additionally configured at least one first barrier described in comparison, described in identification at leastThe position of one barrier and profile;The motion module be additionally configured to according to the task location, positioned described firstThe position and profile of the position of mobile electronic device and at least one first barrier identified, path planning, and according to institutePath is stated to be moved.
Alternatively or alternately, wherein the motion module is moved, and the camera gathers bag in real timeInclude the image of the mission area of first mobile electronic device;Described image processor and the locating module are additionally operable to dynamicallyDetermine the characteristic information, position and the profile of first mobile electronic device.
Alternatively or alternately, the camera is vertically mounted on the ceiling of the mission area.
Alternatively or alternately, the first described mobile electronic device, in addition to charging pile, wherein the charging pileIncluding described image processor and the locating module.
Alternatively or alternately, the first mobile electronic device can also include sensor, and the sensor is by describedThe second obstacle information around one mobile electronic device sends to the motion module, the motion module and is additionally configured to adjustThe motion orientation of first mobile electronic device is to avoid second barrier.
Alternatively or alternately, the first described mobile electronic device, the sensor include ultrasonic sensorAnd/or laser sensor.
According to another utility model of the present utility model, a kind of method in the first mobile electronic device, describedOne mobile electronic device includes wireless signal transceiver, image processor, locating module and motion module, methods described bagInclude:By being communicatively connected to install the wireless signal transceiver of at least one camera indoors, obtain by instituteState the image for the mission area including first mobile electronic device that at least one camera is gathered;By that can communicateGround is connected to the described image processor of the wireless signal transceiver, identifies that described first in the image of the mission area movesThe characteristic information of dynamic electronic equipment, and coordinate system is established for described image, and each point in the mission area is assignedCorresponding coordinate value;By being communicatively coupled to the locating module of described image processor, by the seat establishedMark fastens the characteristic information and the origin of coordinates for comparing the first identified mobile electronic device, identifies that first mobile electron is setStandby position and profile, and draw the map of the mission area;By the institute for being communicatively coupled to the second mobile electronic deviceLocating module is stated, sends the map to second mobile electronic device, and second mobile electronic device is received and come fromThe task location that user sets on the map;And moved by being communicatively coupled to the locating module and described secondThe motion module of dynamic electronic equipment, receives the task location from second mobile electronic device, and according to instituteState task location and the position of first mobile electronic device identified and profile, path planning, and according to the pathMoved.
Brief description of the drawings
The detailed description that more complete understanding of the present utility model describes by referring to associated drawings is obtained, attachedSimilar reference refers to similar part in figure.
Fig. 1 shows the first mobile electronic device and the second mobile electronic device according to one embodiment of the present utility modelThe schematic diagram of place system.
Fig. 2 shows the frame of the processor in the first mobile electronic device according to one embodiment of the present utility modelFigure.
Fig. 3 A-3D show the schematic diagram of the first mobile electronic device according to one embodiment of the present utility model.
Fig. 4 shows the method flow in the first mobile electronic device according to one embodiment of the present utility modelFigure.
Embodiment
Embodiment one
Fig. 1 is shown according to first mobile electronic device 100 of one embodiment of the present utility model and the second mobile electronThe schematic diagram of the place system of equipment 120.
Reference picture 1, the first mobile electronic device 100 include but is not limited to sweeping robot, industrial automation robot, clothesBusiness humanoid robot, Disaster Relief Robot of getting rid of the danger, underwater robot, robot for space, autonomous driving vehicle are first-class.When the first movement electricitySub- equipment 100 is in field operation equipment such as autonomous driving vehicle, and camera by the unmanned plane flown in overhead for example, can be takenBand.In one embodiment, the first mobile electronic device 100 itself does not include camera.
Second mobile electronic device 120 includes but is not limited to:Mobile phone, tablet personal computer, notebook computer, remote control etc..MoveDynamic electronic equipment alternatively includes operation interface.In an optional embodiment, mobile electronic device is mobile phone, operates boundaryFace is cell phone application.
Signal transmission form between first mobile electronic device 100 and the second mobile electronic device 120 includes but unlimitedIn:Bluetooth, WIFI, ZigBee, infrared, ultrasonic wave, UWB etc., enter in the present embodiment so that signal transmission form is WIFI as an exampleRow description.
Mission area represents that the first mobile electronic device 100 performs the place of task.For example, when the first mobile electronic device100 when being sweeping robot, and mission area represents that sweeping robot needs the region cleaned.In another example when the first mobile electron is setStandby 100 be when getting rid of the danger Disaster Relief Robot, and mission area represents that the Disaster Relief Robot of getting rid of the danger needs the place speedily carried out rescue work.
As shown in figure 1, in one embodiment, the first mobile electronic device 100 includes wireless signal transceiver 102, figureAs processor 1040, locating module 1042 and motion module 106.In addition, the first mobile electronic device 100 also includes encoderWith Inertial Measurement Unit (Inertial Measurement Unit, IMU), this two parts by the first mobile electronic device 100,Namely robot 100 is self-contained.In addition, the first mobile electronic device 100 can also include memory 110, data processor1046, two entities can be with robot 100, can also be inside charging pile, alternatively inside charging pile.StorageThis two functions of device 110, data processor 1046 are storage and processing image information, data message etc..Memory 110 can communicateGround is connected to processor 104.Data processor 1046 is located in processor 104.As shown in FIG. 2, data processor1046 are connected to image processor 1040 and path planning module 1044.
Wireless signal transceiver 102 is communicatively connected to install camera 140 indoors.Wireless signal transceiver102 are configured to obtain the image of the mission area including the first mobile electronic device 100 gathered by camera 140.ExampleSuch as, depression angle shooting is carried out to indoor plane using fixed camera 140 on the ceiling.It is wireless in camera 140Communication module is connected with the wireless communication module 102 in the first mobile electronic device 100 by WIFI.
Image processor 1040 is communicatively coupled to wireless signal transceiver 102, is configured to the image in identification mission areaIn the first mobile electronic device 100 characteristic information.Sat in addition, image processor 1040 also establishes two dimension for plane pictureMark system, corresponding coordinate value is assigned to each indoor point.Locating module 1042 is communicatively coupled to image processor 1040,The characteristic information and the origin of coordinates for comparing the first identified mobile electronic device 100 on the coordinate system established are configured to,Position and the profile of the first mobile electronic device 100 are identified, and draws the map of mission area.
For example, the indoor plane figure under being shot in camera 140 is reached at the image in the first mobile electronic device 100Manage device 1040 and carry out Treatment Analysis.The function of image processor 1040 is to identify the first mobile electronic device 100 in imageFeature.Alternatively, graphics processor 1040 can use and be based on Scale invariant features transform (Scale InvariantFeature Transform, SIFT) algorithm or accelerate robust feature (Speeded Up Robust Features, SURF) calculateMethod identifies features described above.Using SIFT algorithms, it is necessary to store reference picture in the memory 110.Image processor 1040 is firstThe key point of the object of the reference picture of identification storage in the memory 110, extracts SIFT feature, then by comparing memoryEach key point SIFT feature and the SIFT feature of freshly harvested image in 110, then based on K k-nearest neighbors (K-NearestNeighbor KNN) matching characteristic, to identify the object in new images.SURF algorithm is to be based on approximate 2D Haar wavelet transforms(Haar wavelet) is responded, and carries out image convolution using integral image (integral images), has been used and has been based onHessian matrixes estimate construction detection (Hessian matrix-based measurefor the detector),And use the description based on distribution sub (adistribution-based descriptor).Those skilled in the art can be withUnderstand, image processor 1040 and locating module 1042 can be included in processor 104.Below with reference to Fig. 2 to processingDevice 104 is further described.
Locating module 1042 is communicatively coupled to image processor 1040, is configured to by establishing coordinate system and comparing instituteIdentification the first mobile electronic device 100 characteristic information and the origin of coordinates, identify the first mobile electronic device 100 position withProfile, and draw the map of mission area.Locating module 1042 is also communicatively coupled to the second mobile electronic device 120, enters oneStep is configured to send the map to the second mobile electronic device 120, and the second mobile electronic device 120 is received and existed from userThe task location set on map.For example, locating module 1042 establishes plane right-angle coordinate to the plane picture after treating,Will where charging pile at be set to the origin of coordinates, corresponding coordinate value (X, Y) of each point in image, mark the first movementThe position of electronic equipment 100 and profile, realize and indoor map is formulated.The second mobile electronic device 120 of user, such as mobile phone120 are connected with the wireless signal transceiver 102 of the first mobile electronic device 100 by WIFI.The APP of mobile phone 120, which is read, to be formulatedGood indoor map, user set task location using cell phone application on map, that is, what single or multiple needs cleanedPosition (corresponding X, Y-coordinate information), the locating module 1042 of the first mobile electronic device 100 records this setting position information.Meanwhile first mobile electronic device 100 using ceiling camera 140 and its internal image processing module 1040 search firstMobile electronic device 100, such as the image of sweeping robot 100, where coordinating locating module 1042 real-time to robot 100Coordinate information (X, Y) in map is positioned.
Then, motion module 106 is communicatively coupled to the mobile electronic device 120 of locating module 1042 and second, configurationTo receive the task location from the second mobile electronic device 120, and according to task location and first movement identifiedThe position of electronic equipment and profile, path planning, and moved according to the path.
In addition, the first mobile electronic device 100, for example, robot 100 also includes robot encoder and inertia measurement mouldBlock IMU, position and the posture of robot are obtained with auxiliary camera 140.Such as when robot is shielded, not in camera sightWhen middle, encoder and IMU can also provide position and the posture of robot.For example, encoder can be used as odometer, pass through noteThe rotation information of wheels of robot is recorded, comes the track that calculating robot 100 passes by.
Alternatively, or in addition to, the image of mission area also includes at least one first barrier.Image processor 1040The characteristic information at least one first barrier being additionally configured in the image in identification mission area.As discussed above, imageProcessor 1040 can use SIFT or SURF algorithm to carry out image recognition.Locating module 1042 is additionally configured to compare at least oneThe characteristic information and the origin of coordinates of individual first barrier, identify position and the profile of at least one barrier.Obstacle avoidance module 1048It is additionally configured to utilize the second mobile electronic device 120, such as the task that mobile phone 120 is selected according to user with motion module 106Position and the wheel of position, the position of the first mobile electronic device 100 positioned and at least one first barrier identifiedExterior feature, path planning, and moved according to the path.For example, when cleaning positional information and robot location's information all confirmAfterwards, the path planning module 1044 in the first mobile electronic device 100 will cook up the optimal path of energy avoiding obstacles, willRouting information reaches the motion module 106 of robot 100 by wireless network, then robot 100 sets out to clear according to this pathSweep the floor and a little begin a task with.
Alternatively, or in addition to, motion module 106 is moved, and camera 140 gathers in real time includes describedThe image of the mission area of one mobile electronic device.Image processor 1040 and locating module 1042 are additionally operable to dynamically determine firstCharacteristic information, position and the profile of mobile electronic device 100.
Alternatively, or in addition to, the first mobile electronic device 100 can also include sensor 108, and sensor 108 is by firstThe second obstacle information around mobile electronic device 100 is sent to motion module 106.Motion module 106 is additionally configured to adjustThe motion orientation of first mobile electronic device 100 is to avoid second barrier.It is appreciated that because of the height installed notTogether, the vertical view information that ceiling mounted camera 140 is shot, and the biography on the first mobile electronic device 100What sensor obtained is to look squarely information, therefore obstacle information and robot captured by ceiling mounted camera 140In the barrier of sensor may be different because there may be masking.The synthetic assembly of motion module 106 taking the photograph on the ceilingThe second obstacle information that the first obstacle information and sensor as captured by first 140 are perceived, carry out avoidance path planning.Alternatively or alternately, sensor includes ultrasonic sensor and/or laser sensor.It is appreciated that the first barrierWith the second barrier can be identical or different.Therefore, camera 140 and sensor 108 can be mutually auxiliaryHelp.For example, if any masking when, in shielded part, robot 100 needs to lean on laser sensor, the supersonic sensing of itselfDevice etc. carries out avoidance.
For example, in the first mobile electronic device 100, such as the moving process of robot 100, the first mobile electronic deviceImage processor 1040 in 100 and locating module 1042 using ceiling camera 140, robot 100 is entered Mobile state withTrack, to ensure that the positional information of robot 100 is accurate, meanwhile, laser sensor, the ultrasonic sensor of the carrying of robot 100Robot surrounding static, dynamic environment are detected, auxiliary avoids static, dynamic barrier and adjustment optimal path.
Alternatively or alternately, camera 140 is vertically mounted on the ceiling of mission area.The camera 140Such as can be wide-angle camera, for example, fish-eye camera.Therefore, image processor 1040 uses the photograph shot by camera 140During piece, it is only necessary to elevation information is considered, so as to carry out bi-directional scaling to captured image, it is possible to obtain the mesh in imageActual range between mark (for example, the first mobile electronic device 100, such as robot and the first barrier).The elevation informationCan be by altimeter or range unit, such as acquired in laser range finder or infrared range-measurement system, the elevation information can be byMeasured by first mobile electronic device 100, it can also be stored in advance in the first mobile electronic device 100.
Alternatively or alternately, camera 140 can also reuse monitoring camera.For example, the monitoring camera 140It may be mounted on wall, then the first mobile electronic device 100 obtains the distance between target according to trigonometric function etc..As wantedUsing original monitoring camera, it is necessary to the terrain clearance and angle lenses of the monitoring camera be known, so as to calculate figureThe ratio of picture, to obtain accurate dimensional values.
Alternatively or alternately, if the camera lens limited viewing angle of single camera, can be arranged on simultaneously using multipleCamera on ceiling, to cover wider array of region, or even the whole interior of covering.
Alternatively or alternately, the first mobile electronic device 100 also includes charging pile 160, and wherein charging pile 160 wrapsInclude processor 104.For example, charging pile includes image processor 1040 and locating module 1042.
Embodiment two
Fig. 3 A-3D show the schematic diagram of the first mobile electronic device 100 according to one embodiment of the present utility model.Illustrated below so that the first mobile electronic device 100 is robot 300 as an example.
Reference picture 3A-3D, robot 300 is mainly by main body 310, wireless transceiver 320, microprocessor 330, sensor340, and the part of charging pile 350 5 composition.User can be realized by wireless transceiver 320, microprocessor 330 and charging pile 350Fixed point function in certain region, main body 310 is aided in complete the feature operation specified by sensor 340.
Reference picture 3A and Fig. 3 B, main body 310 are the carrier of microprocessor 330 and sensor 340, and microprocessor 330 canTo assist the control unit of main body 310 to perform the function such as mobile.
Reference picture 3C and Fig. 3 D, wireless transceiver 320 can by mobile electronic devices such as mobile phone, computers, send bluetooth,The signal of communication such as WIFI, ZigBee, infrared, ultrasonic wave, row positional information can also be entered with microprocessor 330 and charging pile 350Interactive function, and positional information is sent to microprocessor 330 in a manner of instructing.
Reference picture 3B, microprocessor 330 are located on main body 310, can receive the instruction of the transmission of wireless transceiver 320 simultaneouslyEnter line program processing to the instruction received, the result after processing is transmitted to the control unit of main body 310, perform corresponding appointBusiness.Microprocessor 330 can also launch the signal of communication such as bluetooth, WIFI, ZigBee, infrared, ultrasonic wave to charging pile 350, realStake charge function is now returned automatically.
Reference picture 3A, sensor 340 can detect surrounding environment in real time, realize intelligent barrier avoiding, signal of communication detection and hairThe function of penetrating.
Reference picture 3D, charging pile 350 can receive the signal of the transmitting of wireless transceiver 320, while can also receive micro- placeThe signal of communication that reason device 330 is sent, realizes positional information interactive function, so as to guide 330 control main frame body of microprocessor 310 realNow return the function of stake charging.
Embodiment three
Fig. 4 shows the method flow in the first mobile electronic device according to one embodiment of the present utility modelFigure.
Fig. 4 shows a kind of method 400 in the first mobile electronic device, wherein, the first mobile electronic device 100 wrapsInclude wireless signal transceiver 102, image processor 1040, locating module 1042 and motion module 106.This method 400 is wrappedInclude:By being communicatively connected to install the wireless signal transceiver 102 of camera 140 indoors in block 410, obtain byThe image for the mission area including the first mobile electronic device 100 that camera 140 is gathered;In block 420 by that can lead toIt is connected to the image processor 1040 of wireless signal transceiver 102, the first mobile electron in the image in identification mission area letterThe characteristic information of equipment 100;In block 430, by being communicatively coupled to the locating module 1042 of image processor 1040, lead toThe characteristic information and the origin of coordinates established coordinate system and compare the first identified mobile electronic device 100 are crossed, identification first movesThe position of dynamic electronic equipment 100 and profile, and draw the map of the mission area;In block 440, by being communicatively coupled toThe locating module 1042 of second mobile electronic device, the map, and the second mobile electron are sent to the second mobile electronic deviceEquipment receives the task location set from user on map;And in block 450, by being communicatively coupled to positioning mouldThe motion module 106 of the mobile electronic device of block 1042 and second, receives the task location from the second mobile electronic device, and rootAccording to the position and profile of task location and the first mobile electronic device 100 identified, path planning, and carried out according to the pathMotion.
Alternatively or alternately, wherein the image of mission area also includes at least one first barrier, and method 400 is alsoIncluding (not shown):By image processor 1040, at least one first barrier in the image in identification mission areaCharacteristic information;By locating module 1042, the characteristic information and the origin of coordinates of more at least one first barrier, identification is at leastThe position of one barrier and profile;By motion module 106, according to task location, the first mobile electronic device positionedThe position and profile of 100 position and at least one first barrier identified, path planning, and carried out according to the pathMotion.
Alternatively or alternately, method 400 also includes (not shown):Moved by motion module 106;Gather the image of the mission area including the first mobile electronic device 106 in real time by camera;Pass through image processor1040 and locating module 1042, dynamically determine characteristic information, position and the profile of the first mobile electronic device 100.
Alternatively or alternately, method 400 also includes (not shown):By being communicatively connected to the placeThe encoder and inertia measuring module of device are managed, aids in the camera to obtain position and the appearance of first mobile electronic deviceState.
Alternatively or alternately, camera is vertically mounted on the ceiling of the mission area.
Alternatively or alternately, wherein the first mobile electronic device 100 also includes charging pile 160, wherein charging pile160 include image processor 1040 and locating module 1042.
Alternatively or alternately, wherein the first mobile electronic device 100 can also include sensor 108, method 400Also include (not shown) to send the second obstacle information around the first mobile electronic device 100 by sensor 108To motion module 106;And the motion orientation of the first mobile electronic device 100 is adjusted to avoid second by motion module 106Barrier.
Alternatively or alternately, wherein the sensor includes ultrasonic sensor and/or laser sensor.
In description above, the utility model is described by reference to specific illustrative embodiment;However, it should manageSolution, in the case where not departing from the scope of the utility model described in this paper, can carry out various modifications and variations.SpecificationIt should treat in an exemplary fashion with accompanying drawing, rather than it is restricted, and all such modifications are intended to be included in this realityWith in new scope.Therefore, the scope of the utility model should by general embodiments as described herein and its legal equivalents andIt is not only to be determined by above-mentioned specific embodiment.For example, the step of described in any method or process embodiments, can be with any suitableSequence performs, and is not limited to the clear and definite order presented in a particular embodiment.In addition, the portion described in any device embodimentPart and/or element can be assembled with various arrangements or otherwise operatively configured, essentially identical with the utility model to produceResult, therefore be not limited to the concrete configuration described in specific embodiment.
The solution of benefit, other advantages and problem is described on specific embodiment above;However, any benefitThe solution at place, advantage or problem, or any particular benefits, advantage or scheme can be caused to occur or become more apparent upon anyElement is not necessarily to be construed as key, required or basic feature or part.
As it is used herein, term " comprising ", "comprising" or its any modification are intended to quote including for nonexcludability, makeProcess, method, article, composition or the device of element list, which must be included, not only includes those described elements, and can alsoIncluding not expressly listed or intrinsic main process, method, article, composition or device.Except that being not specifically delineatedOutside a little, the said structure, layout, application, ratio, element, material or the part that are used in practice of the present utility model itsIt is combined and/or modification can be changed, or otherwise especially suitable for specific environment, manufacture specification, design ginsengNumber or other operations require, without departing from its substantially principle.
Although describing the utility model by reference to some preferred embodiments herein, those skilled in the art will holdIt is readily understood, in the case where not departing from spirit and scope of the present utility model, other application can substitute it is described in this paper thatA bit.Therefore, the utility model is only limited by following claims.

Claims (8)

CN201720702958.4U2017-06-162017-06-16A kind of mobile electronic deviceExpired - Fee RelatedCN206833252U (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201720702958.4UCN206833252U (en)2017-06-162017-06-16A kind of mobile electronic device

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201720702958.4UCN206833252U (en)2017-06-162017-06-16A kind of mobile electronic device

Publications (1)

Publication NumberPublication Date
CN206833252Utrue CN206833252U (en)2018-01-02

Family

ID=60776329

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201720702958.4UExpired - Fee RelatedCN206833252U (en)2017-06-162017-06-16A kind of mobile electronic device

Country Status (1)

CountryLink
CN (1)CN206833252U (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN108459595A (en)*2017-06-162018-08-28炬大科技有限公司A kind of method in mobile electronic device and the mobile electronic device
WO2019001237A1 (en)*2017-06-302019-01-03炬大科技有限公司Mobile electronic device, and method in mobile electronic device
CN110299030A (en)*2019-06-282019-10-01汉王科技股份有限公司Handheld terminal, aircraft and its airspace measurement method, control method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN108459595A (en)*2017-06-162018-08-28炬大科技有限公司A kind of method in mobile electronic device and the mobile electronic device
WO2018228258A1 (en)*2017-06-162018-12-20炬大科技有限公司Mobile electronic device and method therein
WO2019001237A1 (en)*2017-06-302019-01-03炬大科技有限公司Mobile electronic device, and method in mobile electronic device
CN110299030A (en)*2019-06-282019-10-01汉王科技股份有限公司Handheld terminal, aircraft and its airspace measurement method, control method

Similar Documents

PublicationPublication DateTitle
CN207488823U (en)A kind of mobile electronic device
CN112740274B (en)System and method for VSLAM scale estimation on robotic devices using optical flow sensors
WO2019001237A1 (en)Mobile electronic device, and method in mobile electronic device
CN207115193U (en)A kind of mobile electronic device for being used to handle the task of mission area
US7865267B2 (en)Environment recognizing device, environment recognizing method, route planning device, route planning method and robot
CN108459595A (en)A kind of method in mobile electronic device and the mobile electronic device
US9020301B2 (en)Method and system for three dimensional mapping of an environment
CN109074083A (en)Control method for movement, mobile robot and computer storage medium
CN108459597B (en)Mobile electronic device and method for processing tasks in task area
CN207051978U (en)A kind of system for reaching indoor task object position determination by image recognition mode
CN108073167A (en)A kind of positioning and air navigation aid based on depth camera and laser radar
WO2018228256A1 (en)System and method for determining indoor task target location by image recognition mode
CN110334701A (en) Data acquisition method based on deep learning and multi-eye vision in digital twin environment
CN207067803U (en)A kind of mobile electronic device for being used to handle the task of mission area
CN206833252U (en)A kind of mobile electronic device
CN104535047A (en)Multi-agent target tracking global positioning system and method based on video stitching
CN207051738U (en)A kind of mobile electronic device
CN110260866A (en)A kind of robot localization and barrier-avoiding method of view-based access control model sensor
CN108780319A (en)Oftware updating method, system, mobile robot and server
CN108459594A (en)A kind of method in mobile electronic device and the mobile electronic device
CN112762929B (en) A kind of intelligent navigation method, device and equipment
CN120339396A (en) A positioning method and system for a tunnel boring machine
JP2020149186A (en) Position / orientation estimation device, learning device, mobile robot, position / orientation estimation method, learning method
Jensen et al.Laser range imaging using mobile robots: From pose estimation to 3D-models
CN212044739U (en)Positioning device and robot based on inertial data and visual characteristics

Legal Events

DateCodeTitleDescription
GR01Patent grant
GR01Patent grant
CF01Termination of patent right due to non-payment of annual fee
CF01Termination of patent right due to non-payment of annual fee

Granted publication date:20180102

Termination date:20200616


[8]ページ先頭

©2009-2025 Movatter.jp