Movatterモバイル変換


[0]ホーム

URL:


CN207067803U - A kind of mobile electronic device for being used to handle the task of mission area - Google Patents

A kind of mobile electronic device for being used to handle the task of mission area
Download PDF

Info

Publication number
CN207067803U
CN207067803UCN201721069377.8UCN201721069377UCN207067803UCN 207067803 UCN207067803 UCN 207067803UCN 201721069377 UCN201721069377 UCN 201721069377UCN 207067803 UCN207067803 UCN 207067803U
Authority
CN
China
Prior art keywords
electronic device
mobile electronic
picture
processor
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201721069377.8U
Other languages
Chinese (zh)
Inventor
潘景良
陈灼
李腾
陈嘉宏
高鲁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ju Da Technology Co Ltd
Original Assignee
Ju Da Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ju Da Technology Co LtdfiledCriticalJu Da Technology Co Ltd
Priority to CN201721069377.8UpriorityCriticalpatent/CN207067803U/en
Application grantedgrantedCritical
Publication of CN207067803UpublicationCriticalpatent/CN207067803U/en
Expired - Fee Relatedlegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Landscapes

Abstract

The mobile electronic device of task for handling mission area includes the first wireless signal transceiver, processor, locating module, path planning module and motion module.First wireless signal transceiver is communicatively coupled to the second mobile electronic device, obtains the instruction from the second mobile electronic device, and instruction includes the title of the pending purpose mission area of the mobile electronic device;Processor is communicatively coupled to first wireless signal transceiver, it is determined that the environment space corresponding with the title of the purpose mission area;Locating module is communicatively coupled to processor, records the distance between present position and the environment space of mobile electronic device scope;Path planning module is communicatively coupled to processor, according to the title of mission area, generates path planning scheme;Motion module is communicatively coupled to path planning module and locating module, the distance range recorded according to path planning scheme and locating module, carries out task.

Description

A kind of mobile electronic device for being used to handle the task of mission area
Technical field
It the utility model is related to electronic device field.Specifically, it the utility model is related to intelligent robot system field.
Background technology
Traditional sweeping robot presses the map autonomous positioning and movement or collision reaction deflecting random walk of scanning, togetherWhen clean ground.Therefore, traditional sweeping robot is because drawing and location technology are immature or inaccurate, in the course of the workGround complicated state can not be judged completely, the situation for losing position and direction easily occur.In addition, some types are not due to possessingStationkeeping ability, can only be by the physical principle of collision reaction come deflecting, or even household articles or robot itself can be caused to damageBad even personal injury, the problems such as being interfered to user.
Utility model content
Embodiment of the present utility model propose it is a kind of based on mobile phone terminal shooting picture, and mobile phone terminal to photo orThe objectives region that person selectes is defined name picture subspace title, by Mike's radio reception of APP or robot,Robot is associated phonetic order with the zone name of name by speech recognition, and by instructing indicated region to complete to appointBusiness.Embodiment of the present utility model sends a command to robot by voice or App, robot is reached the figure of definition automaticallyCompletion task in slice, thin piece space name, it is easy to the automatic cleaning of robot.
According to the embodiment of one side of the present utility model, there is provided a kind of movement for being used to handle the task of mission areaElectronic equipment, including the first wireless signal transceiver, processor, locating module, path planning module and motion module, whereinFirst wireless signal transceiver is communicatively coupled to the second mobile electronic device, is configured to obtain and is moved from described secondThe instruction of dynamic electronic equipment, the instruction includes the title of the pending purpose mission area of the mobile electronic device, describedThe title of mission area is associated with the picture subspace of the picture library in the mobile electronic device;The processor can communicateGround is connected to first wireless signal transceiver, is configured to determine the environment corresponding with the title of the purpose mission areaSpace;The locating module is communicatively coupled to the processor, is configured to record the current institute of the mobile electronic deviceBetween position and the environment space apart from scope;The path planning module is communicatively coupled to the processor,The title according to the mission area is configured to, generates path planning scheme;The motion module is communicatively coupled to describedPath planning module and the locating module, it is configured to according to the path planning scheme and locating module recordDistance range, carry out the task.
According to the embodiment of one side of the present utility model, there is provided one kind is used to handle task in mobile electronic deviceThe method of the task in region, the mobile electronic device include the first wireless signal transceiver, processor, locating module, pathPlanning module and motion module, wherein first wireless signal by being communicatively coupled to the second mobile electronic deviceTransceiver, the instruction from second mobile electronic device is obtained, it is pending that the instruction includes the mobile electronic devicePurpose mission area title, the title of the purpose mission area and the figure in the picture library in the mobile electronic deviceSlice, thin piece space correlation joins;By being communicatively coupled to the processor of first wireless signal transceiver, it is determined that and instituteState the corresponding environment space of the title of purpose mission area;By the positioning mould for being communicatively coupled to the processorBlock, record the distance between the present position of the mobile electronic device and environment space scope;By that can communicateGround is connected to the path planning module of the processor, according to the title of the mission area, generates path planning scheme;By being communicatively coupled to the motion module of the path planning module and the locating module, advised according to the pathThe distance range of the scheme of drawing and locating module record, carries out the task.
Brief description of the drawings
The detailed description that more complete understanding of the present utility model describes by referring to associated drawings is obtained, attachedSimilar reference refers to similar part in figure.
Fig. 1 shows the schematic diagram of the system according to where the mobile electronic device of one embodiment of the present utility model.
Fig. 2 shows the method flow diagram according to one embodiment of the present utility model.
Embodiment
Fig. 1 shows the schematic diagram of the system according to where the mobile electronic device of one embodiment of the present utility model.
Reference picture 1, mobile electronic device 100 include but is not limited to sweeping robot, industrial automation robot, service typeRobot, Disaster Relief Robot of getting rid of the danger, underwater robot, robot for space, unmanned plane etc..It is appreciated that in order to followingTwo mobile electronic devices 140 distinguish, and mobile electronic device 100 is referred to as the first mobile electronic device 100.
Second mobile electronic device 140 includes but is not limited to:Mobile phone, tablet personal computer, notebook computer, remote control etc..MoveDynamic electronic equipment alternatively includes operation interface.In an optional embodiment, the second mobile electronic device is mobile phone, behaviourIt is cell phone application as interface.
Signal transmission form between mobile electronic device 100 and charging pile 160 includes but is not limited to:Bluetooth, WIFI,ZigBee, infrared, ultrasonic wave, ultra wide band (Ultra-wide Bandwidth, UWB) etc., are transmitted with signal in the present embodimentMode is described exemplified by WIFI.
Mission area represents that mobile electronic device 100 performs the place of task.For example, appointing when mobile electronic device 100When being engaged in clean ground, mission area represents that sweeping robot needs the region cleaned.In another example when mobile electronic device 100Task for get rid of the danger the disaster relief when, mission area represents that the Disaster Relief Robot of getting rid of the danger needs the region speedily carried out rescue work.Task place represents bagPlace containing whole mission area.
As shown in figure 1, the mobile electronic device 100 of the task for handling mission area is received and dispatched including the first wireless signalDevice 102, processor 104, locating module 106, path planning module 108 and motion module 110.First wireless signal transceiver102 are communicatively coupled to the second mobile electronic device 140, are configured to obtain the instruction from the second mobile electronic device 140,The instruction includes the title of the pending purpose mission area of instruction mobile electronic device 100, the title of the purpose mission areaIt is associated with the picture subspace in the picture library in mobile electronic device 100.
Second mobile electronic device 140 for example can be mobile phone.Second mobile electronic device 140 includes second camera144th, second processor 146 and second wireless singal transceiver 142.The user of second mobile electronic device 140 takes the photograph using secondAs the first 144 multiple pictures for shooting mission area.Second processor 146 is communicatively coupled to second camera 144, according toThe instruction at family, at least one picture subspace is defined for captured multiple pictures.
For example, after mobile phone shooting photo, the second wireless singal transceiver 142 of the second mobile electronic device 140 is by photoTransmit to mobile electronic device 100, so as to form picture library in mobile electronic device 100.For example, the picture library can storeAt the charging pile 180 of mobile electronic device 100, either it is stored at the server of mobile electronic device 100 or storesIt is medium in the high in the clouds of mobile electronic device 100.In picture library, processor 104 or second processor 146 are according to the finger of userShow, define different type picture subspace title.Title is for example, processor 104 or second processor 146 define six son skiesBetween, for example, bedroom, parlor, parlor, corridor, study, whole family, etc..Pay attention to, some pictures can belong to different simultaneouslyPicture subspace.For example, the picture in parlor can be both included in the picture subspace in entitled parlor by user, can also incite somebody to actionThe picture in parlor is included in the picture subspace of entitled whole family.
In addition, processor 104, more specifically, the image processor 1040 in processor 104 is each in picture libraryImage establishes coordinate system, and assigns corresponding coordinate value to each point in mission area, so as to establish environment space map.ShouldCoordinate system is for example, the origin of coordinates can be used as using charging pile 180.
Second wireless singal transceiver 142 is communicatively coupled to second processor 146, is configured at least one pictureThe title that name in subspace is referred to as pending purpose mission area is sent to mobile electronic device 100.
Alternatively, the processor 104 of mobile electronic device 100 or second processor 146 are additionally configured to the finger according to userOrder, is further segmented to the title of at least one picture subspace.For example, user can also be carried out the picture of shootingDraw a circle selection, be stored in picture library or space name, so as to further be segmented to picture subspace.It is for example, mobileThe processor 104 or second processor 146 of electronic equipment 100 can according to user draw a circle and word inputs, it is further fixedAdopted picture name is bedroom bedside, parlor tea table, parlor dining table etc., and is stored in by 104 addressable storage of processor,For example, in the memory of charging pile 180, in server, or high in the clouds etc..
Further, for example, the user of the second mobile electronic device 140 wishes that mobile electronic device 100 cleans parlor,Therefore, the user of the second mobile electronic device 140 sends instruction " parlor " to mobile electronic device 100.Second mobile electron is setStandby 140 user can send instruction in the form of voice, APP input words " parlor " can also be utilized, to represent that this refers toOrder.
First wireless signal transceiver 102 obtains the instruction from the second mobile electronic device 140, and movement is included in instructionThe title of the pending purpose mission area of electronic equipment 100, such as the mission area that user wants to clean is parlor, the taskThe title " parlor " in region and the picture subspace in the picture library in mobile electronic device 100, namely it is named as the figure in parlorSlice, thin piece space correlation joins.
Processor 104 is communicatively coupled to the first wireless signal transceiver 102, is configured to determine and the purpose mission areaThe corresponding environment space of the title in domain.Environment space, namely environment space map, can be in mobile electronic device 100 firstEstablished during use by mobile electronic device 100, can be with for example, using any of following several ways:
Mobile device 100 described separately below establishes the various ways of indoor environment map when using first.
Mode one:Mobile electronic device 100 (such as robot) includes camera, and the second mobile electronic device 140User wears location receivers
Mobile electronic device 100 also includes the first camera 112, wherein, the second mobile electronic device 140 also includes secondWireless signal transceiver 142, mobile electronic device 100, which is configured to work at, establishes map mode.First wireless signal is received and dispatchedDevice 102 and second wireless singal transceiver 142 be communicatively coupled to respectively it is multiple refer to radio signal source, be configured to according to fromMultiple signal intensities obtained with reference to radio signal source, determine the position of the mobile electronic device 140 of mobile electronic device 100 and secondPut.For example, distance can will be converted to from reference to the signal received at radio signal source by any method known in the artInformation, the above method include but is not limited to:Flight time algorithm (Time of Flight, ToF), angle of arrival algorithm (AngleOf Arrival, AoA), arrival time difference algorithm (Time Difference of Arrival, TDOA) and reception signal it is strongSpend algorithm (Received Signal Strengh, RSS).
Motion module 110 is configured as the position according to the mobile electronic device 140 of mobile electronic device 100 and second, withWith the motion of the second mobile electronic device 140.For example, mobile electronic device 100 includes monocular cam 112, the second movement electricityThe user of sub- equipment 140 wears wireless positioning receiver bracelet, or user's equipment has wireless positioning receiver peripheral hardwareMobile phone.Hardware cost and calculation cost can be reduced using monocular cam 112, is realized using monocular cam and uses depthThe same effect of camera.Image depth information can not needed.Pass through ultrasonic sensor and laser apart from depth informationSensor perceives.In the present embodiment, illustrated by taking monocular cam as an example, those skilled in the art, which will be understood that, to adoptBy the use of the first-class camera as mobile electronic device 100 of depth camera.The wireless location that mobile electronic device 100 passes through itselfReceiver follows user.For example, using first, the user of the second mobile electronic device 140 is realized and mobile electricity by cell phone applicationThe interaction of sub- equipment 100 completes interior and establishes map.Ginseng is used as by the wireless signal transmission group of the fixed position of indoor placementExamination point, for example, UWB, the wireless signal module in the cell phone application and mobile electronic device 100 of the second mobile electronic device 140The signal intensity (RSS) to each signal source is read, to determine that the user of the second mobile electronic device 140 and mobile electron setStandby 100 position indoors.Also, the real-time position that the motion module 110 of mobile electronic device 100 is sent according to intelligent charging spotConfidence ceases (mobile phone and robot location), completes user and follows.
First camera 112 is configured as shooting multiple images when motion module 110 is moved, and the plurality of image includes spyReference ceases and corresponding taking location information.For example, complete to build figure by the monocular cam of robot during following.FollowingDuring, mobile electronic device 100 utilizes the first camera 112, such as monocular cam, and whole indoor arrangement is clappedTake the photograph, and image containing big measure feature and its corresponding taking location information will be taken and mobile electronic device 100 follows roadFootpath coordinate, it is sent in real time in memory 116 by local wireless communication network (WIFI, bluetooth, ZigBee etc.).In Fig. 1,Memory 116 is shown to be included in mobile electronic device 100.Alternatively, memory 116 can also be included in intelligent charging spotIn 180, namely high in the clouds.
Image procossing mold 104 is communicatively coupled to the first camera 112, is configured as by entering to the plurality of imageRow splicing, extracts the characteristic information and camera site point information in the plurality of image, generates image map.For example, image procossingMold 104 is according to the height and inside and outside parameter of the first camera 112 of mobile electronic device 100, via the figure in processor 104Created as processor 1040 carries out map splicing to the great amount of images captured by the first camera 112, feature selecting extraction (exampleSuch as SIFT, SURF algorithm), the addition of characteristic point position information, and then generate off-the-air picture cartographic information (feature containing great amount of imagesPoint), then the image map information after processing is stored in memory 116.The intrinsic parameter of camera (camera) refers to camera certainlyThe related parameter of body characteristic, such as the lens focus of camera, pixel size etc.;The outer parameter of camera is in world coordinate systemThe parameter of (actual coordinates in charging pile room), such as the position of camera, direction of rotation, angle etc..The photo of camera shooting hasThe camera coordinates system of oneself, therefore need camera inside and outside parameter to realize the conversion of coordinate system.
Mode two:Mobile electronic device 100 (robot) includes camera and can show camera calibration black and white chessboard, theThe user of two mobile electronic devices 140 need not wear location receivers.
Alternatively, in another embodiment, mobile electronic device 100 also includes display screen 118, mobile electronic device 100It is configured to work at and establishes map mode, the second mobile electronic device 140 includes second camera 144, the first wireless signalTransceiver 142 be communicatively coupled to it is multiple refer to radio signal source, be configured to according to from it is multiple with reference to radio signal sources obtainSignal intensity, determine the position of mobile electronic device 100.
First camera 112 is configured as detecting the position of the second mobile electronic device 140.Alternatively, mobile electron is setStandby 100 also include ultrasonic sensor and laser sensor, can detect the mobile electronic device of mobile electronic device 100 and secondThe distance between 140.
Motion module 110 is configured as the position according to the mobile electronic device 140 of mobile electronic device 100 and second, withWith the motion of the second mobile electronic device 140.For example, using first, the user of the second mobile electronic device 140 passes through mobile phoneAPP realizes establishes map with the user mutual of mobile electronic device 100 to complete interior.Pass through the fixed position of indoor placementWireless signal transmission group (UWB etc.) is as a reference point, and the first wireless signal transceiver 102 in mobile electronic device 100 is readThe signal intensity (RSS) to each signal source is taken, to determine the position of mobile electronic device 100 indoors.Pass through mobile electronFirst camera 112 of equipment 100, such as monocular cam, ultrasonic sensor and laser sensor 114 are realized and moved to secondThe target of the user of dynamic electronic equipment 100 is positioned with following.For example, the user of the second mobile electronic device 140 can pass through handMachine APP set following distance, so as to mobile electronic device 100 according to the following distance and measure in real time with the second mobile electronAngle between equipment 140, the distance between adjustment and the second mobile electronic device 140 and angle.Mobile electricity during followingSub- equipment 100 sends follow path coordinate to intelligent charging spot 180 in real time.
In addition, the display screen 118 of mobile electronic device 100 is configured as showing such as black and white chessboard.In processor 104Image processor 1040 be communicatively coupled to second camera 144, be configured as receive transported from second camera 144The multiple images that dynamic model block 110 is shot when moving.For example, the image processor 1040 in processor 104 can be wireless by firstSignal transceiver 102 and second wireless singal transceiver 142, are received from the multiple images captured by second camera 144.Wherein, multiple images include the image of the display screen 118 for being shown as black and white chessboard of mobile electronic device 100.ProcessorImage processor 1040 in 104 is additionally configured to, by splicing multiple images, extract the feature letter in multiple imagesBreath and camera site point information, generate image map.The user of the second mobile electronic device 140 is fixed without wearing in this approachPosition receiver, therefore the second mobile device 140, such as the outer parameter of mobile phone camera need to carry out camera calibration by calibration maps.MarkIt is the chessboard figure that chequered with black and white rectangle is formed to determine picture.
For example, mobile electronic device 100, namely robot include the first camera 112, for example, monocular cam, andThe display screen 118 of black and white camera calibration chessboard can be shown.User need not wear wireless positioning receiver bracelet, without userThe mobile phone of equipment wireless positioning receiver peripheral hardware, mobile electronic device 100 follow user, the second mobile electron by visionThe user of equipment 140 builds figure using cell phone application completion of taking pictures.For example, often reach a room, the second mobile electronic device 140User room started by cell phone application build figure application, the display of the LCDs 118 of mobile electronic device 100 now is usedIn the classic black color chessboard of correcting camera.Mobile electronic device 100 simultaneously sends the now coordinate of itself and directional informationTo locating module 106.Now, the user of the second mobile electronic device 140 shoots the room environment using cell phone application, shootingPhoto needs to include the black and white chessboard in the LCDs of mobile electronic device 100.The use of second mobile electronic device 140Family shoots multiple pictures (photo is required to photograph the black and white chessboard in robot liquid crystal display) according to room layout situation, and leads toCross cell phone application and contain room environment and mobile electronic device 100 by what is taken, such as the image of robot 100 passes through local wirelessCommunication network (WIFI, bluetooth, ZigBee etc.) is sent in memory 116.According to mobile electronic device 100, such as robotPosition and direction information, the height and inside and outside parameter of camera 112 at that time, via the image processor in processor 104The great amount of images of user's shooting of 1040 pair of second mobile electronic device 140 carries out map splicing and created, and feature selecting extraction is specialDot position information addition is levied, generates off-the-air picture characteristic point cartographic information, then the image map information after processing is stored inIn reservoir 116.
Mode three:Mobile electronic device 100 (robot) does not include camera, the user of the second mobile electronic device 140Wear location receivers.
Alternatively, in another embodiment, the second mobile electronic device 140 also includes second wireless singal transceiver142 and second camera 144.Second wireless singal transceiver 142 be communicatively coupled to it is multiple refer to radio signal source, configurationAccording to from multiple signal intensities obtained with reference to radio signal source, determine the position of the second mobile electronic device 140.Second takes the photographAs first 144 multiple images for being configured as shooting task place.Image processor 1040 in processor 104 is communicatively coupledTo second camera 140, it is configured as, by splicing multiple images, extracting the characteristic information in multiple images and shootingLocation point information, generate image map.
For example, in this embodiment, mobile electronic device 100, such as robot do not include monocular cam and robotThe user of the second mobile electronic device 140 is not followed.The user of second mobile electronic device 140 wears wireless positioning receiverBracelet, or user's equipment have the mobile phone of wireless positioning receiver peripheral hardware, complete interior using cell phone application and build figure.It is for example, firstSecondary use, the user of the second mobile electronic device 140 pass through cell phone application or the wireless positioning receiver bracelet or hand of user's wearingThe wireless positioning receiver peripheral hardware of machine equipment realizes that map is established in interior.Pass through the reference wireless communication of the fixed position of indoor placementNumber source (UWB etc.) is as a reference point, and the wireless signal transceiver 142 in the second mobile electronic device 140 is read to each referenceThe signal intensity (Received Signal Strength, RSS) of radio signal source determines second mobile electronic device 140User position indoors.A room is often reached, the user of the second mobile electronic device 140 starts room by cell phone applicationBetween graph builder.The user of second mobile electronic device 140 shoots the room environment using cell phone application, for example, according to room clothOffice's situation can shoot multiple pictures.The cell phone application of second mobile electronic device 140 will record the second camera shot every timeSecond mobile electronic device 140 of 144 posture information and second wireless singal transceiver 142 record, such as mobile phone are relativeThe elevation information on ground and its positional information indoors, and pass through local wireless communication network (WIFI, bluetooth, ZigBeeDeng) be sent in memory 116.According to the posture information when inside and outside parameter information of second camera 144 and shooting, heightInformation and positional information, map splicing is carried out to the great amount of images of shooting via the image processor in processor 104 and created, it is specialSelective extraction is levied, the addition of characteristic point position information, generates off-the-air picture characteristic point cartographic information, then by the image map after processingInformation is stored in memory 116.
Image processor 1040 in processor 104 is communicatively coupled to the first wireless signal transceiver 102, is configured toThe characteristic information of photo of the extraction comprising selection area, and the characteristic information by comparing extraction includes positional information with what is storedImage map characteristic information, it is determined that the actual coordinate scope corresponding with the selection area in photo.The positional information refers toDuring map is established, the location information of image characteristic point, i.e. real coordinate position in image map.The positional information is for examplePosition and/or the position of of mobile electronic device 100 itself including charging pile 180.For example, the image procossing in processor 104Device 1040 can be using the position of charging pile 180 as the origin of coordinates.
Mode four:User indoors, such as can arrange at least one camera on ceiling, at least one be taken the photograph by thisThe multiple pictures including mobile electronic device 100 gathered as head.At least one camera is by pictorial information via shiftingFirst wireless signal transceiver 102 of dynamic electronic equipment 100 is transmitted to the image processor 1040 of mobile electronic device 100.SoThe characteristic information of mobile electronic device 100 in the image in the identification mission area of image processor 1040 afterwards, and established for imageCoordinate system, and corresponding coordinate value is assigned to each point in mission area, so as to establish environment space map.
Mode five:Mobile electronic device 100 utilizes the first camera 112, such as depth camera is in mobile electronic deviceThe planar graph information and the range information of the object in figure gathered while 100 motion, and planar graph will be included and believedMultiple three-dimensional informations of breath and range information are sent to image processor 1040;Image processor 1040 is communicatively coupled toOne wireless signal transceiver 102, it is configured to multiple three-dimensional informations that processing is received;Again by being communicatively coupled to imageThe mapping module of device 1040 is managed according to multiple three-dimensional informations after the processing of image processor 1040, passes through the mission area of drawing three-dimensionalImage, obtain the environment space map of mission area.
Processor 104 in mobile electronic device 100 is communicatively coupled to the first wireless signal transceiver 102, configurationTo determine the environment space corresponding with the title of purpose mission area.For example, mobile electronic device 100 can first determine and meshMission area the corresponding picture subspace of title;Then according to picture subspace, it is determined that corresponding with picture subspaceEnvironment space.
For example, being stored in the memory 116 of mobile electronic device 100 indoor environment map process is established when using firstMiddle established environmental map, such as off-the-air picture cartographic information, including image characteristic point and its positional information.In addition,An at least generation of the title with representing the subspace in the memory 116 of mobile electronic device 100 also including picture subspaceCorresponding relation between the picture of table.For example, the representational picture in example parlor can be stored in memory 116, and shouldPhoto is named as parlor.Illustrated below by taking the representational picture in parlor as an example.It will be understood by those skilled in the art thatThe embodiment is also applied for other kinds of room.
Processor 104 first by technologies such as speech recognitions, it is determined that with the picture corresponding to the instruction " parlor " that receivesSubspace, such as representational picture.For example, processor 104 retrieves the name for the picture library being stored in mobile electronic device 100Claim, find the representative picture for being named as " parlor ".
Processor 104 includes image processor 1040, and then, the image processor 1040 extracts for example, the parlorCharacteristic information and positional information in representative picture, and further utilize Image Feature Point Matching algorithm (such as SIFT, SURFDeng) with the indoor environment map (containing positional information) in memory 116 quickly compare and analyze.Image characteristic point can useBased on Scale invariant features transform (Scale Invariant Feature Transform, SIFT) algorithm or accelerate sane specialLevy (Speeded Up Robust Features, SURF) algorithm identification features described above.Using SIFT algorithms, it is necessary in memoryReference picture is stored in 116.Image processor 1040 identifies the object of the reference picture of storage in the memory 110 firstKey point, SIFT feature is extracted, then by comparing each key point SIFT feature in memory 110 and freshly harvested figureThe SIFT feature of picture, then the matching characteristic based on K k-nearest neighbors (K-Nearest Neighbor KNN), to identify new imagesIn object.SURF algorithm is responded based on approximate 2D Haar wavelet transforms (Haar wavelet), and utilizes integral image(integral images) carries out image convolution, has used and has estimated construction detection (Hessian based on Hessian matrixesMatrix-based measurefor the detector), and used the description (adistribution- based on distributionbased descriptor)。
Alternatively, or in addition to, it is determined that the coordinate model of the indoor actual area corresponding with the representational picture in parlorThe actual coordinate scope of mission area can be determined by coordinate Mapping and Converting by enclosing.The parlor stored in mobile electronic device 100Representational picture in characteristic point by with the Image Feature Point Matching in image map, you can determine mobile electronic deviceThe real coordinate position of the characteristic point in image in 100.Meanwhile the representational picture institute in parlor can be calculated after matchingCamera coordinates system with respect to the real world coordinate system where charging pile coordinate system transformational relation.For example, in mobile electronSofa, tea table and cabinet for TV are contained in the representational picture in the parlor in the picture library stored in the memory of equipment 100Etc. characteristic point and the respective coordinate range of these furniture.In addition, the environmental map stored in the memory of mobile electronic device 100In also include parlor in sofa, tea table and cabinet for TV.Image processor 1040 is by the picture and environment in the parlor in picture libraryMap is compared, and extracts characteristic information, and more respective coordinate value, carries out changes in coordinates, need to carry out so as to drawThe actual world coordinates scope in the parlor of cleaning.
Locating module 106 is communicatively coupled to processor 104, is configured to record the current institute of mobile electronic device 100In position and environment space, for example, with the distance between parlor scope.For example, locating module 106 will be set at the place of charging pile 180For the origin of coordinates, corresponding coordinate value (X, Y) of each point in image.Locating module 106 and encoder cause mobile electricitySub- equipment 100 knows oneself current position.Locating module 106 is the module for calculating the position indoors of mobile electronic device 100.Mobile electronic device 100 needs to know the indoor location of oneself constantly at work, is all realized by locating module 106.
Path planning module 108 is communicatively coupled to processor 104, is configured to the title according to mission area, generationPath planning scheme.Alternatively, path planning module 108 is additionally operable to use the spanning tree path planning algorithm based on grid, rightSelection area carries out path planning.For example, path planning module 108 uses the spanning tree path planning algorithm based on grid(Grid-based Spanning Tree Path Planning) realizes the cleaning path planning to selected target purging zone.This method handles in respective coordinates region using gridding, and tree node and spanning tree are established to grid, then using encirclement generationThe hamiltonian circuit (Hamiltonian path) of tree cleans path as the optimization for cleaning the region.
In addition, when initial, mobile electronic device 100 is located at intelligent charging spot 180.For mobile electronic device 100 such asWhat reaches the coordinate range region of selection area from intelligent charging spot 180, and path planning module 108 will read and use time shift firstDynamic electronic equipment 100 follows the path (if mobile electronic device 100 uses follow the mode) up to the region, or using theThe walking path that the user of two mobile electronic devices 140 is built during figure is as reaching the path in the region (if used firstWhen, mobile electronic device 100 does not follow the situation of user), and the path is optimized into cleaning path with selection area and synthesizes cleaningTask path.The synthesis can do two sections of paths simple in-order connection, and first paragraph path, which is realized, reaches target purging zone, theTwo sections of paths are realized to be covered to the optimization for drawing a circle to approve purging zone, completes clean up task.
Then, above-mentioned task is sent to mobile electronic device 100 and performed automatically.For example, motion module 110 is communicablyPath planning module 108 is connected to, is configured to, according to path planning scheme, be moved.
Alternatively, mobile electronic device 100 also includes the first camera 112 and memory 116, in the task of progressThe picture of purpose mission area is shot simultaneously.First wireless signal transceiver 102 is further communicatively coupled to described firstCamera 112, the picture of the mission area for obtaining the shooting of the first camera 112, and picture is stored in the memoryIn 116, and it is corresponding with picture subspace.For example, the first camera 112 of mobile electronic device 100 have taken parlorAfter image, it is stored in memory 116, for example, can be stored among the picture subspace in entitled parlor.In another example moveDynamic electronic equipment 100 shoots the photo in bedroom and the layout in present bedroom is stored in into bedroom subspace when carrying out bedroom cleaningIn title, so as to which picture is added in the corresponding picture library of mobile electronic device 100 by way of self study.
Alternatively, or in addition to, mobile electronic device 100, for example, robot 100 also includes encoder and inertia measurementModule (IMU), to aid in the first camera 112 to obtain mobile electronic device 100, such as the position of robot and posture.Such asWhen robot is shielded, not in the sight of the first camera 112 when, encoder and IMU can also provide the position of robotAnd posture.For example, encoder can be used as odometer, by the rotation information of recorder people's wheel, carry out calculating robot and walkThe track crossed.
Alternatively, or in addition to, mobile electronic device 100 can also include sensor 114, and sensor 114 is by mobile electronObstacle information around equipment 100 is sent to motion module 110.Motion mould 110 is additionally configured to adjust mobile electronic device 100Motion orientation with avoiding obstacles.It is appreciated that because the height of installation is different, on mobile electronic device 100First camera 112 is different from the height of the sensor 114 on mobile electronic device 100, therefore the first cameraObstacle information captured by 112 may be different from the barrier captured by sensor, because there may be masking.First shootingFirst 112 can change visual direction by modes such as rotation, pitching, to obtain wider array of visual range.In addition, sensor 114It may be mounted at than relatively low horizontal level, and this position is likely to be the blind area of the first camera 112, object is not present inIn the visual angle of first camera 112, then avoidance must be carried out by these traditional sensors 112.Alternatively, camera 112 canTo obtain obstacle information, and combine ultrasound and the information of laser sensor 114.The image that monocular cam 112 obtains does thingBody identification, ultrasound and the ranging of laser sensor 114.
Alternatively or alternately, sensor 114 includes ultrasonic sensor and/or laser sensor.First shootingFirst 112 and sensor 114 can mutually aid in.For example, during if any masking, in shielded part, mobile electronic device 100 needsItself laser sensor, ultrasonic sensor 114 etc. is leaned on to carry out avoidance.
For example, mobile electronic device 100 carry laser sensor, ultrasonic sensor is to mobile electronic device 100 weeksEnclose static, dynamic environment to be detected, auxiliary avoids static, dynamic barrier and adjustment optimal path.
Fig. 2 shows the flow of the method 200 in mobile electronic device according to one embodiment of the present utility modelFigure.It is used for the method 200 for handling the task of mission area in mobile electronic device.
Fig. 2 is shown is used for the method 200 for handling the task of mission area in mobile electronic device.Mobile electron is setIt is standby to include the first wireless signal transceiver, processor, locating module, path planning module and motion module, wherein, method200 include, in block 210, by being communicatively coupled to first wireless signal transceiver of the second mobile electronic device,The instruction from second mobile electronic device is obtained, the instruction includes the pending purpose of the mobile electronic device and appointedThe title in business region, the title of the purpose mission area are related to the picture subspace in the picture library in mobile electronic deviceConnection;In block 220, by being communicatively coupled to the processor of first wireless signal transceiver, it is determined that with it is describedThe corresponding environment space of the title of purpose mission area;In block 230, by the institute for being communicatively coupled to the processorLocating module is stated, records the distance between the present position of the mobile electronic device and environment space scope;In block 240, by being communicatively coupled to the path planning module of the processor, according to the name of the mission areaClaim, generate path planning scheme;In block 250, by being communicatively coupled to the path planning module and the positioning mouldThe motion module of block, the distance range recorded according to the path planning scheme and the locating module, carries out instituteState task.
Alternatively or alternately, it is corresponding with the title of the purpose mission area also to include determination for method 200Picture subspace;According to the picture subspace, it is determined that the environment space corresponding with the picture subspace.
Alternatively or alternately, wherein, the mobile electronic device also includes camera and memory, method 200It is additionally included in the picture for carrying out that the purpose mission area is shot while the task;By being communicatively coupled to described take the photographAs first wireless signal transceiver of head;The picture of the mission area of the camera shooting is obtained, the picture is depositedStore up in the memory, and it is corresponding with the picture subspace.
Alternatively or alternately, wherein, mobile electronic device also includes being communicatively coupled to the processorEncoder and inertia measuring module, method 200 also include passing through the encoder and the inertia measuring module, taken the photograph described in auxiliaryAs head obtains position and the posture of the mobile electronic device.
Alternatively or alternately, wherein, mobile electronic device also includes charging pile, wherein the charging pile includes instituteState processor, the path planning module and the locating module.
Alternatively or alternately, wherein, mobile electronic device can also include sensor, and method 200 also includes passing throughThe sensor, the obstacle information around the mobile electronic device is sent to the motion module, passes through the motionModule, the motion orientation of the mobile electronic device is adjusted to avoid the barrier.
Alternatively or alternately, wherein sensor includes ultrasonic sensor and/or laser sensor.
In description above, the utility model is described by reference to specific illustrative embodiment;However, it should manageSolution, in the case where not departing from the scope of the utility model described in this paper, can carry out various modifications and variations.SpecificationIt should treat in an exemplary fashion with accompanying drawing, rather than it is restricted, and all such modifications are intended to be included in this realityWith in new scope.Therefore, the scope of the utility model should by this paper general embodiments and its legal equivalents rather thanOnly determined by above-mentioned specific embodiment.For example, the step in any method or process embodiments can be performed in any order, andAnd it is not limited to the clear and definite order presented in a particular embodiment.In addition, the part and/or element in any device embodiment canAssembled with various arrangements or otherwise operatively configured, with generation and the essentially identical result of the utility model, therefore notThe concrete configuration being limited in specific embodiment.
The solution of benefit, other advantages and problem is described on specific embodiment above;However, any benefitThe solution at place, advantage or problem, or any particular benefits, advantage or scheme can be caused to occur or become more apparent upon anyElement is not necessarily to be construed as key, required or basic feature or part.
As it is used herein, term " comprising ", "comprising" or its any modification are intended to quote including for nonexcludability, makeProcess, method, article, composition or the device of element list, which must be included, not only includes those described elements, and can alsoIncluding not expressly listed or intrinsic main process, method, article, composition or device.Except that being not specifically delineatedOutside a little, the said structure, layout, application, ratio, element, material or the part that are used in practice of the present utility model itsIt is combined and/or modification can be changed, or otherwise especially suitable for specific environment, manufacture specification, design ginsengNumber or other operations require, without departing from its substantially principle.
Although describing the utility model by reference to some preferred embodiments herein, those skilled in the art will holdIt is readily understood, in the case where not departing from spirit and scope of the present utility model, other application can substitute it is described in this paper thatA bit.Therefore, the utility model is only limited by following claims.

Claims (7)

CN201721069377.8U2017-08-242017-08-24A kind of mobile electronic device for being used to handle the task of mission areaExpired - Fee RelatedCN207067803U (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201721069377.8UCN207067803U (en)2017-08-242017-08-24A kind of mobile electronic device for being used to handle the task of mission area

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201721069377.8UCN207067803U (en)2017-08-242017-08-24A kind of mobile electronic device for being used to handle the task of mission area

Publications (1)

Publication NumberPublication Date
CN207067803Utrue CN207067803U (en)2018-03-02

Family

ID=61514168

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201721069377.8UExpired - Fee RelatedCN207067803U (en)2017-08-242017-08-24A kind of mobile electronic device for being used to handle the task of mission area

Country Status (1)

CountryLink
CN (1)CN207067803U (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN108459598A (en)*2017-08-242018-08-28炬大科技有限公司A kind of mobile electronic device and method for handling the task of mission area
CN108827309A (en)*2018-06-292018-11-16炬大科技有限公司A kind of robot path planning method and the dust catcher with it
WO2019001237A1 (en)*2017-06-302019-01-03炬大科技有限公司Mobile electronic device, and method in mobile electronic device
CN111226800A (en)*2020-01-192020-06-05中国农业科学院农业信息研究所 Method, device and system for cow cooling based on position detection
CN113168180A (en)*2018-11-212021-07-23三星电子株式会社 Mobile device and object detection method therefor

Cited By (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2019001237A1 (en)*2017-06-302019-01-03炬大科技有限公司Mobile electronic device, and method in mobile electronic device
CN108459598A (en)*2017-08-242018-08-28炬大科技有限公司A kind of mobile electronic device and method for handling the task of mission area
WO2019037517A1 (en)*2017-08-242019-02-28炬大科技有限公司Mobile electronic device and method for processing task in task area
CN108459598B (en)*2017-08-242024-02-20炬大科技有限公司Mobile electronic device and method for processing tasks in task area
CN108827309A (en)*2018-06-292018-11-16炬大科技有限公司A kind of robot path planning method and the dust catcher with it
CN113168180A (en)*2018-11-212021-07-23三星电子株式会社 Mobile device and object detection method therefor
CN111226800A (en)*2020-01-192020-06-05中国农业科学院农业信息研究所 Method, device and system for cow cooling based on position detection
CN111226800B (en)*2020-01-192021-10-15中国农业科学院农业信息研究所 Method, device and system for cow cooling based on position detection

Similar Documents

PublicationPublication DateTitle
CN207067803U (en)A kind of mobile electronic device for being used to handle the task of mission area
CN207115193U (en)A kind of mobile electronic device for being used to handle the task of mission area
US11100260B2 (en)Method and apparatus for interacting with a tag in a wireless communication area
US20210192774A1 (en)Mapping Optimization in Autonomous and Non-Autonomous Platforms
US12248737B2 (en)Agent supportable device indicating an item of interest in a wireless communication area
CN207488823U (en)A kind of mobile electronic device
JP6705465B2 (en) Observability grid-based autonomous environment search
EP4068206B1 (en)Object tracking in local and global maps systems and methods
CN108459597B (en)Mobile electronic device and method for processing tasks in task area
US9628675B2 (en)Method and apparatus for object tracking and recognition
US11561553B1 (en)System and method of providing a multi-modal localization for an object
CN109547769B (en) A highway traffic dynamic three-dimensional digital scene acquisition and construction system and its working method
WO2019001237A1 (en)Mobile electronic device, and method in mobile electronic device
CN106291517A (en)Indoor cloud robot angle positioning method based on position and visual information optimization
US10949579B2 (en)Method and apparatus for enhanced position and orientation determination
JP6959888B2 (en) A device, program and method for estimating the terminal position using a model related to object recognition information and received electromagnetic wave information.
KR102542556B1 (en)Method and system for real-time detection of major vegetation in wetland areas and location of vegetation objects using high-resolution drone video and deep learning object recognition technology
CN108459595A (en)A kind of method in mobile electronic device and the mobile electronic device
CN207051738U (en)A kind of mobile electronic device
US12242963B2 (en)User-in-the-loop object detection and classification systems and methods
CN206833252U (en)A kind of mobile electronic device
JP7160257B2 (en) Information processing device, information processing method, and program
CN108459598A (en)A kind of mobile electronic device and method for handling the task of mission area
CN109641351A (en)Object feature identification method, visual identification device and robot
CN115830280A (en)Data processing method and device, electronic equipment and storage medium

Legal Events

DateCodeTitleDescription
GR01Patent grant
GR01Patent grant
CF01Termination of patent right due to non-payment of annual fee
CF01Termination of patent right due to non-payment of annual fee

Granted publication date:20180302

Termination date:20200824


[8]ページ先頭

©2009-2025 Movatter.jp