Unmanned vehicle semanteme Map building and building application method based on perceptual positioning monitoringTechnical field
The present invention relates to unmanned technical fields, more particularly to the unmanned vehicle semanteme map based on perceptual positioning monitoring is builtMould and building application method.
Background technique
In recent years, the unmanned extensive concern for having obtained domestic and international academia and industry, Related Supporting TechnologiesThere is quick development.From the point of view of unpiloted application product, generally can be divided into industrial production using unmanned product andUnmanned product two major classes are applied in personal consumption.It, generally can be by Unmanned Systems on unmanned technological system compositionThe submodules such as environment sensing, decision rule and motion control are divided into, wherein environment sensing is obtained by various sensorsTake the real-time scene information of traffic environment and build environment model (i.e. perception map);Decision rule is on the basis of environmental modelOn, it makes and meets traffic rules, the behaviour decision making of safety and corresponding avoidance driving trace;The rail that motion control will be plannedThe discrete control instruction for turning to the practically necessary execution of unmanned vehicle of mark, such as throttle, brake, steering wheel angle, and it is sent to nothingPeople's vehicle executes system and executes, and realizes autonomous driving behavior.Wherein, environment sensing is equivalent to the function of " eyes " of unmanned vehicle, ringThe content of border perception includes autonomous positioning, road Identification, quiet dynamic disorder analyte detection etc., wherein it is crucial that autonomous positioning.
The implementation method of unpiloted autonomous positioning navigation at present has: magnetic stripe colour band of early stage etc. tracks navigation, outdoorSatellite navigation, it is synchronous to position and build figure navigation and numerical map navigation.Current domestic and international unmanned field, specifiedScene is mainstream technology using the navigation of height essence numerical map.
Unmanned transfer robot is the important component of industrial automatic driving vehicle, and current unmanned transfer robot is pressedApplication scenarios divide, and can be divided into indoor unmanned transfer robot and outdoor transfer robot.Location technology is mobile robotCore technology, the location navigation principle of indoor unmanned transfer robot mainly has, laser beacon navigation, laser SLAM navigation,Vision guided navigation and the navigation of magnetic stripe colour band etc..The above navigator fix technology covers substantially indoor most scenes, and takesObtain better effects.In terms of outdoor unmanned transfer robot, current product is outdoor also based on major port magnetic conductance formula technologyThe technical maturity of laser and vision guided navigation is low, mostly can not commercialization.For the scene of outdoor carrying, we have proposed utilizationsMulti-line laser radar constructs the smart beacon map of three-dimensional height of large scene as navigation core sensor.It should be pointed out that shouldHigh-precision map not only can satisfy the location navigation demand of unmanned transfer robot, while being also able to satisfy indoor positioning navigation need toIt asks, thus solves indoor and outdoor, the unmanned carrying demand of indoor and outdoor mixing scene.
Existing Chinese patent: Publication No. CN104535070A (application number 20141083873.5), the patent provides oneMap data structure is divided into four layers: road network, vehicle by kind accurately graph data structure, acquisition and processing system and methodRoad network, lane line information and specific information data, although defining the association of database level between several levels,Due to lacking semantic information, unmanned vehicle be difficult to establish in this map data structure all kinds of map elements and traffic participant itBetween perfect semantic relation, differentiate unmanned vehicle real-time scene information, realize scene understanding.Meanwhile such as crossing, the information that turns around hardly possibleIt is also not accurate enough for lane line and being associated with for lane to be embodied in its data structure, if certain section of road may be that two lanes becomeThe relationship of three lanes, that lane and lane line among such words three lanes will be beyond expression of words.
Existing Chinese patent: Publication No. CN104089619A (application number 201410202876.4), the patent providesA kind of accurate matching system of GPS navigation map and its operating method of pilotless automobile are determined by obtaining road informationInitial point obtains vehicle location information, information matches and the accurate matching for screening this process completion navigation map, but it is matchedMethod mainly passes through discrete point and scans for, and not using the relevance between map element, will lead to match in this wayThe problem of low efficiency.
Main problem of the existing technology is that the data structure of high-precision map is not easy to unmanned vehicle progress scene understanding,The relevance for not utilizing map element well simultaneously, causes map search efficiency relatively low.How effectively efficiently to completeUnmanned vehicle becomes problem to be solved to the understanding of driving scene element and the association search efficiency of raising map element.
Summary of the invention
The present invention in order to solve the above-mentioned technical problem, provide based on perceptual positioning monitoring unmanned vehicle semanteme Map building andApplication method is constructed, to efficiently complete understanding of the unmanned vehicle to driving scene element, improves the association search of map elementEfficiency.
In order to solve the above technical problems, the present invention is achieved by the following technical solutions:
The present invention provides the unmanned vehicle semanteme map modeling method monitored based on perceptual positioning, including building for beacon mapMould, the building of beacon map, map element defines and the Camera calibration based on beacon map, including unmanned vehicle is semanticallyFigure modeling;
It include two conceptual modules: entity and attribute in unmanned vehicle semanteme Map building.
Entity includes from vehicle entity, road network entity and barrier entity;Road network entity includes domain entities and point entity;Domain entities include whole section, tie point, boundary, road separator, special area, crossing, lane line, lane, roadSection;Domain entities include crossing, turn around and number of track-lines increase and decrease at join domain;Point entity includes land marking, roadside markKnowledge and stop line;Barrier entity includes dynamic barrier, static-obstacle thing, means of transportation types of obstructions, pedestrian, movesObject, vehicle, natural obstacle and road intercept class barrier;Attribute includes point coordinate, regional scope and constraint;In attributePoint coordinate is the point coordinate of map element;Regional scope in attribute is the regional scope of map element;Being constrained in attributeConstrained type between map element.
Semantic relation comprising the corresponding concepts in two conceptual modules in unmanned vehicle semanteme Map building;Semantic relation pointFor object properties and data attribute two parts;Object properties part includes the inheritance and association pass between corresponding conceptsSystem;Including establishing the hierarchical relationship between corresponding concepts;Including establishing the incidence relation between corresponding concepts;Data attribute partIncluding the global path planning information from vehicle.
It wherein, is the unmanned vehicle entity of unmanned vehicle itself or respective type from vehicle entity;Comprising several equidirectional in sectionLane;Land marking is ground traffic sign;Roadside is identified as roadside traffic mark.
Wherein, natural obstacle includes being recessed ground noodles barrier and projectedly noodles barrier;Road intercepts class barrierIncluding failure nameplate, cone bucket, water horse fence, defiber and construction nameplate.
Wherein, constraint includes connection constraints, is the connection direction constraint in section and section;Connection constraints include that left steering connectsConnect constraint, right turn connection constraints, the connection constraints that turn around and straight trip connection constraints.
Wherein, the incidence relation between corresponding concepts includes whole section and road separator, section, between tie pointSyntagmatic;Incidence relation between corresponding concepts includes the connection relationship between section and tie point;Between corresponding conceptsIncidence relation includes the positional relationship between section and road separator;Incidence relation between corresponding concepts includes section and peoplePositional relationship between row lateral road;Incidence relation between corresponding concepts includes the positional relationship between section and stop line;PhaseAnswering the incidence relation between concept includes the positional relationship between section and boundary;Incidence relation between corresponding concepts includes roadRelationship between section and lane;Incidence relation between corresponding concepts includes the relationship between section and roadside mark;It is corresponding generalIncidence relation between thought include between tie point and connection constraints there are relationships;Incidence relation between corresponding concepts includesRelationship between tie point and crossing;Incidence relation between corresponding concepts includes that the position between lane and lane line is closedSystem;Incidence relation between corresponding concepts includes the position relation between lane and other lanes;Association between corresponding conceptsRelationship includes the positional relationship between lane and special area;Incidence relation between corresponding concepts includes lane and land markingBetween relationship;Incidence relation between corresponding concepts includes the relationship of connection constraints and section to state connection direction;PhaseAnswering the incidence relation between concept includes the position relation between vehicle and barrier entity;Incidence relation between corresponding conceptsIncluding the positional relationship between vehicle and lane;Incidence relation between corresponding concepts includes between domain entities and regional scopeRelationship;Incidence relation between corresponding concepts includes the relationship between point entity and point coordinate.
Wherein, data attribute part includes the present speed from vehicle;Data attribute part include from vehicle and it is next willThe distance of the tie point of arrival, crossing, stop line;Data attribute part include from vehicle at a distance from barrier entity;NumberIt include the present speed and pose of barrier entity according to attribute section;Data attribute part includes the data information of a coordinate;Data attribute part includes the data information of regional scope;Data attribute part include the speed limiting information in lane, lane permitPerhaps direction information, the whether most left most right lane mark in lane and lane width;Data attribute part includes the vehicle that section includesRoad quantity;Data attribute part includes the type information in whole section;Data attribute part includes the basic of corresponding entity conceptAttribute.
The construction method of unmanned vehicle semanteme map based on perceptual positioning monitoring: by static map data instance andReal-time barrier instantiates generative semantics map.Specific step is as follows:
The first step obtains the detailed data information of true running environment by sensory perceptual system, by map detailed data according toConcept of Map structure example turns to static road network entity;
Second step obtains real-time barrier posture information by sensor, with being instantiated as barrier by obstacle informationFigure entity;
Third step, the entity established in static map and barrier map obtained in the step first step and second step are mutualSemantic relation obtains the semantic map for unmanned vehicle.
Wherein, sensory perceptual system is using laser radar, camera, GPS or photo monitoring satellite or corresponding sensing device system;Sensor uses laser radar, camera, GPS or corresponding sensing device system.
Based on perceptual positioning monitoring unmanned vehicle semanteme map application method: by semantic map, Global motion planning path,The real-time obstacle information of the current pose of unmanned vehicle and periphery carries out semantic reasoning and obtains unmanned vehicle part scene information, realizes nothingThe scene understanding of people's vehicle assists unmanned vehicle decision.Specific step is as follows:
The first step obtains unmanned vehicle target travel path by unmanned vehicle Global motion planning system, and it is fixed to be positioned by GPS/INSObtain the current pose of unmanned vehicle in real time to system;
Second step is obtained by unmanned vehicle context aware systems real-time perception periphery obstacle information by semantic reasoningTheir relative poses between unmanned vehicle;
Third step passes through semantic map, Global motion planning path, the current pose of unmanned vehicle and periphery barrier relative poseIt carries out semantic reasoning and obtains unmanned vehicle part scene information;
4th step assists unmanned vehicle to make different decisions according to corresponding scene information.
Compared with prior art, the beneficial effects of the present invention are:
The present invention by constructing a set of map element data hierarchy suitable for unmanned vehicle, by map element itBetween devise sufficient semantic relation, consequently facilitating generative semantics map;And by semantic map, Global motion planning path, nobodyThe real-time obstacle information of the current pose of vehicle and periphery carries out semantic reasoning and obtains unmanned vehicle part scene information, assists unmanned vehicleBehaviour decision making is carried out, understanding of the unmanned vehicle to driving scene element is efficiently completed, improves the association search efficiency of map element.
Detailed description of the invention
Fig. 1 is the flow chart of unmanned vehicle semanteme Map building of the present invention and application;
Fig. 2 is semantic map element concept hierarchy structure chart;
Fig. 3 is semantic map element inclusion relation figure;
Fig. 4 is the concept related relational graph of semantic map element;
Fig. 5 is unmanned vehicle and barrier position relation figure;
Fig. 6 is semantic map generating process schematic diagram;
Fig. 7 is the schematic diagram of the part implementation content in whole section in semantic map;
Fig. 8 is the part implementation content schematic diagram in semantic map from vehicle;
Fig. 9 is the schematic diagram of semantic reasoning partial content.
Specific embodiment
In order to make the objectives, technical solutions, and advantages of the present invention clearer, with reference to the accompanying drawings and embodiments, rightThe present invention is further elaborated.It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, andIt is not used in the restriction present invention.
Specific embodiment one:
As shown in Figure 1 and Figure 2, the present embodiment provides a kind of modeling method of semantic map, the concept knot including semantic mapThe method of structure, semantic relation and true map instantiation generative semantics map.
As shown in figure 3, Ontology is divided into two big modules: entity and attribute:
(1) entity includes having respectively represented from vehicle, road network entity and barrier entity from vehicle (unmanned vehicle) entity, roadNetwork element element entity and barrier entity.
(11) different type unmanned vehicle can be extended to according to demand by referring to unmanned vehicle itself from vehicle.
(12) road network entity includes domain entities and point entity, respectively represents area type entity and vertex type entity.
(121) domain entities include whole section, tie point, boundary, road separator, special area, crossing, vehicleDiatom, lane, section.Wherein, whole section represents the whole section an of road, including tie point, section, boundary and roadRoad isolation strip;Including crossing, turn around and number of track-lines increase and decrease at region join domain;Section includes multiple equidirectional vehiclesRoad.
(122) point entity includes land marking, roadside mark and stop line, respectively represents ground traffic sign, roadsideTraffic mark and stop line (there are one-to-one relationships with section for stop line, therefore can simplify into a point).
(13) barrier entity include dynamic barrier, static-obstacle thing, means of transportation types of obstructions, pedestrian, animal,Vehicle, natural obstacle and road intercept class barrier.Wherein natural obstacle includes recessed ground noodles barrier (such as: puddle)Projectedly noodles barrier (such as bulk stone).It includes that failure nameplate, cone bucket, water horse enclose that road, which intercepts class barrier,Column, defiber and construction nameplate.
(2) attribute includes point coordinate, regional scope and constraint, has respectively represented point coordinate, the region model of map elementEnclose and map element between constrained type.Constraint includes connection constraints, and the connection direction for representing section and section constrains.EvenConnecing constraint includes left steering connection constraints, right turn connection constraints, the connection constraints that turn around and straight trip connection constraints.
As shown in figure 4, contain the semantic relation in semantic map, contain in front defined between each conception of speciesSemantic relation.Semantic relation is divided into object properties and data attribute two parts:
(1) object properties part includes inheritance (extensive specialization) and the incidence relation between different concepts.
(11) hierarchical relationship between different concepts has been described in the above content.
(12) incidence relation between different concepts includes whole section and road separator, section, between tie pointSyntagmatic (its relationship name is respectively as follows: there are road separator, there are section, there are tie points);Between section and tie pointConnection relationship (its relationship name are as follows: associated connection point), between section and road separator positional relationship (its relationship name are as follows:Associated road isolation strip), the positional relationship (its relationship name are as follows: affiliated person's row lateral road) between section and crossing, section withPositional relationship (its relationship name are as follows: association stop line) between stop line, positional relationship (its relationship name between section and boundaryAre as follows: association boundary), the relationship (its relationship name are as follows: there are lanes) between section and lane, the pass between section and roadside markIt is (its relationship name are as follows: there are roadside marks);Between tie point and connection constraints there are relationship (its relationship names are as follows: there is companyConnect constraint), the relationship (its relationship name are as follows: there are crossings) between tie point and crossing;Between lane and lane linePositional relationship (its relationship name is respectively as follows: there are left-lane line, there are right-lane lines), the orientation between lane and other lanesRelationship (its relationship name is respectively as follows: left lane, in the same direction the right lane in the same direction), the positional relationship between lane and special area(its relationship name are as follows: there are special areas), the relationship (its relationship name are as follows: there are land markings) between lane and land marking;The relationship of connection constraints and section to state connection direction (its relationship name is respectively as follows: starting section, target road section).From vehicle with(its orientation is as shown in figure 5, its relationship name is respectively as follows: there are left back barrier, exists position relation between barrier entityDead astern barrier, there are right back barrier, there are left front barrier, exist front barrier, there are right front barriersHinder object, there are front-left barrier, there are front-right barriers), positional relationship between vehicle and lane (its relationship name are as follows:Affiliated lane);Relationship (its relationship name are as follows: associated region range) between domain entities and regional scope;Point entity and point are satRelationship (its relationship name are as follows: relating dot coordinate) between mark.Physical relationship is as follows:
(2) data attribute part includes global path planning information (its attribute of a relation name are as follows: next crossing turns from vehicleTo) and present speed (its relationship name are as follows: from vehicle real-time speed), it is horizontal from vehicle and next tie point that will be reached, people's rowThe distance (its relationship name is respectively as follows: and tie point distance and crossing distance and stop line distance) in road, stop line, from vehicleWith (its relationship name are as follows: with obstacle distance) at a distance from barrier;Present speed (its relationship name are as follows: obstacle of barrier entityObject speed) and pose (its relationship name are as follows: the barrier direction of motion);Data information (its relationship name are as follows: point coordinate of point coordinateValue);The data information (its relationship name are as follows: regional scope value) of regional scope;Speed limiting information (its relationship name difference in laneAre as follows: lane the max speed, lane minimum speed), lane allow direction information (its relationship name are as follows: lane crossing turn to), laneWhether most left most right lane identifies (its relationship name is respectively as follows: most left-lane, in the same direction most right lane in the same direction) and lane width (itsRelationship name are as follows: lane width);The lane quantity (its relationship name are as follows: number of track-lines contained by section) that section includes;The class in whole sectionType information (its relationship name are as follows: whole road segment classification);(its relationship name is respectively as follows: entity ID, entity to the essential attribute of each concept className).Physical relationship is as follows:
Specific embodiment two:
As shown in fig. 6, the method for static map data instance and real-time barrier instantiation generative semantics map,Steps are as follows:
Step 1: obtaining the detailed of true running environment by sensory perceptual systems such as laser radar, camera, GPS, satellite photoesData information, and map detailed data is turned into static road network entity according to the Concept of Map structure example;
Step 2: obtaining real-time barrier posture information by sensors such as laser radar, camera, GPS, barrier is believedBreath is instantiated as barrier map entities;
Step 3: semanteme closes the entity in static map obtained in establishment step 1, step 2 and barrier map each otherSystem, finally obtains the semantic map for unmanned vehicle.
Specific embodiment three:
As shown in fig. 7, being the modeling example figure of one section of true map, which includes a crossroad, one to turn around,Multiple sections and other map elements, key element all use arrow logo to come out, land marking, roadside mark differenceOne has only been taken to be used as signal.
Firstly, obtaining map detailed data;Then map detailed data is divided into difference according to semantic map concept structureClassification map element and static road network entity is turned to according to aforementioned concepts structure example.
As shown in the figure, wherein road laterally and longitudinally represents two whole section entities, crossroad entity isTie point 002, the entity that turns around is tie point 001, and each section is to be connected by tie point with other sections, empty among roadLine arrow represents connection constraints entity, is associated with tie point 002, and tie point 002 should have 12 connection constraints entities herein, pointDifferent directions section is not represented by the existing connection relationship of tie point 002, is only labelled with part connection constraints entity herein,Other map elements such as lane line, lane, road separator, boundary etc. have all marked in Fig. 7.
Existing semantic relation between the map element entity being previously completed is set up, such as section 003, there are lanesFor lane 003 and lane 004, lane 003 is lane line 002 there are left-lane line, and left lane in the same direction is lane 004.Due toWhole correlation comparison complexity are not easy to be described in detail, and the object properties of each entity and data attribute are established one by one;Pass through senseKnow that system obtains barrier posture information in real time, and barrier map entities are turned to according to aforementioned concepts structure example, will hinderObject entity and static road network entity is hindered to set up semantic relation;Finally, by static state road network entity obtained in preceding step, in real timeBarrier map entities and their association are planned as a whole to get up, and obtain semantic map.
Specific embodiment four:
As shown in figure 8, the square that its cartographic semantics information all in Fig. 7, is directed toward from vehicle arrow represents unmanned vehicle present bitSet, current unmanned vehicle travel to close to tie point (tie point may include crossing, turn around and number of track-lines increase and decrease at etc. regions),The current pose of unmanned vehicle and periphery obstacle information are obtained by sensory perceptual system real-time perception, is obtained by semantic reasoning and nothingPeople's vehicle relative pose, and on this basis, by passing through semantic map, Global motion planning path, the current pose of unmanned vehicle and weekSide barrier relative pose carries out semantic reasoning and obtains unmanned vehicle part scene information, determines so that unmanned vehicle be assisted to make behaviorPlan.
There are barrier vehicle 002, (with obstacle distance be 7m, barrier speed is 0, barrier in discovery front in Fig. 8The direction of motion is in the same direction), there are barrier vehicle 001, (with obstacle distance be 15m, barrier speed is 0, obstacle for right frontThe object direction of motion is in the same direction) and the right there are barrier vehicle 003 (be 2m with obstacle distance, barrier speed is 0, barrierIt is in the same direction for hindering the object direction of motion), therefore judge that unmanned vehicle should stop;Meanwhile it being illustrated in figure 9 one section of reasoning process signal,Known according to global path planning from the next crossing of vehicle and turned to turn left, at the same from the affiliated lane of vehicle be lane 004, section 003There are lane lane 004, associated connection point is tie point 002, and tie point 002 is that connection constraints 004 are (affiliated there are connection constraintsConcept class: left steering connection constraints (affiliated parent: connection constraints), starting section: section 003, target road section: section 008),Therefore can carry out semantic reasoning to predict next section to be reached be section 008, by section 008 it is available itsPlace local map information helps unmanned vehicle that next local map information to be reached is known in advance.Specific step is as follows:
Step 1 is obtained unmanned vehicle target travel path by unmanned vehicle Global motion planning system, and is positioned by GPS/INSOrientation system obtains the current pose of unmanned vehicle in real time;
Step 2 passes through unmanned vehicle context aware systems real-time perception periphery obstacle information, obtains it by semantic reasoningRelative pose between unmanned vehicle;
Step 3 passes through semantic map, Global motion planning path, the current pose of unmanned vehicle and periphery barrier relative poseIt carries out semantic reasoning and obtains unmanned vehicle part scene information;
Step 4 assists unmanned vehicle to make different decisions according to different scenes information.
It, can be in short, the present invention relates to a kind of method based on ontological unmanned vehicle semanteme cartographic model construction methodIt applies in unmanned vehicle software systems, unmanned vehicle is helped to understand scene information.The special needle of semantic cartographic model that the present invention constructsThe cartographic information element of interest to unmanned vehicle carries out model construction, can accurately express the scene that unmanned vehicle may face, andAll there is semantic relation between map element and traffic participant, the semantic map application method energy that provides through the inventionEnough help its place scene of unmanned vehicle fast understanding.
The present invention passes through utilization by the unmanned vehicle semanteme Map building and building application method that monitor based on perceptual positioningThe precision distance measurement characteristic of laser perception, the point cloud information of laser beacon is partitioned into from single frames point cloud, recycles preset letterPhysical model is marked, the geography information of beacon is calculated, including the three-dimensional distance information relative to laser sensor.It is calculated with trackInformation is associated with the beacon location information that expanded Kalman filtration algorithm solves consecutive frame, obtains the three-dimensional beacon ground in regionFigure.By utilizing multi-line laser radar data, the geography information of environment fixed obstacle is calculated;It is excellent by figure optimization algorithmChange, combines the sub- map delamination of each beacon, obtain global expansible beacon-barrier map.By map using indexMechanism realizes from anyon map and carries out secondary location navigation, ensure that the navigation real-time demand of unmanned mobile robot.
The present invention has the following characteristics that in application process
One, the beacon map that constructs of the present invention is a kind of to can be applied to outdoor scene and the three-dimensional of indoor scene builds figure skillArt, to outdoor wet weather, the atrocious weathers such as light differential, spray dust environmental condition has good robustness round the clock.
Two, the high resolution of present invention building map, can reach the precision of Centimeter Level in the scene of length and width km grade;?It builds during figure through point cloud characteristic matching and reckoning algorithm fusion, obtains the sub- map in the region of layering and be associated with sonThe global map of figure avoids the detection error and Algorithm Error of sensor, is applicable in accurately so that the overall precision of map reachesThe requirement of navigation.
Three, the beacon map that the present invention constructs is 3 D stereo, and beacon is distributed with (x, y, z) three dimensions in mapInformation, building drawing method, not only the flat scene of road pavement is effective, equally effective to acclive outdoor scene.
Four, present invention building beacon map can be widely used in industrial carrying machine people, region pilotless automobileLocation navigation, map reference is high-efficient, is able to satisfy requirement of real-time.
Part of that present invention that are not described in detail belong to the well-known technology of those skilled in the art.
The above content is the detailed descriptions for combining specific embodiment to carry out the present invention, but can not assert the present inventionSpecific implementation be only limited to these contents.Under the premise of not departing from the principle and spirit of the invention, those skilled in the art canTo implement to carry out several adjustment, modification to these, protection scope of the present invention has appended claims and its equivalent to limit.
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the invention, all in essence of the inventionMade any modifications, equivalent replacements, and improvements etc., should all be included in the protection scope of the present invention within mind and principle.