Movatterモバイル変換


[0]ホーム

URL:


CN113449158A - Adjoint analysis method and system among multi-source data - Google Patents

Adjoint analysis method and system among multi-source data
Download PDF

Info

Publication number
CN113449158A
CN113449158ACN202110691885.4ACN202110691885ACN113449158ACN 113449158 ACN113449158 ACN 113449158ACN 202110691885 ACN202110691885 ACN 202110691885ACN 113449158 ACN113449158 ACN 113449158A
Authority
CN
China
Prior art keywords
accompanying
source
data
track
perception
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110691885.4A
Other languages
Chinese (zh)
Inventor
马万里
郭鹏展
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Electronics Import And Export Co ltd
Original Assignee
China Electronics Import And Export Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Electronics Import And Export Co ltdfiledCriticalChina Electronics Import And Export Co ltd
Priority to CN202110691885.4ApriorityCriticalpatent/CN113449158A/en
Publication of CN113449158ApublicationCriticalpatent/CN113449158A/en
Pendinglegal-statusCriticalCurrent

Links

Images

Classifications

Landscapes

Abstract

The invention relates to a method and a system for adjoint analysis among multi-source data. The method comprises the following steps: acquiring a main track of a main target passing through a perception source device of the main target within an accompanying analysis time range from a first data source; setting an accompanying parameter, and performing track data query in a second data source according to the accompanying parameter to obtain a plurality of accompanying tracks; and merging the accompanying tracks according to the accompanying parameters, and performing track comparison on the main track and the merged accompanying tracks to obtain the accompanying times and the number of the accompanying sensing source devices. The invention can realize an efficient adjoint analysis method along with analyzing the data sources such as the information collected by the identity card, the information of the equipment of the vehicle passing the perception source, the IMISI of the mobile phone, the information collected by the MAC and the like, can find the hidden adjoint information among different data sources, and greatly improves the data use efficiency and the service expansibility.

Description

Adjoint analysis method and system among multi-source data
Technical Field
The invention relates to a data adjoint analysis method, in particular to an adjoint analysis method and a system among multi-source data.
Background
In the prior art, no technical scheme for fusion use and correlation analysis of multi-source track data exists, so that cross-data-source accompanying analysis cannot be realized. The accompanying analysis is to perform track comparison on track data of different data sources according to the time of passing through respective sensing equipment and the distance between the sensing equipment by a method of time approximation processing and sensing source equipment position approximation normalization processing, and find out potential peer-to-peer relationships.
Disclosure of Invention
The invention discloses a method and a system for concomitant analysis among multi-source data.
Based on the above, the invention provides the following technical scheme:
a method for adjoint analysis among multi-source data comprises the following steps:
1) acquiring a main track of a main target passing through a perception source device in a certain time range (accompanying analysis time range) from a first data source;
2) setting an accompanying parameter, and performing track data query in a second data source according to the set accompanying parameter to obtain tracks of a plurality of accompanying targets, namely accompanying tracks;
3) and merging the accompanying tracks according to the accompanying parameters, and performing track comparison on the main track and the merged accompanying tracks to obtain the accompanying times and the number of accompanying sensing source devices, thereby uniformly providing data support for accompanying and identity.
Furthermore, longitude and latitude information of unified standards is set for perception source equipment in each data source.
Further, the accompanying parameters in step 2) include offset time of passing through the sensing source devices and offset distance between the sensing source devices.
Further, step 2) comprises:
2.1) the primary track details of the primary target obtained by the query in the accompanying analysis time range, namely, the precise time when the primary target passes through each perception source device. The main target is one of identity card number, license plate number, IMSI code and MAC code, and the sensing source equipment is identity card identification terminal, license plate identification equipment, IMSI/MAC acquisition equipment and the like.
2.2) setting an offset distance, and inquiring a spatial position based on the perception source equipment passed by the main track in the step 2.1), wherein the inquiry comprises the following steps: and inquiring other sensing source equipment in the range by taking the position of each sensing source equipment passed by the main track as a circle center and the offset distance as a radius, and calling the other sensing source equipment as the accompanying sensing source equipment, and grouping according to the type of the sensing source equipment to obtain the one-to-many virtual co-location relation between the sensing source equipment passed by all the main targets and the other accompanying sensing source equipment in the offset distance range.
And 2.3) setting offset time, taking the accurate time of the main target passing through each perception source device obtained by inquiring in the step 2.1) as the center, adding or subtracting the offset time from front to back, and calculating the offset time range of each accompanying perception source device, thereby obtaining the time range of other types of accompanying perception source devices corresponding to the offset time range of the perception source device through which the main track passes.
2.4) according to all the accompanying perception source devices obtained in the step 2.2) and the offset time ranges of all the accompanying perception source devices obtained in the step 2.3), inquiring all the accompanying target data passing through the accompanying perception source devices in the second data source, aggregating all the data according to the accompanying targets, and then performing time sequencing on the data of each accompanying target to obtain the tracks of all the accompanying targets passing through the accompanying perception source devices, namely a plurality of accompanying tracks.
Further, step 3) comprises:
3.1) according to the one-to-many virtual same-position relation between the perception source equipment of the main track and the accompanying perception source equipment, the accompanying perception source equipment in the accompanying track is classified into corresponding main track perception source equipment;
3.2) carrying out deduplication operation on the data of the accompanying track in the offset time range according to the offset time range of each perception source device of the main track, wherein only one piece of data exists in one offset time range;
3.3) carrying out statistics on the accompanying tracks operated in the step 3.1) and the step 3.2), wherein the total number of records is the accompanying times of the main track;
3.4) carrying out statistics on the accompanying tracks after the operations of the step 3.1) and the step 3.2), wherein the number of different perception source devices passing through the accompanying tracks is the number of the accompanying perception source devices.
A adjoint analysis system among multi-source data adopting the method comprises the following steps:
the main track query module is used for acquiring a main track of a main target passing through the perception source equipment in an accompanying analysis time range from a first data source;
the accompanying track query module is used for setting accompanying parameters and performing track data query in a second data source according to the accompanying parameters to obtain a plurality of accompanying tracks;
and the merging analysis module is used for merging the accompanying tracks according to the accompanying parameters, and performing track comparison on the main track and the merged accompanying tracks to obtain the accompanying times and the number of the accompanying perception source devices.
Compared with the prior art, the adjoint analysis method for multi-source data can be used for adjoint analysis of data sources such as identification card acquisition information, vehicle information of sensing source equipment, mobile phone IMISI and MAC acquired information, and richer data source adjoint analysis is also suitable, so that an efficient adjoint analysis method is realized, hidden adjoint information can be found among different data sources, and the data use efficiency and the service expansibility are greatly improved.
Drawings
FIG. 1 is a flow chart of a companion analysis method between multi-source data according to the present disclosure.
Detailed Description
The present invention will be described in further detail below with reference to specific examples and the accompanying drawings.
Referring to fig. 1, the technical solution provided in this embodiment includes the following specific steps:
1) the accompaniment of the multi-source data is directly related to the longitude and latitude of the perception source equipment, and a coordinate system, such as a GCJ-02 coordinate used by Baidu BD-09 coordinates to Google, Teng and God, must be unified.
2) A primary trajectory of a primary target through its perceptual source device over a time range (accompanying analysis time range) is obtained.
For example, the main target to be queried is a certain license plate number "Jing XXXXXXX", data (first data source) is captured by a vehicle camera of a perception source device in a time range of 2020-01-0100: 00:00 to 2020-01-0123: 59:59, and the basic structure of the main track data is shown in Table 1.
TABLE 1 Main track information Table
Track information numberingMain targetTime of taking a snapshotVehicle camera encoding
1Jing XXXXXXX2020-01-01 01:30:00S1
2Jing XXXXXXX2020-01-01 03:30:00S2
3Jing XXXXXXX2020-01-01 05:30:00S3
4Jing XXXXXXX2020-01-01 07:30:00S4
5Jing XXXXXX2020-01-01 09:30:00S5
6Jing XXXXXXX2020-01-01 11:30:00S6
7Jing XXXXXXX2020-01-01 13:30:00S7
8Jing XXXXXXX2020-01-01 15:30:00S8
9Jing XXXXXXX2020-01-01 17:30:00S9
10Jing XXXXXXX2020-01-01 19:30:00S10
11Jing XXXXXXX2020-01-01 21:30:00S5
12Jing XXXXXXX2020-01-01 3:30:00S3
3) Setting an accompanying parameter, and inquiring track data in a second data source (information acquired by an electronic fence of sensing source equipment of the IMSI code of the mobile phone) according to the set accompanying parameter to obtain a plurality of tracks of accompanying targets, namely accompanying tracks, wherein the specific method comprises the following steps:
3.1) Primary Trace of the Primary target obtained with the query in the analysis timeframe, i.e., the precise time that the Primary target passed each of the perceptual-source devices S1, S2 …, etc., as shown in Table 1.
3.2) setting an offset distance, and inquiring the spatial position based on the perception source equipment passed by the main track, wherein the method comprises the following steps: and inquiring other sensing source equipment (called accompanying sensing source equipment) in the range by taking the position of each sensing source equipment passed by the main track as a circle center and the offset distance as a radius, and grouping according to the type of the sensing source equipment to obtain the one-to-many virtual same-position relation between the sensing source equipment passed by all main targets and the other accompanying sensing source equipment in the offset distance range. Such as a set offset distance of 20 meters, approximately one intersection distance, the simulated relationship is shown in table 2.
TABLE 2 Master and companion trajectory perception source offset position look-up table
Primary track information numberingVehicle camera encodingElectronic fence numbering
1S1D11
2S2D21
3S3D31
4S4
5S5D51
6S6D61、D62
7S7
8S8D81
9S9
10S10D101
The explanation of no correspondence in table 2 shows that no electronic fence device is installed in the vehicle camera offset range.
And 3.3) setting offset time, taking the accurate time of the main target passing through each perception source device obtained by inquiring in the step 3.1) as the center, adding or subtracting the offset time from front to back, and calculating the offset time range of each accompanying perception source device, thereby obtaining the time range of other types of accompanying perception source devices corresponding to the offset time range of the perception source device passing through the main track. For example, the offset time is set to be 60 seconds, and the resulting electronic fence accompanying the sensing-source device query time in the corresponding cheap range is shown in table 3.
3.4) inquiring all the accompanying target data passing through the accompanying perception source equipment in the second data source according to all the accompanying perception source equipment obtained in the step 3.2) and the offset time ranges of all the accompanying perception source equipment obtained in the step 3.3), aggregating all the data according to the accompanying targets, and then performing time sequencing on the data of each accompanying target to obtain the tracks of all the accompanying targets passing through the accompanying perception source equipment, namely a plurality of accompanying tracks.
The data for the first companion object 460016798008534 is shown in Table 4.
The data for the second companion target 460027754062396 is shown in table 5.
TABLE 3 Master and companion trajectory offset time lookup tables
Figure BDA0003127111760000051
TABLE 4 accompanying target 460016798008534 trajectory information
Number of accompanying track informationCompanion targetElectronic fence numberingWith target acquisition time
1460016798008534D112020-01-01 01:29:10
2460016798008534D212020-01-01 03:29:35
3460016798008534D312020-01-01 05:29:56
4460016798008534D512020-01-01 09:30:10
5460016798008534D612020-01-01 11:29:25
6460016798008534D622020-01-01 11:30:20
7460016798008534D812020-01-01 15:29:47
8460016798008534D812020-01-01 15:30:45
9460016798008534D1012020-01-01 19:29:29
10460016798008534D512020-01-01 21:30:45
11460016798008534D312020-01-01 23:29:29
TABLE 5 accompanying target 460027754062396 trajectory information
Figure BDA0003127111760000061
Also, trajectory information for a third companion target, a fourth companion target, or even more companion targets may be obtained.
4) And carrying out merging and normalization repeated operation on the accompanying tracks according to the accompanying parameters, and carrying out track comparison on the main track and the processed accompanying tracks to obtain the accompanying times and the number of accompanying perception source devices, thereby uniformly providing data support for accompanying and identity.
The specific method comprises the following steps:
4.1) according to the one-to-many virtual same-position relation between the main track perception source equipment and the accompanying perception source equipment, enabling the accompanying perception source equipment in the accompanying track to be grouped into corresponding main track perception source equipment;
the data accompanying the target 460016798008534 was subjected to trajectory processing in accordance with the relationship of Table 2, where D61, D62 were both normalized to S6, as shown in Table 6.
TABLE 6 track information after normalization with object 460016798008534
Figure BDA0003127111760000062
4.2) according to the offset time range of each main track perception source device, carrying out deduplication operation on the data of the accompanying tracks in the offset time range, wherein only one data exists in one offset time range, as shown in the table 7.
TABLE 7 De-duplication of track information with object 460016798008534
Figure BDA0003127111760000071
4.3): and (4) counting the accompanying tracks after the operation of the step 4.1) and the step 4.2), wherein the total number of records is the accompanying times of the main track.
When the data in table 6460016798008534 are counted, the number of times of accompanying is the total number of the accompanying track information numbers, that is, the offset distance is 20 m and the offset time is 60 seconds, the number of times of accompanying the mobile phone with the IMSI number of 460016798008534 and the license plate with the jing XXXXXX is 9.
4.4): and (4) counting the accompanying tracks after the operations of the step 4.1) and the step 4.2), wherein the number of different perception source devices passing through the accompanying tracks is the number of the accompanying perception source devices.
The data of 460016798008534 in table 6 are counted, and sensing source devices S5 and S3 that have passed twice in different time ranges are removed, so that the number of different sensing source devices that have passed is 7, that is, when the accompanying parameters are shifted by 20 meters and the shift time is 60 seconds, the number of mobile phones with IMSI codes of 460016798008534 and the number of license plates with the accompanying sensing source devices of jing XXXXXX is 7.
5) And (3) calculating each accompanying track in the step 3) for multiple times according to the step 4) to obtain all multiple accompanying targets of the main target Jing XXXXXXX license plate, and analyzing accompanying conditions according to the accompanying times of the accompanying targets and the number of accompanying perception source equipment.
The method can well find the accompanying relation for the sensing source data determined by the main key, such as the identity card number acquisition equipment, the license plate identification system, the IMSI, the MAC acquisition equipment and the like. However, with the development of artificial intelligence and AI technology, the key features of things structured by a face recognition system, a gait recognition system and a video, such as the height of people related to people, the color of clothes, whether to wear a hat or not, etc., the color, the appearance, the model, the brand, etc., of the car can be analyzed when the number plate of the car cannot be captured, the information captured by each sensing source device is compared, the information captured by each device is normalized and processed into virtual main key information, such as a portrait file, and if only the same file is used for the photos with the similarity close to 98% of the file, the file is encoded to obtain the virtual main key, and then the virtual main key is subjected to the accompanying analysis.
Based on the same inventive concept, another embodiment of the present invention provides a system for adjoint analysis between multi-source data by using the method of the present invention, which comprises:
the main track query module is used for acquiring a main track of a main target passing through the perception source equipment in an accompanying analysis time range from a first data source;
the accompanying track query module is used for setting accompanying parameters and performing track data query in a second data source according to the accompanying parameters to obtain a plurality of accompanying tracks;
and the merging analysis module is used for merging the accompanying tracks according to the accompanying parameters, and performing track comparison on the main track and the merged accompanying tracks to obtain the accompanying times and the number of the accompanying perception source devices.
The specific operation processes executed by each module are referred to the description of the method of the invention.
Based on the same inventive concept, another embodiment of the present invention provides an electronic device (computer, server, smartphone, etc.) comprising a memory storing a computer program configured to be executed by the processor, and a processor, the computer program comprising instructions for performing the steps of the inventive method.
Based on the same inventive concept, another embodiment of the present invention provides a computer-readable storage medium (e.g., ROM/RAM, magnetic disk, optical disk) storing a computer program, which when executed by a computer, performs the steps of the inventive method.
The particular embodiments of the present invention disclosed above are illustrative only and are not intended to be limiting, since various alternatives, modifications, and variations will be apparent to those skilled in the art without departing from the spirit and scope of the invention. The invention should not be limited to the disclosure of the embodiments in the present specification, but the scope of the invention is defined by the appended claims.

Claims (10)

1. A method for adjoint analysis among multi-source data is characterized by comprising the following steps:
acquiring a main track of a main target passing through a perception source device of the main target within an accompanying analysis time range from a first data source;
setting an accompanying parameter, and performing track data query in a second data source according to the accompanying parameter to obtain a plurality of accompanying tracks;
and merging the accompanying tracks according to the accompanying parameters, and performing track comparison on the main track and the merged accompanying tracks to obtain the accompanying times and the number of the accompanying sensing source devices.
2. The method of claim 1, wherein unified latitude and longitude information is set for the aware-source devices in the first data source and the second data source.
3. The method of claim 1, wherein the companion parameter comprises: past the offset time of the aware-source devices, the offset distance between the aware-source devices.
4. The method of claim 1, wherein the setting of the accompanying parameters and the performing of the track data query in the second data source according to the accompanying parameters to obtain the plurality of accompanying tracks comprises:
setting an offset distance, and inquiring the spatial position based on the perception source equipment passed by the main track, wherein the method comprises the following steps: inquiring other sensing source equipment in the range by taking the position of each sensing source equipment passed by the main track as the center of a circle and the offset distance as the radius, and calling the other sensing source equipment as accompanying sensing source equipment, and grouping according to the type of the sensing source equipment to obtain one-to-many virtual same-position relation between the sensing source equipment passed by all main targets and the other accompanying sensing source equipment in the offset distance range; setting offset time, taking the accurate time of the main target passing through each perception source device as a center, adding or subtracting the offset time from front to back, calculating the offset time range passing through each accompanying perception source device, and obtaining the time range of other types of accompanying perception source devices corresponding to the offset time range of the perception source device passing through the main track;
according to all the accompanying perception source devices and the offset time ranges of all the accompanying perception source devices, data of all the accompanying targets passing through the accompanying perception source devices are inquired in the second data source, all the data are aggregated according to the accompanying targets, and then time sequencing is carried out on the data of each accompanying target, so that tracks of all the accompanying targets passing through the accompanying perception source devices, namely a plurality of accompanying tracks, are obtained.
5. The method of claim 1, wherein said merging the companion trajectory according to the companion parameters comprises:
according to the one-to-many virtual same-position relation between the perception source equipment of the main track and the accompanying perception source equipment, the accompanying perception source equipment in the accompanying track is grouped into corresponding main track perception source equipment;
and according to the offset time range of each perception source device of the main track, carrying out deduplication operation on the data of the accompanying track in the offset time range, wherein only one piece of data exists in one offset time range.
6. The method of claim 5, wherein the comparing the main track and the merged companion track to obtain the companion times and the number of companion sensing source devices comprises:
counting the accompanying tracks after the merging operation, wherein the total number is the accompanying times of the main track;
and counting the accompanying tracks after the merging operation, wherein the number of different sensing source devices passing through the accompanying tracks is the number of the accompanying sensing source devices.
7. The method of claim 1, wherein the primary target is an identification card number, a license plate number, an IMSI number, or an MAC code, and the sensing source device is an identification card recognition terminal, a license plate recognition device, or an IMSI/MAC acquisition device.
8. A system for companion analysis between multi-source data, comprising:
the main track query module is used for acquiring a main track of a main target passing through the perception source equipment in an accompanying analysis time range from a first data source;
the accompanying track query module is used for setting accompanying parameters and performing track data query in a second data source according to the accompanying parameters to obtain a plurality of accompanying tracks;
and the merging analysis module is used for merging the accompanying tracks according to the accompanying parameters, and performing track comparison on the main track and the merged accompanying tracks to obtain the accompanying times and the number of the accompanying perception source devices.
9. An electronic apparatus, comprising a memory and a processor, the memory storing a computer program configured to be executed by the processor, the computer program comprising instructions for performing the method of any of claims 1 to 7.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a computer, implements the method of any one of claims 1 to 7.
CN202110691885.4A2021-06-222021-06-22Adjoint analysis method and system among multi-source dataPendingCN113449158A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202110691885.4ACN113449158A (en)2021-06-222021-06-22Adjoint analysis method and system among multi-source data

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202110691885.4ACN113449158A (en)2021-06-222021-06-22Adjoint analysis method and system among multi-source data

Publications (1)

Publication NumberPublication Date
CN113449158Atrue CN113449158A (en)2021-09-28

Family

ID=77812089

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202110691885.4APendingCN113449158A (en)2021-06-222021-06-22Adjoint analysis method and system among multi-source data

Country Status (1)

CountryLink
CN (1)CN113449158A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN114638937A (en)*2022-03-252022-06-17中电达通数据技术股份有限公司Accompanying person searching method based on three-dimensional grid principle
CN115408624A (en)*2022-08-252022-11-29中国电子产业工程有限公司 Peer analysis method and system among heterogeneous data
WO2025117097A1 (en)*2023-11-282025-06-05Flatiron Health, Inc.Techniques for transmitting unstructured data between different systems

Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20190056423A1 (en)*2016-03-252019-02-21Alibaba Group Holding LimitedAdjoint analysis method and apparatus for data
CN109388663A (en)*2018-08-242019-02-26中国电子科技集团公司电子科学研究院A kind of big data intellectualized analysis platform of security fields towards the society
CN109977109A (en)*2019-04-032019-07-05深圳市甲易科技有限公司A kind of track data cleaning method and adjoint analysis method
CN109977108A (en)*2019-04-032019-07-05深圳市甲易科技有限公司A kind of a variety of track collision analysis methods in Behavior-based control track library
CN112000736A (en)*2020-08-142020-11-27济南浪潮数据技术有限公司Spatiotemporal trajectory adjoint analysis method and system, electronic device and storage medium
CN112131278A (en)*2020-09-282020-12-25浙江大华技术股份有限公司Method and device for processing track data, storage medium and electronic device
CN112561948A (en)*2020-12-222021-03-26中国联合网络通信集团有限公司Method, device and storage medium for recognizing accompanying track based on space-time track

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20190056423A1 (en)*2016-03-252019-02-21Alibaba Group Holding LimitedAdjoint analysis method and apparatus for data
CN109388663A (en)*2018-08-242019-02-26中国电子科技集团公司电子科学研究院A kind of big data intellectualized analysis platform of security fields towards the society
CN109977109A (en)*2019-04-032019-07-05深圳市甲易科技有限公司A kind of track data cleaning method and adjoint analysis method
CN109977108A (en)*2019-04-032019-07-05深圳市甲易科技有限公司A kind of a variety of track collision analysis methods in Behavior-based control track library
CN112000736A (en)*2020-08-142020-11-27济南浪潮数据技术有限公司Spatiotemporal trajectory adjoint analysis method and system, electronic device and storage medium
CN112131278A (en)*2020-09-282020-12-25浙江大华技术股份有限公司Method and device for processing track data, storage medium and electronic device
CN112561948A (en)*2020-12-222021-03-26中国联合网络通信集团有限公司Method, device and storage medium for recognizing accompanying track based on space-time track

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN114638937A (en)*2022-03-252022-06-17中电达通数据技术股份有限公司Accompanying person searching method based on three-dimensional grid principle
CN115408624A (en)*2022-08-252022-11-29中国电子产业工程有限公司 Peer analysis method and system among heterogeneous data
WO2025117097A1 (en)*2023-11-282025-06-05Flatiron Health, Inc.Techniques for transmitting unstructured data between different systems

Similar Documents

PublicationPublication DateTitle
CN110334111B (en)Multidimensional track analysis method and device
CN113449158A (en)Adjoint analysis method and system among multi-source data
CN106506705B (en)Crowd classification method and device based on location service
US10901967B2 (en)License plate matching systems and methods
CN103942811B (en)Distributed parallel determines the method and system of characteristic target movement locus
CN110837582B (en)Data association method and device, electronic equipment and computer-readable storage medium
TWI425454B (en)Method, system and computer program product for reconstructing moving path of vehicle
CN105307121B (en)A kind of information processing method and device
CN107665289B (en)Operator data processing method and system
CN109656973B (en)Target object association analysis method and device
CN112770265B (en)Pedestrian identity information acquisition method, system, server and storage medium
CN110737786A (en)data comparison collision method and device
WO2021114615A1 (en)Method, apparatus, and device for visualization of behavior risk identification, and storage medium
CN114185964A (en)Data processing method, device, equipment, storage medium and program product
CN113971821A (en)Driver information determination method and device, terminal device and storage medium
WO2018176191A1 (en)Method and apparatus for identifying vehicle with fake registration plate
CN107729416B (en)Book recommendation method and system
CN108566620A (en)A kind of indoor orientation method based on WIFI
CN113094388A (en)Method and related device for detecting user workplace and residence
CN112925899B (en)Ordering model establishment method, case clue recommendation method, device and medium
CN114078277A (en)One-person-one-file face clustering method and device, computer equipment and storage medium
CN112257666B (en)Target image content aggregation method, device, equipment and readable storage medium
Shankar et al.An enhanced car parking detection using yolov8 deep learning capabilities
CN112925948A (en)Video processing method and device, medium, chip and electronic equipment thereof
CN112818067A (en)Big data and multidimensional feature combined data tracing method and big data cloud server

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
WD01Invention patent application deemed withdrawn after publication
WD01Invention patent application deemed withdrawn after publication

Application publication date:20210928


[8]ページ先頭

©2009-2025 Movatter.jp