Movatterモバイル変換


[0]ホーム

URL:


CN110969704A - Marker generation tracking method and device based on AR guide - Google Patents

Marker generation tracking method and device based on AR guide
Download PDF

Info

Publication number
CN110969704A
CN110969704ACN201911182161.6ACN201911182161ACN110969704ACN 110969704 ACN110969704 ACN 110969704ACN 201911182161 ACN201911182161 ACN 201911182161ACN 110969704 ACN110969704 ACN 110969704A
Authority
CN
China
Prior art keywords
terminal
target point
information
coordinate information
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911182161.6A
Other languages
Chinese (zh)
Other versions
CN110969704B (en
Inventor
王一男
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xinshijie Technology Co Ltd
Original Assignee
Beijing Xinshijie Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xinshijie Technology Co LtdfiledCriticalBeijing Xinshijie Technology Co Ltd
Priority to CN201911182161.6ApriorityCriticalpatent/CN110969704B/en
Publication of CN110969704ApublicationCriticalpatent/CN110969704A/en
Application grantedgrantedCritical
Publication of CN110969704BpublicationCriticalpatent/CN110969704B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The application provides a marker generation tracking method and a device based on an AR guide, wherein the marker generation method comprises the following steps: acquiring coordinate information of the first terminal, and taking the coordinate information as coordinate information of a target point; selecting a preset 3D mark pattern; placing the 3D mark pattern above the target point to form an AR stereo graph; binding the coordinate information of the target point with the AR three-dimensional graph to form an AR aerial mark information point; sending the AR aerial mark information point to a second terminal for tracking; the mark tracking method is used for determining the distance and the direction between the second terminal and the target point according to the coordinate information of the second terminal and the coordinate information of the target point, and loading the distance and the direction onto a terminal screen in an AR mode. This application demonstrates target position and distance with the AR form, can fix a position and track the target fast.

Description

Marker generation tracking method and device based on AR guide
Technical Field
The application belongs to the technical field of AR information processing, and particularly relates to a marker generation tracking method and device based on an AR guide.
Background
In daily life, there are many life scenes that need to be marked at a certain place to facilitate searching again, such as searching people, searching vehicles or meeting strangers (such as picking up a machine, receiving people and taking expressage).
In the prior art, a party with such a demand can use a landmark building as a reference for finding a place, can mark an actual object at a certain place, and can also take pictures related to the place by using a camera to remind the party or share the pictures with another party. However, such methods are not intuitive, and especially for large landmark buildings, or actual object markers are blocked, and differences between photos and real scenes, etc., all of them bring trouble to find the marked location.
In the existing digital mode, the APP with a plane map or navigation function is used as a place record mark, so that the tracking or searching of others is facilitated. However, such a method is often displayed to the user in a planar (2D) map manner, and it is difficult for the user with poor direction sense to determine the current position of the user and the position of the point according to the position of the displayed location in the map in the positioning search manner.
Disclosure of Invention
In order to solve at least one of the above technical problems, the present application provides a method and an apparatus for generating and tracking a mark based on an AR guide, which facilitate the other party to track or search by generating an AR aerial mark.
In a first aspect of the present application, a method for generating a marker based on an AR guide is applied to a first terminal that locates a target point, and the method for generating the marker includes: acquiring coordinate information of the first terminal, and taking the coordinate information as coordinate information of a target point; selecting a preset 3D mark pattern; placing the 3D mark pattern above the target point to form an AR stereo graph; binding the coordinate information of the target point with the AR three-dimensional graph to form an AR aerial mark information point; and sending the AR aerial mark information point to a second terminal for tracking.
Preferably, the forming of the AR stereoscopic image further includes: and loading the coordinate information of the target point on the 3D marking pattern.
Preferably, the forming of the AR stereoscopic image further includes: and acquiring the time when the AR stereo graph is formed, and loading the time onto the 3D mark pattern.
Preferably, before sending the AR over-the-air mark information point to the second terminal for tracking, the method includes: and selecting a marking mode, wherein the marking mode comprises a fixed positioning marking mode or a mobile positioning marking mode, and under the mobile positioning marking mode, updating the coordinate information of the first terminal according to a preset time interval, generating an AR aerial marking information point and sending the AR aerial marking information point to the second terminal.
In a second aspect of the present application, a marker tracking method based on an AR guide is applied to a second terminal that tracks a target point, and the marker tracking method includes: acquiring coordinate information and an AR stereo graph of a target point according to an AR aerial mark information point sent by a first terminal for positioning the target;
acquiring coordinate information of the second terminal and screen orientation information of the second terminal; moving the orientation of the second terminal screen, and loading the AR stereo graph to a target point in the second terminal screen when the coordinate information of the target point is located in the coverage range of the orientation of the second terminal screen; and determining the distance between the second terminal and the target point according to the coordinate information of the second terminal and the coordinate information of the target point, and placing the distance on the AR stereo graph.
Preferably, the acquiring the coordinate information of the second terminal and the second terminal screen orientation information includes: acquiring the screen orientation of the second terminal through a gyroscope; and acquiring the longitude and latitude coordinates of the second terminal through a GPS or a network.
Preferably, the loading the AR stereoscopic graphic to a target point within a screen of the second terminal includes: calculating the offset of the coordinate information of the target point relative to the second terminal; and determining the loading position of the target point on the screen of the second terminal according to the offset by taking the second terminal as an origin, and displaying the AR stereo graph at the loading position.
In a third aspect of the present application, a marker generating device based on an AR guide is applied to a first terminal for locating a target point, and the marker generating device includes: the target positioning module is used for acquiring the coordinate information of the first terminal and taking the coordinate information as the coordinate information of a target point; the marking pattern selection module is used for selecting a preset 3D marking pattern; the AR three-dimensional graph generating module is used for placing the 3D marking pattern above the target point to form an AR three-dimensional graph; the AR aerial mark information point generating module is used for binding the coordinate information of the target point with the AR stereo graph to form an AR aerial mark information point; and the communication module is used for sending the AR air mark information point to a second terminal for tracking.
Preferably, the system further comprises a marking mode selection module, configured to select a marking direction before the AR aerial marking information point is sent to the second terminal for tracking, where the marking mode includes a fixed positioning marking mode or a mobile positioning marking mode, and in the mobile positioning marking mode, the coordinate information of the first terminal is updated according to a predetermined time interval, and the AR aerial marking information point is generated and sent to the second terminal.
In a fourth aspect of the present application, a marker tracking apparatus based on an AR guide is applied to a second terminal for tracking a target point, the marker tracking apparatus includes: the target point acquisition module is used for acquiring coordinate information and an AR stereo graph of a target point according to an AR aerial mark information point sent by a first terminal for positioning a target; the AR information initialization module is used for acquiring coordinate information of the second terminal and screen orientation information of the second terminal; the AR stereo graphic loading module is used for moving the orientation of the second terminal screen, and loading the AR stereo graphic to a target point in the second terminal screen when the coordinate information of the target point is positioned in the coverage range of the orientation of the second terminal screen; and the target point distance loading module is used for determining the distance between the second terminal and the target point according to the coordinate information of the second terminal and the coordinate information of the target point, and placing the distance on the AR stereo graph.
According to the method, the digital AR aerial mark is erected in the air and the erected digital AR aerial mark is shared to a third party, the position and the distance of the target are displayed in an AR mode, and the target can be quickly positioned and tracked.
Drawings
FIG. 1 is a flowchart of a preferred embodiment of the method for generating a marker based on an AR guide according to the present application.
FIG. 2 is a flowchart of a preferred embodiment of the marker tracking method based on AR guide of the present application.
FIG. 3 is a schematic diagram of vehicle location tracking according to the embodiment shown in FIG. 2.
Fig. 4 is a schematic diagram of setting up an airborne SOS flag and viewing the flag according to the embodiment of fig. 2.
Fig. 5 is a schematic diagram of the embodiment of fig. 2 of the present application for locating persons.
Fig. 6 is a block diagram of a preferred embodiment of the marker generating device based on AR guide according to the present application.
FIG. 7 is a block diagram of a preferred embodiment of the marker tracking device based on AR guide according to the present application.
Detailed Description
In order to make the implementation objects, technical solutions and advantages of the present application clearer, the technical solutions in the embodiments of the present application will be described in more detail below with reference to the accompanying drawings in the embodiments of the present application. In the drawings, the same or similar reference numerals denote the same or similar elements or elements having the same or similar functions throughout. The described embodiments are some, but not all embodiments of the present application. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application, and should not be construed as limiting the present application. All other embodiments obtained by a person of ordinary skill in the art without any inventive work based on the embodiments in the present application are within the scope of protection of the present application. Embodiments of the present application will be described in detail below with reference to the drawings.
According to a first aspect of the present application, a marker generating method based on an AR guide is applied to a first terminal for locating a target point, as shown in fig. 1, and the marker generating method includes:
step S11, acquiring coordinate information of the first terminal, and taking the coordinate information as coordinate information of a target point;
step S12, selecting a preset 3D mark mode;
step S13, placing the 3D marking pattern above the target point to form an AR stereo graph;
step S14, binding the coordinate information of the target point with the AR stereo graph to form an AR aerial mark information point;
and step S15, sending the AR air mark information point to a second terminal for tracking.
Before step S15, the user may choose to create a general mark, a distress mark or a real-time mark by clicking a button for issuing a mark in the smart mobile device (e.g., a mobile phone) side screen. For the distress mark, the AR aerial mark can penetrate any blocking object, so the method can be applied to field rescue, the trapped person can set the aerial SOS mark, and the rescuers can visually see the AR aerial mark of the trapped person so as to determine the direction.
In some optional embodiments, when forming the AR stereoscopic image, the method further includes: and loading the coordinate information of the target point on the 3D marking pattern.
In some optional embodiments, when forming the AR stereoscopic image, the method further includes: and acquiring the time when the AR stereo graph is formed, and loading the time onto the 3D mark pattern.
Fig. 4 shows an AR stereoscopic graphic style when SOS search and rescue is performed, and time may be bound to the AR stereoscopic graphic in response to a request of a user while the first terminal performs the marker generating step.
On the other hand, the AR stereo graphic may also load custom text information, such as SOS distress information.
In some optional embodiments, before sending the AR over-the-air marker information point to the second terminal for tracking, comprises: and selecting a marking mode, wherein the marking mode comprises a fixed positioning marking mode or a mobile positioning marking mode, and under the mobile positioning marking mode, updating the coordinate information of the first terminal according to a preset time interval, generating an AR aerial marking information point and sending the AR aerial marking information point to the second terminal.
The first terminal or the second terminal can be a mobile phone, a tablet or a wearable mobile device, wherein the second terminal can be the first terminal, the first terminal and the second terminal communicate through an internal bus of the terminal, the second terminal can also be other terminal devices different from the second terminal, and the first terminal and the second terminal communicate through a network. In the first case, after the first terminal performs steps S11 to S15, the formed AR air mark information point is stored internally, which is equivalent to being transmitted to the second terminal, and then the terminal may move to another location.
As can be seen from the schematic diagram of vehicle positioning and tracking in the embodiment shown in fig. 3, after a user parks a vehicle at a fixed position, the user performs positioning through a (first) terminal to generate an AR aerial mark information point, only coordinate information of the vehicle is stored in the information point, when the user selects a mark mode, the AR aerial mark information point is stored (equivalently sent to a second terminal) by using the fixed positioning mark mode, at this time, the coordinate information of the AR aerial mark information point is separated from the coordinate information of the first terminal itself, the first terminal itself can move along with the user, and when performing mark tracking, the AR aerial mark information point is tracked and searched through the (second) terminal.
In the present application, in consideration of the positioning accuracy, the coordinate information generally indicates a certain area, which may be a two-dimensional area or a three-dimensional area. Under the two-dimensional condition, different interest areas can be marked and distinguished according to the information of longitude and latitude coordinates; under the three-dimensional condition, the height information can be added on the basis of the longitude and latitude coordinates, so that the areas with the same longitude and latitude coordinates and different heights can be divided into different areas, and more accurate area division is realized.
In an embodiment, when the second terminal is different from the first terminal, the AR air mark information point may be generated by selecting a mobile positioning mark mode, for example, in a scene meeting strangers, the first terminal updates the coordinate information of the first terminal at a predetermined time interval, generates the AR air mark information point in real time, and sends the AR air mark information point to the second terminal.
It should be understood that the first terminal is opposite the second terminal, such as a handset terminal, which may be used as the first terminal, and may also be used as the second terminal, this is related to opening the corresponding app function module, using different modules to correspond to different terminals, in the above scenario of meeting with a stranger, the first person may execute the mark generation step of the first terminal through the coordinate generation module of the app, the second person may also execute the mark generation step of the first terminal through the coordinate generation module of the app, and after the two persons communicate with each other, sending the own AR aerial mark information point to the opposite party, then the first person executes the mark tracking step of the second terminal through the tracking module of the app, and similarly, the second person can also perform a mark tracking step of the second terminal through a tracking module of the mobile phone app of the second person to find and track the first person.
The 3D mark pattern of step S12 in this application may be an arrow + circle positioning pattern, as shown in fig. 3-5, the aerial mark will float in a preset shape color at a preset height in the air; the user may use the application to view the over-the-air indicia on-the-air after leaving the current location and may share the over-the-air indicia to a third party.
The first terminal and the second terminal can interact directly or interact through the server, related functions can be integrated into corresponding apps, different functions are started through different buttons, information points marked in the AR air when SOS is used for asking for help are generally sent to the second terminal through the server, point-to-point transmission is achieved, in alternative implementation modes, broadcasting can be conducted through the server, the server sets a receiving range according to the target point position information of the first terminal, and for example, a user within 3km away from a trapped person can serve as the second terminal to receive the related information.
In a second aspect of the present application, a marker tracking method based on an AR guide is applied to a second terminal for tracking a target point, as shown in fig. 2, the marker tracking method includes:
step S21, acquiring coordinate information of a target point and an AR stereo graph according to an AR aerial mark information point sent by a first terminal for positioning a target;
step S22, acquiring coordinate information of the second terminal and screen orientation information of the second terminal;
step S23, moving the orientation of the second terminal screen, and loading the AR stereo graph to a target point in the second terminal screen when the coordinate information of the target point is located in the coverage range of the orientation of the second terminal screen;
step S24, determining the distance between the second terminal and the target point according to the coordinate information of the second terminal and the coordinate information of the target point, and placing the distance on the AR stereo graph.
In this embodiment, after the second terminal opens the App, the gyroscope and GPS information of the terminal are acquired and AR scene initialization is performed: according to data of a gyroscope and a GPS, an AR three-dimensional scene is established by pointing the x axis to the east, pointing the y axis to the opposite direction of gravity and pointing the z axis to the south, and initialization is completed by setting the position of the equipment as a (0, 0, 0) point during starting. After initialization is completed, the second terminal can view the aerial mark erected by the first terminal.
In an alternative embodiment, after the initialization is completed, the terminal may select to view the guide information or the in-air mark through a filter button. When the aerial mark is selected to be checked, the terminal requests the server to acquire the AR aerial mark information which takes the current terminal GPS coordinate as the center and takes a certain distance (the server can be configured) nearby as the radius, that is, as shown in fig. 5, the AR aerial mark information of a plurality of first terminals can be acquired. And then calculating the offset of an x axis and a z axis according to the AR air mark GPS coordinate returned by the server and the current equipment GPS coordinate, calculating the x and z coordinates of an air mark information point according to the offset and the position of the current equipment in the scene, calculating a y coordinate according to the altitude of the air mark returned by the server and the difference between the altitude of the current terminal, and finally placing the AR graph of the air mark at the calculated xyz coordinate position.
When the aerial mark is searched, the application program calculates the position of the AR graph through a gyroscope and a GPS at the end of the intelligent mobile device (such as a mobile phone), and when the end of the intelligent mobile device (such as the mobile phone) faces to the position, the screen at the end of the intelligent mobile device (such as the mobile phone) displays the AR graph and the distance between the terminal and the aerial mark. In case of the over-the-air sign of the distress type, the first terminal or the server may increase a time for displaying the SOS graphic pattern and creating the sign on the basis of the general sign.
In some optional embodiments, the obtaining the coordinate information of the second terminal and the second terminal screen orientation information comprises: acquiring the screen orientation of the second terminal through a gyroscope; and acquiring the longitude and latitude coordinates of the second terminal through a GPS or a network.
In some optional embodiments, loading the AR stereoscopic graphic at a target point within a screen of the second terminal comprises: calculating the offset of the coordinate information of the target point relative to the second terminal; and determining the loading position of the target point on the screen of the second terminal according to the offset by taking the second terminal as an origin, and displaying the AR stereo graph at the loading position.
In a third aspect of the present application, there is provided a mark generating apparatus based on an AR guide, corresponding to the mark generating method, for being applied to a first terminal for locating a target point, as shown in fig. 6, the mark generating apparatus includes: the target positioning module is used for acquiring the coordinate information of the first terminal and taking the coordinate information as the coordinate information of a target point; the marking pattern selection module is used for selecting a preset 3D marking pattern; the AR three-dimensional graph generating module is used for placing the 3D marking pattern above the target point to form an AR three-dimensional graph; the AR aerial mark information point generating module is used for binding the coordinate information of the target point with the AR stereo graph to form an AR aerial mark information point; and the communication module is used for sending the AR air mark information point to a second terminal for tracking.
In some optional embodiments, the system further includes a marking mode selection module, configured to select a marking direction before the AR aerial marking information point is sent to a second terminal for tracking, where the marking mode includes a fixed positioning marking mode or a mobile positioning marking mode, and in the mobile positioning marking mode, the coordinate information of the first terminal is updated according to a predetermined time interval, and the AR aerial marking information point is generated and sent to the second terminal.
In a fourth aspect of the present application, there is provided a marker tracking apparatus based on an AR guide, corresponding to the marker tracking described above, for use in a second terminal for tracking a target point, as shown in fig. 7, the marker tracking apparatus comprising: the target point acquisition module is used for acquiring coordinate information and an AR stereo graph of a target point according to an AR aerial mark information point sent by a first terminal for positioning a target; the AR information initialization module is used for acquiring coordinate information of the second terminal and screen orientation information of the second terminal; the AR stereo graphic loading module is used for moving the orientation of the second terminal screen, and loading the AR stereo graphic to a target point in the second terminal screen when the coordinate information of the target point is positioned in the coverage range of the orientation of the second terminal screen; and the target point distance loading module is used for determining the distance between the second terminal and the target point according to the coordinate information of the second terminal and the coordinate information of the target point, and placing the distance on the AR stereo graph.
The application also provides a terminal device, which comprises a processor, a memory and a computer program stored on the memory and capable of running on the processor, wherein the processor executes the computer program to realize the marker generation tracking method based on the AR guide.
The present application also provides a readable storage medium storing a computer program for implementing an AR guide-based marker generation tracking method as described above when the computer program is executed by a processor. The computer-readable storage medium may be included in the apparatus described in the above embodiment; or may be present separately and not assembled into the device. The computer readable storage medium carries one or more programs which, when executed by the apparatus, process data in the manner described above.
A terminal device, such as a mobile device such as a mobile phone, which is suitable for implementing the embodiments of the present application, includes a Central Processing Unit (CPU) that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) or a program loaded from a storage section into a Random Access Memory (RAM). In the RAM, various programs and data necessary for the operation of the apparatus are also stored. The CPU, ROM, and RAM are connected to each other via a bus. An input/output (I/O) interface is also connected to the bus.
The following components are connected to the I/O interface: an input section including a touch screen, a key, a scan/camera, and the like; an output section including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section including a hard disk and the like; and a communication section including a network interface card such as a LAN card, a modem, or the like. The communication section performs communication processing via a network such as the internet.
In particular, according to embodiments of the present application, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication section, and/or installed from a removable medium. The computer program, when executed by a Central Processing Unit (CPU), performs the above-described functions defined in the method of the present application.
It should be noted that the computer storage media of the present application can be computer readable signal media or computer readable storage media or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules or units described in the embodiments of the present application may be implemented by software or hardware. The modules or units described may also be provided in a processor, the names of which in some cases do not constitute a limitation of the module or unit itself.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present application should be covered within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

CN201911182161.6A2019-11-272019-11-27Mark generation tracking method and device based on AR guideActiveCN110969704B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201911182161.6ACN110969704B (en)2019-11-272019-11-27Mark generation tracking method and device based on AR guide

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201911182161.6ACN110969704B (en)2019-11-272019-11-27Mark generation tracking method and device based on AR guide

Publications (2)

Publication NumberPublication Date
CN110969704Atrue CN110969704A (en)2020-04-07
CN110969704B CN110969704B (en)2023-09-22

Family

ID=70031806

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201911182161.6AActiveCN110969704B (en)2019-11-272019-11-27Mark generation tracking method and device based on AR guide

Country Status (1)

CountryLink
CN (1)CN110969704B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN111521193A (en)*2020-04-232020-08-11广东博智林机器人有限公司Live-action navigation method, live-action navigation device, storage medium and processor
CN114158022A (en)*2021-12-072022-03-08阿维塔科技(重庆)有限公司Feed rescue method, device, system and equipment
CN115660016A (en)*2022-10-262023-01-31珠海格力电器股份有限公司Processing method, processing device, terminal equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN104598504A (en)*2014-05-152015-05-06腾讯科技(深圳)有限公司Information display control method and device for electronic map
US20170046876A1 (en)*2015-08-122017-02-16International Business Machines CorporationMarker positioning for augmented reality overlays
CN107077739A (en)*2017-01-232017-08-18香港应用科技研究院有限公司Three-dimensional marker model construction and real-time tracking using monocular camera
CN107566793A (en)*2017-08-312018-01-09中科云创(北京)科技有限公司Method, apparatus, system and electronic equipment for remote assistance
CN108830894A (en)*2018-06-192018-11-16亮风台(上海)信息科技有限公司Remote guide method, apparatus, terminal and storage medium based on augmented reality
CN109618055A (en)*2018-12-252019-04-12维沃移动通信有限公司 A location sharing method and mobile terminal
CN109887003A (en)*2019-01-232019-06-14亮风台(上海)信息科技有限公司A kind of method and apparatus initialized for carrying out three-dimensional tracking

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN104598504A (en)*2014-05-152015-05-06腾讯科技(深圳)有限公司Information display control method and device for electronic map
US20170046876A1 (en)*2015-08-122017-02-16International Business Machines CorporationMarker positioning for augmented reality overlays
CN107077739A (en)*2017-01-232017-08-18香港应用科技研究院有限公司Three-dimensional marker model construction and real-time tracking using monocular camera
CN107566793A (en)*2017-08-312018-01-09中科云创(北京)科技有限公司Method, apparatus, system and electronic equipment for remote assistance
CN108830894A (en)*2018-06-192018-11-16亮风台(上海)信息科技有限公司Remote guide method, apparatus, terminal and storage medium based on augmented reality
CN109618055A (en)*2018-12-252019-04-12维沃移动通信有限公司 A location sharing method and mobile terminal
CN109887003A (en)*2019-01-232019-06-14亮风台(上海)信息科技有限公司A kind of method and apparatus initialized for carrying out three-dimensional tracking

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN111521193A (en)*2020-04-232020-08-11广东博智林机器人有限公司Live-action navigation method, live-action navigation device, storage medium and processor
CN114158022A (en)*2021-12-072022-03-08阿维塔科技(重庆)有限公司Feed rescue method, device, system and equipment
CN115660016A (en)*2022-10-262023-01-31珠海格力电器股份有限公司Processing method, processing device, terminal equipment and storage medium

Also Published As

Publication numberPublication date
CN110969704B (en)2023-09-22

Similar Documents

PublicationPublication DateTitle
US11164379B2 (en)Augmented reality positioning method and apparatus for location-based service LBS
CN107179524B (en)Fire fighting equipment positioning method, device and system and computer readable storage medium
CN112307363A (en)Virtual-real fusion display method and device, electronic equipment and storage medium
CN103424113A (en)Indoor positioning and navigating method of mobile terminal based on image recognition technology
CN110969704B (en)Mark generation tracking method and device based on AR guide
JP2003111128A (en) Current position specifying method, current position information providing method, travel route guidance method, position information management system, and information communication terminal
CN107656961B (en)Information display method and device
US20200265644A1 (en)Method and system for generating merged reality images
JP2016110245A (en)Display system, display method, computer program, computer readable recording medium
JP2019027799A (en) Positioning accuracy information calculation apparatus and positioning accuracy information calculation method
CN115439635A (en)Method and equipment for presenting mark information of target object
US9418351B2 (en)Automated network inventory using a user device
CN112422653A (en)Scene information pushing method, system, storage medium and equipment based on location service
CN116164730A (en)Photovoltaic equipment positioning navigation method and related equipment
CN107221030B (en)Augmented reality providing method, augmented reality providing server, and recording medium
CN107766476B (en)Crowdsourcing data processing method, device and equipment based on building block data and storage medium
CN110942521B (en)AR information point display method and device
CN113237464A (en)Positioning system, positioning method, positioner, and storage medium
CN111028516A (en)Traffic police duty information transmission method, system, medium and device
CN110796706A (en)Visual positioning method and system
CN113452842B (en)Flight AR display method, system, computer equipment and storage medium
CN110162658A (en)Position information acquisition method, device, terminal and storage medium
CN117029815A (en)Scene positioning method and related equipment based on space Internet
JP6431494B2 (en) User terminal and program
CN111814824B (en)Method, device, server and system for acquiring association relation

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp