Movatterモバイル変換


[0]ホーム

URL:


US9854395B2 - Methods and systems for notifying user of destination by route guide - Google Patents

Methods and systems for notifying user of destination by route guide
Download PDF

Info

Publication number
US9854395B2
US9854395B2US14/489,946US201414489946AUS9854395B2US 9854395 B2US9854395 B2US 9854395B2US 201414489946 AUS201414489946 AUS 201414489946AUS 9854395 B2US9854395 B2US 9854395B2
Authority
US
United States
Prior art keywords
destination
user
location
route
street view
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US14/489,946
Other versions
US20150094955A1 (en
Inventor
Yoon Shick LEE
Min Sik PARK
Min Oh KIM
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Naver Corp
Original Assignee
Naver Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Naver CorpfiledCriticalNaver Corp
Assigned to NAVER CORPORATIONreassignmentNAVER CORPORATIONASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: PARK, MIN SIK, KIM, MIN OH, LEE, YOON SHICK
Publication of US20150094955A1publicationCriticalpatent/US20150094955A1/en
Application grantedgrantedCritical
Publication of US9854395B2publicationCriticalpatent/US9854395B2/en
Activelegal-statusCriticalCurrent
Adjusted expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

A method of providing destination information to a user through a location-based service includes: determining whether the user has approached a destination along a guided route for the user based on mapped location information for the user, the destination being designated by the user; and activating a visual notification function associated with the destination when the user is determined to have approached the destination.

Description

CROSS-REFERENCE TO RELATED APPLICATION
This application claims priority under 35 U.S.C. §119 to Korean Patent Application No. 10-2013-0115060, filed on Sep. 27, 2013, the entire contents of which are incorporated herein by reference for all purposes as if fully set forth herein.
BACKGROUND
Field
Example embodiments relate to methods and/or apparatuses for more accurately notifying a user of a location of a destination when the user arrives at or near the destination using a location-based service.
Description of Conventional Art
Recently, with the development of global positioning system (GPS) technology and geographical information system (GIS) technology, a user may verify his/her location and peripheral information without restrictions with regard to time and/or occasion. A user may also be provided with travel route information associated with a destination through a variety of location-based services, such as a location verification service, a route find service, a route guide service, a local search service, etc.
A route guide service refers to a service that recognizes a current location of a terminal, such as a mobile terminal (e.g., a navigation terminal, a smartphone, etc.), identifies the recognized current location of the terminal on a map, and provides a travel route to a destination as well as additional information to a user.
A route find service refers to a service that notifies a user of a route based on departure, passage, and destination input from the user. A route find service may guide the user along a route by continuously updating a current location of the user on a map so long as the user allows use of his/her location information.
In an outdoor environment, GPS may be used to identify a current location of a user. Alternatively, or in conjunction with GPS, a current location of a user may be identified by: calculating a current location using a base station signal of a mobile communications network; calculating a current location using a WiFi base station signal; and/or requesting and receiving location information and/or a current location of a terminal from a location-based server.
In an indoor environment, a current location of a user may be identified using access point (AP) information and/or an indoor positioning system (IPS).
A conventional location-based service may guide a user along a route to a location near a destination designated by the user. However, conventional location-based services may not accurately notify the user of the actual location of the destination upon arrival near the destination. Rather, this must be done using a separate method.
In one conventional example, a user is provided with a destination arrival alarm when the user approaches a base station area of a mobile communications network corresponding to a destination set by the user. In this example, however, the location-based service terminates the guidance after outputting only a simple message such as, “you just arrived around your destination” when the user arrives near the destination. Accordingly, when it is difficult for the user to immediately determine or identify the actual location of the destination based on only location (e.g., GPS) information (e.g., when a user is located around a station, a market, a roadside of multiple lanes, an alley, a trail, etc.) the user may wander around the destination delaying arrival.
SUMMARY
Some example embodiments provide systems capable of more accurately notifying a user of a location of a destination using information associated with the destination when the user arrives near the destination using a location-based service. Some example embodiments also provide methods of more accurately notifying a user of a location of a destination using information associated with the destination when the user arrives near the destination using a location-based service.
According to example embodiments, it is possible to enhance quality and/or satisfaction of a route guide service by more accurately notifying a user of a destination using information associated with the destination when the user arrives near the destination using a location-based service.
Additional features of the example embodiments will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of example embodiments.
At least one example embodiment provides a method of providing destination information to a user through a location-based service. According to at least this example embodiment, the method includes: determining whether the user is approaching a destination designated by the user when guiding the user along a route using a map screen on which location information for the user is mapped; and activating a visual notification function associated with the destination when the user is determined to be approaching the destination.
At least one other example embodiment provides a method of providing destination information to a user through a location-based service. According to at least this example embodiment, the method includes: determining whether the user is approaching a destination along a guided route for the user based on mapped location information for the user, the destination being designated by the user; and activating a visual notification function associated with the destination when the user is determined to be approaching the destination.
The route may be a travel route traversed by the user at least one of by walking, using a vehicle, using public transportation, and using a bicycle. The destination may be an intermediate destination along the route.
The activating the visual notification function associated with the destination may include: displaying, on a street view screen, an actual street view around a current location of the user; highlighting an object corresponding to the destination on the street view screen; and displaying the highlighted object on the street view screen.
According to at least some example embodiments, the highlighting the object may include: highlighting the object corresponding to the destination on the street view screen using a color or figure such that the highlighted object is distinguished from other objects on the street view screen.
According to at least some other example embodiments, the highlighting the object may include: processing an edge of the object corresponding to the destination to be bold on the street view screen.
The object may correspond to one of an individual building, a location within a building complex, a street, a sidewalk, and a trail.
According to at least some example embodiments, the activating the visual notification function associated with the destination may include: displaying, on a street view screen, an actual street view around a current location of the user; and displaying, on the street view screen, an instruction indicator on an object corresponding to the destination.
According to at least some other example embodiments, the activating the visual notification function associated with the destination may include: displaying, on a street view screen, an actual street view around a current location of the user; and displaying, on the street view screen, at least one of text describing the destination and an instruction indicator on an object corresponding to the destination.
According to at least some example embodiments, the activating the visual notification function associated with the destination may include: displaying an enlarged photo of the destination.
According to at least some example embodiments, the activating the visual notification function associated with the destination may include: displaying at least one of an actual photo of the destination and a three-dimensional (3D) map, the 3D map being generated using a vector rendering method.
According to at least some example embodiments, the activating the visual notification function associated with the destination may include: activating the visual notification function at a point in time at which route guidance for the user to the destination is completed.
At least one other example embodiment provides a system for providing destination information to a user through a location-based service. According to at least this example embodiment, the system includes: a determiner configured to determine whether the user is approaching a destination designated by the user when guiding the user along a route using a map screen on which location information for the user is mapped; and a processing unit configured to activate a visual notification function associated with the destination when the user is determined to be approaching the destination.
According to at least one other example embodiment, a system for providing destination information to a user through a location-based service, includes: a determiner configured to determine whether the user is approaching a destination along a guided route for the user based on mapped location information for the user, the destination being designated by the user, and a processing unit configured to activate a visual notification function associated with the destination when the user is determined to be approaching the destination.
The route may be a travel route traversed by the user at least one of by walking, using a vehicle, using public transportation, and using a bicycle. The destination may be an intermediate destination along the route.
The processing unit may be further configured to: display, on a street view screen, an actual street view around a current location of the user; highlight an object corresponding to the destination on the street view screen; and display the highlighted object on the street view screen.
The processing unit may be further configured to display an enlarged photo of the destination.
The processing unit may be further configured to display at least one of an actual photo of the destination and a three-dimensional (3D) map, the 3D map being generated using a vector rendering method.
The processing unit may be configured to activate the visual notification function at a point in time at which route guidance for the user to the destination is completed.
At least one other example embodiment provides a non-transitory computer-readable medium storing instructions to control a computer or processor-based system to perform a method for providing destination information to a user through a location-based service. According to at least this example embodiment, the method includes: determining whether the user is approaching a destination designated by the user when guiding the user along a route using a map screen on which location information for the user is mapped; and activating a visual notification function associated with the destination when the user is determined to be approaching the destination.
At least one other example embodiment provides a non-transitory computer-readable medium storing instructions to control a computer or processor-based system to perform a method for providing destination information to a user through a location-based service. According to at least this example embodiment, the method includes: determining whether the user is approaching a destination along a guided route for the user based on mapped location information for the user, the destination being designated by the user; and activating a visual notification function associated with the destination when the user is determined to be approaching the destination.
At least one other example embodiment provides a file distribution system for distributing a file of an application installed at a user terminal to provide destination information for the user terminal through a location-based service. According to at least this example embodiment, the file distribution system includes: a file transmitter configured to transmit the file in response to a request from the user terminal. The application includes: a first module configured to control the user terminal to determine whether the user is approaching a destination designated by the user when guiding the user along a route using a map screen on which location information of the user is mapped; and a second module configured to control the user terminal to activate a visual notification function associated with the destination when the user is determined to be approaching the destination.
Still at least one other example embodiment provides a file distribution system for distributing a file of an application installed at a user terminal to provide destination information for the user terminal through a location-based service. According to at least this example embodiment, the file distribution system includes: a file transmitter configured to transmit the file in response to a request from the user terminal. The application includes: a first module configured to control the user terminal to determine whether the user is approaching a destination along a guided route for the user based on mapped location information for the user, the destination being designated by the user, and a second module configured to control the user terminal to activate a visual notification function associated with the destination when the user is determined to be approaching the destination.
It is to be understood that both the foregoing general description and the following detailed description are explanatory and are intended to provide further explanation of the example embodiments as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which are included to provide a further understanding of example embodiments and are incorporated in and constitute a part of this specification, illustrate example embodiments, and together with the description serve to explain the principles of example embodiments.
FIG. 1 illustrates a relationship between a mobile device and a destination information providing system according to example embodiments.
FIG. 2 is a flowchart illustrating a method of providing destination information to more accurately notify a user of a destination location, according to example embodiments.
FIGS. 3 through 7 illustrate examples of visual notifications more accurately identifying a user of a destination location, according to example embodiments.
FIG. 8 is a block diagram illustrating a destination information providing system to more accurately notify a user of a destination location, according to example embodiments.
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
Example embodiments will now be described more fully with reference to the accompanying drawings in which some example embodiments are shown. Example embodiments may, however, be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein; rather, these example embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of example embodiments to those of ordinary skill in the art. In the drawings, the thicknesses of layers and areas are exaggerated for clarity. Like reference numerals in the drawings denote like elements, and thus their description may be omitted.
It will be understood that when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected” or “directly coupled” to another element, there are no intervening elements present. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. Other words used to describe the relationship between elements or layers should be interpreted in a like fashion (e.g., “between” versus “directly between,” “adjacent” versus “directly adjacent,” “on” versus “directly on”).
Although the terms “first”, “second”, etc. may be used herein to describe various elements, components, areas, layers and/or sections, these elements, components, areas, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, area, layer or section from another element, component, area, layer or section. Thus, a first element, component, area, layer or section discussed below could be termed a second element, component, area, layer or section without departing from the teachings of example embodiments.
Spatially relative terms, such as “beneath,” “below,” “lower,” “above,” “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. The spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the example term “below” can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used herein, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context accurately indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Expressions such as “at least one of,” when preceding a list of elements, modify the entire list of elements not the individual elements of the list.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which example embodiments belong. It will be further understood that terms, such as those defined in commonly-used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Specific details are provided in the following description to provide a thorough understanding of example embodiments. However, it will be understood by one of ordinary skill in the art that example embodiments may be practiced without these specific details. For example, systems, apparatuses and/or devices may be shown in block diagrams so as not to obscure the example embodiments in unnecessary detail. In other instances, well-known processes, structures and techniques may be shown without unnecessary detail in order to avoid obscuring example embodiments.
In the following description, example embodiments may be described with reference to acts and symbolic representations of operations (e.g., in the form of flow charts, flow diagrams, data or signal flow diagrams, structure diagrams, block diagrams, etc.) that may be implemented as program modules or functional processes include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types, and may be implemented using existing hardware at, for example, existing mobile devices, systems, servers, etc. Such existing hardware may include one or more Central Processing Units (CPUs), system-on-chip (SOC) devices, digital signal processors (DSPs), application-specific-integrated-circuits, field programmable gate arrays (FPGAs), controllers, arithmetic logic units, digital signal processors, microcomputers, programmable logic units, microprocessors, or any other device capable of responding to and executing instructions in a defined manner.
Although a flow chart may describe the operations as a sequential process, many of the operations may be performed in parallel, concurrently or simultaneously. In addition, the order of the operations may be re-arranged. Further, in some alternative implementations, the functions/acts discussed herein may occur out of the order noted in the figures. For example, two figures shown in succession may in fact be executed substantially concurrently or simultaneously, or may be executed in the reverse order, depending upon the functionality/acts involved.
A process may be terminated when its operations are completed, but may also have additional steps not included in the drawings. A process may correspond to a method, function, procedure, subroutine, subprogram, etc. When a process corresponds to a function, its termination may correspond to a return of the function to the calling function or the main function.
As discussed herein, the term “storage medium”, “tangible computer-readable storage medium”, “computer-readable storage medium” or “non-transitory computer-readable storage medium” may represent one or more devices for storing data, including read only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices and/or other tangible machine readable mediums for storing information. The term “computer-readable medium” may include, but is not limited to, portable or fixed storage devices, optical storage devices, and various other mediums capable of storing, containing or carrying instruction(s) and/or data.
Example embodiments may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware or microcode, the program code or code segments to perform the necessary tasks may be stored in a machine or computer readable medium such as the computer-readable storage medium discussed herein. When implemented in software, a processor or processors may perform the one or more necessary tasks.
Software may include a computer program, one or more pieces of code, one or more code segments, instruction(s), or some combination thereof, for independently or collectively instructing or configuring a processing device to operate as desired. Software and data may be embodied permanently or temporarily in any type of machine, component, physical or virtual equipment, or computer-readable storage medium or device. Software also may be distributed over network coupled computer systems so that the software is stored and executed in a distributed fashion. In particular, the software and data may be stored by one or more computer readable recording mediums.
A code segment may represent a procedure, function, subprogram, program, routine, subroutine, module, software package, class, or any combination of instructions, data structures or program statements. A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
As discussed herein, example embodiments and/or one or more components thereof (e.g., components of mobile devices, destination information providing systems, systems to more accurately notify users of destination locations, etc.) may be hardware, firmware, hardware executing software or any combination thereof. In this regard, example embodiments may be described as circuits, units, devices, etc. When example embodiments and/or one or more components thereof are hardware, such hardware may include one or more Central Processing circuits (CPUs), system-on-chips (SOCs), digital signal processors (DSPs), application-specific-integrated-circuits (ASICs), field programmable gate arrays (FPGAs), controllers, arithmetic logic units, digital signal processors, microcomputers, programmable logic units, microprocessors, computers, or the like configured as special purpose machines to perform the functions described herein. CPUs, SOCs, DSPs, ASICs, FPGAs, etc. may sometimes generally be referred to as processors and/or microprocessors.
A processing device (e.g., at a mobile device and/or destination information providing system) may run an operating system (OS) and one or more software applications that run on the OS. The processing device also may access, store, manipulate, process, and create data in response to execution of the software. For purpose of simplicity, the description of a processing device is used as singular; however, one skilled in the art will appreciated that a processing device may include multiple processing elements and multiple types of processing elements. For example, a processing device may include multiple processors or a processor and a controller. In addition, different processing configurations are possible, such as parallel processors.
Hereinafter, example embodiments will be described with reference to the accompanying drawings.
One or more example embodiments relate to technology for more accurately notifying a user of a location of a destination using information associated with the destination when the user arrives near or at the destination using a location-based service. One or more example embodiments may be applied to a variety of location-based services, such as a location verification service, a route find service, route guide service, a local search service associated with global positioning system (GPS) technology and/or geographical information system (GIS), etc. In a more specific example, one or more example embodiments may be applied to a personal navigation system (PNS) corresponding to a pedestrian navigation service as well as a car navigation system (CNS) corresponding to a car navigation service.
FIG. 1 illustrates an example relationship between a mobile device and a destination information providing system, according to an example embodiment.
Referring toFIG. 1, amobile device101 and a destinationinformation providing system100 are shown. Here, the arrow indicator indicates that data may be transmitted and received between themobile device101 and the destinationinformation providing system100 over a network (e.g., wired and/or wireless network).
According to one or more example embodiments, the destinationinformation providing system100 may be a physical computer hardware system configured to provide services for mobile devices (e.g., mobile device101) connected to a network, such as the Internet, a Wide Area Network (WAN), a personal area network (PAN), local area network (LAN), campus area network (CAN), metropolitan area network (MAN), a virtual local area network, and/or any other like network capable of physically or logically connecting computers or other devices (not shown). In one or more example embodiments, the destinationinformation providing system100 may be a server or other like network element that employs one or more connection-oriented protocols such as Session Initiation Protocol (SIP), Hyper-Text-Transfer-Protocol (HTTP), and Transport Control Protocol/Internet Protocol (TCP/IP), and includes network devices that use connectionless protocols such as User Datagram Protocol (UDP), Internet Packet Exchange (IPX), and the like. Destinationinformation providing system100 may be configured to establish, manage, and terminate communications sessions, for example between the destinationinformation providing system100 and themobile device101. Destinationinformation providing system100 may also be configured to establish, manage, and terminate communications sessions with two or more client devices.
In a more specific example, the destinationinformation providing system100 may serve as a platform configured to provide a location-based service to themobile device101 corresponding to a client or user. For example, the destinationinformation providing system100 may provide a location-based service in a mobile web or mobile application (“app”) environment through a service platform based on a mobile search. In this example, the destinationinformation providing system100 may provide a function of more accurately notifying a user of a location of a destination when the user arrives near the location of the destination using the location-based service.
Still referring toFIG. 1, according to at least this example embodiment, themobile device101 may be any hardware computing device capable of providing communications services via a cellular network, computer network, and/or other like communications network. In various example embodiments, themobile device101 may be capable of communicating with a server (e.g., destination information providing system100), such that themobile device101 is able to receive services from the server. In one example, themobile device101 may include memory, one or more processors, and a transceiver. Themobile device101 may be configured to send/receive data to/from network devices, such as routers, switches, or other like network devices. Themobile device101 may be designed to sequentially and automatically carry out a sequence of arithmetic or logical operations; equipped to record/store digital data on a machine readable medium; and transmit and receive digital data via one or more network devices. Themobile device101 may include a wireless transceiver configured to operate in accordance with wireless communications standards (e.g., 3rdGeneration Partnership Project (3GPP), etc.).
In a more specific example, themobile device101 may be any type of terminal device capable of accessing the destinationinformation providing system100 through a mobile web or a mobile app. For example, themobile device101 may be any hardware computing device having portability or mobility, such as: a smart phone; a tablet computer; a laptop computer; a digital multimedia broadcasting (DMB) terminal; a navigation device or terminal; an Internet-enabled video camera; an Internet-enabled digital photo camera; an MP3 player: a portable media player (PMP); and/or any logical device capable of capturing/recording, storing, and/or transferring data via a communications network. According to at least this example, themobile device101 may perform the overall service operation, including service screen configuration, data input, data transmission and reception, data storage according to control of the mobile web or the mobile app, etc.
As mentioned above, in some cases, although a user may arrive near a destination using the location-based service provided through themobile device101, the user may not immediately find or identify the actual location of the destination such as at a location near a station, a market, a roadside of multiple lanes, a forked sidewall, or a trail. In this case, a method and/or function of more accurately notifying the actual destination may be helpful.
A method of more accurately notifying a user of a location of a destination within a location-based service, according to example embodiments, may be provided to a user that is to travel, for example, by walking, running, using a vehicle, public transportation (e.g., a subway, a bus, etc.), a bicycle, etc.
Herein, the route guide or route guidance of the location-based service may provide guiding or route guidance information (e.g., directions) for a travel route using at least one of vehicles, public transportation, bicycles, walking, etc. with respect to a destination designated by the user. The route guide may include a case in which the user travels using a single method of travel, and a case in which the user travels through at least two methods of travel (e.g., “walk+public transportation”, “bicycle+walk”, etc.). Also, the route guide may inclusively provide guidance along a travel route with respect to an indoor geography (e.g., within a building or other structure) based on an IPS in addition to providing guidance along a travel route with respect to an outdoor geography.
As discussed herein, the term “destination” may refer to a final destination, but may also inclusively refer to an intermediate destination, such as a passage or route segment through which the user passes before arriving at the final destination. Accordingly, one or more example embodiments may provide more accurate notification of a corresponding point with respect to the passage or route segment, as well as the final destination using the same or substantially the same method.
FIG. 2 is a flowchart illustrating a method of providing destination information to more accurately notify a user of a location of a destination, according to an example embodiment. Operations of the destination information providing method according to at least this example embodiment will be described as being performed by a destination information providing system, such as that described with reference toFIGS. 1 and 8.
Referring toFIG. 2, in operation S210, the destinationinformation providing system100 determines whether a user has approached (or is approaching) a destination designated by the user when guiding the user along a route using a map screen on which location information of the user is mapped. Here, when guiding the user along a route using a location-based service through a mobile device using the map screen on which location information of the user (e.g., a current location of the mobile device) is mapped, the destinationinformation providing system100 may determine whether the user has arrived near the destination.
When the user is determined to have approached the destination, the destinationinformation providing system100 activates a visual notification function associated with the destination in operation S220.
In the example embodiment shown inFIG. 2, the destinationinformation providing system100 may provide the user with a visual notification function including peripheral image information such that the user is able to more accurately recognize the destination as well as the actual location of the destination. In one example, the peripheral image information may include: a street view and/or a photo of a current location of the mobile device and/or a map. In more detail, for example, the destinationinformation providing system100 may activate the visual notification function at a point in time at which guidance for a passage or segment of the route to an intermediate destination along the route is gradually completed, and/or at a point in time at which the guidance along the route is completed by approaching a final destination.
As an example, the destinationinformation providing system100 may provide a street view screen on which an actual street view around a current location of the user is displayed, and then may highlight and display an object corresponding to the destination on the street view screen. In another example, the destinationinformation providing system100 may provide an indoor view screen in which an actual indoor view is displayed in the case of an indoor environment, and then may highlight and display an object corresponding to the destination on the indoor view screen.
Referring toFIG. 3, when a user arrives near a destination on a guided travel route to the destination using amap screen310 on which location information of the user is mapped, the destinationinformation providing system100 may highlight the destination using, for example, a color, a figure, text, etc., and may display the highlighted destination such that the destination is distinguished from the remaining portion (e.g., peripheral facility) on astreet view screen320. According to one or more example embodiments, the destinationinformation providing system100 may display thestreet view screen320 by switching from themap screen310 to thestreet view screen320, or displaying thestreet view screen320 together with themap screen310 using a split screen effect. The term “object” used herein may refer to any type of points of interests (POIs) that may be designated as a destination, such as an individual building, a given (or alternatively desired or predetermined) location within a building complex, a road, a sidewalk, a trail, geographical coordinates in the form of longitude and latitude, etc.
As an example, a destination notification function using a street view may be configured using: 1) a method of processing an edge of theobject321 corresponding to a destination to be bold, and displaying the processedobject321 to be distinguished from other portions of (e.g., peripheral facility on) thestreet view screen320 as illustrated inFIG. 3; 2) a method of displaying anobject421 corresponding to a destination using a different color or figure to be distinguished from other portions of (e.g., peripheral facility on) astreet view screen420 as illustrated inFIG. 4; and/or 3) a method of displaying an instruction indicator (e.g., an arrow indicator)521-aindicating (e.g., directly indicating) anobject521 corresponding to the destination, or indicating a direction from a current location of a user towards the destination, or a direction towards an entrance of a parking lot on astreet view screen520, or a method of displaying text (e.g., a name and/or a point of interest (POI))521-bdescribing the destination and an instruction indicator521-acorresponding to the destination as illustrated inFIG. 5.
As another example, the destinationinformation providing system100 may display an enlarged photo of a destination when a user arrives near the location of the destination. For example, referring toFIG. 6, the destinationinformation providing system100 may display apopup screen620, on which a photo of anobject621 corresponding to a destination is enlarged, as an overlay on amap screen610.
As yet another example, the destinationinformation providing system100 may more accurately designate and display a corresponding destination using a three-dimensional (3D) map of a location around the destination when a user arrives near the destination. In this example, the 3D map may include a map configured using one or more real (e.g., satellite) images and/or a 3D map configured using one or more vector rendering methods. When a map screen for a route guide is a two-dimensional (2D) map, a 3D map may be provided in association with surroundings of the destination so that the user may more intuitively recognize and identify the destination. Referring toFIG. 7, for example, the destinationinformation providing system100 may display apopup screen720, on which one or more real images of a destination is/are displayed, as an overlay on amap screen710.
The aforementioned destination information providing methods, devices, systems and/or computer-readable storage mediums, according to one or more example embodiments, may provide visual notification functionality about destinations so that a user may more accurately recognize and/or identify the destination when the user approaches the location of the destination using a location-based service.
The aforementioned destination information providing methods, devices, systems and/or computer-readable storage mediums, according to one or more example embodiments, may be applied to any and all location-based services (e.g., GPS devices, mobile terminals, tablet computers, etc.). Also, the destination notification function may refer to methods of recognizing an object using image information and a map. Accordingly, when using a destination as a single point of continuous actions, the destination notification function may be used to, for example, transmit a user's location, share and/or record a point in the case of leisure activities such as trekking and/or geo-caching.
As mentioned above, methods according to one or more example embodiments may be performed by, or implemented as, a variety of computer systems and may be recorded in non-transitory computer-readable media in a program instruction form. For example, example embodiments may include non-transitory computer-readable media storing one or more programs including processor-executable instructions that, when executed, cause the processor to execute a method including: determining whether a user has approached a destination designated by the user when guiding the user along a route using a map screen on which location information of the user is mapped; and activating a visual notification function for the destination when the user is determined to have approached the destination.
A program according to some example embodiments may include a personal computer (PC)-based program or a mobile application (“app”) for a mobile terminal. An app for a mobile search may be configured in an independently operating program form or an in-app form of a given (or alternatively, desired or predetermined) application to thereby be operable on the given (or alternatively, desired or predetermined) application.
Destination information providing methods according to one or more example embodiments may be performed in such a manner that a mobile app associated with a server system (e.g., a location-based service server) controls a user terminal. For example, the applications may include one or more modules configured to control the user terminal to perform operations included in the aforementioned destination information providing methods. In one example, the application may include: a module (e.g., determination module) configured to control the user terminal to determine whether a user has approached a destination designated by the user when guiding the user along a route using a map screen on which location information of the user is mapped; and a module (e.g., activation module) configured to control the user terminal to activate a visual notification function associated with the destination when the user is determined to have approached the destination. Also, the application may be installed on the user terminal through a file provided from a file distribution system. For example, the file distribution system may include a file transmitter (not shown) configured to transmit the file in response to a request of the user terminal.
FIG. 8 is a block diagram illustrating a destination information providing system to more accurately notify a user of a destination location, according to example embodiments.
Referring toFIG. 8, the destination information providing system includes: aprocessor800; amemory801; and adatabase802. Theprocessor800 includes adeterminer810 and aprocessing unit820.
Thememory801 may store a program including processor-executable instructions that, when executed, cause the processor to provide a visual notification function associated with a destination so that a user may more accurately identify and/or recognize a destination when guided along a route using a map screen on which location information of the user is mapped. Operations performed by the destination information providing system described above with reference toFIGS. 1 through 7 may be executed in accordance with the program stored in thememory801. According to at least some example embodiments, thememory801 may be a hard disc, a solid state disk (SSD), a secure digital (SD) card, or other storage media.
Thedatabase802 may store and maintain information (e.g., all or substantially all information) required to provide a location-based service, and information (e.g., all information) required to provide a visual notification function associated with a destination when a user approaches the destination using the location-based service.
Theprocessor800 refers to a device configured to perform processing in response to instructions of the program stored in thememory801. Theprocessor800 may include a microprocessor, for example, a central processing unit (CPU). Hereinafter, a detailed configuration of theprocessor800 will be described.
Still referring toFIG. 8, thedeterminer810 determines whether a user has approached a destination designated by the user when guiding the user along a route using a map screen on which location information of the user is mapped. Here, when guiding the user along a route for the user using a location-based service through a mobile device using the map screen on which location information of the user (e.g., a current location of the mobile device) is mapped, thedeterminer810 determines whether the user has arrived near the destination.
When the user is determined to have approached or arrived near the destination, theprocessing unit820 activates a visual notification function for (or associated with) the destination. Here, theprocessing unit820 enables the user to more accurately identify and/or recognize the destination through the visual notification function using peripheral image information (e.g., a street view and a photo) of a current location of the mobile device and a map. In one example, theprocessing unit820 activates the visual notification function at a point in time at which route guidance for a passage or segment of the route corresponding to an intermediate destination is gradually completed, and/or at a point in time at which the route guidance is completed by approaching a final destination. As an example, theprocessing unit820 may provide a street view screen on which an actual street view around a current location of the user is displayed, and then may highlight and display an object corresponding to the destination on the street view screen. In another example, theprocessing unit820 may provide an indoor view screen on which an actual indoor view is displayed in the case of an indoor environment, and then may highlight and display an object corresponding to the destination on the indoor view screen. When a user arrives near a destination while guided along a travel route to the destination using a map screen on which location information of the user is mapped, theprocessing unit820 may highlight the destination according to a visualization method using, for example, a color, a figure, and/or text, and may display the highlighted destination to be distinguished from other portions of the map screen (e.g., peripheral facility) using a street view screen. As an example, theprocessing unit820 may provide a visual notification function for a corresponding destination using: (1) a method of processing an edge of an object corresponding to a destination to be bold, and displaying the processed object to be distinguished from other portions of the map screen (e.g., peripheral facility) on a street view screen; (2) a method of displaying an object corresponding to a destination using a different color and/or figure to be distinguished from other portions of the map screen (e.g., peripheral facility); (3) a method of displaying an instruction indicator on an object corresponding to a destination; and/or (4) a method of displaying text associated with describing a destination and/or an instruction indicator on an object corresponding to a destination. In another example, theprocessing unit820 may provide a visual notification function for a corresponding destination using a method of displaying an enlarged photo of the destination, a method of displaying a real (e.g., satellite) photo of the destination, and/or a 3D map configured using one or more vector rendering methods.
Accordingly, destination information providing systems, according to example embodiments, may enable more accurate notification of a destination to a user when the user arrives near the destination using a location-based service.
AlthoughFIG. 8 is discussed with regard to theprocessor800 as part of the destinationinformation providing system100, it should be understood that thedeterminer810 andprocessing unit820 may be located at themobile device101 and operate in the same or substantially the same manner.
Moreover, although the example embodiment shown inFIG. 2 is discussed herein as being performed at the destinationinformation providing system100, it should be understood that themobile device101 may perform operations S210 and S220.
Referring back toFIG. 2, for example, in this example embodiment, in operation S210, themobile device101 determines whether a user has approached (or is approaching) a destination designated by the user when guiding the user along a route using a map screen on which location information of the user is mapped. As discussed above, when guiding the user along a route using a location-based service, themobile device101 determines whether the user has approached (or is approaching) the destination by determining whether the user is currently located near the destination.
When the user is determined to have approached the destination, themobile device101 activates a visual notification function associated with the destination in operation S220.
Destination information providing systems according to example embodiments may omit a portion of constituent elements or may further include additional constituent elements based on the detailed description of example embodiments of destination information providing methods described above with reference toFIGS. 1 through 7. Also, at least two constituent elements may be combined and operation orders or methods between constituent elements may be modified.
As described above, according to example embodiments, quality and/or satisfaction of a route guide service may be enhanced by more accurately notifying a user of a destination using information associated with the destination when the user arrives near the destination using a location-based service.
It will be apparent to those skilled in the art that various modifications and variation can be made in the example embodiments without departing from the spirit or scope of example embodiments. Thus, it is intended that the example embodiments cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.

Claims (19)

What is claimed is:
1. A method of providing destination information to a user through a location-based service, the method comprising:
determining whether the user is approaching a destination along a guided route for the user based on mapped location information for the user, the destination being designated by the user and being within a building or other structure, the guided route being a travel route traversed by the user from a starting location to the destination; and
activating a visual notification function associated with the destination after and in response to completion of the guided route for the user from the starting location to the destination along the guided route, wherein
the completion of the guided route for the user is determined based at least partially on indoor positioning information for the user, and
the visual notification function includes at least an indoor view screen in which an actual indoor view of the destination is displayed.
2. The method ofclaim 1, wherein the travel route is traversed by the user at least one of by walking, using a vehicle, using public transportation, and using a bicycle.
3. The method ofclaim 1, wherein the destination is an intermediate destination along the guided route.
4. The method ofclaim 1, further comprising:
displaying, on a street view screen, an actual street view around a current location of the user;
highlighting an object corresponding to the destination on the street view screen; and
displaying the highlighted object on the street view screen.
5. The method ofclaim 4, wherein the highlighting the object comprises:
highlighting the object corresponding to the destination on the street view screen using a color or figure such that the highlighted object is distinguished from other objects on the street view screen.
6. The method ofclaim 4, wherein the highlighting the object comprises:
processing an edge of the object corresponding to the destination to be bold on the street view screen.
7. The method ofclaim 4, wherein the object corresponds to one of the building or other structure, a location within a building complex, a street, a sidewalk, and a trail.
8. The method ofclaim 1, further comprising:
displaying, on a street view screen, an actual street view around a current location of the user; and
displaying, on the street view screen, an instruction indicator on an object corresponding to the destination.
9. The method ofclaim 1, further comprising:
displaying, on a street view screen, an actual street view around a current location of the user; and
displaying, on the street view screen, at least one of text describing the destination and an instruction indicator on an object corresponding to the destination.
10. The method ofclaim 1, wherein the activating the visual notification function associated with the destination comprises:
displaying an enlarged photo of the destination.
11. The method ofclaim 1, wherein the activating the visual notification function associated with the destination comprises:
displaying at least one of an actual photo of the destination and a three-dimensional (3D) map, the 3D map being generated using a vector rendering method.
12. A system for providing destination information to a user through a location-based service, the system comprising:
a processor configured to execute computer-readable instructions that, when executed, cause the processor to
determine whether the user is approaching a destination along a guided route for the user based on mapped location information for the user, the destination being designated by the user and being within a building or other structure, the guided route being a travel route traversed by the user from a starting location to the destination; and
activate a visual notification function associated with the destination after and in response to completion of the guided route for the user from the starting location to the destination along the guided route; wherein
the completion of the guided route for the user is determined based at least partially on indoor positioning information for the user, and
the visual notification function includes at least an indoor view screen in which an actual indoor view of the destination is displayed.
13. The system ofclaim 12, wherein the travel route is traversed by the user at least one of by walking, using a vehicle, using public transportation, and using a bicycle.
14. The system ofclaim 12, wherein the destination is an intermediate destination along the guided route.
15. The system ofclaim 12, wherein the processor is further configured to execute computer-readable instructions to,
display, on a street view screen, an actual street view around a current location of the user,
highlight an object corresponding to the destination on the street view screen; and
display the highlighted object on the street view screen.
16. The system ofclaim 12, wherein the processor is further configured to execute computer-readable instructions to cause the system to display an enlarged photo of the destination.
17. The system ofclaim 12, wherein the processor is further configured to execute computer-readable instructions to cause the system to display at least one of an actual photo of the destination and a three-dimensional (3D) map, the 3D map being generated using a vector rendering method.
18. A non-transitory computer-readable medium storing instructions to control a computer or processor-based system to perform a method for providing destination information to a user through a location-based service, the method comprising:
determining whether the user is approaching a destination along a guided route for the user based on mapped location information for the user, the destination being designated by the user and being within a building or other structure, the guided route being a travel route traversed by the user from a starting location to the destination; and
activating a visual notification function associated with the destination after and in response to completion of the guided route for the user from the starting location to the destination along the guided route, wherein
the completion of the guided route for the user is determined based at least partially on indoor positioning information for the user, and
the visual notification function includes at least an indoor view screen in which an actual indoor view of the destination is displayed.
19. A file distribution system for distributing a file of an application installed at a user terminal to provide destination information for the user terminal through a location-based service, the file distribution system comprising:
a file transmitter configured to transmit the file in response to a request from the user terminal; and
wherein the application includes,
a first module configured to control the user terminal to determine whether the user is approaching a destination along a guided route for the user based on mapped location information for the user, the destination being designated by the user and being within a building or other structure, the guided route being a travel route traversed by the user from a starting location to the destination, and
a second module configured to control the user terminal to activate a visual notification function associated with the destination after and in response to completion of the guided route for the user from the starting location to the destination along the guided route, wherein
the completion of the guided route for the user is determined based at least partially on indoor positioning information for the user, and
the visual notification function includes at least an indoor view screen in which an actual indoor view of the destination is displayed.
US14/489,9462013-09-272014-09-18Methods and systems for notifying user of destination by route guideActive2034-11-27US9854395B2 (en)

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
KR20130115060AKR20150034997A (en)2013-09-272013-09-27Method and system for notifying destination by route guide
KR10-2013-01150602013-09-27

Publications (2)

Publication NumberPublication Date
US20150094955A1 US20150094955A1 (en)2015-04-02
US9854395B2true US9854395B2 (en)2017-12-26

Family

ID=52740945

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US14/489,946Active2034-11-27US9854395B2 (en)2013-09-272014-09-18Methods and systems for notifying user of destination by route guide

Country Status (3)

CountryLink
US (1)US9854395B2 (en)
JP (1)JP2015068828A (en)
KR (1)KR20150034997A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20150324389A1 (en)*2014-05-122015-11-12Naver CorporationMethod, system and recording medium for providing map service, and file distribution system
US10553113B2 (en)2018-06-182020-02-04Skip Transport, Inc.Method and system for vehicle location

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
KR102244248B1 (en)*2014-04-012021-04-26삼성전자주식회사Operating Method For content and Electronic Device supporting the same
US9638538B2 (en)2014-10-142017-05-02Uber Technologies, Inc.Street-level guidance via route path
JP6661883B2 (en)*2015-02-092020-03-11株式会社デンソー Vehicle display control device and vehicle display control method
KR102392998B1 (en)*2015-08-032022-05-02현대모비스 주식회사Route guidandce apparatus and control method for the same
US11095727B2 (en)*2015-12-222021-08-17Samsung Electronics Co., Ltd.Electronic device and server for providing service related to internet of things device
JP7008315B2 (en)*2017-08-012022-01-25株式会社冲セキ Server system, methods and programs executed by the server system
KR102541069B1 (en)*2017-12-282023-06-07현대오토에버 주식회사Apparatus and method for guiding destination
KR102040566B1 (en)*2018-01-042019-11-06라인플러스 주식회사Method, system, and non-transitory computer readable medium for providing pick up place
JP7000296B2 (en)*2018-10-312022-01-19トヨタ自動車株式会社 Demand forecast information display control method, display control device, and display control program
KR102584440B1 (en)*2018-12-062023-10-05한국전자통신연구원Driving Guide Apparatus and System for Providing with a Language Description of Image Characteristics

Citations (48)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
KR100235240B1 (en)1995-10-311999-12-15모리 하루오Navigation apparatus
JP2000207577A (en)1999-01-132000-07-28Sumitomo Electric Ind Ltd 3D map display device
JP2002168637A (en)2000-11-302002-06-14Alpine Electronics IncNavigation system
US20020091793A1 (en)*2000-10-232002-07-11Isaac SagieMethod and system for tourist guiding, including both navigation and narration, utilizing mobile computing and communication devices
US20030093216A1 (en)*2000-04-272003-05-15Yoshikazu AkiyamaNavigation system and memory medium storing the position data of the specific facilities
JP2003254775A (en)2002-03-052003-09-10Denso CorpNavigation device
JP2004048674A (en)2002-05-242004-02-12Olympus CorpInformation presentation system of visual field agreement type, portable information terminal, and server
US20040107048A1 (en)*2002-11-302004-06-03Tatsuo YokotaArrival detection method for navigation system
JP2004219293A (en)2003-01-162004-08-05Hitachi Software Eng Co LtdDestination guiding system associated with photograph data representing real scene
US20040243307A1 (en)*2003-06-022004-12-02Pieter GeelenPersonal GPS navigation device
US20040249565A1 (en)*2003-06-032004-12-09Young-Sik ParkSystem and method of displaying position information including and image in a navigation system
US20050004754A1 (en)*2003-07-032005-01-06David HayesNavigation method and apparatus for learning and updating position of street address
US20050021227A1 (en)*2003-06-062005-01-27Shuichi MatsumotoNavigation apparatus
US20050096842A1 (en)*2003-11-052005-05-05Eric TashiroTraffic routing method and apparatus for navigation system to predict travel time and departure time
US20050182564A1 (en)*2004-02-132005-08-18Kim Seung-IiCar navigation device using forward real video and control method thereof
US20060052934A1 (en)*2004-09-082006-03-09Aisin Aw Co., Ltd.Navigation apparatus
US20060075442A1 (en)*2004-08-312006-04-06Real Data Center, Inc.Apparatus and method for producing video drive-by data corresponding to a geographic location
US20060129316A1 (en)2004-12-142006-06-15Samsung Electronics Co., Ltd.Apparatus and method for displaying map in a navigation system
US20060140448A1 (en)*2004-12-272006-06-29Konica Minolta Photo Imaging, Inc.Image capturing apparatus and navigation system
US20060156209A1 (en)*2003-02-252006-07-13Satoshi MatsuuraApplication program prediction method and mobile terminal
US20060287815A1 (en)*2005-06-212006-12-21Mappick Technologies, Llc.Navigation system and method
US20070073478A1 (en)*2005-09-272007-03-29Yuji FunatoPoint search apparatus and in-vehicle navigation apparatus
US20070106469A1 (en)*2005-11-092007-05-10Denso CorporationNavigation system
US20070162942A1 (en)*2006-01-092007-07-12Kimmo HamynenDisplaying network objects in mobile devices based on geolocation
JP2007206298A (en)2006-02-012007-08-16Xanavi Informatics CorpIn-vehicle map display apparatus
US7260473B2 (en)*2000-06-292007-08-21Nokia CorporationMethod and mobile station for route guidance
US20070198182A1 (en)*2004-09-302007-08-23Mona SinghMethod for incorporating images with a user perspective in navigation
US20080040024A1 (en)*2006-08-102008-02-14Andrew De SilvaMethod and apparatus of displaying three-dimensional arrival screen for navigation system
US20080187181A1 (en)*2007-02-062008-08-07Meadow William DMethods and apparatus for generating a continuum of image data
US20090119008A1 (en)*2002-08-052009-05-07Sony CorporationElectronic guide system, contents server for electronic guide system, portable electronic guide device, and information processing method for electronic guide system
US20090177383A1 (en)*2008-01-072009-07-09Simone Francine TertoolenNavigation device and method
US7561958B2 (en)*2005-04-282009-07-14Aisin Aw Co., Ltd.Information providing system and navigation apparatus
US20090254268A1 (en)*2008-04-072009-10-08Microsoft CorporationComputing navigation device with enhanced route directions view
US20090289937A1 (en)*2008-05-222009-11-26Microsoft CorporationMulti-scale navigational visualtization
US20100073201A1 (en)*2008-09-242010-03-25Denso International America, Inc.Car finder by cell phone
US20100123737A1 (en)*2008-11-192010-05-20Apple Inc.Techniques for manipulating panoramas
US20100161658A1 (en)*2004-12-312010-06-24Kimmo HamynenDisplaying Network Objects in Mobile Devices Based on Geolocation
US20100215250A1 (en)*2009-02-242010-08-26Google Inc.System and method of indicating transition between street level images
US20100250581A1 (en)*2009-03-312010-09-30Google Inc.System and method of displaying images based on environmental conditions
US20100257195A1 (en)*2009-02-202010-10-07Nikon CorporationMobile information device, image pickup device, and information acquisition system
JP2010230551A (en)2009-03-272010-10-14Sony CorpNavigation apparatus and navigation method
US20100293173A1 (en)*2009-05-132010-11-18Charles ChapinSystem and method of searching based on orientation
US7933395B1 (en)*2005-06-272011-04-26Google Inc.Virtual tour of user-defined paths in a geographic information system
US20110106432A1 (en)*2008-06-092011-05-05Kabushiki Kaisha KenwoodGuide display device and guide display method, and display device and method for switching display contents
US20110313653A1 (en)2010-06-212011-12-22Research In Motion LimitedMethod, Device and System for Presenting Navigational Information
US20120059720A1 (en)*2004-06-302012-03-08Musabji Adil MMethod of Operating a Navigation System Using Images
US20130325481A1 (en)*2012-06-052013-12-05Apple Inc.Voice instructions during navigation
US20150241235A1 (en)*2014-02-212015-08-27Volkswagen AgDisplay of estimated time to arrival at upcoming personalized route waypoints

Patent Citations (52)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
KR100235240B1 (en)1995-10-311999-12-15모리 하루오Navigation apparatus
JP2000207577A (en)1999-01-132000-07-28Sumitomo Electric Ind Ltd 3D map display device
US20030093216A1 (en)*2000-04-272003-05-15Yoshikazu AkiyamaNavigation system and memory medium storing the position data of the specific facilities
US6721654B2 (en)*2000-04-272004-04-13Toyota Jidosha Kabushiki KaishaNavigation system and memory medium storing the position data of the specific facilities
US7260473B2 (en)*2000-06-292007-08-21Nokia CorporationMethod and mobile station for route guidance
US20020091793A1 (en)*2000-10-232002-07-11Isaac SagieMethod and system for tourist guiding, including both navigation and narration, utilizing mobile computing and communication devices
JP2002168637A (en)2000-11-302002-06-14Alpine Electronics IncNavigation system
JP2003254775A (en)2002-03-052003-09-10Denso CorpNavigation device
JP2004048674A (en)2002-05-242004-02-12Olympus CorpInformation presentation system of visual field agreement type, portable information terminal, and server
US20090119008A1 (en)*2002-08-052009-05-07Sony CorporationElectronic guide system, contents server for electronic guide system, portable electronic guide device, and information processing method for electronic guide system
US20040107048A1 (en)*2002-11-302004-06-03Tatsuo YokotaArrival detection method for navigation system
JP2004219293A (en)2003-01-162004-08-05Hitachi Software Eng Co LtdDestination guiding system associated with photograph data representing real scene
US20060156209A1 (en)*2003-02-252006-07-13Satoshi MatsuuraApplication program prediction method and mobile terminal
US20040243307A1 (en)*2003-06-022004-12-02Pieter GeelenPersonal GPS navigation device
US7383123B2 (en)*2003-06-032008-06-03Samsung Electronics Co., Ltd.System and method of displaying position information including an image in a navigation system
US20040249565A1 (en)*2003-06-032004-12-09Young-Sik ParkSystem and method of displaying position information including and image in a navigation system
US20050021227A1 (en)*2003-06-062005-01-27Shuichi MatsumotoNavigation apparatus
US20050004754A1 (en)*2003-07-032005-01-06David HayesNavigation method and apparatus for learning and updating position of street address
US20050096842A1 (en)*2003-11-052005-05-05Eric TashiroTraffic routing method and apparatus for navigation system to predict travel time and departure time
US20050182564A1 (en)*2004-02-132005-08-18Kim Seung-IiCar navigation device using forward real video and control method thereof
US20120059720A1 (en)*2004-06-302012-03-08Musabji Adil MMethod of Operating a Navigation System Using Images
US20060075442A1 (en)*2004-08-312006-04-06Real Data Center, Inc.Apparatus and method for producing video drive-by data corresponding to a geographic location
US20060052934A1 (en)*2004-09-082006-03-09Aisin Aw Co., Ltd.Navigation apparatus
US20070198182A1 (en)*2004-09-302007-08-23Mona SinghMethod for incorporating images with a user perspective in navigation
KR20060066859A (en)2004-12-142006-06-19삼성전자주식회사 Map display device and method in navigation system
US20060129316A1 (en)2004-12-142006-06-15Samsung Electronics Co., Ltd.Apparatus and method for displaying map in a navigation system
US20060140448A1 (en)*2004-12-272006-06-29Konica Minolta Photo Imaging, Inc.Image capturing apparatus and navigation system
US20100161658A1 (en)*2004-12-312010-06-24Kimmo HamynenDisplaying Network Objects in Mobile Devices Based on Geolocation
US7561958B2 (en)*2005-04-282009-07-14Aisin Aw Co., Ltd.Information providing system and navigation apparatus
US20060287815A1 (en)*2005-06-212006-12-21Mappick Technologies, Llc.Navigation system and method
US7933395B1 (en)*2005-06-272011-04-26Google Inc.Virtual tour of user-defined paths in a geographic information system
US20070073478A1 (en)*2005-09-272007-03-29Yuji FunatoPoint search apparatus and in-vehicle navigation apparatus
US20070106469A1 (en)*2005-11-092007-05-10Denso CorporationNavigation system
US20070162942A1 (en)*2006-01-092007-07-12Kimmo HamynenDisplaying network objects in mobile devices based on geolocation
JP2007206298A (en)2006-02-012007-08-16Xanavi Informatics CorpIn-vehicle map display apparatus
US20080040024A1 (en)*2006-08-102008-02-14Andrew De SilvaMethod and apparatus of displaying three-dimensional arrival screen for navigation system
US20080187181A1 (en)*2007-02-062008-08-07Meadow William DMethods and apparatus for generating a continuum of image data
US20090177383A1 (en)*2008-01-072009-07-09Simone Francine TertoolenNavigation device and method
US20090254268A1 (en)*2008-04-072009-10-08Microsoft CorporationComputing navigation device with enhanced route directions view
US20090289937A1 (en)*2008-05-222009-11-26Microsoft CorporationMulti-scale navigational visualtization
US8649973B2 (en)*2008-06-092014-02-11Kabushiki Kaisha KenwoodGuide display device and guide display method, and display device and method for switching display contents
US20110106432A1 (en)*2008-06-092011-05-05Kabushiki Kaisha KenwoodGuide display device and guide display method, and display device and method for switching display contents
US20100073201A1 (en)*2008-09-242010-03-25Denso International America, Inc.Car finder by cell phone
US20100123737A1 (en)*2008-11-192010-05-20Apple Inc.Techniques for manipulating panoramas
US20100257195A1 (en)*2009-02-202010-10-07Nikon CorporationMobile information device, image pickup device, and information acquisition system
US20100215250A1 (en)*2009-02-242010-08-26Google Inc.System and method of indicating transition between street level images
JP2010230551A (en)2009-03-272010-10-14Sony CorpNavigation apparatus and navigation method
US20100250581A1 (en)*2009-03-312010-09-30Google Inc.System and method of displaying images based on environmental conditions
US20100293173A1 (en)*2009-05-132010-11-18Charles ChapinSystem and method of searching based on orientation
US20110313653A1 (en)2010-06-212011-12-22Research In Motion LimitedMethod, Device and System for Presenting Navigational Information
US20130325481A1 (en)*2012-06-052013-12-05Apple Inc.Voice instructions during navigation
US20150241235A1 (en)*2014-02-212015-08-27Volkswagen AgDisplay of estimated time to arrival at upcoming personalized route waypoints

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Japanese Office Action dated Aug. 25, 2015, issued in Japanese Patent Application 2014-186719.
Japanese Office Action issued in Japanese Patent Application No. 2014-186719, dated Apr. 12, 2016.
Korean Office Action issued in corresponding Korean patent application No. 10-2013-0115060, dated Oct. 15, 2014.

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20150324389A1 (en)*2014-05-122015-11-12Naver CorporationMethod, system and recording medium for providing map service, and file distribution system
US11880417B2 (en)*2014-05-122024-01-23Naver CorporationMethod, system and recording medium for providing map service, and file distribution system
US10553113B2 (en)2018-06-182020-02-04Skip Transport, Inc.Method and system for vehicle location

Also Published As

Publication numberPublication date
KR20150034997A (en)2015-04-06
US20150094955A1 (en)2015-04-02
JP2015068828A (en)2015-04-13

Similar Documents

PublicationPublication DateTitle
US9854395B2 (en)Methods and systems for notifying user of destination by route guide
US12332060B2 (en)Localizing transportation requests utilizing an image based transportation request interface
EP2769333B1 (en)Video based pedestrian traffic estimation
US9080877B2 (en)Customizing destination images while reaching towards a desired task
EP3647725B1 (en)Real-scene navigation method and apparatus, device, and computer-readable storage medium
US10739153B2 (en)Auxiliary navigational assistance
CN111161008A (en) AR/VR/MR ride sharing assistant
US9696174B2 (en)System and method for providing surrounding area search result
US20130231857A1 (en)Method and apparatus for triggering conveyance of guidance information
US20150134850A1 (en)Method and apparatus for probe-based routing
CN108088450A (en)Navigation method and device
US10852149B2 (en)Navigation data processing system, apparatus and computer readable medium
WO2015139788A1 (en)Method and apparatus for providing sharing of navigation route and guidance information among devices
CN111127929A (en) Arrival reminder method, device, terminal and storage medium
CN102984654B (en)Mobile terminal group tracking service method based on internet of things resource sharing platform
CN108267142A (en)A kind of navigation display method based on address card, system and a kind of vehicle device
KR101421411B1 (en)Method for providing map information, system thereof, terminal thereof and apparatus thereof
CN103644921B (en)It is a kind of to realize the method and apparatus that streetscape is shown
KR20230153117A (en)Method and apparatus for providing vehicle status information
KR20150015836A (en)System for providing travel information based on cloud and providing method thereof
CN113739800B (en) Navigation guidance method and computer program product
US10740612B2 (en)Location determination
US9816828B2 (en)Dynamic attraction inference in route recommendations
KR20140030626A (en)Method for managing data traffic, system thereof, apparatus thereof and terminal thereof
US20230175851A1 (en)Data-analysis-based navigation system assistance

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:NAVER CORPORATION, KOREA, REPUBLIC OF

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEE, YOON SHICK;PARK, MIN SIK;KIM, MIN OH;SIGNING DATES FROM 20140806 TO 20140807;REEL/FRAME:033769/0696

STCFInformation on status: patent grant

Free format text:PATENTED CASE

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:4

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:8


[8]ページ先頭

©2009-2025 Movatter.jp