INTERACTIVE GEOSPATIAL MAPCROSS-REFERENCE TO RELATED APPLICATIONS This application claims a priority to U.S. Provisional PatentApplication No. 61/820608, filed on May 7, 2013, and titled “INTERACTIVE DATAOBJECT MAP,” which is hereby incorporated by reference herein in its entirety.
TECHNICAL FIELD The present disclosure relates to systems and techniques forgeographical data integration, analysis, and visualization. More specifically, thepresent disclosure relates to interactive maps including data objects.
BACKGROUND Interactive geographical maps, such as web-based mappingservice applications and Geographical Information Systems (GIS), are available froma number of providers. Such maps generally comprise satellite images or genericbase layers overlaid by roads. Users of such systems may generally search for andview locations of a small number of landmarks, and determine directions from onelocation to another. In some interactive graphical maps, 3D terrain and/or 3Dbuildings may be visible in the interface.
SUMMARY The systems, methods, and devices described herein each haveseveral aspects, no single one of which is solely responsible for its desirableattributes. Without limiting the scope of this disclosure, several non-limiting featureswill now be discussed briefly.
The systems, methods, and devices of the present disclosure mayprovide, among other features, high-performance, interactive geospatial and/or dataobject map capabilities in which large amounts of geographical, geospatial, andother types of data, geodata, objects, features, and/or metadata are efficientlypresented to a user on a map interface. In various embodiments, an interactivegeospatial map system (also referred to as an interactive data object map system)1000737949may enable rapid and deep analysis of various objects, features, and/or metadata by theuser. In some embodiments, a layer ontology may be displayed to the user. In variousembodiments, when the user rolls a selection cursor over an object/feature an outline of theobject/feature is displayed. Selection of an object/feature may cause display of metadataassociated with that object/feature. In various embodiments, the interactive data object mapsystem may automatically generate feature/object lists and/or histograms based onselections made by the user. Various aspects of the present disclosure may enable the userto perform geosearches, generate heatmaps, and/or perform keyword searches, amongother actions.
In a first aspect, there is provided a computer system comprising: acomputer readable medium storing software modules including computer executableinstructions; and one or more hardware processors in communication with the computerreadable medium and an electronic data structure, the electronic data structure configuredto store a plurality of features or objects, wherein each of the features or objects isassociated with metadata, the one or more hardware processors being configured toexecute a user interface module of the software modules in order to: display an interactivemap on an electronic display of the computer system; include on the interactive map one ormore features or objects, wherein the features or objects are selectable by a user of thecomputer system, and wherein the features or objects are accessed from the electronic datastructure; receive a first input from the user selecting a plurality of the included features orobjects; and in response to the first input, access, from the electronic data structure,metadata associated with respective selected features or objects; determine one or moremetadata categories associated with at least one of the accessed metadata; and for each ofthe determined metadata categories: generate one or more histograms for respectivemetadata values or value ranges, each of the histograms including a visual indicatorindicating a quantity of the selected plurality of features or objects included on the interactivemap having the respective metadata value or value range; and display the one or morehistograms on the electronic display.1000737949 The features or objects may comprise vector data.
The features or objects may comprise at least one of roads, terrain, lakes,rivers, vegetation, utilities, street lights, railroads, hotels or motels, schools, hospitals,buildings or structures, regions, transportation objects, entities, events, or documents.[0008A] In a second aspect, there is provided a computer-implementedmethod comprising operating a computer system comprising one or more hardwareprocessors to:display an interactive map on an electronic display of the computersystem;access one or more features or objects from an electronic data structure,the electronic data structure configured to store a plurality of features or objects,wherein each of the features or objects is associated with metadata;include on the interactive map the one or more features or objects, whereinthe features or objects are selectable by a user of the computer system;receive a first input from the user selecting a plurality of the includedfeatures or objects; andin response to the first input,access, from the electronic data structure, metadata associated withrespective selected features or objects;determine one or more metadata categories associated with atleast one of the accessed metadata; andfor each of the determined metadata categories:generate one or more histograms for respective metadatavalues or value ranges, each of the histograms including a visualindicator indicating a quantity of the selected plurality of features orobjects included on the interactive map having the respectivemetadata value or value range; anddisplay the one or more histograms on the electronic display.[0008B] In a third aspect, there is provided a computer-implemented a non-transitory computer-readable medium comprising instructions for execution by a computersystem comprising one or more hardware processors in order to cause the computer systemto perform the method in accordance with the second aspect-2a-1000737949 The metadata associated with the features or objects may compriseat least one of a location, a city, a county, a state, a country, an address, a district, agrade level, a phone number, a speed, a width, or other related attributes.
The features or objects may be selectable by a user using a mouseand/or a touch interface.
Each histogram of the one or more histograms may be specific to aparticular metadata category.
Each histogram of the one or more histograms may comprise a listof items of metadata specific to the particular metadata category of the histogram,wherein the list of items is organized in descending order from an item having thelargest number of related objects or features to an item having the smallest numberof related objects or features.
The one or more histograms displayed on the electronic displaymay be displayed so as to partially overlay the displayed interactive map.
The one or more hardware processors may be further configured toexecute the user interface module in order to: receive a second input from the userselecting a second one or more features or objects from the one or morehistograms; and in response to the second input, update the interactive map todisplay the second one or more features or objects on the display; and highlight thesecond one or more features or objects on the interactive map.
Updating the interactive map may comprise panning and/orzooming.
Highlighting the second one or more features may comprise at leastone of outlining, changing color, bolding, or changing contrast.
The one or more hardware processors may be further configured toexecute the user interface module in order to: receive a third input from the userselecting a drill-down group of features or objects from the one or more histograms;and in response to the third input, drill-down on the selected drill-down group offeatures or objects by: accessing the metadata associated with each of the featuresor objects of the selected drill-down group; determining one or more drill-downmetadata categories based on the accessed metadata associated with each of the1000737949features or objects of the selected drill-down group; organizing the features orobjects of the selected drill-down group into one or more drill-down histogramsbased on the determined drill-down metadata categories and the accessedmetadata associated with each of the features or objects of the selected drill-downgroup; and displaying on the interactive map the one or more drill-down histograms.
The one or more hardware processors may be further configured toexecute the user interface module in order to enable the user to further drill downinto the one or more drill-down histograms.
The one or more hardware processors may be further configured toexecute the user interface module in order to: receive a feature or object hover overinput from the user; and in response to receiving the hover over input, highlight, onthe electronic display, metadata associated with the particular hovered over featureor object to the user.
The one or more hardware processors may be further configured toexecute the user interface module in order to: receive a feature or object selectioninput from the user; and in response to receiving the selection input, display, on theelectronic display, metadata associated with the particular selected feature or objectto the user.
Also described is a computer system comprising: an electronic datastructure configured to store a plurality of features or objects, wherein each of thefeatures or objects is associated with metadata; a computer readable mediumstoring software modules including computer executable instructions; one or morehardware processors in communication with the electronic data structure and thecomputer readable medium, and configured to execute a user interface module ofthe software modules in order to: display an interactive map on a display of thecomputer system, the interactive map comprising a plurality of map tiles accessedfrom the electronic data structure, the map tiles each comprising an imagecomposed of one or more vector layers; include on the interactive map a plurality offeatures or objects accessed from the electronic data structure, the features orobjects being selectable by a user, each of the features or objects includingassociated metadata; receive an input from a user including at least one of a zoom1000737949action, a pan action, a feature or object selection, a layer selection, a geosearch, aheatmap, and a keyword search; and in response to the input from the user: request,from a server, updated map tiles, the updated map tiles being updated according tothe input from the user; receive the updated map tiles from the server; and updatethe interactive map with the updated map tiles.
The one or more vector layers may comprise at least one of aregions layer, a buildings/structures layer, a terrain layer, a transportation layer, or autilities/infrastructure layer.
Each of the one or more vector layers may be comprised of one ormore sub-vector layers.
Also described is a computer system comprising: one or morehardware processors in communication with the computer readable medium, andconfigured to execute a user interface module of the software modules in order to:display an interactive map on a display of the computer system, the interactive mapcomprising a plurality of map layers; determine a list of available map layers;organizing the list of available map layers according to a hierarchical layer ontology,wherein like map layers are grouped together; and display on the interactive map thehierarchical layer ontology, wherein the user may select one or more of thedisplayed layers, and wherein each of the available map layers is associated withone or more feature or object types.
The map layers may comprise at least one of vector layers andbase layers.[0025A] As used herein, except where the context requires otherwise, theterm “comprise” and variations of the term, such as “comprising”, “comprises” and“comprised”, are not intended to exclude further components, integers or steps.
BRIEF DESCRIPTION OF THE DRAWINGS The following aspects of the disclosure will become more readilyappreciated as the same become better understood by reference to the followingdetailed description, when taken in conjunction with the accompanying drawings.
Figure 1 illustrates a sample user interface of the interactive dataobject map system, according to an embodiment of the present disclosure.1000737949 Figure 2A illustrates a sample user interface of the interactive dataobject map system in which map layers are displayed to a user, according to anembodiment of the present disclosure.
Figure 2B illustrates an example map layer ontology, according toan embodiment of the present disclosure.
Figure 2C illustrates a sample user interface of the interactive dataobject map system in which various objects are displayed, according to anembodiment of the present disclosure.
Figure 3A illustrates a sample user interface of the interactive dataobject map system in which objects are selected, according to an embodiment of thepresent disclosure.
Figures 3B-3G illustrate sample user interfaces of the interactivedata object map system in which objects are selected and a histogram is displayed,according to embodiments of the present disclosure.
Figures 3H-3I illustrate sample user interfaces of the interactivedata object map system in which objects are selected and a list of objects isdisplayed, according to embodiments of the present disclosure.
Figures 3J-3K illustrate sample user interfaces of the interactivedata object map system in which objects are outlined when hovered over, accordingto embodiments of the present disclosure.
Figures 4A-4D illustrate sample user interfaces of the interactivedata object map system in which a radius geosearch is displayed, according toembodiments of the present disclosure.
Figures 5A-5D illustrate sample user interfaces of the interactivedata object map system in which a heatmap is displayed, according to embodimentsof the present disclosure.
Figures 5E-5F illustrate sample user interfaces of the interactivedata object map system in which a shape-based geosearch is displayed, accordingto embodiments of the present disclosure.1000737949 Figure 5G illustrates a sample user interface of the interactive dataobject map system in which a keyword object search is displayed, according to anembodiment of the present disclosure.
Figure 5H illustrates an example of a UTF grid of the interactivedata object map system, according to an embodiment of the present disclosure.
Figure 6A shows a flow diagram depicting illustrative client-sideoperations of the interactive data object map system, according to an embodiment ofthe present disclosure.
Figure 6B shows a flow diagram depicting illustrative client-sidemetadata retrieval of the interactive data object map system, according to anembodiment of the present disclosure.
Figure 7A shows a flow diagram depicting illustrative server-sideoperations of the interactive data object map system, according to an embodiment ofthe present disclosure.
Figure 7B shows a flow diagram depicting illustrative server-sidelayer composition of the interactive data object map system, according to anembodiment of the present disclosure.
Figure 8A illustrates one embodiment of a database system usingan ontology.
Figure 8B illustrates one embodiment of a system for creating datain a data store using a dynamic ontology.
Figure 8C illustrates a sample user interface using relationshipsdescribed in a data store using a dynamic ontology.
Figure 8D illustrates a computer system with which certain methodsdiscussed herein may be implemented.
DETAILED DESCRIPTION OF SPECIFIC EMBODIMENTSOverview In general, a high-performance, interactive data object map system(or “map system”) is disclosed in which large amounts of geographical, geospatial,and other types of data, geodata, objects, features, and/or metadata are efficiently1000737949presented to a user on a map interface. The interactive data object map systemallows for rapid and deep analysis of various objects, features, and/or metadata bythe user. For example, millions of data objects and/or features may besimultaneously viewed and selected by the user on the map interface. A layerontology may be displayed to the user that allows the user to select and viewparticular layers. In various embodiments, when the user rolls a selection cursorover an object/feature (and/or otherwise selects the object/feature) an outline of theobject/feature is displayed. Selection of an object/feature may cause display ofmetadata associated with that object/feature.
In an embodiment, the user may rapidly zoom in and out and/ormove and pan around the map interface to variously see more or less detail, andmore or fewer objects. In various embodiments, the interactive data object mapsystem may automatically generate feature/object lists and/or histograms based onselections made by the user. In various embodiments, the user may performgeosearches (based on any selections and/or drawn shapes), generate heatmaps,and/or perform keyword searches, among other actions as described below.
In an embodiment, the interactive data object map system includesserver-side computer components and/or client-side computer components. Theclient-side components may implement, for example, displaying map tiles, showingobject outlines, allowing the user to draw shapes, and/or allowing the user to selectobjects/features, among other actions. The server-side components may implement,for example, composition of layers into map tiles, caching of composed map tilesand/or layers, and/or providing object/feature metadata, among other actions. Suchfunctions may be distribution in any other manner. In an embodiment, object/featureoutlines and/or highlighting are accomplished on the client-side through the use of aUTF grid.
Definitions In order to facilitate an understanding of the systems and methodsdiscussed herein, a number of terms are defined below. The terms defined below, aswell as other terms used herein, should be construed to include the provideddefinitions, the ordinary and customary meaning of the terms, and/or any other1000737949implied meaning for the respective terms. Thus, the definitions below do not limit themeaning of these terms, but only provide exemplary definitions.
Ontology: A hierarchical arrangement and/or grouping of dataaccording to similarities and differences. The present disclosure describes twoontologies. The first relates to the arrangement of vector layers consisting of mapand object data as used by the interactive data object map system (as describedbelow with reference to Figures 2A-2B). The second relates to the storage andarrangement of data objects in one or more databases (as described below withreference to Figures 8A-8C). For example, the stored data may comprise definitionsfor object types and property types for data in a database, and how objects andproperties may be related.
Database: A broad term for any data structure for storing and/ororganizing data, including, but not limited to, relational databases (Oracle database,mySQL database, etc.), spreadsheets, XML files, and text file, among others.
Data Object, Object, or Feature: A data container for informationrepresenting specific things in the world that have a number of definable properties.
For example, a data object can represent an entity such as a person, a place, anorganization, a market instrument, or other noun. A data object can represent anevent that happens at a point in time or for a duration. A data object can represent adocument or other unstructured data source such as an e-mail message, a newsreport, or a written paper or article. Each data object may be associated with aunique identifier that uniquely identifies the data object. The object’s attributes (e.g.metadata about the object) may be represented in one or more properties. For thepurposes of the present disclosure, the terms “feature,” “data object,” and “object”may be used interchangeably to refer to items displayed on the map interface of theinteractive data object map system, and/or otherwise accessible to the user throughthe interactive data object map system. Features/objects may generally include, butare not limited to, roads, terrain (such as hills, mountains, rivers, and vegetation,among others), street lights (which may be represented by a streetlight icon),railroads, hotels/motels (which may be represented by a bed icon), schools (whichmay be represented by a parent-child icon), hospitals, other types of buildings or1000737949structures, regions, transportation objects, and other types of entities, events, anddocuments, among others. Objects displayed on the map interface generallycomprise vector data, although other types of data may also be displayed. Objectsgenerally have associated metadata and/or properties.
Object Type: Type of a data object (e.g., Person, Event, orDocument). Object types may be defined by an ontology and may be modified orupdated to include additional object types. An object definition (e.g., in an ontology)may include how the object is related to other objects, such as being a sub-objecttype of another object type (e.g. an agent may be a sub-object type of a personobject type), and the properties the object type may have.
Properties: Also referred to as “metadata,” includes attributes of adata object/feature. At a minimum, each property/metadata of a data object has atype (such as a property type) and a value or values. Properties/metadataassociated with features/objects may include any information relevant to thatfeature/object. For example, metadata associated with a school object may includean address (for example, 123 S. Orange Street), a district (for example, 509c), agrade level (for example, K-6), and/or a phone number (for example, 800-0000),among other items of metadata. In another example, metadata associated with aroad object may include a speed (for example, 25 mph), a width (for example, 2lanes), and/or a county (for example, Arlington), among other items of metadata.
Property Type: The data type of a property, such as a string, aninteger, or a double. Property types may include complex property types, such as aseries data values associated with timed ticks (e.g. a time series), etc.
Property Value: The value associated with a property, which is ofthe type indicated in the property type associated with the property. A property mayhave multiple values.
Link: A connection between two data objects, based on, forexample, a relationship, an event, and/or matching properties. Links may bedirectional, such as one representing a payment from person A to B, or bidirectional.
Link Set: Set of multiple links that are shared between two or moredata objects.1000737949Description of the Figures Embodiments of the disclosure will now be described withreference to the accompanying Figures, wherein like numerals refer to like elementsthroughout. The terminology used in the description presented herein is not intendedto be interpreted in any limited or restrictive manner, simply because it is beingutilized in conjunction with a detailed description of certain specific embodiments ofthe disclosure. Furthermore, embodiments of the disclosure may include severalnovel features, no single one of which is solely responsible for its desirable attributesor which is essential to practicing the embodiments of the disclosure hereindescribed.
Figure 1 illustrates a sample user interface of the interactive dataobject map system, according to an embodiment of the present disclosure. The userinterface includes a map interface 100, a selection button/icon 102, a shapebutton/icon 104, a layers button/icon 106, a geosearch button/icon 108, a heat mapbutton/icon 110, a search box 112, a feature information box 114, a coordinatesinformation box 116, map scale information 118, zoom selectors 120, andhighlighted features 122. The functionality of the interactive data object map systemmay be implemented in one or more computer modules and/or processors, as isdescribed below with reference to Figure 8D.
The map interface 100 of Figure 1 is composed of multiple maptiles. The map tiles are generally composed of multiple layers of geographical,vector, and/or other types of data. Vector data layers (also referred to as vectorlayers) may include associated and/or linked data objects/features. In anembodiment, vector layers are composed of data objects/features. The various dataobjects and/or features associated with a particular vector layer may be displayed tothe user when that particular vector layer is activated. For example, a transportationvector layer may include road, railroad, and bike path objects and/or features thatmay be displayed to the user when the transportation layer is selected. The layersused to compose the map tiles and the map interface 100 may vary based on, forexample, whether a user has selected features displayed in the map interface 100,and/or the particular layers a user has selected for display. In an embodiment,1000737949composition of map tiles is accomplished by server-side components of theinteractive data object map system. In an embodiment, composed map tiles may becached by the server-side components to speed up map tile delivery to client-sidecomponents. The map tiles may then be transmitted to the client-side components ofthe interactive data object map system where they are composed into the mapinterface 100.
In general, the user interface of Figure 1 is displayed on anelectronic display viewable by a user of the interactive data object map system. Theuser of the interactive data object map system may interact with the user interface ofFigure 1 by, for example, touching the display when the display is touch-enabledand/or using a mouse pointer to click on the various elements of the user interface.
The map interface 100 includes various highlighted features 122and feature icons. For example, the map interface 100 includes roads, buildings andstructures, utilities, lakes, rivers, vegetation, and railroads, among other features.
The user may interact with the map interface 100 by, for example, rolling over and/orclicking on various features. In one embodiment, rolling over and/or placing themouse pointer over a feature causes the feature to be outlined and/or otherwisehighlighted. Additionally, the name of the feature and/or other information about thefeature may be shown in the feature information box 114.
The user of the map system may interact with the user interface ofFigure 1 by scrolling or panning up, down, and/or side to side; zooming in or out;selecting features; drawing shapes; selecting layers; performing a geosearch;generating a heat map; and/or performing a keyword search; among other actionsas are described below. Various user actions may reveal more or less map detail,and/or more or fewer features/objects.
Figure 2A illustrates a sample user interface of the map system inwhich map layers are displayed to a user, according to an embodiment of thepresent disclosure. In the user interface of Figure 2A, the user has selected thelayers button 106, revealing the layers window 202. The layers window 202 includesa list of base layers, vector layers, and user layers. The base layers include, forexample, overhead imagery, topographic, blank (Mercator), base map, aviation, and1000737949blank (unprojected). The vector layers include general categories such as, forexample, regions, buildings/structures, terrain, transportation, andutilities/infrastructure. While no user layers are included in the user interface ofFigure 2A, user layers may be added by the user of the map system, as is describedbelow.
In an embodiment, the user may select one or more of the baselayers which may be used during composition of the map tiles. For example,selection of the overhead imagery base layer will produce map tiles in which theunderlying map tile imagery is made up of recent aerial imagery. Similarly, selectionof the topographic base layer will produce map tiles in which the underlying map tileimagery includes topographic map imagery.
Further, in an embodiment, the user may select one or more of thevector layers which may be used during composition of the map tiles. For example,selecting the transportation layer results in transportation-related objects and/orfeatures being displayed on the map tiles. Transportation-related features mayinclude, for example, roads, railroads, street signs, and/or street lights, amongothers. Examples of transportation-related features may be seen in the userinterface of Figure 2A where various roads, railroads, and street light icons aredisplayed.
In an embodiment, the user of the map system may create andsave map layers. These saved map layers may be listed as user layers in the layerswindow 202.
Figure 2B illustrates an example map layer ontology, according toan embodiment of the present disclosure. As mentioned above with reference toFigure 2A, the list of vector layers in the layers window 202 may include generalcategories/layers such as regions, buildings/structures, terrain, transportation, andutilities/infrastructure. The vector layers available in the map system may be furtherorganized into an ontology, or hierarchical arrangement. For example, as shown inthe vector layers window 206, the buildings/structures category 208 may be furthersubdivided into layers including structures, government, medical, education, andcommercial. The terrain category 210 may include vegetation and/or1000737949water/hydrography layers. The utilities/infrastructure category may include fire and/orstorage/draining.
In an embodiment, the user of the map system may select one ormore of the layers and/or sub-layers of the layer ontology. As shown in Figure 2B,the user has deselected the vegetation sub-layer, and all of the utilities/infrastructurelayers. Selecting and deselecting vector layers, or toggling vectors layers on and off,may cause the vector objects and/or features associated with those layers to bedisplayed or not displayed in the map interface. For example, when the user selectsthe transportation category/layer, road objects associated with the transportationlayer may be displayed on the map interface. Likewise, when a user deselects thetransportation category/layer, road objects associated with the transportation layermay be removed from the map interface.
In an embodiment, additional hierarchical levels of layers may bedisplayed to the user. For example, the vector layers window 206 may include sub-sub-layers (for example, the education sub-layer may be divided into elementaryschools, secondary schools, and post-secondary schools). Alternatively, fewerhierarchical levels may be displayed to the user.
In an embodiment, each of the vector layers shown in the vectorlayers window 206 may be made up of many layers of map vector data. In thisembodiment, the map system may advantageously generate a simplified layerontology, such as the one shown in 206. The simplified layer ontology allows theuser to easily select layers of interest from a reduced number of layers, rather than alarge number of discrete layers. As described above, vector layers may contain dataregarding associated features and/or objects. Thus, features visible in the mapinterface correspond to the currently active/selected layers. In an embodiment, thelayer ontology may have an arbitrary depth.
Figure 2C illustrates a sample user interface of the map system inwhich various objects are displayed, according to an embodiment of the presentdisclosure. The user interface of Figure 2C includes a map interface 214, an outlinedfeature 216, and feature information box 114 indicating that the outlined feature 216is called “Union Park.” Various features/objects may be seen in the map interface1000737949214 including, for example, roads, buildings, terrain, street lights (represented by astreetlight icon), railroads, hotels/motels (represented by a bed icon), and schools(represented by a parent-child icon), among other features.
Figure 3A illustrates a sample user interface of the map system inwhich objects are selected, according to an embodiment of the present disclosure.
The user interface of Figure 3A includes a highlighted user selection rectangle 302.
The highlighted user selection rectangle 302 illustrates the user actively selecting aparticular region of the map interface so as to select the features/objects that fallwithin the bounds of that rectangle. In an embodiment, visible features may beselected by the user, while features that are not currently visible are not selectable.
For example, features related to layers that are not currently active are not selectedwhen the user performs a selection. In another embodiment, even features that arenot visible in a selected area may be selected.
Figures 3B-3C illustrate sample user interfaces of the map systemin which objects are selected and a feature histogram 304 is displayed in a selectionwindow, according to embodiments of the present disclosure. The selectedobjects/features of Figure 3B (including roads 310 and other features 312) may havebeen selected via the highlighted user selection rectangle 302 of Figure 3A.
Selected features are indicated by highlighting and/or altered colors on the map tilesmaking up the map interface.
Feature histogram 304 is shown in a selection window included inthe user interface of Figure 3B. The histogram 304 shows a categorized histogramof all objects/features selected by the user in the map interface. The histogramdivides the features into common buckets and/or categories based on relatedmetadata (also referred to as metadata categories). For example, at 306, “Belongsto Layer” indicates that the following histogram includes all selected featuresorganized by layer category. In this example there are over 70,000 selectedbuildings/structures features, over 40,000 selected facility features, and over 6,000selected road features, among others. Further, the feature histogram 304 includeshistograms of the selected objects organized by account and acreage. In variousembodiments, the map system may select histogram categories and/or metadata1000737949categories based on, for example, the features selected and/or types of featuresselected, among others. Any other categorization of selected features may bedisplayed in the histograms of the feature histogram 304.
In an embodiment, the user of the map system may select a subsetof the selected features for further analysis and/or histogram generation. Forexample, the user may select a subset comprising selected objects belonging to theroad category by, for example, clicking on the roads item 308. This selection mayresult in “drilling down” to histograms of that subset of features, as shown in Figure3C. Thus, a drill-down group of features/objects (for example, the subset offeatures/objects) may be used by the map system to determine new drill-downmetadata categories, or buckets of related metadata. At 314 in Figure 3C, the arrowicon indicates that of the originally selected 124,172 features, the feature histogramnow shows an analysis of the 6,724 features belonging to the road category (seeitem 316). The feature histogram window of Figure 3C thus shows a new set ofhistograms organized by layer, address, addressed, and agency, among others. Theuser may thus “drill down” and “drill up” through the selected features via thedisplayed histograms.
In an embodiment, items selected in the feature histogram arecorrespondingly highlighted in the map interface of the map system. For example, inthe map interface of Figure 3B, the user has selected the roads in the histogram at308. Corresponding features (in this example, roads) are thus highlighted in the mapinterface (as shown at 310).
Figures 3D-3G illustrate additional example user interfaces of themap system in which objects are selected from a histogram and correspondinglyhighlighted in the map interface, according to embodiments of the presentdisclosure. In Figures 3D-3F, in the selection window, the user is viewing ahistogram of all selected roads organized in a histogram according to the road speedlimit. In Figure 3D, the user has selected (at 318) roads with speed limits of 55 and65. The corresponding road features are highlighted in the map interface at, forexample 320. In Figure 3E, the user has selected (at 322) roads with speed limits of, 45, 40, 55, and 65. The corresponding road features are highlighted in the map1000737949interface at, for example 324. In Figure 3F, the user has selected (at 326) roads withspeed limits of 25. The corresponding road features are highlighted in the mapinterface at, for example 328. In Figure 3G, the user may “drill down” into thehistogram by, for example, right clicking on an item and selecting “Remove otherobjects in histogram” (330).
Figures 3H and 3I illustrate sample user interfaces of the mapsystem in which objects are selected and a list of selected objects 332 is displayedin the selection window, according to embodiments of the present disclosure. Withreference to Figure 3H, the list of features 332 indicates that the user has drilleddown further into the selected features of Figure 3G by selecting a subset ofselected features consisting of only roads with speed limits of 20. Thus, the subsetof the example of Figure 3H includes the 163 features that are roads with speedlimits of 20. The user has additionally selected to view the list of features 332 in theselection window (rather than the feature histogram). The list of features 332 listseach individual feature that is included in the currently selected subset. For example,the list includes S Central Av 334, among others.
In Figure 3I, the user has selected feature Hamilton St at 336. In anembodiment, when a feature is selected from the list of features, the map interfaceautomatically zooms to the location of that feature. The user may select the featurefrom the list of features by clicking on the name of the feature and/or the displayedthumbnail. In an embodiment, the map interface only zooms to the feature when theuser clicks on, and/or selects, the thumbnail associated with the feature. In theexample of Figure 3I, the map interface is automatically zoomed to the location ofthe selected Hamilton St, and the selected feature is highlighted (338). Additionally,the name of the selected feature is shown in the feature information box 114. In anembodiment, the name of the selected feature is shown in the feature informationbox 114 when the user hovers the cursor over the thumbnail associated with thefeature in the list of features. In an embodiment, the selected feature may be anyother type of object, and may be outlined or otherwise highlighted when selected.1000737949 In various embodiments, the user of the map system may selecteither the list of features, or the feature histogram, of the selection window to viewinformation about the selected features.
Figures 3J-3K illustrate sample user interfaces of the map systemin which objects are outlined when hovered over, according to embodiments of thepresent disclosure. In Figure 3J, the user is hovering over a building feature with themouse cursor. The feature being hovered over is automatically outlined (340).
Additionally, the name of the feature is displayed in the feature information box 114.
In Figure 3K, the user is hovering over a shelter feature with the mouse cursor. Thefeature being hovered over is automatically outlined (342), and the name of thefeature is displayed in the feature information box 114. The user of the map systemmay, at any time, highlight and/or outline any feature/object by rolling over, hoveringover, selecting, and/or touching that feature/object in the map interface.
In various embodiments, the user may select a feature in order toview a feature information window. The feature information window may include, forexample, metadata associated with the selected feature. For example, the user mayselect a building feature, resulting in a display of information associated with thatbuilding feature such as the building size, the building name, and/or the buildingaddress or location, among others. Metadata associated with features/objects mayinclude any information relevant to that feature/object. For example, metadataassociated with a school may include an address (for example, 123 S. OrangeStreet), a district (for example, 509c), a grade level (for example, K-6), and/or aphone number (for example, 800-0000), among other items of metadata. In anembodiment, a history of the object, changes made to the object, and/or user notesrelated to the object, among other items, may be displayed. In an embodiment, auser may edit metadata associated with a selected feature.
Figures 4A-4D illustrate sample user interfaces of the map systemin which a radius geosearch is displayed, according to embodiments of the presentdisclosure. In Figure 4A, the user has selected the shape button 104 and is drawinga circle selection 404 on the map interface by first selecting a center and then aradius. Shape window 402 indicates the coordinates of the center of the circle1000737949selection, as well as the radius of the circle selection. In various embodiments, anytype of polygon or other shape may be drawn on the map interface to selectfeatures.
In Figure 4B, the user has selected the geosearch button 108 so asto perform a geosearch within the selection circle 408. In an embodiment, ageosearch comprises a search through one or more databases of data objects, andmetadata associated with those data objects, for any objects that meet the criteria ofthe geosearch. For example, a geosearch may search for any objects withgeographic metadata and/or properties that indicate the object may begeographically within, for example, selection circle 408. A geosearch within aselected circle may be referred to as a radius search. Geosearch window 406indicates various items of information related to the radius search, and includesvarious parameters that may be adjusted by the user. For example, the geosearchwindow 406 includes a search area slider that the user may slide to increase ordecrease the radius of the selection circle 408. The user may also indicate a timerange for the geosearch. In an embodiment, objects/features shown and/orsearchable in the map system may include a time component and/or time metadata.
Thus, for example, the user of the map system may specify a date or time period,resulting in the display of any objects/features with associated time metadata, forexample, falling within the specified time period. In various embodiments, associatedtime metadata may indicate, for example, a time the feature was created, a time thefeature was added to a database of features, a time the feature was previouslyadded to a vector layer, a time the feature was last accessed by the map systemand/or a user, a time the feature was built, and/or any combination of the foregoing.
Alternatively, the user may select and/or search for objects/features within particulartime periods, as shown in Figure 4B. The geosearch window 406 also allows theuser to specify the types of objects to be searched, for example, entities, events,and/or documents, among others.
In an embodiment, the user of the map system may perform asearch by clicking and/or touching a search button. The map system may thenperform a search of an object database for any objects matching the criteria1000737949specified in the geosearch. For example, in the example of Figure 4B the mapsystem will search for any objects with associated location information that fallswithin the selection circle 408. Objects searched by the map system may includeobjects other than those shown on the map interface. For example, in anembodiment the map system may access one or more databases of objects (andobject metadata) that may be unrelated to the features currently shown in the mapinterface, or features related to the currently selected vector layers. The databasesaccessed may include databases external to any database storing data associatedwith the map system. Any objects found in the geosearch may then be madeavailable to the user (as shown in Figure 4B), and the user may be given the optionof adding the objects to a new layer in the map interface (as shown in the geosearchinformation window 406).
Figure 4C shows objects added to the map interface following thegeosearch in Figure 4B. The search results are also shown in the feature histogram410. In this example the returned objects include various entities and events. Figure4D shows the user has selected, in the feature histogram, all search result objectswith related metadata indicating a drug law violation. Those selected objects areadditionally highlighted in the map interface of Figure 4D. In another example,geosearch may be used to determine, for example, that many crimes areconcentrated in a downtown area of a city, while DUIs are more common in areaswith slow roads.
Figures 5A-5D illustrate sample user interfaces of the map systemin which a heatmap is displayed, according to embodiments of the presentdisclosure. In Figure 5A, the user has selected the heatmap button 110 so as tocreate a heatmap 504 based on the objects selected in Figure 4D. A heatmapinformation window 502 is displayed in which the user may specify variousparameters related to the generation of heatmap. For example, referring now toFigure 5B, the user may adjust a radius (506) of the circular heatmap related to eachselected object, an opacity (508) of the heatmap, a scale of the heatmap, and anauto scale setting. In Figure 5B, the user has decreased the opacity of the generated1000737949heatmap and zoomed in on the map interface so as to more clearly view variousobjects and the underlying map tiles.
Figure 5C shows the user selecting various objects and/or featureswhile the heatmap is displayed using the rectangle selection tool, such as to viewinformation regarding the features in a histogram. Figure 5D shows the selectedobjects, selected in Figure 5C, now highlighted (512).
In the map system a heatmap may be generated on any objecttype, and/or on multiple object types. In an embodiment, different heatmap radiusesmay be set for different object types. For example, the user may generate aheatmap in which streetlights have a 20 m radius, while hospitals have a 500 mradius. In an embodiment, the heatmap may be generated based on arbitraryshapes. For example, rather than a circular-based heatmap, the heatmap may berectangular-based or ellipse-based. In an embodiment, the heatmap may begenerated based on error ellipses and/or tolerance ellipses. A heatmap based onerror ellipses may be advantageous when the relevant objects have associated errorregions. For example, when a location of an object is uncertain, or multipledatapoints associated with an object are available, an error ellipse may help the userdetermine the actual location of the object.
Figures 5E-5F illustrate sample user interfaces of the map systemin which a shape-based geosearch is displayed, according to embodiments of thepresent disclosure. In Figure 5E, the user has selected the shape button 104, and ashape information window 514 is shown. In the user interface of Figure 5E the userhas drawn lines 518, however any shapes may be drawn on the map interface.
Information related to the drawn lines 518 is displayed in the shape informationwindow 514. For example, at 516 the starting points, distance, and azimuth relatedto each line are displayed. Further, a total distance from the start to the end of theline is shown.
Figure 5F shows a geosearch performed on the line shape drawn inFigure 5E. Geosearch information window 520 indicates a search area 522, a timerange 524, and an object type 526 as described above with reference to Figure 4B.
The search area is indicated on the map interface by the highlighted area 528 along1000737949the drawn line. The geosearch may be performed, and results may be shown, in amanner similar to that described above with reference to Figures 4B-4D. Forexample, geosearch along a path may be used to determine points of interest alongthat path.
Figure 5G illustrates a sample user interface of the map system inwhich a keyword object search is displayed, according to an embodiment of thepresent disclosure. The user may type words, keywords, numbers, and/orgeographic coordinates, among others, into the search box 112. In Figure 5G, theuser has typed Bank (530). As the user types, the map system automaticallysearches for objects and/or features that match the information typed. Matching maybe performed based on object data and/or metadata. Search results are displayedas shown at 532 in Figure 5G. In the example, a list of banks (bank features) isshown. The user may then select from the list shown, at which point the map systemautomatically zooms to the selected feature and indicates the selected feature withan arrow 534. In various embodiments, the selected feature may be indicated byhighlighting, outlining, and/or any other type of indicator. In an embodiment, thesearch box 112 may be linked to a gazetteer so as to enable simple word searchesfor particular geographic locations. For example, a search for a city name, NewYork, may be linked with the geographic coordinates of the city, taking the userdirectly to that location on the map interface.
Figure 5H illustrates an example of a UTF grid of the map system,according to an embodiment of the present disclosure. In an embodiment, the UTFgrid enables feature outlining and/or highlighting of many objects with client-sidecomponents. In one embodiment, each map tile (or image) of the map interfaceincludes an associated textual UTF (UCS Transformation Format) grid. In Figure 5H,an example map tile 526 is shown next to an associated example UTF grid 538. Inthis example, the map tile and associated UTF grid are generated by the server-sidecomponents and sent to the client-side components. In the UTF grid, each characterrepresents a pixel in the map tile image, and each character indicates what featureis associated with the pixel. Each character in the UTF grid may additionally be1000737949associated with a feature identifier which may be used to request metadataassociated with that feature.
Contiguous regions of characters in the UTF grid indicate thebounds of a particular feature, and may be used by the client-side components toprovide the feature highlighting and/or outlining. For example, when a user hovers amouse pointer over a feature on a map tile, the map system determines thecharacter and portion of the UTF grid associated with the pixel hovered over, drawsa feature outline based on the UTF grid, and may additionally access metadataassociated with the feature based on the feature identifier associated with thefeature. In an embodiment, the UTF grid is sent to the client-side components in aJSON (JavaScript Object Notation) format.
Figure 6A shows a flow diagram depicting illustrative client-sideoperations of the map system, according to an embodiment of the presentdisclosure. In various embodiments, fewer blocks or additional blocks may beincluded in the process, or various blocks may be performed in an order differentfrom that shown in Figure 6A. In an embodiment, one or more blocks in Figure 6Amay be performed by client-side components of the map system, for example,computer system 800 (described below in reference to Figure 8D).
At block 602, the map system provides a user interface (forexample, the user interface of Figure 1) to the user. As described above and below,the user interface may be provided to the user through any electronic device, suchas a desktop computer, a laptop computer, a mobile smartphone, and/or a tablet,among others. At block 604, an input is received from the user of the map system.
For example, the user may use a mouse to roll over and/or click on an item of theuser interface, or the user may touch the display of the interface (in the example of atouch screen device).
Inputs received from the user may include, for example, hoveringover, rolling over, and/or touching and object in the user interface (606); filling out atext field (614); drawing a shape in the user interface (608), and/or drawing aselection box and/or shape in the user interface (610); among other actions or inputsas described above.1000737949 At block 612, any of inputs 606, 614, 608, and 610 may cause themap system to perform client-side actions to update the user interface. For example,hovering over an object (606) may result in the client-side components of the mapsystem to access the UTF grid, determine the boundaries of the object, and draw anoutline around the hovered-over object. In another example, filling out a text field(614) may include the user inputting data into the map system. In this example, theuser may input geographic coordinates, metadata, and/or other types of data to themap system. These actions may result in, for example, the client-side components ofthe map system storing the inputted data and/or taking an action based on theinputted data. For example, the user inputting coordinates may result in the mapinterface being updated to display the inputted information, such as an inputtedname overlaying a particular object. In yet another example, the actions/inputs ofdrawing a shape (608) and/or drawing a selection (610) may result in the client-sidecomponents of the map system to update the user interface with colored and/orhighlighted shapes (see, for example, Figure 3A).
In an embodiment, one or more blocks in Figure 6A may beperformed by server-side components of the map system, for example, server 830(described below in reference to Figure 8D).
Figure 6B shows a flow diagram depicting illustrative client-sidemetadata retrieval of the map system, according to an embodiment of the presentdisclosure. In various embodiments, fewer blocks or additional blocks may beincluded in the process, or various blocks may be performed in an order differentfrom that shown in Figure 6B. In an embodiment, one or more blocks in Figure 6Bmay be performed by client-side components of the map system, for example,computer system 800.
At block 620, the client-side components of the map system detectthat the user is hovering over and/or touching an object in the user interface. Atblock 622, and as described above, the client-side components may access the UTFgrid to determine the feature identifier and object boundaries associated with thehovered-over object. Then, at block 624, the client-side components may render the1000737949feature shape on the image or map interface. The feature shape may be renderedas an outline and/or other highlighting.
At block 636, the client-side components detect whether the userhas selected the object. Objects may be selected, for example, if the user clicks onthe object and or touches the object. If the user has selected the object, then atblock 628, the client-side components query the server-side components to retrievemetadata associated with the selected object. In an embodiment, querying of theserver-side components may include transmitting the feature identifier associatedwith the selected object to the server, the server retrieving from a database therelevant metadata, and the server transmitting the retrieved metadata back to theclient-side components.
At block 630, the metadata is received by the client-sidecomponents and displayed to the user. For example, the metadata associated withthe selected object may be displayed to the user in the user interface in a dedicatedmetadata window, among other possibilities.
In an embodiment, one or more blocks in Figure 6B may beperformed by server-side components of the map system, for example, server 830.
Figure 7A shows a flow diagram depicting illustrative server-sideoperations of the map system, according to an embodiment of the presentdisclosure. In various embodiments, fewer blocks or additional blocks may beincluded in the process, or various blocks may be performed in an order differentfrom that shown in Figure 7A. In an embodiment, one or more blocks in Figure 7Amay be performed by server-side components of the map system, for example,server 830.
Server-side operations of the map system may include composingand updating the map tiles that make up the map interface. For example, when theuser changes the selection of the base layer and/or one or more of the vector layers,the map tiles are re-composed and updated in the map interface to reflect the user’sselection. Selection of objects resulting in highlighting of those objects may alsoinvolve re-composition of the map tiles. Further, UTF grids may be generated by theserver-side components for each map tile composed.1000737949 At block 702, the user interface is provided to the user. At block 704an input from the user is received. Inputs received from the user that may result inserver-side operations may include, for example, an object selection (706), a changein layer selection (708), a geosearch (710), generating a heatmap (712), searchingfrom the search box (714), and/or panning or zooming the map interface, amongothers.
At block 716, the client-side components of the map system mayquery the server-side components in response to any of inputs 706, 708, 710, 712,and 714 from the user. The server-side components then update and re-composethe map tiles and UTF grids of the map interface in accordance with the user input(as described below in reference to Figure 7B), and transmits those updated maptiles and UTF grids back to the client-side components.
At block 718, the client-side components receive the updated maptile information from the server, and at block 720 the user interface is updated withthe received information.
In an embodiment, additional information and/or data, in addition toupdated map tiles, may be transmitted to the client-side components from theserver-side components. For example, object metadata may be transmitted inresponse to a user selecting an object.
In an embodiment, one or more blocks in Figure 7A may beperformed by client-side components of the map system, for example, computersystem 800.
Figure 7B shows a flow diagram depicting illustrative server-sidelayer composition of the map system, according to an embodiment of the presentdisclosure. In various embodiments, fewer blocks or additional blocks may beincluded in the process, or various blocks may be performed in an order differentfrom that shown in Figure 7B. In an embodiment, one or more blocks in Figure 7Bmay be performed by server-side components of the map system, for example,server 830.
At block 730, a query is received by the server-side componentsfrom the client-side components. Such a query may originate, for example, at1000737949block 716 of Figure 7A. At block 732, the server-side components determine themap tile composition based on the query. For example, if the user has selected anobject or group of objects, the map tiles containing those objects may be updated toinclude highlighted objects. In another example, if the user has changed the layerselection, the map tiles may be updated to include only those layers that arecurrently selected. In the example of Figure 7B, the layers currently selected aredetermined, and the layers are composed and/or rendered into the map tiles. Inanother example, if the user has performed a geosearch and selected to add thesearch result objects to the map interface, the map tiles are updated to include thosesearch result objects. In yet another example, when the user has generated aheatmap, the map tiles are updated to show the generated heatmap. In anotherexample, if the user searches via the search box, the selected objects may behighlighted in the re-composed map tiles. In another example, when the user pansand/or zooms in the map interface, the map tiles are updated to reflect the new viewselected by the user. In all cases, and updated UTF grid may also be generated foreach composed map tile.
At block 734, the map system determines whether the layersnecessary to compose the requested map tiles are cached. For example, when alayer is selected by the user, that layer may be composed by the map system andplaced in a memory of the server-side components for future retrieval. Caching ofcomposed layers may obviate the need for recomposing those layers later, whichadvantageously may save time and/or processing power.
If the required layers are cached, then at block 740 the layers arecomposed into the requested map tiles and, at block 742, transmitted to the client-side components.
When the required layers are not cached, at block 736, the server-side components calculate and/or compose the requested layer and or layers, andmay then, at block 738, optionally cache the newly composed layers for futureretrieval. Then, at blocks 740 and 742, the layers are composed into map tiles andprovided to the client-side components.1000737949 In an embodiment, entire map tiles may be cached by the server-side components. In an embodiment, the size and/or quality of the map tiles thatmake up that map interface may be selected and/or dynamically selected based onat least one of: the bandwidth available for transmitting the map tiles to the client-side components, the size of the map interface, and/or the complexity of the layercomposition, among other factors. In an embodiment, the map tiles compriseimages, for example, in one or more of the following formats: PNG, GIF, JPEG,TIFF, BMP, and/or any other type of appropriate image format.
In an embodiment, the layer and object data composed into layersand map tiles comprises vector data. The vector data (for example, object data) mayinclude associated metadata, as described above. In an embodiment, the vector,layer, and/or object data and associated metadata may originate from one or moredatabases and/or electronic data stores.
In an embodiment, one or more blocks in Figure 7B may beperformed by client-side components of the map system, for example, computersystem 800.
In an embodiment, the map system may display more than 50million selectable features to a user simultaneously. In an embodiment, the mapsystem may support tens or hundreds of concurrent users accessing the same mapand object data. In an embodiment, map and object data used by the map systemmay be mirrored and/or spread across multiple computers, servers, and/or server-side components.
In an embodiment, rather than updating the map tiles to reflect aselection by the user of one or more objects, the map system may show anapproximation of the selection to the user based on client-side processing.
In an embodiment, a user may drag and drop files, for example,vector data and/or vector layers, onto the user interface of the map system, causingthe map system to automatically render the file in the map interface.
In an embodiment, icons and/or styles associated with variousobjects in the map interface may be updated and/or changed by the user. Forexample, the styles of the various objects may be specified in or by a style data file.1000737949The style data file may be formatted according to a particular format or standardreadable by the map system. In an embodiment, the style data file is formattedaccording to the JSON format standard. The user may thus change the look of theobjects and shapes rendered in the map interface of the map system by changingthe style data file. The style data file may further define the looks for object andterrain (among other items and data) at various zoom levels.
In an embodiment, objects, notes, metadata, and/or other types ofdata may be added to the map system by the user through the user interface. In anembodiment, user added information may be shared between multiple users of themap system. In an embodiment, a user of the map system may add annotations andshapes to the map interface that may be saved and shared with other users. In anembodiment, a user of the map system may share a selection of objects with one ormore other users.
In an embodiment, the user interface of the map system mayinclude a timeline window. The timeline window may enable the user to view objectsand layers specific to particular moments in time and/or time periods. In anembodiment, the user may view tolerance ellipses overlaid on the map interfaceindicating the likely position of an object across a particular time period.
In an embodiment, the map system may include elevation profiling.
Elevation profiling may allow a user of the system to determine the elevation along apath on the map interface, to perform a viewshed analysis (determine objects and/orterrain viewable from a particular location), to perform a reverse-viewshed analysis(for a particular location, determine objects and/or terrain that may view thelocation), among others.
In an embodiment, vector data, object data, metadata, and/or othertypes of data may be prepared before it is entered into or accessed by the mapsystem. For example, the data may be converted from one format to another, maybe crawled for common items of metadata, and/or may be prepared for application ofa style file or style information, among other action. In an embodiment, a layerontology may be automatically generated based on a group of data. In an1000737949embodiment, the map system may access common data sources available on theInternet, for example, road data available from openstreetmap.org.
In an embodiment, roads shown in the map interface are labeledwith their names, and buildings are rendered in faux-3D to indicate the buildingheights. In an embodiment, Blue Force Tracking may be integrated into the mapsystem as a layer with the characteristics of both a static vector layer and a dynamicselection layer. A Blue Force layer may enable the use of the map system for liveoperational analysis. In an embodiment, the map system may quickly render detailedchloropleths or heatmaps with minimal data transfer. For example, the system mayrender a chloropleth with a property value on the individual shapes of the propertiesthemselves, rather than aggregating this information on a county or zip code level.
Advantageously, the map system displays many items of data,objects, features, and/or layers in a single map interface. A user may easily interactwith things on the map and gather information by hovering over or selectingfeatures, even though those features may not be labeled. The user may selectfeatures, may “drill down” on a particular type of feature (for example, roads), mayview features through histograms, may use histograms to determine commoncharacteristics (for example, determine the most common speed limit), and/or maydetermine correlations among features (for example, see that slower speed limitareas are centered around schools). Further, the map system may be useful in manydifferent situations. For example, the system may be useful to operational plannersand/or disaster relief personnel.
Additionally, the map system accomplishes at least three coreideas: providing a robust and fast back-end (server-side) renderer, keeping data onthe back-end, and only transferring the data necessary to have interactivity. In oneembodiment, the primary function of the server-side components is rendering maptiles. The server is capable of drawing very detailed maps with a variety of styles thatcan be based on vector metadata. Rendered map tiles for a vector layer are cached,and several of these layer tiles are drawn on top of one another to produce the finaltile that is sent to the client-side browser. Map tile rendering is fast enough fordisplaying dynamic tiles for selection and highlight to the user. Server-side1000737949operations allow for dynamic selections of very large numbers of features,calculation of the histogram, determining the number of items shown and/orselected, and drawing the selection, for example. Further, the heatmap may includelarge numbers of points without incurring the cost of transferring those points to theclient-side browser. Additionally, transferring only as much data as necessary tohave interactivity enables quick server rendering of dynamic selections and vectorlayers. On the other hand, highlighting hovered-over features may be performedclient-side nearly instantaneously, and provides useful feedback that enhances theinteractivity of the map system. In an embodiment, to avoid transferring too muchgeometric data, the geometries of objects (in the map tiles and UTF grid) are down-sampled depending on how zoomed in the user is to the map interface. Thus, maptiles may be rendered and presented to a user of the map system in a dynamic anduseable manner.
Object Centric Data Model To provide a framework for the following discussion of specificsystems and methods described above and below, an example database system1210 using an ontology 1205 will now be described. This description is provided forthe purpose of providing an example and is not intended to limit the techniques tothe example data model, the example database system, or the example databasesystem’s use of an ontology to represent information.
In one embodiment, a body of data is conceptually structuredaccording to an object-centric data model represented by ontology 1205. Theconceptual data model is independent of any particular database used for durablystoring one or more database(s) 1209 based on the ontology 1205. For example,each object of the conceptual data model may correspond to one or more rows in arelational database or an entry in Lightweight Directory Access Protocol (LDAP)database, or any combination of one or more databases.
Figure 8A illustrates an object-centric conceptual data modelaccording to an embodiment. An ontology 1205, as noted above, may include storedinformation providing a data model for storage of data in the database 1209. Theontology 1205 may be defined by one or more object types, which may each be1000737949associated with one or more property types. At the highest level of abstraction, dataobject 1201 is a container for information representing things in the world. Forexample, data object 1201 can represent an entity such as a person, a place, anorganization, a market instrument, or other noun. Data object 1201 can represent anevent that happens at a point in time or for a duration. Data object 1201 canrepresent a document or other unstructured data source such as an e-mailmessage, a news report, or a written paper or article. Each data object 1201 isassociated with a unique identifier that uniquely identifies the data object within thedatabase system.
Different types of data objects may have different property types.
For example, a “Person” data object might have an “Eye Color” property type and an“Event” data object might have a “Date” property type. Each property 1203 asrepresented by data in the database system 1210 may have a property type definedby the ontology 1205 used by the database 1205.
Objects may be instantiated in the database 1209 in accordancewith the corresponding object definition for the particular object in the ontology 1205.
For example, a specific monetary payment (e.g., an object of type “event”) ofUS$30.00 (e.g., a property of type “currency”) taking place on 3/27/2009 (e.g., aproperty of type “date”) may be stored in the database 1209 as an event object withassociated currency and date properties as defined within the ontology 1205.
The data objects defined in the ontology 1205 may support propertymultiplicity. In particular, a data object 1201 may be allowed to have more than oneproperty 1203 of the same property type. For example, a “Person” data object mighthave multiple “Address” properties or multiple “Name” properties.
Each link 1202 represents a connection between two data objects1201. In one embodiment, the connection is either through a relationship, an event,or through matching properties. A relationship connection may be asymmetrical orsymmetrical. For example, “Person” data object A may be connected to “Person”data object B by a “Child Of” relationship (where “Person” data object B has anasymmetric “Parent Of” relationship to “Person” data object A), a “Kin Of” symmetricrelationship to “Person” data object C, and an asymmetric “Member Of” relationship1000737949to “Organization” data object X. The type of relationship between two data objectsmay vary depending on the types of the data objects. For example, “Person” dataobject A may have an “Appears In” relationship with “Document” data object Y orhave a “Participate In” relationship with “Event” data object E. As an example of anevent connection, two “Person” data objects may be connected by an “Airline Flight”data object representing a particular airline flight if they traveled together on thatflight, or by a “Meeting” data object representing a particular meeting if they bothattended that meeting. In one embodiment, when two data objects are connected byan event, they are also connected by relationships, in which each data object has aspecific relationship to the event, such as, for example, an “Appears In” relationship.
As an example of a matching properties connection, two “Person”data objects representing a brother and a sister, may both have an “Address”property that indicates where they live. If the brother and the sister live in the samehome, then their “Address” properties likely contain similar, if not identical propertyvalues. In one embodiment, a link between two data objects may be establishedbased on similar or matching properties (e.g., property types and/or property values)of the data objects. These are just some examples of the types of connections thatmay be represented by a link and other types of connections may be represented;embodiments are not limited to any particular types of connections between dataobjects. For example, a document might contain references to two different objects.
For example, a document may contain a reference to a payment (one object), and aperson (a second object). A link between these two objects may represent aconnection between these two entities through their co-occurrence within the samedocument.
Each data object 1201 can have multiple links with another dataobject 1201 to form a link set 1204. For example, two “Person” data objectsrepresenting a husband and a wife could be linked through a “Spouse Of”relationship, a matching “Address” property, and one or more matching “Event”properties (e.g., a wedding). Each link 1202 as represented by data in a databasemay have a link type defined by the database ontology used by the database.1000737949 Figure 8B is a block diagram illustrating exemplary componentsand data that may be used in identifying and storing data according to an ontology.
In this example, the ontology may be configured, and data in the data modelpopulated, by a system of parsers and ontology configuration tools. In theembodiment of Figure 8B, input data 1300 is provided to parser 1302. The input datamay comprise data from one or more sources. For example, an institution may haveone or more databases with information on credit card transactions, rental cars, andpeople. The databases may contain a variety of related information and attributesabout each type of data, such as a “date” for a credit card transaction, an addressfor a person, and a date for when a rental car is rented. The parser 1302 is able toread a variety of source input data types and determine which type of data it isreading.
In accordance with the discussion above, the example ontology1205 comprises stored information providing the data model of data stored indatabase 1209, and the ontology is defined by one or more object types 1310, oneor more property types 1316, and one or more link types 1330. Based on informationdetermined by the parser 1302 or other mapping of source input information toobject type, one or more data objects 1201 may be instantiated in the database 209based on respective determined object types 1310, and each of the objects 1201has one or more properties 1203 that are instantiated based on property types 1316.
Two data objects 1201 may be connected by one or more links 1202 that may beinstantiated based on link types 1330. The property types 1316 each may compriseone or more data types 1318, such as a string, number, etc. Property types 1316may be instantiated based on a base property type 1320. For example, a baseproperty type 1320 may be “Locations” and a property type 1316 may be “Home.” In an embodiment, a user of the system uses an object typeeditor 1324 to create and/or modify the object types 1310 and define attributes of theobject types. In an embodiment, a user of the system uses a property typeeditor 1326 to create and/or modify the property types 1316 and define attributes ofthe property types. In an embodiment, a user of the system uses link type editor1328 to create the link types 1330. Alternatively, other programs, processes, or1000737949programmatic controls may be used to create link types and property types anddefine attributes, and using editors is not required.
In an embodiment, creating a property type 1316 using the propertytype editor 1326 involves defining at least one parser definition using a parsereditor 1322. A parser definition comprises metadata that informs parser 1302 how toparse input data 1300 to determine whether values in the input data can be assignedto the property type 1316 that is associated with the parser definition. In anembodiment, each parser definition may comprise a regular expressionparser 1304A or a code module parser 1304B. In other embodiments, other kinds ofparser definitions may be provided using scripts or other programmatic elements.
Once defined, both a regular expression parser 1304A and a code moduleparser 1304B can provide input to parser 1302 to control parsing of input data 1300.
Using the data types defined in the ontology, input data 1300 maybe parsed by the parser 1302 determine which object type 1310 should receive datafrom a record created from the input data, and which property types 1316 should beassigned to data from individual field values in the input data. Based on the object-property mapping 1301, the parser 1302 selects one of the parser definitions that isassociated with a property type in the input data. The parser parses an input datafield using the selected parser definition, resulting in creating new or modifieddata 1303. The new or modified data 1303 is added to the database 1209 accordingto ontology 205 by storing values of the new or modified data in a property of thespecified property type. As a result, input data 1300 having varying format or syntaxcan be created in database 1209. The ontology 1205 may be modified at any timeusing object type editor 1324, property type editor 1326, and link type editor 1328, orunder program control without human use of an editor. Parser editor 1322 enablescreating multiple parser definitions that can successfully parse input data 1300having varying format or syntax and determine which property types should be usedto transform input data 300 into new or modified input data 1303.
The properties, objects, and links (e.g. relationships) between theobjects can be visualized using a graphical user interface (GUI). For example,Figure 8C displays a user interface showing a graph representation 1403 of1000737949relationships (including relationships and/or links 1404, 1405, 1406, 1407, 1408,1409, 1410, 1411, 1412, and 1413) between the data objects (including data objects1421, 1422, 1423, 1424, 1425, 1426, 1427, 1428, and 1429) that are represented asnodes in the example of Figure 8C. In this embodiment, the data objects includeperson objects 1421, 1422, 1423, 1424, 1425, and 1426; a flight object 1427; afinancial account 1428; and a computer object 1429. In this example, each personnode (associated with person data objects), flight node (associated with flight dataobjects), financial account node (associated with financial account data objects), andcomputer node (associated with computer data objects) may have relationshipsand/or links with any of the other nodes through, for example, other objects such aspayment objects.
For example, in Figure 8C, relationship 1404 is based on apayment associated with the individuals indicated in person data objects 1421and 1423. The link 1404 represents these shared payments (for example, theindividual associated with data object 1421 may have paid the individual associatedwith data object 1423 on three occasions). The relationship is further indicated bythe common relationship between person data objects 1421 and 1423 and financialaccount data object 1428. For example, link 1411 indicates that person dataobject 1421 transferred money into financial account data object 1428, while persondata object 1423 transferred money out of financial account data object 1428. Inanother example, the relationships between person data objects 1424 and 1425 andflight data object 1427 are indicated by links 1406, 1409, and 1410. In this example,person data objects 1424 and 1425 have a common address and were passengerson the same flight data object 1427. In an embodiment, further details related to therelationships between the various objects may be displayed. For example,links 1411 and 1412 may, in some embodiments, indicate the timing of therespective money transfers. In another example, the time of the flight associatedwith the flight data object 1427 may be shown.
Relationships between data objects may be stored as links, or insome embodiments, as properties, where a relationship may be detected betweenthe properties. In some cases, as stated above, the links may be directional. For1000737949example, a payment link may have a direction associated with the payment, whereone person object is a receiver of a payment, and another person object is the payerof payment.
In various embodiments, data objects may further includegeographical metadata and/or links. Such geographical metadata may be accessedby the interactive data object map system for displaying objects and features on themap interface (as described above).
In addition to visually showing relationships between the dataobjects, the user interface may allow various other manipulations. For example, theobjects within database 1108 may be searched using a search interface 1450 (e.g.,text string matching of object properties), inspected (e.g., properties and associateddata viewed), filtered (e.g., narrowing the universe of objects into sets and subsetsby properties or relationships), and statistically aggregated (e.g., numericallysummarized based on summarization criteria), among other operations andvisualizations. Additionally, as described above, objects within database 1108 maybe searched, accessed, and implemented in the map interface of the interactive dataobject map system via, for example, a geosearch and/or radius search.
Implementation Mechanisms According to an embodiment, the interactive data object mapsystem and other methods and techniques described herein are implemented byone or more special-purpose computing devices. The special-purpose computingdevices may be hard-wired to perform the techniques, or may include digitalelectronic devices such as one or more application-specific integrated circuits(ASICs) or field programmable gate arrays (FPGAs) that are persistentlyprogrammed to perform the techniques, or may include one or more generalpurpose hardware processors programmed to perform the techniques pursuant toprogram instructions in firmware, memory, other storage, or a combination. Suchspecial-purpose computing devices may also combine custom hard-wired logic,ASICs, or FPGAs with custom programming to accomplish the techniques. Thespecial-purpose computing devices may be desktop computer systems, servercomputer systems, portable computer systems, handheld devices, networking1000737949devices or any other device or combination of devices that incorporate hard-wiredand/or program logic to implement the techniques.
Computing device(s) are generally controlled and coordinated byoperating system software, such as iOS, Android, Chrome OS, Windows XP,Windows Vista, Windows 7, Windows 8, Windows Server, Windows CE, Unix, Linux,SunOS, Solaris, iOS, Blackberry OS, VxWorks, or other compatible operatingsystems. In other embodiments, the computing device may be controlled by aproprietary operating system. Conventional operating systems control and schedulecomputer processes for execution, perform memory management, provide filesystem, networking, I/O services, and provide a user interface functionality, such asa graphical user interface (“GUI”), among other things.
For example, Figure 8D is a block diagram that illustrates acomputer system 800 upon which the various systems and methods discussedherein may be implemented. Computer system 800 includes a bus 802 or othercommunication mechanism for communicating information, and a hardwareprocessor, or multiple processors, 804 coupled with bus 802 for processinginformation. Hardware processor(s) 804 may be, for example, one or more generalpurpose microprocessors.
Computer system 800 also includes a main memory 806, such as arandom access memory (RAM), cache and/or other dynamic storage devices,coupled to bus 802 for storing information and instructions to be executed byprocessor 804. Main memory 806 also may be used for storing temporary variablesor other intermediate information during execution of instructions to be executed byprocessor 804. Such instructions, when stored in storage media accessible toprocessor 804, render computer system 800 into a special-purpose machine that iscustomized to perform the operations specified in the instructions.
Computer system 800 further includes a read only memory(ROM) 808 or other static storage device coupled to bus 802 for storing staticinformation and instructions for processor 804. A storage device 810, such as amagnetic disk, optical disk, or USB thumb drive (Flash drive), etc., is provided andcoupled to bus 802 for storing information and instructions.1000737949 Computer system 800 may be coupled via bus 802 to adisplay 812, such as a cathode ray tube (CRT), LCD display, or touch screendisplay, for displaying information to a computer user and/or receiving input from theuser. An input device 814, including alphanumeric and other keys, is coupled tobus 802 for communicating information and command selections to processor 804.
Another type of user input device is cursor control 816, such as a mouse, a trackball,or cursor direction keys for communicating direction information and commandselections to processor 804 and for controlling cursor movement on display 812.
This input device typically has two degrees of freedom in two axes, a first axis (e.g.,x) and a second axis (e.g., y), that allows the device to specify positions in a plane.
In some embodiments, the same direction information and command selections ascursor control may be implemented via receiving touches on a touch screen withouta cursor.
Computing system 800 may include a user interface module, and/orvarious other types of modules to implement a GUI, a map interface, and the variousother aspects of the interactive data object map system. The modules may be storedin a mass storage device as executable software codes that are executed by thecomputing device(s). This and other modules may include, by way of example,components, such as software components, object-oriented software components,class components and task components, processes, functions, attributes,procedures, subroutines, segments of program code, drivers, firmware, microcode,circuitry, data, databases, data structures, tables, arrays, and variables.
In general, the word “module,” as used herein, refers to logicembodied in hardware or firmware, or to a collection of software instructions,possibly having entry and exit points, written in a programming language, such as,for example, Java, Lua, C or C++. A software module may be compiled and linkedinto an executable program, installed in a dynamic link library, or may be written inan interpreted programming language such as, for example, BASIC, Perl, or Python.
It will be appreciated that software modules may be callable from other modules orfrom themselves, and/or may be invoked in response to detected events orinterrupts. Software modules configured for execution on computing devices may be1000737949provided on a computer readable medium, such as a compact disc, digital videodisc, flash drive, magnetic disc, or any other tangible medium, or as a digitaldownload (and may be originally stored in a compressed or installable format thatrequires installation, decompression or decryption prior to execution). Such softwarecode may be stored, partially or fully, on a memory device of the executingcomputing device, for execution by the computing device. Software instructions maybe embedded in firmware, such as an EPROM. It will be further appreciated thathardware modules may be comprised of connected logic units, such as gates andflip-flops, and/or may be comprised of programmable units, such as programmablegate arrays or processors. The modules or computing device functionality describedherein are preferably implemented as software modules, but may be represented inhardware or firmware. Generally, the modules described herein refer to logicalmodules that may be combined with other modules or divided into sub-modulesdespite their physical organization or storage Computer system 800 may implement the techniques describedherein using customized hard-wired logic, one or more ASICs or FPGAs, firmwareand/or program logic which in combination with the computer system causes orprograms computer system 800 to be a special-purpose machine. According to oneembodiment, the techniques herein are performed by computer system 800 inresponse to processor(s) 804 executing one or more sequences of one or moremodules and/or instructions contained in main memory 806. Such instructions maybe read into main memory 806 from another storage medium, such as storagedevice 810. Execution of the sequences of instructions contained in mainmemory 806 causes processor(s) 804 to perform the process steps describedherein. In alternative embodiments, hard-wired circuitry may be used in place of or incombination with software instructions.
The term “non-transitory media,” and similar terms, as used hereinrefers to any media that store data and/or instructions that cause a machine tooperate in a specific fashion. Such non-transitory media may comprise non-volatilemedia and/or volatile media. Non-volatile media includes, for example, optical ormagnetic disks, such as storage device 810. Volatile media includes dynamic1000737949memory, such as main memory 806. Common forms of non-transitory mediainclude, for example, a floppy disk, a flexible disk, hard disk, solid state drive,magnetic tape, or any other magnetic data storage medium, a CD-ROM, any otheroptical data storage medium, any physical medium with patterns of holes, a RAM, aPROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip orcartridge, and networked versions of the same.
Non-transitory media is distinct from but may be used inconjunction with transmission media. Transmission media participates in transferringinformation between nontransitory media. For example, transmission media includescoaxial cables, copper wire and fiber optics, including the wires that comprisebus 802. Transmission media can also take the form of acoustic or light waves, suchas those generated during radio-wave and infra-red data communications.
Various forms of media may be involved in carrying one or moresequences of one or more instructions to processor 804 for execution. For example,the instructions may initially be carried on a magnetic disk or solid state drive of aremote computer. The remote computer can load the instructions and/or modulesinto its dynamic memory and send the instructions over a telephone line using amodem. A modem local to computer system 800 can receive the data on thetelephone line and use an infra-red transmitter to convert the data to an infra-redsignal. An infra-red detector can receive the data carried in the infra-red signal andappropriate circuitry can place the data on bus 802. Bus 802 carries the data to mainmemory 806, from which processor 804 retrieves and executes the instructions. Theinstructions received by main memory 806 may optionally be stored on storagedevice 810 either before or after execution by processor 804.
Computer system 800 also includes a communication interface 818coupled to bus 802. Communication interface 818 provides a two-way datacommunication coupling to a network link 820 that is connected to a localnetwork 822. For example, communication interface 818 may be an integratedservices digital network (ISDN) card, cable modem, satellite modem, or a modem toprovide a data communication connection to a corresponding type of telephone line.
As another example, communication interface 818 may be a local area network1000737949(LAN) card to provide a data communication connection to a compatible LAN (orWAN component to communicated with a WAN). Wireless links may also beimplemented. In any such implementation, communication interface 818 sends andreceives electrical, electromagnetic or optical signals that carry digital data streamsrepresenting various types of information.
Network link 820 typically provides data communication throughone or more networks to other data devices. For example, network link 820 mayprovide a connection through local network 822 to a host computer 824 or to dataequipment operated by an Internet Service Provider (ISP) 826. ISP 826 in turnprovides data communication services through the world wide packet datacommunication network now commonly referred to as the “Internet” 828. Localnetwork 822 and Internet 828 both use electrical, electromagnetic or optical signalsthat carry digital data streams. The signals through the various networks and thesignals on network link 820 and through communication interface 818, which carrythe digital data to and from computer system 800, are example forms oftransmission media.
Computer system 800 can send messages and receive data,including program code, through the network(s), network link 820 andcommunication interface 818. In the Internet example, a server 830 might transmit arequested code for an application program through Internet 828, ISP 826, localnetwork 822 and communication interface 818. Server-side components of theinteractive data object map system described above (for example, with reference toFigures 7A and 7B) may be implemented in the server 830. For example, theserver 830 may compose map layers and tiles, and transmit those map tiles to thecomputer system 800.
The computer system 800, on the other hand, may implement thethe client-side components of the map system as described above (for example, withreference to Figures 6A and 6B). For example, the computer system may receivemap tiles and/or other code that may be executed by processor 804 as it is received,and/or stored in storage device 810, or other non-volatile storage for later execution.
The computer system 800 may further compose the map interface from the map1000737949tiles, display the map interface to the user, generate object outlines and otherfunctionality, and/or receive input from the user.
In an embodiment, the map system may be accessible by the userthrough a web-based viewer, such as a web browser. In this embodiment, the mapinterface may be generated by the server 830 and/or the computer system 800 andtransmitted to the web browser of the user. The user may then interact with the mapinterface through the web-browser. In an embodiment, the computer system 800may comprise a mobile electronic device, such as a cell phone, smartphone, and/ortablet. The map system may be accessible by the user through such a mobileelectronic device, among other types of electronic devices.
Each of the processes, methods, and algorithms described in thepreceding sections may be embodied in, and fully or partially automated by, codemodules executed by one or more computer systems or computer processorscomprising computer hardware. The processes and algorithms may be implementedpartially or wholly in application-specific circuitry.
The various features and processes described above may be usedindependently of one another, or may be combined in various ways. All possiblecombinations and subcombinations are intended to fall within the scope of thisdisclosure. In addition, certain method or process blocks may be omitted in someimplementations. The methods and processes described herein are also not limitedto any particular sequence, and the blocks or states relating thereto can beperformed in other sequences that are appropriate. For example, described blocksor states may be performed in an order other than that specifically disclosed, ormultiple blocks or states may be combined in a single block or state. The exampleblocks or states may be performed in serial, in parallel, or in some other manner.
Blocks or states may be added to or removed from the disclosed exampleembodiments. The example systems and components described herein may beconfigured differently than described. For example, elements may be added to,removed from, or rearranged compared to the disclosed example embodiments.
Conditional language, such as, among others, “can,” “could,”“might,” or “may,” unless specifically stated otherwise, or otherwise understood1000737949within the context as used, is generally intended to convey that certain embodimentsinclude, while other embodiments do not include, certain features, elements and/orsteps. Thus, such conditional language is not generally intended to imply thatfeatures, elements and/or steps are in any way required for one or moreembodiments or that one or more embodiments necessarily include logic fordeciding, with or without user input or prompting, whether these features, elementsand/or steps are included or are to be performed in any particular embodiment.
Any process descriptions, elements, or blocks in the flow diagramsdescribed herein and/or depicted in the attached Figures should be understood aspotentially representing modules, segments, or portions of code which include oneor more executable instructions for implementing specific logical functions or steps inthe process. Alternate implementations are included within the scope of theembodiments described herein in which elements or functions may be deleted,executed out of order from that shown or discussed, including substantiallyconcurrently or in reverse order, depending on the functionality involved, as wouldbe understood by those skilled in the art.
It should be emphasized that many variations and modificationsmay be made to the above-described embodiments, the elements of which are to beunderstood as being among other acceptable examples. All such modifications andvariations are intended to be included herein within the scope of this disclosure. Theforegoing description details certain embodiments of the invention. It will beappreciated, however, that no matter how detailed the foregoing appears in text, theinvention can be practiced in many ways. As is also stated above, it should be notedthat the use of particular terminology when describing certain features or aspects ofthe invention should not be taken to imply that the terminology is being re-definedherein to be restricted to including any specific characteristics of the features oraspects of the invention with which that terminology is associated. The scope of theinvention should therefore be construed in accordance with the appended claimsand any equivalents thereof.1000737949