CROSS-REFERENCE TO RELATED APPLICATION AND CLAIM OF PRIORITYThis application is related to and claims priority under 35 U.S.C. § 120 from U.S. Provisional Application No. 62/615,310 entitled “Data Acquisition, Bundling and Processing” filed on Jan. 9, 2018, U.S. Provisional Application No. 62/612,959 entitled “Self-Configuring Modular Surface Sensors Analytics System” filed on Jan. 2, 2018, U.S. Provisional Application No. 62/646,537 entitled “System and Method for Smart Building Control Using Multidimensional Presence Sensor Arrays” filed on Mar. 22, 2018 and U.S. Provisional Application No. 62/644,130 entitled “System and Method for Smart Building Control Using Directional Occupancy Sensors,” filed on Mar. 16, 2018, the disclosures of which are incorporated by reference herein in their entireties.
TECHNICAL FIELDThis disclosure relates generally to sensors and control systems for physical spaces. More specifically, this disclosure relates to a system and method for smart building control using multidimensional presence sensor arrays.
BACKGROUND“Smart Buildings,” or buildings comprising physical spaces whose environmental control systems, such as lights, HVAC systems, and physical features (for example, ceiling fans or window shades) operate, at least in part, based on control inputs generated by the computerized application of predetermined rules to sensor data, offer tremendous promise in terms of improving how humans use physical spaces. For example, truly intelligent control of heating and lighting systems offers the possibility of significant improvements in energy efficiency beyond those attainable through passive structural improvements such as better insulation. However, a “smart building” is only as “smart” as the sensors are able to provide accurate and meaningful inputs to the algorithms for controlling parameters of the building's physical spaces. Embodiments according to this disclosure address technical problems associated with generating “smart” control inputs for environmental control systems.
SUMMARYThis disclosure provides a system and method for smart building control using multidimensional presence sensor arrays.
In a first embodiment, a method of operating a master control device includes obtaining, at a input-output interface, first measurement data for a zone of a physical space, the first measurement data based on signals from a first group of presence sensors covering the zone of the physical space, obtaining, at the input-output interface, second measurement data for the zone of the physical space, the second measurement data based on signals from a second group of presence sensors covering the zone of the physical space and identifying, based on at least one of the first or second measurement data, one or more moving objects within the zone of the physical space. The method further includes associating, based on the first and second measurement data, each of the one or more moving objects with an object class, determining, for each of the one or more moving objects, a track within a coordinate system for the physical space and outputting, via the input-output interface of the master control device, a signal associated with the one or more determined tracks.
In a second embodiment, a master control device includes an input-output interface, a processor and a memory containing instructions, which when executed by the processor, cause the master control device to obtain, at the input-output interface, first measurement data for a zone of a physical space, the first measurement data based on signals from a first group of presence sensors covering the zone of the physical space, to obtain, at the input-output interface, second measurement data for the zone of the physical space, the second measurement data based on signals from a second group of presence sensors covering the zone of the physical space, and to identify, based on at least of the first or second measurement data, one or more moving objects within the zone of the physical space. The instructions, when executed by the processor, further cause the master control device to associate, based on the first and second measurement data, each of the one or more moving objects with an object class, to determine, for each of the one or more moving objects, a track within a coordinate system for the physical space; and to output, via the input-output interface of the master control device, a signal associated with the one or more determined tracks.
In a third embodiment, a computer program product includes program code, which when executed by a processor, causes a master control device to obtain, at a input-output interface, first measurement data for a zone of a physical space, the first measurement data based on signals from a first group of presence sensors covering the zone of the physical space, to obtain, at the input-output interface, second measurement data for the zone of the physical space, the second measurement data based on signals from a second group of presence sensors covering the zone of the physical space, and to identify, based on at least of the first or second measurement data, one or more moving objects within the zone of the physical space. The program code, when executed by the processor, further cause the master control device to associate, based on the first and second measurement data, each of the one or more moving objects with an object class, to determine, for each of the one or more moving objects, a track within a coordinate system for the physical space; and to output, via the input-output interface of the master control device, a signal associated with the one or more determined tracks.
Other technical features may be readily apparent to one skilled in the art from the following figures, descriptions, and claims.
Before undertaking the DETAILED DESCRIPTION below, it may be advantageous to set forth definitions of certain words and phrases used throughout this patent document. The term “couple” and its derivatives refer to any direct or indirect communication between two or more elements, whether or not those elements are in physical contact with one another. The terms “transmit,” “receive,” and “communicate,” as well as derivatives thereof, encompass both direct and indirect communication. The terms “include” and “comprise,” as well as derivatives thereof, mean inclusion without limitation. The term “or” is inclusive, meaning and/or. The phrase “associated with,” as well as derivatives thereof, means to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have, have a property of, have a relationship to or with, or the like. The term “controller” means any device, system or part thereof that controls at least one operation. Such a controller may be implemented in hardware or a combination of hardware and software and/or firmware. The functionality associated with any particular controller may be centralized or distributed, whether locally or remotely. The phrase “at least one of,” when used with a list of items, means that different combinations of one or more of the listed items may be used, and only one item in the list may be needed. For example, “at least one of: A, B, and C” includes any of the following combinations: A, B, C, A and B, A and C, B and C, and A and B and C.
Moreover, various functions described below can be implemented or supported by one or more computer programs, each of which is formed from computer readable program code and embodied in a computer readable medium. The terms “application” and “program” refer to one or more computer programs, software components, sets of instructions, procedures, functions, objects, classes, instances, related data, or a portion thereof adapted for implementation in a suitable computer readable program code. The phrase “computer readable program code” includes any type of computer code, including source code, object code, and executable code. The phrase “computer readable medium” includes any type of medium capable of being accessed by a computer, such as read only memory (ROM), random access memory (RAM), a hard disk drive, a compact disc (CD), a digital video disc (DVD), or any other type of memory. A “non-transitory” computer readable medium excludes wired, wireless, optical, or other communication links that transport transitory electrical or other signals. A non-transitory computer readable medium includes media where data can be permanently stored and media where data can be stored and later overwritten, such as a rewritable optical disc or an erasable memory device.
Definitions for other certain words and phrases are provided throughout this patent document. Those of ordinary skill in the art should understand that in many if not most instances, such definitions apply to prior as well as future uses of such defined words and phrases.
BRIEF DESCRIPTION OF THE DRAWINGSFor a more complete understanding of this disclosure and its advantages, reference is now made to the following description, taken in conjunction with the accompanying drawings, in which:
FIG. 1 illustrates a network context for implementing a system and method for smart building control according to embodiments of this disclosure;
FIG. 2 illustrates a network and processing context for implementing a system and method for smart building control according to embodiments of this disclosure;
FIG. 3 illustrates aspects of a resistive mat presence sensor according to embodiments of this disclosure;
FIG. 4 illustrates aspects of a floor-mounted presence sensor according to embodiments of this disclosure;
FIG. 5 illustrates a master control device according to embodiments of this disclosure;
FIG. 6 illustrates operations of a method of determining tracks associated with moving occupants of a physical space according to embodiments of this disclosure;
FIG. 7 illustrates operations of a Kalman fitter according to embodiments of this disclosure;
FIGS. 8A-8I illustrate aspects of a method for determining tracks from presence sensor data according to embodiments of this disclosure;
FIG. 9 illustrates aspects of an implementation of a smart building control system utilizing multidimensional presence sensor arrays according to embodiments of the instant disclosure;
FIG. 10 illustrates a presence sensor housed in a lightbulb according to embodiments of this disclosure;
FIG. 11 illustrates operations of a method for smart building control using multidimensional presence sensors according to embodiments of the present disclosure; and
FIGS. 12A-12G illustrate aspects of a method for determining tracks from multidimensional presence sensors according to embodiments of this disclosure.
DETAILED DESCRIPTIONFIGS. 1 through 12G, discussed below, and the various embodiments used to describe the principles of this disclosure in this patent document are by way of illustration only and should not be construed in any way to limit the scope of the disclosure.
Embodiments as disclosed herein relate to systems and methods for smart building control using multidimensional presence sensor arrays. The advent of the internet of things (“IoT”) and development of physical spaces whose environmental control systems (for example, lights and HVAC systems) can be controlled using a broad spectrum of sensor data collected within the physical space presents many opportunities to make buildings “smarter,” in the sense of being attuned with, and responsive to, the needs and priorities of the buildings' human occupants. Effective integration of sensor technology and machine intelligence for processing and understanding the sensor data presents opportunities for meaningful improvements across a wide range of building functionalities. For example, such integration can improve the efficiency of a building (for example, by focusing heating and cooling resources on the regions of a building that have the most people), improve a building's safety (for example, by performing footstep analysis to identify when an occupant of a building has fallen or stopped walking under circumstances suggesting concern, and extend the life cycle of a building (for example, by collecting data as to loading and use stress over a building's lifespan).
Realizing the full potential of a “smart building” to learn about its occupants and control itself in response to, and in anticipation of, its occupants' needs is enhanced when data regarding a building's utilization is collected from sources that are a constant across the building's lifecycle, and which capture all, or almost all, of the relevant occupant usage data.
The floor of a building is one example of a source of relevant occupant data for the entirety of the building's life. The ceiling is another example of a source of relevant occupant data for the entirety of the building's life Walls can be knocked down and moved over the course of a building's lifetime, but the floor and ceiling generally remain structural constants. Barring unforeseeable changes in human locomotion, humans can be expected to generate measureable interactions with buildings through their footsteps on buildings' floors. By the same token, the ceiling provides a vantage point for sensor data that complements data obtained at the floor. Embodiments according to the present disclosure help realize the potential of the “smart building” by providing, amongst other things, control inputs for a building's environmental control systems based on occupants' interaction with building surfaces, including, without limitation, floors.
FIG. 1 illustrates an example of anetwork context100 for implementing a system and method for smart building control according to some embodiments of this disclosure. The embodiment of thenetwork context100 shown inFIG. 1 is for illustration only and other embodiments could be used without departing from the scope of the present disclosure.
In the non-limiting example shown inFIG. 1, anetwork context100 according to certain embodiments of this disclosure includes a master control device105 (sometimes referred to as a gateway, one ormore routers110a,110b,110c,110d, aclient device115 providing a user interface, a plurality of end devices120a-jin a physical space, and one or more appliances or features of a physical space receiving control signals (for example, HVAC system125) frommaster control device105.
According to certain embodiments,master control device105 is embodied on a low power processing platform, such as a development board running an ARM CORTEX™ processor. Alternatively,master control device105 may be implemented on a larger computing platform, such as a notebook computer, a server computer, or a tablet comprising a memory, a processor, an input output interface, an analog to digital converter, and send and receive circuitry that includes a network interface and supports multiple communication protocols, including without limitation, Wi-Fi on the 900 MHz, 2.4 GHz and 5.0 GHz bands. According to further embodiments, master control device also supports communications using the Zigbee protocol and AES-128 encryption between devices in the network, including without limitation, routers110a-110d, end devices120a-j,client device115 andHVAC system125.
As will be described in greater detail herein, the memory ofmaster control device105 contains instructions, which when executed by the processor, cause the master control device to receive signals from end devices120a-j, determine tracks associated with moving occupants of a physical space based on the received signals, and output signals for controlling appliances and features of the physical space based on the determined tracks.
While in the non-limiting example shown inFIG. 1,master control device105 is shown as embodied on a single, physical computing platform (such as a server or notebook), which is communicatively connected to other actors withinnetwork context100 using various wireless communication protocols, numerous other embodiments are possible and within the intended scope of this disclosure. For example, the operations carried out bymaster control device105 in the embodiment shown inFIG. 1, can, in other embodiments, be performed on multiple machines, or by a different machine withinnetwork context100, such asclient device115 or one of end devices120a-j. Additionally, according to some embodiments,master control device105 may be embodied on one or more virtual machines.
According to some embodiments, each router of routers110a-110dis a wireless router providing a Wi-Fi link betweenmaster control device105 and each of end devices120a-120j. In the non-limiting example shown inFIG. 1, each of routers110a-110dsupport communications using, without limitation, the Zigbee, Bluetooth, Bluetooth Low Energy (BLE) and Wi-Fi communication protocols in the 900 MHz, 2.4 GHz and 5.0 GHz bands. Alternatively, in other embodiments, routers110a-110dconnect to one or more devices withinnetwork context100 over a wired connection and communicate using wired communication protocols, such as Ethernet networking protocols. Additionally, each of routers110a-110dmay be connected to one another, as shown inFIG. 1 to form a mesh network.
According to various embodiments,client device115 is a smartphone providing a user interface for, without limitation, receiving information regarding determined tracks in the physical space, providing visualizations of determined tracks in the physical space, and controlling the transmission of control signals frommaster control device105 to appliances and devices in the physical space (such as HVAC system125) based on tracks determined bymaster control device105.
In the non-limiting example shown inFIG. 1, each end device of end devices120a-120jcomprises a floor mounted presence sensor capable of collecting floor contact data from within the physical space at predetermined intervals. According to some embodiments, the predetermined intervals at which floor contact data is collected corresponds to a scan rate that can be configured atmaster control device105 or via a user interface ofclient device115. Further, according to some embodiments, each end device of end devices120a-120jis embodied on a low-power general computing device such as a development board powered by an energy efficient processor, such as the INTEL ATOM™ processor. According to some embodiments, the presence sensor is a membrane switch, resistive sensor, piezoelectric sensor or capacitive sensor that, when contacted, produces or changes an electrical signal, from which a value along one or more coordinate axes assigned to the physical space can be mapped. According to some embodiments (for example, embodiments using membrane switches or certain capacitive sensors), the presence sensors of end devices120a-120jdetect the presence or absence of contact with the floor. According to other embodiments (for example, with the resistive sensor shown inFIG. 3), the presence sensors of end devices120a-120jproduce an electric signal correlating to a pressure applied to the sensor. In certain embodiments, each of end devices120a-120jalso include an analog-to-digital converter (“A/D”) to digitize the electrical signals. Further end devices120a-120jmay include a memory, a processor and send and receive circuitry to provide the electrical signals from the presence sensors or digitizations thereof to routers110a-110dormaster control device105. According to some embodiments, the send and receive circuitry of end devices120a-120jincludes a network interface supporting one or more wired or wireless communication protocols, including without limitation, Ethernet, ZIGBEE, Wi-Fi, BLUETOOTH and BLUETOOTH Low Energy (BLE).
Additionally, according to certain embodiments, the presence sensors of each of end devices120a-120jmay, either by themselves, or under the control ofmaster control device105, form a self-configuring array of sensors, such as described in U.S. Provisional Patent Application No. 62/612,959, which is incorporated in by reference in its entirety.
According to certainembodiments HVAC system125 is a “smart” HVAC device, such as one of the component devices of the Carrier Comfort Network system. According to other embodiments,HVAC system125 is a conventional HVAC device that has been retrofitted with a networked controller capable of receiving control inputs frommaster control device105. Skilled artisans will appreciate thatHVAC system125 is merely illustrative, and not limitative of the kinds of devices that can be controlled in response to inputs frommaster control device105. Other devices of a “smart building” whose operation can be controlled or adjusted based on signals frommaster control device105 include, without limitation, IoT devices such as lights, window shades, room cleaning robots, windows, automatic doors, media systems, and security systems.
FIG. 2 illustrates an example of anetwork context200 for implementing a system and method for smart building control according to certain embodiments of this disclosure. The embodiment of thenetwork context200 shown inFIG. 2 is for illustration only and other embodiments could be used without departing from the scope of the present disclosure.
In the non-limiting example shown inFIG. 2, thenetwork context200 includes one ormore mat controllers205a,205band205c, anAPI suite210, atrigger controller220, job workers225a-225d, adatabase230 and anetwork235.
According to certain embodiments, each of mat controllers205a-205cis connected to a presence sensor in a physical space. In some embodiments, each of mat controllers205 is a mat controller, such as described in U.S. Provisional Patent Application No. 62/615,310, the contents of which are incorporated in their entirety herein. According to some embodiments, each of mat controllers205a-205cis an end device, such as one of end devices120a-120jdescribed with reference toFIG. 2 herein. Mat controllers205a-205cgenerate floor contact data from presence sensors in a physical space and transmit the generated floor contact data toAPI suite210. In some embodiments, data from mat controllers205a-205cis provided toAPI suite210 as a continuous stream. In the non-limiting example shown inFIG. 2, mat controllers205a-205cprovide the generated floor contact data toAPI suite210 via the internet. Other embodiments, wherein mat controllers205a-205cemploy other mechanisms, such as a bus or Ethernet connection to provide the generated floor data toAPI suite210 are possible and within the intended scope of this disclosure.
According to some embodiments,API suite210 is embodied on a server computer connected via the internet to each of mat controllers205a-205c. According to other embodiments, API suite is embodied on a master control device, such asmaster control device105 shown inFIG. 1 of this disclosure. In the non-limiting example shown inFIG. 2,API suite210 comprises a Data Application Programming Interface (API)215a, anEvents API215band aStatus API215c.
In some embodiments,Data API215ais an API for receiving and recording mat data from each of mat controllers205a-205c. Mat events include, for example, raw, or minimally processed data from the mat controllers, such as the time and data a particular sensor was pressed and the duration of the period during which the sensor was pressed. According to certain embodiments,Data API215astores the received mat events in a database such asdatabase230. In the non-limiting example shown inFIG. 2, some or all of the mat events are received byAPI suite210 as a stream of event data from mat controllers205a-205c,Data API215aoperates in conjunction withtrigger controller220 to generate and pass along triggers breaking the stream of mat event data into discrete portions for further analysis.
According to various embodiments,Events API215breceives data from mat controllers205a-205cand generates lower-level records of instantaneous contacts where a sensor on the mat is pressed and released.
In the non-limiting example shown inFIG. 2,Status API215creceives data from each of mat controllers205a-205cand generates records of the operational health (for example, CPU and memory usage, processor temperature, whether all of the sensors from which a mat controller receives inputs is operational) of each of mat controllers205a-205c. According to certain embodiments,status API215cstores the generated records of the mat controllers' operational health indatabase230.
According to some embodiments,trigger controller220 operates to orchestrate the processing and analysis of data received from mat controllers205a-205c. In addition to working withdata API215ato define and set boundaries in the data stream from mat controllers205a-205cto break the received data stream into tractably sized and logically defined “chunks” for processing,trigger controller220 also sends triggers to job workers225a-225cto perform processing and analysis tasks. The triggers comprise identifiers uniquely identifying each data processing job to be assigned to a job worker. In the non-limiting example shown inFIG. 2, the identifiers comprise: 1.) a sensor identifier (or an identifier otherwise uniquely identifying the location of contact); 2.) a time boundary start identifying a time in which the mat went from an idle state (for example, an completely open circuit, or, in the case of certain resistive sensors, a baseline or quiescent current level) to an active state (a closed circuit, or a current greater than the baseline or quiescent level); and 3.) a time boundary end defining the time in which a mat returned to the idle state.
In some embodiments, each of job workers225a-225ccorresponds to an instance of a process performed at a computing platform, (for example,master control device105 inFIG. 1) for determining tracks and performing an analysis of the tracks. Instances of processes may be added or subtracted depending on the number of events or possible events received byAPI suite210 as part of the data stream from mat controllers205a-205c. According to certain embodiments, job workers225a-225cperform an analysis of the data received from mat controllers205a-205c, the analysis having, in some embodiments, two stages. A first stage comprises deriving paths, or tracks from mat impression data. A second stage comprises characterizing those paths according to a certain criteria to, inter alia, provide metrics to an online dashboard (in some embodiments, provided by a UI on a client device, such asclient device115 inFIG. 1) and to generate control signals for devices (such as HVAC systems, lights, and IoT appliances) controlling operational parameters of a physical space where the mat impressions were recorded.
In the non-limiting example shown inFIG. 2, job workers225a-225cperform the constituent processes of certain methods for analyzing mat impressions to generate paths, or tracks. According to certain embodiments, a method comprises the operations of obtaining impression data fromdatabase230, cleaning the obtained impression data and reconstructing paths using the cleaned data. In some embodiments, cleaning the data includes removing extraneous sensor data, removing gaps between impressions caused by sensor noise, removing long impressions caused by objects placed on mats or by defective sensors, and sorting impressions by start time to produce sorted impressions. According to certain embodiments, job workers225a-225cperform processes for reconstructing paths by implementing algorithms that first cluster impressions that overlap in time or are spatially adjacent. Next, the clustered data is searched, and pairs of impressions that start or end within a few milliseconds of one another are combined into footsteps, which are then linked together to form footsteps. Footsteps are further analyzed and linked to create paths.
According to certain embodiments,database230 provides a repository of raw and processed mat impression data, as well as data relating to the health and status of each of mat controllers205a-205c. In the non-limiting example shown inFIG. 2,database230 is embodied on a server machine communicatively connected to the computing platforms providingAPI suite210,trigger controller220, and upon which job workers225a-225cexecute. According to other embodiments,database230 is embodied on a cloud computing platform.
In the non-limiting example shown inFIG. 2, the computing platforms providingtrigger controller220 anddatabase230 are communicatively connected to one or more network(s)235. According to embodiments,network235 comprises any network suitable for distributing mat data, determined paths and control signals based on determined paths, including, without limitation, the internet or a local network (for example, an intranet) of a smart building.
Presence sensors utilizing a variety of sensing technologies, such as membrane switches, pressure sensors and capacitive sensors, to identify instances of contact with a floor are within the contemplated scope of this disclosure.FIG. 3 illustrates aspects of a resistivemat presence sensor300 according to certain embodiments of the present disclosure. The embodiment of the resistivemat presence sensor300 shown inFIG. 3 is for illustration only and other embodiments could be used without departing from the scope of the present disclosure.
In the non-limiting example shown inFIG. 3, a cross section showing the layers of a resistivemat presence sensor300 is provided. According to some embodiments, the resistance to the passage of electrical current through the mat varies in response to contact pressure. From these changes in resistance, values corresponding to the pressure and location of the contact may be determined. In some embodiments, resistivemat presence sensor300 may comprise a modified carpet or vinyl floor tile, and have dimensions of approximately 2′×2′.
According to certain embodiments, resistivemat presence sensor300 is installed or disposed directly on a floor, withgraphic layer305 comprising the top-most layer relative to the floor. In some embodiments,graphic layer305 comprises a layer of artwork applied topresence sensor300 prior to installation.Graphic layer305 can variously be applied by screen printing or as a thermal film.
According to certain embodiments, a firststructural layer310 sits belowgraphic layer305 and comprises one or more layers of durable material capable of flexing at least a few thousandths of an inch in response to footsteps or other sources of contact pressure. In some embodiments, firststructural layer310 may be made of carpet, vinyl or laminate material.
According to some embodiments, firstconductive layer315 sits belowstructural layer310. According to some embodiments, firstconductive layer315 includes conductive traces or wires oriented along a first axis of a coordinate system. The conductive traces or wires of firstconductive layer315 are, in some embodiments, copper or silver conductive ink wires screen printed onto either firststructural layer310 orresistive layer320. In other embodiments, the conductive traces or wires of firstconductive layer315 are metal foil tape or conductive thread embedded instructural layer310. In the non-limiting example shown inFIG. 3, the wires or traces included in firstconductive layer315 are capable of being energized at low voltages (for example, ˜5 volts). In the non-limiting example shown inFIG. 3, connection points to a first sensor layer of another presence sensor or to mat controller are provided at the edge of eachpresence sensor300.
In various embodiments, aresistive layer320 sits belowconductive layer315. As shown in the non-limiting example shown inFIG. 3,resistive layer320 comprises a thin layer of resistive material whose resistive properties change under pressure. For example,resistive layer320 may be formed using a carbon-impregnated polyethylete film.
In the non-limiting example shown inFIG. 3, a secondconductive layer325 sits belowresistive layer320. According to certain embodiments, secondconductive layer325 is constructed similarly to firstconductive layer315, except that the wires or conductive traces of secondconductive layer325 are oriented along a second axis, such that whenpresence sensor300 is viewed from above, there are one or more points of intersection between the wires of firstconductive layer315 and secondconductive layer325. According to some embodiments, pressure applied topresence sensor300 completes an electrical circuit between a sensor box (for example,mat controller225ashown inFIG. 2 ormaster control device105 shown inFIG. 1) and presence sensor, allowing a pressure-dependent current to flow throughresistive layer320 at a point of intersection between the wires of firstconductive layer315 and secondconductive layer325.
In some embodiments, a secondstructural layer330 resides beneath secondconductive layer325. In the non-limiting example shown inFIG. 3, secondstructural layer330 comprises a layer of rubber or a similar material to keeppresence sensor300 from sliding during installation and to provide a stable substrate to which an adhesive, such asglue backing layer335 can be applied without interference to the wires of secondconductive layer325.
The foregoing description is purely descriptive and variations thereon are contemplated as being within the intended scope of this disclosure. For example, in some embodiments, presence sensors according to this disclosure may omit certain layers, such asglue backing layer335 andgraphic layer305 described in the non-limiting example shown inFIG. 3.
According to some embodiments, aglue backing layer335 comprises the bottom-most layer ofpresence sensor300. In the non-limiting example shown inFIG. 3,glue backing layer335 comprises a film of a floor tile glue, such as Roberts6300 pressure sensitive carpet adhesive.
FIG. 4 illustrates aspects of a floor mounted presence sensor according to various embodiments of this disclosure. The embodiment of the floor mountedpresence sensor400 sown inFIG. 4 is for illustration only. Other embodiments could be used without departing from the scope of the present disclosure.
In the non-limiting example shown inFIG. 4, a resistivemat presence sensor400 has a plurality of conductive traces, including the traces numbered405aand405b, along a first axis, which, in this example, correspond to conductive traces in a first conductive layer (for example,conductive layer315 inFIG. 3) of a resistive mat presence sensor. Further, resistivemat presence sensor400 has a plurality of conductive traces, including the traces numbered410aand410b, along a second axis, which, in this example, correspond to conductive traces in a second conductive layer (for example,conductive layer325 inFIG. 3) of a resistive mat presence sensor. Each of conductive traces connects separately to an end device. In this case, the end device is a mat controller415 (for example,mat controller205ashown inFIG. 2). Other embodiments, in which the end device is, for example,end device120ashown inFIG. 1 ormaster control device105 shown inFIG. 1 are possible and within the scope of this disclosure.
In the non-limiting example shown inFIG. 4,presence sensor400 is shown as connecting directly withmat controller415. In other embodiments,presence sensor400 connects tomat controller415 through one or more additional presence sensors.
According to certain embodiments, the alignment and spacing of the conductive traces of the presence system correspond to the spatial increments of a coordinate system for a physical space in which the presence sensor is installed. For example, in some cases, the conductive wires are disposed within the conductive layers of the presence sensor at intervals of approximately three inches or less, as such as spacing provides a high resolution representation of the occupancy and traffic within the physical space.
According to certain embodiments, when pressure is applied (such as by a footstep) to the presence sensor, the resistive mat is compressed such that the electrical resistance between a trace in one layer of the resistive mat and a trace in another layer of the resistive mat is reduced, and a signal corresponding to the difference in electrical current from a baseline or quiescent value is observed (such as by an ammeter or voltmeter in mat controller415) in the traces brought into proximity by the footstep. By identifying the traces of the presence sensor through which the difference in current is measured, a value in a coordinate space for the corresponding to the location where the pressure was applied to the pressure sensor can be mapped. Additionally, a value for the pressure applied to the mat at a given interval may be determined based on the size of the signal.
In the non-limiting example shown inFIG. 4, an end device, (for example,mat controller415 ormaster control device105 shown inFIG. 1) “scans” the voltages or currents observed at each of the terminals where traces of the presence sensors connect to the end device at predetermined intervals. Accordingly, a plurality of signals corresponding to the measured voltages or currents at each of the terminals at known times are recorded and passed to an input-output interface of the end device. According to some embodiments, the scan rate of approximately 100-200 Hertz (Hz), wherein the time between scans is on the order of 5-10 milliseconds (ms), is appropriate for capturing footstep data at a level of temporal granularity from which the directionality of footsteps can be determined. Faster and slower scan rates are possible and within the contemplated scope of this disclosure.
While in the non-limiting example shown inFIG. 4, traces405a-band410a-bofpresence sensor400 are depicted as comprising part of a rectilinear coordinate system having uniformly sized spatial increments, the present disclosure is no so limited. Other embodiments are possible, such as embodiments in which one or more layers of traces are curved or fan shaped and define a radial coordinate system. Such embodiments may be advantageous for curving spaces, such as running tracks, velodromes or curved hallways. Additionally, in some embodiments, such as physical spaces that have defined spectator areas and performance areas (for example, a basketball court or a stage), it may be advantageous that the coordinate system have a finer spatial resolution in certain areas (such as the playing or performance area) and a coarser spatial resolution in other areas, such as hallways or concession stand areas.
FIG. 5 illustrates amaster control device500 according to certain embodiments of this disclosure. The embodiment of themaster control device500 shown inFIG. 5 is for illustration only and other embodiments could be used without departing from the scope of the present disclosure.
In the non-limiting example shown inFIG. 5,master control device500 is embodied on a standalone computing platform (for example,master control device105 inFIG. 1) connected, via a network, to a series of end devices (for example,120a-120jinFIG. 1,mat controller205ainFIG. 2) In other embodiments,master control device105 connects directly to, and receives raw signals from, one or more presence sensors (for example,presence sensor300 inFIG. 3 orpresence sensor400 inFIG. 4).
According to certain embodiments,master control device500 includes one or more input/output interfaces (I/O)505. In the non-limiting example shown inFIG. 5, I/O interface505 provides terminals that connect to each of the various conductive traces of the presence sensors deployed in a physical space. Further, in systems where membrane switches or pressure sensing mats are used as presence sensors, I/O interface505 electrifies certain traces (for example, the traces contained in a first conductive layer, such asconductive layer315 inFIG. 3) and provides a ground or reference value for certain other traces (for example, the traces contained in a second conductive layer, such asconductive layer325 inFIG. 3). Additionally, I/O interface505 also measures current flows or voltage drops associated with occupant presence events, such as a person's foot squashing a membrane switch to complete a circuit, or compressing a resistive mat, causing a change in a current flow across certain traces. In some embodiments, I/O interface505 amplifies or performs an analog cleanup (such as high or low pass filtering) of the raw signals from the presence sensors in the physical space in preparation for further processing.
In some embodiments,master control device500 includes an analog-to-digital converter (“ADC”)510. In embodiments where the presence sensors in the physical space output an analog signal (such as in the case of resistive mats),ADC510 digitizes the analog signals. Further, in some embodiments,ADC510 augments the converted signal with metadata identifying, for example, the trace(s) from which the converted signal was received, and time data associated with the signal. In this way, the various signals from presence sensors can be associated with touch events occurring in a coordinate system for the physical space at defined times. While in the non-limiting example shown inFIG. 5ADC510 is shown as a separate component ofmaster control device500, the present disclosure is not so limiting, and embodiments in which theADC510 is part of, for example, I/O interface505 orprocessor515 are contemplated as being within the scope of this disclosure.
In various embodiments,master control device500 further comprises aprocessor515. In the non-limiting example shown inFIG. 5,processor515 is a low-energy microcontroller, such as the ATMEGA328P by Atmel Corporation. According to other embodiments,processor515 is the processor provided in other processing platforms, such as the processors provided by tablets, notebook or server computers.
In the non-limiting example shown inFIG. 5,master control device500 includes amemory520. According to certain embodiments,memory520 is a non-transitory memory containing program code to implement, for example,APIs525, networking functionality and the algorithms for generating and analyzing tracks described herein.
Additionally, according to certain embodiments,master control device500 includes one or more Application Programming Interfaces (APIs)525. In the non-limiting example shown inFIG. 5,APIs525 include APIs for determining and assigning break points in one or more streams of presence sensor data and defining data sets for further processing. Additionally, in the non-limiting example shown inFIG. 5,APIs525 include APIs for interfacing with a job scheduler (for example,trigger controller220 inFIG. 2) for assigning batches of data to processes for analysis and determination of tracks. According to some embodiments,APIs525 include APIs for interfacing with one or more reporting or control applications provided on a client device (for example,client device115 inFIG. 1). Still further, in some embodiments,APIs525 include APIs for storing and retrieving presence sensor data in one or more remote data stores (for example,database230 inFIG. 2).
According to some embodiments,master control device500 includes send and receivecircuitry530, which supports communication betweenmaster control device500 and other devices in a network context in which smart building control is being implemented according to embodiments of this disclosure. In the non-limiting example shown inFIG. 5, send and receivecircuitry530 includescircuitry535 for sending and receiving data using Wi-Fi, including, without limitation at 900 MHz, 2.8 GHz and 5.0 GHz. Additionally, send and receivecircuitry530 includes circuitry, such asEthernet circuitry540 for sending and receiving data (for example, presence sensor data) over a wired connection. In some embodiments, send and receivecircuitry530 further comprises circuitry for sending and receiving data using other wired or wireless communication protocols, such as Bluetooth Low Energy or Zigbee circuitry.
Additionally, according to certain embodiments, send and receivecircuitry530 includes anetwork interface550, which operates to interconnectmaster control device500 with one or more networks.Network interface550 may, depending on embodiments, have a network address expressed as a node ID, a port number or an IP address. According to certain embodiments,network interface550 is implemented as hardware, such as by a network interface card (NIC). Alternatively,network interface550 may be implemented as software, such as by an instance of the java.net.NetworkInterface class. Additionally, according to some embodiments,network interface550 supports communications over multiple protocols, such as TCP/IP as well as wireless protocols, such as 3G or BLUETOOTH.
FIG. 6 illustrates operations of amethod600 for determining tracks associated with moving occupants of a physical space according to various embodiments of this disclosure. While the flow chart depicts a series of sequential steps, unless explicitly stated, no inference should be drawn from that sequence regarding specific order of performance, performance of steps or portions thereof serially rather than concurrently or in an overlapping manner, or performance of the steps depicted exclusively without the occurrence of intervening or intermediate steps.
In the non-limiting example shown inFIG. 600, the operations ofmethod600 are carried out by “job workers” or processes orchestrated by a gateway or master control device (for example,master control device500 inFIG. 5). Other embodiments are possible, including embodiments in which the described operations are performed across a variety of machines, including physical and virtual computing platforms.
According to some embodiments,method600 includesoperation605, wherein a first plurality of electrical signals is received by an input/output interface (for example, I/O interface505 inFIG. 5) of a master control device from presence sensors (for example, a self-configuring array of presence sensors, such as certain embodiments of end devices120a-120jinFIG. 1) in a physical space under analysis. While not required, in some embodiments, the first plurality of electrical signals is received at multiple points in time, based on several scans of the presence sensors in the physical space by the master control device. Further, in the non-limiting example shown inFIG. 6, as part ofoperation605, the received analog electrical signals may be digitized (for example, byADC510 inFIG. 5) and stored in a memory (for example,memory520 inFIG. 5 ordatabase230 inFIG. 2).
In some embodiments,method600 includesoperation610, wherein the master control device generates background sensor values. As part ofoperation610, the master control device maps the presence sensor signals received atoperation605 to sensor values mapped to a coordinate system for the physical space (for example, the grid type coordinatesystem800 inFIG. 8). In some cases, each trace of the presence sensor corresponds to a value on a coordinate axis for the physical space, and each intersection of traces corresponds to a “pixel” having a location in the physical space. The mapping of coordinate values comprises pairing the traces from which each signal of the first plurality of electrical signals was received to identify a “pixel,” or location in the physical space associated with the received presence sensor signals.
In the non-limiting example shown inFIG. 6, background sensor values mapped to the coordinate system for the physical space are generated in one of at least two ways. In one set of embodiments, the first plurality of electrical signals is received over a time known to be a period of low activity in the physical space (for example, in cases where the physical space is a store, when the store is closed). In such cases, the sensor values collected during periods of inactivity may are assumed to be generated by furniture and other static actors in the space and comprise the background sensor values for the physical space. In another set of embodiments, the master control device categorizes the sensor values as “fast” and “slow” and maintains a running estimate of “foreground” and “background” sensor values by fitting two normal distributions to each pixel with “fast” and “slow” responses.
According to various embodiments,method600 includesoperation615, wherein the master control device receives a second plurality of electrical signals comprising presence sensor signals at multiple points in time, such as presence sensor signals received from two or more “scans” of the presence sensors by the master control device. Atoperation620, as inoperation605, the second plurality of electrical signals include an analog component that may be digitized (for example, byADC510 inFIG. 5) and stored in a memory (for example,memory520 inFIG. 5 ordatabase230 inFIG. 2).
In some embodiments,method600 includesoperation620, wherein the master control device generates, based on the second plurality of electrical signals from the presence sensors, sensor values mapped to “pixels” within the coordinate system and points in time. For example, a first sensor value generated inoperation620 may be of the general form: (time=10.01 s, x=2, y=4, Ground Pressure=30 lb/in2), and a second sensor value generated inoperation620 may be of the general form (time=10.02 s, x=2, y=4, Ground Pressure=15 lb/in2). In another embodiment, a first sensor value generated inoperation620 may be expressed as a string of the general form: (053104061), wherein the first four digits “0531” correspond to a time value, the fifth and sixth digits (“04”) correspond to an angle in a radial coordinate system, the seventh and eighth digits (“06”) correspond to a distance in the radial coordinate system, and the last digit (“1”) corresponds to the measured state (for example, “on” or “off”) of the presence sensor. Skilled artisans will appreciate that the foregoing examples of sensor values are purely illustrative, and other representations of location, time and presence sensor values are possible and within the intended scope of this disclosure.
In the non-limiting example shown inFIG. 6,method600 is shown as includingoperation625, wherein background sensor values (for example, the sensor values generated atoperation610 shown inFIG. 6) are subtracted from the sensor values generated atoperation620 to produce measurement data associated with the activities of the mobile occupants in the physical space. By subtracting out the background sensor values caused by, for example, furniture placed in the physical space after installation of presence sensors or damaged presence sensors, the master control device can obtain an unimpeded view of activity within the physical space.
According to some embodiments,method600 includesoperation630, wherein the master control device associates measurement data (for example, the measurement data generated in operation625) with one or more moving objects belonging to an object class. In the non-limiting example shown inFIG. 6, the density of traces (and spatial resolution) of the presence sensor is such that the sensor value at each pixel in the coordinate system can be examined in the context of neighboring sensors and time windows to classify the activity associated with the measurement data.
In certain embodiments, the master control device implements a classification algorithm that operates on the assumptions about the moving actors in the physical space. For example, in some embodiments, it is an operational assumption that footsteps form, persist on timescales on the order of one or two seconds, and then disappear. As a further example, it is an operational assumptions that wheels (such as from wheelchairs, bicycles, carts and the like) roll across a surface in a continuous motion. Working from predetermined rules, which in some embodiments, are based on operational assumptions, the measurement data can be associated with moving objects belonging to predefined object classes. In some embodiments, a tracker, corresponding to the location of the moving object in time is assigned to the moving object based on the measurement data. Further, according to some embodiments, trackers move from along tracks, which may be determined paths in a network of nodes in the coordinate system for the physical space.
In a non-limiting example, presence sensors are deployed in a physical space at a density that supports a spatial resolution of approximately 3 inches, and the master control device is configured to scan the presence sensors at intervals of approximately 5 ms (corresponding to a scan rate of 200 Hz). In this example, measurement data for a first point in the coordinate system correlating to a high applied pressure (for example, 200 psi) is generated for a time t=0. Over the course of the next 200 ms, the measurement data shows a decrease in applied pressure at the first point, and a moderate increase in pressure (for example, 20 psi) at one or more points adjacent to the first point. Applying predetermined rules, the master control device associates the generated measurement data with the footstep of a person wearing high heeled shoes and moving generally along a line passing through the first point and the one or more adjacent points.
In another, non-limiting example, with the same scan rate and spatial resolution, at a first time, t=0 measurement data corresponding to a uniform applied pressure at five evenly spaced points in the coordinate system is generated. Over the course of the next five seconds, the measurement data shows five similarly spaced points of contact having approximately the same applied pressure values. Applying predetermined rules, the master control device associates the generated measurement data with the motion of an office chair on five caster wheels moving across the floor.
In some embodiments,method600 includesoperation635, wherein the master control device identifies, based on the measurement data, a first node corresponding to a determined location of the moving object (for example, the moving object associated with an object class described with reference to operation630). In the non-limiting example shown inFIG. 6, a node corresponds to a single value within the coordinate system corresponding to the location, at a given time, of a moving object in the physical space. In many cases, certain moving objects of interest in the physical space (for example, humans wearing shoes) contact the presence sensors at intermittent points in time at non-contiguous points of contact within the physical space. In such cases, nodes, or single points corresponding to the location of the actor, provide an analytical convenience and useful representation of the location associated with multiple pieces of measurement data.
According to some embodiments, a first node corresponding to a determined location of the moving object may be determined by applying a naïve clustering algorithm that clusters measurement data within a specified radius of a tracker and determines a node (such as by calculating a centroid associated with the measurement data) based on the measurement data within the cluster. In some cases, the specified radius is on the order of three feet.
In other embodiments, the first node is determined using another clustering algorithm, such as one of the clustering algorithms provided in the NumPy library. Examples of clustering algorithms suitable for generating the first node include, without limitation, K-Means clustering, Affinity Propagation clustering, and the sklearn.cluster method.
In some embodiments, nodes may be assigned retroactively, based on the application of predetermined rules. For example, in cases where measurement data belonging to a first instance of a moving object class (for example, a footstep associated with a person wearing high-heeled shoes) is observed, a node may be assigned to the nearest door, based on a predetermined rule requiring that occupants of the physical space enter and exit via the doors.
According to various embodiments,method600 includesoperation640, wherein the master control device generates, based on the measurement data at multiple time points, a track linking the first node (for example, the node determined during operation635) with another node in the coordinate system for the physical space. In some embodiments, the generation of nodes is based on the application of a recursive algorithm to the measurement data, to smooth out the paths between nodes and to mitigate the effects of noise in the data. In the non-limiting example shown inFIG. 6, recursive algorithms for generating nodes may incorporate a predict/update step where an occupant's predicted location is used to update which footsteps are assigned to a tracker associated with the occupant. In one illustrative embodiment, up to two footsteps are assigned to each tracker. In some embodiments, nodes are generated by implementing a recursive estimation algorithm, such as a Kalman fitter (for example, the Kalman fitter described inFIG. 7).
In the non-limiting example shown inFIG. 6, the generated nodes are connected together in a network to form tracks associated with the path of moving objects and occupants of the physical space. According to some embodiments, the nodes are connected using a network algorithm (For example, the NetworkX package for Python) that generates a graph of nodes and edges connecting the nodes. In the non-limiting example shown inFIG. 6, after finding footsteps (and where, appropriate, wheels or other sources of impression data), these nodes are connected using the network algorithm. Further, to mitigate potential pileup effects, the network links or “edges” are pruned according to distance and time-based penalty terms to find unique tracks through the coordinate system associated with the physical space. In some cases, where there is ambiguity from pileup, track overlap can be represented by increasing the weight of the edges and by allowing tracks to merge and split.
In the non-limiting example shown inFIG. 6,method600 is shown as includingoperation645, wherein a signal associated with the determined track is outputted. According to some embodiments, the output signal may be a running tally of the number of determined tracks in the room, which corresponds generally to the number of occupants in the room. According to other embodiments, the output signal may comprise a plot of the determined tracks at a given time point, or a map of “hot spots” of high human traffic in the physical space. According to still other embodiments, the signal outputted atoperation645 is a control signal for an electrical appliance or other feature of the physical space (e.g., a window shade, door or lock) whose operation can be controlled or based at least in part on a signal from a master control device according to various embodiments of this disclosure. For example, in one embodiment, the determined tracks may show the occupants of a physical space moving towards a particular region of the space (for example, near a television or screen showing a news item or sporting event of broad interest), and the master control device may output a control signal to the HVAC system (for example,HVAC system125 shown in FIGURE) increasing the power of the HVAC system in a particular region of the room.
FIG. 7 illustrates operations of a Kalman fitter700 according to certain embodiments of this disclosure. While the flow chart depicts a series of sequential steps, unless explicitly stated, no inference should be drawn from that sequence regarding specific order of performance, performance of steps or portions thereof serially rather than concurrently or in an overlapping manner, or performance of the steps depicted exclusively without the occurrence of intervening or intermediate steps. The Kalman fitter700 described with reference to the non-limiting example shown inFIG. 7 is one example of an algorithm for generating nodes encompassed by this disclosure. In some embodiments, Kalman fitter700 provides the benefit of managing noise from the sensors and determining less “jittery” tracks associated with moving objects within the physical space.
According to some embodiments, Kalman fitter700 is a recursive estimation algorithm and includesoperation705, wherein a master control device (for example,master control device105 inFIG. 1) assigns a tracker to a moving object belonging to a determined object class. In some embodiments of this disclosure, a tracker corresponds to a point coordinate for a person, object or other moving entity of interest that contacts presence sensors at multiple points (for example, a mail cart on casters) or discontinuous intervals (for example, a walking human).
In some embodiments, Kalman fitter700 includesoperation710, wherein the master control device receives measurement data (for example, a set of clustered impression data points corresponding to one or more possible directions of motion for the moving object that is being tracked) corresponding to the state of the moving object at a first time, T1. Information regarding the state of the moving object at first time T1can include, without limitation, information as to the moving object's location, apparent direction of motion and apparent rate of motion. In some embodiments, the information as to the moving object's location, apparent direction and rate of motion is determined based on footstep and stride analysis of presence sensor data assumed by the master control device to be footsteps. In other embodiments, the measurement data corresponding to the state of the moving object at a time T1 comprises only the moving object's location within the physical space.
In some embodiments, Kalman fitter700 is a recursive estimation process, andoperation710 marks the start of a loop repeated for a period relevant to the operation of one or more environmental control systems of a physical space, or of other analytical interest (for example, the interval beginning when a tracker associated with a human being in the physical space is assigned, and ending when the human being is determined to have departed the physical space, such as by leaving the room).
In the non-limiting example shown inFIG. 7, Kalman fitter700 includesoperation715, wherein the master control unit predicts, based on the measurement data corresponding to the state of the moving object at time T1, measurement data corresponding to the state of the moving object at a subsequent time, T2. As part ofoperation715, the master control device may also determine an uncertainty value associated with the predicted measurement data at time T2. In certain embodiments, the uncertainty associated with the predicted measurement data corresponding to the state of the moving object at time T2, may be expressed as, or determined from an uncertainty matrix associated with the measurement data.
According to certain embodiments, Kalman fitter700 includesoperation720, wherein the master control device receives measurement data corresponding to the state of the moving object at time T2. In the non-limiting example shown inFIG. 7, the values of measurement data received as part ofoperation720 correspond to fields of measurement data received atoperation710 and predicted atoperation715.
In some embodiments, Kalman fitter700 further includesoperation725, wherein the master control device updates the measurement data corresponding to the moving object at time T2based on the predicted measurement data corresponding to the state of the moving object at time T2. In certain embodiments, the updating of the recorded measurement data at time T2based on the predicted measurement data for time T2 comprises taking a weighted average of the values of the recorded measurement data with the predicted values of the measurement data at time T2. In the non-limiting example shown inFIG. 7, the relative weights of the recorded and predicted values of the measurement data is determined based on the uncertainty value or uncertainty matrix associated with the predicted value atoperation715. As noted elsewhere in this disclosure, in some embodiments, Kalman fitter700 implements a recursive estimation method. According to such embodiments, afteroperation725, the method returns tooperation710, using the updated values of the measurement data corresponding to the moving object at time T2, as an initial value for a subsequent prediction.
FIGS. 8A-8I illustrate aspects of a method for determining tracks based on presence data according to certain embodiments of this disclosure.FIGS. 8A-8I illustrate activity in a coordinate system corresponding to a person entering a room and walking through the room, and how certain embodiments according to this disclosure determine a track corresponding to the person's motion into and through the room. Specifically,FIGS. 8A-8I depict activity in a coordinate system for the physical space (e.g., a room) beginning with an “empty” (noise and background presence sensor values) coordinate system for the physical space, followed by the detection of presence sensor data an initial time, assignment of a tracker, detection of additional presence sensor data at a subsequent time, and the determination of tracks connecting nodes within the coordinate system for the physical space.
FIG. 8A illustrates a coordinatesystem800 for a physical space at an initial time. The embodiment of the coordinatesystem800 shown inFIG. 8A is for illustration only and other embodiments could be used without departing from the scope of the present disclosure.
In the non-limiting example shown inFIG. 8A, the axes of coordinatesystem800 are based on the direction of the traces in two separate layers (for example, layers315 and325 shown inFIG. 3) of conductive mat presence sensors installed in the physical space. According to certain embodiments, coordinatesystem800 provides a representation of the physical space after the “background” presence sensor values caused by furniture, noise and other factors have been subtracted out (for example, by performingoperation625 shown inFIG. 6).
FIG. 8B illustrates activity in the coordinatesystem800 for the physical space at a time subsequent to the time shown inFIG. 8A. The embodiment of the coordinatesystem800 shown inFIG. 8B is for illustration only and other embodiments could be used without departing from the scope of the present disclosure.
In the non-limiting example shown inFIG. 8B, a person has just entered the physical space and made her first footstep in the room.Measurement data805 corresponding to electrical signals generated at one or more presence sensors in the physical space has been mapped to a location in the coordinatesystem800 for the physical space. In this particular example, themeasurement data805 is represented as a shaded region, indicating that electrical signals were generated by presence sensors in the shaded region. Other representations of measurement data are possible, and include, without limitation, dots corresponding to overlap points between traces in of layers of a resistive mat through which a current or potential change was detected.
FIG. 8C illustrates activity in the coordinatesystem800 for the physical space subsequent tomapping measurement data805 to a location in coordinatesystem800. The embodiment of the coordinatesystem800 shown inFIG. 8C is for illustration only and other embodiments could be used without departing from the scope of the present disclosure.
In the non-limiting example shown inFIG. 8C,measurement data805 has been associated with a moving object belonging to an object class (in this particular example, a walking human), and atracker810 has been assigned to the moving object. InFIG. 8C,tracker810 corresponds to a single point in the coordinate system (the single point is shown as a black dot within a dotted line included to help distinguish the tracker from other entities in coordinate system800).
FIG. 8D illustrates activity in the coordinatesystem800 for the physical space subsequent to assigning a tracker to the human moving in the physical space. The embodiment of the coordinatesystem800 shown inFIG. 8D is for illustration only and other embodiments could be used without departing from the scope of the present disclosure.
In the non-limiting example shown inFIG. 8D, the initial position of the tracker in the coordinatesystem800 has been designated as afirst node815 and the start of a new track for the tracker assigned to the human moving in the physical space. Additionally, a master control device (for example,master control device105 inFIG. 1) connected to the presence sensors in the physical space implements a Kalman fitter (for example, Kalman fitter700 described with reference toFIG. 7) and predicts the location of the tracker at a subsequent time, T2. In this particular example, the predicted position of the tracker at subsequent time T2is shown byunshaded circle820.
In some embodiments, the recursion rate of a Kalman fitter is the same as the rate at which a master control device scans for electrical signals from presence sensors. In other embodiments, for example, where moving objects' interactions (such as footsteps) occur over intervals that are significantly longer than the scan rate, the recursion rate of a Kalman fitter may be lower than the scan rate for the presence sensors.
FIG. 8E illustrates activity in the coordinatesystem800 for the physical space at time T2. At time T2,additional measurement data825 associated with the tracked human has been received and mapped to a location within the coordinatesystem800 for the physical space. The embodiment of the coordinatesystem800 shown inFIG. 8E is for illustration only and other embodiments could be used without departing from the scope of the present disclosure.
FIG. 8F illustrates activity in the coordinatesystem800 for the physical space at a time subsequent to time T2. The embodiment of the coordinatesystem800 shown inFIG. 8F is for illustration only and other embodiments could be used without departing from the scope of the present disclosure.FIG. 8F depicts thattracker810 has moved to a second node corresponding to a position for the tracked human determined based on the predictedposition820 of the tracked human at time T2and themeasurement data825 received at time T2. In the non-limiting example shown inFIG. 8F, the location of the second node to whichtracker810 has been moved is determined based on a weighted average of the predictedposition820 andmeasurement data825, wherein the weighting is based, at least in part, on an uncertainty value determined for predictedposition820.
According to certain embodiments, the master control device performs a determination as to whether the newly determined position oftracker810 satisfies one or more predetermined conditions, such as expected changes time or distance between nodes or conditions indicating possible pileups of nodes or tracks. If the predetermined conditions are determined to have been satisfied, the master control device createstrack830 connecting the first and second nodes.
FIG. 8G illustrates activity in the coordinatesystem800 for the physical space at the start of a new recursion of the Kalman fitter, in which the predictedlocation835 of the moving human in the physical space at a new subsequent time T3is determined based on the position oftracker810 at time T2. The embodiment of the coordinatesystem800 shown inFIG. 8G is for illustration only and other embodiments could be used without departing from the scope of the present disclosure.
FIG. 8H illustrates activity in the coordinatesystem800 for the physical space at time T3. The embodiment of the coordinatesystem800 shown inFIG. 8H is for illustration only and other embodiments could be used without departing from the scope of the present disclosure.
At time T3, the master control device receivesadditional measurement data840 from presence sensors and maps theadditional measurement data840 to a location within the coordinatesystem800 for the physical space. Additionally, the master control device applies a clustering algorithm (for example, one of the clustering algorithms described with reference tooperation635 inFIG. 6) thatclusters measurement data825 and840 based on their physical and temporal proximity of the measurement data and assigns a point coordinate for the clusteredmeasurement data845. For the purposes of implementing the Kalman fitter, the point coordinate for the clusteredmeasurement data845 is the measurement data for time T3.
FIG. 8I illustrates activity in the coordinatesystem800 for the physical space at a time subsequent to time T3. The embodiment of the coordinatesystem800 shown inFIG. 8I is for illustration only and other embodiments could be used without departing from the scope of the present disclosure.
In the non-limiting example shown inFIG. 8I, the tracker moves to a new node determined based on a weighted average of the predicted location of the moving human at time T3 and the clustered measurement data. Further, the master control device performs a determination as to whether the newly determined position oftracker810 satisfies one or more predetermined conditions, such as expected changes time or distance between nodes or conditions indicating possible pileups of nodes or tracks. If the predetermined conditions are determined to have been satisfied, the master control device createstrack850 connecting the first and second nodes.
According to certain embodiments, the method described with reference toFIGS. 8A-8I recurs until a terminal condition, such as a determination that the tracked human has left the physical space, is satisfied. Further, in some embodiments, the master control device outputs the determined tracks, data derived from the determined tracks, or control signals (such as turning a light on or off) based on the determined tracks.
FIG. 9 illustrates aspects of animplementation900 of a smart building control system using multidimensional presence sensors according to certain embodiments of the present disclosure. The embodiment of theimplementation900 shown inFIG. 9 is for illustration only and other embodiments could be used without departing from the scope of the present disclosure.
Referring to the non-limiting example shown inFIG. 9,implementation900 comprises one ormore presence sensors905 situated in a first spatial dimension of a physical space (in this case, floor910), one ormore presence sensors915 situated in a second spatial dimension of the physical space (in this case, mounted above floor910), communicatively connected to a gateway, ormaster control device920.Presence sensors905 and915 are configured to generate measurement data based on the activity ofobjects925 within the physical space.
In some embodiments, the operation ofmaster control device920 is enhanced whenmaster control device920 receives presence sensor data from more than one vantage point, or dimension, of the physical space. For example, ceiling mounted presence sensors may, by virtue of their location in the physical space and the technologies that can be employed in a sensor not subject to foot traffic, be better able to discriminate between living occupants of a physical and inanimate objects moving in the space. By the same token, floor mounted presence sensors, may, by virtue of their location and construction, be able to collect user impression data (for example, footsteps and wheel prints) at a high level of spatial resolution.
According to certain embodiments, the control of a smart building may be enhanced by using occupant movement data collected across multiple dimensions of a physical space to more accurately associate classes, to objects moving within the physical space. As a non-limiting example, consider a person operating a wheelchair. From just the perspective of a floor mounted presence sensor, such a person may not be reliably distinguishable from other wheeled objects presenting a similar footprint (for example, a heavily laden file cart). From just the perspective of a ceiling mounted sensor, the person's use of a wheelchair may not be apparent. Given the expanding heterogeneity of actors moving in a physical space, which, can be expected to include autonomous vehicles and the like, improvements in the granularity with which the classes of moving objects in a room can be identified translate into improvements in the operation of a “smart building.” Put differently, a building is smarter when it can assign one set of control inputs (for example, turning the air conditioning up) in response to a person in a wheelchair entering a room, and another set of control inputs (for example, turning the air conditioning down) in response to an autonomous vehicle having a similar footprint to a wheelchair moving within the same room.
FIG. 10 illustrates a presence sensor suitable for use in an above-the-floor dimension of a physical space. The embodiment of the presence sensor shown inFIG. 10 is for illustration only and other embodiments could be used without departing from the scope of the present disclosure.
In the non-limiting example shown inFIG. 10, the presence sensor is housed in alightbulb1000. Other embodiments are possible, and presence sensors suitable for above-ground use may variously be housed in ceiling speakers, ceiling fans, or as standalone sensors. While housing sensors in lightbulbs offers clear benefits in terms of ease of installation and providing power for an above-ground presence sensor, other embodiments are possible and within the contemplated scope of this disclosure.
According to certain embodiments, light emittingelement1005 is a filament or light emitting diode suitable for converting electrical current into visible light broadcast across the physical space.
In some embodiments, embeddedsensor1010 is an electronic sensor powered from the same current source as light emittingelement1005, which is capable of detecting the presence of moving objects within a predefined space. Further, embeddedsensor1010 is, in certain embodiments, configured to distinguish between living and inanimate objects. According to certain embodiments, embeddedsensor1010 utilizes one or more of the following object detection technologies: RF emission, thermal imaging or sonar.
In the non-limiting example shown inFIG. 10,wireless module1015 is a wireless communication interface between embeddedsensor1010 and a gateway or master control device (for example,master control device920 inFIG. 9). In some embodiments,wireless module1015 is powered from the same current source as light emittingelement1005. According to certain embodiments,wireless module1015 communicates withmaster control device920 via one or more of the following wireless communication protocols: ZigBee, Bluetooth, Bluetooth Low Energy, or Wi-Fi.
FIG. 11 describes operations of amethod1100 for smart building control according to certain embodiments of this disclosure. While the flow chart depicts a series of sequential steps, unless explicitly stated, no inference should be drawn from that sequence regarding specific order of performance, performance of steps or portions thereof serially rather than concurrently or in an overlapping manner, or performance of the steps depicted exclusively without the occurrence of intervening or intermediate steps. In the non-limiting example shown inFIG. 11, operations ofmethod1100 are carried out by “job workers” or processes orchestrated by a gateway or master control device (for example,master control device500 inFIG. 5, ormaster control device920 inFIG. 9). Other embodiments are possible, including embodiments in which the described operations are performed across a variety of machines, including physical and virtual computing platforms.
According to some embodiments,method1100 comprisesoperation1105, wherein the master control device obtains first measurement data for a zone of a physical space, based on signals from a first group of sensors. In the non-limiting example shown inFIG. 11, the first group of physical sensors are disposed in a first dimension, or perspective of the physical space. In this example, the first group of sensors comprise resistive mat presence sensors (for example,sensor300 inFIG. 3), and measurement data comprises data culled from a stream of event-related signals (for example, data based on changes in current associated with feet and wheels compressing the sensor at mappable locations, such as the measurement data obtained atoperation625 inFIG. 6).
As used herein to describe the non-limiting example ofmethod1100, the term “zone” encompasses a region in a coordinate system for the physical space covered by a specific subset of sensors in a first dimension of the physical space (for example, the floor), and a specific subset of sensors in a second dimension of the physical space. In some embodiments, the sensors in both dimensions of the physical space have equivalent spatial resolutions, and the coordinate system for the physical space may be applied from the perspective of either dimension of the physical space. In other embodiments, sensors in one dimension may have a more granular spatial resolution (for example,presence sensor300 inFIG. 3, which in some embodiments, has a spatial resolution of at least 2″×2″), while sensors in another dimension may have a coarser spatial resolution (for example, ceiling mounted thermal imaging sensors, which may perceive objects in the physical space as warm or cold “blobs.”) In such cases, the coordinate system for the physical space may be based off of the first dimension, and the zone serves as an analytical construct to identify regions where shared coverage between the heterogeneous floor and ceiling sensors is possible.
In some embodiments, atoperation1110, the master control device obtains second measurement data for the zone of the physical space based on signals from a second group of presence sensors. In the non-limiting example shown inFIG. 11, the first and second group of presence sensors are disposed in different dimensions of the physical space (for example, the first group of presence sensors is situated in the floor, while the second group of presence sensors is situated in the ceiling or suspended therefrom).
As noted above, according to certain embodiments, the presence sensors within the physical space are heterogeneous, with the presence sensors of the first group being responsive to different motion events than the sensors of the second group, and the sensors within groups potentially differing in their performance characteristics (for example, spatial resolution and coverage area).
In the non-limiting example shown inFIG. 2, the second group of presence sensors are thermal sensors, (for example,lightbulb1000 inFIG. 10, wherein the embedded sensor is an infrared (IR) sensor) the second measurement data obtained atoperation1110 comprises information as to the motion of exotherming objects (for example, people and animals) in the zone.
According to certain embodiments, atoperation1115, the master control device identifies one or more moving objects within the physical space. In one embodiment, objects within the zone of the physical space may be identified based on measurement data from one dimension of the room (for example, objects may be identified by clustering sets of floor contact events). In other embodiments, and where the spatial resolution of the first group and second group of presence sensors supports doing so, (for example, in embodiments where the first group of presence sensors are resistive floor mats and the second group of presence sensors are ceiling or wall mounted digital video cameras), the identification of moving objects within the physical space may be performed using measurement data from multiple groups of presence sensors.
In various embodiments according to this disclosure, atoperation1120, the master control device associates each of the one or more identified moving objects within the physical space with an instance of an object class. According to certain embodiments, instances of object classes may comprise a top-level genus classification, with one or more species or sub-genus classifications. Further, in some embodiments, one or more features may be recognized from first and second measurement data and the master control device determines the object(s) most probably associated with the measurement data.
For example, in embodiments in which the presence sensors in the first dimension of the physical space are adapted to measuring the pressure and location of floor contact, and the presence sensors in the second dimension of the physical space are adapted to measuring heat, atoperation1120, the master control device express the association of the first and second measurement data with an object class as shown below:
| TABLE 1 |
|
| First | Second | | | |
| Measure- | Measure- | Object Class - | Object Class - | Proba- |
| ment Data | ment Data | Genus | Species | bility |
|
| (+20° F.) | 4 contact | Exotherm | Human in | 67% |
| events/200 | | wheelchair |
| pounds total |
| pressure. |
| (−1° F.) | 4 contact | Inanimate | File cart | 34% |
| events/200 |
| pounds total |
| pressure. |
| (+23° F.) | 4 contact | Exotherm | Canine | 75% |
| events/45 |
| pounds of |
| total pressure |
|
According to certain embodiments, predetermined rules or models are applied to the first and second measurement data to identify one or more object classes to which the moving object in the room belongs. In the non-limiting example shown in Table 1 above, for each moving object in the zone, the first measurement data comprises thermal sensor data from heat sensors housed in lightbulbs. In this particular example, the first measurement data is expressed as the temperature of the moving object relative to an ambient or background temperature. For example “+20° F.” indicates first measurement data showing a moving object having a surface temperature 20 degrees higher than the background or room temperature. In this particular example, the second measurement data is taken from pressure sensors in the floor of the physical space, and represents a total pressure value across clustered floor contact events. For example, a human in a wheelchair having two main wheels and two smaller, castered wheels at the front would register four contact events (e.g., one event per wheel) from which a total pressure applied to the floor can be determined. Applying predetermined rules to the first and second measurement data, one or more object classes can be associated with the moving object. In the example of Table 1, at least two classes are associated with the moving object. Moving objects are assigned to a value in a first, genus-level classification, such as “exotherm” or “inanimate.” Additionally, moving objects are assigned to a value in a second, species-level classification, such as “file cart” or “canine.” Additionally, in the non-limiting example of Table 1, as part ofoperation1120, the master control device calculates a certainty probability associated with the object class (es) assigned to the moving object. In some embodiments, the certainty probability is used for retraining and refining classification models used to associate moving objects with object classes. According to some embodiments, predetermined rules may be able to determine associations between moving objects that would otherwise be separately tracked. For example, a model could be associate a canine closely following the same human around a physical space as a service dog.
In some embodiments, the predetermined rules applied to the first and second measurement data are manually determined (for example, where first measurement data shows especially high surface temperatures and the second measurement data shows contact events fitting a given profile, then, the moving object is a dog, which is lighter than a human, but has a higher body temperature). In other embodiments, the predetermined rules can be developed by training a model (for example, a classification algorithm) on a large data set.
According to certain embodiments, atoperation1125, the master control device determines, for each moving object, a track within a coordinate system for the physical space. In the non-limiting example shown inFIG. 11, the track is determined using the coordinate system defined by the group of presence sensors with the highest spatial resolution (for example, coordinatesystem800 inFIG. 8A). However, in other embodiments, the track may be determined in multiple coordinate systems, or in the coordinate system with the lower spatial resolution.
In the non-limiting example shown inFIG. 11, atoperation1130, the master control device outputs, via an input-output interface, a signal (for example, the signal output inoperation645 inFIG. 6) associated with the one or more determined tracks. According to certain embodiments, the signal output atoperation1130 comprises at least one of, a control signal for an electrical or electronic appliance in the physical space (for example, a light or a climate control device, such as a fan or air conditioner), or an updated track showing the associated object class, current and/or historical position of the moving objects in the physical space.
FIGS. 12A-12G illustrate aspects of a method for determining tracks from multidimensional presence sensors according to certain embodiments of this disclosure. The embodiments of the method for determining tracks shown inFIGS. 12A-12G for illustration only and other embodiments could be used without departing from the scope of the present disclosure.FIGS. 12A-12G illustrate a zone in which three moving objects are detected, associated with object classes based on first and second measurement data, and tracks associated with the movement of each object in a coordinate system of the physical space are determined.
FIG. 12A illustrates a coordinatesystem1200 for a physical space prior to the detection of any moving objects in the space. In the non-limiting example shown inFIG. 12A, the axes of coordinate system are based on the direction of traces in two separate layers (for example, layers315 and325 shown inFIG. 3) of resistive mat presence sensors installed in the physical space. In this explanatory example, the resistive mat presence sensors comprise a first group of presence sensors in a first dimension
According to certain embodiments, coordinatesystem1200 provides a representation, in one dimension of the space, of the physical space after the “background” presence sensor values caused by furniture, noise and other factors have been subtracted out (for example, by performingoperation625 shown inFIG. 6). In this non-limiting example,zone boundaries1205aand1205bdefine four zones, or regions of the physical space where measurement data from groups of presence sensors are obtained and used to generate output signals from the master control device.
FIG. 12B illustrates a second group ofpresence sensors1210 in a second dimension of the physical space. In this explanatory example, the presence sensors are thermal imaging sensors housed in lightbulbs. The location of eachpresence sensor1215 and its area ofcoverage1220 are shown relative tozone boundaries1205a&1205b. In some embodiments, if the entirety of the portion of the physical space represented by the coordinate system is covered by a group of presence sensors in one dimension, there is no requirement that the presence sensors in another dimension of the physical space fully cover the coordinate system. In this non-limiting example, the second group of presence sensors does not (and is not required to) cover the entirety of the physical space. Further, as discussed elsewhere in this disclosure, presence sensors within a group of presence sensors can be heterogeneous. For example, inFIG. 12B, the coverage area ofpresence sensor1225 is smaller thancoverage area1220 forpresence sensor1215.
FIG. 12C illustrates the superposition of the second group of presence sensors relative to coordinatesystem1200. In some embodiments, the second group of presence sensors are positioned according to regular intervals of the coordinate system. In other embodiments, such as where the second group of presence sensors are retrofitted in existing features of the physical space (for example, existing light sockets), it may not be possible to position the second group of presence sensors according to regular intervals of the coordinate system. By performing the association of moving objects within the physical space based on measurement data from the first and second groups of presence sensors at the zone, rather than coordinate system level, challenges associated with representing data from the second group of presence sensors in the coordinate system may be avoided.
FIG. 12D illustrates the superposition of a second group of presence sensors relative to coordinatesystem1200, along with three identified movingobjects1235a,1235band1235c. As shown inFIG. 12D, movingobject1235ais in the coverage area ofpresence sensor1215, while movingobjects1235b-care in the coverage area ofpresence sensor1240. At the moment shown inFIG. 12D, the master control device is receiving first and second measurement data from the first and second group of sensors, but has not yet associated any of moving objects1235a-cwith an object class. In this illustrative example, first and second measurement data is used to associate each of moving objects1235a-cwith an object class.
FIG. 12E illustrates, from a different vantage point, the moment shown inFIG. 12D. As shown inFIG. 12E, each of moving objects1235a-cis in contact with afloor1245 in which the first group of presence sensors are embedded. Additionally, movingobjects1235b-care in the coverage zone ofpresence sensor1240, while movingobject1235ais in the coverage zone ofpresence sensor1215. By collecting data regarding each of moving objects1235a-cfrom two vantage points (in this non-limiting example, the floor and the ceiling) the master control device may more readily confirm that movingobjects1235aand1235bare walking humans, and that movingobject1235cis an office chair (as opposed to a human in a wheel chair, or other object presenting analogous contact information to presence sensors infloor1245.
FIG. 12F illustrates a plot of each of moving objects1235a-cin coordinatesystem1200 after each moving object has been associated with an object class. According to certain embodiments, the master control device continues to implement a zone-based tracking of moving objects using presence sensors in multiple dimensions of the physical space. According to other embodiments, to save computational resources, once moving objects have been associated with class of object based on presence sensors from multiple dimensions of a physical space, the master control device tracks the objects using only one group of presence sensors. Both embodiments are possible and within the intended scope of this disclosure.
FIG. 12G illustrates tracks determined for moving objects in coordinatesystem1200 at a moment subsequent to the moment shown inFIG. 12F. According to certain embodiments, each oftracks1250 and1255 may be determined using methods described in this disclosure (for example,operation640 inFIG. 6). According to certain embodiments, associating moving objects with object classes can provide a filtering function with regard to the objects for which tracks are determined and used as the basis of output signals. In this particular example, no tracks were determined for the office chair (movingobject1235c), as the movement of a wheeled office chair was not relevant to the control of any of the electrical or electronic systems in the physical space. In other embodiments, (for example, warehouses or mailrooms) inanimate moving objects' activities may be relevant to the control of systems in the “smart building.”
None of the description in this application should be read as implying that any particular element, step, or function is an essential element that must be included in the claim scope. The scope of patented subject matter is defined only by the claims. Moreover, none of the claims is intended to invoke 35 U.S.C. § 112(f) unless the exact words “means for” are followed by a participle.