Movatterモバイル変換


[0]ホーム

URL:


US20200364583A1 - Iot sensor network artificial intelligence warning, control and monitoring systems and methods - Google Patents

Iot sensor network artificial intelligence warning, control and monitoring systems and methods
Download PDF

Info

Publication number
US20200364583A1
US20200364583A1US16/412,383US201916412383AUS2020364583A1US 20200364583 A1US20200364583 A1US 20200364583A1US 201916412383 AUS201916412383 AUS 201916412383AUS 2020364583 A1US2020364583 A1US 2020364583A1
Authority
US
United States
Prior art keywords
sensor
remote sensor
station
signals
expert system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/412,383
Inventor
Robert D. Pedersen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by IndividualfiledCriticalIndividual
Priority to US16/412,383priorityCriticalpatent/US20200364583A1/en
Publication of US20200364583A1publicationCriticalpatent/US20200364583A1/en
Priority to US17/183,635prioritypatent/US11244230B2/en
Priority to US17/555,038prioritypatent/US11734578B2/en
Priority to US18/217,510prioritypatent/US12112272B2/en
Priority to US18/822,021prioritypatent/US20240419982A1/en
Abandonedlegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

A first Internet of Things (IoT) sensor network remote sensor station and network comprising a sensor network parameter processing, warning and control system with electronic, specifically programmed, specialized sensor network communication computer machine including electronic artificial intelligence expert system processing with transceivers and/or electrical or optical connections for communicating with IoT sensors and other different followed or following IoT remote sensor stations. Monitor units collect information from and communicate with remote sensor stations and further analyze collected information with artificial intelligence expert systems processing comprising expert input with multiple propositional expert system instructions defining multiple ranges of sensor variables. Further artificial intelligence expert system decision making provides Multiple-Input/Multiple-Output (MIMO) expert system operation with dispatch of electronic and/or optical communication warnings and/or corrective action control signals addressing MIMO urgent and composite degrees of concerns. Further artificial intelligence expert system comprises one or more of expert system processing, speech recognition, natural language processing, image analysis, fuzzy logic and/or neural network analysis. In some embodiments, hierarchical expert or fuzzy systems, neural network and/or adaptive feedback control implementation are disclosed resulting in substantial simplification of computational requirements.

Description

    BACKGROUND OF THE INVENTION
  • The Internet of Things (IoT) is a network of physical devices or objects (“things”) monitored and/or controlled by distributed sensors, controllers, processors and storage devices interconnected by the Internet. The physical devices or objects may include, for example: materials, objects, persons, areas, terrestrial or air-borne vehicles, appliances, manufacturing or process tools, environments, pipe lines, power generation and/or delivery systems, telecommunications equipment, processors and/or storage devices, or other devices or objects for which collected information and/or automated control is important for considerations such as safety, personal health or well-being, security, operational efficiency, information exchange, data processing and data storage.
  • The importance and magnitude of the IoT cannot be overstated. It has been estimated that the number of devices connected to the IoT may exceed 20 Billion or more by 2020. The total annual revenues for vendors of hardware, software and IOT solutions has been estimated to exceed $470B by 2020 (See, for example, Louis Columbus, “Roundup of Internet of Things Forecasts and Market Estimates,” Forbes, Nov. 27, 2016.) Efficient management and control of such massive networks is of critical importance. This invention addresses improved performance and operation of such IoT systems and methods providing Artificial Intelligence (AI) integrated and comprehensive overall network operational monitoring systems and methods. Network sensors, controllers, telecommunication network resources and processing and data storage resources are included in the systems and methods of this invention.
  • A critical concern is management of the massive amounts of data collected from billions of sensors implemented throughout the IoT. Modern technology is being employed to amass this data in distributed computer and data storage systems including “cloud” based systems. The massive data bases being assembled are often referred to “Big Data.”
  • Big data has been defined as voluminous and complex data sets. Often traditional data-processing application software are inadequate to deal with Big Data. Challenges include capturing data, data storage, data analysis, search, sharing, transfer, visualization, querying, updating, information privacy and data source access. Managing and making efficient use of such Big Data is a challenge to system designers.
  • One aspect of the present invention is to provide such Big Data efficient data management and use systems and methods based on artificial intelligence, expert systems, fuzzy logic and hierarchical and adaptive expert and fuzzy system implementations. More particularly, the systems and methods disclosed herein provide efficient and powerful derivation of IoT warning and control signals directed to managing IoT network faults and potentially dangerous situations. It is also important that information provided be readily understandable and presented in a non-confusing format. In many cases, such readily understandable presentation may be critical to control and appropriate response to IoT situations being monitored.
  • While managing Big Data collected on centralized servers is important, the rapid advance in distributed processing can also alleviate centralized processing requirements. The present invention discloses artificial intelligence systems and methods implemented on a distributed network basis for performing local processing operations while also providing access to Big Data and cloud-based processing and data storage.
  • Sensors available to monitor operations in IoT networks include, for example, audio sensors, image sensors, medical sensors, location sensors, process control sensors and equipment sensors, magnetic sensors, micro-switches, proximity sensors, RFID (Radio Frequency Identification Devices) touch sensitive devices, force sensors such as strain gauges, optical sensors, infrared sensors, ultraviolet sensors, taste sensors, and environmental sensors including, for example, temperature sensors, humidity sensors, wind sensors and gas sensors.
  • An additional important consideration for proper operation of the IoT is the reliability of the backbone telecommunications network and remote data processing and storage facilities. Failure or congestion from over loading of these resources may negatively impact proper operation of the IoT resulting in, for example, loss of important information or lack of proper response to critical situations.
  • In addition to the above described sensor technology, important advances in various other technologies are available today to implement more powerful systems and methods for monitoring and/or control of IoT physical devices, situations and telecommunication network performance, Such technologies include, for example: advanced microprocessor; digital control; display technology; cloud computing and storage; computer technology; data storage software and programming languages; advanced radio signal transmission and reception including Wi-Fi, Bluetooth; near field communication (NFC); satellite communications; advanced telecommunications network technology including fiber optic transmission, switching and routing control systems and specialized antenna systems; drones; robotics and BOTs; audio signal processing and acoustical beamforming; image signal generation and transmission; image signal analysis; speech recognition; speech-to-text conversion; text-to-speech conversion; natural language processing; electronic location determination; artificial intelligence; expert systems, fuzzy logic; neural networks; statistical signal analysis; network graph theory and modern control systems and theory. It is important that such monitoring systems and methods be accurate and simple to control and operate.
  • Monitoring systems and methods are used today to observe activities at remote locations. Such prior art monitoring systems and methods make use of remote monitoring units or sensors strategically placed in the area to be monitored. Remote sensors may include microphones, motion sensors, image sensors, location sensors, environmental sensors, medical sensors, equipment operational sensors, and the like. The signals from those sensors are transmitted to network monitoring stations.
  • Exemplary prior art activity monitoring systems and methods and selected technologies include the following:
  • C. W. Anderson, “Activity monitor,” U.S. Pat. No. 8,743,200, HiPass Design, Jun. 3, 2014, describing, inter alia, a system for monitoring a location using a sensor system and detecting and responding to “interesting events.” The system may detect events based on video processing of a substantially live sequence of images from a video camera or using other sensors. Embodiments with smart phones and wireless networks are described.
  • Wen and Tran, “Patient Monitoring Apparatus,” U.S. Pat. Nos. 7,420,472 and 7,502,498, Oct. 16, 2005 and Mar. 10, 2009, describing, inter alia, a system using one or more cameras to generate a 3D model of a person and to generate alarms based on dangerous situations determined from that model. Fuzzy Logic is mentioned. “Once trained, the data received by the server20 can be appropriately scaled and processed by the statistical analyzer. In addition to statistical analyzers, the server20 can process vital signs using rule-based inference engines, fuzzy logic, as well as conventional if-then logic. Additionally, the server can process vital signs using Hidden Markov Models (HMMs), dynamic time warping, or template matching, among others.”
  • Ryley, et. al., “Wireless motion sensor using infrared illuminator and camera integrated with wireless telephone,” U.S. Pat. No. 7,339,608, VTech Telecommunications, Mar. 4, 2008, describing, inter alia, a monitor system with a camera using visible or infrared radiation and a cordless radio transceiver for transmission of video and image signals and alerts.
  • Karen Fitzgerald, et. al., “Two-Way Communication Baby Monitor with Smoothing Unit,” U.S. Pat. No. 6,759,961, Mattel, Jul. 6, 2004, describing, inter alia, a system with baby monitoring and parent units for communicating audible sounds between the baby and parent units and having audible sounds for soothing of the baby.
  • Karen Fitzgerald, et. al., “Baby monitor with a soothing unit,” U.S. Pat. No. 7,049,968, Mattel, May 23, 2006, describing, inter alia, a system with baby monitoring and parent units for communicating audible sounds between the baby and parent units and having audible sounds for soothing of the baby.
  • Marc R. Matsen, el. al., “Methods and systems for monitoring structures and systems,” U.S. Pat. No. 7,705,725, Boeing, Apr. 27, 2010, describing, inter alia, methods and systems for structural and component health monitoring of an object such as a physical aircraft with a plurality of sensor systems positioned about an object to be monitored and a processing system communicatively coupled to at least one of said plurality of sensor systems with processing system including expert systems, neural networks, and artificial intelligence technologies.
  • Bao Tran, “Mesh network personal emergency response appliance,” U.S. Pat. No. 7,733,224, Jun. 8, 2010, describing, inter alia, a monitoring system including one or more wireless nodes forming a wireless mesh network; a user activity sensor including a wireless mesh transceiver adapted to communicate with the one or more wireless nodes using the wireless mesh network; and a digital monitoring agent coupled to the wireless transceiver through the wireless mesh network to request assistance from a third party based on the user activity sensor. The digital monitoring agent comprises one of: a Hidden Markov Model (HMM) recognizer, a dynamic time warp (DTW) recognizer, a neural network, a fuzzy logic engine, a Bayesian network, an expert system or a rule-driven system.
  • Edward K. Y. Jung, et. al., “Occurrence data detection and storage for mote networks,” U.S. Pat. No. 8,275,824, The Invention Science Fund, Sep. 25, 2012, describing, inter alia, systems and processes for detecting and storing occurrence data using mote networks.
  • Artificial intelligence with pattern recognition may include data or image processing and vision using fuzzy logic, artificial neural networks, genetic algorithms, rough sets, and wavelets.
  • Matthias W. Rath, et. al., “Method and system for real time visualization of individual health condition on a mobile device,” U.S. Pat. No. 9,101,334, Aug. 11, 2015, describing, inter alia, a method and technology to display 3D graphical output for a user using body sensor data, personal medical data in real time with expert Q&As, “What if” scenarios and future emulation all in one artificial intelligence expert system.
  • M. Toy, “Systems and methods for managing a network,” U.S. Pat. No. 9,215,181, Comcast Cable, Dec. 15, 2015 describing, inter alia, systems and methods for managing congestion in a network. The use of expert systems, fuzzy logic and neural networks are mentioned. “The methods and systems can employ Artificial Intelligence techniques such as machine learning and iterative learning. Examples of such techniques include, but are not limited to, expert systems, case-based reasoning, Bayesian networks, behavior-based AI, neural networks, fuzzy systems, evolutionary computation (e.g. genetic algorithms), swarm intelligence (e.g. ant algorithms), and hybrid intelligent systems (e.g. expert inference rules generated through a neural network or production rules from statistical learning).”
  • C. M. Chou, et. al., “Network operating system resource coordination,” U.S. Pat. No. 9,807,640, Taiwan semiconductor, Oct. 31, 2017 describing, inter alia, network device coordination schemes to allocate resources, configure transmission policies, and assign users to utilize resources. “Inference can also refer to techniques employed for composing higher-level events from a set of events and/or data. Such inference results in the construction of new events or actions from a set of observed events and/or stored event data, whether the events are correlated in close temporal proximity, and whether the events and data come from one or several event and data sources. The use of expert systems, fuzzy logic and neural networks are mentioned. “Various classification schemes and/or systems (e.g., support vector machines, neural networks, expert systems, Bayesian belief networks, fuzzy logic, and data fusion engines) can be employed in connection with performing automatic and/or inferred action in connection with the disclosed subject matter.”
  • Yiu L Lee, U.S. Publication No. 2014-0126356, patent application Ser. No. 13/669,039, Nov. 6, 2012, Comcast Cable, “Intelligent Network,” describing, inter alia, determining a plurality of services to be provided over a first communication path to a destination, determining a select service of the plurality of services to be provided over a failover path to the destination, detecting a failure of the first communication path, and routing the select service over the failover path in response to the failure of the first communication path. The use of expert systems, fuzzy logic and neural networks are mentioned. “The methods and systems can employ Artificial Intelligence techniques such as machine learning and iterative learning. Examples of such techniques include, but are not limited to, expert systems, case-based reasoning, Bayesian networks, behavior-based AI, neural networks, fuzzy systems, evolutionary computation (e.g. genetic algorithms), swarm intelligence (e.g. ant algorithms), and hybrid intelligent systems (e.g. expert inference rules generated through a neural network or production rules from statistical learning).”
  • Robert C. Streijl, “Enhanced network congestion application programming interface,” AT&T Intellectual Property, U.S. Publication No. 2016-0135077, patent application Ser. No. 14/534,499, Nov. 6, 2014, describing, inter alia, systems and methods that receive network load data that provides indication of a utilization level extant in a wireless cellular network. The use of expert systems, fuzzy logic and neural networks are mentioned. “Any suitable scheme (e.g., neural networks, expert systems, Bayesian belief networks, support vector machines (SVMs), Hidden Markov Models (HMMs), fuzzy logic, data fusion, etc.) can be used bydetection engine102 to provide an appropriate prediction to associate with the stability factor.”
  • J. B. Dowdall, “Methods and systems for pedestrian avoidance, U.S. Pat. No. 9,336,436, May 10, 2016, describing, inter alia, an autonomous vehicle configured to avoid pedestrians using hierarchical cylindrical features.
  • J. Zu and P. Morton, “Methods and systems for Pedestrian avoidance using Lidar,” U.S. Pat. No. 9,315,192, Apr. 19, 2016, describing an autonomous vehicle configured to avoid pedestrians using hierarchical cylindrical features and Lidar or other sensors.
  • J. Benesty, et. al., “Microphone Array Signal Processing,” Springer, 2008 is a text treating fundamentals of microphone arrays and beamforming technologies.
  • M. Brandsttein and D. West, “Microphone Arrays,” Springer, 2001, is a text dealing with a microphone array signal processing techniques and applications.
  • Ali O. Abid Noor, “Adaptive Noise Cancellation—Innovative Approaches,” Lambert Academic Publishing, 2012 is a text describing noise cancellation systems based on optimized subband adaptive filtering.
  • J. C. Giarratano, et. al., “Expert Systems,” Thomson Course Technology, Boston, Mass., 2005 is a text dealing with knowledge representation, reasoning modeling and expert system design.
  • Chen, C. H., “Fuzzy Logic and Neural Network Handbook,” McGraw-Hill, New York, 1996.
  • Cox, C., “The Fuzzy Systems Handbook,” Academic Press Inc., 1994.
  • Earl Cox, “Fuzzy Fundamentals,” IEEE Spectrum, October 1992, pages 58-61. A technical paper describing, inter alia, basic concepts of fuzzy logic and its applications.
  • G. V. S. Raju, et. al., “Hierarchical Fuzzy Control,” Int J. Control, 1991, V. 54, No. 5, pages 1201-1216. A technical paper describing, inter alia, use of Hierarchical Fuzzy Logic with rules structured in a hierarchical way to reduce the number or required rules from an exponential function of the system variables to a linear function of those variables.
  • G. V. S. Raju, “Adaptive Hierarchical Fuzzy Controller,” IEEE Transactions on Systems, Man and Cybernetics, V. 23, No. 4, pages 973-980, July/August 1993. A technical paper describing, inter alia, use of a supervisory rule set to adjust the parameters of a hierarchical rule-based fuzzy controller to improve performance.
  • Li-Xin Wang, “Analysis and Design of Hierarchical Fuzzy Systems,” IEEE Transactions on Fuzzy Systems, V. 7, No. 5, October 1999, pages 617-624. A technical paper describing, inter alia, derivation of a gradient decent algorithm for tuning parameters of hierarchical fuzzy system to match input-output pairs.
  • Di Wang, Xiao-Jun Zeng and John A. Keane, “A Survey of Hierarchical Fuzzy Systems (Invited Paper),” International Journal of Computational Cognition, V. 4, No. 1, 2006, pages 18-29. A technical paper providing a survey of fuzzy hierarchical systems.
  • S. Bolognani and M. ZiglIoTto, “Hardware and Software Effective Configurations for Multi-Input Fuzzy Logic Controllers,” IEEE Transactions on Fuzzy Systems, V. 6, No. 1, February 1998, pages 173-179. A technical paper describing, inter alia, approaches to simplification of multiple input fuzzy logic controllers with either a hierarchical or parallel structure.
  • F. Cheong and R. Lai, “Designing a Hierarchical Fuzzy Controller Using Differential Evolution,” IEEE International Fuzzy Systems Conference Proceedings, Seoul Korea, August 22-25, 1999, pages 1-277 to 1-282. A technical paper describing, inter alia, a method for automatic design of a hierarchical fuzzy logic controllers.
  • Elike Hodo, et, al., “Threat analysis of IoT networks using artificial neural network intrusion detection system,” International Symposium on Networks, Computers and Communications (ISNCC), May 2016. A technical paper describing, inter alia, a threat analysis of the IoT and uses an Artificial Neural Network (ANN) to combat these threats.
  • Xiaoyu Sun, et. al., “Low-VDD Operation of SRAM Synaptic Array for Implementing Ternary Neural Network,” IEEE Transactions on very Large Scale Integration (VLSI) systems, V. 25, No. 10, October, 2017, pages 262-265. A technical paper describing, inter alia, a low-power design of a static random access memory (SRAM) synaptic array for implementing a low-precision ternary neural network.
  • E. De Coninck, et. al., “Distributed Neural Networks for Internet of Things: The Big-Little Approach,” from book Internet of Things—IoT Infrastructures: Second International Summit, IoT 360°, Rome, Italy, Oct. 27-29, 2015, pp. 484-492. A technical paper describing, inter alia, an application area in the Internet of Things (IoT) where a massive amount of sensor data has to be classified and the need to overcome variable latency issues imposes a major drawback for neural networks. The paper describes a proposed Big-Little architecture with deep neural networks used in the IoT.
  • F. Chung and J. Duan, “On Multistage Fuzzy Neural Network Modeling,” IEEE Transactions on Fuzzy Systems, Vol. 8, No. 2, April 2000, pages 125-142. A technical paper addressing, inter alia, input selection for multistage hierarchical AI network models and proposed efficient methods of selection.
  • M. Chi, et. al., “Big Data for Remote Sensing: Challenges and Opportunities,” IEEE Proceedings, Vl. 104, No. 11, November 2016, pages 2207-2219.
  • M. Frustaci et. al., “Evaluating Critical Security Issues of the IoT World: Present and future challenges,” IEEE Internet of Things Journal, August, 2018, pages 2483-2495
  • However, such prior systems and methods fail to take full advantage of modern AI expert system, fuzzy logic, neural network and hierarchical system information processing technology to provide a comprehensive assessment of sensor data, telecommunication network status, and/or potentially dangerous or unacceptable situations or conditions. What is needed are newly improved monitoring systems and methods that analyze and integrate information from multiple network sensors including physical device sensors, situation sensors, distributed sensors in remote locations and telecommunication network problem sensors to generate integrated, understandable and non-confusing assessments for presentation to monitoring personnel and/or control systems.
  • SUMMARY OF INVENTION
  • Various embodiments for improved monitoring systems and methods are disclosed in the present invention. In one aspect of this invention, a first Internet of Things (IoT) sensor network remote sensor station comprises, without limitation, a sensor network parameter processing, warning and control system with at least one electronic, specifically programmed, specialized sensor network communication computer machine including electronic artificial intelligence expert system processing and further comprising a non-transient memory having at least one portion for storing data and at least one portion for storing particular computer executable program code; at least one processor for executing the particular program code stored in the memory; and one or more transceivers and/or electrical or optical connections for communicating with IoT (Internet of Things) sensors that generate electrical or optical parameter signals derived from sensor inputs from objects or situations being monitored.
  • Some embodiments further comprise one or more other different followed or following Internet of Things (IoT) sensor network remote sensor stations sharing common interests with said first IoT sensor network remote sensor station comprising one or more other electronic, specifically programmed, specialized sensor network communication computer machines for monitoring other such electrical or optical sensor parameter signals derived from different sensor inputs from IoT objects or situations being monitored.
  • Some embodiments further comprise one or more monitor units connected to, collecting information from and communicating with said first remote sensor station and further analyzing such collected information from remote sensor stations.
  • Furthermore, in some embodiments, the particular program code mat be configured to perform artificial intelligence expert system operations upon execution including artificial intelligence expert system processing based on expert input defining multiple expert system logic propositional instructions and multiple ranges of sensor variables with artificial intelligence expert system processing analysis of multiple sensor signal inputs and generation of multiple control outputs with urgent and/or integrated composite degree of concerns based on said expert system propositional instruction evaluation of multiple input sensor parameters.
  • In some embodiments, the artificial intelligence expert system processing further comprises hierarchical Multiple-Input/Multiple-Output (MIMO) operation wherein the number of said expert system logic propositional instructions is a linear function of the number of variables and wherein said hierarchical MIMO operations provide inputs to successive hierarchical control levels based at least in part on importance of said inputs and feedback indicative of output signal sensitivity to said inputs with artificial intelligence expert system control of dispatch of electronic or optical communication warnings and/or corrective action to address MIMO urgent concerns and/or composite degrees of concern of said sensor network objects or situations based on urgent concerns and rankings of said expert system composite degrees of concern.
  • In some embodiments, the artificial intelligence expert system processing comprises, without limitation, one or more of expert system processing and analysis of said first remote sensor station sensor input signals, acoustic signal processing, speech recognition, natural language processing, image processing, fuzzy logic, statistical analysis, mathematical analysis and/or neural network analysis.
  • In some embodiments, the first sensor network remote sensor station sensor signals include, without limitation, a combination of one or more of audio, image, medical, process, material, manufacturing equipment, environmental, transportation, location, pipeline, power system, radiation, vehicle, computer, processor, data storage, cloud processing, cloud data storage, vehicle, drone, threat, mote, BOT, robot, telecommunication network, cyberattack, malicious hacking or other followed remote sensor station monitoring signals.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station of said MIMO artificial intelligence expert system controller is a fuzzy logic controller.
  • In some embodiments, the artificial intelligence expert system remote sensor station of said hierarchical MIMO artificial intelligence expert system controller is a fuzzy logic controller.
  • In some embodiments, the artificial intelligence expert system propositional expert system instructions are based on priorities or importance of selected object or situation expert defined monitored parameters.
  • In some embodiments, the artificial intelligence expert system includes at least one of said expert systems propositional expert system instructions priorities is based on selected combinations of object or situation parameters.
  • In some embodiments, the artificial intelligence expert system of the first sensor network remote sensor station further comprises access of said remote sensor station to internet cloud storage and processing units.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station sensor inputs may vary with time.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station parameter analysis further comprises time series analysis of time variable sensor input data.
  • In some embodiments, the artificial intelligence expert system, the first sensor network remote sensor station time series analysis includes regression analysis of time varying sensor signal parameter values.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station electronic, specifically programmed, specialized sensor network communication computer machine communicates with other network nodes to monitor connected telecommunication network elements, subnetworks or networks for failures or performance issues impacting said first remote sensor station.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station one or more of said transceivers may communicate with a terrestrial or air-born vehicle. See, e.g.,FIGS. 2A and 2B and
    Figure US20200364583A1-20201119-P00001
    Figure US20200364583A1-20201119-P00001
    124-126 and134.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station is implemented in a terrestrial or air-born vehicle. See, e.g.,FIGS. 2A and 2B and
    Figure US20200364583A1-20201119-P00001
    Figure US20200364583A1-20201119-P00001
    124-126,134.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station one or more of said transceivers may communicate with a drone. See, e.g.,FIGS. 2A and 2B and
    Figure US20200364583A1-20201119-P00001
    Figure US20200364583A1-20201119-P00001
    124 and125.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station is implemented in a drone. See, e.g.,FIGS. 2A and 2B and
    Figure US20200364583A1-20201119-P00001
    125.
  • The first sensor network remote sensor station ofclaim1 wherein the one or more of said transceivers may communicate with a robot. See, e.g.,FIGS. 1 and
    Figure US20200364583A1-20201119-P00002
    Figure US20200364583A1-20201119-P00002
    106 and109.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station is implemented in a robot. See, e.g.,FIGS. 1 and
    Figure US20200364583A1-20201119-P00002
    109.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station one or more transceivers may communicate with a BOT. See, e.g.,FIGS. 1 and
    Figure US20200364583A1-20201119-P00002
    Figure US20200364583A1-20201119-P00002
    106 and109.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station is implemented in a BOT. See, e.g.,FIGS. 1 and
    Figure US20200364583A1-20201119-P00002
    Figure US20200364583A1-20201119-P00002
    109.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station transceivers may communicate with a mote. See, e.g.,FIGS. 1 and
    Figure US20200364583A1-20201119-P00002
    Figure US20200364583A1-20201119-P00002
    106 and108.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station is implemented in a mote. See, e.g.,FIGS. 1 and
    Figure US20200364583A1-20201119-P00002
    108.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station monitored objects or situations comprise one or more persons. See, e.g.,FIGS. 1, 2A and 2B and
    Figure US20200364583A1-20201119-P00002
    Figure US20200364583A1-20201119-P00002
    106,128,200 and228.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station a monitored person is an infant, child, invalid, medical patient, elderly or special needs person. See, e.g.,FIGS. 1, 2A and 2B and
    Figure US20200364583A1-20201119-P00002
    106.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station transmits background audio signals to be broadcast in the area of said person. See, e.g.,FIGS. 3, 5, 6, 8, 9 and 10A and
    Figure US20200364583A1-20201119-P00002
    Figure US20200364583A1-20201119-P00002
    142,164-166,170-175,184,186 and200.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station transmitted background audio signals are removed from or attenuated in signals transmitted to connected monitor units to minimize annoying or unnecessary signals received and/or heard at said monitoring unit while still transmitting audio signals from the monitored object or person. See, e.g.,FIGS. 4-6 and
    Figure US20200364583A1-20201119-P00002
    Figure US20200364583A1-20201119-P00002
    164-166 and170-177.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station transmits periodic keep-alive signals to a connected monitor unit to assure users that the remote sensor station is operating correctly. See, e.g.,FIGS. 3-6, 8 and 10A and
    Figure US20200364583A1-20201119-P00002
    Figure US20200364583A1-20201119-P00002
    142,164-168,173-177,184,186 and200.
  • In some embodiments, the artificial intelligence expert system the first sensor network remote sensor station sensor signals include a combination of at least one telecommunication network sensor input combined with other sensor signal inputs. See, e.g.,FIGS. 2B and
    Figure US20200364583A1-20201119-P00003
    Figure US20200364583A1-20201119-P00003
    130-132
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station at least one telecommunication network sensor input is a telecommunication link sensor. See, e.g.,FIGS. 2B, 9, 10B, 11, 18 and 21 and
    Figure US20200364583A1-20201119-P00003
    Figure US20200364583A1-20201119-P00003
    132,195,196,199,216-220,229,256 and263.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station comprises at least one telecommunication network router sensor. See, e.g.,FIGS. 2B and
    Figure US20200364583A1-20201119-P00003
    219.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station comprises at least one telecommunication switching system sensor. See, e.g.,FIGS. 10B and
    Figure US20200364583A1-20201119-P00003
    218.
  • In some embodiments, the artificial intelligence expert system first sensor network remote sensor station comprises at least one telecommunication modem sensor. See, e.g.,FIGS. 10B and
    Figure US20200364583A1-20201119-P00003
    Figure US20200364583A1-20201119-P00003
    217 and218.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • While the invention is amenable to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and will herein be described in detail. The inventions of this disclosure are better understood in conjunction with these drawings and detailed description of the preferred embodiments. The various hardware and software elements used to carry out the inventions are illustrated in these drawings in the form of diagrams, flowcharts and descriptive tables setting forth aspects of the operations of the invention.
  • FIG. 1 illustrates, without limitation, an exemplary sensor monitor network system comprising a one or more remote sensor stations, one or more monitoring units and one or more network monitoring centers of this invention.
  • FIG. 2A illustrates, without limitation, an exemplary sensor monitor network of this invention.
  • FIG. 2B illustrates, without limitation, exemplary sensor monitor network faults or concerns monitored in this invention.
  • FIG. 3 illustrates, without limitation, an exemplary remote sensor station of this invention.
  • FIG. 4 illustrates, without limitation, an exemplary sensor network monitoring unit of this invention.
  • FIG. 5 illustrates, without limitation, exemplary background audio and keep alive audio signals of the type used in this invention.
  • FIG. 6 illustrates, without limitation, exemplary composite audio signals of the type used in this invention.
  • FIG. 7 illustrates, without limitation, exemplary acoustic spatial beamforming for noise cancellation of the type used in this invention.
  • FIG. 8 illustrates, without limitation, exemplary sensor network monitor controls of the type used in this invention.
  • FIG. 9 illustrates, without limitation, an exemplary sensor monitor system flowchart.
  • FIG. 10A illustrates, without limitation, additional exemplary sensor signal analysis ofFIG. 9.
  • FIG. 10B illustrates, without limitation, still further additional exemplary sensor signal analysis ofFIG. 9.
  • FIG. 11 illustrates, without limitation, an exemplary expert system sensor signal analysis and ranking of this invention.
  • FIG. 12A illustrates, without limitation, exemplary symmetric audio-video (AV) expert system rules of this invention.
  • FIG. 12B illustrates, without limitation, exemplary asymmetric audio-video (AV) expert system rules of this invention.
  • FIG. 13 illustrates, without limitation, exemplary fuzzy logic relationships of this invention.
  • FIG. 14 illustrates, without limitation, exemplary MIMO hierarchical expert system operation.
  • FIG. 15 illustrates, without limitation, exemplary hierarchical AV-Medical (AVM) sensor expert system rules with medical TOT sensor signal analysis of this invention.
  • FIG. 16 illustrates, without limitation, exemplary hierarchical AVM-Process (AVMP) sensor expert rules with process TOT sensor signal analysis of this invention.
  • FIG. 17 illustrates, without limitation, exemplary hierarchical AVMP-Following (AVMPF) sensor expert rules with followed TOT remote sensor station signal sensor analysis of this invention.
  • FIG. 18 illustrates, without limitation, an exemplary hierarchical AVMPF-Telecommunications (AVMPFT) sensor expert rules with IOT telecommunication signal sensor analysis of this invention.
  • FIG. 19 illustrates, without limitation, an exemplary MIMO hierarchical composite warning and control index calculation of this invention.
  • FIG. 20 illustrates, without limitation, an exemplary neural network of this invention.
  • FIG. 21 illustrates, without limitation, an exemplary sensor network analysis diagram.
  • FIG. 22 illustrates, without limitation, an exemplary sensor network fuzzy logic inference engine of this invention.
  • DETAILED DESCRIPTION
  • The above figures are better understood in connection with the following detailed description of the preferred embodiments. Although the embodiments are described in considerable detail below, numerous variations and modifications will become apparent to those skilled in the art once the above disclosure is fully appreciated. It is intended that the following be interpreted to embrace all such variations and modifications.
  • FIG. 1 depicts theIoT monitor system100 of this invention. TheIOT monitor system100 comprises theremote sensor station107 used to monitor the subject orobject101 and/or its surroundings or defined areas, themonitor unit102 collecting and processing/analyzing information from one or moreremote monitor stations107, and thenetwork monitor center118 communicating with one ormore monitor units102 and/orremote monitor stations107 with further information processing and analysis of data and information gathered by IoT sensors distributed throughout themonitor system100.
  • Example objects or subject101 being monitored may include, without limitation, a person, group of people, physical object or objects, or an area for surveillance. For example, a person being monitored may be an infant, invalid, medical patient or special needs person, senior citizen, child, prisoner, store customer, criminal, intruder, police officer, pedestrian, gathering or crowd, or other person or persons requiring monitoring or surveillance. Example objects being monitored may include, without limitation, manufacturing equipment, transportation equipment, robotic or other material or workpiece processing equipment, BOTs, products, product inventory, terrestrial or air borne vehicles, terrestrial vehicle traffic, pipe lines, utility generation and/or delivery equipment and systems, valuable assets, and computer and/or data processing equipment. Example telecommunication equipment may include, without limitations, transmission, switching, routing, data storage, and telecommunication system and data processing and control equipment and systems. Example101 object-subject sensors114 may include, without limitation, audio sensors, video sensors, movement sensors, environmental sensors, medical sensors, traffic sensors, pedestrian sensors, BOT sensors, robot sensors, mote sensors, process sensors, location sensors, security sensors or other sensors used to monitor objects or subjects for activities or events of interest. IoT network security concerns include network cyberattacks and malicious hacking of network connected devices and sensors, telecommunication facilities, data processing and storage facilities, and information collected, stored, transmitted and processed in the IoT network. The ubiquitous deployment of billions of network sensors and IoT connection of such sensors in a world-wide web presents new threat and vulnerability realities and concerns. Security violations may effect local and wider network operations requiring evaluation and response beyond isolated limited concerns at a particular attack location. Cyberattacks may occur at IoT network application layers, information transmission layers, and data processing layers including internet IoT cloud processing and data storage facilities.
  • Theremote sensor station107 receives, analyzes, and processes data and information from and communicates with the object/subject101sensors114. In addition, without limitation, theremote sensor station107 may includeadditional sensors113.Example sensors113 may include, without limitation, audio sensors such asmicrophone array109,image sensors110, process sensors and/or environmental sensors.Sensors113 may be used for further monitoring of the object or subject101, or areas or activities around that object or subject. Theremote sensor station107 analyzes object or subject101sensor114 inputs and thesensor113 inputs for detection of various situations that may present danger or operational issues to the object or subject101. Audio signals from the object or situation being monitored may be broadcast fromremote sensor station107speaker111.
  • In some embodiments, the sensors may be configured as a mote or small electronic assembly or a robot capable of performing processing, gathering sensory information and communicating with other connected nodes in the network. Thenetwork monitor unit102 receives sensor signals from theremote sensor station107 monitoring the object or subject101 and surrounding areas activities.
  • In some embodiments the remote sensor station may implemented in a robot, a BOT or a mote. In some embodiments the remote sensors may be implemented in a robot, a BOT or a mote.
  • The object or subjectremote sensor station107 may include amicrophone array109 comprising one or more microphones used to listen to sounds from the object or subject101 and the surroundings. In some embodiments themicrophone array109 may be replaced by a single omnidirectional or directional microphone for listing to the sounds. Advantageously, in some embodiments,microphone array109 permits the use of microphone beamforming for enhancing sounds emanating from the subject101 while reducing sounds received from directions other than those produced by the object or subject. As explained more completely below, microphone beamforming makes use of beamforming software to create acoustical beams with a main lobe directed at the object or subject that is most sensitive to sounds originating from the object or subject while attenuating or perhaps essentially eliminating sounds arriving at the microphone array from different directions. For example, depending on the configuration, thebeamforming microphone array109 may be designed to be less sensitive to the broadcast audible signals fromspeaker111 and more sensitive to sounds emanating from the subject101. In some embodiments of this invention, it may be desirable to include operator controls that may permit the user of the subjectremote sensor station107 to enable or disable microphone array beamforming depending on user preference. For example, some users may want to hear more clearly all sounds generated by the object or subject101 or from other sources in the object or subject's surroundings at selected times or under selected circumstances.
  • The object or subjectremote sensor station107 also may include video and/or infrared (IR) cameras and anIR illuminator110 as shown inFIG. 1. In addition, theremote sensor station107 may include still image cameras not shown. Theremote sensor station107 may also include image analysis software for analyzing images from thecameras110 to determine particular activities of the object or subject101 and/or activities in the areas being monitored. Such images may reveal activity of the object or subject101 such as particular movements even though the object or subject101 may not be making audible sounds. In addition, dangerous situations absent audible sounds from the object or subject101 may be detected with the cameras or otherimage collection devices110 including, for example, situations that may indicate a risk to the object or subject101 being injured by dangerous activities.
  • While standard video or still camera technology may be useful in monitoring object or subject101 activities in well lighted environments, such monitoring with subdued lighting or even in the dark may not be possible with standard cameras. The use of an infrared IR camera and infrared IR illuminator as depicted inFIG. 1 permits visual monitoring even in such difficult lighting situations. The IR illuminator may bathe the area with infrared radiation or structured IR illumination may be used to simplify image analysis for determination of activities of the object or subject101. Such structured lighting may include patterns comprising multiple dots of light in various arrangements, circular patterns of infrared lighting, straight lines of infrared lighting or other patterns most suitable for the chosen environment and purpose.IR detectors110 may also be used as motion detectors including such use in security systems.
  • Theremote monitor station107 may also include additional controls useful for selecting operational modes of theremote sensor station107. In some embodiments,sensors114 and/or113 may be used for sensing of other conditions such as unacceptable temperature ranges, air pollutants, dangerous gases, fires or smoke. The additional controls may be used to turn theremote sensor station107 on or off, adjust the volume fromspeaker111, select or deselectmicrophone array109 beamforming, and select appropriate video, still image, or infrared visual monitoring of the object or subject101 and the surrounding area.
  • In some embodiments theremote sensor station107 may also include a display not shown for displaying captured video, still or infrared images from thecameras110. In some embodiments the display may also be used for touchscreen control of the subjectremote sensor station107 as described further below.
  • Theremote sensor station107 may include a radio transceiver such as a Bluetooth, Wi-Fi or cellular telephone transceiver or other RF telecommunication capability known to those of skill in the art for communicating with other elements of thenetwork monitor system100 as depicted inFIG. 1. Theantenna108 may be used for such RF communication of audible, visual and/or control and status information.
  • Thenetwork monitoring system100 ofFIG. 1 may also includemonitor unit102 for remotely monitoring activities of the subject101. Themonitoring unit102 includes anantenna106 for receiving RF signals from theremote sensor station107. Thenetwork monitor unit102 also includes an RF transceiver compatible with the transceiver of the remote sensor station for receiving RF signals from that unit. For example, the RF transceiver for themonitor unit102 may include Bluetooth, Wi-Fi, cellular telephone transceiver or other RF, fiber optic or wired connections for communicating with other elements of themonitor system100 as depicted inFIG. 1.
  • Theremote monitor unit102 may further include adisplay105 for displaying video images received from theremote sensor station107 ofFIG. 1. Thedisplay105 may also be a touchscreen display for display of control icons to simplify overall operation as described further below.
  • Remotenetwork monitor unit102 may also includespeaker103 used to broadcast audio signals received from theremote sensor station107 and used for listening to audible sounds from the object or subject101 and the surrounding area. As explained further below, the remotenetwork monitor unit102 may also include video and/or audio processing software to further enhance received signals to improve observations of object or subject101 activities.
  • Theremote monitor unit102 may also includecontrols104 for controlling monitoring of object or subject activities. Such controls may turn thenetwork monitor unit102 on and off, adjustspeaker103 volumes, adjustdisplay105 parameters, and/or select operational modes including further control for reduction of background noise in the received audio signal.
  • As depicted inFIG. 1, some embodiments may include RF connections viaantenna106 or landline connections via wire, cable or fiber optic links to networkmonitor center118. Thenetwork monitor center118 may receive periodic status update reports from thenetwork monitor unit102 and/or the object orremote sensor station107 for recording, analysis, reporting and/or preparation of history files for access by users. Themonitor center118 may also receive emergency alert signals from themonitor unit102 and/or theremote sensor station107 indicating emergency situations requiring immediate attention. In response to such emergency alert signals, thesubject monitor center118 may dispatch emergency personnel, contact responsible personnel, sound alarms or otherwise provide or initiate appropriate emergency assistance. Themonitor center118 may contain communications and computer equipment for use by specialized personnel trained to respond appropriately to reported emergency situations. Communications may be by wireline, cable, fiber-optics and/or via radio signals withantenna119.
  • As also indicated inFIG. 1, themonitoring system100 may communicate with thecellular telephone115 for the purpose of providing information concerning the status of object or subject101 and alarm signals indicating dangerous situations. In addition, any and all audio/video signals transmitted using cellular telephone frequencies, Wi-Fi, and/or Bluetooth or other telecommunication signals to or from themonitor unit102, theremote sensor station107 or themonitor station118 ofFIG. 1 or transmitted to thecellular telephone115. Thecellular telephone115 also includes one ormore antennae117 for such communications. Thecellular telephone115 also includestouchscreen display116 and other controls as may be implemented depending upon the cellular telephone design. It is to be understood that thecellular telephone115 may also be a tablet computer or laptop computer or other similar portable telecommunication devices. In some embodiments theremote sensor station107 may in fact be implemented using cellular telephone or portable computer or tablet devices.
  • In addition, any of themonitor unit102,remote sensor station107 or monitorcenter118 may transmit email or text messages to designated user World Wide Web addresses such as personal computers or other designated recipients of information concerning the object or subject101 being monitored. Such communications may make use of SMS (short message service) or other messaging telecommunications with unsolicited messages concerning the object or subject101 being “pushed” to the address destination without first being requested from that destination. Such messages enable rapid alerting of designated destinations concerning the status of the object or subject101. Other social media communication systems such as Twitter and Facebook may also be used to communicate status information concerning the subject101 on a real-time or near real-time basis.
  • In some embodiments, themonitor system100 may also communicate withcloud computing120 as shown inFIG. 1.Cloud computing120 may comprise computing, data processing and data storage equipment useful to supplement the computing, data processing, and data storage equipment of themonitor system100 ofFIG. 1.Cloud computing120 may comprise extensive data base information descriptive of local, wide area, or even global IoT network performance parameters useful in evaluating situations impacting the object or subject101.
  • FIG. 2A illustrates an exemplarysensor monitor network200 of the type used in some embodiments of the present invention. Thesensor monitor network200 may compriseexemplary star subnetworks201 and205 and/orexemplary mesh subnetworks206 interconnected with each other and with themonitor center207. Themonitor center207 ofFIG. 2A corresponds to themonitor center118FIG. 1. Theremote sensor stations203 ofFIG. 2A correspond to theremote sensor stations107 ofFIG. 1. Themonitor unit202 ofFIG. 2A corresponds to themonitor unit102 ofFIG. 1. As shown inFIG. 2A, multipleremote sensor stations203 may be connected to amonitor unit202 in star subnetworks such as201 and205 or exemplary mesh subnetworks such as206.Communication links204 interconnectremote sensor stations203 to therespective monitor units202.Communication links204 may be wireline, fiber optic or RF radio links as appropriate for a given implementation.Communication links208 in-turn interconnect themonitor units202 with themonitor center207. Theindividual subnetworks202,205 and206 communicate with themonitor center207 and with each other through themonitor center207 and/or via direct connections such as shown inlink209 or viacloud computing212. The communication connections between the exemplary sensor network remote sensor stations, monitor units and the monitor center may also be established via drones such asdrone210 or other air-born vehicle not shown using radio-link RF communications.
  • Air-borndrone210 may be used to relay signals betweenremote sensor stations203 and monitorunits202 ofFIG. 2A. Thedrone210 may also relay signals to and from themonitor center207 and/or themonitor center118 ofFIG. 1. In addition, thedrone210 may itself contain sensors such as cameras, temperature, wind, rain, snow, humidity or other sensors suitable for drone implementation.Drone210 may also contain signal analysis software for proper operation with RF link protocols, noise reduction and/or data compression as appropriate for particular applications. In some embodiments, drones such asdrone210 may have the functionality of a remote monitor station such asremote monitor station203.
  • Terrestrial vehicle211 may also communicate with theremote sensor station203 via radio links as indicated inFIG. 2A.Vehicle211 may also communicate directly with themonitor unit202 ornetwork monitor center207. In someembodiments vehicle211 may be a driverless vehicle. Driverless vehicles communicate with other vehicles in their immediate vicinity to avoid collisions or other issues resulting from traffic congestion. As indicated inFIG. 2A, adriverless vehicle211 may be alerted as to the presence ofpedestrian213 inFIG. 2A for the purpose of avoiding hitting that person withvehicle211. Detection of the presence ofpedestrian213 may be via radar, sonar, lidar, or via a radio link. The pedestrian may include location determination capability such as GPS to determine physical location information. That information may be transmitted in a format that permits vehicles such asvehicle211 to determine if a collision with the pedestrian may occur permitting collision avoidance maneuvers or actions to be taken. Warning signals may also be transmitted fromvehicle211 topedestrian213. In some embodiments such warning signals may be transmitted to thepedestrian213 bringing potentially dangerous situations to the pedestrian's213 attention. In some embodiments, thevehicle212 may transmit information on its location, speed and direction of travel. Upon receiving that information, thepedestrian213 may be alerted of potential collision danger via a wireless warning device carried by that pedestrian. In some embodiments, all of the above described pedestrian communication, location determination, processing, signal generation and warning may be implemented in a pedestrian carried cellular telephone or other portable wireless electronic device carried by the pedestrian.
  • FIG. 2B depicts the exemplary sensor network ofFIG. 2A together with the exemplary performance or sensor detected issues with different elements of that network. Exemplary network elements failure or sensor detected issues are illustrated inside dotted circles shown inFIG. 2B.
  • For example,issue214 concerns one of theremote sensor stations203 ofFIG. 2B. Theissue214 may concern complete or partial failure of aremote sensor station203. Theissue214 may also reflect the outputs of sensor units being monitored by theremote sensor station203. For example, such sensor outputs may be indicative of alarm signals received from audio or video sensors. Other possibilities include alarm signals received from medical sensors, environmental sensors or other sensors used to report the status of objects or situations being monitored by theremote sensor station203.
  • In a similar manner,issue215 may concern complete or partial failure of themonitor unit202.Issue215 may also reflect a reported failure or concerns detected by the multipleremote sensor stations203 connected to monitorunit202.
  • Issue216 concerns the failure or other issues encountered on atelecommunication link208 connectingremote sensor stations203 to themonitor center207. Such issues may include transmission link equipment problems, excessive error rates of the transmission link, link data congestion concerns caused by overloading of thelink208 or propagation issues such as multipath or RF interference.
  • Issue217 may concern complete or partial failure ofmonitor center207.Issue217 may indicate concerns about the capacity ofmonitor center207 to provide required processing and network control.Issue217 may also concern wider network failures or issues detected by thenetwork monitor center207.
  • Issue218 concerns the failure or other issues encountered on telecommunication links interconnecting monitor units in separate subnetworks as illustrated inFIG. 2B. Here again, such issues may include transmission link equipment problems, excessive error rates of the transmission link, link data congestion concern caused by overloading of the link or propagation issues such as multipath or RF interference.
  • Issue219 reflects concerns about an entire subnetwork such as theexemplary subnetwork206. TheIssue219 concerns may range from total loss of that portion of the sensor network to less drastic issues involving performance of the subnetwork. An example may be the loss of several but not necessarily all elements ofmesh subnetwork206. Another example may be reported sensor issues indicating subnetwork wide problems.
  • Issue220 relates to reported operational concerns for thevehicle211. Such issues may include, for example, driver issues, equipment issues, roadway issues, traffic issues, emergency accident or police issues, etc. In some embodiments, a complete remote monitor station may be implemented invehicle211.Issue222 relates to reported operational issues for thecommunication link222 between thevehicle220 and apedestrian213.
  • Issue221 concerns operational status of a drone. Such concerns may reflect loss of or intermittent connections with ground-based equipment, overload, low power concerns, equipment malfunctions, etc. As noted above, in some embodiments, a complete remote monitor station may be implemented in drone.
  • This invention presents overall integrated artificial (AI) intelligence systems and methods to analyze, prioritize and respond to all of the above network type concerns with comprehensive network wide solutions based on relative issue criticality with AI decision making. Monitored sensor signals may include audio, image, medical, process, material, manufacturing equipment, environmental, transportation, location, pipeline, power system, radiation, vehicle, computer, processor, data storage, cloud processing, cloud data storage, drone, threat, mote, BOT, robot, telecommunication network or other followed sensor station monitoring signals.
  • FIG. 3 provides a more detailed exemplary configuration diagram for theremote sensor station107 ofFIG. 1—labeledremote sensor station300 inFIG. 3. Theremote sensor station300 is controlled byprocessor301 which may be a microprocessor, computer or digital controller of the type well known to those of skill in the art. Theprocessor301 is connected tomemory317 for storage of programs and data for theremote sensor station300. Theprocessor301 may include, without limitation, conventional processor or digital controller capabilities as well as more specialized processing capabilities implemented for example with digital signal processing (DSP).
  • Thememory317 may comprise, without limitation, solid-state random-access memory (RAM), solid-state read only memory (ROM), and/or optical memory or mechanical, optical or solid-state disk drive memory. Theprocessor301 also includes apower supply321 which may comprise, without limitation, batteries, AC/DC power converters, solar cells, or other green energy power sources. Theprocessor301 may also include input/output devices320 comprising without limitation USB ports, Common Flash memory Interface (CFI) approved by JEDEC and other standard memory unit interfaces. Theprocessor301 also includes connection to aspeaker315 for broadcasting audio signals from the remotesensor station unit300.
  • Database access capability318 ofFIG. 3 may be implemented separately or as part of the system software operating onprocessor301 for use in accessing system parameters, control information, status information, history, audio recordings, video recordings, image recordings, operational information, contact information, internet addresses, telephone numbers, received messages, alarm signals and/or other information used in the operation of theremote sensor station300.
  • As further shown inFIG. 3, theprocessor301 may includecontrols316 integrated withprocessor301 for on/off control, controlling microphone sensitivity, speaker volumes, camera operations and/or other operational features of thesensor station300.
  • As also shown inFIG. 3, theremote sensor station300 may also include adirectional microphone array302 used to provide audio inputs from a subject being monitored as well as other audio signals from the area surrounding the subject. Thedirectional microphone array302 may be designed to operate as a beamforming array with directional sound pickup forming a main lobe in the direction of the subject. The main lobe is most sensitive to sounds emanating in the reduced area covered by that lobe and is less sensitive to sounds emanating from other directions. In this way thedirectional microphone array302 provides a noise or interference reduction capability wherein the primary audio signals picked up by themicrophone array302 are from the direction of the subject being monitored. Thedirectional microphone array302 may operate with beamforming software executing onprocessor301 as discussed more completely below. In addition to thedirectional microphone array302, but not illustrated inFIG. 3, single microphones may also be used in the present invention. Single microphones with pressure-gradient, directional acoustic sensitivity and/or omnidirectional microphones may be used as part of theremote sensor station300. Microphones based on variable capacitor technologies, electromagnetic induction technologies, fiber-optic technologies, variable resistance technologies, piezoelectric technologies and including MEMS (Micro-Electrical-Mechanical System) technologies may be used in the present invention.
  • The acoustical beamforming software may further comprise acoustical beam former circuitry for receiving audio signals from each of multiple microphones comprising the directional microphone array, analog to digital conversion circuitry for converting the received audio signals from each of the multiple microphones to digital signals, individual transversal filters for filtering each individual digital microphone signal, with the individual transversal filters further comprising adaptive filter weights for each of the transversal filter tapped delay line signals, and with individual transversal filters further comprising additional circuitry for summing the outputs of the adaptively weighted tapped line signals producing an audio output signal most sensitive in the direction of the main lobe of the sensitivity pattern of acoustical beamforming circuitry. In this way, the acoustical beam-former signal sensitivity pattern may be adaptively varied with respect to direction of the main beam pattern lobe and null locations within that pattern and/or sidelobe structure.
  • As further shown inFIG. 3, the remotesensor monitoring unit300 may also include, without limitation, a background acoustic signal generator303. For example, a white noise generator may be used to generate white noise to be broadcast viaspeaker315 depending on the subject requirements. The white noise generator is only meant to be representative of acoustic signal sources that may be used for this purpose. Other possibilities include pink noise, announcements, audio messages, soothing music generators or the like. In some embodiments it may be desirable to avoid transmission of such background acoustic signals picked up bymicrophones302 to monitorunit102 ofFIG. 1. At the same time, it may be desirable to ensure that the communication link such aslink204 ofFIG. 2A is operational. As further explained below, this can be achieved by including known keep-alive signals in the transmission to monitorunit102 and suppressing the unwanted acoustic signal for transmission purposes to monitorunit102 as described below.
  • As also shown inFIG. 3, the remotesensor monitoring unit300 may include anoptical camera304 which may be a video camera and/or a camera for capturing still images under control of theprocessor301. Theoptical camera304 may be used to monitor activities of the subject as directed byprocessor301 under program control.
  • In addition to theoptical camera304, the remotesensor monitoring unit300 may also include an infrared (IR)illuminator305 operating under control of theprocessor301. The IR illuminator305 may be used to bathe the subject and surrounding area with infrared illumination including the possible use of structured infrared light. Such structured light may include, for example, multiple individual dots of infrared light, vertical and/or horizontal raster scan infrared light orcircularsscans of infrared light. Infrared light is not visible to the human eye and has the advantage that it may be used in light or dark environments to create images or determine motions, movements or activities of the subject being monitored.
  • In addition to theIR illuminator305, the remotesensor monitoring unit300 may include aninfrared IR camera306 to detect reflected IR illumination from the subject and other objects in the illuminated field. TheIR camera306 may be used to generate image information for analysis by the artificial intelligenceimage analysis capability314 discussed below. The goal of such image analysis may include the detection of the dangerous situations or circumstances encountered by the object or subject being monitored.
  • As also depicted inFIG. 3, the remotesensor monitoring unit300 may includeenvironmental sensors307 to further monitor and detect dangerous situations that may present themselves in the area of the object or subject being monitored. Such sensors may be used to detect, for example, unacceptable temperature ranges, humidity, air pollutants, dangerous gases, fires or smoke.
  • The various sensors including but not limited to microphones, optical cameras, infrared cameras, infrared sensors and environmental sensors ofFIG. 3 may all produce signals for transmission to theremote sensor station107 and/or themonitor unit102 ofFIG. 1. Various embodiments of theremote sensor station300 may implement radio RF transceivers for such communications using without limitation one or more of the transceiver implementations illustrated inFIG. 3. Possible such transceivers include, without limitation, one or more RF antennas and transceivers312 designed for video, audio and/or data transmission and/or reception. Transceiver312 may use, for example, VHF, UHF, Cellular Telephone, Bluetooth or Wi-Fi or other signal transmission spectrums. Modulation formats may include amplitude modulation (AM), single sideband (SSB) modulation, frequency modulation (FM), phase modulation (PM) or other modulation schemes known to those of skill in the art. Radio transceivers with appropriate antennas may also include aBluetooth transceiver310, a Wi-Fi transceiver311 and/or the cellular telephone transceiver309.
  • The remotesensor monitoring unit300 Bluetooth, Wi-Fi, cellular or other RF transceivers ofFIG. 3 may be used to send and/or receive signals to/from thesensors114 of the object or subject101 as illustrated inFIG. 1. Such remote sensors may include audio, image, environmental, medical, process, equipment or other sensors to further determine and capture information concerning the object or subject101 ofFIG. 1. Themedical sensors308FIG. 3 may be implemented as part of thesensors114 ofFIG. 1 with RF or physical wire, cable or fiber optic connections between thesensors114 and theremote sensor station107FIG. 1.
  • Medical sensors308 implemented as part of the subject101sensors114 may include sensors for subject temperature, blood pressure, pulse rate or other sensors designed to monitor critical medical parameters for the subject101 ofFIG. 1.
  • The remotesensor monitoring unit300 ofFIG. 3 may also include, without limitation, a GPS (Global Positioning System)313 for an accurate determination of the object or subject monitoring position. Other positioning technology not shown may include determining positions based on cellular telephone tower triangulation, Wi-Fi signal location technology or other location determination capabilities known to those of skill in the art. The precise location of theremote sensor station300 may be important, for example, when reporting alarm conditions to themonitor unit102 or monitorcenter115 ofFIG. 1. Including such location determination capability permits theremote sensor station300 to be moved from location to location or taken on trips and used for example at homes, hotels, warehouses, factories, businesses or in other such locations or accommodations.
  • The remotesensor monitoring unit300 ofFIG. 3 may also include a time/clock unit319 for providing accurate time information. The time/clock unit319 may operate off thepower supply unit321 or the use separate batteries or a power supply to ensure accurate time information. Accurate time information may be used for example to control operations of the object or subject sensor station microphones, acoustic signal generators, cameras, environmental sensors, and radio transceivers discussed above. For example, it may be desirable to initiate or terminate operation of the acoustic signal generators, cameras and/or other sensors under program control at specific times as determined by control parameter settings. It may also be important to transmit accurate time and/or date information with alarm signals transmitted to themonitor unit102 and/or monitorcenter115 ofFIG. 1.
  • Timing signals broadcast using AM, FM, shortwave radio, Internet Network Time Protocol servers as well as atomic clocks in satellite navigation systems may be used in the present invention. For example, WWV is the call sign of the United States National Institute of Standards and Technology's (NIST) HF (“shortwave”) radio station in Fort Collins, Colo. WWV continuously transmits official U.S. Government frequency and time signals on 2.5, 5, 10, 15 and 20 MHz. These carrier frequencies and time signals are controlled by local atomic clocks traceable to NIST's primary standard in Boulder, Colo. These and other available time transfer methods may be used in the subject monitoring system and method of this invention.
  • Also depicted inFIG. 3 isartificial intelligence capability314 incorporating advanced signal processing for analysis of the various sensors signals, evaluation of varying conditions and situations concerning the observed subject with integration of the results of such observation, analysis and evaluation into comprehensible usable outputs for system control and/or alerting users of the monitoring system of this invention. Suchartificial intelligence capability314 may include, without limitation, image analysis, noise reduction, speech recognition, text-to-speech and speech-to-text conversion, natural language processing, expert system analysis, fuzzy logic analysis and/or neural network analysis as discussed in more detail below.
  • For example, as discussed further below, image analysis may be used to detect changes in the object field being viewed, recognition of particular objects, facial recognition, recognition of movements or changes in configuration the object or subject being viewed and the like. Noise reduction may include time and/or frequency domain analysis of received video and/or acoustic signals to remove or reduce the effects of background signal interference and noise to improve signal analysis results. Speech recognition may be used to recognize particular spoken words, commands or phrases. Text-to-speech and/or speech-to-text conversion may be used to convert between textual and spoken or auditory messages including commands, alerts and/or informative communications concerning objects being monitored by the sensor monitoring network of this invention. Natural language processing may be used for automatic contextual understanding of input signals or messages to facilitate automatic response to those signals or messages.
  • FIG. 4 provides a more detailed exemplary configuration diagram for the sensornetwork monitor unit102 ofFIG. 1—labeled sensornetwork monitor unit400 inFIG. 4. The sensornetwork monitor unit400 is controlled byprocessor401 which may be a computer, processor, microprocessor or digital controller of the types well known to those of skill in the art. Theprocessor401 is connected tomemory412 for storage of programs and data for the remotesubject monitoring station400. Theprocessor401 may include without limitation conventional processor capabilities as well as more specialized processing capabilities implemented for example with digital signal processing (DSP) or other well-known computer technology. Thememory412 may comprise without limitation solid-state random-access memory (RAM), solid-state read only memory (ROM), and/or optical memory or mechanical, optical or solid-state disk drive memory. Theprocessor401 also includes apower supply416 which may comprise without limitation batteries, AC/DC power converters, solar cells, or other green energy power sources. Theprocessor401 may also include input/output device capabilities415 comprising without limitation USB ports, Common Flash memory Interface (CFI) approved by JEDEC and other standard unit interfaces. Theprocessor401 also includes connection to aspeaker411 for broadcasting audio signals from the sensornetwork monitor unit400 or other sources as needed.
  • Also, like theremote sensor station300 ofFIG. 3, sensornetwork monitor unit400 ofFIG. 4 may also compriseartificial intelligence capability409.Artificial intelligence409 may include for example and without limitation, image analysis, noise reduction, speech recognition, natural language processing, expert system analysis, fuzzy logic and/or neural networks. Including this capability in the remotesubject sensor station300 ofFIG. 3 may compliment that capability also contained in thesubject monitoring unit400 ofFIG. 4. In some embodiments of this invention such artificial intelligence capability may be implemented in only one or both of theremote sensor station300 and/or the sensornetwork monitor unit400.
  • In some embodiments, as described above forFIG. 3remote sensor station300,artificial intelligence409 of themonitor unit400 ofFIG. 4 may comprise image analysis used to detect changes in the object field being viewed, recognition of particular objects, facial recognition, recognition of movements or changes in configuration the object or subject being viewed and the like.
  • Artificial intelligence409 of themonitor unit400 ofFIG. 4 may further comprise noise reduction capability. In some embodiments, adaptive filtering may be implemented in the time or frequency domain. Such filtering is capable of isolating audible human speech signals from background white noise, pink noise or music signals and periodic signals such as the above described control tone frequencies. For example, an adaptive filter with frequency domain signal analysis may comprise Fast Fourier Transform (FFT) analysis to separate the received signal into frequency subbands or bins, capability for analyzing said frequency subbands or bins to isolate received noise signals from audio voice signals in the frequency domain and eliminating noise signal levels in the respective frequency subbands or bins. The adaptive filter may then further combine the subbands or bins with reduced noise signal levels and use Inverse Fast Fourier Transform (IFFT) analysis to generate time domain signals representing the desired audio voice signals for broadcasting via said remote subject monitoring station speaker. These and other techniques known to those skilled in the art may be used to separate the desired subject audio signals from the background noise, interference or music picked up by themicrophones109 and broadcast to themonitor unit102 ofFIG. 1.
  • Artificial intelligence409 of themonitor unit400 ofFIG. 4 may further comprise expert system analysis for derivation of control signals based on evaluation of received sensor signals with comparison to expert control or output rules. Such expert system analysis is a form of artificial intelligence implemented to emulate the results of human reasoning based on observed conditions. In some embodiments, fuzzy logic, an extension of expert system analysis, may be used wherein allowance is made for uncertainty in variable values with ranges of values defined to accommodate such uncertainties. Ranges for particular variables may in fact overlap depending on expert defined fuzzy logic rules or particular implementations. Expert systems output control or message information is derived to further emulate human reasoning based on concerns, issues or uncertainties in observed conditions. In some embodiments, depending on the number of variables and expert system rules defining variable relationships, the total number of such relationships may grow exponentially and complicate expert system analysis as explained further below. In some embodiments, hierarchical expert systems control may be used to offset such exponential growth in control and complexity as explained further below.
  • Artificial intelligence409 ofFIG. 4 may also comprise neural networks. Neural networks are yet another artificial intelligence tool for rapid evaluation of various combinations of inputs and derivation of outputs. Neural networks are based on networks of interconnected nodes emulating the structure and interconnection of neurons in the human brain. Such neural networks are trained to respond in particular ways to particular combinations of inputs.
  • Database access capability413 ofFIG. 4 may be implemented separately or as part of the system software operating onprocessor401 for use in accessing system parameters, control information, status information, history, audio recordings, video recordings, image recordings, operational information, contact information, internet addresses, telephone numbers, received messages, alarm signals and/or other information used in the operation of the sensornetwork monitor unit400.
  • As also shown inFIG. 4 theprocessor401 may operate withdisplay402 for displaying images, control information or messages received by the sensornetwork monitor unit400.Controls403 are also integrated withprocessor401 for on/off control, speaker volumes, transceiver operations and/or other operational features of the sensornetwork monitor unit400.
  • In addition, as also shown inFIG. 4, sensornetwork monitor unit400 may include RF antenna and receiver404 compatible with the RF antenna and transceivers309/312 of theremote sensor station300 depicted inFIG. 3 and/or themonitor center115 ofFIG. 1. Similarly, sensornetwork monitor unit400 may include, without limitation, a Bluetooth transceiver405, Wi-Fi transceiver406 and/or cell phone transceiver407 for communication withsubject monitoring unit300 ofFIG. 3 and/orsubject monitor center115 ofFIG. 1.
  • Also shown inFIG. 4 is noise and/or interferencesignal reduction capability408. As explained in more detail below, the present invention includes the capability to reduce or eliminate background interference and/or noise from signals received from theremote sensor station300 ofFIG. 3. Theremote sensor station300 transmits audible sounds made by the object or subject being monitored. In addition, as explained above, theremote sensor station300 may also broadcast audible sounds such as white noise, pink noise or music into the environment of the object or subject. As also explained above, theremote sensor station300 may include directional microphones or directional microphone arrays designed to primarily respond to audible sounds in the direction of the subject. Nonetheless, additional background signals such as noise, music, etc. from the broadcast audible sounds may be picked up to some degree by theremote sensor station300microphones302. Other noise or interference signals generated in the area occupied by the object or subject may also be present in the signal transmitted to the sensornetwork monitor unit400. Receiving these additional noise and/or interference sources may be confusing, annoying and/or distracting to users of theremote sensor station400. It is the purpose of the keep-alive signal processing408 ofFIG. 4 to substantially reduce or eliminate such extraneous noise from the broadcast by thespeaker411 ofFIG. 4 while at the same time not reducing or eliminating desired audible sounds generated by the subject or otherwise originating from the surroundings of the subject.
  • In one embodiment a frequency tone or digital control signal or the like, referred to herein as “keep alive” signals, may be added to the noise signal that is transmitted from theremote sensor station107 ofFIG. 1. Such “keep alive” signals are added during periods of time when no other audio signals other than the added background signals are detected from the subject or subject surroundings being monitored. The “keep alive” signal is not broadcast by theremote sensor station107 into the area occupied by the subject. Rather it is added as a control signal to the signal transmitted from theremote sensor station107 to themonitor unit102. Users of themonitor unit102 may elect to: (1) always hear all signals transmitted from theremote sensor station107 to themonitor unit102 including the added background noise or music signals, any transmitted control tone, and, of course, any audible signals from the object or subject being monitored; (2) to not hear the added background signals but to always hear the transmitted keep-alive control tone or signals, and again, of course, any audible signals from the subject or subject's surroundings being monitored; (3) only hear audible signals from the subject being monitored.
  • Users may choose option (1) above when it is desired to hear all sounds from the area occupied by the object or subject. Users may choose option (2) or (3) above when it is desired to not hear the background noise signals which can be very distracting or annoying, but at the same time to receive a periodic keep-alive control tone for reassurance that the subject monitoring system is actually operating correctly. When choosing option (1) or (2), controls may adjust the time period between the “keep alive” signals to a time period acceptable to the user. Such time period or interval adjustments may be made at themonitor unit102 and/or theremote sensor station107. The volume of “keep alive” control tones transmitted viaspeaker103 ofFIG. 1 may also be adjusted.
  • Like theremote sensor station300 ofFIG. 3, the sensornetwork monitor unit400 ofFIG. 4 may also include aGPS receiver unit410 for an accurate determination of the monitoring unit position. Other positioning technology not shown may include determining positions based on cellular telephone tower triangulation, Wi-Fi signal location technology or other location determination capabilities known to those of skill in the art. The precise location of the sensornetwork monitor unit400 may be important, for example, when reporting alarm conditions to themonitor center115 ofFIG. 1. Including such location determination capability permits the sensornetwork monitor unit400 to be moved from location to location or taken on trips and used for example at hotels or in other accommodations.
  • In addition, like theremote sensor station300 ofFIG. 3, the sensornetwork monitor unit400 ofFIG. 4 may also include a time/clock unit414 for providing accurate time information as described above for theremote sensor station300 ofFIG. 3. The time/clock unit414 may operate off thepower supply unit416 or separate batteries or power supply to ensure accurate time information. Accurate time information may be used for example to control operations of the sensornetwork monitor unit400 as discussed above. For example, it may be desirable to initiate or terminate operation of the keep-alive signal under program control at specific times as determined by control parameter settings. It may also be important to transmit accurate time and/or date information with alarm signals and/or status signals transmitted to thenetwork monitor center118 ofFIG. 1.
  • In addition, like theremote sensor station300 ofFIG. 3, the sensornetwork monitor unit400 ofFIG. 4 includes input/output ports415 for access to the sensornetwork monitor unit400. As is the case forFIG. 3, these ports may include capabilities comprising, without limitation, USB ports, Common Flash memory Interface (CFI) approved by JEDEC and other standard unit interfaces.
  • FIG. 5 illustrates exemplaryaudio signals500 of a type used in some embodiments in the subject monitoring system and method of this invention. Noise orother background signals501 may be generated by themonitor unit102 or theremote sensor station107 ofFIG. 1 and broadcast byspeaker111 to be heard bysubject101. White noise is a random signal with the uniform frequency spectrum over a wide range of frequencies. It is an electronically produced sound that may be somewhat similar to the sound of steady rain. In some embodimentspink noise502 may be used in place of thewhite noise501. Pink noise is a random signal within the audible frequency range whose amplitude decreases as frequency increases. In other embodiments different audible signals may be used, for example, to calm the subject including different types of noise signals, music, a familiar person's voice or other audible signals pleasing to the subject101.
  • Audio signal503 is generated bysubject101 and may be speech signals, shouting or other audible signals. As can be seen fromFIG. 5, the frequency domain representation ofaudio signal503 is quite different from the background signals501 or502. Such differences in the time and frequency domains permit noise reduction software to separate and/or reduce or eliminate audio noise signals501 or502 from theaudio speech signal503.
  • It is to be understood that themicrophone array109 or other microphone implementations of theremote sensor station107 ofFIG. 1 may detect sounds in the area being monitored including sounds from signals such aswhite noise501,pink noise502, other noise sources such as music or a soothing voice, and otheraudio signals503 from the subject101.
  • In some embodiments, and as explained further below, themonitor unit102 may broadcast viaspeaker103 the total received signal from theremote sensor station107. In other cases, the user may elect to have the remote sensor station suppress the noise orbackground component501 or502 or the like in the received signal using noise or interference reduction software as explained further below. However, the user may still want to receive a “keep alive” or other control signal such as the periodicsinusoidal signal504 or the like for providing reassurance to the user that theremote sensor station107 is functioning properly. In this sense, the periodic sinusoidal or control signal may be considered a “keep alive” signal used to inform the user that the monitoring system of the present invention is indeed operational even though no background noise or audible signals from the subject101 are being heard. Suppressing the noise component of the received signal before broadcasting over thespeaker103 of themonitor unit102 will result in a less confusing or annoying signal to be received by the user of the subject monitoring system and method of this invention. Also, thecontrol capabilities112 of theremote sensor station107 will permit adjustment of the time between the audible sinusoid or other received “keep alive” signals. Such timing control signals may also be transmitted from themonitor unit102 to theremote sensor station107 to select the time interval between “keep” alive signals.
  • The exemplarysinusoidal waveform504 is depicted as comprising a sine wave signal of different or varying frequencies. In some embodiments, a single sinusoidal or other predictable signal waveform may be used instead of the multiplefrequency sinusoidal waveform504. In other embodiments, a digital control signal may be used in place of thesinusoidal signal504 ofFIG. 5. The periodic sinusoidal or other control signal inserted into the transmitted signal from theremote sensor station107 to themonitor unit102 may be used in various ways. In addition to the keep alive signal, other control or message information may be encoded into these signals.
  • FIG. 6 illustrates, without limitation, exemplarycomposite signals600 of the type described above. The exemplarycomposite signal601 comprises, for example, the white noise signal such aswhite noise signal501 ofFIG. 5 with periodic insertion of the multi-frequencysinusoidal signal504 ofFIG. 5.FIG. 6 also depicts at602 combining thepink noise signal502 ofFIG. 5 with a single frequency sinusoidal signal as shown. As discussed above, when such signals are transmitted from theremote sensor station107 to themonitor unit102 ofFIG. 1, the user may elect at themonitor unit102 usingcontrols104 to suppress the noise signals in601,602 or similarly constituted signals, while only hearing the sinusoidal or other “keep alive” signals. In addition, as discussed above, the periodicity of the audible sinusoidal signals may be adjusted using thecontrols112 of theremote sensor station107. In this way the user can eliminate the annoying audible noise signals while at the same time receiving periodic tones for reassurance that the subject monitor system and method are working properly. In yet another possible configuration, the user may elect to listen to all signals including the noise signals and the periodic sinusoidal signals and subject audible signals and all other audible signals that may be detected by themicrophones109 of theremote sensor station107 ofFIG. 1. In yet another configuration the user may elect to only hear audible speech signals such assignals503 ofFIG. 5. It is clear from the above that the system and methods of the present invention provide maximum flexibility to the user in choosing which audible signals to hear while monitoring the subject101.
  • FIG. 7 illustrates inmore detail operations700 involving themicrophone array109 of theremote sensor station107 shown inFIG. 1. Themicrophone array109 is indicated asmicrophone array703 inFIG. 7. Themicrophone array703 is used in the implementation of the beamforming andnoise cancellation capability700 ofFIG. 7. The microphone array operates with beamforming software and/orhardware704 to form a directionalacoustical beam705 primarily sensitive to audible signals emanating in the area covered by such adirectional beam705. For example, configuring the beamforming capability to be directed toward the subject701 will result in primarily picking up audible signals in the direction ofsubject701, while being less sensitive to other audible signals such asnoise702 illustrated inFIG. 7. In this way audible signals that may mask desired audible sounds from the subject701 or otherwise confuse the user of thesubject monitor system100 ofFIG. 1 can be at least partially excluded from the detected audio signal. In some embodiments, the beamforming noise cancellation system may be capable of automatically directing the beam to the audible sounds from the subject701. In other embodiments, the beam may be manually adjusted or be directed by physical placement of theremote sensor station107 and the subject701.
  • As further illustrated inFIG. 7, the beamformingnoise cancellation capability700 may also include additionalsignal processing capability706. This signal processing capability may include further noise reduction for extraneous or leakage noise sources outside the main beam lobe. In addition, this further signal processing may be used to add the sinusoidal or other control “keep alive” signals504 ofFIG. 5 to the noise signals to result in composite signals such assignals601 and602 ofFIG. 6. As explained above, these additional sinusoidal or other control signals may be used to selectively eliminate background noise signals transmitted from aspeaker103 of themonitor unit102 ofFIG. 1 thereby reducing or eliminating background noise that the user may find annoying or confusing when monitoring the subject's activities.
  • As further illustrated inFIG. 7, the composite audio output signal may be passed totransmitter707 andantenna708 for transmission to themonitor unit102 and/or themonitor center115 ofFIG. 1. As described above, this transmitter may be implemented, without limitation, with a Bluetooth, Wi-Fi, cellular telephone or other appropriate signal transceiver for communications.
  • Also illustrated inFIG. 7 are control signals includingbeamforming control709,signal processing control710 andtransmitter control711. These control signals may be used to select particular operations of the acoustic beamforming, signal processing and transceiver capabilities. It is to be understood that all of the beam forming, signal processing and control may be implemented separately or integrated with the processor control ofFIGS. 3 and/or 4.
  • FIG. 8 illustrates, without limitation,exemplary controls800 for the subject monitoring system and methods of the present invention. These controls may be implemented in a variety of ways familiar to those of skill in the art. For example, dials and/or switches may be employed to select specific control options. In some embodiments, control options may be shown on thedisplay105 ofmonitor unit102 as illustrated inFIG. 1 and/or other displays not shown on theremote sensor station107. It is to be understood that, in different embodiments of this invention, the exemplary control options ofFIG. 8 may be implemented in both themonitor unit102 and theremote sensor station107 or be implemented in just one of themonitor unit102 and theremote sensor station107, or those control options may be distributed with some being implemented in themonitor unit102 while others are implemented in theremote sensor station107 ofFIG. 1. That is to say, remote control capabilities may be distributed between themonitor unit102 and theremote sensor station107 using the communication capabilities of those units. In some embodiments, control options may also be executed by remote processors including cloud processor and storage configurations. In some embodiments control processing may be implemented on a distributed basis betweenmonitor unit102, theremote sensor station107 and the cloud computing such ascloud computing120 inFIG. 1. Distributed processing may also extend to thenetwork monitor center118 ofFIG. 1. Results of such distributed processing may be displayed ondisplay801 and/or other displays associated with distributed processing configurations.
  • Exemplary display801 is illustrated inFIG. 8 with indicateddisplay icons802. Thedisplay801 may be controlled from external controls such as a keyboard or mouse pointer arrangement. In some embodiments, thedisplay801 may also be implemented, without limitation, as an interactive, touchscreen display with display options selected by the touch of a finger and/or an appropriate stylus depending on the implementation of the touchscreen display. Possible interactive display technologies include, without limitation, capacitive touchscreens, electromagnetic radiation sensitive touchscreens, optical sensitive touchscreens and pressure sensitive touchscreens. Touchscreens sensors may be implemented as an integral part of thedisplay801 with control electronics integrated into that display. Alternatively, sensors located around the periphery of thedisplay801 may be used to determine the XY coordinates or positions of a finger or stylus used to select particular icons.
  • FIG. 8 also provides, without limitation, exemplary control options for the monitoring system and methods of the present invention. Nine exemplary high-level controls are indicated which may be accessed, for example, through the nine indicatedicons802 of thedisplay801. In addition to control of thedisplay801, eight high-level icons permit user selection of the particular features to be controlled with exemplary features including, without limitation, audio signals, video signals, medical signals, process signals, followed IOT sensor signals, transceivers, telecommunication network alerts and artificial intelligence settings. Selecting any one of these control options may result in the dynamic changing of theinteractive control screen801 to display another level of control options. An example of such operation is illustrated inFIG. 8 wherein selection of the top-level icon803 results in opening of threeadditional icons804. Additional levels of control may also be implemented wherein selecting one of theicons804 will result in opening yet another level of control icons or objects.
  • For example, selecting the high-level “display” icon may open additional icons for further control of the display including options, for example and without limitation, screen configuration, brightness and/or zoom control. Selecting the high-level “display” icon may also, for example, open additional icons for control of the display of video signals used to capture activities of the subject101 and/or activities in the area surrounding the subject101 ofFIG. 1.
  • Selecting the high-level audio signal icon may open additional levels of control for selecting, for example, white noise or pink noise as illustrated inFIG. 8 or other audible signals not shown. Additional control options may include time periods during which the background white noise or pink noise or other suitable sounds are to be broadcast from theremote sensor station107 to the area surrounding the subject101. Other control options may include keep alive signal parameters, the volume of the background sounds and/or other sound control parameters.
  • As also illustrated inFIG. 8, selecting the high-level audio signals icon may give access to further lower level icons to control acoustical beamforming operating with amicrophone array109 of theremote sensor station107 ofFIG. 1. Beamforming options may include, without limitation, enabling acoustical beamforming, disabling acoustical beamforming, beam configuration, beam width, and/or selectively directing the main beam in selected directions depending upon the physical configuration of the subject environment and the object or subject101 relative to theremote sensor station107 ofFIG. 1. While acoustic beamforming serves to reduce pickup of extraneous noise by the microphone array, depending on the situation, leakage noise from other sources not located in the main lobe of the acoustic beam may also be present. For this reason, it may be desirable to implement yet further noise reduction prior to transmission to themonitor unit102 ofFIG. 1. Control options for this purpose may include enabling or disabling such additional noise reduction and/or selecting particular parameters to be used in further noise reduction algorithms.
  • Selecting the high-level “audio signal” icon inFIG. 8 may also provide user control over the signal to be transmitted from theremote sensor station107 to themonitor unit102 ofFIG. 1 of the present invention. Several options for such signals may exist including those listed inFIG. 8. In one exemplary option, the transmitted signal includes the background noise or other accompanying sounds plus any subject audio signals detected by themicrophones109 ofFIG. 1. A second exemplary option transmits the same signals and also includes the periodic “keep alive” signal such as the “keep alive” sinusoidal signals depicted inFIGS. 5 and 6 and described above. As described above such “keep-alive” signals may be used by theremote sensor station107 ofFIG. 1 to control the broadcast of selected signals viaspeaker111 of theremote sensor station107. As also discussed above, this capability permits suppressing or eliminating confusing or unwanted background signals from those audible signals transmitted to themonitor unit102 while still ensuring that the subject monitoring system is operating properly and while still permitting transmission of audible sounds from the object or subject101. Yet a third signal selection option indicated inFIG. 8 permits transmission of only the keep alive sinusoidal signals ofFIG. 6 together with any object or subject101 audible sounds. This option suppresses the background noise signals from the signal actually transmitted from theremote sensor station107 to theremote monitor unit102 ofFIG. 1. In a yet another option only detected audible sounds from the object or subject101 may be broadcast by theremote sensor station107 speaker.
  • Video signal controls indicated inFIG. 8 may include controls for selecting full-motion video, still images or the use of infrared imaging. Particular image analysis software may also be selected for the corresponding image capture.
  • As further indicated inFIG. 8, in some embodiments one of control icons thecontrol display801 may be dedicated to the analysis of medical sensors for persons being monitored. Such sensors may include, for example and without limitation, temperature sensors, blood pressure sensors, cardiac sensors and/or oxygen sensors. Medical sensors may, for example and without limitation, be worn by the persons being monitored, implemented in the person's clothing or implanted in the persons being monitored.
  • As also indicated inFIG. 8, in some embodiments a control icon for process monitoring may be implemented in thedisplay801. Such processes may include, for example and without limitation, manufacturing processes, material flow processes and logistic control processes involving movement materials or products or the transportation needs for such movement.
  • In some embodiments it may be desirable or even important to be made aware monitoring and control results for other remote sensor stations as indicated in the sensor monitoring network ofFIG. 2A. In an aspect of this invention such monitoring of the remote sensor stations may be implemented by designating certain other remote sensor stations to be “followed” by an individual remote sensor station. As indicated inFIG. 8, such following may include audio/video/medical/process alerts from other remote sensor stations. Other shared alerts may include, without limitation, alerts for weather, traffic, crowds, pipeline status, utility power systems, emergency alerts and even terrorist alerts. In this way a given remote sensor station may be made aware of situations of concern at other remote sensor stations. These other remote sensor stations may be in close proximity to the following remote sensor station. In addition, such “following” may be used to track results from other remote sensor stations not necessarily in close proximity to the remote sensor station but still important in the evaluation of overall situations of concern.
  • Selecting the high-level signal transceiver icon ofFIG. 8 provides user control of selected radio frequency or other transmission system capabilities for communications between themonitor unit102, theremote sensor station107 and/or themonitor center118 as illustrated inFIG. 1. As indicated inFIG. 8, selection of the signal transceiver icon may cause opening of additional icons on the interactive displays allowing, without limitation, configuration of system communications for use of Bluetooth, Wi-Fi or cellular technology. For example, without limitation, such communications with themonitor center118 may be connected through cellular telephone, microwave, fiber-optic link, cable communications, wired connections or other appropriate telecommunications media. Also, themonitor unit102, may communicate through Wi-Fi or Bluetooth connections to a local router for connection to broader telecommunication networks. In some cases, it may be appropriate to also specify data rates, transmission times, transmission formats, or other telecommunication system parameters for operative connection to the chosen telecommunications media.
  • Selecting the high-level artificial intelligence (AI) icon permits managing the use of the various AI options including parameter selections, signal processing operations, and control and/or warning signal generation as discussed above and in greater detail below.
  • In the various embodiments discussed above, selecting a high-level icon inFIG. 8 enables user selection and enablement of particular alarm conditions for the generation of alarm signals. Selecting particular icons may open additional icon selections for configuration of particular alarm situations. As shown inFIG. 8, possibilities include, without limitation, audio alarms, video alarms, medical alarms, process alarms, followed IOT alarms, telecommunication network alarms. For example, alarm conditions for monitoring subject people may include movements, extended silence, medical conditions, presence of an intruder, undesired presence of pets or animals, and unacceptable environmental conditions such as out of range temperature or the presence of dangerous gases, smoke or fire. Medical alert conditions may include, without limitation, lack of response from the subject, subject temperature, blood pressure, oxygen levels or pulse rate. In some embodiments, the subject may wear a medical monitoring device not shown such as an arm or leg bracelet to monitor critical subject medical parameters.
  • As also discussed above, transmitted alarm signals may also include specific location information such as, without limitation, GPS location of the object or subject101. Including this capability permits themonitor unit102 and/or theremote sensor station107 to be moved from place to place or even carried on a trip to a distant location while still being operative to transmit alarm signals to/from themonitor center118 or other appropriate locations with those alarm signals including the current location of the object or subject situation needing attention. Knowing the location can be important in derivation of information or warning messages depending on the situations at particular locations.
  • As further described below,FIGS. 9, 10A and 10B illustrate, without limitation, an exemplary flowchart for operations for the exemplary networks ofFIGS. 1, 2A and 2B as discussed above. In some embodiments the operations ofFIGS. 9, 10A and 10B may be distributed between theremote sensor station107, themonitor unit102 thenetwork monitor center118 or other accessible distributed cloud or processing capabilities ofFIG. 1. The operation may be automatically or manually initiated at thestart block901. The control setup and initiateblock902 analyzes and initializes the various control inputs from the users of the subject monitoring system of this invention. Such control inputs may include, without limitation, for example, those control operations described inFIG. 8 above.
  • Having initiated thesubject monitor unit102, control is passed to one or more of exemplary sensor input blocksaudio sensor inputs903,image sensor inputs904,medical sensor inputs905,process sensor inputs906, followed remotesensor station inputs907 and/or telecommunicationnetwork sensor inputs908. It is to be understood that some embodiments may have a subset comprising one or more but not necessarily all of thesesensor inputs903,904,905,906,907 and908. Sensor inputs ofFIG. 9 may include detection of network cyberattacks and malicious hacking of network connected devices and sensors, telecommunication facilities, data processing and storage facilities, and information collected, stored, transmitted and processed in the IoT network. Also, other sensor inputs not specifically included inFIG. 9 may be used in some embodiments without departing from the teachings of this disclosure.
  • Exemplary more detailed identification of possible capabilities of each of the sensor inputs903-908 are provided via flowchart connectors A-F identified as909,910,911,912,913 and914 respectfully inFIGS. 9, 10A and 10B. Based on analysis of the identified sensor inputs, as described in more detail below, exemplary expert systems warning indices are derived at915 categorizing each as presenting very low, low, medium, high or very high danger or concern for each of the monitored subjects or objects as discussed above. Based on these categorizations, artificial intelligence expert systems orfuzzy logic analysis916 provides comprehensive composite derivation of appropriate warning signals informing users of where the most urgent problems may exist and directing corrective actions according to the relative degrees of danger for each monitored subject or object as described above. Exemplary artificial intelligence analyses are described viaconnector G917 ofFIG. 9 atFIGS. 12-19 as indicated.
  • Based on the above artificial intelligence analysis, warningalarm decisions918 are made. If no alarms are indicated, control is returned to control block902 viapath920. If alarm signals indicate the need for corrective actions, those alarms are transmitted at919 and control returned to control block902 viapath920.
  • Analysis of exemplaryaudio sensor inputs903,image sensor inputs904 andmedical sensor inputs905 via connectors A909,B910 andC911 ofFIG. 9 are shown in more detail in analysis diagram1000 ofFIG. 10A. The connectors A1001,B1007 andC1011 correspond to connectors A909,B910 andC911 respectfully ofFIG. 9. Similarly, analysis of exemplaryprocess sensor inputs906, followedIOT sensor inputs907 and telecommunicationnetwork sensor inputs908 viaconnectors D912,E913 andF914 ofFIG. 9 are shown in more detail in the continuation of analysis diagram1000 ofFIG. 10B. Theconnectors D1016,E1018 andF1020 correspond toconnectors D912,E913 andF914 respectfully ofFIG. 9. It is to be understood that the analysis capabilities depicted inFIGS. 10A and 10B are exemplary. Other embodiments may have a subset of the illustrated capabilities ofFIGS. 10A and 10B or other capabilities not shown without departing from the teachings of this disclosure.
  • Referring now toBackground Audio Processing1002, as explained above and illustrated inFIGS. 5 and 6, the audio sensor signal analysis may include background audio processing with broadcast audible sounds such as white noise, pink noise, music or other selected sounds into the environment of the object or subject being monitored. These background sounds may be picked up by sensor microphones and transmitted back to the sensor stations. In some embodiments or operations, it may be desirable to suppress these known background sound signals prior to listening to or recording sounds from the monitored subject or object. TheBackground Audio Processing1002 may provide such audio signal filtering to remove these retransmitted background signals. As also explained above and illustrated inFIGS. 5 and 6, when such known background signals are removed, it may be desirable to transmit periodic “keep alive” signals from the subject or object monitors to ensure that the units are operating correctly despite such selective background signal suppression. For example, this capability may be especially beneficial when monitoring infants or other persons in need of special care.
  • As explained above and illustrated inFIG. 7,Acoustic Beamforming1003 may be implemented in some embodiments with a microphone array comprising one or more microphones used to listen to sounds from the object or subject being monitored. Arrays permit the use of microphone beamforming in the direction of sounds emanating from the subject while reducing sounds received from other directions. Microphone arrays use beamforming software to create acoustical beams with a main lobe directed at the object or subject for attenuating or perhaps essentially eliminating sounds arriving at the microphone from different directions. In some embodiments,acoustic beamforming1003 makes use of analog-to-digital converters with transversal filters for filtering individual digital microphone signals with adaptive filter weights permitting adaptive variation of the acoustic beam pattern and side local structure.
  • As also explained above, in some embodimentsacoustic noise reduction1004 may comprise time and/or frequency domain noise reduction adaptive filtering. Including the use of Fourier transform analysis with the frequency domain divided into sub-bands for frequency selective signal evaluation.
  • As also explained above, in someembodiments speech recognition1005 may be used to recognize particular spoken words, commands or phrases.Natural language processing1006 may be used for automatic contextual understanding of input signals or messages to facilitate automatic response to those signals or messages.
  • As also explained above,image sensors904 ofFIG. 9 may comprise, for example, video, still image, or infrared visual monitoring of the object or subject101 and/or the surrounding area. Exemplary image analysis capability is depicted inFIG. 10B viaconnector910 ofFIG. 9 andcorresponding connector1007 ofFIG. 10A.
  • Analysis of image sensor signals may include image filtering/enhancement1008 in the time and/or frequency domain depending on the application. Time domain filtering may be used to recognize time varying situations in the received image sensor signals such as movements of the object or subject being monitored or new image features. Frequency domain filtering including the use of Fourier transforms or Fast Fourier transforms (FFT) may be used to analyze and capture frequency domain characteristics of individual images including gradient changes in image intensity or contrast that may be compared from image to image to assist in ascertaining changes in the image content. Two-dimensional frequency domain filtering of portions or all of an image may be used. Time domain and/or frequency domain filtering may be used to improve image quality for further analysis.
  • Image pattern recognition and/orfeature extraction1009 may be used to discover particular patterns or features in the captured images. Such pattern recognition and/or frequency extraction may include capabilities such as facial recognition or recognition of particular patterns expected to be found in the image and alerts when such patterns are not found.
  • Image/Tile feature comparisons1010 may be used for individual images, successive images or frames of video images to monitor for changes in the captured image which may indicate concerns about image content. In some embodiments images may be segmented into multiple tiles representing a subset of a total image with comparison of tile content from image to image to monitor for changes in tile image. Such changes may indicate potential problems or concerns about the subject or object being monitored.
  • As also explained above,medical sensors905 ofFIG. 9 may be analyzed viaconnector911 represented asconnector C1011 inFIG. 10A. Such medical sensor analysis may comprise temperature sensor analysis1012, blood pressure sensor analysis1013,pulse sensor analysis1014 oroxygen sensor analysis1015 or other medical sensors used to monitor the medical condition of a subject. Other medical sensors may include sensors implanted in the body or tissues of a subject being used for tracking selective medical parameters or bodily functions of the subject.
  • Sensor signal analysis1000 ofFIG. 10A is continued inFIG. 10B. Analysis ofprocess sensor inputs906 ofFIG. 9 are provided byconnector D912 at correspondingconnector D1016 ofFIG. 10B listing exemplary processsensor analysis operations1017. As indicated at1017 process sensor analysis may include inputs from process equipment sensors. Such process equipment may include, without limitation, manufacturing equipment, robotic equipment, BOTS, production line monitoring equipment, transportation equipment including air, ground and rail transportation, chemical processing equipment, drones, computer processing equipment, data storage equipment, or any equipment used to facilitate process operations.
  • As also indicated in theprocess sensor analysis1017, input sensor data indicating the status of process materials necessary for the execution of particular processes may be collected for analysis. Such data may reflect process material parameters such as material availability, quantity, quality, alternatives, location or other parameters descriptive the materials necessary to carry out a given process.
  • As also indicated in theprocess sensor analysis1017, input sensor data indicating process maintenance requirements including maintenance schedules, required downtimes, maintenance results, and projected maintenance issues based on process equipment status monitoring. Such status monitoring may indicate, for example and without limitation, operational parameters indicative of future performance issues that may arise in the execution of the process.
  • As also indicated in theprocess sensor analysis1017, input sensor data indicating process schedules including precedent requirements for execution of particular processes and materials required for that execution. Such scheduling may impact other operations throughout the Internet of Things being monitored. Informing monitoring units in controllers throughout the network may greatly facilitate efficient execution of distributed operations in that network.
  • As further indicated inFIG. 10B, followed IOT sensormonitor unit inputs908 fromFIG. 9 may be analyzed viaconnector E914 ofFIG. 9 represented by correspondingconnector F1018 ofFIG. 10B. In some embodiments, it may be important for individualRemote Sensor Stations107 and/orMonitor Units102 ofFIG. 1 to be made aware of the status of results from other Remote Sensor Stations or Monitor Units located throughout the network. In this case, individual Remote Sensor Stations or Monitor Units may elect to “follow” or be informed of the monitoring results of those other Remote Sensor Stations or Monitor Units. Factors that may enter into such decisions to “follow” other stations or units may include, without limitation, proximity to those other stations or units, criticality of operation of those other stations or units, concerns for dangerous conditions being detected at those other stations or units and/or dependence on the operation or status of other objects or sensors being monitored at those other units. In this way, the overall operations of the present IOT monitoring system may be extended to include distributed interdependent or critical operations being monitored and analyzed throughout the network.
  • As indicated at1019, parameters and/or analysis results from other “followed” remote sensor stations or monitor units may include results from audio sensors, image sensors, medical sensors, process sensors and/or telecommunication network sensors as discussed above. Reporting from such “followed” stations and units may include an evaluation of the overall stations or unit monitoring signal status and indication of any dangerous or other concerns arising from such an evaluation.
  • As further indicated at1019, multiple other potential situations including environmental concerns such as whether, critical traffic or crowd concerns, critical equipment failures such as failures involving pipelines or public utilities or police reported criminal activities including terrorist alerts may be widely reported throughout the network to all remote sensor stations and monitoring units that have chosen to follow such concerns. In some cases, such concerns may be widely reported to remote sensor stations or monitoring units without those stations or units registering to receive warnings for such concerns.
  • As further indicated inFIG. 10B, telecommunicationnetwork sensor inputs908 fromFIG. 9 may be passed toconnector F914 tocorresponding connector F1020 ofFIG. 10B. Exemplary sensor networks communication elements including selected telecommunication elements and potential telecommunication network faults are depicted inFIGS. 2A and 2B as discussed above. As shown inFIG. 10B1021 telecommunication network sensors may include individual link sensors including transmitters, receivers, repeaters, antennas, signal amplifiers, link power sources, physical wireline or optical cables and other resources involved in implementing a particular telecommunications link. In addition, as indicated at1021, telecommunication sensors may provide signal quality signals for individual links including measurements such as signal power, signal-to-noise ratio, multipath interference issues, adjacent channel interference, data error rates or other parameters indicative of the signal quality on respective telecommunication links.
  • As also indicated inFIG. 10B, individual link modems may be monitored with modem sensors providing information on operational parameters for the modems for various modulation formats including AM, FM, PM, signal multiplexing formats and multiple other parameters. Problems or issues involving the signal modulation can be detected and reported for further evaluation.
  • In addition, as indicated at1021, signal link quality and/or modem sensor monitor parameters and switching node parameters may also be monitored using sensors to indicate the operational capability of such switching nodes including failures and traffic congestion.
  • Modern communication networks make use of data routers to route individual data packets to desired destinations. Such routers are used at both the network level and at the periphery the network including commonly used Wi-Fi routers and routers in cellular networks. Monitoring of such routers as also indicated that 1021 ofFIG. 10B.
  • As also indicated at1021 ofFIG. 10B and illustrated at219 inFIG. 2B, the monitoring systems of this network may include detection of failures of telecommunication subnetworks. Such a failure will clearly affect all sensor nodes in the failed subnetwork. In addition, failures of a particular subnetwork may impact IoT sensors and monitoring nodes not directly connected to that failed subnetwork. For example, a particular subnetwork may correspond to resources necessary to rectify problems in other areas of a total network. A global failure of that subnetwork could impact objects or situations located in other parts of the total network. In another example, failures in a subnetwork may correspond to power outages in particular areas. Such power outages may impact advisable activities in other areas of the total network. In yet another example, failures corresponding to release of dangerous gases, liquids or other materials in one subnetwork may impact other areas of the total network. Failure of the subnetwork communications may interrupt communications with other portions of total network to report the dangerous situation.
  • In some embodiments, the various sensor measurements discussed above may vary with time. Time series analysis may be used to maintain history files with analytic evaluation of those files to determine parameter ranges, changes in parameter values, statistical analysis of successive parameter values, regression analysis, predictions, or other analytic evaluation of the time varying sensor inputs. Without limitations, time varying sensor measurements may be collected in cloud storage facilities with history files and access to big data analytics provided for the analysis of such data files. Based on such analysis, machine learning based on statistical trends in parameter values may lead to improved performance.
  • It is to be further understood that while sensor signal analyses ofFIGS. 10A and 10B are shown as serial operations, such signal analysis operations may be carried out selectively, in parallel, or on a distributed basis without departing from the teachings of this invention. It is also to be understood that other and/or additional sensor signal analysis operations not specifically set forth inFIGS. 9, 10A and 10B above may be employed without departing from the teachings of this invention.
  • FIG. 11 depicts, without limitation, anexemplary expert mapping1100 of sensor signal analysis results to an exemplary expertsensor rating scale1101.Example rankings1102 include derived parameters for audio sensor signals, video sensor signals, medical sensor signals, process sensor signals, followed IoT remote sensor monitor unit sensor signals, and telecommunication network signals. It is to be understood that other sensor parameters may be used in addition to or in place of the exemplary parameters ofFIG. 11 without departing from the teachings of this invention. Derived signal parameters are mapped, for example, onto a scale from 0 to 1. Mappings of 0 correspond to derived parameters that present absolutely no danger or concern to the object or situation being monitored. Mappings of 1 correspond to derived parameters that indicate the maximum danger or concern to the object or situation being monitored. Mappings may be defined by experts. As further explained below and illustrated on the exemplarysensor ranking scale1101 ofFIG. 11, the degrees of danger between 0 and 1 may be partitioned into defined ranges corresponding to very low danger, low danger, medium danger, high danger, or very high danger. Such partitioning may be useful in simplifying reporting results or in implementing artificial intelligence expert system analysis and/or fuzzy logic analysis and the derivation of overall danger warning and/or control signals.
  • Returning now to the exemplary listing of parameters at1102 ofFIG. 11, audio speech signals may be parsed for selected keywords being ranked on a scale from 0 to 1. Particular ranking may depend upon the subject being monitored. A crying screaming infant may be ranked higher on the scale than other audible signals from the infant such as harmless babbling. At the same time, and infant continually asking for “mommy” may be ranked higher than harmless babble but lower than crying and screaming. Spoken words from older persons may be ranked according to the urgency that such words they convey. Words or phrases such as robbery, gun, don't shoot, help, I have fallen, heart attack, danger or gun shots would receive a higher ranking than phrases or words expressing less urgency. In addition to speech recognition for keywords, natural language processing may be used to derive complete phrases or sentences from the detected audio signals. Modern natural language processing ascertains meaning of such complete phrases or sentences useful in understanding concerns or dangerous situations being described by a speaker. Here again, expert defined rankings from 0 to 1 on thescale1101 may be made.
  • In a similar manner results from image, video orinfrared signal processing1007 may be ranked on a scale from 0 to 1 as shown inFIG. 11. Changes in the image fields may indicate a range of dangerous situations or situations of concern. Any movement in a field-of-view that is supposed to be stationary may be a concern. Image analysis may include definitions of smaller sections of image defined as tiles with comparison of tiles from one image to the next for changes. Motions in the image field being monitored outside the area were the subject is located may indicate dangerous situations. Detection of rapid changing movements of individuals in the field-of-view may indicate a conflict or even fighting. Facial recognition may also be used to generate appropriate warning signals on the defined scale ofFIG. 11.
  • As further indicated inFIG. 11, medicalsensor signal analysis1011 may be used to monitor parameters such as the subject temperature, blood pressure, pulse, oxygen levels or other important medical parameters. Here again the individual parameters may be ranked on a scale from 0 to 1 depending on their deviation of expected normal readings. Medical sensors may be attached to the subject, implanted in a subject, integrated into the subject's clothing or otherwise worn or placed in defined proximity to the subject for monitoring purposes.
  • As further indicated inFIG. 11, processsensor signal analysis1017 may be used to monitor the state of a defined process. An example would be monitoring of a manufacturing process with sensors used to monitor manufacturing equipment, materials used in the manufacturing process, situations requiring maintenance, automated robotic equipment and/or monitoring of the entire process with comparisons to required manufacturing schedules. Other processes that may be monitored include, for example, chemical processes, transportation processes, computer processes including data processing and storage systems and interrelated activities of personnel in the execution of particular processes.
  • In some embodiments, it may be desirable for a givenremote sensor station107 inFIG. 1 to communicate or follow results from other different remote sensor stations as indicated at1019. The results from those other different remote sensor stations may indicate a dangerous situation or situations of concern for the given remote sensor station. For example, an emergency detected at a different remote sensor station in defined proximity to a first remote sensor station may serve as an alert calling for action at that first remote sensor station. Also, for example, any of the above audio, image, medical or process sensor parameters at the remote sensor station may give rise to such an alert. Other situations detected at other remote sensor stations that may be cause for concern include, but are not necessarily limited to, bad weather, traffic congestion, crowds, gas or water or petroleum or chemical pipeline failures, utility systems including electric utilities, police emergencies or terrorist alerts and the like. In some cases, these alerts from other remote sensor stations may be more critical when the remote sensor stations are the close proximity to one another.
  • As also indicated at1102 ofFIG. 11, telecommunication network sensor signal analysis may include multiple communication parameters as discussed above.FIG. 11 identifies, without limitation, multiple such parameters including communication links error rates, signal-to-noise ratios, traffic congestion delays, lack of telecommunication system response, reported link outages, reported processing node outages, reported storage outages and reported subnetwork failures as discussed above. Here again, expert input may be used to map such telecommunication issues onto the expertsensor ranking scale1101.
  • FIG. 12A depicts in matrix form artificial intelligenceexpert system relationships1200 between two selected parameters resulting from analysis of audio sensor inputs and video sensor inputs that may be used in some embodiments of the present invention. As indicated inFIG. 12A, the ranges for audio and video parameters are divided into exemplary subranges corresponding to very low, low, medium, high and very high as shown inFIG. 11. For each combination of such values for the two parameters being considered inFIG. 12A, an artificial intelligence expert decision rating is provided indicating degree of danger indices for each commination. These danger indicators are provided by audio and video parameter analysis experts and are part of an artificial intelligence expert system database. As indicated inFIG. 12A, the combined audio/video danger indicators may be defined by such experts as, for example, being very low, low, medium, high, and very high. In the exemplary embodiment depicted inFIG. 12A, twenty-five such expert system defined rules are shown.
  • The combined warning/control matrix1200 ofFIG. 12A forms the basis of an artificial intelligence intelligent system. For example, each of the results indicated inFIG. 12A may be expressed in propositional calculus logic form, for example, as follows:
  • 1. If audio danger is low and the video danger is high, then the combined warning and control index is high.
  • 2. If audio danger is very high and the video danger very low, then the combined warning index is very high.
  • 3. If audio danger is very low and video danger is medium then the combined warning and control index entry is medium.
  • Clearly 25 such logical statements exist for the entries inFIG. 12A. For each such logical statement, a combined warning and control index for the given combination may be determined by the expert system of the present invention. The combined warning and control index may be displayed on thedisplay801 ofFIG. 8 in various forms including text messages, flashing, with audible messages from the speakers ofFIGS. 3 and/or 4 or with a combination of such visual or audible alerts.
  • The artificial intelligence expert system matrix ofFIG. 12A is symmetric with respect to the two variables, audio and video. That is to say, every row of the matrix is identical to the corresponding column with the result that the matrix entries are symmetric about each of the two diagonals. The result is that equal weight is given to each of the two audio and video variables.
  • It is also to be understood that in some embodiments it may be desirable to establish priorities between sensor variables. For example, it may be important to prioritize certain medical sensor parameters over audio sensor and over sensor parameters. One way of favoring one variable over another is to add an expert defined value to the result obtained from the expert sensor ranking scale ofFIG. 11. For example, if the calculations for ranking a variable resulted in a value of 0.3 inFIG. 11, adding 0.1 to that value would increase the danger warning for that particular variable from “low” to “medium.” The subsequent expert system or fuzzy logic analysis would then favor the results derived from the modified sensor values sensors over the results from unmodified sensors.
  • In still other embodiments, where it is desirable to prioritize or give more weight to a selected parameter, an unbalanced expert defined matrix may be used. One such exemplary unbalanced matrix is shown inFIG. 12B. For example, in the case where the audio signal analysis indicates a very low danger and a video signal analysis indicates a high danger, then the combined warning and control index is set at medium as indicated inFIG. 12B. In this case, the very low audio warning and control index is given priority or increased weight. Even though the video warning and control index is high, the artificial intelligence expert system analysis results in lower danger warning and control index of medium. If both audio and image parameters are high, the combination is an output warning index of very high, indicating that the high combination of two inputs is interpreted as a very high warning condition. The matrix of12B is defined by an expert and reflects the expert's opinion concerning importance of individual variables and combinations of those variables.
  • In other embodiments of the present invention, the above described decision-making process may be augmented with the use of fuzzy logic. It is clear from the above discussion that the audio and video parameter values will be variables with certain ranges of uncertainty. For example, in the analysis ofFIG. 11, hard boundaries are set between the different ranges of very low, low, medium, high and very high. These hard boundaries do not actually exist in the real world. Human decision making is more nuanced and not subject to such binary decisions based on defined limits or boundaries. In some embodiments, analyses that provide for a more gradual transition between defined ranges are more appropriate. As described below, artificial intelligence expert systems using of fuzzy logic is particularly well-suited for deriving control rules for variables with such uncertainty. It is to be understood that artificial intelligence expert system derivations may be implemented without fuzzy logic as described above. The use of the above described expert defined propositional logic rules may be sufficient for some embodiments as described above. That said, fuzzy logic has found expanded uses in the development of sophisticated control systems. With this technology, complex requirements may be implemented in relatively simple, easily managed and inexpensive controllers. It is a simplified method of representing analog processes on a digital computer. It has been successfully applied in a myriad of applications such as flight control systems, camera systems, antilock brakes systems, washing machines, elevator controllers, hot-water heaters, decision analysis, and stock trading programs.
  • With fuzzy logic control, statements are written in the form of the propositional logic statements as illustrated above. These statements represent somewhat imprecise ideas reflecting the states of the variables. The variable ranges for audio and video parameters indicated may be “fuzzified” as fuzzy logic variables extending over the defined overlapping ranges as shown, for example, inFIG. 13 at1300. Fuzzy logic systems make use of “fuzzifers” that convert input variables into their fuzzy representations. “Defuzzifiers” convert the output of the fuzzy logic process into “crisp” numerical values that may be used in system control. It is to be understood that while the exemplary fuzzy logic analysis ofFIG. 13 is based on the illustrated triangular membership functions, other such overlapping membership functions may be used such as Gaussian, exponential or other functions selected for particular applications.
  • For example, thegraph1301 ofFIG. 13 illustrates such a possible “fuzzification” for the audio warning and control index variable with overlapping ranges indicated in the figure. In this example, on a scale of 0 to 1.0, the normalized audio warning and control index fromFIG. 11 is rated at 0.85. As illustrated in theFIG. 13, an audio warning and control index rating of 0.85 results in a degree of membership (DOM) of 0.70 in the membership class “high.” In this particular example, the warning and control index rating of 0.85 does not result in membership in any other of the possible membership classes.
  • In a similar way, in thegraph1302 ofFIG. 13 “fuzzification” of the video warning and control index variable is illustrated. On a scale of 0 to 1.0, a normalized video warning and control index value of 0.45 results in a DOM of 0.6 in the video “medium” membership class and 0.15 in the “low” membership class. These DOM values may in turn be used in the fuzzy logic implementation to derive a defined, “crisp” numerical value for a combined warning and control index.
  • In the above example ofFIG. 13, the composite warning and control index depends on the degrees of membership of the audio signal analysis “or” the video signal analysis. The conjunctive relation “or” corresponds to the logical intersection of the two sets corresponding to the audio and video variables. In this case the appropriate DOM is the maximum DOM for each of the sets at the specified time. This is expressed algebraically as follows:

  • (A∪B)(x)=max(A(x),B(x)) for allx∈X
  • Premises connected by an “AND” relationship are combined by taking the minimum DOM for the intersection values. This is expressed algebraically as follows:

  • (A∩B)(x)=min(A(x),B(x)) for allx∈X
  • The conjunctive relation “or” requires the use of the maximum value of the respective DOM's. These values may be used to defuzzify the warning and control index degree of membership. As shown in1303 ofFIG. 13, fuzzy ranges for the warning and control index may be defined in a similar manner to the audio and video variables. A numerical “crisp” value for the warning and control index can now be derived using defuzzification procedures. As shown in FIG.13, the DOM ranges for the warning and control index are capped at values corresponding to the above analysis for the DOMs of the audio and video variables. The final “crisp” numerical value of the warning and control index may, for example, be calculated based on the centroid of the geometric figure for the DOM ranges of thegraph1303 ofFIG. 13. This calculation may be carried out by dividing the geometric figure ofFIG. 13 into sub-areas Aiwith individual centroids xifrom the following formula.
  • xc=(i=1nxiAi)/(i=1nAi)
  • The result of such a calculation is shown inFIG. 13 yielding a warning and control index numerical value of about 0.6. Note that this result is less than the warning and control index for audio signals alone and more than result for video signals alone. Fuzzy Logic produces a result in between these extremes, reflecting the fuzzy transitions from one designated range to another.
  • While, for simplicity, the above example dealt with only two variables, the audio signal analysis results and the video signal analysis results, the method described above may be expanded to more than two variables, including multi-dimensional expert system analysis and multi-dimensional fuzzy logic analysis. Multi-dimensional fuzzy logic may be applied to the example parameter combinations ofFIGS. 14-17 discussed below.
  • For example, while the above example is limited to two variables, audio and video, clearly, for some embodiments, additional tables may be constructed to include other important variables in the decision process. Multidimensional tables may be constructed with more than two variables to reflect additional indices. Exemplary other parameters may include, the results of analysis for medical, process, followed IoT sensor monitor units and telecommunication network analysis as described above. For the case of two variables, audio and video with 5 defined subranges of variable values, 25 possible combinations exist. As the number of variables increases, the number of possible combinations increases exponentially. For example, with 6 such variables and 5 ranges of values for such variables, the number of possible combinations increases to 56=15,625 possible outcomes. Modern processing and memory systems make artificial intelligence expert systems analysis of such a large volume of possibilities manageable. In one embodiment of this invention, this approach with, perhaps a reduced number of rules eliminated, a subset of all possible rules that may not apply to a given analysis is used.
  • However, clearly a simpler artificial intelligence expert system implementation is desirable. In another embodiment of this invention, a hierarchical artificial expert and/or fuzzy logic system is disclosed that reduces the increased size of the inference rule data base with addition of more variables from exponential growth to linear growth. Hierarchical fuzzy system designs are discussed, for example, in the G. Raju, L. Wang and D. Wang references cited above in the identification of prior art in this patent. In addition, the hierarchical systems and methods of this invention implement MIMO (Multiple input-Multiple Output) operations with intermediate evaluation of dangerous situations permitting response to such situations in addition to providing evaluation of the levels of concern or dangerous situations for the combination of considered variables. In some embodiments, adaptive feedback control is provided to further improve hierarchical system control and processing of input signals.
  • An exemplary hierarchical MIMOadaptive operation1400 is illustrated inFIG. 14 for a system with 4 input variables1401: X1, X2, X3 and X4. In this case, the maximum number of required AI expert system rules is reduced from 54=625 to 3×52=75. More generally, for a hierarchical design with “n” input variables and “m’ possible values per variable, required AI expert system rules will be (n−1)m2. Thus, the number of required rules is a linear function of the number of variables as opposed to the exponential increase mnin the non-hierarchical case. As another example, for a system with 6 input variable and 5 subset ranges of values for each variable, the number of required AI expert system rules is reduced from 56=15,625 to 5×52=125. The result is a significant decrease in design and implementation complexity of the system. The advantage of such a reduction in a system with a large number of variables and variable ranges is clear. In some embodiments of this invention with many network wide variables and variable ranges a substantial reduction in complexity can be achieved.
  • InFIG. 14, theinputs1401 are processed atblock1402 using, for example,input signal processing1403 and signal ranking methods ofFIG. 11. Signal processing may include signal filtering, noise reduction, analog to digital conversion, audio signal processing, speech recognition, natural language processing, video signal processing, image analysis, signal time series analysis and statistical signal analysis as discussed above. Analysis of additional signals from other followed remote sensor station and/or monitor units and/or network monitor centers and network telecommunication failures as indicated inFIGS. 1 and 2 and discussed above may also be included in the signal evaluations ofblocks1402 and1403 ofFIG. 14. The individual processedinputs1401 may be mapped onto a scale from 0 to 1.0 with zero being input signal indicative of no danger or concern for an individual parameter and 1.0 being indication of maximum danger or concern for that parameter. As indicated inFIG. 11 and at1402 ofFIG. 14 the individual input parameters X1, X2, X3 and X4 may also be mapped into the ranges very low (VL), low (L), medium (M), high (H), or very high (VH) according to their numerical rankings representing the degree of danger or concern presented by the parameter values. The resulting output processedsignals1404 fromblocks1402 and1403 are designatedX1,X2,X3 andX4 inFIG. 14.
  • As shown in the embodiment ofFIG. 14, outputsX1,X2,X3 andX4 designated1404 from the signal processing and adaptive ranking ofblocks1402 and1403 are mapped to signals Y1, Y2, Y3 and Y4 for further processing by the hierarchical adaptive expert system controller. The outputs of the hierarchical controller may be considered an approximation to an output derived using a complete non-hierarchical expert system controller as described above. In some embodiments of this invention, the results obtained with the hierarchical controller may be improved with selective control of routing of input signal to hierarchical control levels. For example, the specific routing of the signalsX1,X2,X3 andX4 to the signals Y1, Y2, Y3 and Y4 may be adaptively changed depending on signal input importance, output sensitivity to particular input signals or other parameter relationships. (See, for example, Di Wang, et. al., and F. Chung cited above.) In some embodiments, it may be desirable to adaptively apply inputs with more specific information first, and inputs with less specific information later in the hierarchical network ofFIG. 14. For example, input signals with the largest values may be deemed more important than input variables with smaller values. In other embodiments, particular input parameters may be considered more important than other input variable. For example, in some embodiments, medical signals may be deemed more important than other signals. Also, the input signal of most importance may change with time. For example, time series analysis of particular signal inputs may indicate trends of concern for particular inputs. The output parameters Y1 and Y2 are passed to AI expertsystem analysis block1406 for AI expert system analysis to determine the degree of danger or concern Z1 represented by the particular combination of Y1 and Y2.
  • With the hierarchical analysis of some embodiments of this invention, the output parameter Z1 is also passed to AIexpert system analysis1407 together with the output parameter Y3 for an analysis and determination of the degree or danger or concern Z2 represented by this combination. In this way, the output of AIexpert system analysis1406 is used as an input to AIexpert system analysis1407 in a hierarchical manner for successive computations.
  • In the same hierarchical manner, the output Z2 of AIexpert system analysis1407 is passed input to AIexpert system analysis1408 together with the output Y4 from the input processing andranking block1402 for determination of the degree of danger or concern Z3 represented by the combination of the variable Y4 and the output Z2.
  • The input variables X1, X2, X3 and X4 are represented collectively by1401 inFIG. 14. The output variables Y1, Y2, Y3 and Y4 are represented collectively by1405. The results Z1, Z2 and Z3 of the AI expert system analyses1406,1407, and1408 are represented collectively by1409 inFIG. 14. The combination ofvariables1405 and1409 are fed to block1410 for further signal analysis and issuance of appropriate warning/control signals. In this way, the warning/control signals may be indicative of individual concerns or danger arising from theoutput variables1404 as well as the results of the AI expert system analyses1406,1407 and1408 as illustrated inFIG. 14. The output Z3 includes results based on all fourinput variables1401. In addition, in some embodiments, further feed-back control signals1411 may be used to provide further control of selection ofsignal ranking1404 depending sensitivity or other measurements of output signals to selection or ordering ofsignals1404 for analysis as indicated inFIG. 14. The hierarchical system and method ofFIG. 14 is a MIMO (Multiple Input-Multiple Output) hierarchical AI Expert system with 4 inputs and 7 outputs.
  • Returning now to the two-input audio/video system ofFIGS. 12A, 12B and 13, in another embodiment of this invention, a third variable may be added to the audio/video (AV) analysis using the above outlined hierarchical MIMO AI adaptive expert system analysis ofFIG. 14. For example,FIG. 15 adds the medical variable resulting in an AVM analysis. In this example, a non-symmetricexpert system matrix1500 as discussed above is used. Depending on the expert input, other matrices may also may be more appropriate. As discussed above, non-symmetric matrices permit favoring certain variable or variable combinations over others. For example, in the matrix ofFIG. 15, the medical parameter is given more weight than the audio or video parameters. The result of the expert system analysis ofFIG. 15 is an AVM (audio/video/medical) combination warning and control index output. The variable relationships ofFIG. 15 may also be processed using fuzzy logic as shown inFIG. 13.
  • In the same way, the derived process signal value may be added to the hierarchical adaptive expert system analysis as shown in thematrix1600 ofFIG. 16. The result of this analysis is a warning and/or control (AVMP) combination signal output warning and control index for audio/video/medical/process as indicated inFIG. 16. The same modifications for emphasizing certain variables over others and fuzzy logic formulations as described above can applied to the AVMP combination calculation. Here again, the variable relationships ofFIG. 16 may also be processed using fuzzy logic as shown inFIG. 13.
  • In the same way, the derived “followed” remote sensor station signal value may be added to the hierarchical adaptive expert system analysis as shown in thematrix1700 ofFIG. 17. The result of this analysis is a warning and/or control (AVMPF) signal output for audio/video/medical/process/followed combination as indicated inFIG. 17. The same modifications for emphasizing certain variables over others and fuzzy logic formulations as described above can applied to the AVMPF calculation.
  • In the same way, derived “telecommunication network” remote sensor station signal values may be added to the hierarchical expert system analysis as shown in thematrix1800 ofFIG. 18. The result of this analysis is a warning and/or control (AVMPFT) signal output warning and control index for audio/video/medical/process/followed/telecommunication combination as indicated inFIG. 18. The same modifications for emphasizing certain variables over others and fuzzy logic formulations as described above can applied to the AVMPFT calculation.
  • The above described operations may involve the input of multiple exemplary parameters such as the audio, medical, process, followed remote sensor stations, and telecommunication sensor signals, and may in-turn result in the output of a single composite warning and control signal based on the combination the input sensor signals. Such systems are sometimes referred to as MISO systems with multiple inputs and a single output. Systems with multiple inputs and multiple outputs are sometimes referred to as MIMO systems. In some embodiments of the present invention, MIMO operation permits generation of multiple control outputs based on multiple sensor signals inputs as described above.
  • FIG. 19 depicts one such possiblehierarchical MIMO embodiment1900 of the present invention. In this embodiment, multiple outputs may be generated based on multiple calculated results indicating the requirements for urgent responses to multiple sensor signal inputs. The flowchart ofFIG. 19 provides for multiple intermediate outputs depending on the results that may develop in the processing, analyzing and evaluating the input data.
  • Here again, the flowchart ofFIG. 19 is a continuation of the flowchart ofFIG. 9 via theconnector1921 G. Atblock1901 audio/video (AV) sensor signals are input for evaluation.Block1902 computes an AV warning and control index as outlined for example inFIGS. 12A, 12B and 13. Control is then passed to theurgent decision block1903. At this point the decision is made as to whether the audio signals alone, video signals alone or the computed AV itself require an issue of a warning or system control signal. If a warning signal is to be issued at this point control is passed to theissue warning block1904 for generation and dispatch of that signal. Once that warning is issued or in the event that no urgent warning signal is necessary, control is passed to the input medical sensor network data block1905.
  • The received medical sensor data is in turn passed to block1906 for computation of the audio/video/medical (AVM) warning and control index using, for example, the expert system matrix ofFIG. 15 or fuzzy logic calculations as indicated inFIG. 9 atblock916. At this point the decision is made as to whether the medical signals alone or the computed AVM itself require an issue of a warning or system control signal. Having computed the AVM warning and control index atblock1906, control is passed to theurgent decision block1907. If a warning or control signal is to be issued at this point control is passed to theissue warning block1908 for generation and dispatch of that signal. Once that warning is issued or in the event that no urgent warning signal is necessary, control is passed to the input process sensor network data block1909.
  • The received process sensor data is in turn passed to block1910 for computation of the audio/video/medical/process (AVMP) warning and control index using, for example, the expert system matrix ofFIG. 16 or fuzzy logic calculations as indicated inFIG. 9 atblock916. At this point the decision is made as to whether the process signals alone or the computed AVMP itself require an issue of a warning or system control signal. Having computed the AVMP warning and control atblock1910, control is passed to theurgent decision block1911. If a warning signal is to be issued at this point, control is passed to theissue warning block1912 for generation and dispatch of that signal. Once that warning is issued or in the event that no urgent warning signal is necessary, control is passed to the input followedsensor network block1913 for data inputs from other remote sensor stations of interest as described above.
  • The received “followed” sensor data is in turn passed to block1914 for computation of the audio/video/medical/process/followed (AVMPF) warning and control index using, for example, the expert system matrix ofFIG. 17 or fuzzy logic calculations as indicated inFIG. 9 atblock916. At this point the decision is made as to whether the “followed” signals alone or the computed AVMPF itself require an issue of a warning or system control signal. Having computed the AVMPF warning and control index atblock1914, control is passed to theurgent decision block1915. If a warning signal is to be issued at this point, control is passed to theissue warning block1916 for generation and dispatch of that signal. Once that warning is issued or in the event that no urgent warning signal is necessary, control is passed to the input telecommunication sensor network data block1917 for data inputs from telecommunication network components and/or telecommunication subnetworks as discussed above.
  • The received telecommunication network sensor data is in turn passed to block1918 for computation of the audio/video/medical/process/followed/telecommunication (AVMPFT) warning and control index using, for example, the expert system matrix ofFIG. 18 or fuzzy logic calculations as indicated inFIG. 9 atblock916. At this point the decision is made as to whether the telecommunication signals alone or the computed AVMPFT itself require an issue of a warning or system control signal. Having computed the AVMPFT warning and control index atblock1918, control is passed to theurgent decision block1919. If a warning signal is to be issued at this point, control is passed to theissue warning block1920 for generation and dispatch of that signal. Once that warning is issued or in the event that no urgent warning signal is necessary, control is passed back to block916 OFFIG. 9.
  • It is clear from the above discussion of the flow chart ofFIG. 19 that multiple warning signals may be issued based on the analysis outlined in the flowchart. With multiple input signals and multiple output signals the disclosed system operates as a hierarchical multi-variable MIMO sensor network warning and/or control system.
  • FIG. 20 is an exemplary artificial neural network of the type useful in some embodiments of this invention. Artificial neural networks are in part modeled after operations in the biological neural network of the human brain. It has been estimated that the biological neural network of the human brain contains roughly 100 billion neurons. Biological neural network neurons interact and communicate with one another via interconnecting axons and dendrites. Biological neurons respond to a weighted combination of input signals with comparison of the sum to activation thresholds. A single biological brain neuron may receive thousands of input signals at once that undergo the summation process to determine if the message gets passed along to other neurons in the biological network.
  • Artificial neural networks as shown inFIG. 20 are based on a simplistic model compared to the actual biological neural network of the brain. Nonetheless, artificial neural networks are proving useful, for example, in problems encountered in pattern recognition, facial recognition, prediction, process modeling and analysis, medical diagnostics and dynamic load scheduling. Referring toFIG. 20, the artificialneural network2000 receivesinputs2006 from external sensors. Thenodes2004 of the artificialneural network2000 correspond approximately to neurons in the human brain and are organized with aninput layer2001 andoutput layer2003 interconnected byhidden layers2002. The nodes are interconnected by neural network connections2005. Weighted sums of input signals may be compared to threshold levels within the neural network nodes to produce output signals for transmission to subsequent nodes or outputs2007. For example, in image recognition problems theinputs2006 correspond to signals from image sensors. The output signals2007 will indicate whether or not the image being observed corresponds to a desired image. The internal weights and summing operations are configured during a training process corresponding to the actual desired result. Multiple training methodologies have been proposed including the use of backward chaining feedback arrangements of the output signals to adjust the weights of the artificial neural network summing operations to achieve the proper result if the desired image is presented. Artificial neural networks may thus be characterized as containing adaptive weights along paths between neurons that can be tuned by a learning algorithm from observed data using optimization techniques in order to improve the model. (See, for example, https://www.innoarchitech.com/artificial-intelligence-deep-learning-neural-networks-explained!)
  • FIG. 21 depicts, without limitation, an exemplary sensor network neural networkexpert system analysis2100 in accordance with the systems and methods of the present invention. Audio, image, medical, process, material, manufacturing equipment, environmental, transportation, location, pipeline, power system, radiation, vehicle, computer, processor, data storage, cloud processing, cloud data storage, drone, threat, mote, BOT, robot, telecommunication network or other followed remote sensor station monitoring signals comprise thesensor network inputs2101. In some instances, the signal inputs are processed by artificialneural networks2102 as illustrated inFIG. 21. The outputs from the neural networks along with the additional sensor signal inputs are processed by the sensor signal analysis/neural network analysis/rankingoperations2103. Theprocessing2103 accesses expert system rules2104 for derivation system warning and control signals as discussed above. In some embodiments, fuzzy logic inference rules may also be accessed to provide fuzzy logic warning and control signals as discussed above. In some embodiments, hierarchical adaptive MIMO control may be implemented as described above.
  • FIG. 22 illustrates in more detail exemplary fuzzy logicsystem operation execution2200 useful in the system and methods of this invention. As shown inFIG. 22, the operations of fuzzylogic inference engine2201 include access to the artificial intelligence expertsystem knowledge base2205 which may include the fuzzy logic rules discussed above. The fuzzy logic operations include thefuzzifier2202 used to establish degree of memberships DOMs as discussed above. The outputs offuzzifier2202 are fed to the fuzzylogic processing element2203.Defuzzifier2204 provides crisp numerical outputs for the warning andcontrol index2206 as discussed above.
  • Systems and methods described above provide network wide IoT warning and control signals. MIMO embodiment operation including input analysis from sensors directly connected to a given remote sensor station as well as inputs from different remote sensor stations designated as being followed by a given sensor station are disclosed. Embodiments based on artificial intelligence expert systems analysis, fuzzy logic analysis and the use of neural networks are included in systems and methods set-forth in this invention. Expert system and fuzzy logic implementations with hierarchical control and/or adaptive feedback are disclosed.
  • It should be understood that the drawings and detailed descriptions are not intended to limit the invention to the particular form disclosed, but on the contrary, the intention is to cover all modifications, equivalents and alternatives falling within the spirit and scope of the present invention as defined by the appended claims.

Claims (34)

1. A first Internet of Things (IoT) sensor network remote sensor station comprising:
a sensor network parameter processing, warning and control system with at least one electronic, specifically programmed, specialized sensor network communication computer machine including electronic artificial intelligence expert system processing and further comprising:
a non-transient memory having at least one portion for storing data and at least one portion for storing particular computer executable program code;
at least one processor for executing the particular program code stored in the memory,
one or more transceivers and/or electrical or optical connections for communicating with:
IoT (Internet of Things) sensors that generate electrical or optical parameter signals derived from sensor inputs from objects or situations being monitored;
one or more other different followed or following Internet of Things (IoT) sensor network remote sensor stations sharing common interests with said first IoT sensor network remote sensor station comprising one or more other electronic, specifically programmed, specialized sensor network communication computer machines for monitoring other such electrical or optical sensor parameter signals derived from different sensor inputs from IoT objects or situations being monitored;
one or more monitor units connected to, collecting information from and communicating with said first remote sensor station and further analyzing such collected information from remote sensor stations;
wherein the particular program code is configured to perform at least the following artificial intelligence expert system operations upon execution:
artificial intelligence expert system processing based on expert input defining multiple expert system logic propositional instructions and multiple ranges of sensor variables;
artificial intelligence expert system processing analysis of multiple sensor signal inputs and generation of multiple control outputs with urgent and/or integrated composite degree of concerns based on said expert system propositional instruction evaluation of multiple input sensor parameters;
artificial intelligence expert system processing further comprising hierarchical Multiple-Input/Multiple-Output (MIMO) operation wherein the number of said expert system logic propositional instructions is a linear function of the number of variables and wherein said hierarchical MIMO operations provide inputs to successive hierarchical control levels based at least in part on importance of said inputs and feedback indicative of output signal sensitivity to said inputs;
artificial intelligence expert system control of dispatch of electronic or optical communication warnings and/or corrective action to address MIMO urgent concerns and/or composite degrees of concern of said sensor network objects or situations based on urgent concerns and/or expert system composite degrees of concern; and
wherein said artificial intelligence further processing comprises, without limitation, one or more of expert system processing and analysis of said first remote sensor station sensor input signals, acoustic signal processing, speech recognition, natural language processing, image processing, fuzzy logic, statistical analysis, mathematical analysis and/or neural network analysis.
2. The first sensor network remote sensor station ofclaim 1 wherein said sensor signals include a combination of one or more of audio, image, medical, process, material, manufacturing equipment, environmental, transportation, location, pipeline, power system, radiation, vehicle, computer, processor, data storage, cloud processing, cloud data storage, drone, threat, mote, BOT, robot, telecommunication network, cyberattack, malicious hacking or other followed remote sensor station monitoring signals.
3. The first sensor network remote sensor station ofclaim 1 wherein said MIMO artificial intelligence expert system controller is a fuzzy logic controller.
4. The first sensor network remote sensor station ofclaim 1 wherein said hierarchical MIMO artificial intelligence expert system controller is a fuzzy logic controller.
5. The first sensor network remote sensor station ofclaim 1 wherein said propositional expert system instructions are based on priorities or importance of selected object or situation expert defined monitored parameters.
6. The first sensor network remote sensor station ofclaim 2 wherein at least one of said expert systems propositional expert system instructions priorities is based on selected combinations of object or situation parameters.
7. The first sensor network remote sensor station ofclaim 1 wherein said artificial intelligence processing further comprises neural network processing with backward chaining from computed results to improve future computational results.
8. The first sensor network remote sensor station ofclaim 1 further comprising access of said remote sensor station to internet cloud storage and processing units.
9. The first sensor network remote sensor station ofclaim 1 wherein sensor inputs may vary with time.
10. The first sensor network remote sensor station ofclaim 9 wherein said parameter analysis further comprises time series analysis of time variable sensor input data.
11. The first sensor network remote sensor station ofclaim 10 wherein said time series analysis includes regression analysis of time varying sensor signal parameter values.
12. The first sensor network remote sensor station ofclaim 1 wherein said electronic, specifically programmed, specialized sensor network communication computer machine communicates with other network nodes to monitor connected telecommunication network elements, subnetworks or networks for failures or performance issues impacting said first remote sensor station.
13. The first sensor network remote sensor station ofclaim 1 wherein the one or more of said transceivers may communicate with a terrestrial or air-born vehicle.
14. The first sensor network remote sensor station ofclaim 1 wherein said first sensor network remote sensor station is implemented in a terrestrial or air-born vehicle.
15. The first sensor network remote sensor station ofclaim 1 wherein the one or more of said transceivers may communicate with a drone.
16. The first sensor network remote sensor station ofclaim 1 wherein said first sensor network remote sensor station is implemented in a drone.
17. The first sensor network remote sensor station ofclaim 1 wherein the one or more of said transceivers may communicate with a robot.
18. The first sensor network remote sensor station ofclaim 1 wherein said first sensor network remote sensor station is implemented in a robot.
19. The first sensor network remote sensor station ofclaim 1 wherein the one or more of said transceivers may communicate with a BOT.
20. The first sensor network remote sensor station ofclaim 1 wherein said first sensor network remote sensor station is implemented in a BOT.
21. The first sensor network remote sensor station ofclaim 1 wherein the one or more of said transceivers may communicate with a mote.
22. The first sensor network remote sensor station ofclaim 1 wherein said first sensor network remote sensor station is implemented in a mote.
23. The first sensor network remote sensor station ofclaim 1 wherein said monitored Internet of Things (IoT) objects or situations comprise one or more persons.
24. The first sensor network remote sensor station ofclaim 23 wherein a monitored person is an infant, child, invalid, medical patient, elderly or special needs person.
25. The first sensor network remote sensor station ofclaim 24 wherein said first sensor network remote sensor station transmits background audio signals to be broadcast in the area of said person.
26. The first sensor network remote sensor station ofclaim 25 wherein said first sensor network remote sensor station transmitted background audio signals are removed from or attenuated in signals transmitted to connected monitor units to minimize annoying or unnecessary signals received and/or heard at said monitoring unit while still transmitting audio signals from the monitored object or person.
27. The first sensor network remote sensor station ofclaim 26 wherein said first sensor network remote sensor station transmits periodic keep-alive signals to a connected monitor unit to assure users that the remote sensor station is operating correctly.
28. The first sensor network remote sensor station ofclaim 2 wherein said sensor signals include a combination of at least one telecommunication network sensor input combined with other sensor signal inputs.
29. The first sensor network remote sensor station ofclaim 28 wherein said at least one telecommunication network sensor input is a telecommunication link sensor.
30. The first sensor network remote sensor station ofclaim 28 wherein said at least one telecommunication network sensor input is a telecommunication router sensor.
31. The first sensor network remote sensor station ofclaim 28 wherein said at least one telecommunication network sensor input is a telecommunication switching system sensor.
32. The first sensor network remote sensor station ofclaim 28 wherein said at least one telecommunication network sensor input is a telecommunication modem sensor.
33. A first Internet of Things (IoT) sensor network remote sensor station comprising:
a sensor network parameter processing, warning and control system with at least one electronic, specifically programmed, specialized sensor network communication computer machine including electronic artificial intelligence expert system processing and further comprising:
a non-transient memory having at least one portion for storing data and at least one portion for storing particular computer executable program code;
at least one processor for executing the particular program code stored in the memory;
one or more transceivers and/or electrical or optical connections for communicating with IoT (Internet of Things) sensors that generate electrical or optical parameter signals derived from sensor inputs from objects or situations being monitored including one or more telecommunication sensors;
wherein the particular program code is configured to perform at least the following artificial intelligence expert system operations upon execution:
artificial intelligence expert system processing based on expert input defining multiple expert system logic propositional instructions and multiple ranges of sensor variables;
artificial intelligence expert system hierarchical Multiple-Input/Multiple-Output (MIMO) processing of multiple sensor signal inputs and generation of multiple control outputs with urgent and/or integrated composite degree of concerns based on said expert system propositional instruction evaluation of multiple input sensor parameters;
artificial intelligence expert system control of dispatch of electronic or optical communication warnings and/or corrective action to address MIMO urgent concerns and/or composite degrees of concern of said sensor network objects or situations based on urgent concerns and rankings of said expert system composite degrees of concern; and
wherein the number of said expert system logic propositional instructions is a linear function of the number of variables; and
wherein said artificial intelligence further processing comprises, without limitation, one or more of expert system processing and analysis of said first remote sensor station sensor input signals, acoustic signal processing, speech recognition, natural language processing, image processing, fuzzy logic, statistical analysis, mathematical analysis and/or neural network analysis.
34. The first sensor network remote sensor station ofclaim 33 wherein, in addition to one or more telecommunication sensors, said sensor signals include a combination of one or more of audio, image, medical, process, material, manufacturing equipment, environmental, transportation, location, pipeline, power system, radiation, vehicle, computer, processor, data storage, cloud processing, cloud data storage, drone, threat, mote, BOT, robot, cyberattack, malicious hacking or other followed remote sensor station monitoring signals.
US16/412,3832019-05-142019-05-14Iot sensor network artificial intelligence warning, control and monitoring systems and methodsAbandonedUS20200364583A1 (en)

Priority Applications (5)

Application NumberPriority DateFiling DateTitle
US16/412,383US20200364583A1 (en)2019-05-142019-05-14Iot sensor network artificial intelligence warning, control and monitoring systems and methods
US17/183,635US11244230B2 (en)2019-05-142021-02-24Internet of things (IoT) big data artificial intelligence expert system information management and control systems and methods
US17/555,038US11734578B2 (en)2019-05-142021-12-17Internet of things (IoT) big data artificial intelligence expert system information management and control systems and methods
US18/217,510US12112272B2 (en)2019-05-142023-06-30Internet of things (IOT) big data artificial intelligence expert system information management and control systems and methods
US18/822,021US20240419982A1 (en)2019-05-142024-08-30Internet Of Things (IOT) Big Data Artificial Intelligence Expert System Information Management And Control Systems And Methods

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
US16/412,383US20200364583A1 (en)2019-05-142019-05-14Iot sensor network artificial intelligence warning, control and monitoring systems and methods

Related Child Applications (2)

Application NumberTitlePriority DateFiling Date
US17/183,635Continuation-In-PartUS11244230B2 (en)2019-05-142021-02-24Internet of things (IoT) big data artificial intelligence expert system information management and control systems and methods
US17/183,635ContinuationUS11244230B2 (en)2019-05-142021-02-24Internet of things (IoT) big data artificial intelligence expert system information management and control systems and methods

Publications (1)

Publication NumberPublication Date
US20200364583A1true US20200364583A1 (en)2020-11-19

Family

ID=73228681

Family Applications (5)

Application NumberTitlePriority DateFiling Date
US16/412,383AbandonedUS20200364583A1 (en)2019-05-142019-05-14Iot sensor network artificial intelligence warning, control and monitoring systems and methods
US17/183,635ActiveUS11244230B2 (en)2019-05-142021-02-24Internet of things (IoT) big data artificial intelligence expert system information management and control systems and methods
US17/555,038Active2039-06-29US11734578B2 (en)2019-05-142021-12-17Internet of things (IoT) big data artificial intelligence expert system information management and control systems and methods
US18/217,510ActiveUS12112272B2 (en)2019-05-142023-06-30Internet of things (IOT) big data artificial intelligence expert system information management and control systems and methods
US18/822,021PendingUS20240419982A1 (en)2019-05-142024-08-30Internet Of Things (IOT) Big Data Artificial Intelligence Expert System Information Management And Control Systems And Methods

Family Applications After (4)

Application NumberTitlePriority DateFiling Date
US17/183,635ActiveUS11244230B2 (en)2019-05-142021-02-24Internet of things (IoT) big data artificial intelligence expert system information management and control systems and methods
US17/555,038Active2039-06-29US11734578B2 (en)2019-05-142021-12-17Internet of things (IoT) big data artificial intelligence expert system information management and control systems and methods
US18/217,510ActiveUS12112272B2 (en)2019-05-142023-06-30Internet of things (IOT) big data artificial intelligence expert system information management and control systems and methods
US18/822,021PendingUS20240419982A1 (en)2019-05-142024-08-30Internet Of Things (IOT) Big Data Artificial Intelligence Expert System Information Management And Control Systems And Methods

Country Status (1)

CountryLink
US (5)US20200364583A1 (en)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN112589619A (en)*2020-12-292021-04-02森默(成都)科技有限公司Master-apprentice type intelligent grinding and polishing machine based on expert system
CN113256275A (en)*2021-07-142021-08-13支付宝(杭州)信息技术有限公司Expert system updating method, service processing method and device
CN113407945A (en)*2021-06-182021-09-17北京计算机技术及应用研究所Man-machine cooperation based large-scale Fuzzing optimization system and method
US20210357780A1 (en)*2020-05-152021-11-18Motorola Mobility LlcArtificial Intelligence Modules for Computation Tasks
US20210405609A1 (en)*2020-03-232021-12-30Vmware, Inc.Hybrid internet of things evaluation framework
US11240590B2 (en)*2019-07-312022-02-01Merit Zone LimitedBaby monitor system with noise filtering
WO2022117669A1 (en)*2020-12-022022-06-09Dalton Wilson MichaelSmart remote system
US11423051B2 (en)*2020-10-202022-08-23International Business Machines CorporationSensor signal prediction at unreported time periods
US20220269958A1 (en)*2021-02-192022-08-25Samsung Electronics Co., Ltd.Device-invariant, frequency-domain signal processing with machine learning
US20220277232A1 (en)*2019-09-182022-09-01Hartford Steam Boiler Inspection And Insurance CompanyComputer-based systems, computing components and computing objects configured to implement dynamic outlier bias reduction in machine learning models
US20220352978A1 (en)*2021-04-292022-11-03Rockwell Collins, Inc.Application of machine learning to muos network management
CN115442404A (en)*2022-08-312022-12-06国网智能电网研究院有限公司 Distributed networking system and method based on intent-driven IoT sensor
CN115588156A (en)*2022-12-092023-01-10河南瑞德克气力输送设备有限公司Continuous quantitative feeding system for serial tanks
CN115903865A (en)*2022-09-162023-04-04中国空气动力研究与发展中心空天技术研究所 A Method for Realizing the Maneuvering Decision-Making of Aircraft Short-range Air Combat
US11636322B2 (en)2020-01-032023-04-25Silicon Storage Technology, Inc.Precise data tuning method and apparatus for analog neural memory in an artificial neural network
WO2023086669A1 (en)*2021-11-152023-05-19Sports Data Labs, Inc.A system and method for intelligently selecting sensors and their associated operating parameters
TWI809663B (en)*2021-02-252023-07-21美商超捷公司Precise data tuning method and apparatus for analog neural memory in an artificial neural network
US11803612B2 (en)2018-09-282023-10-31Hartford Steam Boiler Inspection And Insurance CompanySystems and methods of dynamic outlier bias reduction in facility operating data
CN117095506A (en)*2023-10-182023-11-21潍坊市平安消防工程有限公司Fire safety monitoring system and method based on alarm area model
US11868425B2 (en)2011-08-192024-01-09Hartford Steam Boiler Inspection And Insurance CompanyDynamic outlier bias reduction system and method
US11875769B2 (en)2019-07-312024-01-16Kelvin Ka Fai CHANBaby monitor system with noise filtering and method thereof
US20240146807A1 (en)*2021-02-242024-05-02Nippon Telegraph And Telephone CorporationSensor device allocation apparatus, sensor device allocation method, and sensor device allocation program
WO2024106612A1 (en)*2022-11-172024-05-23주식회사 포에스텍System and method for artificial intelligence-based object identification monitoring using iot sensor
US20240202840A1 (en)*2022-12-162024-06-20Gudea, Inc.Information Monitoring System and Method
US20240236139A1 (en)*2021-09-242024-07-11Ntt Communications CorporationVehicle security analysis apparatus, method, and program storage medium
CN118466413A (en)*2024-05-062024-08-09国家电投集团江西电力有限公司分宜发电厂 Control method and device for unattended equipment in power plants based on fuzzy logic
US12124944B2 (en)2020-01-032024-10-22Silicon Storage Technology, Inc.Precise data tuning method and apparatus for analog neural memory in an artificial neural network
CN119414900A (en)*2024-10-232025-02-11四川省机械研究设计院(集团)有限公司 A kind of Daqu fermentation control method and Daqu fermentation system
US12353506B2 (en)2014-04-112025-07-08The Hartford Steam Boiler Inspection And Insurance CompanyFuture reliability prediction based on system operational and performance data modelling
WO2025156013A1 (en)*2024-01-252025-07-31Fenwick Ty DavidData communications network and method for identifying and remediating issues with networked devices
US12388718B2 (en)2022-10-262025-08-12T-Mobile Usa, Inc.Predicting a likelihood of a predetermined action associated with a mobile device

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN110826398B (en)*2019-09-232021-04-02上海意略明数字科技股份有限公司Intelligent image recognition big data acquisition and analysis system and application method
US11824639B2 (en)*2021-02-252023-11-21Insight Direct Usa, Inc.Dynamic IoT rule and alert templating engine
CN113778007A (en)*2021-08-272021-12-10哈尔滨工程大学 A UAV intelligent monitoring system for pipe gallery inspection
US12074667B2 (en)*2022-05-312024-08-27Motional Ad LlcAntenna monitoring and selection
CN115118745B (en)*2022-06-082024-11-19浙江工业大学Performance equipment information interconnection platform system and construction method thereof
US11860712B1 (en)2022-08-242024-01-02International Business Machines CorporationSensor fault prediction and resolution
WO2024081256A1 (en)*2022-10-102024-04-18The Regents Of The University Of CaliforniaCritically synchronized network for efficient machine learning
CN116684327B (en)*2023-08-032023-10-27中维建技术有限公司Mountain area communication network fault monitoring and evaluating method based on cloud computing

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US6759961B2 (en)2001-10-042004-07-06Mattel, Inc.Two-way communication baby monitor with a soothing unit
US7339608B2 (en)2003-01-032008-03-04Vtech Telecommunications LimitedWireless motion sensor using infrared illuminator and camera integrated with wireless telephone
US8275824B2 (en)2004-03-312012-09-25The Invention Science Fund I, LlcOccurrence data detection and storage for mote networks
US7502498B2 (en)2004-09-102009-03-10Available For LicensingPatient monitoring apparatus
ITMN20050049A1 (en)*2005-07-182007-01-19Balzanelli Sonia VISUAL DEVICE FOR VEHICLES IN DIFFICULT CLIMATE-ENVIRONMENTAL CONDITIONS
US7733224B2 (en)2006-06-302010-06-08Bao TranMesh network personal emergency response appliance
US7420472B2 (en)2005-10-162008-09-02Bao TranPatient monitoring apparatus
US20080317241A1 (en)2006-06-142008-12-25Derek WangCode-based echo cancellation
US7705725B2 (en)2007-01-082010-04-27The Boeing CompanyMethods and systems for monitoring structures and systems
US20090126736A1 (en)2007-07-162009-05-21Brenton TaylorIn-home medical data collection and reporting system
US9101334B2 (en)2011-08-132015-08-11Matthias W. RathMethod and system for real time visualization of individual health condition on a mobile device
US8743200B2 (en)2012-01-162014-06-03Hipass Design LlcActivity monitor
US10708121B2 (en)2012-11-052020-07-07Comcast Cable Communications, LlcIntelligent network
US9215181B2 (en)2012-11-062015-12-15Comcast Cable Communications, LlcSystems and methods for managing a network
US9336436B1 (en)2013-09-302016-05-10Google Inc.Methods and systems for pedestrian avoidance
US9315192B1 (en)2013-09-302016-04-19Google Inc.Methods and systems for pedestrian avoidance using LIDAR
US9807640B2 (en)2013-12-172017-10-31Taiwan Semiconductor Manufacturing Company, Ltd.Network operating system resource coordination
US10049408B2 (en)*2014-04-152018-08-14Speedgauge, Inc.Assessing asynchronous authenticated data sources for use in driver risk management
US10225761B2 (en)2014-11-062019-03-05At&T Intellectual Property I, L.P.Enhanced network congestion application programming interface
US9805601B1 (en)*2015-08-282017-10-31State Farm Mutual Automobile Insurance CompanyVehicular traffic alerts for avoidance of abnormal traffic conditions
JP2017068335A (en)*2015-09-282017-04-06ルネサスエレクトロニクス株式会社 Data processing device and in-vehicle communication device
US20190339688A1 (en)*2016-05-092019-11-07Strong Force Iot Portfolio 2016, LlcMethods and systems for data collection, learning, and streaming of machine signals for analytics and maintenance using the industrial internet of things

Cited By (38)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US11868425B2 (en)2011-08-192024-01-09Hartford Steam Boiler Inspection And Insurance CompanyDynamic outlier bias reduction system and method
US12353506B2 (en)2014-04-112025-07-08The Hartford Steam Boiler Inspection And Insurance CompanyFuture reliability prediction based on system operational and performance data modelling
US11803612B2 (en)2018-09-282023-10-31Hartford Steam Boiler Inspection And Insurance CompanySystems and methods of dynamic outlier bias reduction in facility operating data
US11875769B2 (en)2019-07-312024-01-16Kelvin Ka Fai CHANBaby monitor system with noise filtering and method thereof
US11240590B2 (en)*2019-07-312022-02-01Merit Zone LimitedBaby monitor system with noise filtering
US11615348B2 (en)*2019-09-182023-03-28Hartford Steam Boiler Inspection And Insurance CompanyComputer-based systems, computing components and computing objects configured to implement dynamic outlier bias reduction in machine learning models
US20220277232A1 (en)*2019-09-182022-09-01Hartford Steam Boiler Inspection And Insurance CompanyComputer-based systems, computing components and computing objects configured to implement dynamic outlier bias reduction in machine learning models
US12124944B2 (en)2020-01-032024-10-22Silicon Storage Technology, Inc.Precise data tuning method and apparatus for analog neural memory in an artificial neural network
US11847556B2 (en)2020-01-032023-12-19Silicon Storage Technology, Inc.Precise data tuning method and apparatus for analog neural memory in an artificial neural network
US11636322B2 (en)2020-01-032023-04-25Silicon Storage Technology, Inc.Precise data tuning method and apparatus for analog neural memory in an artificial neural network
US11714396B2 (en)*2020-03-232023-08-01Vmware, Inc.Hybrid internet of things evaluation framework
US20210405609A1 (en)*2020-03-232021-12-30Vmware, Inc.Hybrid internet of things evaluation framework
US20210357780A1 (en)*2020-05-152021-11-18Motorola Mobility LlcArtificial Intelligence Modules for Computation Tasks
US11836640B2 (en)*2020-05-152023-12-05Motorola Mobility LlcArtificial intelligence modules for computation tasks
US11423051B2 (en)*2020-10-202022-08-23International Business Machines CorporationSensor signal prediction at unreported time periods
WO2022117669A1 (en)*2020-12-022022-06-09Dalton Wilson MichaelSmart remote system
CN112589619A (en)*2020-12-292021-04-02森默(成都)科技有限公司Master-apprentice type intelligent grinding and polishing machine based on expert system
US20220269958A1 (en)*2021-02-192022-08-25Samsung Electronics Co., Ltd.Device-invariant, frequency-domain signal processing with machine learning
US12028417B2 (en)*2021-02-242024-07-02Nippon Telegraph And Telephone CorporationSensor device allocation apparatus, sensor device allocation method, and sensor device allocation program
US20240146807A1 (en)*2021-02-242024-05-02Nippon Telegraph And Telephone CorporationSensor device allocation apparatus, sensor device allocation method, and sensor device allocation program
TWI809663B (en)*2021-02-252023-07-21美商超捷公司Precise data tuning method and apparatus for analog neural memory in an artificial neural network
US20220352978A1 (en)*2021-04-292022-11-03Rockwell Collins, Inc.Application of machine learning to muos network management
US11750277B2 (en)*2021-04-292023-09-05Rockwell Collins, Inc.Application of machine learning to MUOS network management
CN113407945A (en)*2021-06-182021-09-17北京计算机技术及应用研究所Man-machine cooperation based large-scale Fuzzing optimization system and method
CN113256275A (en)*2021-07-142021-08-13支付宝(杭州)信息技术有限公司Expert system updating method, service processing method and device
US20240236139A1 (en)*2021-09-242024-07-11Ntt Communications CorporationVehicle security analysis apparatus, method, and program storage medium
WO2023086669A1 (en)*2021-11-152023-05-19Sports Data Labs, Inc.A system and method for intelligently selecting sensors and their associated operating parameters
CN115442404A (en)*2022-08-312022-12-06国网智能电网研究院有限公司 Distributed networking system and method based on intent-driven IoT sensor
CN115903865A (en)*2022-09-162023-04-04中国空气动力研究与发展中心空天技术研究所 A Method for Realizing the Maneuvering Decision-Making of Aircraft Short-range Air Combat
US12413482B2 (en)2022-10-262025-09-09T-Mobile Usa, Inc.Establishing a connection between a mobile device and a representative of a wireless telecommunication network
US12388718B2 (en)2022-10-262025-08-12T-Mobile Usa, Inc.Predicting a likelihood of a predetermined action associated with a mobile device
WO2024106612A1 (en)*2022-11-172024-05-23주식회사 포에스텍System and method for artificial intelligence-based object identification monitoring using iot sensor
CN115588156A (en)*2022-12-092023-01-10河南瑞德克气力输送设备有限公司Continuous quantitative feeding system for serial tanks
US20240202840A1 (en)*2022-12-162024-06-20Gudea, Inc.Information Monitoring System and Method
CN117095506A (en)*2023-10-182023-11-21潍坊市平安消防工程有限公司Fire safety monitoring system and method based on alarm area model
WO2025156013A1 (en)*2024-01-252025-07-31Fenwick Ty DavidData communications network and method for identifying and remediating issues with networked devices
CN118466413A (en)*2024-05-062024-08-09国家电投集团江西电力有限公司分宜发电厂 Control method and device for unattended equipment in power plants based on fuzzy logic
CN119414900A (en)*2024-10-232025-02-11四川省机械研究设计院(集团)有限公司 A kind of Daqu fermentation control method and Daqu fermentation system

Also Published As

Publication numberPublication date
US20240419982A1 (en)2024-12-19
US11244230B2 (en)2022-02-08
US20220108190A1 (en)2022-04-07
US20210201165A1 (en)2021-07-01
US11734578B2 (en)2023-08-22
US12112272B2 (en)2024-10-08
US20240256905A1 (en)2024-08-01

Similar Documents

PublicationPublication DateTitle
US12112272B2 (en)Internet of things (IOT) big data artificial intelligence expert system information management and control systems and methods
Heidari et al.Machine learning applications in internet-of-drones: Systematic review, recent deployments, and open issues
US10803720B2 (en)Intelligent smoke sensor with audio-video verification
WO2017117674A1 (en)Intelligent smoke sensor with audio-video verification
CN110383855A (en)Audio communication system and method
US20230171326A1 (en)Generating Third-Party Notifications Related to Occurrence of Motion Events
US11531516B2 (en)Intelligent volume control
CN110277093B (en)Audio signal detection method and device
US10481564B2 (en)Adaptive control systems for buildings with security
Laha et al.How can machine learning impact on wireless network and IoT?–A survey
Singh et al.Privacy-enabled smart home framework with voice assistant
US10496047B2 (en)Adaptive control systems methods for buildings with security
Subramanian et al.Converging Horizons: Synergies of 6G Wireless Communication, Machine Learning, and Embedded Systems for Intelligent Connectivity
US20230334258A1 (en)Analyzing monitoring system events using natural language processing (nlp)
Aygül et al.Machine learning-based spectrum occupancy prediction: A comprehensive survey
Mitola IIICognitive radio architecture evolution: annals of telecommunications
Mano et al.Adaptive multi-agent system for multi-sensor maritime surveillance
Pasandi et al.Echoing the Future: On-Device Machine Learning in Next-Generation Networks A Comprehensive Survey
Dulaj et al.Harnessing Machine Learning for Intelligent Networking in 5G Technology and Beyond: Advancements, Applications and Challenges
US20210026978A1 (en)Application monitoring and device restriction system and method
US20200404093A1 (en)Device monitoring and restriction system and method
AlakbarovaDevelopment of a model for the analysis of human behavior in a smart home environment
US12243414B1 (en)Intelligent dynamic workflow generation
US20250110200A1 (en)Methods, systems, and apparatuses for presence detection
Ahmed et al.Tri-tier architectures for AIoT networks

Legal Events

DateCodeTitleDescription
STPPInformation on status: patent application and granting procedure in general

Free format text:FINAL REJECTION MAILED

STCBInformation on status: application discontinuation

Free format text:ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION


[8]ページ先頭

©2009-2025 Movatter.jp