CROSS-REFERENCE TO RELATED APPLICATIONSThe present application claims the benefit under 35 U.S.C. § 119(e) of U.S. Provisional Application Ser. No. 62/381,781, filed Aug. 31, 2016, and titled “METHOD AND SYSTEM FOR WAREHOUSE INVENTORY MANAGEMENT USING DRONES,” which is incorporated herein by reference in its entirety.
BACKGROUNDToday, the globalized supply chain ships countless goods made around the world to willing buyers. Most manufactured items sold at some point in time move through a warehouse. Many warehouse inventory management systems already make use of handheld label scanners to update and track items. However, there are many challenges in warehouse inventory management due to human error in scanning and updating inventory information. Even with inventory management software, managers and workers frequently don't know where specific items are, whether there are duplicates, whether items get lost or damaged, or what to do about shrinkage (e.g. items taken/replaced without updating the system). In addition, many tools and machines used today pose hazards to human workers who check inventory, such as falls from ladders, injuries from pallet movers or forklifts, and slips from liquid spills or leaks. Errors in inventory information can lead to costly under- or overstock for the warehouse company.
Since many warehouses have predictable layouts and repetitive work, there have been some attempts to use robotic machines to help with warehouse inventory management to automate warehouse tasks. Robotic arms help with carton removal and automated packing. Wheeled ground robots follow painted paths on open warehouse floors with wide aisles to move pallets and cartons. However, ground robots and robotic arms only move in two dimensions, unable to adjust for or see individual cases and packages at different heights, in warehouses with aisles sometimes stacked from floor to ceiling. Even if connected to inventory management systems, such machines are sometimes unable to efficiently provide a complete picture of warehouse inventory to warehouse managers.
SUMMARYThe following presents a general summary of aspects of the present disclosure. This summary is not intended to limit the scope of the present disclosure in any way, but it simply provides a general overview and context for the more detailed description that follows.
Aspects of this disclosure relate to a system that employs aerial drones for inventory management. The implementation of indoor drones for warehouses in the real-world is more complicated than simply attaching a barcode scanner to a drone. It involves technologies for indoor navigation, solving routing problems, and approaches to aligning a scanning sensor with inventory labels. In embodiments, the system includes at least one aerial drone with an optical sensor, an indoor positioning system, and a controller on the aerial drone. The controller is communicatively coupled to the optical sensor and the indoor positioning system. The controller is configured to localize and navigate the aerial drone within a facility based on one or more signals from the indoor positioning system. The controller is further configured to detect identifiers attached to respective inventory items via the optical sensor and to store information associated with the detected identifiers in an onboard memory. The controller may be further configured to transmit the information associated with the detected identifiers to a warehouse management system.
Aspects of this disclosure also relate to a method for inventory management using aerial drones. The method employs at least one aerial drone with an optical sensor and an indoor positioning system on the aerial drone. In implementations, the method includes: localizing and navigating the aerial drone within a facility based on one or more signals from the indoor positioning system; detecting identifiers attached to respective inventory items via the optical sensor; and storing information associated with the detected identifiers in an onboard memory of the aerial drone. In implementations, the information associated with the detected identifiers is then transmitted (or transmitted in real-time/near real-time) to a warehouse management system.
BRIEF DESCRIPTION OF THE DRAWINGSThe detailed description is described with reference to the accompanying figures. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items. Various embodiments or examples (“examples”) of the present disclosure are disclosed in the following detailed description and the accompanying drawings. The drawings are not necessarily to scale. In general, operations of disclosed processes may be performed in an arbitrary order, unless otherwise provided in the claims.
FIG. 1A is an illustration of an aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 1B is an illustration of an aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 1C is an illustration of an aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 1D is an illustration of an aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 1E is an illustration of an aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 1F is a block diagram illustrating electronics for an aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 2A is an illustration of a propeller for an aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 2B is an illustration of a propeller for an aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 2C is an illustration of a propeller for an aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 3A is an illustration of a landing gear footing for an aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 3B is an illustration of a landing gear footing for an aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 4A is an illustration of an aerial drone with a landing gear including horizontal bars for interfacing with a landing surface, in accordance with an example embodiment of the present disclosure.
FIG. 4B is an illustration of an aerial drone with a landing gear including feet/nubs for interfacing with a landing surface, in accordance with an example embodiment of the present disclosure.
FIG. 4C is an illustration of an aerial drone with a landing gear including raised points (e.g., downward facing conical or pyramid-like elements) for interfacing with a landing surface, in accordance with an example embodiment of the present disclosure.
FIG. 4D is an illustration of an aerial drone with a landing gear including feet/nubs extending from the aerial drone's motors for interfacing with a landing surface, in accordance with an example embodiment of the present disclosure.
FIG. 4E is an illustration of an aerial drone with a cage-like landing gear for interfacing with a landing surface, in accordance with an example embodiment of the present disclosure.
FIG. 5A is an illustration of a one-dimensional optical sensor for an aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 5B is an illustration of a one-dimensional optical sensor for an aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 5C is an illustration of a two-dimensional optical sensor for an aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 5D is an illustration of a two-dimensional optical sensor for an aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 6A is an illustration of an image-based optical sensor for an aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 6B is an illustration of an identifier having one or more elements detectable by an image-based optical sensor for an aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 6C is an illustration of an image-based optical sensor for an aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 7A is an illustration of an aerial drone with an optical sensor configured to scan identifiers at a first height based on a flight path of the aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 7B is an illustration of an aerial drone with an optical sensor configured to scan identifiers at a first height based on a flight path of the aerial drone, where the optical sensor misses an identifier located at a second height different from the first height, in accordance with an example embodiment of the present disclosure.
FIG. 8A is an illustration of an aerial drone with an optical sensor and a camera having a wider field of view than the optical sensor, in accordance with an example embodiment of the present disclosure.
FIG. 8B is an illustration of an aerial drone with an optical sensor and a camera having a wider field of view than the optical sensor, wherein the aerial drone is configured to follow a flight path based on image data from the camera, in accordance with an example embodiment of the present disclosure.
FIG. 9A is an illustration of an optical sensor for an aerial drone, wherein the optical sensor is actuatable along or about a first axis, in accordance with an example embodiment of the present disclosure.
FIG. 9B is an illustration of an optical sensor for an aerial drone, wherein the optical sensor is actuatable along or about a first and a second axis, in accordance with an example embodiment of the present disclosure.
FIG. 10A is an illustration of an aerial drone with an optical sensor that is actuatable along or about at least one axis, wherein the optical sensor is actuatable along or about a first and a second axis to detect identifiers at a plurality of different scanning heights, in accordance with an example embodiment of the present disclosure.
FIG. 10B is an illustration of an aerial drone with a plurality of optical sensor oriented at a plurality of different respective scanning heights, in accordance with an example embodiment of the present disclosure.
FIG. 11A is an illustration of an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 11B is an illustration of an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone, where the optical sensor fails to detect an identifier when the aerial drone does not maintain an alignment between the optical sensor and the identifier for a sufficient time period, in accordance with an example embodiment of the present disclosure.
FIG. 11C is an illustration of an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone, wherein the aerial drone is configured to maintain an alignment between the optical sensor and a first identifier for a predetermined time period or until the first identifier is recognized prior to the aerial drone moving on to scan a second identifier, in accordance with an example embodiment of the present disclosure.
FIG. 12A is an illustration of an aerial drone with an optical sensor mounted to an upper surface of the aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 12B is an illustration of an aerial drone with an optical sensor mounted to a structure including raised platform on an upper surface of the aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 12C is an illustration of an optical sensor for an aerial drone, wherein the optical sensor is actuatable along or about a first axis, in accordance with an example embodiment of the present disclosure.
FIG. 12D is an illustration of an optical sensor for an aerial drone, wherein the optical sensor is actuatable along or about a first and a second axis, in accordance with an example embodiment of the present disclosure.
FIG. 12E is an illustration of an aerial drone with an optical sensor mounted to platform that protrudes from the aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 12F is an illustration of an aerial drone with an optical sensor mounted at least partially within a structure that defines a body of the aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 12G is an illustration of an aerial drone with an optical sensor mounted to an lower surface of the aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 12H is an illustration of an aerial drone with an optical sensor on a gimbal mounted to a lower surface of the aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 13A is an illustration of an optical sensor configuration for an aerial drone, wherein the optical sensor is coupled to a controller and a battery by separate data and power cables, in accordance with an example embodiment of the present disclosure.
FIG. 13B is an illustration of an optical sensor configuration for an aerial drone, wherein the optical sensor is coupled to a controller by separate data and power cables, in accordance with an example embodiment of the present disclosure.
FIG. 13C is an illustration of an optical sensor configuration for an aerial drone, wherein the optical sensor is coupled to a controller by a combined data and power cable, in accordance with an example embodiment of the present disclosure.
FIG. 14 is an illustration of an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone, wherein the flight path comprises a stop-and-go flight path, in accordance with an example embodiment of the present disclosure.
FIG. 15 is an illustration of an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone, wherein the flight path causes the aerial drone to scan identifiers of inventory items located on one side of each aisle, in accordance with an example embodiment of the present disclosure.
FIG. 16 is an illustration of an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone, wherein the flight path causes the aerial drone to scan identifiers of inventory items located on one side of each aisle, where the aerial drone rotates after reaching an endpoint in order to scan identifiers of inventory items located on another (e.g., opposite) side of each aisle, in accordance with an example embodiment of the present disclosure.
FIG. 17 is an illustration of an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone, wherein the flight path causes the aerial drone to scan identifiers of inventory items located in a subset of the aisles, in accordance with an example embodiment of the present disclosure.
FIG. 18 is an illustration of an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone, wherein the flight path causes the aerial drone to scan an identifier of an inventory item at a selected position within a selected aisle, in accordance with an example embodiment of the present disclosure.
FIG. 19 is an illustration of an aerial drone with an optical sensor and at least a second (oppositely facing) optical sensor configured to simultaneously or substantially simultaneously scan identifiers located on opposing sides of an aisle based on a flight path of the aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 20 is an illustration of a system including an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone and a device having a user interface for receiving a flight path input for an aerial drone, wherein the flight path input comprises a distance for the aerial drone to travel before stopping or turning around, in accordance with an example embodiment of the present disclosure.
FIG. 21A is an illustration of a system including an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone, wherein the aerial drone is configured to detect a recognizable portion (e.g., an end) of an aisle before stopping or changing direction, in accordance with an example embodiment of the present disclosure.
FIG. 21B is an illustration of a system including an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone, wherein the aerial drone is configured to detect a portion (e.g., an end) of an aisle before stopping or changing direction, wherein the portion of the aisle is detected based upon one or more identifiers disposed upon or near the portion of the aisle, such as using image processing, computer vision, and/or machine learning techniques, in accordance with an example embodiment of the present disclosure.
FIG. 22A is an illustration of a system including an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone, wherein the aerial drone is configured to detect a marker located in proximity to (e.g., at or near) a portion (e.g., an end) of an aisle before stopping or changing direction, wherein the marker includes a mobile device (e.g., a smartphone, a tablet, etc.), in accordance with an example embodiment of the present disclosure.
FIG. 22B is an illustration of a system including an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone, wherein the aerial drone is configured to detect a marker located in proximity to (e.g., at or near) a portion (e.g., an end) of an aisle before stopping or changing direction, wherein the marker includes a recognizable object (e.g., a pylon, flag, colored/patterned fiducial marker, indicator light, etc.), in accordance with an example embodiment of the present disclosure. Such object may be identified visually, or by transmitting wireless signals to the drone.
FIG. 22C is an illustration of a system including an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone, wherein the aerial drone is configured to detect a marker located in proximity to (e.g., at or near) a portion (e.g., an end) of an aisle before stopping or changing direction, wherein the marker includes a wireless transmitter or transceiver, in accordance with an example embodiment of the present disclosure.
FIG. 23 is a block diagram illustrating control/processor blocks for an aerial drone, including navigation, scanning, and/or identifier detection processor(s), in accordance with an example embodiment of the present disclosure.
FIG. 24 is an illustration of a system including an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone, wherein a position of the aerial drone is detected based on a triangulation algorithm using signals transmitted to the aerial drone by a plurality of wireless transmitters or transceivers, in accordance with an example embodiment of the present disclosure.
FIG. 25A is an illustration of an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone, wherein a position of the aerial drone is detected based on a monocular camera-based positioning system, such as an IDS UEye global shutter camera or any other such monocular camera, in accordance with an example embodiment of the present disclosure.
FIG. 25B is an illustration of an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone, wherein a position of the aerial drone is detected based on a stereoscopic camera-based positioning system, such as an Intel Realsense, Microsoft Kinect, DJI Guidance, or any other such stereoscopic camera system, in accordance with an example embodiment of the present disclosure.
FIG. 25C is an illustration of an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone, wherein a position of the aerial drone is detected based on a multiple monocular or stereoscopic camera-based positioning system, in accordance with an example embodiment of the present disclosure.
FIG. 25D is an illustration of an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone, wherein a position of the aerial drone is detected based on a light detection and ranging (LIDAR) positioning system, such as the Velodyne PUCK, or any other such LIDAR system), in accordance with an example embodiment of the present disclosure.
FIG. 26A is an illustration of an aerial drone with an optical sensor and a camera having a wider field of view than the optical sensor, wherein the aerial drone is configured to detect an identifier with the optical sensor, capture an image of the identifier with the camera, and perform an image processing and/or machine learning algorithm on the captured image of the identifier, wherein the optical sensor and the camera are communicatively coupled to a graphics processor, in accordance with an example embodiment of the present disclosure.
FIG. 26B is an illustration of an aerial drone with an optical sensor and a camera having a wider field of view than the optical sensor, wherein the aerial drone is configured to detect an identifier with the optical sensor, capture an image of the identifier with the camera, and perform an image processing and/or machine learning algorithm on the captured image of the identifier, wherein the camera is communicatively coupled to a graphics processor and the optical sensor is communicatively coupled to a controller, in accordance with an example embodiment of the present disclosure.
FIG. 27 is an illustration of an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone, wherein the aerial drone is in communication with a device, the device configured to receive and process information associated with the identifiers detected by the aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 28 is an illustration of an aerial drone with an optical sensor configured to scan identifiers based on a flight path of the aerial drone, wherein the aerial drone is tethered to a portable device, such as a 4-wheel ground robot with onboard graphics processing units, the portable device configured to receive and process information associated with the identifiers detected by the aerial drone, in accordance with an example embodiment of the present disclosure.
FIG. 29A is a block diagram illustrating a warehouse management system (WMS) (sometimes referred to as an enterprise resource planning system (ERP)) that is configured to communicate with an aerial drone, such as the aerial drone in any of the embodiments illustrated byFIGS. 1A through 28, in accordance with an example embodiment of the present disclosure.
FIG. 29B is a block diagram illustrating a WMS in communication with an aerial drone, such as the aerial drone in any of the embodiments illustrated byFIGS. 1A through 28, in accordance with an example embodiment of the present disclosure.
FIG. 29C is a table of values populated by a WMS, the values corresponding to identifiers of inventory items detected by an aerial drone, such as the aerial drone in any of the embodiments illustrated byFIGS. 1A through 28, in accordance with an example embodiment of the present disclosure.
FIG. 30A is a graphical user interface generated by a WMS based on information associated with identifiers of inventory items detected by an aerial drone, such as the aerial drone in any of the embodiments illustrated byFIGS. 1A through 28, wherein the graphical user interface includes a mapping of the inventory items, in accordance with an example embodiment of the present disclosure.
FIG. 30B is a graphical user interface generated by a WMS based on information associated with identifiers of inventory items detected by an aerial drone, such as the aerial drone in any of the embodiments illustrated byFIGS. 1A through 28, wherein the graphical user interface includes a mapping of the inventory items, and in response to receiving a selection of an inventory item of the mapped inventory items, the graphical user interface displays information corresponding to the selected inventory item based on the information received by the WMS from the aerial drone, in accordance with an example embodiment of the present disclosure.
DETAILED DESCRIPTIONOverviewThe present disclosure relates to an inventory management system (e.g., warehouse inventory management system) that employs at least one aerial drone to scan identifiers of inventory items stored within a storage facility (e.g., warehouse), a manufacturing facility, and/or within a shopping facility, or the like. The system includes at least one aerial drone with an optical sensor (e.g., a laser scanner, photodetector array, camera, any combination thereof, or the like), an indoor positioning system (e.g., a triangulation based indoor positioning system, a light ranging and detection based indoor positioning system, or an indoor positioning system based on camera or LIDAR sensor systems coupled with a processor running simultaneous localization and mapping or visual-inertial odometry algorithms), and a controller on the aerial drone. The controller is communicatively coupled to the optical sensor and the indoor positioning system. The controller is configured to localize and navigate the aerial drone within a facility based on one or more signals from the indoor positioning system. The controller is further configured to detect identifiers attached to respective inventory items via the optical sensor and to store information associated with the detected identifiers in an onboard memory.
The controller can be configured to implement a flight path or several flight paths for the aerial drone. For example, the controller can implement a static flight path (e.g., a fully predetermined flight path through a storage facility) or a dynamic flight path (e.g., a flight path that at least partially changes based on one or more inputs (e.g., user inputs, detected position, detected markers/reference points, detected identifiers, etc.)).
In an example where the controller implements a dynamic flight path, the system can include a camera or multiple cameras (in addition to the optical sensor) on the aerial drone. The camera can have a wider field of view than the field of view of the optical sensor, which may also be a camera in some implementations. The controller may be configured to capture image data for a plurality of inventory items (e.g., an image, multiple images, or video footage of several adjacent inventory items) via the camera. The controller may be further configured to detect locations of the identifiers for the plurality of inventory items based on the image data, using image processing, computer vision, machine learning, and/or other algorithms, and configured to generate a flight path for the aerial drone based on the detected locations of the identifiers in order to cause the optical sensor to align with and detect respective ones of the identifiers. For example, the flight path generated by the controller may take into account differences in height of a first identifier of a first inventory item relative to a second identifier of a second inventory item that is adjacent to the first inventory item. The controller can also be configured to update the flight path based on detected differences in orientation, horizontal position (e.g., left, right, or center placement of the identifier on a respective inventory item), and so forth.
The system may include at least one actuator coupled to the optical sensor. For example, the system may include one, two, or possibly three or more actuators configured to actuate the optical sensor along or about at least one axis (or two axes (e.g., x and y), or three axes (e.g., x, y, and z) axes) in order to cause the optical sensor to align with and detect respective ones of the identifiers. In this regard, the controller can be configured to cause the actuator to reposition the optical sensor in addition to or instead of repositioning the aerial drone itself. Alternatively or additionally, the system can include a plurality of optical sensors having differing orientations (e.g., aimed at different heights when the aerial drone is in proximity to an inventory item) so that at least one of the optical sensors is capable of detecting an identifier regardless of its position on the inventory item.
In some embodiments, the controller is configured to implement a stop-and-go flight path to detect identifiers attached to respective inventory items via the optical sensor. For example, the controller can be configured to detect a first identifier of a first inventory item via the optical sensor. The controller is then configured to cause the aerial drone to maintain an alignment between the optical sensor and the first identifier for a predetermined time period or until the first identifier is recognized (e.g., until the detected identifier is successfully correlated with an identifier from a list of stored identifiers and/or until a threshold data set for the inventory item can be determined/derived from the detected identifier). The controller may be configured to cause the aerial drone to align the optical sensor with a second identifier of a second inventory item after the predetermined time period or after the first identifier is recognized.
The aerial drone may be configured to scan identifiers for inventory items located on both sides (e.g., on opposing, inward facing sides) of an aisle. For example, the controller may be configured to cause the aerial drone to follow a zig-zag flight path such that the optical sensor detects identifiers of inventory items located one side of each aisle of a plurality of aisles prior to reaching an end of the plurality of aisles. The aerial drone can then turn around (e.g., rotate about 180 degrees) and perform the same flight path in an opposite direction in order to scan identifiers of the inventory items located on the other side of each aisle of the plurality of aisles. In another example implementation, the aerial drone has at least a second optical sensor on the aerial drone. The second optical sensor can be oriented such that it faces an opposite direction relative to the optical sensor (e.g., the first optical sensor and the second optical sensor generally face away from one another). The controller can be configured to implement a flight path down an aisle, wherein the first optical sensor and the second optical sensor are configured to align with and detect identifiers of inventory items located on opposing sides of the aisle prior to reaching an end of the aisle. The first optical sensor and the second optical sensor may be configured to perform detections simultaneously, at least partially in parallel, or immediately after one another.
The system can employ markers to indicate respective endpoints of aisles and/or other reference points. For example, a marker can comprise a mobile device (e.g., a smartphone, a tablet, etc.) configured to display a visual indicator or transmit a wireless signal that is detectable by the aerial drone (e.g., using the optical sensor or another sensor, wireless transceiver, or the like). In another example implementation, a marker can comprise a recognizable object (e.g., a pylon, flag, colored/patterned fiducial marker, indicator light, etc.). In another example implementation, a marker can comprise a wireless transmitter or transceiver (e.g., RFID tag, Bluetooth beacon, WiFi or ZigBee transmitter/transceiver, ultra-wideband (UWB) transmitter/transceiver, radio frequency (RF) transmitter/transceiver, or the like). Any number or combination of markers can be implemented throughout the system.
In some embodiments, the aerial drone has an indoor positioning system communicatively coupled to the controller. For example, the positioning system can include a camera-based positioning system, a triangulation based (e.g., laser or RF) positioning system, a light detection and ranging (LIDAR) positioning system, a camera-based simultaneous localization and mapping (SLAM) positioning system, inertial tracking system, or the like, and any combination thereof. The controller can be configured to determine a position of the aerial drone based on one or more signals from the positioning system. The controller may be further configured to associate the determined position with a detected identifier. For example, the controller can be configured to store respective positions for the detected identifiers. The controller can also be configured to determine the flight path for the aerial drone based upon the determined position of the aerial drone and/or a determined position of the aerial drone relative to one or more markers or other reference points.
The controller and associated circuitry/components (e.g., a graphics processor or the like) can be configured to perform an image processing algorithm on an image of an identifier and/or text, symbols, drawings, or pictures associated with the identifier to implement machine learning or computer vision functionalities. For example, the controller can be configured to detect the identifier and capture an image of an identifier with the optical sensor and/or a camera on the aerial drone. The controller can then perform an image processing algorithm on the image to detect at least one recognizable feature of the identifier and/or text, symbols, drawings, or pictures associated with the identifier (e.g., using a processor of the controller and/or a graphics processor communicatively coupled to the controller).
The aerial drone can be configured to communicate with a warehouse management system (WMS) that stores inventory data for the storage facility. In embodiments, the WMS may include, but is not limited to, an onsite computer/server, a network of onsite computers/servers, a remote computer/server, a network of remote computers/servers, a cloud computing network, a network accessible by one or more mobile devices, or any combination of the foregoing. The controller may be configured to transmit information associated with the detected identifiers to the WMS. The WMS can have an onsite user interface and/or can be configured to transmit information for display via a user interface of a connected device (e.g., a computer, mobile device, or the like). In some embodiments, the WMS is configured to generate a graphical user interface (e.g., for display via the user interface of the WMS, or the user interface of a connected device). The graphical user interface generated by the WMS can include a mapping of a plurality of inventory items. The graphical user interface can be configured to receive user inputs (e.g., data entries, selections, etc.) via an I/O device (e.g., keyboard, mouse, touch panel, microphone (e.g., for voice commands), and the like). In response to receiving a selection of an inventory item of the plurality of mapped inventory items, the WMS may be configured to cause the graphical user interface to display information corresponding to the selected inventory item based on information received from the aerial drone.
Example ImplementationsFIGS. 1A through 1E illustrate several types ofaerial drones100 that can be employed by a warehouse inventory management system, in accordance with various embodiments of this disclosure. For example, theaerial drone100 can be, but is not limited to, a blimp (e.g., as shown inFIG. 1A), a quadcopter with upward and downward facing propellers (e.g., as shown inFIG. 1B), which may also be referred to as an octocopter because it has eight propellers, a quadcopter with upward facing propellers (e.g., as shown inFIG. 1C), a quadcopter with downward facing propellers (e.g., as shown inFIG. 1D), a hexacopter (e.g., as shown inFIG. 1E), or the like. Examples of propeller types are shown inFIGS. 2A through 2C (e.g., apropeller200 with twofins202 shown inFIG. 2A, apropeller200 with fourfins202 shown inFIG. 2B, apropeller200 with threefins202 shown inFIG. 2C). Examples oflanding gear footings300 are shown inFIGS. 3A and 3B (e.g., with a deformable (cushion-like) ornon-deformable ball302 shown inFIG. 3A, with a deformable (cushion-like) or non-deformablecylindrical footing304 shown inFIG. 3B). Examples of landing gear configurations are shown inFIGS. 4A through 4E, in particular:FIG. 4A shows an example embodiment of an aerial drone with alanding gear400 including horizontal bars for interfacing with a landing surface (e.g., ground, raised platform, building structure, shelf, etc.);FIG. 4B shows an example embodiment of an aerial drone with alanding gear402 including feet/nubs for interfacing with a landing surface;FIG. 4C shows an example embodiment of an aerial drone with alanding gear404 including raised points (e.g., downward facing conical or pyramid-like elements) for interfacing with a landing surface;FIG. 4D shows an example embodiment of an aerial drone with alanding gear406 including feet/nubs extending from the aerial drone's motors (e.g., below propellers200) for interfacing with a landing surface; andFIG. 4E shows an example embodiment of an aerial drone with a cage-like landing gear408 for interfacing with a landing surface. The foregoing embodiments are provided by way of example, and it is contemplated than any aerial drone configuration having any number/type of propellers, landing gear, etc., can be implemented without departing from the scope of this disclosure.
Various components that can be coupled to, integrated within a structure of, or otherwise onboard theaerial drone100 are illustrated inFIG. 1F. In embodiments, theaerial drone100 has at least one controller (e.g., main/central controller102 and/or flight controller110). For example, themain controller102 can be configured to provide communication and processing functionality for theaerial drone100, while theflight controller100 is configured to receive instructions from themain controller102 and drive one ormore motors112 accordingly. Theaerial drone100 may have a number ofmotors112 coupled torespective propellers114. In another embodiment, themain controller102 can implementflight controller110 operations and drive themotors112 directly, or theflight controller110 can comprise themain controller102, or vice versa.
Controller102 (and/or flight controller110) can include aprocessor104, amemory106, and acommunications interface108. Theprocessor104 provides processing functionality for thecontroller102/drone100 (or components thereof) and can include any number of microprocessors, digital signal processors, micro-controllers, circuitry, field programmable gate array (FPGA) or other processing systems, and resident or external memory for storing data, executable code, and other information accessed or generated by thecontroller102/drone100. Theprocessor104 can execute one or more software programs embodied in a non-transitory computer readable medium that implement techniques described herein. Theprocessor104 is not limited by the materials from which it is formed or the processing mechanisms employed therein and, as such, can be implemented via semiconductor(s) and/or transistors (e.g., using electronic integrated circuit (IC) components), and so forth.
Thememory106 can be an example of tangible, computer-readable storage medium that provides storage functionality to store various data and or program code associated with operation of thecontroller102/drone100, such as software programs and/or code segments, or other data to instruct theprocessor104, and possibly other components of thecontroller102/drone100, to perform the functionality described herein. Thus, thememory106 can store data, such as a program of instructions (e.g., software module(s)) for operating thecontroller102/drone100 (including its components), and so forth. It should be noted that while asingle memory106 is described, a wide variety of types and combinations of memory (e.g., tangible, non-transitory memory) can be employed. Thememory106 can be integral with theprocessor104, can comprise stand-alone memory, or can be a combination of both.
Some examples of thememory106 can include removable and non-removable memory components, such as random-access memory (RAM), read-only memory (ROM), flash memory (e.g., a secure digital (SD) memory card, a mini-SD memory card, and/or a micro-SD memory card), magnetic memory, optical memory, universal serial bus (USB) memory devices, hard disk memory, external memory, and so forth. In implementations, thecontroller102/drone100 and/or thememory106 can include removable integrated circuit card (ICC) memory, such as memory provided by a subscriber identity module (SIM) card, a universal subscriber identity module (USIM) card, a universal integrated circuit card (UICC), and so on.
Thecommunications interface108 can be operatively configured to communicate with components of thecontroller102/drone100. For example, thecommunications interface108 can be configured to retrieve data from storage in thecontroller102/drone100, transmit data for storage in thecontroller102/drone100, and so forth. Thecommunications interface108 can also be communicatively coupled with theprocessor104 to facilitate data transfer between components of thecontroller102/drone100 and theprocessor104. It should be noted that while thecommunications interface108 is described as a component of acontroller102/drone100, one or more components of thecommunications interface108 can be implemented as external components communicatively coupled to thecontroller102/drone100 via a wired and/or wireless connection. Thecontroller102/drone100 can also be configured to connect to one or more input/output (I/O) devices via thecommunications interface108 and/or via direct or indirect communicative coupling with theprocessor104. In an embodiment shown inFIG. 1F, thecontroller102 is communicatively coupled to at least one optical sensor116 (e.g., a laser scanner, photodetector array, camera, any combination thereof, or the like) on thedrone100. In some embodiments, thedrone100 further includes a camera118 (e.g., a camera having a wider field of view than the optical sensor116), one or more additional sensors120 (e.g., temperature sensors, inertial sensors, altitude detectors, LIDAR devices, laser depth sensors, radar/sonar devices, wireless receivers/transceivers, RFID detectors, etc.), an indoor position determining system122 (e.g., camera vision based SLAM positioning system employing one or more monocular cameras, one or more stereoscopic camera, one or more laser depth sensors, one or more LIDAR devices, laser and/or ultrasonic rangefinders, an inertial sensor based positioning system, an RF/WIFI/Bluetooth triangulation based sensor system, or the like), a graphics processor124 (e.g., to provide processing functionality for theindoor positioning system122, and/or to implement optical character recognition (OCR), machine learning, computer vision, or any other image processing algorithm(s)), any combination thereof, and so forth. Thecontroller102 can be configured to utilize sensor inputs to detect identifiers on inventory items and/or other information (e.g., contextual information (e.g., location of an inventory item, time, temperature, humidity, pressure, etc.) or product information (e.g., label information for the inventory item, expiration information, production date, environmental tolerances, quantity, size/volume, product weight (if printed on the inventory item), etc.), navigate the drone100 (e.g., by avoiding obstacles, detecting reference points, updating a dynamic flight path for the drone100), and to stabilize and/or localize its position.
Thecommunications interface108 and/or theprocessor104 can be configured to communicate with a variety of different networks, such as near-field communication (NFC) networks, a wide-area cellular telephone network, such as a cellular network, a 3G cellular network, a 4G cellular network, or a global system for mobile communications (GSM) network; a wireless computer communications network, such as a WiFi network (e.g., a wireless local area network (WLAN) operated using IEEE 802.11 network standards); an ad-hoc wireless network, an internet; the Internet; a wide area network (WAN); a local area network (LAN); a personal area network (PAN) (e.g., a wireless personal area network (WPAN) operated using IEEE 802.15 network standards); a public telephone network; an extranet; an intranet; and so on. However, this list is provided by way of example only and is not meant to limit the present disclosure. Further, thecommunications interface108 can be configured to communicate with a single network or multiple networks across different access points. In an embodiment, acommunications interface108 can transmit information from thecontroller102/drone100 to an external device (e.g., mobile device, a computer connected to a network, cloud storage, server, etc.). For example, as shown inFIGS. 29A through 29C and further described below, thecommunications interface108 may be configured to transmit information from thecontroller102/drone100 to a warehouse management system (WMS)2900 (sometimes referred to as an enterprise resource planning (ERP) system) for storing and/or updating information based on the information transmitted by thecontroller102/drone100. In another embodiment, acommunications interface108 can receive information from an external device (e.g., a mobile device, a computer connected to a network, cloud computing/storage network, etc.). For example, thecommunication interface120 may be further configured to receive information from the WMS2900 (e.g., requests for data, control or flight path information, etc.).
Theaerial drone100 includes at least oneoptical sensor116 configured to detect identifiers on inventory items (e.g., labeling information, such as, but not limited to, shipping labels, packaging labels, text, images, barcodes, combinations thereof, and the like). Examples of inventory items include warehouse objects, such as, but not limited to, boxes, pallets, cartons, packages, and cases; although other labeling information may be located on warehouse structures, such as aisles, shelves, signs, floors, paths, and so forth. In example embodiments shown inFIGS. 5A through 5D, theoptical sensor116 may include an optical scanner500 (e.g., laser scanner or other light-based scanner).FIG. 5A shows a one-dimensional scanner500 configured to scan an identifier504 (e.g., barcode) on aninventory item502. As shown inFIG. 5B, the one-dimensional scanner500 must have a scanning orientation that corresponds to the orientation of the identifier504 (e.g., both in portrait or both in landscape orientation); otherwise the one-dimensional scanner500 is unable to recognize theidentifier504. In another embodiment shownFIGS. 5C and 5D, thescanner500 is a two-dimensional scanner500. The two-dimensional scanner500 can successfully detect theidentifier504 regardless of the orientation or tilt angle of theidentifier504. In this regard, employing a multi-dimensional (e.g., two or more dimension)scanner500 can be advantageous.
Referring now toFIGS. 6A through 6C, identifiers printed oninventory items602 can include patterned elements604 (e.g., one-dimensional barcodes, two-dimensional codes such as QR codes, or the like) as shown inFIG. 6A, printed symbols or alphanumeric characters606 (e.g., numbers and letters) as shown inFIG. 6C, or a combination thereof (e.g., as shown inFIG. 6B). In some embodiments, theoptical sensor116 can include an image-based sensor600 (e.g., a camera or a scanning array of photodetectors) that is configured to capture an image of the identifier (e.g., patternedelement604 and/or alphanumeric character/symbol606) on theinventory item602. Thecontroller102 can be configured to perform an image processing algorithm on the image (e.g., an OCR algorithm) to recognize theidentifier604/606 and/or derive information from the detected identifier, product information, and so forth.
FIGS. 7A and 7B demonstrate a problem that may be encountered when theaerial drone100 is scanningidentifiers704 ofinventory items702 in astorage facility700. As shown inFIG. 7A, theaerial drone100 can be configured to scan (e.g., with optical sensor116)identifiers704 at a first height based on a flight path of theaerial drone100. However, as shown inFIG. 7B, theaerial drone100 may miss anidentifier704 on asubsequent inventory item702 if the identifier is positioned at a different height than the first identifier. That is, the flight path of theaerial drone100 might not account for differences in positioning ofidentifiers704 oninventory items702, and as a result, someidentifiers704 may not be detected.
FIGS. 8A and 8B show an embodiment of theaerial drone100 that accounts for differences in positioning ofidentifiers804 oninventory items802. For example,FIG. 8A shows astorage facility800 whereinventory items802 haveidentifiers804 located at different respective heights. Theaerial drone100 can optionally include a camera118 (e.g., as shown inFIG. 1F) that has a wider field of view than the field of view of theoptical sensor116. Thecontroller102 can be configured to capture image data for a plurality of inventory items802 (e.g., an image, multiple images, or video footage of several adjacent inventory items802) via thecamera118. Thecontroller102 can be further configured to detect locations (e.g., x, y, and/or z coordinates) of theidentifiers804 for the plurality ofinventory items802 based on the image data and configured to generate a flight path808 (which may be an updated version of an original flight path806) for the aerial drone based on the detected locations of theidentifiers804 in order to cause theoptical sensor116 to align with and detect respective ones of the identifiers804 (e.g., as shown inFIG. 8B). For example, theflight path808 generated by thecontroller102 may take into account differences in height of a first identifier of a first inventory item relative to a second identifier of a second inventory item that is adjacent to the first inventory item. The controller can also be configured to update theflight path806/808 based on detected differences in orientation, horizontal position (e.g., left, right, or center placement of theidentifier804 on a respective inventory item802), and so forth.
Theaerial drone100 can also be configured to account for differences in the positioning of identifiers on respective inventory items by employing at least one actuatable optical sensor (e.g., such as theactuatable sensor900 shown inFIG. 9A or 9B). For example, theoptical sensor116 can include an actuatableoptical sensor900 having at least one actuator (e.g.,actuator904 and/or actuator906) and amechanical mount902 that attaches the actuator (e.g.,actuator904 and/or actuator906) to the optical sensor (e.g., scanner500). Examples of an actuator can include, but are not limited to, a servo, stepper motor, linear actuator, electromagnetic actuator, or the like. The actuatableoptical sensor900 can include one actuator902 (e.g., as shown inFIG. 9A), twoactuators902 and904 (e.g., as shown inFIG. 9B, or possibly three or more actuators configured to actuate theoptical sensor900 along or about at least one axis (or two axes (e.g., x and y), or three axes (e.g., x, y, and z) axes) in order to cause theoptical sensor900 to align with and detect respective ones of the identifiers. In an example implementation,actuator906 is a one-directional motor, such as a stepper motor or a servomotor, andactuator904 is a multi-directional motor, such as a stepper motor or servomotor, but in a perpendicular direction fromactuator906 so as to give the actuatableoptical sensor900 an additional axis of motion. Thecontroller102 may be configured to cause the actuator (actuator902 and/or904) to reposition theoptical sensor900 in addition to or instead of repositioning theaerial drone100 itself. For example, as shown inFIG. 10A, theaerial drone100 can maintain a low flight path1006 (e.g., at a predetermined and/or static height) through astorage facility1000 and can be configured to detectidentifiers1004 ofinventory items1002 that are higher thanidentifiers1010 ofother inventory items1008 by actuating the optical sensor116 (e.g., actuatable optical sensor900) of theaerial drone100. As shown inFIG. 10B, theaerial drone100 can alternatively or additionally include a plurality ofoptical sensors116 having differing orientations (e.g., aimed at different heights when the aerial drone is in proximity to an inventory item1002) so that at least one of theoptical sensors116 is capable of detecting anidentifier1004 regardless of its position on theinventory item1002. In this regard, a firstoptical sensor116 on theaerial drone100 can be configured to detect anidentifier1010 at a first height on arespective inventory item1008 and anotheroptical sensor116 on theaerial drone100 can be configured to detect anidentifier1004 at a second height on arespective inventory item1002, where the second height is greater than the first height.
FIGS. 11A and 11B demonstrate a problem that may be encountered when theaerial drone100 is scanningidentifiers1104 ofinventory items1102 in astorage facility1100. As shown inFIG. 11A, theaerial drone100 can be configured to scan (e.g., with optical sensor116)identifiers1104 based on aflight path1006 of theaerial drone100. However, as shown inFIG. 11B, theaerial drone100 may miss anidentifier1104 on aninventory item1102 if theidentifier1104 cannot be recognized (e.g., the scannedidentifier1104 does not register) before theaerial drone100 moves on to scan the next inventory item. In some embodiments, thecontroller102 is configured to implement aflight path1106 with a speed that is not greater than a maximum speed at which theoptical sensor116 can scan theidentifier1104, or thecontroller102 may be configured to cause theaerial drone100 to fly at the reduced speed when theaerial drone100 is in proximity to anidentifier1104 and/or when theoptical sensor116 is used to detect theidentifier1104. In other embodiments, thecontroller102 can be configured to implement a stop-and-go flight path1106 (e.g., as shown inFIG. 11C) to detect identifiers (e.g.,identifiers1104 and1112) attached to respective inventory items (e.g.,inventory items1102 and1110) via theoptical sensor116. For example, thecontroller102 can be configured to detect afirst identifier1104 of afirst inventory item1102 via theoptical sensor116. Thecontroller102 is then configured to cause theaerial drone100 to maintain an alignment between theoptical sensor116 and the first identifier1104 (e.g., by maintaining the current position of the aerial drone100) for a predetermined time period or until thefirst identifier1104 is recognized (e.g., until the detectedidentifier1104 is successfully correlated with an identifier from a list of stored identifiers and/or until a threshold data set for theinventory item1102 can be determined/derived from the detected identifier1104). Thecontroller102 may be configured to cause theaerial drone100 to fly to asecond inventory item1110 and align theoptical sensor116 with asecond identifier1112 of thesecond inventory 1110 item after the predetermined time period or after thefirst identifier1104 is recognized.
In some embodiments, theoptical sensor116 includes a camera having a global shutter to reduce image blur from flying by anidentifier1104 too quickly. A global shutter camera may be used to instantaneously capture an image of anidentifier1104 with less image blur than a rolling shutter camera that captures image pixels sequentially, for example. Thus, theaerial drone100 can employ anoptical sensor116 with a global shutter to improve readability of captured images ofidentifiers1104, which may be especially useful in implementations where thecontroller102 performs OCR analysis on the image.
Theoptical sensor116 can be coupled to theaerial drone100, integrated within a structure of theaerial drone100, or otherwise disposed upon theaerial drone100 in many ways. For example, theoptical sensor116 can include theoptical sensor1200 implemented on theaerial drone100 in any of the configurations shown inFIGS. 12A through 12H. For example,FIG. 12A shows an embodiment of theaerial drone100 with theoptical sensor1200 mounted to an upper surface of theaerial drone100;FIG. 12B shows an embodiment of theaerial drone100 with theoptical sensor1200 mounted to a mounting structure1202 (e.g., a raised platform) on an upper surface of theaerial drone100;FIG. 12E shows and embodiment of theaerial drone100 with theoptical sensor1200 mounted to a mounting structure1202 (e.g., a protruding platform/shelf) that protrudes from theaerial drone100;FIG. 12F shows an embodiment of theaerial drone100 with theoptical sensor1200 mounted at least partially within a mountingstructure1202 that defines a body portion of or an opening in a body portion of theaerial drone100;FIG. 12G shows an embodiment of theaerial drone100 with theoptical sensor1200 mounted to a lower surface of theaerial drone100; andFIG. 12H shows an embodiment of theaerial drone100 with theoptical sensor1200 coupled to a mounting structure1202 (e.g., a gimbal) that suspends theoptical sensor1200 from a lower surface of theaerial drone100. In embodiments (e.g., as shown inFIGS. 12C and 12D), theoptical sensor1200 can include at least one actuator (e.g.,actuator1204 and/or actuator1206) configured to rotate or slideoptical sensor1200 in two or more directions. For example, theactuators1204 and1206 can include servos, stepper motors, linear actuators, electromagnetic actuators, or the like. In some embodiments, theoptical sensor1200 may include one actuator1204 (e.g., as shown inFIG. 12C), twoactuators1204 and1206 (e.g., as shown inFIG. 12D), or possibly three or more actuators configured to actuate theoptical sensor1200 along or about at least one axis (or two axes (e.g., x and y), or three axes (e.g., x, y, and z) axes) in order to cause theoptical sensor1200 to align with and detect identifiers on inventory items (e.g., as described above).
FIGS. 13A through 13C show various embodiments of anoptical sensor116 and/orcamera118 configuration for anaerial drone100. For example,FIG. 13A shows acomponent assembly1300 where an optical sensor1304 (e.g., optical sensor116) is coupled to a controller1302 (e.g., controller102) with adata cable1303 and coupled to a power supply1306 (e.g., battery or generator) with apower cable1305.FIG. 13B shows another example implementation where theoptical sensor1304 is coupled to thecontroller1302 with a data cable and a power cable1305 (e.g., where thecontroller1302 includes power distribution circuitry and/or a built-in power supply).FIG. 13C shows another example implementation where theoptical sensor1304 is coupled to thecontroller1302 with a combined data and power cable1307 (e.g., a Power over Ethernet (POE) connection, USB connection, or the like).
Thecontroller102 can be configured to implement a flight path or several flight paths for the aerial drone. For example, thecontroller102 can implement a static flight path (e.g., a fully predetermined flight path through a storage facility) or a dynamic flight path (e.g., a flight path that at least partially changes based on one or more inputs (e.g., user inputs, detected position, detected markers/reference points, detected identifiers, etc.)).
In an implementation shown inFIG. 14, thecontroller102 is configured to implement a stop-and-go flight path1409 for theaerial drone100. For example, theaerial drone100 can fly through astorage facility1400 while scanning identifiers (e.g.,identifier1404, . . . ,identifier1408, etc.) on inventory items (e.g.,inventory item1402, . . . ,inventory item1406, etc.). Thecontroller102 can be configured to cause theaerial drone100 to stop at a first position1410 (e.g., remain at a constant position or at a nearly constant position (e.g., within a restricted range of motion)) and maintain an alignment between theoptical sensor116 andfirst identifier1404 for a predetermined time period or until theidentifier1404 is recognized (e.g., until the detectedidentifier1404 is successfully correlated with an identifier from a list of stored identifiers and/or until a threshold data set for theinventory item1402 can be determined/derived from the detected identifier1404). Thecontroller102 may be configured to cause theaerial drone100 to fly tosecond position1412,third position1414, and so on while scanning identifiers for respective inventory items at each of the positions.
There are several manners by which theaerial drone100 can be configured to scan identifiers for inventory items located on both sides (e.g., on opposing, inward facing sides) of an aisle. For example, inFIGS. 15 and 16, the controller may be configured to cause the aerial drone to follow a zig-zag flight path (e.g.,flight path1502/1602) through a storage facility (e.g.,storage facility1500/1600) such that theoptical sensor100 detectsidentifiers1506 ofinventory items1504 located one side of each aisle of a plurality of aisles prior to reaching an end of the plurality of aisles. Then, as shown inFIG. 16, thecontroller102 can be configured to cause theaerial drone100 to turn around (e.g., arotation1606 of about 180 degrees) and perform thesame flight path1602 in an opposite direction in order to scanidentifiers1606 of theinventory items1604 located on the other side of each aisle of the plurality of aisles. In another example implementation shown inFIG. 17, thecontroller102 is configured to cause theaerial drone100 to follow aflight path1702 that causes theaerial drone100scan identifiers1706 ofinventory items1704 located in a subset of the aisles of thestorage facility1700. In another implementation shown inFIG. 18, thecontroller102 is configured to cause theaerial drone100 to follow aflight path1802 that causes theaerial drone100 to travel to a particular (e.g., user selected or program selected)inventory item1804 and scan anidentifier1806 on the selectedinventory item1804 within astorage facility1800. For example, theaerial drone100 may be dispatched to a selected position within a selected aisle (e.g., using column and row selection, or the like). In another example implementation shown inFIG. 19, theaerial drone100 includes at least two optical sensors116 (e.g., a first optical sensor and a second optical sensor, with the second optical sensor oriented such that it faces an opposite direction relative to the first optical sensor; in other words, at least twooptical sensors116 that generally face away from one another). Thecontroller102 can be configured to implement a flight path down an aisle of astorage facility1900 that causes the first optical sensor and the second optical sensor to align with and detect identifiers (e.g.,identifiers1904 and1908) of inventory items (e.g.,inventory items1902 and1906) located on opposing sides of the aisle prior to reaching an end of the aisle. Thecontroller102 may be configured to detect identifiers with the at least twooptical sensors116 simultaneously, at least partially in parallel, or immediately after one another.
The warehouse inventory management system can employ one or more techniques to identify reference points (e.g., endpoints) of aisles or other structures within a storage facility. In an example implementation shown inFIG. 20, theaerial drone100 is in communication with a user device2000 (e.g., a mobile device, notebook computer, desktop computer, etc.). For example, thecontroller102 can receive communications from theuser device2000 via thecommunications interface108. In an embodiment, theuser device2000 is configured to receive auser input2002 including a distance for the aerial drone to travel. Theuser device2000 may further receive auser selection2004 to initiate drone operation. In response, the flight path information is communicated to thecontroller102, and thecontroller102 can be configured to cause theaerial drone100 to follow aflight path2006 that extends adistance2008 that is based on (e.g., equal to) theuser input2002. For example, theaerial drone100 may travel thedistance2008 before stopping or turning around within a storage facility.
In another implementation shown inFIG. 21A, thecontroller102 is configured to detect a recognizable portion2108 (e.g., an end) of an aisle before stopping or changing direction. For example, thecontroller102 can be configured to employ computer vision to recognize image features that correspond to a reference point (e.g., endpoint) of a shelf or other structure within astorage facility2100, or use non-feature based approaches in image processing, computer vision, and/or machine learning for the same task. In some embodiments, thecontroller102 relies on acamera118 in addition to theoptical sensor116 to detect the recognizable portion of the aisle, and theoptical sensor116 is used to detectidentifiers2104 oninventory items2102. In other embodiments, the optical sensor116 (e.g., a camera) is used to detect theidentifiers2104 as well as therecognizable portions2108. Theaerial drone100 may be configured to follow aflight path2106 until therecognizable portion2108 is detected, and then thecontroller102 can cause theaerial drone100 to stop, turn around, or follow a new flight path or updated version of theflight path2106. In some implementations, the reference points are tagged with identifiers that can be detected by theoptical sensor116 and/or thecamera118. For example, as shown inFIG. 21B, the aisles can haveidentifiers2110 at the ends of the aisles (or at other reference points within the aisles). The aisles can also haveidentifiers2112 located at a vertical reference points2114 (e.g., to indicate different shelves/pallets) within the aisles. In this regard, thecontroller102 can be configured to cause theaerial drone100 to travel to selected shelf locations within an aisle and/or determine when theaerial drone100 has scanned a top shelf of the aisle (e.g., finished scanning allidentifiers2104 ofinventory items2102 within the aisle).
In some implementations (e.g., as shown inFIGS. 22A through 22C), the warehouse inventory management system can employ markers to indicate respective endpoints of aisles and/or other reference points within astorage facility2200. For example, a marker can comprise a mobile device2202 (e.g., a smartphone, a tablet, etc.) configured to display a visual indicator or transmit a wireless signal that is detectable by the aerial drone100 (e.g., using theoptical sensor116 or another sensor, wireless transceiver, or the like). In another example implementation, a marker can comprise a recognizable object2204 (e.g., a pylon, flag, colored/patterned fiducial marker, indicator light, etc.). In another example implementation, a marker can comprise a wireless transmitter or transceiver2206 (e.g., RFID tag, Bluetooth beacon, WiFi or ZigBee transmitter/transceiver, ultra-wideband (UWB) transmitter/transceiver, radio frequency (RF) transmitter/transceiver, or the like). Any number or combination of markers can be implemented throughout the system.
FIG. 23 is a block diagram illustrating acontrol system2300 configuration for theaerial drone100, in accordance with an embodiment of the present disclosure. For example, thecontrol system2300 can include a flight controller2302 (e.g.,controller110 and/or controller102), a navigation processor2304 (e.g.,controller102 and/or graphics processor124), barcode detection processor2306 (e.g.,controller102 and/or graphics processor124), and scanner processor2308 (e.g.,controller102 and/or graphics processor124). Theflight controller2302 is configured to handle low level commands (e.g., control signal) for themotors112. Thenavigation processor2304,barcode detection processor2306, and/orscanner processor2308 may be implemented by thecontroller102 and/or thegraphics processor124 to provide processing for theindoor navigation system122, optical sensor(s)116,camera118, and/or additional sensor(s)120 for identifier recognition, OCR and other computer vision/machine learning, and/or localization, navigation, and stabilization processes for navigating the aerial drone within a storage facility.
In some embodiments, theaerial drone100 has anindoor positioning system122 communicatively coupled to thecontroller102. For example, theindoor positioning system122 can include an optical flow camera-based positioning system, a triangulation based (e.g., laser or RF) positioning system, a light detection and ranging (LIDAR) or camera-based a simultaneous localization and mapping (SLAM) positioning system, a laser or ultrasonic rangefinder based positioning system, inertial tracking system, or the like, and any combination thereof. Thecontroller102 can be configured to determine a position of theaerial drone100 based on one or more signals from theindoor positioning system122. Thecontroller102 may be further configured to associate the determined position with a detected identifier. For example, thecontroller102 can be configured to store respective positions for the detected identifiers. Thecontroller102 can also be configured to determine the flight path for theaerial drone100 based upon the determined position of theaerial drone100 and/or a determined position of theaerial drone100 relative to one or more markers or other reference points.
In an example implementation shown inFIG. 24, theindoor positioning system122 can include at least one receiver or transceiver configured to detect signals from a plurality of transmitters2402 (e.g., RF transmitters, Bluetooth beacons, WiFi transmitters, ZigBee transmitters, UWB transmitters, LEDs or other light emitters, or other active transmitters) within a storage facility. Thecontroller102 can be configured to determine a position of theaerial drone100 by triangulating signals received from the plurality oftransmitters2402. In some embodiments, thecontroller102 utilizes agraphics processor124 or another auxiliary processor to perform the triangulation.
In example implementations shown inFIGS. 25A through 25D, theindoor positioning system122 can include cameras and/or light sensors to determine a position of theaerial drone100 based on SLAM, visual-inertial, and/or LIDAR fused algorithms that are performed by thecontroller102 and/orgraphics processor124. For example,FIG. 25A shows an embodiment of theaerial drone100 where theindoor positioning system122 includes amonocular camera2500 for use with a SLAM, visual-inertial, and/or LIDAR fused positioning system;FIG. 25B shows an embodiment of theaerial drone100 where theindoor positioning system122 includes astereoscopic camera2502 for use with a SLAM, visual-inertial, and/or LIDAR fused positioning system;FIG. 25C shows an embodiment of theaerial drone100 where theindoor positioning system122 includes a plurality ofmonocular cameras2500 for use with a SLAM, visual-inertial, and/or LIDAR fused positioning system; andFIG. 25D shows an embodiment of theaerial drone100 where theindoor positioning system122 includes a LIDAR device (e.g., Velodyne PUCK, or the like). In some implementations, theindoor positioning system122 may additionally or alternatively include, but is not limited to, distance sensors (e.g., laser or ultraviolet differential or depth sensors, sonar or radar distance sensors, etc.), inertial sensors (e.g., accelerometers, gyroscopes, etc.), or the like.
Thecontroller102 and associated circuitry/components (e.g., agraphics processor124 or the like) can be configured to perform an image processing algorithm on an image of an identifier and/or text, symbols, drawings, or pictures associated with the identifier to implement machine learning or computer vision functionalities. For example, thecontroller102 can be configured to detect the identifier and capture an image of an identifier with theoptical sensor116 and/or acamera118 on the aerial drone. Thecontroller102 can then perform an image processing algorithm on the image to detect at least one recognizable feature of the identifier and/or text, symbols, drawings, or pictures associated with the identifier (e.g., using aprocessor104 of thecontroller102 and/or agraphics processor124 communicatively coupled to the controller, and/or another auxiliary processor having a higher speed processor and/or more processing cores than the controller102).
In order to detect identifiers (e.g., barcodes, QR codes, text, symbols, images, etc.), theaerial drone100 must be able to align theoptical sensor116 with the identifier. In some embodiments, theaerial drone100 can employ a wide field of view camera (e.g., camera118) to collect image data, determine positioning of at least one identifier based upon the image data, and utilize the positioning information to align theoptical sensor116 with the identifier. For example, thecontroller102 can be configured to adjust the drone's flight path or trajectory based upon the positioning information derived from the image data. Thecontroller102 may employ various machine learning approaches, as discussed above. For example, thecontroller102 can employ Haar Cascade algorithms, Neural Network algorithms, You Only Look Once algorithms, or the like. Thecontroller102 can also employ various computer vision approaches, such as, but not limited to, color segmentation algorithms, line segmentation algorithms, and so forth.
In the embodiments shown inFIGS. 26A and 26B, theaerial drone100 includes a wide field of view camera2602 (e.g., camera118) in addition to an optical sensor2604 (e.g., optical sensor116). Theaerial drone100 can also include a dedicated graphics processor2600 (e.g., graphics processor124) that processes image data collected by thecamera2602. In the embodiment shown inFIG. 26A, thegraphics processor2600 is configured to process image data collected by thecamera2602 in addition to scan data collected by theoptical sensor2604. In the embodiment shown inFIG. 26B, thegraphics processor2600 is configured to process image data collected by thecamera2602 and another processor2608 (e.g., controller102) is configured to process scan data collected by theoptical sensor2604.
FIGS. 27 and 28 show embodiments of theaerial drone100, where at least a portion of the image data and/or scan data processing is performed by another device that is communicatively coupled to theaerial drone100. For example, as shown inFIG. 27, theaerial drone100 can be configured to transmit image data collected by thecamera118 to another device2700 (e.g., mobile device, notebook computer, desktop computer, server, WMS, etc.). Thedevice2700 can be configured to perform one or more image processing algorithms on the image data and can be further configured to transmit information (e.g., positioning information, control instructions, etc.) to theaerial drone100 based upon the image data. In another embodiment shown inFIG. 28, theaerial drone100 can be tethered (e.g., via a communicative coupling) to a portable device2800 (e.g., a terrestrial robot that follows theaerial drone100 and/or a vehicle/cart pulled by the aerial drone100), where theportable device2800 can be configured to perform one or more image processing algorithms on the image data and can be further configured to transmit information (e.g., positioning information, control instructions, etc.) to theaerial drone100 based upon the image data. In some embodiments, theportable device2800 can also be configured to supply power to theaerial drone100.
Referring now toFIGS. 29A through 29C, the aerial drone can be configured to communicate with a warehouse management system (WMS)2900 that stores inventory data for the storage facility. In embodiments, theWMS2900 may include, but is not limited to, an onsite computer/server, a network of onsite computers/servers, a remote computer/server, a network of remote computers/servers, a cloud computing network, a network accessible by one or more mobile devices, or any combination of the foregoing. As shown inFIG. 29A, theWMS2900 can include at least oneprocessor2902, amemory2904, and a communications interface2906 (e.g., for communicating with theaerial drone100, user devices, and so forth). Examples of a processor, memory, and communications interface are described above (e.g., with reference toprocessor104,memory106, and communications interface108). TheWMS2900 can also include a user interface2908 (e.g., a display, touch panel, I/O device(s), etc.) for presenting information and receiving user inputs/selections. In some embodiments, theWMS2900 is configured to present information via the user interface2908 (e.g., by displaying a graphical user interface) and/or theWMS2900 can provide access to a graphical user interface that is generated by the WMS2900 (e.g., theWMS2900 can be accessed via a browser or app running on a user device (e.g., mobile device, computer, etc.)).
Thecontroller102 may be configured to transmit information associated with the detected identifiers to theWMS2900. TheWMS2900 can have an onsite user interface (e.g., user interface2908) and/or can be configured to transmit information for display via a user interface of a connected (e.g., wired or wirelessly connected) user device (e.g., a computer, mobile device, or the like).FIG. 29C shows an example of a table that can be displayed via the graphical user interface generated by theWMS2900 and/or exported to an Excel file or the like. The table shown inFIG. 29C includes values (e.g., A1, A2, A3, B1, C, . . . ) populated by theWMS2900 based on the identifiers of inventory items and/or other information (e.g., time, date, location, sensor info (e.g., altitude, temperature, humidity, etc.), and so forth) detected by theaerial drone100. As shown inFIGS. 30A and 30B, in some embodiments, the graphical user interface generated by theWMS2900 can include amapping3000 of a plurality ofinventory items3002. For example, themapping3000 can correspond to anaisle selection3001 input by the user. The graphical user interface can be configured to receive user inputs (e.g., data entries, selections, etc.) via an I/O device (e.g., keyboard, mouse, touch panel, microphone (e.g., for voice commands), and the like). In response to receiving a selection of an inventory item3002 (e.g., viacursor3004, touch input, verbal command, text input, or the like), theWMS2900 may be configured to cause the graphical user interface to display information corresponding to the selectedinventory item3002 based on information received from theaerial drone100. For example, as shown inFIG. 30B, the graphical user interface may display awindow3006 adjacent to or at least partially on top of themapping3000. The graphical user interface can be configured to display (e.g., in the window3006) animage3008 of theinventory item3002 and/or animage3008 of the identifier on theinventory item3002 that was detected by theaerial drone100. The graphical user interface can also be configured to displayproduct information3010, such as, but not limited to, a reference value (e.g., SKU number, serial number, or other product label), time and/or date, last user information, location, sensor info (e.g., altitude, temperature, humidity, etc.), or any combination thereof.
In some embodiments, the wireless connection utilized by the warehouse inventory management system may be configured to transmit data to and receive data from thedrone100, such as image, video, depth measurement, distance measurement, position and orientation, flight time, command, three-dimensional reconstruction, processed label data, and/or other data. In one non-limiting configuration, the data may be transmitted through the wireless connection to an external processor, including a local processor such as a drone ground station, a laptop, a personal computer, a smartphone, a tablet, or other such processors. In another non-limiting configuration, the data may be transmitted through the wireless connection to a cloud for processing, such as cloud processing platforms provided by Amazon Web Services, Google Cloud, Microsoft Azure, IBM SmartCloud, and other such cloud computing platforms. Another non-limiting configuration may be that sensor data collection, processing of label data, 3D reconstruction could all be completed on the processor on the drone, of which the output is sent to an external processor via a wireless connection. The wireless connection utilized by the warehouse inventory management system may be or may include an internet connection configured over a Wi-Fi network, a cellular network, a satellite internet network, or other internet service network. Alternatively, the wireless connection may be or include another wireless connection protocol. Furthermore, the wireless connection may be configured as a private local area wireless network for communication with the drone and/or other devices.
In some embodiments, the external processor may contain software for the user control interface system. The user control interface system may include but is not limited to a three-dimensional model generated from the sensor data sent by the drone, a GUI connected to and/or a part of the data storage system, a map of the warehouse and located item(s), and commands for future drone actions. The three-dimensional model may be created through photogrammetry, laser scan point cloud, stereo camera point cloud, or other appropriate techniques. In one non-limiting example, the user interface control system software runs on a processor external to the drone (a local processor or processors on the cloud). This user interface control system can be separate from and interact with an inventory management software, or alternatively it could be bundled together to be a part of the inventory management software. The GUI connected to and/or a part of the data storage system may be connected to and/or a part of inventory management software and may connect processed label data with specific items in the software. In one non-limiting example, the GUI connected to and/or a part of the data storage system may comprise information such as item number, bar code number, item name, order number, shipping status, storage status, location in warehouse, timestamp, bar code image, package image, item image, real-time video stream, or other appropriate information. Moreover, the user control interface system may also contain a map of the interior of the warehouse, comprising of a two- or three-dimensional model of the interior layout of the warehouse. The map may contain information such as aisles, rows, pallets, packages, items, and other information.
Furthermore, application software and/or control algorithms may be loaded and/or stored on the external processor which may be used to control thedrone100 over the wireless connection. Additionally or alternatively, the application software and control algorithms may be stored and located on the internet and accessible by the user control interface system and thedrone100. Moreover, the user control interface system may have the ability to access and execute other software over the wireless connection. In some embodiments the software may be configurable and modular, and a user may be able to configure the software to direct the drone to perform a task or a plurality of tasks as needed. For example, the user control interface system may contain commands for thedrone100, possibly given by a user through the user control interface system or automated by programming, which may be sent over the wireless network to be executed by the drone. These commands may be represented in the form of clickable buttons, key presses, touchscreen key presses, digital or physical joysticks, and other representations. They may give instructions to the drone to fly to a certain location in the warehouse, such as using a map of the warehouse and/or by altering its roll/pitch/yaw/throttle, take off, land, fly to another item in the list of items stored in the data storage system, hover, scan an item, otherwise collect data about an item, a shelf, or the warehouse, update a 3D map, collect and/or transport an item as payload, or other such instructions.
In some embodiments, the commands can be provided by the user in real time on a command by command basis to control the drone. In some embodiments, one or more sequences of commands can be entered by the user in real time to cause the drone to subsequently execute a sequence of discrete actions for performing a task or mission. In some embodiments, one or more sequences of commands can be entered by the user prior to drone take-off for providing an automated flight plan and/or mission profile for the drone. It will be apparent in view of this disclosure that any command, commands, command sequences, automated flight plans, or automated mission profiles can be configured for using a single drone to complete a task or mission or for using multiple drones to complete a task or mission. For example, in some embodiments, a plurality of drones can be assigned to work in concert to perform a comprehensive warehouse inventory, wherein each drone can inventory a single shelf, rack, etc. before returning to a base station to recharge.
In some embodiments, thedrone100 may be constructed having a frame/body, a single or plurality of rotors/propellers, and one or more landing structures/gears. The frame/body may provide support for the rotors/propellers which may be fixedly attached to and positioned above the frame/body. However, other positions for the rotors/propellers in relation to the frame/body are possible. In addition, in one non-limiting example, thedrone100 may be configured to have a plurality of rotors/propellers equaling four rotors. However, other numbers of rotors/propellers are possible, such as one, two, six, eight or any other suitable number of rotors/propellers. Additionally, one or more landing structures/gears may be attached to the frame/body and the one or more landing structures may be arranged to position thedrone100 in an upright position when thedrone100 is in an inactive, idle, or rest position.
In some embodiments thedrone100 may be directed to land or otherwise come to rest at a designated home position when thedrone100 is not being used. The designated home position can be any location given by a user of thedrone100 to serve as the designated home position. Alternatively or additionally, the designated home position may be a structure such as a platform, a box or other known structure.
During operation the plurality of rotors/propellers may be configured to allow thedrone100 to fly, hover in a fixed location, or otherwise move around an area. Moreover, thedrone100 may require a certain amount of power to operate the plurality of rotors/propellers and other components of thedrone100. In some embodiments, thedrone100 may receive power from a battery pack or other such power storage device. The battery pack may be integrated into and/or mounted onto the frame/body of thedrone100. However, other locations for the battery pack are possible. During periods of rest or inactivity the battery pack may need to be recharged to ensure an adequate supply of power for drone operation. In one non-limiting example, a battery charger may be incorporated within the designated home position. For example, the battery charger may be configured as an inductive charger which sends electromagnetic energy through inductive coupling with an electronic device and the energy may be stored in the battery pack for later use. While the battery charger described here is an inductive charger, any other known types of battery chargers may be used. Moreover, the designated home position may have a wall plug that plugs into a standard wall electrical socket to provide and electricity source for designated home position and the battery charger.
In addition to the battery pack, thedrone100 may carry other parts, such as sensor units, which may include camera, stereo camera, laser depth sensor, LIDAR, and/or other sensors. In one non-limiting example, the sensor unit may be configured to have sensors facing the front, back, left, and right of thedrone100. However, other configurations of sensor units are possible, such as facing front only, facing the four directions plus downward-facing, facing the four directions plus downward and upward-facing, facing four diagonal corners, and other suitable configurations. Thedrone100 may also carry an on-board processor unit, which may include CPUs, GPUs, flight controllers, and other processors and microprocessors. This processor unit may contain other electronics, such as IMUS, Wi-Fi devices, other wireless protocol devices, GPS, altimeters, ultrasonic sensors, data storage devices, and/or other electronics.
The user control interface system may run on a device such as a smartphone, a personal computer or laptop, a tablet computer, or any other such device that is capable of connecting to the wireless connection. In some embodiments, the wireless connection may be or include an internet connection. The operator may view the data from the user control interface system on the device, or to a difference device connected to the first device, and may use the user control interface system to send commands through the wireless connection to be executed by thedrone100.
In some implementations, adrone100 may capture data with its on-board sensors. This data may be processed on-board thedrone100 itself. The processed data may then be sent via a wireless connection such as the internet to one or multiple end devices, to cloud processors, and/or be used by thedrone100 itself for purposes including but not limited to localization, stabilization, and mapping.
The end device may comprise a laptop or desktop computer, smartphone, tablet device, drone base station, drone controller, smartwatch, wall-mounted computing device, or any other such suitable end device. With the received data, the end device may update the information running on its software, such as a GUI. This information may include pictures, videos, barcode scans, parsed text, timestamps, location data, and/or other suitable information.
External processors such as cloud processors may receive unprocessed data directly sent from thedrone100 and/or processed data. In some embodiments, a user control interface system runs on one cloud processor, and processes the processed and/or unprocessed data sent via thedrone100. In one non-limiting configuration, the output of the processing by the user control interface system may be sent to an inventory management system, which may run on another cloud processor. In other configurations, the user control interface system and inventory management system running on one cloud processor together, the systems running on a local non-cloud processor, the systems being bundled together as one software package, or other suitable configurations. The inventory management system may use the data output from the user control interface system to take actions to update and reconcile entries, actions that may include updating item location data, removing duplicate data, adding a timestamp, updating a status of an item, and/or other suitable actions. The inventory management system may send data to the user control interface system, which may take actions to update and reconcile its data. The user control interface system may send data to one or more end devices. This may prompt an end device to update the information running on its software, such as the GUI. This information may include pictures, videos, barcode scans, parsed text, timestamps, location data, status of order, status of item, quantity of item, the need to re-order, and/or other suitable information.
An operator may input commands to an end device. These commands may be input through voice command, physical keyboard, digital keyboard, mouse, touchscreen, joystick, buttons, and/or any other suitable input methods. In one non-limiting configuration, commands may be transmitted through a wireless connection from the end device to cloud processors, such as the processor running a user control interface system. The user control interface system may process the commands, then relay the commands through wireless connection to thedrone100.
Although specific examples of the configurations of devices, data processing, data transmission, and software location are included herein, any of the data processing operations may be performed on any of: adrone100,multiple drones100, a base station, an inventory management system (e.g., WMS2900), a local or cloud-based processor, and/or devices (e.g.,user device2000,device2700, and/or device2800) connected to any one or more of the items in this list, or any combination of the foregoing devices. In one non-limiting example, instead of being located on processors on the cloud, a user control interface system and/or an inventory management system may exist on one or more local non-cloud processors. In another non-limiting example, all sensor data processing could be done entirely on thedrone100. Another non-limiting configuration is that when operators input command data to an end device, the end device transmits the commands directly to thedrone100 or the inventory management system, which then may or may not transmit data to the user control interface system.
Generally, any of the functions described herein can be implemented using hardware (e.g., fixed logic circuitry such as integrated circuits), software, firmware, manual processing, or a combination thereof. Thus, the blocks discussed in the above disclosure generally represent hardware (e.g., fixed logic circuitry such as integrated circuits), software, firmware, or a combination thereof. In the instance of a hardware configuration, the various blocks discussed in the above disclosure may be implemented as integrated circuits along with other functionality. Such integrated circuits may include all of the functions of a given block, system, or circuit, or a portion of the functions of the block, system, or circuit. Further, elements of the blocks, systems, or circuits may be implemented across multiple integrated circuits. Such integrated circuits may comprise various integrated circuits, including, but not necessarily limited to: a monolithic integrated circuit, a flip chip integrated circuit, a multichip module integrated circuit, and/or a mixed signal integrated circuit. In the instance of a software implementation, the various blocks discussed in the above disclosure represent executable instructions (e.g., software modules) that perform specified tasks when executed on a processor (e.g., processor104). These executable instructions can be stored in one or more tangible computer readable media. In some such instances, the entire system, block, or circuit may be implemented using its software or firmware equivalent. In other instances, one part of a given system, block, or circuit may be implemented in software or firmware, while other parts are implemented in hardware.
It is to be understood that the present application is defined by the appended claims. Although embodiments of the present application have been illustrated and described herein, it is apparent that various modifications may be made by those skilled in the art without departing from the scope and spirit of this disclosure.