CROSS-REFERENCE TO RELATED APPLICATIONSThe present application is a non-provisional of and claims priority to U.S. Provisional Patent Application No. 62/581,280, filed Nov. 3, 2017, entitled “SEMI-AUTONOMOUS TARGETING OF REMOTELY OPERATED WEAPONS.” The entire contents of provisional application no. 62/581,280 is incorporated herein by reference for all purposes.
BACKGROUND1. Field of the InventionThis disclosure generally relates to autonomous and semi-autonomous motorized weapons systems. More specifically, the present disclosure relates to hardware- and software-based techniques for efficient operation of motorized weapons systems, via improvements in target identification and selection, autonomous actuation of motor and targeting systems, dynamic tracking, and trajectory measurement and assessment.
2. Description of Related ArtWithin the context of motorized weapons systems, the concept of a “kill chain” refers to the sequence of actions performed between the first detection of potential targets, and the elimination of the targets. The sequence of actions within a kill chain generally may include the following: (1) Find—identifying and locating a target, (2) Fix or Track—determining the accurate location of the target, (3) Target—time-critical targeting, including predicting where the target may pop-up, (4) Engage—firing on the target, and (5) Assess—determining whether or not the target has been hit and/or eliminated.
Conventional weapon systems may include various components for achieving the above steps of a kill chain, including cameras and sensors to identify targets, display screens and controls (e.g., joysticks) to allow an operator to identify targets and aim the weapon, and a variety of weapons that may be fired at the target. Such systems may include “fully autonomous” weapons systems, which are capable of targeting and firing without any intervention by a human operator, “semi-autonomous” weapons systems, which may use automated software target tracking tools but still rely on a human operator for target selection and firing commands, “supervised autonomous” weapons systems, which may be granted permission to react to threats autonomously, and/or manual weapon systems that are operated entirely by the human operator.
Typically, conventional weapons systems rely on an “operator centric” approach to perform the actions in the kill chain sequence. Such systems often prioritize the interface and environment provided to the human operator. First, the human operator may be put in a safe environment, and the operator's eyesight may be improved using broad spectrum and high-resolution options. The weapon may be stabilized from motion and vibration, to allow the operator to find and track the target via a joystick and cursor or similar interface. After these steps, image recognition software may be used to attempt to recognize the target that been selected and tracked by the operator, and trajectory adjustments may be applied. Such systems and processes may result in a number of technical problems and inefficiencies, including difficulties of targeting and tracking when the operator is in a moving vehicle, difficulties selection and identification of targets and inefficiencies in selecting follow-on targets, and operator-based assessment and correction of weapon targeting and firing.
BRIEF SUMMARYTechniques described herein relate to hardware- and software-based solutions for operating motorized weapons systems, including target identification and selection techniques, autonomous actuation of motor and targeting systems, dynamic tracking, and trajectory measurement and assessment techniques. Certain embodiments described herein correspond to semi-autonomous motorized weapon systems, which may include various combinations of hardware such as weapons capable of firing munitions, two-axis and/or three-axis mounts configured to support and position the weapons, motors coupled to the mounts and configured to move the mounts to specified positions to control the direction to which the weapons is aimed, and/or operator interface components such as operator controls and a target display device. In some embodiments, such a semi-autonomous motorized weapon system may be implemented with various hardware-based and software-based components configured to determine target points associated with targets at a remote locations, determine one or more areas having boundaries surrounding the target points, such boundary areas determined based on the likelihood of the weapon hitting the target when aimed at the boundary in comparison to predetermined likelihood thresholds. Such embodiments may be further configured to engage the motor of the motorized weapon system, with instructions to move the mount from an initial position to a target position at which the weapon is aimed at the target point, and during engagement of the motor, to periodically determine, during the movement of the mount toward the target position, whether the weapon is aimed at a position within the boundary area surrounding the target point. When determining, during the movement of the mount toward the target position, that the weapon is not aimed at a position within the area surrounding the target point, the semi-autonomous motorized weapon system may disable a manual firing mechanism of the weapon system to prevent firing of the weapon by an operator, whereas when it is determined during the movement of the mount toward the target position, that the weapon is aimed at a position within the area surrounding the target point, the semi-autonomous motorized weapon system may enable (or re-enable) the manual firing mechanism to allow firing of the weapon. Finally, the semi-autonomous motorized weapon system may be configured to receive and execute firing commands from operators, via the manual firing mechanism, thereby firing the weapon at times when the manual firing mechanism is enabled.
Additional techniques described herein include weapon-agnostic motorized weapon systems, including weapon-agnostic targeting/firing systems that may support various different types or models of weapons, as well as implementation of operation-specific rules of engagement that may be received and enforced by the weapon-agnostic targeting and firing systems. Further techniques described herein include minimum confidence thresholds for target selection and/or prioritization via semi-autonomous weapons systems, which may be separate determinations from target identification confidence and/or target verification confidence. Still further techniques described herein may include sensor-based real-time projectile firing assessment and automatic correction of targeting algorithms based on accuracy evaluations.
The various techniques described herein further include combinations of autonomous target selection, prioritization, and re-selection by targeting/firing systems within semi-autonomous motorized weapon systems, dynamic target tracking of both primary and secondary targets including target movement predictions and weapon/projectile characteristics, autonomous motor actuation to automatically orient the weapon toward the primary target before receiving any operator input, simplified user interfaces and operator controls, and enabling/disabling of the firing mechanism depending on the projected point of impact of the weapon, thereby providing increased system efficiency, increased rate of firing, improved weapon system accuracy, and reduced operator error, along with the other technical advantages described herein.
BRIEF DESCRIPTION OF THE DRAWINGSFIG. 1 is a depiction of a motorized weapon system, in accordance with one or more embodiments of the present invention.
FIG. 2 is a block diagram illustrating example component architecture diagram of a motorized weapon system, in accordance with one or more embodiments of the present invention.
FIGS. 3A-3C are illustrative drawings depicting the mounting and application of a motorized weapon system in accordance with one or more embodiments of the present invention, within different engagement environments.
FIG. 4 is a flowchart illustrating an example process of using a motorized weapon system to engage one or more targets, in accordance with certain embodiments of the present invention.
FIG. 5 is an example screen of a user interface displayed to an operator of a motorized weapon system during engagement of one or more targets, in accordance with certain embodiments of the present invention.
FIG. 6 is another example screen of a user interface displayed to an operator of a motorized weapon system during engagement of one or more targets, in accordance with certain embodiments of the present invention.
FIG. 7 is a flowchart illustrating an example process of disabling or enabling a firing mechanism of a motorized weapon system during engagement of the motor to move the weapon, in accordance with certain embodiments of the present invention.
FIGS. 8A and 8B are example screens of a user interface displayed to an operator of a motorized weapon system during engagement of the motor to move the weapon toward a target point, in accordance with certain embodiments of the present invention.
FIG. 9 is a schematic illustration of a computer system configured to perform techniques in accordance with certain embodiments of the present invention.
In the appended figures, similar components and/or features may have the same reference label. Further, various compo of the same type may be distinguished by following the reference label by a dash and a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
DETAILED DESCRIPTIONIn the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of various embodiments of the present invention. It will be apparent, however, to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details. In other instances, well-known structures and devices are shown in block diagram form.
The ensuing description provides exemplary embodiments only, and is not intended to limit the scope, applicability, or configuration of the disclosure. Rather, the ensuing description of the exemplary embodiments will provide those skilled in the art with an enabling description for implementing an exemplary embodiment. It should be understood that various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the invention as set forth in the appended claims.
Specific details are given in the following description to provide a thorough understanding of the embodiments. However, it will be understood by one of ordinary skill in the art that the embodiments may be practiced without these specific details. For example, circuits, systems, networks, processes, and other components may be shown as components in block diagram form in order not to obscure the embodiments in unnecessary detail. In other instances, well-known circuits, processes, algorithms, structures, and techniques may be shown without unnecessary detail in order to avoid obscuring the embodiments.
Also, it is noted that individual embodiments may be described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed, but could have additional steps not included in a figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination can correspond to a return of the function to the calling function or the main function.
The term “computer-readable medium” includes, but is not limited non-transitory media such as portable or fixed storage devices, optical storage devices, and various other mediums capable of storing, containing or carrying instruction(s) and/or data. A code segment or computer-executable instructions may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a class, or any combination of instructions, data structures, or program statements. A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
Furthermore, embodiments may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware or microcode, the program code or code segments to perform the necessary tasks may be stored in a computer-readable medium. A processor(s) may perform the necessary tasks.
Various techniques (e.g., methods, systems, computing devices, non-transitory computer-readable storage memory storing a plurality of instructions executable by one or more processors, etc.) are described herein for hardware- and software-based solutions for operating motorized weapons systems, including target identification and selection techniques, autonomous actuation of motor and targeting systems, dynamic tracking, and trajectory measurement and assessment techniques. Certain embodiments described herein correspond to semi-autonomous motorized weapon systems, which may include various combinations of hardware such as weapons capable of firing munitions, two-axis and/or three-axis mounts configured to support and position the weapons, motors coupled to the mounts and configured to move the mounts to specified positions to control the direction to which the weapons is aimed, and/or operator interface components such as operator controls and a target display device. In some embodiments, such a semi-autonomous motorized weapon system may be implemented with various hardware-based and software-based components configured to determine target points associated with targets at a remote locations, determine one or more areas having boundaries surrounding the target points, such boundary areas determined based on the likelihood of the weapon hitting the target when aimed at the boundary in comparison to predetermined likelihood thresholds. Such embodiments may be further configured to engage the motor of the motorized weapon system, with instructions to move the mount from an initial position to a target position at which the weapon is aimed at the target point, and during engagement of the motor, to periodically determine, during the movement of the mount toward the target position, whether the weapon is aimed at a position within the boundary area surrounding the target point. When determining, during the movement of the mount toward the target position, that the weapon is not aimed at a position within the area surrounding the target point, the semi-autonomous motorized weapon system may disable a manual firing mechanism of the weapon system to prevent firing of the weapon by an operator, whereas when it is determined during the movement of the mount toward the target position, that the weapon is aimed at a position within the area surrounding the target point, the semi-autonomous motorized weapon system may enable (or re-enable) the manual firing mechanism to allow firing of the weapon. Finally, the semi-autonomous motorized weapon system may be configured to receive and execute firing commands from operators, via the manual firing mechanism, thereby firing the weapon at times when the manual firing mechanism is enabled.
Additional techniques described herein include weapon-agnostic motorized weapon systems, including weapon-agnostic targeting/firing systems that may support various different types or models of weapons, as well as implementation of operation-specific rules of engagement that may be received and enforced by the weapon-agnostic targeting and firing systems. Further techniques described herein include minimum confidence thresholds for target selection and/or prioritization via semi-autonomous weapons systems, which may be separate determinations from target identification confidence and/or target verification confidence. Still further techniques described herein may include sensor-based real-time projectile firing assessment and automatic correction of targeting algorithms based on accuracy evaluations.
The various techniques described herein further include combinations of autonomous target selection, prioritization, and re-selection by targeting/firing systems within semi-autonomous motorized weapon systems, dynamic target tracking of both primary and secondary targets including target movement predictions and weapon/projectile characteristics, autonomous motor actuation to automatically orient the weapon toward the primary target before receiving any operator input, simplified user interfaces and operator controls for operating the semi-autonomous motorized weapon systems, and enabling/disabling of the firing mechanism depending on the projected point of impact of the weapon, thereby providing increased system efficiency, increased rate of firing, improved weapon system accuracy, and reduced operator error, along with the other technical advantages described herein.
With reference now toFIG. 1, a depiction of an illustrativemotorized weapon system100 is shown. As shown in this example,weapon system100 may include aweapon110 withammunition feed115, agimbal mount120, a camera/sensor unit125. Additionally, in this example, theweapon system100 includes a base/housing130, which contains and obscures additional components of thesystem100, including the motor, servos, targeting system, processing and memory components, communications system, firing controls, and various other components described herein.
In some embodiments,weapon system100 may be a remotely operated weapon stations (ROWS), including stabilization and auto-targeting technology. The targeting system ofweapon system100 may be configured to perform rapid target selection and acquisition, and increased hit probabilities.Weapon system100 may be compatible with many different types ofweapon110 and different corresponding types of ammunition, and as discussed below, the operation of the targeting system and other components of theweapon system100 may depend on knowledge of which type ofweapon110 and ammunition is currently in use. As discussed in more detail below,weapon system100 may be fully integrated, with auto-targeting capabilities, and/or remote operation.Weapon system100 also may be capable of being mounted to various different types of platforms, including tripods, buildings, ground vehicles (e.g., trucks, tanks, cars, jeeps), all-terrain vehicles (ATVs), utility task vehicles (UTVs), boats, fixed-wing aircraft, helicopters, and drones. As described in further detail below, various embodiments ofweapon systems100 may include capabilities for automatic target detection, selection, and re-selection, active stabilization, automatic ballistic solutions, target tagging, and/or continuous target tracking.
As noted above,weapon110 may any type of gun, armament, or ordinance, including without limitation, off-the-shelf firearms, large caliber rifles, machine guns, autocannons, grenade launchers, rockets, and/or directed energy weapons such as lasers, high-power microwave emitters, and other undisclosed devices. Theweapon110 may be attached to theweapon system100 using a 2-axis or 3-axismechanical gimbal mount120, capable of controlling azimuth and yaw, elevation and pitch, and possibly cant and roll. A closed loop servomotor within theweapon system100 may be configured to drive the gimbal to an identified target. A firing mechanism within the weapon system may be configured to fire theweapon110, either electronically or by manually pulling the trigger, in response to a firing command from a human operator and/or additional firing instructions received from a targeting/firing component of theweapon system110.
Camera/sensor unit125 may include an array of various different sensors configured to collect data at theweapon system100, and transmit the sensor/image data back to the internal software systems of the weapon system100 (e.g., targeting system/component, firing control, ballistics engine) and/or to a display device for outputting to an operator. Cameras/sensors within thesensor unit125 may include, for example, cameras sensitive in various spectrums such as visible and infrared (IR), for day and night visibility, as well as rangefinders (e.g., LIDAR, RADAR, ultrasonic, etc.) to determine distance to target. Additional sensors within thesensor unit125 may include rate gyros (e.g., MEMS or fiber optic gyros), which may be used to stabilize theweapon110 within themount120. Magnetometers and accelerometers also may be included within theweapon system100, and may be used for canceling gyro drift. Accelerometers also may be used to detect and respond to vehicle accelerations (i.e., when theweapon system100 is mounted on a vehicle), and vibrations caused by vehicle movement and/or terrain and weather.Sensors125 also may include wind speed sensors, including hot-wire, laser/LIDAR, sonic and other types of anemometers. Additionally, as described below, a global positioning system (GPS) receiver or other positioning devices may be included within thesensor unit125, in order to determine the weapon location, head, and velocity to compute firing solutions, and for use in situations where external target coordinates are provided. It should also be understood that for each of the cameras and/or sensors described above and elsewhere herein, the cameras/sensors may be housed within thesensor unit125, positioned elsewhere in theweapon system100, installed on a structure or vehicle on which theweapon system100 is mounted, or installed at a separate remote location and configured to transmit wireless sensor data back to theweapon system100.
Referring now toFIG. 2, a block diagram is shown illustrating various components and systems, and the computing/communication architecture within a motorized weapon system. In this example,weapon system200 may correspond tosame weapon system100 discussed above, and/or other variations of weapon systems described herein. As in the example above,weapon system200 includes aweapon225,mount230,motor235, and a camera/sensor unit245.Weapon system200 also includes a targeting/firing system210, described below in more detail, which may be implemented in hardware, software, or a combination of hardware and software. Additionally,weapon system200 may include operator-facing components, includingcontrols245 and adisplay screen250.
As indicated by the arrows shown in the diagram ofweapon system200, the targeting/firing system210 may be configured to control drive themotor235 to a particular target point, and to initiate firing of theweapon225. The camera/sensor unit240 may collect image and sensor data, and transmit that data back to the targeting/firing system210 for use in target detecting, selection, and tracking functionality. In some cases, image and sensor data may be transmitted directly from thesensor unit240 to thedisplay250 for rendering/use in an operator user interface. The targeting/firing system210 also may transmit various targeting data to thedisplay device250 for presentation to the operator, and may receive from the operator firing commands and/or other control commands via the operator controls245.
In some embodiments, all components of aweapon system200 may be co-located and installed together as a single integrated system. For instance,weapon systems200 may include turrets or platform-mounted guns which include the weapon/motor225-235, camera/sensor unit240, targeting/firing system210, as well as the operator controls245 anddisplay250. However, in other embodiments, some or all of the components of aweapon system200 may non-integrated and located remoted from the others. For example, in some cases the weapon/motor225-235 and a subset of the sensors/cameras240 may be located near the potential targets, while the targeting/firing system210 and operator interface components245-250 may be in a distance remote location.Certain sensors240 may be located at or near the weapon225 (e.g., to measure distance to target, current location, weapon movement and vibration, wind and weather conditions, etc.), whileother sensors240 may be positioned at or near the target and/or at other angles to the target, while still other sensors orcameras240 may be remotely located (e.g., drone-based cameras, satellite imagery, etc.). In embodiments in which certain components of aweapon system200 are located remotely from others, each of the components may include network transceivers and interfaces configured for secure network communication, including components for data encryption and transmission over public or private computer networks, satellite transmission systems, and/or secure short-range wireless communications, etc.
The targeting/firing system210 may receive input data from various data sources, and analyze the data to identify, select, and prioritize targets, actuate themotor235, dynamically track targets, generate firing solutions, and control firing of theweapon225. In order to perform these functions, the targeting/firing system210 may receive data from one or more cameras/sensor units240, including aGPS unit211. The sensor data may include images of targets and potential targets, distance/range data, heat or infrared data, audio data, vehicle or weapon location data, vehicle or weapon movement and vibration data, wind and weather condition data, and any other sensor data described herein. Additionally, one or more data stores may store system configuration and operation data, including arules data store213 and aprofiles data store214. Therules data store213 may include, for example, target identification rules, target selection/priority rules, firing rules, and other rules of engagement, each of which may depend on the particular operation, the current location of theweapon system200, the individual operator, etc. Theprofiles data store213 may include, for example, individual user profiles with user preferences and parameters, weapon profiles, and/or ballistic profiles that may include specifications for individual weapon types and ammunition types that may be used to calculate maximize ranges and targeting solutions. Additionally, one ormore communication modules212 within the targeting/firing system210 may be used to receive commands and other data from the current operator and/or from a separate command centers. As discussed below, commands received from a command center or other higher-level authority may be to control the target selection and rules of engagement for particular operations.Communication modules212 also may be used to receive or retrieve sensor data from remote sensor systems, including satellite data, image data from remote cameras, target GPS data, weather data, etc. The targeting/firing system210 may include various components (e.g., targeting component220) configured to receive and analyze the various data to performing target functions including subcomponents fortarget detection221,target selection222, target tracking223, and firingcontrol215, among others.
The operator controls245 anddisplay screen250 may correspond to the input/output interface between the human operator and theweapon system200. As noted above,certain weapons systems200 may be fully autonomous, or may operate in a supervised autonomous mode, in which case the operator controls245 anddisplay screen250 need not be present. Additionally, the operator controls245 anddisplay screen250 may be remotely located in some embodiments, allowing the operators to control theweapon system200 from a separate location that may be a few feet away or across the globe. Thedisplay device250 may receive and output various user interview views to the operator, including views described below for identifying and highlighting targets, obscuring non-targets, rendering target points, weapon trajectories, confidence ranges, and providing various additional sensor readings to the operator. The operator controls245 may allow the operator to identify, select, and mark targets, and to fire theweapon225. As shown in this example, the operator controls245 may include a fire button246 (to fire the weapon225), and a “next target”button247 to instruct thetarget component220 to re-select the next priority target. In certain embodiments, the operator controls might include only these two buttons, and need not include a joystick for aiming tracking, etc.
Referring briefly toFIGS. 3A-3C, these drawings illustrate the operation of motorized weapons systems on three different vehicle-based mounting platforms. In the example ofFIG. 3A, a motorized weapon system is mounted on a stationary or movingvehicle306. Theremote weapon system304 holds thefirearm305, and various sensors may be installed in the frame of reference of thefirearm305, in the frame of reference of the gimballed remote control, and/or in the frame of reference of thevehicle306. In these examples, the field ofview307 is represented by dotted lines. Acrosshair301 shows the current projected point of impact. In each ofFIGS. 3A-3C, thecrosshair301 is not yet on target, and it may be assumed that the motor is engaged driving the firearm to the target position, or the operator has not yet confirmed the target. The targeting system in these examples shows aprimary target302 identified by a doubled-dashed box, and a secondary target which has been identified but not yet targeted, is shown within a singled dashedbox303.FIG. 3B shows a similar set of components, but in this case, the scenario is a maritime use with anarmed boat306 as the vehicle.FIG. 3C shows yet another scenario in which thevehicle306 is a helicopter.FIG. 3C also illustrates that the system may identify multiplesecondary targets303 within the field ofview307.
Referring now toFIG. 4, a flow diagram is shown illustrating a process by which a motorized weapon system may identify, target, engage, and fire on one or more targets. As described below, the steps in this process may be performed by one or more components in the examplemotorized weapon system200 discussed above, such as targeting/firing system210 and the subsystems thereof, in conjunction with the weapon/mount/motor components225-235, one ormore sensor units240, operator interface components245-250, and/or various remote and external systems. However, it should be understood that process steps described herein, such as target identification and prioritization, dynamic target tracking, semi-autonomous target selection, motor actuation and firing control/locking capabilities, and the like, need not be limited to the specific systems and hardware implementations described above inFIGS. 1-3, but may be performed within other motorized weapon systems and environments comprising other combinations of the hardware and software components described herein.
Instep401, the components of themotorized weapon system200 may identify and verify one or more targets, usingsensor units240 and/or additional data sources. In some embodiments, the identification and/or verification of targets may be performed fully autonomously by thesystem200. For example, image data from cameras and sensor data from other sensors240 (e.g., range to target data, heat data, audio, etc.) may be used to identify one or more targets within the range and proximity of theweapon system200. In some cases, data from additional sources may be used as well, including imagery or sensor data from remote sensor or imaging systems (e.g.,other weapons systems200, fixed cameras, drones, satellites, etc.). For example, ifsensor unit240 does not include a rangefinder and/or if exact range to target data is not available, the targeting/firing system210 may be configured to calculate approximate range data using passive ranging techniques. For example, heights of known objects (or presumed heights) may be used to calculate the distance of those objects from theweapon system200. Additional sources of target data also may be received viacommunication modules212, which may include the GPS coordinates of targets, or bearing to targets, received from a command center. Such image data and other sensor data received from additional data sources may be used by the targeting/firing system210 to triangulate or confirm a target's location, or verify the identity of a target, etc.
As used herein, target identification and target verification refer to related but separate techniques. Target identification (or target detection) refers to the analysis of camera images, sensor data, etc., to detect objects and identify the detected object as potential targets for the weapon system200 (e.g., vehicles, structures, weapons, individuals, etc.), rather than generally non-target objects such as rocks, trees, hills, shadows, and the like. Target verification (or target confirmation) refers to additional analyses of the same images/sensor data, and/or additional sources images/sensor data, to determine whether or not the identified potential target should be selected for targeting by theweapon system200. Target verification techniques may be based on the configuration of the system and priorities of the particular mission, etc. For example, target verification techniques for vehicles may include identifying the size of a vehicle target (e.g., based on image analysis, target range, heat signatures from engines, etc.), the vehicle type (e.g., based on image analysis, and comparisons to adatabase214 of target/non-target images), the presence of weapons on a target or proximate to a target, etc. For example, the size, shape, color, movement, audio and heat signatures of a vehicle may be analyzed to determine if that vehicle is a drone, helicopter, aircraft, boat, tank, truck, jeep, or car, whether the target is a military or civilian vehicle, the number of individuals and/or weapons on the vehicle, and the like, all of which may be used be arules database213 to determine whether the vehicle is a target non-target. Target verification also may include identifying particular insignia on targets, and for human targets, facial recognition and/or biometric recognition to confirm the identity of the target.
In some cases, both target identification and target verification instep401 may be performed fully autonomously by theweapon system200, using the techniques described above. In other cases, target identification and/or verification may include semi-autonomous or manual steps. For example, the rules of engagement for particular operations may require that each target be visually confirmed by a human operator. Such visual confirmation may be performed by the operator, as described in steps406-407 below. Additionally or alternatively, the visual confirmation may be received from a different user, such as a commanding officer at a remote command center or other authorized user. In such cases, theweapon system200 may be configured to transmit imagery and other sensor data to one or more remote locations, and then to receive the instructions identifying the potential target as a selected target or a non-target, from the remote authorized user/command center via acommunication module212. These remote visual confirmation techniques may be entirely transparent with respect to the operator of theweapon system200 in some cases, that is, if a target is not selected/confirmed by a remote authorized user then that target might not ever be rendered or selected via the operator display device and/or might not be selectable by the operator during steps406-407.
As noted above, both target identification and target selection instep401 may be based on sets of rules received via arules database213 or other sources. Target selection rules may be based on target type (e.g., types of vehicles, individuals (if any), and structures, etc.), target size, target distance, the presence and types of weapons on a target, the uniform/insignia on a target, and the like. Additional rules may relate to the probability that the target has been accurately identified (e.g., level of confidence of facial recognition, vehicle type identification, insignia recognition, etc.), the probability that theweapon system200 will be able to hit the selected target (e.g., based on target distance, target movement, weapon and ammunition type, wind and weather conditions, etc.), and/or the presence of potential collateral damage that may occur if the target is fired upon (e.g., based on detection of friendly and non-targets in the proximity of the identified target). Different sets of rules may be applied for different operators,different weapons225 and ammunition types, different times, and/or different physical locations for the engagement. For instance, while one set of target identification, selection, and prioritization rules may be selected and applied by the targeting/firing system210 for an engagement with a particular operator, at a particular date and time, using a particular weapon/ammunition type, in a particular country/region of the engagement, having particular lighting or weather conditions, and so on, an entirely different set of target identification, selection, and prioritization rules may be selected and applied by the targeting/firing system210 if one or more of these variables (e.g., operator, time, weapon or ammunition type, engagement location or environmental conditions, etc.) changes.
Instep402, for scenarios in which multiple targets have been identified and selected instep401, the targeting/firing system210 of themotorized weapon system200 may be configured to prioritize the multiple targets, thereby determining a firing order. As with the techniques for target identification and selection described above, target prioritization techniques similarly may be on imagery and sensor data, as well as sets of operational rules that may apply to operators, weapons, locations, etc. Examples of target prioritization rules may include, without limitation, rules that prioritize vehicles over human targets, certain types of vehicles over other types of vehicles, armored vehicles over non-armored vehicles, armed targets over non-armed targets, uniformed/insignia targets over non-uniformed or insignia targets, close targets over far targets, advancing targets over stationary or retreating targets, higher confidence targets (i.e., higher probability of weapon being able to hit the target) over lower confidence targets, targets firing weapons over targets not firing weapons, and/or any combination of these criteria. In some examples, the targeting/firing system210 may evaluate the current target distance and trajectory of all advancing and armed targets (e.g., missiles, drones, ground vehicles, and individuals, etc.), in order to prioritize the targets in the order in which they would first reach the current position (or future position) of theweapon system200. These target prioritization rules also may include rules determining how particular types of targets may be targeted. For example, such rules may include the desired point of impact for a particular target type (e.g., the engine of boat, the center of mass of an individual, etc.).
Additionally, different sets of rules or algorithms may be applied for prioritizing targets, depending on the current operator, current location, current date/time, and/or based on predefined operation-specific rules of engagement. Further, rules or algorithms for prioritization may be based on or adjusted in view of current conditions, such as the current amount of ammunition of the weapon system200 (e.g., lower ammunition circumstances may cause prioritization of most valuable/important targets first), the current wind or weather conditions (e.g., in which closer and/or higher confidence targets may be prioritized), or based on nearby friendly or non-hostile targets (e.g., in which closer and/or higher confidence targets may be prioritized). Additionally, certain prioritizing algorithms may adjust the priorities of a set of targets to reduce and/or minimize the lag time between successive firings of the weapon, for instance, by prioritizing a set of nearby targets successively in the priority rank order, in order to reduce the firing latency time required to drive theweapon225 through the sequence of targets.
In various embodiments, operators may be permitted to switch on-the-fly between different rules or algorithms for target selection and prioritization. Such switching capabilities may be based the rank and/or authorization level of the operator, and in some cases may require that a request for approval be transmitted from theweapons system200 to a high-level user at a remote command center.
Referring briefly toFIG. 5, a display screen is shown displaying anexample user interface500 that may be generated by amotorized weapon system200 during engagement of a set of targets. In this example, a plurality of targets have been identified and selected within the range and proximity of theweapon system200. The targets have been prioritized to select aprimary target501, severalsecondary targets502, and several non-targets503 (e.g., friendly or non-hostile vehicles or individuals). In this example, theprimary target501 is indicated with a double dotted line, thesecondary targets502 are indicated with a single dotted line, and the non-targets have no lines. It should be understood that different types of user interface indicators may be used in other embodiments, such a green border (or other color) for theprimary target501, and a different color forsecondary targets502. In some examples, thesecondary targets502 might not be indicated at all on theuser interface500, until asecondary target502 becomes theprimary target501. In other examples, only N number of thesecondary targets502 might be identified onuser interface500, such as the only nexthighest priority target502, or the two next highest priority targets, etc. Additionally, non-targets503 may be entirely obscured or blocked out, so as not to distract the operator.Crosshairs505 are also displayed in this example, representing the point at which theweapon225 of theweapon system200 is currently aimed.
Finally,example user interface500 includes two operator controls: afire button510 to allow the user to fire theweapon225, and anext button515 to allow the user to select the next target in the priority list. In this example,fire button510 is shaded indicating that theweapon225 cannot currently be fired. As described below in more detail, this may represent a feature in which the operator'sfiring control mechanism246 is disabled whenever theweapon225 is not currently aimed at a selected target. However, it will be noted that thenext button515 is enabled in this example, indicating that thenext mechanism247 that allows the operator to change theprimary target501 to the nexthighest priority target502 in the priority list may be enabled even when thecrosshairs505 are not yet positioned on theprimary target501.
The kill chain sequence may continue by performing the functionality of steps403-410 in a continuous loop for each of the targets selected instep401, and in the priority order of the target prioritization performed instep402. Therefore, the first iteration of steps403-410 may be performed for the highest priority target, the second iteration of steps403-410 may be performed for the second highest priority target, and so on.
Instep404, for the current highest priority target in the prioritization list, the targeting/firing system210 may perform a dynamic tracking technique to determine a firing solution for that target. A firing solution refers to a precise firing position for the weapon (e.g., an azimuth/horizontal angle and altitude/elevation angle) and a precise firing time calculated by the targeting/firing system210 to hit the primary target. For stationary targets, target tracking need not be performed, and the firing solution may be computed based on a number of factors, including the target distance and target bearing from theweapon225, the muzzle velocity of theweapon225, the aerodynamic drag of the projectile/ammunition to be fired, the wind and weather conditions, and gravity (any one of which may vary based on the current conditions).
When the target is moving and/or anticipated to be moving, dynamic target tracking may be required to generate a firing solution, introducing additional variables which may increase the complexity and uncertainty of the firing solution calculation. Initially, dynamic target tracking may involve calculating the anticipated direction and velocity of the target. In some embodiments, the targeting/firing system210 may assume that the primary target will continue along its current course with the same velocity and direction. If the target is currently moving along a curved path, and/or is currently accelerating or decelerating, then the targeting/firing system210 may assume the same curved path and/or the same acceleration/declaration pattern, and may extrapolate out based on those variables. Further, in some embodiments, the targeting/firing system210 may anticipate future changes in course or speed, based on factors such as upcoming obstructions in the target's path, curves in roads, previous flight patterns, etc.
In addition to dynamically tracking the target in order to anticipate the future position of the target, the determination of a firing solution for a moving target also may take into account the anticipated time to drive themotor235 so that the weapon is positioned at the correct firing point, and the anticipated amount of time between the firing command and when the projectile/ammunition will reach the target. The time to drive themotor235 may be calculated based on the distance the gun is to be driven, the speed of the motor and/or the weight of theweapon225. The amount of time between receiving a firing command and when the projectile/ammunition will reach the target may be based on the muzzle velocity of theweapon225, the aerodynamic drag of the projectile/ammunition to be fired, the wind and weather conditions, etc. Additionally, in some cases, an anticipated delay for operator reaction time (e.g., 0.5 seconds, 1 second) also may be included in the firing solution calculation.
Referring briefly toFIG. 6, anotherexample user interface600 is shown that may be generated by amotorized weapon system200 during engagement of one or more targets. In this example, only a singleprimary target601 is shown, and the targeting/firing system210 has assessed that thetarget601 is moving toward the lower-right direction of theinterface600. Based on the factors discussed above, namely (a) the anticipated movement of thetarget601, (b) the time required to engage themotor235 and drive the weapon to the firing point, and (c) the time for the projectile/ammunition to be fired and reach the target, the targeting/firing system210 may calculate the firing solution. In this example, thecrosshairs605 represents the point at which theweapon225 is currently aimed, thepoint606 represents the desired point of impact on thetarget601, andpoint607 represents the firing solution determined by the targeting/firing system210. As shown inuser interface600, themotor235 is currently re-positioning the weapon toward thefiring solution point607, and the firing solution computation has taken into account the time reposition theweapon225 and the projectile time-to-target. Potentially, the firing solution computation also may take into account a short time delay to fire the weapon, and/or an anticipated operator decision time delay.
Further,example interface600 also includes three operator controls: afire button610, anext button615, and asafe button620. As discussed above, thefire button610 allows the operator to fire theweapon225, but in some cases might be enabled only after theweapon225 has reached thefiring solution point607. Thenext button615 allows the operator not to fire theweapon225 at theprimary target601, but instead to re-select the next highest priority target in the priority list. In this example, theprimary target601 may be moved to the back of the priority list or elsewhere in the priority list, based on the operator's selection of thenext control615. Finally, thesafe button620 allows the operator to mark the currently selectedprimary target601 as a friendly or non-target object, thereby removing it from the set of selected targets determined instep401 and priority list ofstep402. Thus, after an operator has marked a target using thesafe mechanism615, it may not be selected again by the targeting/firing system210, at least during the current engagement by thecurrent weapon system200. In some embodiments, the configuration settings of the targeting/firing system210 may determine that a target marked as safe by an operator during one engagement might thereafter be excluded from target selection/prioritization in future engagements. Additional or alternatively,weapon system200 may transmit data identifying any targets marked as safe toother weapons systems200 in the same general location, so that thoseother weapons systems200 may automatically remove the target marked as safe from their target selection/prioritization lists as well.
Althoughstep404 was described above as performed for only a single target (i.e., the current highest priority target), in some embodiments, the targeting/firing system210 may continuously performing dynamic tracking for all targets selected/prioritized in steps401-402. In such cases, by performing dynamic tracking on the selected secondary target(s), before the completion of the firing sequence403-410 for the primary target, the targeting/firing system210 may more quickly and efficiently determine the firing solution for the next primary target as soon as the firing sequence403-410 is completed for the first primary target. Additionally, while dynamically tracking a plurality of secondary target(s), the targeting/firing system210 may potentially re-order the prioritization sequence determined instep402, for example, based on movement of the secondary targets and/or based on newly received data about one or more of the secondary targets (e.g., improved verification information, additional threat information, etc.).
Instep405, the targeting/firing system210 may engage themotor235 to drive the orientation of theweapon225 toward the firing solution determined for the primary target instep404 Thus, referring again toFIG. 6, themotor235 may be engaged to aim theweapon225 from its currently aimedposition605, to the determinedfiring solution point607. It may be noted from this example, that (a) theweapon225 may be driven not toward the current position point of thetarget606, but instead to thefuture position point607, and (b) that themotor235 may be engaged and theweapon225 may be driven to this point by the targeting/firing system210 in a fully autonomous manner, before any action has been taken by the operator to view, select, mark, or engage this target.
Instep406, the targeting/firing system210 may generate and transmit a user interface to be rendered for the operator via one ormore display devices250. As discussed above, the human operator may be located at theweapon system200 or remote to theweapon system200, in which case the user interface may be transmitted via thecommunication module212 over one or more secure computer networks, wireless networks, satellite networks, etc. In various embodiments, the user interface provided instep406 may correspond touser interfaces500 and/or600 discussed above, although several variations may be implemented in different embodiments. For instance, as noted above, theprimary target501 may be marked by a particular scheme that is different from the secondary targets and from non-targets. In some cases, the user interface may automatically zoom in on the primary target (as in screen600) to allow the operator the best possible visual of the target. Additionally or alternatively, secondary targets and/or non-targets may be blocked out, hidden, or otherwise obscured to prevent confusion or distraction by the operator. Further, in different embodiments, each of the various different target points discussed above (e.g.,crosshairs605 representing current weapon aiming point, the currenttarget position point606, and/or firing solution target point607) may or may not be rendered within the user interface, and/or may be shown in different colors, using different graphics and icons, etc. Finally, the user interface generated and rendered instep406 may include additional components such as side menus, overlays, and the like, to convey any relevant sensor information about the target or the firing environment. Examples of such sensor that may be included in the operator user interface may include the target type, target name/identifier of verified (if known) and confidence level of the verified name/identifier, distance to target, current wind and weather conditions, current status ofweapon225 and ammunition supply, number of other secondary targets, etc.
Instep407, the targeting/firing system210 may receive engagement instructions from the operator, via operator controls245. As illustrated inFIG. 5, in some embodiments, the operator controls might only include two buttons: a fire button and next button. Or, as illustrated inFIG. 6, the operator controls might include only three buttons: a fire button, a next button, and safe button. Although any number of different/additional operator controls may be included in other embodiments (e.g., mouse/joystick for aiming, manual override, target selection controls, etc.), there are certain technical advantages associated with a limited interface such as a two-button or three-button interface as shown500-600, including simplification of operator interface, reduction or real-time operator errors, increased speed to weapon firing, etc.
Additionally, as noted above during the discussion of the dynamic target tracking, there may be time delay betweensteps406 and407, for target analysis, evaluation, and decision-making by the operator. During this time delay, the dynamic tracking may continue for the primary target as well as the secondary targets selected by the targeting/firing system210. Thus, while the operator deliberates on whether or not to fire on a target betweensteps406 and407, for moving targets and/or other circumstances (e.g., a detected change in the wind), the firing solution may be updated during this time delay and themotor235 may be continuously engaged so that theweapon225 is continuously aimed at the most recent firing solution target point. Additionally, for excess delays or deliberations betweensteps406 and407, the target identification, selection, and prioritization techniques discussed above insteps401 and402 may be updated, automatically and entirely transparently to the operator, to re-select and re-prioritize the targets based on new imagery, sensor data, and other relevant data received during the time delay between steps406-407.
After receiving the firing/engagement instructions from the operator instep407, the targeting/firing system210 may perform the received instructions in steps408-410. In this example, similar to that shown inFIG. 6, there are only three possible operator instructions with respect to the primary target shown in the user interface: fire on the target (step408), do not fire on the target and proceed to the next target (step409), and do not fire on the target and mark the target as a non-target (step410). As discussed above, the fire command (408) is an operator instruction to fire theweapon225, and in some cases might be enabled only after theweapon225 has reached the firing solution target point. When the operator selects the fire button246 (or other fire command) instep408, the targeting/firing system210 may initiate firing of theweapon225, and then return to perform steps403-410 for the next highest priority target. Additionally, in some embodiments, the targeting/firing system210 may be configured to evaluate the accuracy of the projectile fired instep410, and may perform a real-time automatic correction in the targeting algorithm based on the accuracy evaluation. For example, upon firing a shot instep410, the targeting/firing system210 may be configured to activate one or more cameras or sensors from sensor units240 (which may be local or remote), to detect the landing time and location of the projectile. Additional sensors such as audio sensors, heat sensors, etc., also may be used to determine where the projectile hit/landed. The projectile landing/hit data may compared to the firing solution/target point data that was determined by the targeting/firing system210 prior to firing the projectile. If the shot was off target by an amount greater than a predetermined accuracy threshold, then the targeting/firing system210 may be configured to adjust its targeting algorithm in real-time, so that the updated algorithm may be used in the next iteration of steps403-410. Additionally, if the shot was off target by a sufficient amount that the target was missed, then the targeting/firing system210 may be further configured to re-insert the previously fired upon target back into the priority list of selected targets.
The next command (step409) is an operator instruction not to fire theweapon225 at the target, but to retain the target within the set of selected targets/target priority list, and then to re-select the next highest priority target in the priority list. In various examples, a next command instep409 may cause the target to be placed at the back of the priority list of selected targets, or may cause the target to placed immediately after the next highest priority target in the priority list. Finally, a safe command (step410) is an operator instruction to mark the target as a friendly or non-target object, thereby removing it from the set of selected targets and target priority list. Thus, afterstep410, the target may not be selected again by the targeting/firing system210, during at least the current engagement by thecurrent weapon system200. As noted above, in some embodiments, a target marked as safe duringstep410 during an engagement at oneweapon system200 also might be excluded from target selection in future engagements of theweapon system200, and/or during current and future engagements atdifferent weapons systems200.
Thus, the various techniques discussed above with reference toFIG. 4, including without limitation: (a) autonomous target selection, prioritization, and re-selection by the targeting/firing system210, (b) dynamic target tracking of both the primary target and secondary targets that takes into account target movement, weapon/projectile characteristics, etc., (c) autonomous actuation of the motor to automatically orient the weapon toward the primary target before receiving any operator input, (d) a simplified user interface and operator controls, and (e) enabling/disabling of the firing mechanism depending on the projected point of impact of the weapon, alone and in combination, provide increased system efficiency, increased rate of firing, improved weapon system accuracy, and reduced operator error, along with the other technical advantages described herein.
As mentioned above, certain aspects of the present disclosure relate to techniques for disabling and re-enabling an operator firing control (e.g.,246), during the period of time when themotor235 of amotorized weapon system200 is engaged and theweapon225 is being positioned and oriented toward a determined target point for firing. The process of engaging themotor235 of theweapon system200 to position theweapon225 to fire on a particular target point may take anywhere from a fraction of second to several seconds, depending on factors size as the motor size and speed, gun size and weight, angular distance to be traveled, etc. During the time period when themotor235 is engaged in positioning theweapon225, the projected point of impact of a projectile fired from theweapon225 may become closer and closer to the target point, and similarly, the likelihood of hitting the target may increase continuously until a maximum likelihood is reached when the projected point of impact of the weapon225 (e.g., marked bycrosshairs505,605, etc.) is directly on the determined firing solution target point. Because many unknown variables may exist during the weapon firing process (e.g., exact target distance and bearing, exact muzzle velocity and aerodynamic drag of projectile, future target movement, exact wind and air pressure conditions, exact weapon vibration, and so on), the probability of hitting the target might never be100%. However, when the likelihood of hitting the target is determined to be sufficiently high, e.g., above a predetermined likelihood threshold, then the targeting/firing system210 may be configured to enable firing of the weapon225 (and/or automatically fire the weapon225).
Accordingly, in some embodiments, the targeting/firing system210 may be configured to determine if/when the predetermined likelihood threshold for hitting the target is reached during the time period when themotor235 is engaged in positioning theweapon225, but before thecrosshairs505 are directly on the target (i.e., before the projected point of impact of theweapon225 is directly on the determined firing solution target point). In such embodiments, the targeting/firing system210 may be configured to disable theoperator firing mechanism246 when the current likelihood of hitting the target is below the predetermined likelihood threshold, based on the position/orientation of theweapon225 and other factors. Theoperator firing mechanism246 then may be re-enabled in response to the targeting/firing system210 determining that the current likelihood of hitting the target is above the predetermined likelihood threshold. These aspects are described below in more detail with reference toFIGS. 7-8.
Referring now toFIG. 7, a flow diagram is shown illustrating a process of disabling and/or re-enabling the firing mechanism of a motorized weapon system while the motor is engaged to move the weapon to a target point. As described below, the steps in this process may be performed by one or more components in the examplemotorized weapon system200 discussed above, such as targeting/firing system210 and the subsystems thereof, in conjunction with the weapon/mount/motor components225-235, one ormore sensor units240, operator interface components245-250, and/or various remote and external systems. However, it should be understood that process steps described herein, such as determination of likelihood thresholds for hitting targets, and corresponding boundary areas for motorized weapons systems, need not be limited to the specific systems and hardware implementations described above inFIGS. 1-3, but may be performed within other motorized weapon systems and environments comprising other combinations of the hardware and software components described herein.
Instep701, amotorized weapon system200 has identified and selected a particular target, and determines a firing solution and/or target point for the selected target. Thus, step701 may be similar or identical to step404 discussed above. As noted above, one or both of the target and theweapon system200 may potentially be moving during this process. When both the targets and theweapon225 are stationary, target tracking need not be performed, and the firing solution target point may be computed based on factors including the target distance, target bearing from theweapon225, muzzle velocity of theweapon225, aerodynamic drag of the projectile/ammunition to be fired, the wind and weather conditions, and gravity (any one of which may vary based on the current conditions). However, when one or both of the selected target and theweapon225 are moving and/or are anticipated to be moving, dynamic target tracking may be required to generate a firing solution, and additional variables may increase the complexity and uncertainty of the firing solution calculation. For example, dynamic target tracking may be used to determine the current velocity and direction of travel of both theweapon system200 and the target, and that data may be used to calculate the anticipated velocity and direction of travel of both in the near future. In some cases, the targeting/firing system210 may assume that both theweapon system200 and the target may continue along their current course with the same velocity and direction, and if either is currently moving along a curved path and/or is currently accelerating/decelerating, then the targeting/firing system210 may assume the same curved path and/or the same acceleration/declaration in the near future. As noted above, when performing dynamical tracking on a moving target, the determination of a firing solution (e.g., predicted future coordinates at a future firing time) also may take into account the anticipated time to engage themotor235 to position and orient the weapon at the correct firing point, as well as the anticipated time lag for the fired projectile to reach the target. Additionally, in some cases, the targeting/firing system210 may build in an anticipated delay for operator reaction time (e.g., 0.5 seconds, 1 second) which may be included in the firing solution calculations for moving targets.
Instep702, the targeting/firing system210 of themotorized weapon system200 may determine a boundary area surrounding the target point determined instep701. In some examples, the boundary area may be referred to as a “confidence lock” boundary, because as discussed below, the firing mechanism may be disabled when the projected point of impact of the weapon is outside of this area. From the perspective of theweapon system200, the boundary area may be a circle or other two-dimensional closed shape surrounding the target point. A simple example of acircular boundary area807 is shown inFIGS. 8A-8B, discussed in more detail below. The boundaries of the area may correspond to a predetermined likelihood threshold of hitting the target and need not be any particular shape. That is, when the projected point of impact of theweapon225 is directly on any point of the boundary of the area, the likelihood of theweapon225 hitting the target may be calculated as a probability P, which may be the same for every point on the boundary of the area and is also the same as a predetermined likelihood threshold set by the targeting/firing system210. Thus, for any shot taken when the weapon crosshairs are outside of the boundary area, the likelihood of hitting the target is less than P, and for any shot taken when the weapon crosshairs are inside of the boundary area, the likelihood of hitting the target is greater than P.
In some embodiments, the boundary area may be circular, as shown inFIGS. 8A-8B. Circular boundaries may generally apply when the determined probability P is the probability of the hitting the target point. However, if the determined probability P is the probability of hitting any point on the target, then the boundary area may be target-shaped (e.g., a larger vehicle-shaped boundary surrounding the target vehicle, a larger person-shaped boundary surrounding the target person, etc.). When either the target or theweapon system200 is current moving, the boundary area may assume a more elongated shape in the direction of the movement, to account for the additional targeting uncertainties caused by the movement of theweapon system200 or target. For example, for a horizontally moving target vehicle and/or horizontally moving weapon system, the boundary area may be shaped like a horizontally-elongated circle (or horizontally-elongated vehicle shape). In any of these examples, the boundary area may be defined in terms of angular coordinates (e.g., azimuth and altitude) from the perspective of theweapon225.
The size of the boundary area determined instep702 may be based on any combination of factors that may introduce uncertainty in the point of impact calculation of theweapon225 with respect to the target. For instance, the size of the boundary area (e.g., in terms of angular degrees or coordinates) may be based on one or more of the target size, distance between theweapon225 and the target, the general accuracy and precision data for theweapon type225 and ammunition type, and other factors such as wind, vibration level of theweapon225 during movement by the motor, and current movement of theweapon system200 and/or the target. In scenarios where there is a high degree of confidence in the predictive accuracy of the weapon's crosshairs, the boundary area may be relatively small. In contrast, for scenarios of greater uncertainty of the relevant variables, and where the confidence level is in the predictive accuracy of the weapon's crosshairs is lower, than the boundary area may be relatively large.
Instep703, the targeting/firing system210 engages themotor235 to position and orient theweapon225 toward the target point identified instep701. Thus, step703 may be similar or identical to step405, discussed above. For example, referring back toFIG. 6, if thetarget601 is stationary, then the engagement of the motor635 may drive the position and orientation of theweapon225 to a predicted point of impact of thestationary target point606. If thetarget601 is moving, then the engagement of the motor635 may drive the position and orientation of theweapon225 to a separate predicted future target point (e.g.,607) determined by a firing solution calculation based on predicted target movement and anticipated time delays until firing and impact.
Instep704, at a particular point of time when themotor235 is engaged and theweapon225 is moving, the targeting/firing system210 may compute the projected point of impact if a projectile were fired from theweapon225 at that time. The projected point of impact corresponds to the calculation of the crosshairs (e.g.,505 and605) discussed above and shown inFIGS. 5 and 6. The calculation of the projected point of impact may be based on the specifications of theweapon system200 and/or collected sensor data, such as the current position and orientation of the gun, the distance to target and bearing of the target from theweapon225, the muzzle velocity of theweapon225, the aerodynamic drag of the projectile to be fired, the current wind and weather conditions, and gravity (which may vary based on the current elevation).
Instep705, the targeting/firing system210 may compare the projected point of impact computed instep704 to the “confidence lock” boundary area defined instep702. This may be straightforward comparison of angular coordinates from the perspective of theweapon225. If the current point of impact of theweapon225 is projected to fall outside of the defined boundary area (705:No), then instep706 the targeting/firing system210 may disable theoperator firing mechanism246 thereby preventing theweapon225 from being fired. However, if the current point of impact of theweapon225 is projected to fall within the defined boundary area (705:Yes), then instep707 the targeting/firing system210 may enable (or re-enable) theoperator firing mechanism246, thereby allowing the operator to fire theweapon225.
In some embodiments, after theoperator firing mechanism246 has been re-enabled instep707, and the operator fires on the target, the targeting/firing system210 may be configured to perform a rapid post-firing command movement of theweapon225 in order to further improve shot confidence. For instance, after the operator pushes theenabled firing mechanism246, rather than immediately firing theweapon225, the targeting/firing system210 in some cases may engage themotor235 for a short amount of time (e.g., 50 ms, 100 ms, 200 ms, etc.), in response to a determination that the corresponding small weapon movement may significantly increase shot confidence. These short post-firing command movements may be performed in the case of moving targets and/or movingweapon systems200, in the event of a sudden change in the trajectory of the target, to correct for a lag in operator reaction time, and/or as part of a firing burst to increase hit probability.
Referring briefly toFIGS. 8A and 8B, two example user interface screens800 are shown, during a process of engaging themotor235 of amotorized weapon system200 to position and orient theweapon225 at a selectedtarget point806. In these examples, a circular “confidence lock”boundary area807 has been defined by the targeting/firing system210, outside of which firing of theweapon225 is to be disabled. As shown inFIG. 8A, when the projected point ofimpact805 of theweapon225 falls outside of theboundary area225, the operator may be unable to fire the weapon225 (as indicated by the shaded fire button810). InFIG. 8B, themotor235 has now oriented theweapon225 closer to thetarget point806, and the projected point ofimpact805 now falls within theboundary area807. Therefore, as shown inFIG. 8B, the fire button is now re-enabled allowing theweapon225 to be fired by the user. It is further noted in this example that thenext button815 and thesafe button820, which are discussed above in reference toFIGS. 5-6, are active and enabled regardless of the current orientation of theweapon225.
As further shown inFIG. 7, the functionality of steps704-707 may be performed multiple times while themotor235 is engaged and theweapon225 is moving toward the target point. In some embodiments, targeting/firing system210 may perform steps704-707 on a continuous loop at all times while themotor235 is engaged, or in some cases even when themotor235 is not engaged. Additionally or alternatively, the targeting/firing system210 may be configured to initiate an instance ofsteps704 in accordance with a schedule (e.g., every 100 ms, 200 ms, 500 ms, etc.).
As mentioned above, these steps may be performed periodically or continuously even when themotor235 is not moving and thecrosshairs805 are fixed on thetarget point806. In these scenarios, a new action such as a change in movement of the target801 or theweapon system200, an object obscuring the target801, and/or new sensor readings (e.g., a change in wind conditions) may temporarily cause the probability level of theweapon225 hitting the target to drop below the predetermine likelihood threshold and out of the confidencelock boundary area807, requiring a minor adjust via themotor235 or other corrective action by theweapon system200.
Using similar techniques to those discussed above in referenced toFIG. 7, certain embodiments of amotorized weapon system200 may implement a minimum confidence threshold for target selection and/or prioritization. In some cases, this minimum confidence threshold may be a separate determination from the level of confidence computed by thesystem200 for identifying or verifying a target. Rather, this minimum confidence threshold may refer to the level of confidence that theweapon system200 is able to hit the identified target. For example, if an identified and verified target is too far away fromweapon system200, is moving too fast or too erratically, is too small, is not within a sufficiently direct line-of-sight of theweapon225, then the targeting/firing system210 may determine that the confidence level that theweapon system200 will hit the target is not sufficiently high to fire theweapon235. Environmental conditions such as wind or weather conditions, lighting conditions, and/or other objects potentially obscuring the target object also may lower the confidence level computed by the targeting/firing system210 for hitting the target. In such embodiments, when the confidence level computed by the targeting/firing system210 falls below the predetermined threshold for target, that target may be automatically deprioritized so that it is not selectable by the operator (or selectable only via manual override). However, the targeting/firing system210 may continue to monitor and dynamically track the low-confidence target, and may re-enable target selection and firing capabilities on that target as soon as the confidence level of hitting the target returns to above the minimum confidence threshold. The minimum confidence threshold is another operation-specific variable that may be altered based on the operation, the particular operator, the location, and other factors.
In some embodiments, over the course of a particular operation (or multiple operations at or near the same location) the firing/targetingsystem210 may continuously assess and evaluate its target accuracy, which may result thesystem210 increasing or decreasing the confidence levels it had previously computed for one or more selected targets. As an example, if a first target is initially determined to be too small and too far away to have a sufficiently high confidence level for firing on the target, the firing/targetingsystem210 may instead select a number of closer targets and may fire on those targets. Then, by analyzing the firing trajectories and accuracies of hitting the closer targets, the firing/targetingsystem210 may be better able to evaluate the range, lighting, wind conditions, and the like, so that the confidence level for the hitting the first target now may be increased based on the accuracy feedback from the closer targets.
As demonstrated in the above examples, amotorized weapon system200 may be weapon-agnostic, in that aweapon system200 may support many different types or models ofweapons235, including various firearms, large caliber rifles, machine guns, autocannons, grenade launchers, rockets, and/or directed energy weapons such as lasers, high-power microwave emitters, and other undisclosed devices. Further, the targeting/firing system210 may weapon profiles indata store214 and/or weapon-specific rules indata store213, that allow theweapon system200 to perform the techniques discussed herein in a similar or identical manner regardless of the current weapon type. In some embodiments, the targeting/firing system210,sensor units240, and the operator interface245-250 may function identically regardless of the type ofmotor235,mount230, andweapon225 integrated into thesystem200. Becausesystems200 having different types ofweapons225, mounts230, and/ormotors235, may perform differently in some respects (e.g., time required to re-position and re-orient theweapon225, maximum range of weapon, type, size, and speed of projectiles fired, etc.), the targeting/firing system210 may be configured to initially determine these weapon-specific data factors, and adjust the techniques described herein to provide a uniform operator experience.
For instance, the targeting/firing system210 of afirst weapon system200 may automatically select targets based on the firing range of theweapon225 installed on thatsystem200, whereas adifferent system200 might select more or less targets based on its having aweapon225 with a different range. In other example, afirst weapon system200 may prioritize a set of selected targets taking into account the speed of themotor235 on thatsystem200, whereas adifferent system200 might prioritize the same set of targets differently as a result of having a different motor speed. As yet another example,different sensor units240 have different numbers, types, and/or qualities of cameras and other sensors, may result in different sets of input provided to the targeting/firing systems210. As a result, afirst weapon system200 may have sufficient data to select and verify a target with high confidence, while asecond weapon system200 with different cameras/sensors240 would not select because it could not verify the target with a sufficient confidence level. In all of these examples, the different behaviors of theweapon systems200, resulting fromdifferent weapons225, mounts230,motors235, and/orsensor units240 may be entirely transparent to the operator. In some cases, operators ofweapons systems200 need not ever know whatweapon225 they are firing, and the entire operator interface may function identically regardless of the particular weapon, motor, mount, or sensor unit. These similarities may apply to the operator interface with respect to the kill chain sequence described in reference toFIG. 4, the enabling/disabling of the operator's firing mechanism based on the confidence lock area boundary described in reference toFIG. 7, the related technique of enforcing a minimum confidence threshold for targeting/firing discussed above, and all other techniques described herein.
Additional techniques applicable to the above examples include the implementation of operation-specific rules of engagement that may be retrieved/received and enforced by the targeting/firing system210. As discussed above, specific rules of engagement and/or operational parameters for the motorized weapon system may include different requirements or parameters for target identification and selection, different minimum confidence thresholds for firing theweapon225, different target prioritization algorithms, and so on. In some embodiments, themotorized weapon system200 may be configured to receive a set of operation-specific rules of engagement from a remote command center via a secure communication channel, store and apply those operation-specific rules during the appropriate operation. As noted above, specific rules of engagement and/or sets of operational parameters may be associated with specific operators, operator rank, engagement location (e.g., country, region, etc.). In some embodiments, operators having sufficient rank and/or authorization levels may be permitted to manually override certain rules of engagement and/or operational parameters of theweapon system200, and to apply the operator's own preferred rules/parameters in place. Additionally or alternatively, such overrides may require outside approval, and thus upon receiving a rule/parameter override request from the operator, the weapon system may be configured to transmit a secure request for override approval a remote command center.
In several examples above, the target points for selected targets, including stationary and moving targets, are computed based on a desired point of impact location on the target (e.g., an engine of a boat or vehicle, the center of mass of an individual, etc.). However, in some embodiments, the targeting/firing system210 may be configured with warning shot capabilities in which the desired point of impact location is not on the target. For instance, the rules of engagement enforced by the targeting/firing system210 for a particular operation may dictate that only warning shots are to be fired at particular selected target. Alternatively, such rules may dictate that at least one initial warning shot is to be fired at a selected target before an attempt is made to hit the target. In some cases, the operator controls245 also may include a warning shot mode that can be activated by the operator, independent of the rules of engagement of the operation, to allow the operator to independently fire one or more warning shots on any selected target.
When the targeting/firing system210 is configured to operate in a warning shot mode, the firing solution may be adjusted to assure that the projectiles fired by theweapon225 will miss the target. In some embodiments, the targeting/firing system210 may determine the preferred location of a desired warning shot based on the type and size of the target (e.g., the number and position of warning shots for human targets may be different than for vehicle targets), the orientation and/or the direction of movement of the target (e.g., it may be desirable to firing a warning shot directly in front of the target), and so on.
Implementation of the techniques, blocks, steps and means described above may be done in various ways. For example, these techniques, blocks, steps and means may be implemented in hardware, software, or a combination thereof. For a hardware implementation, the processing units may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof.
Furthermore, embodiments may be implemented by hardware, software, scripting languages, firmware, middleware, microcode, hardware description languages, and/or any combination thereof. When implemented in software, firmware, middleware, scripting language, and/or microcode, the program code or code segments to perform the necessary tasks may be stored in a machine readable medium such as a storage medium. A code segment or machine-executable instruction may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a script, a class, or any combination of instructions, data structures, and/or program statements. A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, and/or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
For a firmware and/or software implementation, the methodologies may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. Any machine-readable medium tangibly embodying instructions may be used in implementing the methodologies described herein. For example, software codes may be stored in a memory. Memory may be implemented within the processor or external to the processor. As used herein the term “memory” refers to any type of long term, short term, volatile, nonvolatile, or other storage medium and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
Moreover, as disclosed herein, the term “storage medium” may represent one or more memories for storing data, including read only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices and/or other machine readable mediums for storing information. The term “machine-readable medium” includes, but is not limited to portable or fixed storage devices, optical storage devices, and/or various other storage mediums capable of storing that contain or carry instruction(s) and/or data.
A computer system as illustrated inFIG. 9 may be incorporated as part of the previously described systems, such as to execute the client interface, perform the functionality of orchestration systems and/or datacenters, etc.FIG. 9 provides a schematic illustration of one embodiment of acomputer system900 that can perform various steps of the methods provided by various embodiments. It should be noted thatFIG. 9 is meant only to provide a generalized illustration of various components, any or all of which may be utilized as appropriate.FIG. 9, therefore, broadly illustrates how individual system elements may be implemented in a relatively separated or relatively more integrated manner.
Thecomputer system900 is shown comprising hardware elements that can be electrically coupled via a bus905 (or may otherwise be in communication, as appropriate). The hardware elements may include one ormore processors910, including without limitation one or more general-purpose processors and/or one or more special-purpose processors (such as digital signal processing chips, graphics acceleration processors, video decoders, and/or the like); one ormore input devices915, which can include without limitation a mouse, a keyboard, remote control, and/or the like; and one ormore output devices920, which can include without limitation a display device, a printer, and/or the like.
Thecomputer system900 may further include (and/or be in communication with) one or morenon-transitory storage devices925, which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, a solid-state storage device, such as a random access memory (“RAM”), and/or a read-only memory (“ROM”), which can be programmable, flash-updateable and/or the like. Such storage devices may be configured to implement any appropriate data stores, including without limitation, various file systems, database structures, and/or the like.
Thecomputer system900 might also include acommunications subsystem930, which can include without limitation a modem, a network card (wireless or wired), an infrared communication device, a wireless communication device, and/or a chipset (such as a Bluetooth™ device, an 802.11 device, a WiFi device, a WiMax device, cellular communication device, etc.), and/or the like. Thecommunications subsystem930 may permit data to be exchanged with a network (such as the network described below, to name one example), other computer systems, and/or any other devices described herein. In many embodiments, thecomputer system900 will further comprise a workingmemory935, which can include a RAM or ROM device, as described above.
Thecomputer system900 also can comprise software elements, shown as being currently located within the workingmemory935, including anoperating system940, device drivers, executable libraries, and/or other code, such as one ormore application programs945, which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein. Merely by way of example, one or more procedures described with respect to the method(s) discussed above might be implemented as code and/or instructions executable by a computer (and/or a processor within a computer); in an aspect, then, such code and/or instructions can be used to configure and/or adapt a general purpose computer (or other device) to perform one or more operations in accordance with the described methods.
A set of these instructions and/or code might be stored on a non-transitory computer-readable storage medium, such as the non-transitory storage device(s)925 described above. In some cases, the storage medium might be incorporated within a computer system, such ascomputer system900. In other embodiments, the storage medium might be separate from a computer system (e.g., a removable medium, such as a compact disc), and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon. These instructions might take the form of executable code, which is executable by thecomputer system500 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer system900 (e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc.), then takes the form of executable code.
It will be apparent to those skilled in the art that substantial variations may be made in accordance with specific requirements. For example, customized hardware might also be used, and/or particular elements might be implemented in hardware, software (including portable software, such as applets, etc.), or both. Further, connection to other computing devices such as network input/output devices may be employed.
As mentioned above, in one aspect, some embodiments may employ a computer system (such as the computer system900) to perform methods in accordance with various embodiments of the invention. According to a set of embodiments, some or all of the procedures of such methods are performed by thecomputer system900 in response toprocessor910 executing one or more sequences of one or more instructions (which might be incorporated into theoperating system940 and/or other code, such as an application program945) contained in the workingmemory935. Such instructions may be read into the workingmemory935 from another computer-readable medium, such as one or more of the non-transitory storage device(s)925. Merely by way of example, execution of the sequences of instructions contained in the workingmemory935 might cause the processor(s)910 to perform one or more procedures of the methods described herein.
The terms “machine-readable medium,” “computer-readable storage medium” and “computer-readable medium,” as used herein, refer to any medium that participates in providing data that causes a machine to operate in a specific fashion. These mediums may be non-transitory. In an embodiment implemented using thecomputer system900, various computer-readable media might be involved in providing instructions/code to processor(s)910 for execution and/or might be used to store and/or carry such instructions/code. In many implementations, a computer-readable medium is a physical and/or tangible storage medium. Such a medium may take the form of a non-volatile media or volatile media. Non-volatile media include, for example, optical and/or magnetic disks, such as the non-transitory storage device(s)925. Volatile media include, without limitation, dynamic memory, such as the workingmemory935.
Common forms of physical and/or tangible computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, any other physical medium with patterns of marks, a RAM, a PROM, EPROM, a FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read instructions and/or code.
Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to the processor(s)910 for execution. Merely by way of example, the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer. A remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by thecomputer system900.
The communications subsystem930 (and/or components thereof) generally will receive signals, and thebus905 then might carry the signals (and/or the data, instructions, etc. carried by the signals) to the workingmemory935, from which the processor(s)910 retrieves and executes the instructions. The instructions received by the workingmemory935 may optionally be stored on anon-transitory storage device925 either before or after execution by the processor(s)910.
100981 It should further be understood that the components ofcomputer system900 can be distributed across a network. For example, some processing may be performed in one location using a first processor while other processing may be performed by another processor remote from the first processor. Other components ofcomputer system900 may be similarly distributed. As such,computer system900 may be interpreted as a distributed computing system that performs processing in multiple locations. In some instances,computer system900 may be interpreted as a single computing device, such as a distinct laptop, desktop computer, or the like, depending on the context.
The methods, systems, and devices discussed above are examples. Various configurations may omit, substitute, or add various procedures or components as appropriate. For instance, in alternative configurations, the methods may be performed in an order different from that described, and/or various stages may be added, omitted, and/or combined. Also, features described with respect to certain configurations may be combined in various other configurations. Different aspects and elements of the configurations may be combined in a similar manner. Also, technology evolves and, thus, many of the elements are examples and do not limit the scope of the disclosure or claims.
Specific details are given in the description to provide a thorough understanding of example configurations (including implementations). However, configurations may be practiced without these specific details. For example, well-known circuits, processes, algorithms, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the configurations. This description provides example configurations only, and does not limit the scope, applicability, or configurations of the claims. Rather, the preceding description of the configurations will provide those skilled in the art with an enabling description for implementing described techniques. Various changes may be made in the function and arrangement of elements without departing from the spirit or scope of the disclosure.
Also, configurations may be described as a process which is depicted as a flow diagram or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure. Furthermore, examples of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the necessary tasks may be stored in a non-transitory computer-readable medium such as a storage medium. Processors may perform the described tasks.
Having described several example configurations, various modifications, alternative constructions, and equivalents may be used without departing from the spirit of the disclosure. For example, the above elements may be components of a larger system, wherein other rules may take precedence over or otherwise modify the application of the invention. Also, a number of steps may be undertaken before, during, or after the above elements are considered.