The application is a divisional application of a primary application with application date of 2013, 8, and 16, and application number of 201380044058.X, entitled "flying camera with rope assembly for positioning and interaction".
Disclosure of Invention
In accordance with the present invention, the limitations of previous methods for aerial imaging have been substantially reduced or eliminated. In particular, it is an object of the present invention to provide improved systems and methods for position and attitude stabilization of flying bodies (volitant bodies) including flight cameras using a tether assembly. Furthermore, it is an object of the invention to provide improved user interaction for a flying subject comprising a flying camera.
In the context of the present invention, such user interaction typically enables a user intent translation to perform an action or change an operation of a flying subject. This is usually achieved by a specific physical action of the user (e.g. pulling a string) and by a specific physical action of the flying body (e.g. pulling the same string backwards). In the context of the present invention, user interactions include, but are not limited to: a force applied by the user or by the flying body or both, user activation of a switch or button, a visual or audible signal from the user or from the flying body, or pre-coding sequences and patterns thereon.
According to a first aspect of the present invention, there is provided an apparatus comprising: a flying body comprising at least one actuator; a control unit for controlling the actuator; and a mechanical device for operatively connecting the flying body to a reference point remote from the flying body.
The flying body preferably comprises means for flying.
Preferably, at least one actuator is configured to drive the means for flying.
Preferably, the flying body further comprises a camera.
The operable connection may be of a mechanical nature.
Preferably, the mechanical device mechanically connects the flying body to a reference point.
The mechanical device may be operatively connected to the control unit.
The apparatus may comprise an evaluation unit operable to provide data indicative of at least one of: (ii) the attitude (a) of the flying body relative to the reference point; or (b) a position, and wherein the control unit is configured to control the at least one actuator based on the data.
In the present application, pose means the complete definition of the rotation of a rigid body with respect to an inertial coordinate system, i.e. the orientation of said body in all three dimensions. An exemplary inertial coordinate system is the local gravity-ground reference northeast coordinate system that is commonly used.
Preferably, the evaluation unit is comprised in the mechanical device and is preferably positioned on the flying body. The data provided by the evaluation unit is preferably based on mechanical forces applied to the mechanical device.
Preferably, the evaluation unit is positioned on the flying body.
The apparatus may further include: a sensing unit operable to provide data representative of mechanical force applied to the mechanical device to the evaluation unit. Preferably, the mechanical device comprises the sensing unit.
Preferably, the sensing unit is positioned on the flying body.
The sensing unit may be mechanically connected to the flying body.
The apparatus may include: a force sensor for determining the mechanical force, operatively connected to the sensing unit. Preferably, the mechanical device comprises the force sensor.
Preferably, the force sensor is positioned on the flying body.
The device may include a sensor for providing data representative of at least one of: acceleration, attitude or rate of rotation of the flying body, the sensor being operatively connected to the sensing unit. The apparatus may comprise a sensor for providing data indicative of the position of the flying body relative to the reference point, the sensor being operatively connected to a sensing unit.
The device may comprise a memory unit operatively connected to the evaluation unit, in which first data relating to properties of the mechanical device and second data relating to properties of the flying body are stored, wherein the evaluation unit is configured to perform an evaluation using at least one of the first or second data, in order to provide the data representing at least one of: an (a) attitude or (b) position of the flying body relative to the reference point.
The apparatus may further comprise active safety means and/or passive safety means.
The apparatus may further comprise active user interaction means and/or passive user interaction means.
Preferably, the mechanical device defines a user interaction means.
According to another aspect of the invention there is provided the use of any of the above mentioned mechanical devices as a communication channel based on mechanical forces applied to the mechanical device.
According to another aspect of the present invention there is provided the use of any of the above-mentioned mechanical devices for aerial imaging.
According to another aspect of the present invention there is provided a method for operating any of the above mentioned mechanical devices, the method comprising the steps of: controlling the at least one actuator to cause the flying body to fly away from the reference point.
Preferably, the step of controlling the at least one actuator is accomplished using mechanical means.
The method may comprise the additional steps of:
using the evaluation unit to provide data representing at least one of: (ii) the attitude (a) of the flying body relative to the reference point; or (b) position; and
providing the data to the control unit, and wherein the control unit performs control of the at least one actuator based on the result of the data evaluation.
The method may comprise the steps of:
sensing a mechanical force applied to the mechanical device; providing data representative of mechanical forces applied to the mechanical device to the evaluation unit, and wherein the evaluation unit performs the steps of: using the data representative of mechanical forces applied to the mechanical device to provide data representative of at least one of: (ii) the attitude (a) of the flying body relative to the reference point; or (b) position.
The method may further comprise the steps of:
recording first data relating to a property of said mechanical device, an
Recording second data relating to the properties of said flying body, an
Performing an evaluation at an evaluation unit using at least one of the first or second data in order to provide the data representing at least one of: (ii) the attitude (a) of the flying body relative to the reference point; or (b) position.
The method may further comprise the steps of:
sensing a mechanical force applied to the mechanical device, an
Evaluating the sensed mechanical force to provide an evaluation result;
providing an evaluation result to the control unit, and wherein the step of controlling the at least one actuator is performed based on the evaluation result.
Preferably, the mechanical force is applied to the mechanical device at the reference point.
Preferably, the evaluation result may include data representing at least one of: a magnitude of a force, a change in magnitude of a force, a direction of a force, and/or a sequence of forces.
At least one of the evaluating or controlling steps may comprise the steps of: the user interacts with the mechanical device.
Preferably, the step of the user interacting with the mechanical device comprises: the user applies one or more forces to the mechanical device or the user reduces the force applied to the mechanical device.
The step of sensing the mechanical force may comprise sensing at least one of: direction, magnitude, or timing of the mechanical force.
The method may comprise the steps of:
communicating with the reference point by applying one or more forces to the mechanical device, wherein the at least one actuator is controlled by a control unit to cause the flying body to apply the one or more forces to the mechanical device.
The step of controlling the at least one actuator may comprise at least one of: performing an emergency maneuver, performing an active maneuver, detecting continuous user input, or detecting concurrent user input.
According to another aspect of the invention, there is provided a method implemented in an evaluation unit of calculating at least one of the attitude or the relative position of a flying body with respect to a reference point. The method may include: a first step of receiving first data from an inertial sensor mounted on the flying body. Optionally, the method may comprise a second step of retrieving the second data from a memory unit comprising previous data or calculations. The method may comprise the steps of: using the first and/or second data to: (a) predicting the instantaneous movement of the aircraft, and (b) calculating the approximate position of the flying body relative to a reference point and/or the attitude of the flying body. Alternatively, prior knowledge about the dynamics of the device may be used to improve this prediction and calculation. Additional sensors, such as a string length sensor, pressure sensor, distance sensor, or other sensor, may be used to further enhance the measurements in a computationally rigorous manner. The method may comprise the steps of: the instantaneous motion prediction and the approximate measurement are mathematically combined to produce at least one of a flight body attitude and/or a flight body position estimate relative to a reference point.
The above evaluation unit provides technical advantages for particular embodiments of the present invention, including allowing for the elimination of dependencies on any external infrastructure, such as visible or wireless electromagnetic beacons or GPS systems.
In addition, the evaluation unit in combination with the mechanical device enables further performance enhancing features, such as continuous calibration of actuators or flight parameters or estimation of external forces (such as wind).
According to another aspect of the present invention there is provided a method for stable or controlled flight of a flying body, wherein the flying body comprises a control unit operatively connected to the flying body and at least one actuator configured to drive a flying apparatus. The method may include: receiving, at a control unit, data from the evaluation unit, wherein the data specifies at least one of: the attitude of the flying body, the instantaneous motion of the flying body, or the position of the flying body relative to a reference point. The method may include operating the control unit to access the memory unit to retrieve the stored information. Preferably, the retrieved information may include actuator calibration, flight parameters, predefined control behaviors, or trajectories. The method may comprise the steps of: the control unit is operated to calculate appropriate commands for at least one actuator and to send commands to the actuator(s). Furthermore, the control unit may return the actuator command to the evaluation unit.
Preferably, the flying body is a flying body of an apparatus according to any one of the above-mentioned apparatuses.
The step of combining the method implemented in the control unit with the method implemented by the evaluation, sensing and/or memory unit and with the constraints imposed by the mechanical device provides technical advantages. These advantages include favorable system stability, better reliability against various influences (such as wind gusts), or better reliability against rapid movement of the reference point. The combination allows the flying subject to maintain controlled flight even under challenging conditions, such as when attached to a fast moving reference point (such as a skier, boat, or car).
According to another aspect of the present invention there is provided a method for interacting with a flying body, wherein the flying body comprises:
a control unit operatively connected to the flying body;
and mechanical means for operatively connecting said flying body to a reference point remote from said flying body, said method comprising the steps of:
receiving, at the control unit, data relating to at least one of: the direction of force, magnitude, or timing applied to the mechanical device,
wherein the control unit performs the steps of: controlling the flying body such that the flying body performs a predefined maneuver, wherein the predefined maneuver is associated with data received by the predefined maneuver. Preferably, the flying body is a flying body of an apparatus according to any one of the above-mentioned apparatuses.
According to yet another aspect of the present invention there is provided a method for interacting with a flying body, wherein the flying body comprises:
a control unit operatively connected to the flying body;
an evaluation unit operatively connected to the flying body; and
mechanical apparatus for operatively connecting said flying body to a reference point remote from said flying body, said method comprising the steps of:
receiving, at the evaluation unit, data relating to at least one of: the direction, magnitude or of a force applied to the mechanical device, wherein
An evaluation unit evaluates the data, in combination with data from the memory unit, to computationally detect a specific sequence or pattern of at least one of direction, magnitude or direction of force.
The evaluation unit may simultaneously command the control unit to alter its operation in order to provide tactile feedback to the user, e.g. during user interaction.
The evaluation result of the evaluation unit may be transmitted to the control unit for optionally performing at least one of the following: a change in the internal state of the flying body, execution of a specific action, or a change in the current operation mode. Preferably, the flying body is a flying body of an apparatus according to any one of the above-mentioned apparatuses.
The step of combining the evaluation unit, the control unit and the mechanical device thereby allows a controlled, user-interactive flight of the flying subject without the need for radio communication or complex configuration and programming. The flying subject can be intuitively launched and operated with simple tasks, such as aerial photography by natural user gestures applied to the rope. Furthermore, the combination enables a new communication channel of the flying subject to convey information back to the user, such as for example by applying a particular pattern or sequence of forces on the rope under particular conditions, such as a particular battery level or a given task (such as a panoramic photography survey) being completed.
A technical advantage of particular embodiments of the present invention may even allow inexperienced users of all ages to safely capture images from a wide range of viewpoints without the need for specialized support elements. For example, the present invention may allow minimizing or eliminating the risks inherent in current aerial imaging due to collisions, mechanical or electrical failures, electronic failures, operator error, or harmful environmental conditions (such as wind or turbulence).
Other technical advantages of particular embodiments of the present invention may allow for easier operation in a wide variety of operating conditions and environments. This may even allow inexperienced users, young users, or automated/semi-automated/user-controlled base stations to perform tasks currently performed by experienced human pilots with manned and unmanned aircraft. The need for human pilots severely limits the cost effectiveness, possible operating conditions, and flight durability of aircraft in many applications. For example, even an experienced human pilot cannot ensure safe and efficient control in many real-world operating conditions (including wind and turbulence).
Still other technical advantages of particular embodiments of the present invention may allow it to be tailored to the specific needs of various applications in various contexts. Exemplary applications include: community hobbyist platforms such as DIY remote pilotless plane (Drone); research platforms that actively research flight platforms or groups that use them as part of their lessons; military applications with various requirements such as viability, energy autonomy, detectability, or operation under extreme conditions (weather, lighting conditions, pollution); toys, such as small aircraft; including stage performances of dancing according to music and light settings or theater performances requiring interaction with theater actors; leisure uses similar to kite uses; industrial or public service applications (e.g., surveillance and surveillance of industrial sites, photogrammetry, surveys); professional aerial or cinematography; or inspection and monitoring of residential infrastructure, which may require dangerous or repetitive tasks. In particular, certain technical advantages allow the present invention to be equipped with a wide variety of sensors. For example, infrared sensors allow embodiments to detect small plots of dry land in an orchard or for crop monitoring.
Other technical advantages of the present invention will be readily apparent to one skilled in the art from the following figures, descriptions, and claims. Moreover, while specific advantages have been enumerated above, various embodiments may include all, some, or none of the enumerated advantages.
Detailed Description
For the purposes of promoting an understanding of the invention, it is described hereinafter with specific reference to a series of specific exemplary embodiments. It will be appreciated, however, that the principles that underlie the invention are not limited to these specific embodiments. Rather, these principles can be combined with many systems to position, stabilize, interact with and control a flight body using mechanical devices.
Furthermore, various features of the present invention are described from time to time in the context of embodiments thereof for aerial camera and sample application for aerial imaging. These features are equally applicable to other types of flying bodies, other sensors, and other applications. Thus, certain aspects of the embodiments described hereinafter should in no way be viewed as limiting the applicability of the invention to these flying bodies, these sensors, or any particular application.
Overview and exemplary usage patterns
FIG. 1 illustrates an example of a typical usage scenario for the invention described herein. User 108 holds the user end of the machine (here a simple cord assembly, consisting of a single cord 106). The tether 106 is connected to a flying body (here, the flying camera 102) that uses a typical helicopter configuration to raise and stabilize itself: the propulsion/stabilization system 104 includes a main, swashplate-equipped rotary wing for lift and lateral stabilization and a small auxiliary tail rotor for yaw stabilization.
The tether 106 enables the flying camera 102 to perform precise positioning and stabilization with respect to a reference point 126 (here, proximate the user 108). Further, the tether 106 enables the user 108 to communicate with the flying camera 102 (and vice versa). Such user interaction may, for example, include pulling the cord in a desired direction. This interaction may be accomplished, for example, by using directional force sensors 124 mounted on the flight camera 102 and attached to the tether 106 and a flight module 128 (including, for example, a sensing unit, an evaluation unit, a memory unit, or a control unit) attached to the body of the flight camera 102.
For convenience in explaining the interaction pattern with the aerial camera 102, the application in fig. 1 is understood as aerial imaging: that is, the user attempts to accomplish the following end goal: images are collected from a desired viewpoint (typically from an elevated position), as implemented by a flying camera 102 equipped with a camera 110.
Positioning and stabilization
The present invention substantially reduces or eliminates the problem of recovering the attitude and position of the flying camera 102 relative to the global coordinate system 114 and the reference point 126, respectively. For clarity, the following explanation discusses the two-dimensional (2D) case. However, the invention extends to and can be used for three spatial dimensions (3D).
The primary force acting on the center of mass 122 of the aerial camera 102 is the gravitational force FgThrust force F generated by the rotary wingpAnd a rope force F pulled by user 108s. Angle a 116 indicates the angle between user 108 and flying camera 102. Angle β 118 indicates gravity FgAnd the "up" direction Z of the flying camera 102bThe angle of (c).
On a typical flying body (such as an aircraft) that is not attached to ground, the inertial sensors cannot provide absolute measurements of the angles α 116 and β 118 for first order. Second order effects such as aerodynamic drag may enable such measurements under some conditions but do not provide angular information of the hover flight. The present invention allows this limitation to be overcome.
This can be achieved as follows. The flight camera 102 is equipped with sensors (e.g., magnetometers)An accelerometer, a gyroscope) and connected to a fixed coordinate system 114 using a tether 106, the sensor providing data indicative of at least one of acceleration, attitude, or rate of rotation of the aerial camera 102. Assuming the taut tether 106, accurate sensor measurements, and known geometry of the flight camera 102, one makes a mathematical model of the settings of the flight components. Specifically, one utilizes the following knowledge: the base force 120 acting on the center of mass 120 of the flying camera 102, the mechanical constraint imposed by the taut tether 106, and α 116, β 118, FpAnd FgAn algebraic relationship is generated between them (as shown in figure 1). Generated and subject reference coordinate system (x)b,zb) Aligned accelerometer measurements alphaxAnd alphazThe method comprises the following steps:
ax=-Fssin(α-β)
az=Fp-Fscos(α-β)
because the rope is taut, the forces in the rope direction must be equal:
where l is the cord length.
The above mathematical model describes that the specific forces measured acting on the body are α 116, β 118, FpAnd FgThe mathematical model is then inverted (inverted) in closed form or numerically, for example by sampling the model at regular intervals. Pair F generated by inertial sensorgThe indirect observation of the vector provides an absolute measurement of the gravity vector and, therefore, the angle β 118. In other words, the above method allows the evaluation unit to determine the gravity direction based on the physical constraints imposed by the ropes and their inertial sensors, which in turn allows the control unit to stabilize the attitude of the flight camera 102 and to remain airborne and controllable.
In one approach, we assume the above-mentioned centripetal termIs negligible. Given nominal FpAnd accelerometer measurements alphaxAnd alphazWe can restore the tension FsAnd two angles:
β=α-asin(-ax/Fs)
note that there is sign ambiguity in this calculation; the evaluation unit may resolve this ambiguity by referring to previous estimates of a and β and also by using other sensors (e.g. angular rate or rope angle sensors) to provide an a priori estimate at a given time.
When combined with additional information such as the rope length from the barometric altimeter or the aircraft attitude, this technique enables the evaluation unit to recover the relative position between the reference point 126 and the flying camera 102 by exploiting a simple trigonometric relationship between length and angle (see fig. 1 for the 2D case). This may be used by the control unit for controlling, for example, the flying camera 102, the aiming camera 110, or a combination thereof.
The improvements of the present invention allow for more precise positioning if the force or direction sensor 124 attaching the tether 106 to the flying camera 102 is installed. In particular, such sensors are able to provide data to be used by the evaluation unit in order to provide both a more reliable estimation of α 116 and β 118 and to improve user interaction with the flight platform. More specifically, if the height of the flying camera 102 or the length of the tether 106 is known (e.g., stored in a memory unit), or is unknown but remains fixed, these angle estimates may be used to generate attitude and/or position estimates using an evaluation unit. In particular, the position and/or pose may be a partial (e.g., position on a circle or position along a line in a particular direction; direction in a plane or tilt along a single axis) or full 3D position and/or pose. They may allow the control unit to actively stabilize the attitude and/or position of the aerial camera 102 relative to the reference point 126. This is achieved by using the trigonometric relation of length and angle in 3D (see fig. 1 for the 2D case).
Further, the pose estimation techniques disclosed herein may be combined with other measurements such as from a satellite-based global positioning system, a beacon-based position and attitude measurement system, or others. For example, data from additional yaw attitude sensors (such as magnetometers) mounted on the flight camera 102 may be used to allow the evaluation unit to generate full 3D position and attitude estimates. This is achieved by using a trigonometric relationship of length and angle, including the angle of the aircraft attitude relative to the user or another reference coordinate system, such as 3D GPS coordinates. This may be particularly useful in applications that require the flying camera to face a given direction independent of the user's movement.
The method described above is not limited to a particular flight configuration, such as the helicopter configuration shown in fig. 1. The analysis resulting in the restoration of the angles α 116 and β 118 may be shown to have good reliability under appropriate operating conditions, meaning that the described invention is still usable under wind conditions or when the reference point 126 is moving (such as being held by a moving person or attached to a moving vehicle or to a moving base). In addition, although sensing is explained with reference to inertial sensors, the methods described above are equally applicable when using other sensors that provide data indicative of attitude or position. Furthermore, the method is not limited to the use of a single rope and many other mechanical devices are possible.
Fig. 2 shows a block diagram of an exemplary control method that may be used to stabilize the flight camera 102. During operation of the control method, a numerical method ("state estimator" 204, typically implemented in an evaluation unit) is used to form an estimate of the state of the flight camera 102 from measurements of sensors used to provide data indicative of at least one of: the acceleration, attitude or rotation rate of the flight camera 102, and optionally the sensor is a force sensor acting on the flight camera's rope attachment point 124. Further, memory cells may be used. Depending on the particular requirements and use cases, various methods known in the art that can be used to form the estimate include one or more of a luneberg (Luenberger) observer, a kalman filter, an extended kalman filter, an unscented kalman filter, and a particle filter, and combinations thereof. Suitable on-board sensor combinations (typically implemented in the evaluation unit) may also provide operational failure reliability against invalid model assumptions, such as temporarily relaxed ropes, providing a valid short time frame estimate and allowing emergency procedures to be activated if necessary.
Based on the state estimate 204, the feedback control methods 206, 208 (typically implemented in a control unit) provide control inputs to at least one actuator 210, and depending on the embodiment of the invention, the actuator 210 may be comprised of a rotary wing, a swash plate, a control surface, or other device for applying force and/or torque to the flight camera 102. The feedback control is designed to control the position 206 and attitude 208 as described by the body coordinate system 112, and may be composed of parallel or cascaded feedback control loops. Depending on the particular requirements and use cases, methods that may be used to calculate flight actuator control signals include linear or non-linear state feedback, model predictive control, and fuzzy control.
As an exemplary embodiment, consider an aircraft operating in a vertical 2D plane, receiving a nominal overall thrust FpAnd desired angular rateFor maintaining a desired rope angleOne possible control law of (2) is a cascade controller, in which the desired rope angular acceleration isIs first calculated so as to then be transformed into the desired aircraft angle beta*Wherein in turn generates
Wherein tau issAnd ζsIs the tuning factor corresponding to the desired time constant and the desired slope of the closed loop rope angle system. The desired aircraft angle and angular rate may then be calculated as follows:
wherein tau isVAs well as the tuning parameters.
State estimation 204 (specifically, recovered F)sValue) may also be used as a communication channel or to detect user interaction with a flying camera. The user command detection 212 detects a match of the status information (e.g., aircraft in flight, aircraft taking a picture) with a predefined or computationally learned interaction pattern (e.g., three quick pulls on the rope, two strong pulls, and then a long pull sideways). Based on the detected pattern, feedback control methods (typically implemented in the control unit) 206, 208 are provided with data representing commands provided by the user 108. The commands are also forwarded to the camera control system 214, thus allowing the user 108 to independently control the behavior of the flying camera 102 and the camera 110. The user interaction system implemented in user command detection system 212 is further detailed in the following sections.
FIG. 3 shows a block diagram of flight module 128 and portions thereof, including control unit 302, evaluation unit 304, sensing unit 306, and memory unit 308. The flight module receives sensor information as input and provides data to the actuators.
Depending on the specific requirements and use cases, a plurality of evaluation units, sensing units, memory units and control units may be used. Similarly, multiple steps (e.g., both steps related to data representing position and/or pose and steps related to data representing user interaction) may be performed in a single unit.
The sensing unit is used to process the sensor information. For example, they may process information received from sensors, such as accelerometers, gyroscopes, magnetometers, barometers, thermometers, hygrometers, buffers, chemical sensors, electromagnetic sensors, or microphones (all not shown). For example, the sensing unit may process the information to extract data about the forces caused by the mechanical connection of the flying body (such as a flight camera) to a reference point. Such forces may be caused, for example, by an aircraft being restrained by the rope, a user dragging or disturbing the rope, such as those caused by wind or motor failure. The sensing unit may also be used to detect patterns of data representing mechanical forces applied to a mechanical device (such as a rope) in order to implement a communication channel using the mechanical device. In addition, the sensing unit may also process information from one or more cameras 110, 608 on the flying camera 102.
The evaluation unit is used for evaluating the data. For example, they may evaluate data representing both relative or absolute position, in particular data of GPS sensors, visual odometers/SLAMs, retro-reflection positioning systems, laser range finders, WiFi positioning systems, barometric altimeters and magnetometers (variometers), or ultrasound sensors (all not shown). For example, they may use data representative of the mechanical forces provided by the sensing unit to infer data representative of the position and attitude of a flying body (such as a flight camera) relative to a reference point.
Memory cells are used to store data. For example, they may be used to store data regarding past sensor readings, operating conditions or user commands, and attributes of the flight camera 102 and the tether 106.
The control unit is used to control the actuator. For example, they may allow for active (and passive) self-stabilization by generating control outputs for flight actuators (e.g., the air propulsion system 104) as well as other actuators (e.g., the camera 110 zoom/pan/tilt motors). These control outputs may be generated independently of the data representing the position and the pose provided by the evaluation unit.
User interaction
Direct measurement via optional force sensor 124 or on the force F applied by the tether 106sMay enable the evaluation unit 304 to detect user interaction with the flight camera 102 via physical forces applied to the rope 106. Optionally, to allow communication from the flying camera 102 to the user, the control unit 302 may be configured to allow user interaction by controlling at least one actuator of the flying camera 102.
Fig. 4 shows a flow diagram of an exemplary user interaction process with the flight camera 102 for a sample application of aerial imaging. After turning on the power switch, the flying camera 102 performs a self-check 402. This check is designed to ensure proper, safe operation of all the elements and actuators of the flying camera 102 prior to flying the flying camera. For example, it may include the following checks: all components have been properly initialized and the flight camera responds to rope commands, or battery level, sufficient for flight. In addition, self-check 402 may include a calibration step, such as positioning the device on a horizontal surface to calibrate internal inertial sensors or adjust the white balance of attached camera 110. Self-check 402 may also include other checks, such as suspending the device with the cord 106 to determine if it has the proper weight and weight balance with respect to its center of mass 122 and cord attachment point 124, or to determine if the position of the camera 110 or battery pack (not shown) needs to be adjusted.
After successful self-check 402, the flying camera is ready for user release. Depending on the size, weight, passive and active safety features, or their sensory ability, user release 404 may be performed in various ways. For example, a user may first hold the flying camera 102 above his head, then command it to rotate his propeller 104 by shaking it twice (typically detected by the evaluation unit 304), and then release the handle 602 with the integrated safety switch 604. In other cases, the flying camera 102 may be activated or operated from a base station (not shown) placed on the ground or held in the user's hand, or it may be activated by simply throwing it into the air above two meters and allowing it to automatically stabilize.
The flight camera 102 then enters a hover mode 406 in which the control unit 302 stabilizes its position by sending appropriate commands to the actuators in dependence on the data received from the evaluation unit 304. Stabilization may be relative to a reference point, a line, air, ground, or a combination thereof.
In the exemplary flow diagram shown in fig. 4, the flying camera is then ready to perform maneuvers based on user commands. Such a maneuver may be triggered by detecting the representative data using the sensing unit 306 and processing the data in the evaluation unit 304, optionally together with data from the memory unit 308.
Maneuvers are a sequence of desired aircraft positions, attitudes, and actions and are typically stored in memory unit 308. The manoeuvre is typically performed by sending appropriate commands to the actuators in dependence of the data received from the evaluation unit 304. Simple examples include "take a picture", "move towards drag" or "land". However, maneuvers may also include more complex routines, such as "taking panoramic photographs," which involves controlling an aircraft to a sequence of positions and taking a sequence of images. Unlike the example shown in fig. 4, a maneuver may also include all operations performed without user input. For example, a user may select a setting on the user interface that causes the flying camera to perform a maneuver involving: spontaneous takeoff, flying to a set point relative to reference point 126, taking a series of photographs and finally returning to the takeoff position for spontaneous landing.
Once the user issues a landing command 410, which is typically detected by the evaluation unit 304, the control unit 302 commands the actuators to cause the flying camera 102 to execute the landing maneuver 412 and exit the procedure.
Fig. 5 shows an exemplary embodiment of a simple state machine (typically implemented in the evaluation unit 304) for user interaction with the flight camera 102. While the flying camera 102 is hovering, the user 108 on the ground may move the reference point 126 laterally, causing the flying camera 102 to perform a lateral displacement maneuver 502. The user may also release or retract the tether to change the attitude 504 of the flying camera. In addition, the user may use a signal such as a short drag 508 on the rope (typically detected by the evaluation unit 304 based on data from the at least one sensing unit 306) to trigger a specific action, such as taking a photograph or video. Finally, the user may use a signal such as a side drag to tell the flying camera 102 to reorient 506. For example, the drag signal may be detected by the evaluation unit 304 based on data provided by the sensing unit 306 and optionally based on data provided by the memory unit 308, and cause the control unit 302 to send a command to the actuator, such that redirection takes place. Similarly, the aircraft may use the signal to communicate with the user, e.g., instead of redirection in the example above, the control unit 302 may send a command to the actuator to cause a communication signal (such as a drag signal) to be sent to the user.
The user interaction input typically detected by evaluation unit 304 may include any combination of force direction, magnitude, and timing applied to cord 106 (e.g., at its end or at reference point 126). It may be measured via a force sensor at the rope attachment point 124 or via other sensors (e.g., accelerometer, rate gyroscope). It may be identified by the user command detection method 212. This arrangement allows for a rich and intuitive user interface, significantly extending existing interfaces, such as those known from mouse clicks (which do not account for direction and magnitude) or mouse gestures (which do not account for magnitude).
Additionally, the aerial camera 102 can provide a communication channel based on mechanical force (e.g., tactile feedback, such as mechanical actuation used in tactile feedback) for communication between base stations at a reference point or user interaction via a tether. This may be used for two-way communication, for example to communicate to the reference point whether the command has been recognized and whether the command has been correctly entered. To implement such a communication channel, the onboard controller of the flight camera may use the propulsion system to, for example, command a series of small jerks (jerk) to indicate a particular haptic message to the user via the tensioned tether reversal. As another example, a flying camera can perform a continuous series of jerks upward in a clearly perceptible time-force pattern to indicate a warning or error condition, such as a low battery or on-board memory full. For example, the tether may be used as a communication channel for sending user commands from the user to the flight camera or user feedback from the flight camera to the user based on mechanical forces applied to the tether.
The flight cameras can also provide visual feedback as a visual response to user commands via a light display or by intentionally performing clearly understandable actions, thereby providing the user with a sense of learning the necessary commands and context in which they can be used. Both tactile and visual feedback are key components of an intuitive user interface. Depending on the size and proximity of the flight cameras, such clearly understandable maneuvers can be further refined by adding appropriate auditory cues, thereby further improving the reliability of the user interaction. The haptic, visual, and auditory feedback may be generated using the primary actuator (e.g., its motor) or additional actuators (e.g., vibrator, linear actuator, light, speaker) of the flight camera, or a combination of both.
The maneuver may be active or passive. The active maneuver is triggered by user command detection 212 and is manipulated by the control method of the flying camera (FIG. 2). For example, the flying camera 102 may immediately perform a single fully controlled rotation following user commands. Passive maneuvers are purely the result of user interaction that exploits the physical dynamics determined by the flight camera, the rope assembly and its feedback control system. They arise from the steady behavior of the aircraft and use the reliability of standard state estimation and control behavior to change the state of the flight cameras. For example, the user may release the additional tether and thereby allow the flying camera to fly higher.
The maneuver may treat the user input as binary or continuous. For example, a change in the magnitude of the detected force on the cord may trigger a maneuver only when it exceeds a threshold. Continuous user input may also be used as a parameter. For example, the magnitude of the force on the string may be used as an indication of the desired amount of lateral movement or the number of photographs to be taken.
Maneuvers may be performed sequentially or in parallel. For example, the flying camera may not receive further commands until a sequence of photographs suitable for synthesizing a panoramic image is completed. Or the user may issue a lateral movement command 502 and simultaneously retract the tether, thereby constraining the flying camera from spiraling toward the user.
It should be noted that while the above description lists a small number of user interaction scenarios with the exemplary embodiment by way of example only, the possible user input commands via the rope, the flight camera response via the rope, and the possible maneuvers of the flight camera produce a large number of possible input-output mappings and user interaction scenarios. Given the advantages of the present disclosure, one of ordinary skill in the art will be able to design suitable communication schemes involving user input or various communication channels (including mechanical force based, etc.) for a wide variety of applications, between a wide variety of entities (including for users, base stations, etc.), and for a wide variety of flying subjects other than flight cameras.
Further exemplary embodiments
FIG. 6 illustrates another exemplary embodiment of a flying camera 102. The embodiment of fig. 6 is equipped with various active and passive safety features, including: a handle 602 for easy takeoff/retrieval; a safety switch 604 mounted on the handle for rapid, safe takeoff; and a light but rigid shroud 606 surrounding the propulsion system 104.
Various other security features may be employed depending on the requirements and use case. In particular, the flight camera 102 may be equipped with additional passive safety features (e.g., non-conductive rope assemblies, predetermined braking points, crash safety propellers, redundant components) and additional active safety features (e.g., autonomous obstacle avoidance, automatic stabilization, autonomous takeoff/landing, audible warnings, visual error indicators, dead man switches (deadman switch), panic switches, backup flight control routines (automatic/autonomous "copilot")) (all not shown) in order to reduce the risk to the operator, to its environment, and to the flight camera 102.
The embodiment of fig. 6 is equipped with a propulsion system comprising two coaxial contra-rotating propellers. One or both of the propellers may be equipped with a mechanism such as a swashplate or another device that produces a controlled torque (such as a control moment gyroscope) to produce the required force or torque to control the flying camera 102.
In addition, various other propulsion systems may be employed depending on the requirements and use. Specifically, the flying camera 102 may be equipped with: an aileron; passively stabilizing rotor assemblies, such as those described in U.S. patent 7494320; or a four-axis aircraft, a six-axis aircraft, or other multi-rotor configuration (all not shown) to increase efficiency and redundancy.
The embodiment of fig. 6 shows a stereo camera setup 608 for depth-of-field enabled imaging or 3D telepresence applications. The cameras typically mounted on the aerial camera 102 may be any type of sensor capable of receiving electromagnetic radiation, including those with active sensors (e.g., structured lights and projection systems).
In the embodiment of FIG. 6, the flight module 128 is positioned on the flight camera 102. However, depending on the requirements and use case, some or all of its components (including the control unit, the sensing unit, the evaluation unit and the memory unit) may also be located elsewhere, including at the reference point 126.
For some embodiments, additional functionality, such as wireless communication, on-board image processing, or battery heating circuitry (all not shown), may be added as desired. In addition, other properties of the rope 106 may be utilized, such as using it as a fiber optic communication medium, using it as a bend sensor, using it to raise other types of sensors, or using it as a sensor for measuring properties (such as strain or torsion).
While the above description contains many specificities, these should not be construed as limiting the scope of the embodiments, but as merely providing illustrations of some of the several embodiments. For example, lift may be achieved using a variety of devices, including propellers, wings, ducts (ducts), and blades; the device may be assembled in a number of configurations, including fixed wing, rotating wing, and vibrating wing machines; the cords may have various elasticities or include sensing elements. The scope of the embodiments should, therefore, be determined by the appended claims and their legal equivalents, rather than by the examples given.
Reference numerals
102-flying camera
104-air propulsion system
106-rope
108-user
110-camera
112-body coordinate system
114-global coordinate system
116-Alpha (Alpha) (angle between user and flying camera)
118-Beta (Beta) (angle between flight camera and gravity)
120-forces acting on flying camera
122-center of mass of flying camera
124-force sensor/rope attachment point
126-reference point
128-flight module
202-inertial sensor
204-State estimation
206-position control
208-attitude control
210-flight actuator
212-user Command detection
214-Camera control
216-memory
218-Camera actuator
302-control Unit
304-evaluation Unit
306-sensing unit
308-memory cell
402-perform self-check
404-user Release
406-spiral
408-user commands
410-descent
412-descent maneuver
500-spiral
502-lateral movement
504-Up/Down movement
506-orientation
508-taking a photograph
602-handle
604-safety switch
606-protective cover
608-stereo camera.