Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
It should be noted that the terms "comprising" and "having" and any variations thereof in the embodiments of the present application and the accompanying drawings are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
The embodiment of the application discloses a flight equipment control method and device based on voice, a vehicle and a storage medium, which can control flight equipment based on voice and simplify the control operation of the flight equipment. The following will describe in detail.
Referring to fig. 1, fig. 1 is a schematic view of an application scenario of a control method of a flight device according to an embodiment. As shown in fig. 1, a wireless communication connection may exist between the vehicle 10 and the flying device 20.
The flying device 20 may include an unmanned aerial vehicle, a model airplane, etc., and is not particularly limited.
The vehicle 10 may send control instructions to the flying device 20 based on the wireless communication connection, and the flying device 20 may perform corresponding flying actions including, but not limited to, ascent, descent, rotation according to the received control instructions, such that the vehicle 10 may be utilized to control the flying device 20.
The flying apparatus 20 may be mounted with a camera device, which may include a visible light camera, an infrared light camera, a fish eye camera, etc., but is not limited thereto. The flying device 20 may generate a series of flight record data during the flying process, including, but not limited to, images captured by the camera, and movement data such as course angle and flying speed detected by the sensor of the flying device 20.
The flying device 20 may transmit the flight record data back to the vehicle 10 based on the communication connection with the vehicle 10, and the vehicle 10 may analyze or present the flight record data.
Referring to fig. 2, fig. 2 is a flow chart of a method for controlling a flight device based on voice according to an embodiment of the disclosure. As shown in fig. 2, the method may include the steps of:
210. the vehicle generates a first set of flight instructions in response to the first voice instructions.
The vehicle can include voice acquisition modules such as microphones, and the voice acquisition modules can acquire first voice instructions input by a user and transmit the acquired first voice signals to the vehicle.
The first voice command may be a voice command for instructing the flying device to take off, for example, may include a voice command including a "take off" keyword, such as "take off of an unmanned aerial vehicle", "take off", and the like, which is not limited in particular.
Alternatively, the vehicle may detect the current gear information of the vehicle before performing step 210. And when the current gear of the vehicle is detected to meet the preset condition, executing step 210, and responding to the first voice instruction, generating a first flight instruction set so as to improve the safety of the vehicle when using the flight equipment.
The preset conditions may include: the vehicle is put into a parking range (P range), but is not limited thereto.
The vehicle can receive the first voice command acquired by the voice acquisition module and analyze the first voice command to identify the intention of the first voice command. When the first voice command is identified to be the intention of indicating the flying equipment to take off, the vehicle can generate a first flying command set according to a preset take-off control script.
The first set of flight instructions may include one or more first target control instructions, each of which may be used to instruct the flight device to perform a corresponding operation.
For example, the first target control instructions in the first flight instruction set may include: take-off instructions, lift-off instructions, and adjustment instructions, but are not limited thereto. Wherein:
The takeoff instructions may be used to instruct the flying device to perform a takeoff preparation action, such as, but not limited to, calibrating a compass, inertial navigation unit (Inertial Measurement Unit, IMU), positioning system, etc. of the flying device.
The lift-off command may be used to instruct the flying device to lift off, i.e., to ascend at altitude. Each lift-off instruction may include a destination coordinate that needs to be reached by the lift-off operation, or include a height that needs to be raised by the lift-off operation, which is not limited in detail.
The adjustment instruction may be used to instruct the flying apparatus to adjust the shooting angle of the image pickup device.
In some embodiments, the camera mounted on the flight device may be fixedly connected to the flight device, and the adjustment instruction may be used to instruct the flight device to adjust the heading angle, so as to adjust the shooting angle of the camera by adjusting the heading of the flight device.
In other embodiments, the camera device carried by the flying device may be movably connected with the flying device, the camera device may rotate relative to the flying device, and the adjustment instruction may be used to instruct the flying device to rotate the camera device, so that the shooting angle of the camera device may be adjusted while the heading of the flying device remains unchanged.
220. The vehicle transmits a first set of flight instructions to the flight device.
The first flight instruction set may include a plurality of first target control instructions, which the vehicle may send to the flight device one by one in a sequence and at intervals. Or the vehicle may also send the entire first flight instruction set to the flight device at a time, and the flight device executes each first target control instruction included in the first flight instruction set one by one according to the sequence and the time interval, which is not limited in particular.
In one example, one implementation of step 220 described above may include the steps of:
the vehicle sends a take-off instruction to the flying device so that the flying device executes take-off preparation actions according to the instruction of the take-off instruction.
When the vehicle receives the preparation completion notification sent by the flying device, a lift-off instruction is sent to the flying device, so that the flying device is lifted to a preset height according to the lift-off instruction. The preset height can be set according to actual service requirements, and can be determined by referring to the height difference between the flying equipment and the vehicle under the optimal shooting effect of shooting the vehicle. By way of example, the first height may be set at 25 meters, 20 meters, 10 meters, etc., but is not limited thereto.
When the flying device is lifted to the first height, an adjusting instruction is sent to the flying device, so that the flying device adjusts the shooting angle of the shooting device, and the vehicle is in the shooting range of the shooting device.
That is, the vehicle may send the first target control command in the first flight command set to the flight device one by one, so that the flight device may take off automatically to a suitable position without manual adjustment.
230. The flying device executes operations of taking off and adjusting the shooting angle of the shooting device according to the indication of each first target control instruction included in the first flying instruction set until the vehicle is in the shooting range of the shooting device.
The vehicle may be within the imaging range of the imaging device, and may include a part or all of the vehicle body within the imaging range, and the imaging device may be viewed from a top view, a front view, a side view, or the like with respect to the vehicle, and is not particularly limited. The front view can mean that the camera hole of the camera device faces the head of the vehicle.
240. The flying device sends a take-off success notification to the vehicle.
The flying device may send a take-off success notification to the vehicle after performing the corresponding operation as directed by the first set of flight instructions.
250. And when the vehicle receives the take-off success notification, acquiring a second voice instruction.
When the vehicle receives the take-off success notification, a second voice instruction input by the user can be acquired through the voice acquisition module. The second voice command may be a voice command instructing the flight device to perform any one of operations, which is not particularly limited.
That is, in the embodiment of the present application, the user may trigger the vehicle to control the flying device to take off to a position where the vehicle can be shot through the first voice command, and then further trigger the vehicle to control the flying device to fly through the second voice command, or control the shooting angle of the camera device mounted on the flying device, etc.
In a usage scenario of an in-vehicle flying device, a portion of the operations performed by a user controlling the flying device may be related to a vehicle. For example, a user may wish the drone to follow or fly around the vehicle, a user may wish to be able to capture an image of the vehicle via the camera of the flying apparatus, etc.
Therefore, in the embodiment of the application, the vehicle controls the flying device to take off to a more reasonable position (the vehicle can be shot) based on the first voice command, and then controls the flying device to execute the next operation based on the second voice command, so that the number of voice commands to be input by a user can be relatively reduced, and the control operation of the flying device can be further simplified.
260. The vehicle generates a second set of flight instructions in response to the second voice instructions.
The vehicle may parse the second voice command to identify an intent of the second voice command. The second voice instructions may include, but are not limited to, the following three types:
1) The second voice command includes a single action keyword and does not include a custom parameter. For example, "landing", "unmanned aerial vehicle return voyage", "unmanned aerial vehicle take a photograph", "video", "fly up", "fly down", "fly forward", "look down", etc. Wherein, "landing", "returning", "photographing", "flying high", "forward", "downward", etc. may be single action keywords included in the second voice instruction.
2) The second voice command includes a single action keyword and includes a custom parameter. For example, "fly 10 meters forward", "fly 2 meters upward", "unmanned camera plus 5 degrees", "unmanned see 10 degrees downward", etc. Wherein, "10 meters", "2 meters", "5 degrees", etc. may be the custom parameters included in the second voice command.
3) The second voice instruction includes a combined action keyword. For example, "one fly" and "unmanned aerial vehicle dancing" etc.
For example, a list of the combined action keywords and the single action keywords may be preset, so that the vehicle may accurately distinguish between the single action keywords and the combined action keywords included in the second voice instruction.
The vehicle may generate a second set of flight instructions corresponding to different types of second voice instructions based on different instruction generation strategies, and the second set of flight instructions may include one or more second target control instructions.
In one embodiment, the instruction generation policy corresponding to the foregoing type 1) may include:
The vehicle generates a second target control instruction according to a preset control instruction corresponding to the single-action keyword included in the second voice instruction and a default parameter corresponding to the single-action keyword, so as to obtain a second flight instruction set including the second target control instruction.
The corresponding relation between each single action keyword and the preset control instruction and the default parameter can be set according to actual service requirements, and the preset control instructions or the default parameters corresponding to different single action keywords can be different or the same, and the corresponding relation is not limited in detail. Optionally, the analysis of the voice command by the vehicle can support generalization compatibility, and if the action intentions corresponding to different single action keywords are the same, the corresponding preset control commands can be the same.
For example, the preset control instructions corresponding to the single action keywords "fly height" and "up" are the same, and may be applied to control instructions for instructing the flying device to raise the fly height. The single action keyword "right" may be applied to a control instruction that instructs the flight apparatus to fly to the right of a preset reference direction, unlike a control instruction corresponding to the single action keyword "up".
For example, default parameters corresponding to the single action keywords "fly high" and "right" may be the same, both may be 1 meter in flight, and default parameters corresponding to the single action keywords "lower" may be 1.5 meters in flight, different from default parameters corresponding to "fly high".
Note that, the default parameters corresponding to the single action keywords may include specific values or may be null. In some embodiments, the flying device may have one or more control methods encapsulated therein for the flying device, and the vehicle may invoke the control methods encapsulated inside the flying device so that the flying device performs the corresponding operations. For example, a control method for following the vehicle to fly is packaged in the flying device, and a preset control instruction corresponding to the single action keyword "following" acquired by the vehicle may be an instruction for indicating the flying device to call the method, and the corresponding default parameter may be null.
In one embodiment, the instruction generation policy corresponding to the foregoing type 2) may include:
And the vehicle generates a second target control instruction according to the preset control instruction and the custom parameter corresponding to the single action keyword so as to obtain a second flight instruction set comprising the second target control instruction.
The corresponding relation between each single action keyword and the preset control instruction can be set according to the actual service requirement, and the following description is omitted. The user-defined parameters are input by the user through the second voice command, so that the vehicle can control the flying device to execute corresponding operation according to the target indicated by the user-defined parameters through the second target control command generated based on the user-defined parameters.
In one embodiment, the instruction generation policy corresponding to the foregoing type 3) may include:
The vehicle generates a second flight instruction set according to the control script corresponding to the combined action keyword, and a plurality of second target control instructions included in the second flight instruction set are arranged according to the sequence and the time interval in the control script.
The corresponding relation between each combined action keyword and the control script can be set according to the actual service requirement, and the control scripts corresponding to different combined action keywords can be the same or different, and the method is not limited in detail.
The control script may include a plurality of preset control instructions arranged in a preset sequence and time interval. When the flight device executes corresponding operation according to the control script, corresponding action effect can be achieved.
For example, the control script corresponding to the combined action keyword "one-fly-away" may include 4 preset control instructions, and be arranged in the following order: a take-off command, a lift-off command for raising the flying height to 100 meters, an adjustment command for adjusting the shooting angle of the imaging device by 90 degrees downwards, and an end command for executing an end action for 10 seconds. Wherein the lift-off command is 5 seconds apart after the take-off command; the adjustment command is 30 seconds after the lift-off command; the end instruction has no time interval requirements after the adjust instruction.
The control script can be written in advance according to the service requirement; or may be user-defined.
In some embodiments, the user may select one or more preset control instructions from a library of preset control instructions. The vehicle can detect each preset control instruction selected by the user and generate a user-defined control script according to the instruction arrangement sequence and the interval time set by the user. And, the vehicle can bind the user-defined voice command with the user-defined control script. After binding, if the second voice command acquired by the vehicle is a user-defined voice command, the vehicle can generate a second flight command set according to a user-defined control script, so that the flight device is controlled to execute the user-defined action, and the flight device is more convenient to operate and control and has stronger interestingness.
270. The vehicle transmits a second set of flight instructions to the flight device.
In some embodiments, the second set of flight instructions may include a plurality of second target control instructions, and the plurality of second target control instructions are arranged in a sequence and time interval.
Thus, the vehicle may send the second target control instructions to the flying device one by one in a sequence and time interval when sending the second set of flight instructions to the flying device. Or the vehicle may also send the entire second flight instruction set to the flight device at a time, and each second target control instruction included in the second flight instruction set is executed by the flight device one by one according to the sequence and the time interval, which is not particularly limited.
280. The flying device performs corresponding operations as directed by each of the second target control instructions included in the second set of flying instructions.
It can be seen that, in the foregoing embodiment, the vehicle may collect the voice command input by the user, and after analyzing the voice command, generate the corresponding flight command set to control the flight device to execute the corresponding operation, so that the user may control the flight device based on the voice, without manual adjustment, and simplify the control operation of the flight device.
Secondly, the vehicle firstly controls the flying device to take off to a more reasonable position (the vehicle can be shot) based on the first voice command, and then controls the flying device to execute the next operation based on the second voice command, so that the number of voice commands required to be input by a user can be relatively reduced, and the control operation of the flying device can be further simplified.
For example, after the vehicle controls the flying device to take off and adjusts the shooting angle according to the first voice command until the vehicle can be shot, the received second voice command may be a command instructing the flying device to fly following the vehicle. The flying device can adopt a vision-based following method, and the flying direction, speed and the like can be adjusted in the moving process of the vehicle so as to always shoot the vehicle. Because the flying device has taken off to a more reasonable position, the vehicle can directly control the flying device to fly along with the vehicle based on the second voice command, and the shooting angle of the shooting device is not required to be adjusted through the voice command.
In addition, the vehicle firstly controls the flying device to take off to a more reasonable position (the vehicle can be shot) based on the first voice command, and then controls the flying device to execute the next operation based on the second voice command, so that better user experience can be brought under partial scenes.
The vehicle controls the flying device to take off and adjusts the shooting angle according to the first voice command until the vehicle can be shot, if the received second voice command indicates that the flying device ascends to a higher height (such as 300 meters) in a short time and continuously records video in the ascending process, the camera device can continuously shoot the vehicle in the ascending process of the flying device, and the recorded video can show the visual effect of the flying device in one flying and one washing day in combination with the flying effect of the ascending higher height in the short time.
In one embodiment, the vehicle may further receive an image captured by the camera of the flying device after controlling the flying device to take off to a position where the vehicle can be captured by the first flying instruction set, and output the received image in the in-vehicle display.
The vehicle-mounted display can comprise a central control large screen or a rearview mirror and the like, and is not particularly limited. A communication link may exist between the vehicle and the in-vehicle display, and the vehicle may transmit the received image to the in-vehicle display so that the in-vehicle display may output the image.
The user can view the image shot by the camera device of the flying equipment through the vehicle-mounted display, and output a second voice command according to the image. For example, if the user sees that there is an obstacle behind the vehicle in the image, a second voice command for instructing the image pickup device to adjust the photographing angle to align with the obstacle behind may be input. That is, the second voice command may be input by the user from the image in the in-vehicle display to arrive at a what is known as a user experience.
In one embodiment, the image captured by the image capturing device may include a building around the vehicle, in addition to the vehicle. These buildings may be landmarks of interest to some users, after the vehicle receives the image from the flying device and after output in the on-board display, the user may input the second voice command of the aforementioned type 2) to control the flying device to fly toward the landmark of interest.
Referring to fig. 3, fig. 3 is a flowchart illustrating a method for generating a second target control command by a vehicle according to an embodiment. The step shown in fig. 3 may be one embodiment of step 260 described previously.
As shown in fig. 3, the following steps may be included:
310. The landmark names in the image are identified from the second voice command.
The second voice command of the aforementioned type 2) includes a single action keyword and a custom parameter, which may include a landmark name in the image.
The image output by the vehicle display, for example, includes a Guangzhou tower in addition to the vehicle. The second voice command entered by the user may be "unmanned aerial vehicle fly to guangzhou tower" which may be the name of the landmark in the image.
320. A first location coordinate is obtained that matches the landmark name.
The vehicle may search according to the identified landmark name to search for a candidate location matching the landmark name, and obtain the position coordinates of the candidate location from the map as the first position coordinates matching the landmark name.
If landmark names are common, such as hospitals, schools, residential buildings and the like, vehicles may search out a plurality of candidate places; if the landmark name is unique, such as the Guangzhou tower, the vehicle may search for a candidate location.
Alternatively, if the number of searched candidate places is two or more, the vehicle may mark each candidate place in the map output from the in-vehicle display. The user can select a target place from the candidate places through a third voice command, and the vehicle can respond to the third voice command to acquire the position coordinates of the target place from the map as the first position coordinates, so that the flying device can accurately fly to the vicinity of the landmark of interest of the user.
330. And generating a second target control instruction according to the preset control instruction corresponding to the single action keyword in the second voice instruction and the first position coordinate.
The second target control command generated by the vehicle includes first position coordinates of the target site to be reached by the flying device. Therefore, after receiving the second flight instruction set including the second target control instruction, the flight device may plan a flight route according to the first position coordinate, so as to fly to the target location corresponding to the first position coordinate.
It can be seen that in the foregoing embodiments, the vehicle may receive the image captured by the image capturing device and output in the in-vehicle display after controlling the flying apparatus to take off. The user may control the flying device to fly near the landmark of interest to the user via the second voice command when the image is viewed to include the landmark of interest.
In one embodiment, the second voice command of the foregoing type 2) entered by the user may further include relative position information of the flying device with respect to the vehicle.
Referring to fig. 4, fig. 4 is a flowchart illustrating another method for generating a second target control command by a vehicle according to an embodiment. The step shown in fig. 4 may be one embodiment of step 260 described previously.
As shown in fig. 4, the following steps may be included:
410. And identifying the relative position information of the flying device relative to the vehicle from the second voice command.
The second voice command of the aforementioned type 2) includes a single action keyword and a custom parameter, which may include relative position information of the flying device with respect to the vehicle.
The image output by the vehicle display, for example, includes a Guangzhou tower in addition to the vehicle. The second voice command entered by the user may include "unmanned aerial vehicle fly ahead 500 meters", "unmanned aerial vehicle fly south 200 meters", and so forth.
The "front 500 meters" and the "south 200 meters" of the vehicle may be relative position information of the flying device relative to the vehicle, including the relative directions "front", "south" and the relative distances "500 meters" and "200 meters".
420. Vehicle position coordinates of a vehicle are acquired.
The vehicle can be positioned by a positioning module such as a satellite positioning system, an inertial navigation system and the like, so that the current position coordinates of the vehicle are obtained.
430. And calculating a second position coordinate according to the vehicle position coordinate, the relative direction and the relative distance included in the relative position information.
440. And generating a second target control instruction according to a preset control instruction corresponding to the single action keyword in the second voice instruction and the second position coordinate.
The second target control command generated by the vehicle includes second position coordinates at which the flying device is to reach. Therefore, after receiving the second flight instruction set including the second target control instruction, the flight device may plan a flight route according to the second position coordinate, so as to fly to a destination corresponding to the second position coordinate.
It can be seen that in the foregoing embodiment, the user can input the relative position information of the flight device with respect to the vehicle by the second voice instruction when the name of the flight destination is ambiguous or no landmark is present. The vehicle may calculate the second position coordinates based on the relative position information to control the flying device to reach the user's desired flight destination.
Referring to fig. 5, fig. 5 illustrates a voice-based flight device control apparatus according to one embodiment, which may be applied to any of the foregoing vehicles. As shown in fig. 5, the voice-based flying device control apparatus 500 may include: a first control module 510, an acquisition module 520, a second control module 530.
The first control module 510 is configured to generate a first flight instruction set in response to the first voice instruction, and send the first flight instruction set to the flight device, so that the flight device executes operations of taking off and adjusting the shooting angle of the camera device according to the instructions of each first target control instruction included in the first flight instruction set until the vehicle is within the shooting range of the camera device.
The acquisition module 520 is configured to acquire a second voice command when receiving a takeoff success notification sent by the flight device;
The second control module 530 is configured to generate a second flight instruction set in response to the second voice instruction, and send the second flight instruction set to the flight device, so that the flight device performs a corresponding operation according to the instructions of each second target control instruction included in the second flight instruction set.
In one embodiment, the first set of flight instructions may include: take-off instructions, lift-off instructions, and adjustment instructions. The first control module 510 may include: a first generation unit and a first communication unit.
The first generation unit is operable to generate a first set of flight instructions in response to the first voice instruction.
The first communication unit can be used for sending a take-off instruction to the flying equipment so that the flying equipment can execute take-off preparation actions according to the instruction of the take-off instruction; when receiving a preparation completion notification sent by the flight equipment, sending a lift-off instruction to the flight equipment so as to enable the flight equipment to rise to a preset height according to the lift-off instruction; and when the flying equipment is lifted to the first height, sending an adjusting instruction to the flying equipment so as to enable the flying equipment to adjust the shooting angle of the shooting device, and enabling the vehicle to be in the shooting range of the shooting device.
In one embodiment, the second control module 530 may include: a second generation unit and a second communication unit.
The second generating unit is configured to generate a second target control instruction according to a preset control instruction corresponding to the single-action keyword and a default parameter corresponding to the single-action keyword when the second voice instruction includes the single-action keyword and does not include the custom parameter, so as to obtain a second flight instruction set including the second target control instruction; or alternatively
When the second voice command comprises a single action keyword and a custom parameter, generating a second target control command according to a preset control command and the custom parameter corresponding to the single action keyword so as to obtain a second flight command set comprising the second target control command; or alternatively
The method may include generating a second flight instruction set according to a control script corresponding to the combined action keyword when the second voice instruction includes the combined action keyword; the second flight instruction set includes a plurality of second target control instructions, and the plurality of second target control instructions are arranged in a sequence and time interval in the control script.
In one implementation, the voice-based flying device control apparatus 500 may further include: and an output module.
The output module is configured to receive an image captured by the camera of the flight device after the first control module 510 generates the first flight instruction set in response to the first voice instruction, and output the image in the vehicle-mounted display;
The custom parameters in the second voice command include: landmark names in images.
The second generating unit may be further configured to obtain a first position coordinate that matches the landmark name; and generating a second target control instruction according to the preset control instruction corresponding to the single action keyword and the first position coordinate.
In one embodiment, the second generating unit may be further configured to search for a candidate location matching the landmark name; and marking each candidate place in the map output by the vehicle-mounted display when the number of the candidate places is two or more; and in response to a third voice instruction to select the target location from the candidate locations, acquiring the position coordinates of the target location from the map as the first position coordinates.
In one embodiment, the custom parameters in the second voice instruction include: relative position information of the flying device with respect to the vehicle.
The second generating unit may be further configured to obtain a vehicle position coordinate of the vehicle; calculating a second position coordinate according to the vehicle position coordinate and the relative direction and the relative distance included in the relative position information; and generating a second target control instruction according to the preset control instruction corresponding to the single action keyword and the second position coordinate.
In one embodiment, the first control module 510 is further configured to generate a first flight instruction set in response to the first voice instruction when it is detected that the current gear information of the vehicle meets a preset condition.
Therefore, by implementing the voice-based flight device control device disclosed in the foregoing embodiment, a voice command input by a user can be collected, and after the voice command is analyzed, a corresponding flight command set is generated to control the flight device to execute a corresponding operation, so that the user can control the flight device based on voice without manual adjustment, and the control operation of the flight device is simplified.
Secondly, the vehicle firstly controls the flying device to take off to a more reasonable position (the vehicle can be shot) based on the first voice command, then controls the flying device to execute the next operation based on the second voice command, and the number of voice commands required to be input can be relatively reduced by a user, so that the control operation of the flying device can be further simplified, and better user experience can be brought under partial scenes.
Referring to fig. 6, fig. 6 is a schematic structural diagram of a vehicle according to an embodiment of the application. As shown in fig. 6, the vehicle 600 may include:
a memory 610 storing executable program code;
a processor 620 coupled to the memory 610;
The processor 620 invokes executable program code stored in the memory 610 to perform any of the voice-based flight device control methods disclosed in embodiments of the present application.
It should be noted that, the vehicle shown in fig. 6 may further include components not shown, such as a power supply, an input key, a camera, a speaker, a display screen, an RF circuit, a Wi-Fi module, a bluetooth module, and a sensor, which are not described in detail in this embodiment.
The embodiment of the application discloses a computer readable storage medium which stores a computer program, wherein the computer program enables a computer to execute any one of the flight equipment control methods based on voice disclosed by the embodiment of the application.
Embodiments of the present application disclose a computer program product comprising a non-transitory computer readable storage medium storing a computer program, and the computer program is operable to cause a computer to perform any of the voice-based flying device control methods disclosed in the embodiments of the present application.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. Those skilled in the art will also appreciate that the embodiments described in the specification are alternative embodiments and that the acts and modules referred to are not necessarily required for the present application.
In various embodiments of the present application, it should be understood that the sequence numbers of the foregoing processes do not imply that the execution sequences of the processes should be determined by the functions and internal logic of the processes, and should not be construed as limiting the implementation of the embodiments of the present application.
The units described above as separate components may or may not be physically separate, and components shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units described above, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer-accessible memory. Based on this understanding, the technical solution of the present application, or a part contributing to the prior art or all or part of the technical solution, may be embodied in the form of a software product stored in a memory, comprising several requests for a computer device (which may be a personal computer, a server or a network device, etc., in particular may be a processor in a computer device) to execute some or all of the steps of the above-mentioned method of the various embodiments of the present application.
Those of ordinary skill in the art will appreciate that all or part of the steps of the various methods of the above embodiments may be implemented by a program that instructs associated hardware, the program may be stored in a computer readable storage medium including Read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), programmable Read-Only Memory (Programmable Read-Only Memory, PROM), erasable programmable Read-Only Memory (Erasable Programmable Read Only Memory, EPROM), one-time programmable Read-Only Memory (OTPROM), electrically erasable programmable Read-Only Memory (EEPROM), compact disc Read-Only Memory (Compact Disc Read-Only Memory, CD-ROM) or other optical disk Memory, magnetic disk Memory, tape Memory, or any other medium that can be used for carrying or storing data.
The above describes in detail a method, apparatus, vehicle and storage medium for controlling a flight device based on voice disclosed in the embodiments of the present application, and specific examples are applied to illustrate the principles and embodiments of the present application, where the above description of the embodiments is only for helping to understand the method and core idea of the present application. Meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.