Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present application, and it is obvious to those skilled in the art that the present application may be applied to other similar situations according to the drawings without inventive effort. It should be understood that these exemplary embodiments are presented merely to enable those skilled in the relevant art to better understand and practice the invention and are not intended to limit the scope of the invention in any way. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
It will be appreciated that "system," "apparatus," "unit" and/or "module" as used herein is one method for distinguishing between different components, elements, parts, portions or assemblies at different levels. However, if other words can achieve the same purpose, the words can be replaced by other expressions.
As used in this application and in the claims, the terms "a," "an," "the," and/or "the" are not specific to the singular, but may include the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
Although the present application makes various references to certain modules or units in a system according to embodiments of the present application, any number of different modules or units may be used and run on clients and/or servers. The modules are merely illustrative, and different aspects of the systems and methods may use different modules.
Flowcharts are used in this application to describe the operations performed by systems according to embodiments of the present application. It should be appreciated that the preceding or following operations are not necessarily performed in order precisely. Rather, the steps may be processed in reverse order or simultaneously. Also, other operations may be added to or removed from these processes.
Fig. 1 is a schematic diagram of an application scenario 100 of a speech recognition-based auxiliary cooking system according to some embodiments of the present application.
As shown in fig. 1, the application scenario 100 may include a processing device 110, a network 120, a user terminal 130, and a storage device 140.
In some embodiments, the processing device 110 may be used to process information and/or data related to auxiliary cooking. For example, the processing device 110 may be used to recognize user voice instructions. In some embodiments, the processing device 110 may be regional or remote. For example, the processing device 110 may access information and/or material stored in the user terminal 130 and the storage device 140 via the network 120. In some embodiments, processing device 110 may be directly connected to user terminal 130 and storage device 140 to access information and/or material stored therein. In some embodiments, the processing device 110 may execute on a cloud platform. In some embodiments, processing device 110 may include a processor 210, and processor 210 may include one or more sub-processors (e.g., a single core processing device or a multi-core processing device). By way of example only, the processor 210 may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), the like, or any combination thereof.
The network 120 may facilitate the exchange of data and/or information in the application scenario 100. In some embodiments, one or more components in the application scenario 100 (e.g., the processing device 110, the user terminal 130, and the storage device 140) may send data and/or information to other components in the processing device 100 over the network 120. For example, the processing device 110 may send the target recipe to the user terminal 130 over the network 120 via the network 120. In some embodiments, network 120 may be any type of wired or wireless network. For example, the network 120 may include a cable network, a wired network, a fiber optic network, a telecommunications network, an internal network, the Internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Bluetooth network, a ZigBee network, a Near Field Communication (NFC) network, and the like, or any combination thereof.
The user terminal 130 is a terminal device used by a user. In some embodiments, the user terminal 130 may obtain information or data related to the auxiliary cooking. For example, the user terminal 130 may obtain a user recipe selection instruction; the target recipe is obtained from the processing device 110 via the network 120 based on the user recipe selection instruction and displayed. For another example, the user terminal 130 may obtain a device binding instruction; and establishing communication connection with at least one intelligent kitchen device based on the device binding instruction. In some embodiments, the user terminal 130 may include one or any combination of a mobile device, a tablet, a notebook, etc.
In some embodiments, the user terminal 130 may be configured with a voice recognition based auxiliary cooking system for assisting the user in cooking, and the voice recognition based auxiliary cooking system may be an application software (APP) on the user terminal 130.
In some embodiments, the speech recognition based auxiliary cooking system may include a recipe instruction acquisition module, a target recipe acquisition module, a binding instruction acquisition module, a device binding module, an auxiliary instruction acquisition module, and a speech auxiliary module.
The menu instruction acquisition module may be configured to acquire a user menu selection instruction.
The target menu obtaining module may be configured to obtain a target menu based on a user menu selection instruction and display the target menu, the target menu including a plurality of operation steps.
The binding instruction acquisition module may be configured to acquire a device binding instruction.
The device binding module may be configured to establish a communication connection with at least one intelligent kitchen device based on the device binding instructions.
The auxiliary instruction acquisition module may be configured to acquire a voice auxiliary start instruction.
The voice auxiliary module can be used for acquiring a menu voice packet corresponding to the target menu based on the voice auxiliary starting instruction, wherein the menu voice packet comprises voice files corresponding to a plurality of operation steps of the target menu respectively; and the intelligent kitchen equipment is further used for repeatedly executing and acquiring the user voice instruction based on the voice auxiliary starting instruction, broadcasting at least one voice file based on the user voice instruction and/or controlling at least one intelligent kitchen equipment to finish target operation until the user finishes cooking.
Further description of the menu instruction acquisition module, the target menu acquisition module, the binding instruction acquisition module, the device binding module, the auxiliary instruction acquisition module, and the voice auxiliary module may be referred to in fig. 3 and related description thereof, and will not be repeated here.
In some embodiments, storage device 140 may be coupled to network 120 to enable communication with one or more components of processing device 100 (e.g., processing device 110, user terminal 130, etc.). One or more components of processing device 100 (e.g., processing device 110, user terminal 130, etc.) may access material or instructions stored in storage device 140 via network 120. In some embodiments, the storage device 140 may be directly connected to or in communication with one or more components in the processing device 100 (e.g., the processing device 110, the user terminal 130). In some embodiments, the storage device 140 may be part of the processing device 110. In some embodiments, the processing device 110 may also be located in the user terminal 130.
It should be noted that the foregoing description is provided for illustrative purposes only and is not intended to limit the scope of the present application. Many variations and modifications will be apparent to those of ordinary skill in the art, given the benefit of this disclosure. The features, structures, methods, and other features of the exemplary embodiments described herein may be combined in various ways to obtain additional and/or alternative exemplary embodiments. For example, the storage device 140 may be a data storage device including a cloud computing platform, such as a public cloud, a private cloud, a community, a hybrid cloud, and the like. However, such changes and modifications do not depart from the scope of the present application.
FIG. 2 is an exemplary block diagram of a computing device, shown in accordance with some embodiments of the present application.
In some embodiments, the processing device 110 and/or the user terminal 130 may be implemented on the computing device 200. For example, processing device 110 may implement and execute the acquisition work tasks disclosed herein on computing device 200.
As shown in fig. 2, computing device 200 may include a processor 210, a read-only memory 220, a random access memory 230, a communication port 240, an input/output interface 250, and a hard disk 260.
Processor 210 may execute computing instructions (program code) and perform the functions of processing device 100 described herein. The computing instructions may include programs, objects, components, data structures, procedures, modules, and functions (which refer to particular functions described herein). For example, processor 210 may build a deep reinforcement learning model and use the deep reinforcement learning model to determine engineering management results for the construction node based on model difference data, construction quality standard data, and actual construction quality data for the construction node. In some embodiments, processor 210 may include microcontrollers, microprocessors, reduced Instruction Set Computers (RISC), application Specific Integrated Circuits (ASIC), application specific instruction set processors (ASIP), central Processing Units (CPU), graphics Processing Units (GPU), physical Processing Units (PPU), microcontroller units, digital Signal Processors (DSP), field Programmable Gate Arrays (FPGA), advanced RISC Machines (ARM), programmable logic devices, and any circuits and processors capable of executing one or more functions, or the like, or any combination thereof. For illustration only, the computing device 200 in fig. 2 depicts only one processor, but it should be noted that the computing device 200 in the present application may also include multiple processors.
The memory of computing device 200 (e.g., read Only Memory (ROM) 220, random Access Memory (RAM) 230, hard disk 260, etc.) may store data/information retrieved from any other component of application scenario 100. For example, a target recipe retrieved from the storage device 140. Exemplary ROMs may include Mask ROM (MROM), programmable ROM (PROM), erasable programmable ROM (PEROM), electrically Erasable Programmable ROM (EEPROM), compact disk ROM (CD-ROM), and digital versatile disk ROM, among others. Exemplary RAM may include Dynamic RAM (DRAM), double rate synchronous dynamic RAM (DDR SDRAM), static RAM (SRAM), and the like.
The input/output interface 250 may be used to input or output signals, data, or information. In some embodiments, input/output interface 250 may enable a user to contact computing device 200. For example, a user inputs user recipe selection instructions to computing device 200 via input/output interface 250. In some embodiments, input/output interface 250 may include an input device and an output device. Exemplary input devices may include a keyboard, mouse, touch screen, microphone, and the like, or any combination thereof. Exemplary output means may include a display device, a speaker, a printer, a projector, etc., or any combination thereof. Exemplary display devices may include Liquid Crystal Displays (LCDs), light Emitting Diode (LED) based displays, flat panel displays, curved displays, television equipment, cathode Ray Tubes (CRTs), and the like, or any combination thereof. The communication port 240 may be connected to a network for data communication. The connection may be a wired connection, a wireless connection, or a combination of both. The wired connection may include an electrical cable, optical cable, or telephone line, or the like, or any combination thereof. The wireless connection may include bluetooth, wi-Fi, wiMax, WLAN, zigBee, a mobile network (e.g., 3G, 4G, 5G, etc.), etc., or any combination thereof. In some embodiments, the communication port 240 may be a standardized port, such as RS232, RS485, and the like. In some embodiments, communication port 240 may be a specially designed port.
For purposes of illustration only, computing device 200 depicts only one central processor and/or processor. However, it should be noted that the computing device 200 in this application may include multiple central processors and/or processors, and thus the operations and/or methods described in this application as being implemented by one central processor and/or processor may also be implemented by multiple central processors and/or processors, either collectively or independently. For example, the central processor and/or the processor of computing device 200 may perform steps a and B. In another example, steps a and B may also be performed by two different central processors and/or processors in computing device 200 in combination or separately (e.g., a first processor performing step a and a second processor performing step B, or the first and second processors collectively performing steps a and B).
Fig. 3 is an exemplary flow chart of a method of assisted cooking based on speech recognition according to some embodiments of the present application. As shown in fig. 3, the auxiliary cooking method based on voice recognition may include the following steps. In some embodiments, the speech recognition based assisted cooking method may be performed by the user terminal 130 or the computing device 200.
Step 310, a user menu selection instruction is obtained. In some embodiments, step 310 may be performed by a recipe instruction fetch module.
The user recipe selection instruction is used to characterize the target recipe selected by the user. In some embodiments, the user may input the user recipe selection instruction to the recipe instruction acquisition module by voice input, clicking on a screen, or the like. For example, icons of various recipes may be displayed on the display screen of the user terminal 130 or the computing device 200, the user may click on an icon of a certain recipe, and the user terminal 130 or the computing device 200 may automatically acquire a user recipe selection instruction. Illustratively, when the user opens the APP, browses the today's recipes, selects a favorite recipe as the target recipe, enters the tutorial card, and begins to prepare for cooking.
Step 320, obtaining a target menu based on the user menu selection instruction, and displaying the target menu. In some embodiments, step 320 may be performed by the target recipe acquisition module.
In some embodiments, the target recipe acquisition module may acquire the target recipe from the processing device 110, the user terminal 130, the storage device 140, or an external data source based on the user recipe selection instruction, wherein the target recipe includes a plurality of operational steps. After the target recipe is acquired, the target recipe acquisition module may display the target recipe on a display screen of the user terminal 130 or the computing device 200.
Step 330, obtain the device binding instruction. In some embodiments, step 330 may be performed by a bind instruction fetch module.
The device binding instructions are used to characterize that the user needs the user terminal 130 or the computing device 200 to establish a communication connection with at least one smart kitchen device. Wherein, intelligent kitchen equipment can be kitchen balance and kitchen processing equipment etc. wherein, kitchen balance can be used for weighing the weight of edible material, and kitchen processing equipment can be used for processing edible material. Kitchen processing equipment may include ovens, cookers, rice cookers, and the like. In some embodiments, the user may input the device binding instruction to the binding instruction acquisition module by voice input, clicking on a screen, or the like.
Step 340, establishing a communication connection with at least one intelligent kitchen device based on the device binding instruction. In some embodiments, step 330 may be performed by a device binding module.
In some embodiments, the device binding module establishes a communication connection with at least one intelligent kitchen device based on the device binding instruction, and may include:
based on the device binding instruction, judging whether the user terminal is bound with at least one intelligent kitchen device;
if the user terminal binds at least one intelligent kitchen device, acquiring and displaying device information of the bound intelligent kitchen device, and sending a newly-added binding prompt, wherein the newly-added binding prompt can be a voice prompt or an identifier or a character displayed on a display screen, judging whether the at least one newly-added intelligent kitchen device needs to be bound based on feedback of the user on the newly-added binding prompt, and if the at least one newly-added intelligent kitchen device needs to be bound, establishing communication connection with the at least one newly-added intelligent kitchen device through Bluetooth based on feedback of the user on the newly-added binding prompt; in some embodiments, the user may input feedback of the user on the newly added binding prompt to the device binding module by voice input, clicking on a screen, and the like;
if the user terminal is not bound with at least one intelligent kitchen device, a binding prompt is sent out, wherein the binding prompt can be a voice prompt or an identifier or a character displayed on a display screen, whether the at least one intelligent kitchen device needs to be bound or not is judged based on feedback of the binding prompt of a user, and if the at least one intelligent kitchen device needs to be bound, communication connection is established with the at least one intelligent kitchen device through Bluetooth based on feedback of the binding prompt of the user; in some embodiments, the user may input user feedback of the binding prompt to the device binding module by voice input, clicking on a screen, or the like.
The device binding module detects whether an intelligent kitchen device is bound or not, if the intelligent kitchen device is not bound, clicks a plus sign adding device in an APP course, detects whether Bluetooth is opened or not, if the intelligent kitchen device is not bound, prompts a user to open Bluetooth, detects whether a device is connected after the Bluetooth is opened, if the intelligent kitchen device is not connected, prompts the user to open the device, and after the intelligent kitchen device is opened, the APP searches through Bluetooth and automatically connects the intelligent kitchen device, and when a course interface displays an icon of the intelligent kitchen device, the icon indicates that the connection is successful.
Step 350, obtain a voice assisted on command. In some embodiments, step 350 may be performed by an auxiliary instruction fetch module.
The voice assisted on command is used to characterize the user's need for voice assistance by the user terminal 130 or the computing device 200 during the cooking process. In some embodiments, the user may input the voice auxiliary turn-on command to the auxiliary command acquisition module by voice input, clicking on a screen, or the like.
In some embodiments, after obtaining the voice assisted on instruction, the user terminal 130 or the computing device 200 may perform an operation based on the user's voice instruction.
Step 360, based on the voice auxiliary opening instruction, acquiring a menu voice packet corresponding to the target menu, wherein the menu voice packet comprises voice files corresponding to a plurality of operation steps of the target menu. In some embodiments, step 320 may be performed by a speech assistance module.
In some embodiments, the voice assistance module may obtain a recipe voice package corresponding to the target recipe from the processing device 110, the user terminal 130, the storage device 140, or an external data source based on the voice assistance activation instruction.
Step 370, based on the voice auxiliary opening instruction, repeatedly executing to obtain the user voice instruction, voice broadcasting at least one voice file and/or controlling at least one intelligent kitchen device to complete target operation based on the user voice instruction until the user completes cooking. In some embodiments, step 320 may be performed by a speech assistance module.
In some embodiments, the user terminal 130 or computing device 200 may include a microphone that may be used to collect user voice indications. The voice assistance module may perform voice recognition on the user voice indication. For example, the speech assistance module may rely on a microphone in the iOS and the Siri native library for speech listening and recognition. After the voice recognition, the user terminal 130 or the computing device 200 may extract keywords in the voice recognition result, and match the operation corresponding to the user voice instruction based on the keywords.
In some embodiments, the voice assistance module voice broadcasts the at least one voice file based on the user voice indication may include:
and playing the voice file corresponding to the current operation step or playing the voice file corresponding to the next operation step again based on the voice instruction voice of the user.
For example, before each step starts, the voice auxiliary module performs voice broadcasting on the voice file corresponding to the current step, and after the user can say "rebroadcast once again", the voice auxiliary module rebroadcasts once again; the user says "switch next", the voice auxiliary module automatically switches the next operation step and broadcasts the voice file corresponding to the next operation step.
In some embodiments, the voice assistance module controls the target operation performed by the at least one intelligent kitchen device based on the user voice indication to include at least one of controlling kitchen scale zero, controlling kitchen scale switching units, controlling a processing temperature of the kitchen processing device, and controlling an operating time of the kitchen processing device.
For example, the user speaks "switch next", and the voice assistance module automatically switches the next operation step and broadcasts the voice file corresponding to the next operation step and controls the kitchen scale to be zeroed. For another example, the user says "switch the weighing unit to" kg ". The voice assistance module controls the weighing unit of the kitchen scale to" kg ".
In some embodiments, for the current operating step, the voice assistance module automatically sets a desired value for the cooking (e.g., at least one of a processing temperature of the kitchen processing equipment and an operating time of the kitchen processing equipment) by identifying keywords of the current operating step in the target recipe, converting the values in the keywords into data, and transmitting the data to the kitchen processing equipment.
In some embodiments, the progress display module may further obtain a completed operation step of the user, generate a progress bar based on the completed operation step, and display the progress bar. For example, in the cooking process, the user can make according to the prompt, the kitchen scale and the kitchen processing equipment can transmit the operation data of the user to the user terminal 130 or the computing equipment 200, the user terminal 130 or the computing equipment 200 can convert the operation data into a progress bar form, the user operation dynamics is displayed, when the operation is completed, the progress bar process loading is completed, the completed state of the step is displayed, the user terminal 130 or the computing equipment 200 broadcasts the next step in a voice mode, and each step when the user cooks is broadcasted according to the user process until the cooking is completed, so that the error operation caused by subjective judgment of the user is effectively reduced.
While the basic concepts have been described above, it will be apparent to those skilled in the art that the foregoing detailed disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements, and adaptations of the present application may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within this application, and are therefore within the spirit and scope of the exemplary embodiments of this application.
Meanwhile, the present application uses specific words to describe embodiments of the present application. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the present application. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this specification are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present application may be combined as suitable.
Furthermore, those skilled in the art will appreciate that the various aspects of the invention are illustrated and described in the context of a number of patentable categories or circumstances, including any novel and useful procedures, machines, products, or materials, or any novel and useful modifications thereof. Accordingly, aspects of the present application may be performed entirely by hardware, entirely by software (including firmware, resident software, micro-code, etc.) or by a combination of hardware and software. The above hardware or software may be referred to as a "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the present application may take the form of a computer product, comprising computer-readable program code, embodied in one or more computer-readable media.
The computer storage medium may contain a propagated data signal with the computer program code embodied therein, for example, on a baseband or as part of a carrier wave. The propagated signal may take on a variety of forms, including electro-magnetic, optical, etc., or any suitable combination thereof. A computer storage medium may be any computer readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer storage medium may be propagated through any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or a combination of any of the foregoing.
The computer program code necessary for operation of portions of the present application may be written in any one or more programming languages, including an object oriented programming language such as Java, scala, smalltalk, eiffel, JADE, emerald, C ++, c#, vb net, python, etc., a conventional programming language such as C language, visual Basic, fortran 2003, perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, ruby and Groovy, or other programming languages, etc. The program code may execute entirely on the supervisory computer or as a stand-alone software package, partly on the supervisory computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the attendant computer through any form of network, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or the use of services such as software as a service (SaaS) in a cloud computing environment.
Furthermore, the order in which the elements and sequences are presented, the use of numerical letters, or other designations are used in the application and are not intended to limit the order in which the processes and methods of the application are performed unless explicitly recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of various examples, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the present application. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing server or mobile device.
Likewise, it should be noted that in order to simplify the presentation disclosed herein and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure, however, is not intended to imply that more features than are presented in the claims are required for the subject application. Indeed, less than all of the features of a single embodiment disclosed above.
In some embodiments, numbers describing the components, number of attributes are used, it being understood that such numbers being used in the description of embodiments are modified in some examples by the modifier "about," approximately, "or" substantially. Unless otherwise indicated, "about," "approximately," or "substantially" indicate that the number allows for a 20% variation. Accordingly, in some embodiments, numerical parameters set forth in the specification and claims are approximations that may vary depending upon the desired properties sought to be obtained by the individual embodiments. In some embodiments, the numerical parameters should take into account the specified significant digits and employ a method for preserving the general number of digits. Although the numerical ranges and parameters set forth herein are approximations that may be employed in some embodiments to confirm the breadth of the range, in particular embodiments, the setting of such numerical values is as precise as possible.
Each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., cited in this application is hereby incorporated by reference in its entirety. Except for application history documents that are inconsistent or conflicting with the present application, documents that are currently or later attached to this application for which the broadest scope of the claims to the present application is limited. It is noted that the descriptions, definitions, and/or terms used in the subject matter of this application are subject to the use of descriptions, definitions, and/or terms in case of inconsistent or conflicting disclosure.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present application. Other variations are also possible within the scope of this application. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present application may be considered in keeping with the teachings of the present application. Accordingly, embodiments of the present application are not limited to only the embodiments explicitly described and depicted herein.