Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a flowchart of a cooking real-time guiding method according to an embodiment of the present invention, where the method may be applicable to a situation of guiding a user to cook in real time during a cooking process, and the method may be performed by a cooking real-time guiding device, where the cooking real-time guiding device may be implemented in a form of hardware and/or software, and the cooking real-time guiding device may be configured in a cooking real-time guiding device, where the cooking real-time guiding device may be a server, and may be a cloud server or an integrated oven, and a following description will be made using the cloud server as an example. As shown in fig. 1, the method includes:
S101, receiving a preset dish identifier set by a user on a terminal device, and determining a preset cooking menu according to the preset dish identifier.
In this embodiment, the terminal device includes an intelligent device that allows a user to operate, where the intelligent device may be a mobile terminal such as a mobile phone, a tablet computer, or a cooking device, for example, an integrated oven, a functional box, where the functional box may be a steam box, an oven, a steam box, or the like, and this embodiment is not limited thereto. The preset cooking recipe may be stored in a cloud server, such as a recipe database of the cloud server, for providing guidance for the user to cook the dish. The preset dish identifier is used for determining a preset cooking menu, and may be a dish name or a dish number, which is not limited in this embodiment.
Specifically, a plurality of preset dish identifiers are provided on the terminal device, and when a user needs cooking guidance, the preset dish identifiers can be set on the terminal device. After receiving a preset dish identifier set by a user and sent by the terminal equipment, matching a preset cooking menu in a menu database according to the preset dish identifier, and guiding the user to cook through the terminal equipment.
S102, acquiring user cooking information of a user in a cooking process.
In this embodiment, the cooking process includes the entire process from the start of ignition to the end of the off fire by the user. The user cooking information comprises related data, such as cooking temperature, firepower, food weight and the like, related to the cooking process of the user, the cooking data can be collected by the terminal equipment and sent to the cloud server, and the cloud server determines the cooking information of the user according to the cooking data uploaded by the terminal equipment.
Specifically, in order to know the condition of the user in the actual cooking process, the cooking information of the user in the cooking process needs to be acquired in real time.
S103, judging whether the cooking information of the user is consistent with corresponding preset cooking information in a preset cooking menu.
In this embodiment, preset cooking information is stored in the cloud server for guiding the cooking operation of the user, for example, the cooking operation mode, the required fire power, and the like of the user can be guided.
Specifically, in order to conduct targeted guidance on the cooking operation of the user, it is required to determine whether the cooking information of the user matches corresponding preset cooking information in the preset cooking recipe, where the matching indicates that the cooking information of the user is within the allowable range of the preset cooking information, that is, the related information related to the cooking process of the user basically matches the preset cooking information.
Optionally, if the cooking parameters are matched, the user can be guided to cook according to a preset cooking menu.
And S104, if the cooking information does not accord with the preset cooking menu, inputting the cooking information of the user into a cooking prediction model to obtain the predicted cooking menu.
In this embodiment, the preset cooking recipe is used to guide the user in the subsequent cooking operation. The cooking prediction model is used for predicting a predicted cooking recipe needed for guiding a user to follow-up cooking according to current cooking information of the user. The cooking prediction model can be obtained by training cooking experimental data and can comprise a first functional relation, a second functional relation, a third functional relation and a fourth proportional relation corresponding to different sample food material weights under different cooking operation modes; the first functional relation comprises a functional relation among the maturity of the sample food material, the cooking temperature curve and the cooking operation duration, the second functional relation comprises a functional relation among the oil smoke concentration, the weight of the sample food oil, the cooking temperature curve and the cooking operation time, the third functional relation comprises a functional relation among the time for adding the sample seasoning and the maturity of the sample food material, and the fourth proportional relation comprises a proportional relation among the weight of the sample seasoning and the weight of the sample food material.
Specifically, when the user cooking information and the preset cooking information do not match, if the operation is continued according to the preset cooking menu, the corresponding effect cannot be achieved, so that the user cooking information is input into the cooking prediction model, and the predicted cooking menu is determined according to the current user cooking information, so that real-time dynamic adjustment of the menu for guiding cooking according to the actual cooking condition of the user is realized.
S105, guiding the user to finish subsequent cooking according to the predicted cooking menu.
Specifically, after the predicted cooking menu is obtained, the user is guided to complete the subsequent cooking operation according to the predicted cooking menu.
According to the technical scheme provided by the embodiment of the invention, the preset cooking menu is determined according to the preset dish identification by receiving the preset dish identification set by the user on the terminal equipment; acquiring user cooking information of a user in a cooking process; judging whether the cooking information of the user accords with corresponding preset cooking information in a preset cooking menu; if the cooking information does not accord with the cooking information, inputting the cooking information of the user into a cooking prediction model to obtain a predicted cooking menu; and guiding the user to complete the subsequent cooking according to the predicted cooking menu. By the technical scheme, the subsequent cooking operation of the user is guided in real time according to the current cooking state of the user, and when the acquired cooking information of the user is inconsistent with the guiding menu, the guiding menu can be modified according to the cooking information of the user to guide the subsequent cooking operation of the user. The problems that the existing cooking guidance method cannot conduct targeted guidance according to the cooking operation of people in the cooking process, so that the cooking difficulty is high and the success rate is low are solved, and the technical scheme of the application can conduct targeted guidance on the cooking operation of people in the cooking process, so that the cooking difficulty is effectively reduced, and the cooking success rate is improved.
In some embodiments, the cooking information includes: food material information, seasoning information, cooking information, fire information and smoke machine information; wherein, the food material information includes food material kind, food material name, food material weight, food material processing mode and food material maturity, and the condiment information includes condiment name and condiment weight, and the culinary art operation information includes culinary art operation mode, culinary art operation order, culinary art operation duration and culinary art temperature curve, and the firepower information includes firepower gear and firepower time, and the cigarette machine information includes cigarette machine gear and oil smoke concentration. By the technical scheme, more comprehensive cooking information can be obtained, and a foundation is laid for predicting a cooking menu.
In this embodiment, the food material processing manner may include shredding, dicing, slicing, cutting, or chopping. The cooking modes may include adding, frying, steaming, frying, boiling, stewing, and the like. The cooking temperature profile is a temperature profile formed according to a cooking time and a temperature generated at the time of cooking.
Specifically, the cooking information may be user cooking information or preset cooking information. The user cooking information comprises user food material information, user seasoning information, user cooking operation information, user firepower information and user smoke machine information; the user food material information comprises a user food material type, a user food material name, a user food material weight, a user food material processing mode and a user food material maturity, the user seasoning information comprises a user seasoning name and a user seasoning weight, the user cooking operation information comprises a user cooking operation mode, a user cooking operation sequence, a user cooking operation duration and a user cooking temperature curve, the user firepower information comprises a user firepower gear and a user firepower time, and the user smoke machine information comprises a user smoke machine gear and a user smoke density. The preset cooking information comprises preset food material information, preset seasoning information, preset cooking operation information, preset firepower information and preset smoke machine information; the preset food material information comprises preset food material types, preset food material names, preset food material weights, preset food material processing modes and preset food material maturity, the preset seasoning information comprises preset seasoning names and preset seasoning weights, the preset cooking operation information comprises preset cooking operation modes, preset cooking operation sequences, preset cooking operation duration and preset cooking temperature curves, the preset firepower information comprises preset firepower gears and preset firepower time, and the preset smoke machine information comprises preset smoke machine gears and preset smoke density.
In some embodiments, the cooking real-time guidance method further comprises: responding to a first cooking function request input by a user on terminal equipment, and controlling the terminal equipment to prompt the user to perform cooking operation in a preset period before a key action time point in an image, video or voice mode; or, in response to a second cooking function request input by the user on the terminal device, the terminal device is controlled to automatically adjust the firepower and the smoke machine when the user is prompted to perform cooking operation. Through the technical scheme, the predicted cooking menu is utilized to guide or assist a user to perform subsequent cooking.
In this embodiment, the first cooking function request includes a direction request and an auxiliary request. The second cooking function request includes an auxiliary request. The key actions include adding, stir-frying and the like. The preset time period includes a time period set in advance, for example, the preset time period may be 8s.
For example, when a user sets a preset cooking identifier, a first cooking function request may be input on a terminal device according to needs, for example, the user may input a guiding request or an auxiliary request on a mobile terminal such as a mobile phone, a tablet computer or the like and a cooking device, if the user inputs the guiding request, after determining that a cooking menu is predicted, the user responds to the guiding request to send a guiding program instruction generated by the predicted cooking menu to the terminal device, and the terminal device is controlled to prompt the user to perform cooking operation in 8s before a time point corresponding to a key action in a mode such as an image, a video or a voice; if the user inputs an auxiliary request, after the predicted cooking menu is determined, an auxiliary program instruction generated by the predicted cooking menu is sent to the terminal equipment in response to the auxiliary request of the user, the terminal equipment is controlled to prompt the user to perform cooking operation within 8s before a time point corresponding to a key action in a mode of images, videos or voices and the like, meanwhile, a fuel gas proportional valve is automatically adjusted according to the auxiliary program instruction to control fire, and a range hood is opened or a range hood gear is adjusted at a specified time.
Example two
Fig. 2 is a flowchart of a cooking real-time guiding method according to a second embodiment of the present invention, in which the present embodiment is optimized and expanded based on the above-mentioned alternative embodiments, and the present embodiment details a process of obtaining cooking information of a user and how a cooking prediction model obtains a predicted cooking recipe according to the cooking information of the user. As shown in fig. 2, the method includes:
s201, receiving a preset dish identifier set by a user on the terminal equipment, and determining a preset cooking menu according to the preset dish identifier.
S202, receiving user cooking data of a user; wherein the user cooking data includes: raw images acquired by the image acquisition equipment on the target integrated cooker, raw cooking temperatures detected by the temperature detection equipment on the target integrated cooker, and user food weight and user seasoning weight measured by the weighing equipment on the target integrated cooker.
In this embodiment, the target integrated cooker may be understood as an integrated cooker used by a user to cook dishes at the present time. The user cooking data is cooking data generated by the user during the cooking process. The image acquisition equipment is used for shooting original pictures or videos with cooking operations of users and food materials in the pot in the cooking process, and can be a camera, is arranged above the air inlet of the target integrated kitchen range and is opposite to the kitchen range on the target integrated kitchen range. The temperature detection device may be a temperature sensing probe. The weighing equipment can be a high-precision electronic scale and is positioned at the bottom of the gas stove. The present embodiment is not limited to the specific types of image capturing apparatus, temperature detecting apparatus, and weighing apparatus.
Illustratively, receiving user cooking data uploaded by the target integrated cooker includes taking raw photographs or videos with user cooking operations and food materials in the pot through the camera, raw cooking temperatures detected through the temperature sensing probe, and user food material weight and user seasoning weight measured through the high-precision electronic scale.
S203, processing the cooking data of the user to obtain cooking information of the user.
Specifically, when the user cooking data is received, the user cooking data can be processed, so that the user cooking information is obtained. Optionally, processing the cooking data of the user includes positioning and region labeling objects in the original image by using a positioning algorithm to obtain a target labeling image, extracting target image features from the target labeling image by using a convolutional neural network model, matching the target image features with preset image features to obtain object names, analyzing the object names to obtain the food material names of the user, the seasoning names of the user and the cooking operation mode of the user, and determining the food material types of the user according to the food material names of the user; a user cooking temperature profile is determined from the raw cooking temperature.
The original picture comprises objects such as a pot, food materials, a seasoning bottle, a cooking tool, limbs and the like, firstly, preprocessing an image, such as size adjustment and image enhancement, positioning a plurality of objects on the image by using a selection search algorithm, marking areas, extracting shape features, color features, texture features, space features and the like in the image by using a convolutional neural network, further identifying object names by feature matching, and analyzing according to the object names, such as potato shreds, soy sauce, a slice, hands and the like, wherein the added user food material names can be known according to the potato shreds, the user seasoning names can be known according to the soy sauce, and the cooking operation modes of the user can be known according to the slice and the hands.
Alternatively, in some embodiments, the user cooking information may be content directly input by the user according to his cooking experience.
S204, judging whether the cooking information of the user is consistent with corresponding preset cooking information in a preset cooking menu.
Specifically, judging whether the user cooking information accords with corresponding preset cooking information in the preset cooking menu includes judging whether the user food material information, the user seasoning information, the user cooking information, the user firepower information and the user smoke machine information all accord with corresponding preset food material information, preset seasoning information, preset cooking information, preset firepower information and preset smoke machine information in the preset cooking information, and the compliance can include whether the difference is within a preset error range or not.
Optionally, the method further comprises judging whether the type of the food material of the user in the food material information of the user is the same as the type of the preset food material in the preset cooking information, wherein the type of the food material is the same, the type of the food material comprises the same type of the food material and the same quantity of the type of the food material, and the type of the food material comprises but is not limited to meat, vegetables, fruits and the like; and/or judging whether the difference between the weight of the food material of the user and the weight of the preset food material in the preset cooking information is in a user error interval; the error interval comprises a user threshold value and a preset threshold value; and/or judging whether the processing mode of the food materials of the user in the food material information of the user is the same as the processing mode of the preset food materials in the preset cooking information; and/or judging whether the cooking operation mode of the user is the same as the preset cooking operation mode in the preset cooking operation information; and/or judging whether the cooking operation sequence and the cooking operation duration of the user which are finished by the user accord with the preset cooking operation sequence and the preset cooking operation duration in the preset cooking information; and/or judging whether the difference between the weight of the user seasoning and the weight of the preset seasoning in the preset cooking information is in a preset error interval; the preset error interval comprises a third threshold value and a fourth threshold value; judging whether the difference between the food material maturity of the user and the preset food material maturity in the preset cooking information is in a third error interval or not; wherein the third error interval includes a fifth threshold and a sixth threshold; judging whether differences between slopes corresponding to the same time periods of the cooking temperature curve of the user and the preset temperature curve in the preset cooking information are in a fourth error interval or not; wherein the fourth error interval includes a seventh threshold and an eighth threshold.
If the judging results are yes, the cooking information of the user is consistent with the corresponding preset cooking information in the preset cooking menu; if at least one item of the judging result is negative, the cooking information of the user is not consistent with the corresponding preset cooking information in the preset cooking menu.
The user error interval, the user threshold, the preset error interval, the third threshold, the fourth threshold, the third error interval, the fifth threshold, the sixth threshold, the fourth error interval, the seventh threshold and the eighth threshold mentioned in this embodiment are all set according to the actual situation, which is not limited by the embodiment of the present invention.
S205, if the user cooking information does not accord with the user cooking information, inputting the user food material information, the user seasoning information and the user cooking information in the cooking prediction model.
Specifically, if any item of user cooking information is judged to be inconsistent with the preset cooking information, determining that the user cooking information is inconsistent with the preset cooking information, and inputting the user food material information, the user seasoning information and the user cooking information in the user cooking information into a cooking prediction model for prediction.
S206, in the cooking prediction model, the predicted food material information, the predicted seasoning information, the predicted firepower information, the predicted smoke machine information and the predicted cooking information are determined according to the user food material information, the user seasoning information and the user cooking information.
The predicted food material information comprises a predicted food material name and a predicted food material weight, the predicted seasoning information comprises a predicted seasoning name, a predicted seasoning weight, a predicted seasoning adding time and a predicted seasoning adding sequence, the predicted firepower information comprises a predicted firepower gear and a predicted firepower time, the predicted smoke machine information comprises a predicted smoke machine gear and a predicted oil smoke concentration, and the predicted cooking operation information comprises a residual cooking operation mode, a residual cooking operation sequence and a residual cooking operation duration.
Specifically, in the cooking prediction model, predicted food material information, predicted seasoning information, predicted fire information, predicted smoke machine information, and predicted cooking information can be determined from user food material information, user seasoning information, and user cooking information. Optionally, determining the predicted food material name and the predicted food material weight to be added according to the type and the weight of the user food material in the user food material information; determining the predicted seasoning names and predicted seasoning weights which are required to be added according to the user food material weight in the user food material information, the user seasoning names and the user seasoning weight in the user seasoning information; determining a predicted seasoning adding sequence and a user target time for reaching a target food material maturity according to a user cooking operation mode in the user cooking operation information and the user food material maturity in the user food material information; wherein the user target time is used for determining a predicted seasoning adding time; determining a target temperature curve according to the food material maturity of the user, the cooking operation time length of the user in the cooking information of the user and the cooking temperature curve of the user, analyzing a firepower function of the target temperature curve, and determining a predicted firepower gear and predicted firepower time required for reaching the target temperature value; determining a predicted range of the smoke machine and a predicted oil smoke concentration according to the weight of the edible oil of the user, the cooking temperature curve of the user and the cooking operation duration in the seasoning information of the user; and determining a residual cooking operation mode, a residual cooking operation sequence and a residual cooking operation duration according to a preset cooking menu and the completed user cooking operation.
S207, generating a predicted cooking menu according to the predicted food material information, the predicted seasoning information, the predicted firepower information, the predicted smoke machine information and the predicted cooking information.
Specifically, after the predicted food material information, the predicted seasoning information, the predicted firepower information, the predicted smoke machine information and the predicted cooking information are determined, the predicted food material information, the predicted firepower information, the predicted smoke machine information and the predicted cooking information can be stored in a menu database, so that a predicted cooking menu is formed.
S208, guiding the user to complete the follow-up cooking work according to the preset cooking menu.
For example, if the user cooking operation mode at the current moment is frying and the preset cooking operation mode is frying, it is determined that the user cooking operation information does not coincide with the preset cooking operation information, and at this time, if cooking according to the preset cooking recipe is continued, it is difficult to achieve the expected cooking effect, so that the user cooking information at the current moment including the user food material information, the user seasoning information and the user cooking operation information is input into the cooking prediction model, and the predicted food material information, the predicted seasoning information, the predicted firepower information, the predicted smoke machine information and the predicted cooking operation information can be obtained, for example, the subsequent firepower size, the subsequent cooking duration, the subsequent cooking operation and the like, and then the predicted cooking recipe is generated to guide the user to perform the subsequent cooking work.
According to the technical scheme provided by the embodiment of the invention, the user cooking data acquired by the image acquisition equipment, the temperature detection equipment and the weighing equipment are received in real time, and the user cooking data are processed to obtain the user cooking information, so that the user cooking information of the user in the cooking process can be accurately acquired, and a foundation is laid for judging the current cooking operation of the user. And further judging whether the user cooking information accords with corresponding preset cooking information in a preset cooking menu, and when the user cooking information does not accord with the corresponding preset cooking information, inputting the user food material information, the user seasoning information and the user cooking information in the user cooking information into a cooking prediction model to determine predicted food material information, predicted seasoning information, predicted firepower information, predicted smoke machine information and predicted cooking information, so as to generate a predicted cooking menu, and guiding the user to complete subsequent cooking work according to the preset cooking menu. According to the method and the device, the follow-up cooking operation of the user is guided in a targeted manner according to the current cooking state of the user, the cooking difficulty is effectively reduced, the cooking success rate is improved, and the life of the user is greatly facilitated.
Example III
Fig. 3 is a schematic structural view of a cooking real-time guiding device according to a third embodiment of the present invention. As shown in fig. 3, the apparatus includes: a first recipe determination module 31, an information acquisition module 32, an information judgment module 33, a second recipe determination module 34, and a cooking instruction module 35.
The first menu determining module 31 is configured to receive a preset dish identifier set by a user on the terminal device, and determine a preset cooking menu according to the preset dish identifier; an information acquisition module 32 for acquiring user cooking information of a user during cooking; an information judging module 33, configured to judge whether the user cooking information matches with corresponding preset cooking information in a preset cooking recipe; a second recipe determination module 34, configured to input the user cooking information into the cooking prediction model if the user cooking information does not match, so as to obtain a predicted cooking recipe; a cooking guidance module 35 for guiding the user to complete the subsequent cooking according to the predicted cooking recipe.
The technical scheme provided by the third embodiment of the invention solves the problems that the existing cooking guiding method cannot conduct targeted guidance according to the cooking operation of people in the cooking process, so that the cooking difficulty is high and the success rate is low.
Optionally, the cooking information includes:
food material information, seasoning information, cooking information, fire information and smoke machine information; wherein, the food material information includes food material kind, food material name, food material weight, food material processing mode and food material maturity, and the condiment information includes condiment name and condiment weight, and the culinary art operation information includes culinary art operation mode, culinary art operation order, culinary art operation duration and culinary art temperature curve, and the firepower information includes firepower gear and firepower time, and the cigarette machine information includes cigarette machine gear and oil smoke concentration.
Optionally, the second recipe determination module 34 includes:
a model input unit for inputting user food material information, user seasoning information and user cooking operation information in the user cooking information into the cooking prediction model;
a model prediction unit for determining predicted food material information, predicted seasoning information, predicted fire information, predicted smoke machine information, and predicted cooking information according to user food material information, user seasoning information, and user cooking operation information in a cooking prediction model; the predicted food material information comprises a predicted food material name and a predicted food material weight, the predicted seasoning information comprises a predicted seasoning name, a predicted seasoning weight, a predicted seasoning adding time and a predicted seasoning adding sequence, the predicted firepower information comprises a predicted firepower gear and a predicted firepower time, the predicted smoke machine information comprises a predicted smoke machine gear and a predicted smoke concentration, and the predicted cooking operation information comprises a residual cooking operation mode, a residual cooking operation sequence and a residual cooking operation duration;
and the menu generating unit is used for generating a predicted cooking menu according to the predicted food material information, the predicted seasoning information, the predicted firepower information, the predicted smoke machine information and the predicted cooking information.
Optionally, the model prediction unit includes at least one of:
the first determining subunit is used for determining the predicted food material type, the predicted food material name and the predicted food material weight which need to be added according to the user food material type and the user food material weight in the user food material information;
a second determining subunit, configured to determine a predicted seasoning name and a predicted seasoning weight that need to be added according to the user food material weight in the user food material information and the user seasoning name and the user seasoning weight in the user seasoning information;
a third determining subunit, configured to determine, according to the user cooking operation mode in the user cooking operation information and the user food material maturity in the user food material information, a predicted seasoning adding sequence and a user target time for reaching the target food material maturity; wherein the user target time is used for determining a predicted seasoning adding time;
a fourth determining subunit, configured to determine a target temperature curve according to the user food material maturity, the user cooking operation duration and the user cooking temperature curve in the user cooking information, perform thermal function analysis on the target temperature curve, and determine a predicted thermal gear and a predicted thermal time required for reaching the target temperature value;
A fifth determining subunit, configured to determine a predicted range of the smoke machine and a predicted concentration of the smoke according to the weight of the user edible oil, the cooking temperature curve of the user, and the cooking operation duration in the user seasoning information;
and a sixth determining subunit, configured to determine a remaining cooking operation mode, a remaining cooking operation sequence and a remaining cooking operation duration according to a preset cooking recipe and a user cooking operation completed by the user.
Optionally, the information determining module 33 includes at least one of the following, if at least one of the following does not match, then the information determining module determines that all the items match:
the first judging unit is used for judging that the types of the food materials of the user in the food material information of the user are the same as the types of the preset food materials in the preset cooking information;
the same food material category includes the same food material category and the same number of food material categories, where the food material category includes but is not limited to meat, vegetables, fruits, and the like.
The second judging unit is used for judging that the difference between the weight of the food material of the user and the weight of the preset food material in the preset cooking information is in a user error interval; the error interval comprises a user threshold value and a preset threshold value;
the third judging unit is used for judging that the processing mode of the food materials of the user in the food material information of the user is the same as the processing mode of the preset food materials in the preset cooking information;
A fourth judging unit for judging that the cooking operation mode of the user is the same as the preset cooking operation mode in the preset cooking operation information;
a fifth judging unit for judging that the user cooking operation sequence and the user cooking operation duration which are finished by the user accord with the preset cooking operation sequence and the preset cooking operation duration in the preset cooking information;
a sixth judging unit for judging that the difference between the weight of the user seasoning and the weight of the preset seasoning in the preset cooking information is in a preset error interval; the preset error interval comprises a third threshold value and a fourth threshold value;
a seventh judging unit, configured to judge that a difference between the user food material maturity and a preset food material maturity in the preset cooking information is in a third error interval; wherein the third error interval includes a fifth threshold and a sixth threshold;
an eighth judging unit, configured to judge that differences between slopes corresponding to the same time points of the user cooking temperature curve and the preset temperature curve in the preset cooking information are in a fourth error interval; wherein the fourth error interval includes a seventh threshold and an eighth threshold.
Optionally, the cooking real-time guiding device further comprises a cooking data receiving module, which is used for receiving user cooking data of a user and processing the user cooking data to obtain user cooking information; wherein the user cooking data includes: raw images acquired by an image acquisition device on the terminal device, raw cooking temperatures detected by a temperature detection device on the terminal device, and user food weight and user seasoning weight measured by a weighing device on the terminal device.
Optionally, processing the user cooking data to obtain user cooking information includes:
positioning and region labeling are carried out on objects in the original image by using a positioning algorithm, so that a target labeling image is obtained;
extracting target image features from the target annotation image through a convolutional neural network model, and matching the target image features with preset image features to obtain an object name;
analyzing the object names to obtain the user food material names, the user seasoning names and the user cooking operation modes, and determining the user food material types according to the user food material names;
a user cooking temperature profile is determined from the raw cooking temperature.
Optionally, the cooking real-time guiding device further includes a response module, configured to respond to a first cooking function request input by a user on the terminal device, and control the terminal device to prompt the user to perform a cooking operation in a preset period before a key action time point in an image, video or voice manner; or, in response to a first cooking function request input by the user on the terminal device, the terminal device is controlled to automatically adjust the firepower and the smoke machine when the user is prompted to perform cooking operation.
The cooking real-time guiding device provided by the embodiment of the invention can execute the cooking real-time guiding method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the executing method.
Example IV
Fig. 4 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention. The electronic device is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 4, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the various methods and processes described above, such as the cooking real-time instruction method.
In some embodiments, the cooking real-time guidance method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as the storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into RAM 13 and executed by processor 11, one or more of the steps of the cooking real-time instruction method described above may be performed. Alternatively, in other embodiments, processor 11 may be configured to perform the cooking real-time instruction method in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.