Disclosure of Invention
In view of the above technical problems, the present invention provides a method, system, device and storage medium for estimating pig weight
The technical scheme for solving the technical problems is as follows:
on the basis of the technical scheme, the invention can be further improved as follows.
A method of estimating pig weight, comprising:
acquiring a two-dimensional picture of a pig acquired by a camera;
and inputting a pre-trained neural network model according to the parameters of the camera, the pig variety and the two-dimensional picture to obtain an estimated value of the weight of the pig.
Further, the training method of the neural network model specifically includes:
constructing three-dimensional pig models with various postures and shapes through three-dimensional modeling;
projecting a two-dimensional picture corresponding to each pig three-dimensional model through different visual angles, obtaining a corresponding binary picture according to the two-dimensional picture, and constructing a corresponding data pair of the pig three-dimensional model and the binary picture;
and taking each binary image, the visual height of the binary image and the pig variety as input of a neural network model, taking the weight of the pig corresponding to the corresponding three-dimensional pig model as output, and training the neural network model until the neural network model is converged.
Further, the parameter of the camera is a camera height.
In order to achieve the above object, the present invention further provides a pig weight estimation system, including:
the picture acquisition module is used for acquiring a two-dimensional picture of the pig acquired by the camera;
and the weight estimation module is used for inputting a pre-trained neural network model according to the parameters of the camera, the breed of the pig and the two-dimensional picture to obtain an estimated value of the weight of the pig.
Further, the training method of the neural network model specifically includes:
constructing three-dimensional pig models with various postures and shapes through three-dimensional modeling;
projecting a two-dimensional picture corresponding to each pig three-dimensional model through different visual angles, obtaining a corresponding binary picture according to the two-dimensional picture, and constructing a corresponding data pair of the pig three-dimensional model and the binary picture;
and taking each binary image, the visual height of the binary image and the pig variety as input of a neural network model, taking the weight of the pig corresponding to the corresponding three-dimensional pig model as output, and training the neural network model until the neural network model is converged.
Further, the parameter of the camera is a camera height.
The present invention also provides a terminal device, including:
a processor; and
a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform the method described above.
The invention also provides a non-transitory machine-readable storage medium having stored thereon executable code which, when executed by a processor of an electronic device, causes the processor to perform the above-described method.
The invention has the beneficial effects that:
according to the pig weight estimation method provided by the invention, under the conditions of different pig varieties and camera parameters, the estimation value of the pig weight can be accurately obtained through the two-dimensional picture of the pig by adopting a machine learning method, so that the pig breeding efficiency is improved.
Detailed Description
The principles and features of this invention are described below in conjunction with the following drawings, which are set forth by way of illustration only and are not intended to limit the scope of the invention.
Fig. 1 is a flowchart of a pig weight estimation method according to an embodiment of the present invention.
As shown in fig. 1, the method includes:
110. and acquiring a two-dimensional picture of the pig acquired by the camera.
Specifically, a pig image acquisition system can be set up, a pig in the pigsty is shot through a downward shooting camera, and an actual downward shooting image of the pig is shown in fig. 2.
120. And inputting a pre-trained neural network model according to the parameters of the camera, the pig variety and the two-dimensional picture to obtain an estimated value of the weight of the pig.
Specifically, the training method of the neural network model specifically includes:
a. constructing three-dimensional pig models with various postures and shapes through three-dimensional modeling;
specifically, as shown in fig. 3, the software such as maya can be used to construct a three-dimensional model by scanning and collecting three-dimensional data of a pig, and meanwhile, the breed and weight of the pig need to be recorded as a training data set;
b. projecting a two-dimensional picture corresponding to each pig three-dimensional model through different visual angles, obtaining a corresponding binary picture according to the two-dimensional picture, and constructing a corresponding data pair of the pig three-dimensional model and the binary picture as shown in FIG. 4;
obtaining a binary image according to a two-dimensional image belongs to the prior art, and is not described in detail in this embodiment.
c. And taking each binary image, the visual height of the binary image and the pig variety as input of a neural network model, taking the weight of the pig corresponding to the corresponding three-dimensional pig model as output, and training the neural network model until the neural network model is converged.
After the training of the neural network model is completed through the process, the height of a camera of the image acquisition system is used as the visual height, a binary image converted from a two-dimensional image shot by the camera is combined, and a pre-known pig breed is input into the trained neural network model together, so that the weight estimation value of the pig can be obtained.
According to the method for estimating the weight of the pig, provided by the embodiment of the invention, under the conditions of different pig varieties and camera parameters, the estimated value of the weight of the pig can be accurately obtained through the two-dimensional picture of the pig by adopting a machine learning method, so that the pig breeding efficiency is improved.
The principles of the functional modules in the system are explained in detail in the above description, and the following description is omitted for brevity.
As shown in fig. 5, the system includes:
the picture acquisition module is used for acquiring a two-dimensional picture of the pig acquired by the camera;
and the weight estimation module is used for inputting a pre-trained neural network model according to the parameters of the camera, the breed of the pig and the two-dimensional picture to obtain an estimated value of the weight of the pig.
Optionally, in this embodiment, the training method of the neural network model specifically includes:
constructing three-dimensional pig models with various postures and shapes through three-dimensional modeling;
projecting a two-dimensional picture corresponding to each pig three-dimensional model through different visual angles, obtaining a corresponding binary picture according to the two-dimensional picture, and constructing a corresponding data pair of the pig three-dimensional model and the binary picture;
and taking each binary image, the visual height of the binary image and the pig variety as input of a neural network model, taking the weight of the pig corresponding to the corresponding three-dimensional pig model as output, and training the neural network model until the neural network model is converged.
Optionally, in this embodiment, the parameter of the camera is a camera height.
An embodiment of the present invention provides a terminal device, including:
a processor; and
a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform the method described above.
Embodiments of the present invention provide a non-transitory machine-readable storage medium having stored thereon executable code, which, when executed by a processor of an electronic device, causes the processor to perform the above-described method.
The reader should understand that in the description of this specification, reference to the description of the terms "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the modules and units in the above described system embodiment may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention essentially or partially contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
While the invention has been described with reference to specific embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.