Disclosure of Invention
The invention provides an automatic printing equipment tracing method based on deep learning, which is used for solving the defects of rotation and scale change of a secret mark image, noise points and missing of the secret mark points, which are caused by image acquisition operation and change of acquisition equipment, and printing quality in the prior art.
The invention provides an automatic printing equipment tracing method based on deep learning, which comprises the following steps:
step 1, acquiring N types of printing equipment data samples, and classifying and cutting the N types of printing equipment data samples to obtain a reference data set;
Step 2, performing image preprocessing on the reference data set to obtain a first sample data set, wherein the first sample data set is a dark-mark development black-and-white image data set of the N types of printing equipment data samples contained in the reference data set;
Step 3, based on one class of the N types of printing equipment data samples contained in the first sample data set, extracting a complete secret mark set by using an intelligent learning method, acquiring a neighborhood point mode characteristic, and further acquiring a training set characteristic to obtain a traceability classification reference data set;
and 4, comparing elements in the test set with the traceability classification reference data set, and obtaining a traceability classification result of the source printing equipment by using a KNN nearest neighbor classification method.
Wherein, the step 1 further comprises:
step 11, acquiring the N-type printing equipment data samples, and automatically scanning the N-type printing equipment data samples based on TWAIN protocol to obtain N-type scanning sample images corresponding to the N-type printing equipment data samples;
And step 12, randomly cutting the N types of scanning sample images into k pieces to obtain N multiplied by k samples, and forming the reference data set.
Wherein, the step 2 further comprises:
step 21, obtaining the reference data set, and decomposing the reference data set according to the color channel to obtain a blue color channel image with the most obvious dark mark characteristics;
Step 22, performing a whitening treatment and a binarization treatment method on the blue color channel image to obtain a black-and-white image of the developed dark mark;
step 24, performing morphological operation on the black-and-white image to obtain a clear dark mark image;
And step 25, based on the clear hidden image, performing traversal on each pixel in the image to remove noise, so as to obtain the first sample data set.
Wherein, the step 3 further comprises:
Step 31, obtaining one type of data in the N types of printing equipment data samples contained in the first sample data set, calculating the space coordinates of all the hidden marks in the one type of data, and storing coordinate data to obtain a hidden mark data file;
Step 32, based on the secret mark data file, performing scale normalization and rotation consistency on the secret mark local features by using a neighborhood point pattern NPP to obtain neighborhood point pattern features of the secret mark data file;
Step 33, obtaining repeated secret mark points by comparing the neighborhood point mode characteristics of the secret mark points in the secret mark data file based on the secret mark data file and the corresponding neighborhood point mode characteristics;
Step 34, calculating the association relation of all the dark points by using a dark point association learning method of a greedy strategy based on the repeated dark points, and determining a maximum repeated dark point set as a dark mode set, wherein the maximum repeated dark point set is a point set containing all the repeated dark points;
And 35, traversing the undetected data in the first sample data set due to the fact that the neighborhood points exceed the image range, calculating the association relation between the undetected data and other dark mark points, and supplementing the dark mark pattern set to obtain a complete dark mark pattern set.
Wherein, the step 35 further includes:
step 351, obtaining a complete secret mark mode set corresponding to the N types of printing equipment data samples;
Step 352, based on the complete set of the patterns of the N types of printing device data samples, randomly selecting the set of the patterns of the N types of printing device data samples as a training set, and the sets of the patterns of the rest samples as a test set;
step 353, acquiring the neighborhood point mode features in the training set, and storing the neighborhood point mode features in a database to obtain the traceability classification reference dataset.
Wherein, the step 4 further comprises:
step 41, acquiring the test set and the traceability classification reference data set;
Step 42, traversing the traceable classification reference data set for each element in the test set, obtaining a distance relation between the test set and the traceable classification reference data set by using a KNN nearest neighbor method, and judging a similarity result of each element in the test set and the traceable classification reference data set;
And 43, classifying the traceability classification reference data set to obtain the traceability classification result of the source printing equipment.
Wherein, the step 42 further includes:
step 421, traversing the traceable classification reference dataset based on the test set;
Step 422, calculating the distance between each element in the test set and each element in the traceable classification reference data set by taking the test set as a reference, so as to obtain a first distance;
Step 423, calculating the distance between each element in the test set and each element in the traceable classification reference data set by taking the traceable classification reference data set as a reference to obtain a second distance;
Step 424, traversing the first distance and the second distance corresponding to each element in the test set, and calculating a difference between the first distance and the second distance to obtain the similarity result;
step 425, based on the similarity result, obtaining a plurality of elements in the traceability classification reference data set, which are nearest to the test set, and performing traceability classification on the test sample image by using a majority voting method to obtain a traceability classification result of the source printing device.
The invention also provides an automatic printing equipment traceability detection device based on deep learning, which is characterized by comprising:
The data acquisition module is used for acquiring N types of printing equipment data samples, classifying and cutting the N types of printing equipment data samples to obtain a reference data set;
The data preprocessing module is used for carrying out image preprocessing on the reference data set to obtain a first sample data set, wherein the first sample data set is a dark-mark development black-and-white image data set of the N types of printing equipment data samples contained in the reference data set;
the model training module is used for extracting a complete dark record set by using an intelligent learning method based on one class of the N types of printing equipment data samples contained in the first sample data set to obtain neighborhood point mode characteristics, and further obtaining training set characteristics to obtain a traceability classification reference data set;
And the data tracing module is used for taking the data which is not randomly selected in the data enhancement module as a test set, comparing the elements in the test set with a tracing classification reference data set, and obtaining a tracing classification result of the source printing equipment by using a KNN nearest neighbor classification method.
The present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program, characterized in that the computer program, when executed by a processor, implements the automated printing device tracing method based on deep learning according to any one of claims 1 to 7.
The invention also provides a computer program product comprising a computer program, characterized in that the computer program, when executed by a processor, implements the automated printing device tracing method based on deep learning according to any one of claims 1 to 7.
Specifically, the method and the device form complete model training and sample trace tracing strategies by combining various image recognition strategies with sample characteristics of the printing equipment, and automatically acquire related information of the color laser printing equipment by utilizing a computer mode recognition technology and an automatic comparison technology to recognize and compare trace and secret mark characteristics of the color laser printing equipment, so that the recognition efficiency, the comprehensiveness and the accuracy are greatly improved.
The method and the device of the invention improve the tracing precision to 98.53% and the overall identification accuracy to 95.62% by combining deep learning and pattern identification, thereby greatly improving the tracing precision and the overall identification accuracy of the printing equipment and realizing the efficient tracing of the printing equipment.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The trace-source technology of the printed document plays a vital role in the field of court document inspection and crime fighting, and judges the brand and model of the source printing device by analyzing the characteristics of the printed document. At present, a file tracing method based on a visual technology is mainly divided into a passive tracing technology and an active tracing technology, which are respectively.
The active tracing technology realizes tracing by identifying hidden fine image-text features embedded in the document printing process or embedding specific identification information in the printed document. By utilizing a specific technology or method, the trace of the source of the printing equipment is realized by acquiring a certain hidden fine image-text characteristic embedded in the printed image-text content.
Aiming at a machine identification code formed in the printing process of color laser printing equipment, the method effectively solves the problems of rotation and scale change of a secret mark image caused by image acquisition operation and acquisition equipment change, noise points and missing of the secret mark points caused by printing quality and the like. By introducing deep learning and pattern recognition technology, automatic and intelligent secret mark feature comparison is realized, so that the time consumption is less, the cost is low, and the tracing accuracy and efficiency are greatly improved.
The deep learning realizes high-precision identification and analysis of the file images by constructing a multi-layer neural network model, and improves the accuracy and efficiency of file inspection. The application of the technology not only reduces the influence of manual intervention and subjective judgment, but also provides a new technical means for tracing the printed file. The development of the trace-source technology of the printed file, especially the development in the aspects of automatic and intelligent secret mark characteristic comparison, provides powerful technical support for court file inspection and crime attack, and has wide popularization value and application prospect.
Fig. 1 is a flow chart of an automatic printing equipment tracing method based on deep learning, and the method is shown in fig. 1, and comprises the following steps of 1, obtaining N types of printing equipment data samples, classifying and cutting the N types of printing equipment data samples to obtain a reference data set, 2, preprocessing an image of the reference data set to obtain a first sample data set, wherein the first sample data set is a black-and-white image data set of a dark record development image of the N types of printing equipment data samples contained in the reference data set, 3, extracting a complete dark record set based on one type of the N types of printing equipment data samples contained in the first sample data set by using an intelligent learning method, obtaining neighborhood point mode characteristics, further obtaining training set characteristics, obtaining a tracing classification reference data set, and 4, comparing elements in a test set with the tracing classification reference data set, and obtaining a tracing printing equipment source classification result by using a KNN nearest neighbor classification method.
Wherein, the step 1 further comprises:
step 11, acquiring the N-type printing equipment data samples, and automatically scanning the N-type printing equipment data samples based on TWAIN protocol to obtain N-type scanning sample images corresponding to the N-type printing equipment data samples;
And step 12, randomly cutting the N types of scanning sample images into k pieces to obtain N multiplied by k samples, and forming the reference data set.
The N types of printing equipment data samples are paper samples printed by different types of color laser printing equipment, and the scanner equipment is used for scanning by fixing DPI, so that printing paper with blurring, incomplete and color distortion caused by poor printing quality is removed, and finally M printing paper scanning images are acquired. For each of the N types of printing device data samples, scanning the image to randomly crop the image into k pictures.
Preferably, under a scanning condition of 400DPI, the scanner device automatically scans a plurality of data samples of the N types of printing devices based on TWAIN protocol to obtain a plurality of scanned sample images, and the N types of scanned sample images are randomly cut into k pieces to obtain the reference data set, wherein k is more than or equal to 30.
Wherein, the step 2 further comprises:
step 21, obtaining the reference data set, and decomposing the reference data set according to the color channel to obtain a blue color channel image with the most obvious dark mark characteristics;
Step 22, performing a whitening treatment and a binarization treatment method on the blue color channel image to obtain a black-and-white image of the developed dark mark;
step 24, performing morphological operation on the black-and-white image to obtain a clear dark mark image;
And step 25, based on the clear hidden image, performing traversal on each pixel in the image to remove noise, so as to obtain the first sample data set.
Fig. 2 is a flowchart of image preprocessing, as can be seen from fig. 2, the present technical solution obtains a final dark image through channel decomposition, anti-white processing, binarization, morphological operation, and denoising, so as to further form the first sample data set.
The decomposing of the reference data set according to the color channel comprises the steps of reading the scanning sample image containing the content of the dark marks in the reference data set, decomposing the scanning sample image according to the color channel, and obtaining independent red channel image, green channel image and blue channel image.
Further, a blue color channel image with the most obvious dark mark features is selected, each pixel of the image is traversed, the brightness value of each pixel is subtracted by 255, and the dark marks in the image are highlighted through the inverse white processing, so that the image of the developed dark marks is obtained.
Further, binarizing the image of the developed dark mark, traversing each pixel after reading the image, judging the size relation between the brightness value of each pixel and a threshold value, wherein the threshold value is preferably set to 128, determining the brightness value of the pixel with the brightness value larger than the set threshold value to be 255 and the brightness value smaller than the set threshold value to be 0, and further converting the image into a binary image with only two colors of black and white to obtain a black-white image of the developed dark mark.
Further, the black-and-white image of the camera is subjected to morphological operations including corrosion and expansion, so that a camera image with clearer camera is obtained.
The etching operation is to develop a black-and-white image on the dark mark, traverse each pixel of the image, take the current pixel as a center, determine a 3×3 rectangle as a structural element, namely determine an area formed by the current pixel and 8 adjacent pixels around the current pixel as the structural element, compare all pixel values in the coverage area of the structural element, and assign the minimum value in the coverage area of the structural element to the pixel in the center position of the structural element, thereby realizing the white area in the contracted image, reducing the size of the dark mark and enabling the dark mark to be more compact, further eliminating burrs of the boundary, and enabling the shape of the dark mark to be clearer.
The expansion operation is to develop a black-and-white image on the dark mark, traverse each pixel of the image, take the current pixel as the center, determine a 3×3 rectangle as a structural element, namely determine the area formed by the current pixel and 8 adjacent pixels around the current pixel as the structural element, compare all pixel values in the coverage area of the structural element, and assign the maximum value in the coverage area of the structural element to the pixel in the center position of the structural element, thereby realizing the expansion of the white area in the image, enabling the white area to be more communicated and protrude out of the boundary, eliminating the cavity and small fracture, and facilitating the subsequent detection of the dark mark.
Further, denoising the black-and-white image of the dark mark development after the corrosion operation and the expansion operation, traversing each pixel after reading the image, checking the neighborhood around each pixel, calculating the pixel number occupied by the dark mark point, judging the pixel to be the dark mark point in a set threshold range, otherwise judging the pixel to be the noise point, obtaining black-and-white images of the dark mark after removing noise of printing equipment of different brands and models as shown in fig. 3, preprocessing the image of the reference data set through the operation of the step 2, and obtaining N multiplied by k dark mark development black-and-white images, namely the first sample data set.
Wherein, the step 3 further comprises:
and step 31, obtaining one type of data in the N types of printing equipment data samples contained in the first sample data set, calculating the space coordinates of all the hidden marks in the one type of data, and storing coordinate data to obtain a hidden mark data file.
And step 32, carrying out scale normalization and rotation consistency on the local features of the secret marks by using a neighbor point pattern NPP based on the secret mark data file to obtain neighbor point pattern features of the secret mark data file.
And step 33, obtaining repeated secret mark points by comparing the neighborhood point mode characteristics of the secret mark points in the secret mark data file based on the secret mark data file and the corresponding neighborhood point mode characteristics.
And step 34, calculating the association relation of all the dark points by using a dark point association learning method of a greedy strategy based on the repeated dark points, and determining a maximum repeated dark point set as a dark point mode set, wherein the maximum repeated dark point set is a point set containing all the repeated dark points.
And 35, traversing the undetected data in the first sample data set due to the fact that the neighborhood points exceed the image range, calculating the association relation between the undetected data and other dark mark points, and supplementing the dark mark pattern set to obtain a complete dark mark pattern set.
The calculating of the space coordinates of all the dark points in the data comprises determining the space coordinates of each dark point in the black-and-white image, calculating the space coordinates of all the dark points, and storing the coordinates of each dark point in a (x, y) coordinate pair format to obtain a dark data file.
The determining the spatial coordinates of each dark mark point in the black-and-white image comprises obtaining a dark mark point pc and a neighborhood point set thereof by using the spatial distribution relation between 7 nearest neighbor points in the neighborhood with the dark mark point as the center and the corresponding dark mark point after obtaining the spatial coordinates pc=(xc,yc) of a certain dark mark point according to the dark mark data file of a certain imageAnd modeling each dark mark point in the image to obtain a neighborhood point mode of each dark mark point.
The neighborhood point mode NPP refers to an image processing mode recognition algorithm for feature extraction and data dimension reduction, and based on the secret mark data file, scale normalization and rotation consistency are carried out on secret mark local features based on the neighborhood point mode, so that scale invariance is realized.
FIG. 4 is a schematic diagram of scale normalization, as shown in FIG. 4, including for a certain shadow point pc and its neighborhood point setAccording toThe calculation method performs scale normalization operation on the neighborhood point mode of the hidden mark point, wherein max (dc) is all nearest neighbor hidden mark pointsAnd the robustness of the neighborhood point mode to scale change is increased through scale normalization operation, so that the NPP operator has consistency on the characteristics of the dark images acquired by different image acquisition devices.
FIG. 5 is a schematic diagram of rotation normalization, as shown in FIG. 5, including for a certain secret point pc and its neighborhood point setAccording toThe computing method carries out rotation unification operation on the neighborhood point mode of the hidden mark point, whereinFor nearest neighbor secret mark pointsThe distance to the center point pc,Is the nearest neighbor hidden mark pointAnd the included angle between the central point pc and the NPP operator is the included angle between the nearest neighbor hidden mark point farthest from pc and pc, and the hidden mark point set of each hidden mark point neighborhood space is calibrated in the angle space through rotation unification operation, so that the NPP operator has better robustness to the rotation change of the input image.
The comparison of the neighborhood point mode characteristics of the dark mark points means that the points in the dark mark point set have the same neighborhood point mode characteristics based on the characteristic that the dark mark point set of a specific mode repeatedly appears according to a rule, and the repeated dark mark points are positioned by determining whether the neighborhood point mode characteristics are the same.
The determining the neighborhood point mode characteristic specifically comprises traversing the normalized pair of the dark point pA and pB and the neighborhood point set of the pair of the dark point in the dark imageAnd
According toThe method calculates the distance difference between the space distribution of other dark mark points in the neighborhood space of the dark mark point pA and the dark mark point pB, whereinFor neighborhood relative pointsAndEpsilonNPP is a predefined distance threshold,If DNPP(pA,pB) =0, the relative spatial positions of 7 neighboring points around the dark point pA and the dark point pB are the same, and pa and pb are two corresponding dark points with the same positions in the two repeated dark modes, so as to obtain repeated dark points with the same neighboring point mode characteristics in the dark point image and neighboring point sets thereof.
The method for determining the neighborhood point mode features further comprises the step of associating related secret points together through a secret point association learning method based on a greedy strategy, so that a maximum repeated secret point set, namely a set containing as many secret points as possible, is found.
Fig. 6 is a schematic drawing of different types of set extraction of hidden marks. The method for the association learning of the hidden marks comprises the following steps of extracting a repeated hidden mark setWhere i=1, 2, K,For repeated secret points with the same secret point mode characteristics, Mi is the number of repeated secret points, K is the number of secret point sets, and each repeated secret point set is calculatedThe relative position mode of 8 directions of any two points (rhoi,θi),ρi is the relative distance, thetai is the relative angle, and the repeated secret mark point set is predicted through (rhoi,θi) and occurrence frequencyThe dark mark points which are supposed to appear but not appear or are not successfully detected are removed, and the false dark mark points which are wrongly identified by noise are removed, so as to obtain the final repeated dark mark point set
Further, 8 directions of any two points are determined based on a position vector between the two points, and include 8 basic directions on a two-dimensional plane, specifically, a 0 ° direction of right (horizontal rightward), a 45 ° direction of right (diagonal direction), a 90 ° direction of right (vertical upward), a 135 ° direction of left (diagonal direction), a 180 ° direction of right left (horizontal leftward), a 225 ° direction of left lower (diagonal direction), a 270 ° direction of right lower (vertical downward), and a 315 ° direction of right lower (diagonal direction).
Further, initializing the set of dark patternsFor empty sets, selecting any one of the repeated sets of dark pointsAnd will beEach of the hidden marks is associated with a correspondingAnd according toCalculating the central points of all the dark points to obtain the positions of the dark points, wherein M is the number of the dark points, and calculating the point distances in each point set one by one for each dark point in the unassociated repeated dark pointsThe nearest set and its secret points are associated to the correspondingIn (3), updateCorresponding center point
Further, for the case that the neighborhood points in the image of the dark mark points are not detected or part of the dark mark points are not recognized as repeated dark mark points because the dark mark points are distributed at the edge of the picture and the neighborhood points are beyond the range of the image, the dark mark pattern setAnd (5) completing to obtain the complete dark record set.
Wherein the pair of sets of secret mark patternsThe completion method specifically comprises traversing the points of the dark mark which are not yet associated with the imagesPoints in the set of dark points Punrelated, calculating the distances between all the dark points and the central point one by oneFurther to obtain the nearest secret mark pointAnd predicting a repeated set of secret pointsWherein the secret points are as followsRelative to the centre pointHaving the same positional relationship, if there are more than a certain threshold number of unassociated dark points in Punrelated, which have the same positional relationship with the nearest point relative to the center point, thenEach dark point in the point set and the pattern set of the dark marksMake association and updateCorresponding center pointIf there are less than a certain threshold number of unassociated hidden marks in Punrelated, thenThe point in (a) is regarded as noise point, and is updatedFinally obtaining complete dark record set extracted based on specific dark record point image
Wherein, the step 35 further includes:
step 351, obtaining a complete secret mark mode set corresponding to the N types of printing equipment data samples;
Step 352, based on the complete set of the patterns of the N types of printing device data samples, randomly selecting the set of the patterns of the N types of printing device data samples as a training set, and the sets of the patterns of the rest samples as a test set;
step 353, acquiring the neighborhood point mode features in the training set, and storing the neighborhood point mode features in a database to obtain the traceability classification reference dataset.
For each type of sample image in the N types of data samples of the printing device, preferably, 10 sample images are randomly selected to obtain the corresponding complete dark record set and the neighborhood point mode features, and the dark record mode sets of the rest sample images are used as the test set.
The tracing classification reference data set comprises the complete dark record set and the neighborhood point mode features.
Wherein, the step 4 further comprises:
step 41, acquiring the test set and the traceability classification reference data set;
Step 42, traversing the traceable classification reference data set for each element in the test set, obtaining a distance relation between the test set and the traceable classification reference data set by using a KNN nearest neighbor method, and judging a similarity result of each element in the test set and the traceable classification reference data set;
And 43, classifying the traceability classification reference data set to obtain the traceability classification result of the source printing equipment.
Wherein the test setFor the extracted dark record set in the dark record point image on the printed document, the tracing classification reference data setThe distance relation is that the test set is printed by the specific type of printing equipment consisting of the complete secret record set and the neighborhood point mode featuresTest set under benchmark(s)And a reference data setIs calculated on the basis of the distance and the reference data setTest set under benchmark(s)And a reference data setThe similarity result is that the test setWith reference data setsAnd the source printing equipment tracing classification result is a printing equipment tracing result determined according to the similarity result.
Wherein, the step 42 further includes:
step 421, traversing the traceable classification reference dataset based on the test set;
Step 422, calculating the distance between each element in the test set and each element in the traceable classification reference data set by taking the test set as a reference, so as to obtain a first distance;
Step 423, calculating the distance between each element in the test set and each element in the traceable classification reference data set by taking the traceable classification reference data set as a reference to obtain a second distance;
Step 424, traversing the first distance and the second distance corresponding to each element in the test set, and calculating a difference between the first distance and the second distance to obtain the similarity result;
step 425, based on the similarity result, obtaining a plurality of elements in the traceability classification reference data set, which are nearest to the test set, and performing traceability classification on the test sample image by using a majority voting method to obtain a traceability classification result of the source printing device.
Wherein the first distance is a test setTest set under benchmark(s)And a reference data setIn particular according to
The method is calculated byReference-based set of dark marksDark recordWherein (2) is a distance ofIs thatAndThe distance of the spatial distribution of other hidden marks in the neighborhood space, epsilonMIC is a preset distance threshold,NA is a set of dark marks as an indicator functionThe number of the hidden marks in the image.
Wherein the second distance is a reference data setIs calculated on the basis of the distance and the reference data setTest set under benchmark(s)And a reference data setIn particular according to
The method is calculated byReference-based set of dark marksDark recordIs a distance of (3).
The specific calculation method of the similarity result comprises the following steps of
Computing a set of dark marks for a test sampleAnd a set of dark marks printed by a particular type of printing deviceSimilarity of (1)ThenAndThe hidden marks in the two hidden marks are judged to be similar;
The most adjacent elements are elements closest to the test set in the traceable classification reference data set, and preferably 5 elements closest to the test set are selected.
The majority voting is specifically that the most adjacent multiple element categories are selected as the tracing classification result of the source printing equipment according to the judgment of the most adjacent multiple elements.
It can be seen that the invention integrates the deep learning technology and the pattern recognition technology, performs multidimensional data processing on sample image data acquired by the printing equipment, and realizes an accurate and universal classification strategy through detailed characteristic analysis and comparison. The application of the KNN algorithm further improves the tracing efficiency and accuracy of the printing equipment, and provides a high-efficiency and reliable technical means for trace identification of the printing equipment.
On the other hand, the invention also provides an automatic printing equipment traceability detection device based on deep learning, which is characterized by comprising:
The data acquisition module is used for acquiring N types of printing equipment data samples, classifying and cutting the N types of printing equipment data samples to obtain a reference data set;
The data preprocessing module is used for carrying out image preprocessing on the reference data set to obtain a first sample data set, wherein the first sample data set is a dark-mark development black-and-white image data set of the N types of printing equipment data samples contained in the reference data set;
the model training module is used for extracting a complete dark record set by using an intelligent learning method based on one class of the N types of printing equipment data samples contained in the first sample data set to obtain neighborhood point mode characteristics, and further obtaining training set characteristics to obtain a traceability classification reference data set;
And the data tracing module is used for taking the data which is not randomly selected in the data enhancement module as a test set, comparing the elements in the test set with a tracing classification reference data set, and obtaining a tracing classification result of the source printing equipment by using a KNN nearest neighbor classification method.
In yet another aspect, the present invention further provides a non-transitory computer readable storage medium, on which a computer program is stored, wherein the computer program, when executed by a processor, implements an automated printing device tracing method based on deep learning.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the respective embodiments or some parts of the embodiments.
It should be noted that the above-mentioned embodiments are merely for illustrating the technical solution of the present invention, and not for limiting the same, and although the present invention has been described in detail with reference to the above-mentioned embodiments, it should be understood by those skilled in the art that the technical solution described in the above-mentioned embodiments may be modified or some technical features may be equivalently replaced, and these modifications or substitutions do not make the essence of the corresponding technical solution deviate from the spirit and scope of the technical solution of the embodiments of the present invention.