Movatterモバイル変換


[0]ホーム

URL:


CN115860802A - Product value prediction method, device, computer equipment and storage medium - Google Patents

Product value prediction method, device, computer equipment and storage medium
Download PDF

Info

Publication number
CN115860802A
CN115860802ACN202211701339.5ACN202211701339ACN115860802ACN 115860802 ACN115860802 ACN 115860802ACN 202211701339 ACN202211701339 ACN 202211701339ACN 115860802 ACN115860802 ACN 115860802A
Authority
CN
China
Prior art keywords
product value
value time
product
subsequence
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211701339.5A
Other languages
Chinese (zh)
Inventor
张华�
吴亚东
陈磊
孙科伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial and Commercial Bank of China Ltd ICBC
ICBC Technology Co Ltd
Original Assignee
Industrial and Commercial Bank of China Ltd ICBC
ICBC Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial and Commercial Bank of China Ltd ICBC, ICBC Technology Co LtdfiledCriticalIndustrial and Commercial Bank of China Ltd ICBC
Priority to CN202211701339.5ApriorityCriticalpatent/CN115860802A/en
Publication of CN115860802ApublicationCriticalpatent/CN115860802A/en
Pendinglegal-statusCriticalCurrent

Links

Images

Landscapes

Abstract

The application relates to a product value prediction method, a product value prediction device, computer equipment, a storage medium and a computer program product, and relates to the technical field of big data. The method comprises the following steps: performing sliding window processing on the product value time sequence of the product to be predicted to obtain a plurality of product value time subsequences; acquiring target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes through a recurrent neural network; through a first layer of attention mechanism network, performing fusion processing on target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes to obtain subsequence characteristic information of each product value time subsequence; and fusing the subsequence characteristic information of each product value time subsequence through a second layer attention mechanism network to obtain the predicted value of the product to be predicted after the product value time sequence. By adopting the method, the accuracy of value prediction can be improved.

Description

Product value prediction method, device, computer equipment and storage medium
Technical Field
The present application relates to the field of big data technologies, and in particular, to a method, an apparatus, a computer device, a storage medium, and a computer program product for product value prediction.
Background
With the development of big data technology, big data analysis is widely applied to the prediction of product value.
At present, the product value is generally predicted through a recurrent neural network; however, since the conventional recurrent neural network has difficulty in capturing the temporal relationship among input data, and cannot recognize noise information accompanying the input data. However, in the product value data, the front value data and the rear value data often have strong dependency, and meanwhile, the product value itself has volatility and can carry larger noise information. Therefore, the product value prediction method based on the conventional technology has low prediction accuracy.
Disclosure of Invention
In view of the above, it is necessary to provide a product value prediction method, apparatus, computer device, computer readable storage medium, and computer program product capable of improving prediction accuracy.
In a first aspect, the present application provides a method for product value prediction. The method comprises the following steps:
performing sliding window processing on the product value time sequence of the product to be predicted to obtain a plurality of product value time subsequences;
acquiring target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes through a recurrent neural network in a pre-trained value prediction model;
performing fusion processing on target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes through a first layer attention mechanism network in the pre-trained value prediction model to obtain subsequence characteristic information of each product value time subsequence;
and fusing the subsequence characteristic information of each product value time subsequence through a second layer attention mechanism network in the pre-trained value prediction model to obtain the predicted value of the product to be predicted after the product value time sequence.
In one embodiment, the fusing, by using a first layer attention mechanism network in the pre-trained value prediction model, target feature information of each product value time subsequence under a plurality of preset feature indexes to obtain subsequence feature information of each product value time subsequence, includes:
confirming the weight of each preset characteristic index through a first layer attention mechanism network in the pre-trained value prediction model;
and aiming at each product value time subsequence, performing fusion processing on target characteristic information of the product value time subsequence under a plurality of preset characteristics through the weight of each preset characteristic index to obtain subsequence characteristic information of the product value time subsequence.
In one embodiment, the obtaining the predicted value of the product to be predicted after the product value time series by fusing the subsequence feature information of each product value time subsequence through a second layer attention mechanism network in the pre-trained value prediction model includes:
determining a weight of each of the product value time subsequences through a second layer attention mechanism network in the pre-trained value prediction model;
and fusing the subsequence characteristic information of each product value time subsequence to obtain the predicted value of the product to be predicted after the product value time sequence.
In one embodiment, the pre-trained value prediction model is trained by:
obtaining a sample product value time series of a sample product and an actual value of the sample product after the sample product value time series;
performing sliding window processing on the sample product value time sequences to obtain a plurality of sample product value time subsequences;
acquiring target characteristic information of each sample product value time subsequence under a plurality of preset characteristic indexes through a recurrent neural network in a value prediction model to be trained;
performing fusion processing on target characteristic information of each sample product value time subsequence under a plurality of preset characteristic indexes through a first layer of attention mechanism network in the value prediction model to be trained to obtain subsequence characteristic information of each sample product value time subsequence;
performing fusion processing on subsequence feature information of each sample product value time subsequence through a second layer attention mechanism network in the value prediction model to be trained to obtain the prediction value of the sample product after the sample product value time sequence;
and training the value prediction model to be trained according to the difference between the predicted value of the sample product after the sample product value time sequence and the actual value of the sample product after the sample product value time sequence to obtain a trained value prediction model serving as the pre-trained value prediction model.
In one embodiment, the obtaining, by a recurrent neural network in a pre-trained value prediction model, target feature information of each product value time subsequence under a plurality of preset feature indexes includes:
acquiring the mapped characteristic information of each product value time subsequence under a plurality of preset characteristic indexes;
for each product value time subsequence, inputting the mapped feature information of the product value time subsequence under a plurality of preset feature indexes into a recurrent neural network in the pre-trained value prediction model to obtain hidden state information corresponding to the mapped feature information of the product value time subsequence under the plurality of preset feature indexes;
and confirming hidden state information corresponding to the mapped characteristic information of the product value time subsequence under a plurality of preset characteristic indexes as target characteristic information of the product value time subsequence under the plurality of preset characteristic indexes.
In one embodiment, the obtaining of the mapped feature information of each product value time subsequence under a plurality of preset feature indexes includes:
acquiring initial characteristic information of each product value time subsequence under a plurality of preset characteristic indexes, and confirming the mapping relation of each preset characteristic index;
and aiming at each product value time subsequence, mapping the initial characteristic information of the product value time subsequence under a plurality of preset characteristic indexes according to the mapping relation of each preset characteristic index to obtain the mapped characteristic information of the product value time subsequence under the plurality of preset characteristic indexes.
In one embodiment, the preset characteristic indexes at least comprise a polymerization approximation index, a slope fitting index and a gradient index;
the obtaining of the initial characteristic information of each product value time subsequence under a plurality of preset characteristic indexes includes:
aiming at each product value time subsequence, dividing the product value time subsequence to obtain a plurality of product value time secondary subsequences with equal time length;
confirming the average value of each product value time secondary subsequence;
and carrying out fusion processing on the average value of each product value time secondary subsequence to obtain the aggregation approximation information of the product value time subsequences under the aggregation approximation index, wherein the aggregation approximation information is used as the initial characteristic information of the product value time subsequences under the aggregation approximation index.
In one embodiment, the obtaining initial characteristic information of each product value time subsequence under a plurality of preset characteristic indexes further includes:
aiming at each product value time subsequence, dividing the product value time subsequence to obtain a plurality of product value time three-level subsequences;
confirming gradient information of each product value time three-level subsequence; the gradient information includes horizontal gradient information in a horizontal direction and vertical gradient information in a vertical direction;
and obtaining gradient histogram information of the product value time subsequence under the gradient index according to the gradient information of each product value time three-level subsequence, and using the gradient histogram information as initial characteristic information of the product value time subsequence under the gradient index.
In one embodiment, after the sub-sequence feature information of each product value time sub-sequence is fused through a second layer attention mechanism network in the pre-trained value prediction model to obtain the predicted value of the product to be predicted after the product value time sequence, the method further includes:
acquiring the actual value of the product to be predicted after the product value time series;
confirming the loss value of the pre-trained value prediction model according to the predicted value of the product to be predicted after the product value time sequence and the actual value of the product to be predicted after the product value time sequence;
and under the condition that the loss value is larger than a preset threshold value, correcting the parameters of the pre-trained value prediction model according to the loss value.
In a second aspect, the application further provides a product value prediction device. The device comprises:
the sequence sliding window processing module is used for performing sliding window processing on the product value time sequence of the product to be predicted to obtain a plurality of product value time subsequences;
the index information confirmation module is used for acquiring target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes through a recurrent neural network in a pre-trained value prediction model;
the sequence information confirmation module is used for fusing target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes through a first layer attention mechanism network in the pre-trained value prediction model to obtain subsequence characteristic information of each product value time subsequence;
and the product value prediction module is used for carrying out fusion processing on the subsequence feature information of each product value time subsequence through a second layer attention mechanism network in the pre-trained value prediction model to obtain the predicted value of the product to be predicted after the product value time sequence.
In a third aspect, the present application also provides a computer device. The computer device comprises a memory storing a computer program and a processor implementing the following steps when executing the computer program:
performing sliding window processing on the product value time sequence of the product to be predicted to obtain a plurality of product value time subsequences;
acquiring target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes through a recurrent neural network in a pre-trained value prediction model;
performing fusion processing on target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes through a first layer attention mechanism network in the pre-trained value prediction model to obtain subsequence characteristic information of each product value time subsequence;
and fusing the subsequence characteristic information of each product value time subsequence through a second layer attention mechanism network in the pre-trained value prediction model to obtain the predicted value of the product to be predicted after the product value time sequence.
In a fourth aspect, the present application further provides a computer-readable storage medium. The computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of:
performing sliding window processing on the product value time sequence of the product to be predicted to obtain a plurality of product value time subsequences;
acquiring target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes through a recurrent neural network in a pre-trained value prediction model;
performing fusion processing on target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes through a first layer attention mechanism network in the pre-trained value prediction model to obtain subsequence characteristic information of each product value time subsequence;
and fusing the subsequence characteristic information of each product value time subsequence through a second layer attention mechanism network in the pre-trained value prediction model to obtain the predicted value of the product to be predicted after the product value time sequence.
In a fifth aspect, the present application further provides a computer program product. The computer program product comprising a computer program which when executed by a processor performs the steps of:
performing sliding window processing on the product value time sequence of the product to be predicted to obtain a plurality of product value time subsequences;
acquiring target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes through a recurrent neural network in a pre-trained value prediction model;
performing fusion processing on target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes through a first layer attention mechanism network in the pre-trained value prediction model to obtain subsequence characteristic information of each product value time subsequence;
and fusing the subsequence characteristic information of each product value time subsequence through a second layer attention mechanism network in the pre-trained value prediction model to obtain the predicted value of the product to be predicted after the product value time sequence.
According to the product value prediction method, the product value prediction device, the computer equipment, the storage medium and the computer program product, firstly, the product value time sequence of a product to be predicted is subjected to sliding window processing to obtain a plurality of product value time subsequences; then, acquiring target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes through a recurrent neural network in a pre-trained value prediction model; then, target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes is subjected to fusion processing through a first layer of attention mechanism network in a pre-trained value prediction model, and subsequence characteristic information of each product value time subsequence is obtained; and finally, fusing the subsequence characteristic information of each product value time subsequence through a second layer attention mechanism network in a pre-trained value prediction model to obtain the predicted value of the product to be predicted after the product value time sequence. Therefore, target characteristic information of a plurality of product value time subsequences obtained by sliding window processing under different preset characteristic indexes and the subsequence characteristic information of the product value time subsequences can be obtained based on a first layer attention mechanism network in a pre-trained value prediction model; and then, based on the subsequence characteristic information of the product value time subsequences and a second layer attention mechanism network in a pre-trained value prediction model, further obtaining the predicted value of the product to be predicted after the product value time sequence. On one hand, the product value is predicted based on the product value time sequence, and the dependency and correlation between the previous value data and the next value data are reserved; on the other hand, the characteristics of the product value time sequence under the multiple preset characteristic index scales are comprehensively considered, and the volatility and the noise in the product value data are eliminated; thereby improving the accuracy of product value prediction.
Drawings
FIG. 1 is a schematic flow chart diagram illustrating a method for product value prediction in one embodiment;
FIG. 2 is a schematic flow chart diagram illustrating the training steps of a pre-trained value prediction model in one embodiment;
FIG. 3 is a flow chart illustrating a method for predicting product value in another embodiment;
FIG. 4 is a diagram illustrating a multi-scale fusion-based hierarchical attention product value prediction method according to an embodiment;
FIG. 5 is a block diagram showing the structure of a product value predicting apparatus according to an embodiment;
FIG. 6 is a diagram of the internal structure of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of and not restrictive on the broad application.
In an exemplary embodiment, as shown in fig. 1, a product value prediction method is provided, which is exemplified by the application of the method to a server; it is understood that the method can also be applied to a terminal, and can also be applied to a system comprising a server and the terminal, and is realized through the interaction between the server and the terminal; the server can be implemented by an independent server or a server cluster consisting of a plurality of servers; the terminal may be, but is not limited to, various personal computers, notebook computers, smart phones, tablet computers, and the like. In this embodiment, the method includes the steps of:
and S102, performing sliding window processing on the product value time sequence of the product to be predicted to obtain a plurality of product value time subsequences.
Wherein the product to be predicted comprises at least one of a virtual product and a physical product; virtual products may include, but are not limited to, financial products such as stocks, funds, etc.; the physical product can be any product which can be owned and has a practical shape, such as a stamp, a commemorative coin and the like. The product value refers to the price of the product to be predicted, and the product value time sequence is a sequence formed by price data of the product to be predicted in a certain period of time according to a certain time sequence.
Specifically, the server firstly obtains value data of a product to be predicted within a period of time, and then arranges the value data according to a time sequence to obtain a product value time sequence of the product to be predicted; and then, carrying out sliding window processing on the product value time sequence to obtain a plurality of sub-sequences under the window, wherein the sub-sequences are used as the product value time sub-sequences.
For example, assuming that the product value time sequence of the product to be predicted is X, the server divides the product value time sequence X into a plurality of product value time subsequences X by performing sliding window processing on the product value time sequence Xn I.e. X = { X1 ,X2 ,X3 ,…Xn }。
And step S104, acquiring target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes through a recurrent neural network in a pre-trained value prediction model.
The pre-trained value prediction model is a deep learning model obtained by training based on the product value time sequence of the sample product; the recurrent neural network in the pre-trained value prediction model adopts an LSTM network (long short-term memory network). The preset characteristic index is a characteristic index for analyzing and predicting the product value.
Specifically, the server inputs each product value time subsequence into a pre-trained value prediction model respectively, and hidden state information of each product value time subsequence about preset feature indexes of different dimensions is acquired through a recurrent neural network in the pre-trained value prediction model and is used as target feature information of each product value time subsequence under a plurality of preset feature indexes.
Supposing that the server selects three preset characteristic indexes for product value prediction, the server acquires a first product value time subsequence X1 For illustration, the server determines the first product value time sub-sequence X through the LSTM network in the pre-trained value prediction model1 Three hidden state information corresponding to three preset characteristic indexes
Figure BDA0004024671140000081
Then using the three hidden state information as a first product value time subsequence X1 Target characteristic information under a plurality of preset characteristic indexes. The acquisition process of the target characteristic information of the rest product value time subsequences is the same as the above.
And S106, fusing target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes through a first layer attention mechanism network in a pre-trained value prediction model to obtain subsequence characteristic information of each product value time subsequence.
The first layer attention mechanism network is used for confirming the attention corresponding to each preset characteristic index. The subsequence feature information is used for representing a product value time subsequence comprehensively considering a plurality of preset feature indexes.
Specifically, the server inputs target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes into a first layer attention mechanism network in a pre-trained value prediction model respectively, and confirms attention corresponding to each preset characteristic index through the first layer attention mechanism network; and then, fusing target characteristic information under each preset characteristic index according to the attention corresponding to each preset characteristic index aiming at each product value time subsequence to obtain subsequence characteristic information of the product value time subsequence.
With the first product value time subsequence X1 The sub-sequence characteristic information confirmation process of (2) is explained as an example,the server respectively confirms the attention { alpha ] corresponding to the three preset characteristic indexes through the first layer attention mechanism network1 ,α2 ,α3 And then, according to the attention corresponding to the three preset characteristic indexes, fusing and processing a first product value time subsequence X1 Three target characteristic information under three preset characteristic indexes
Figure BDA0004024671140000082
Obtaining a first product value time subsequence X1 Sub-sequence characteristic information S of1 . The process of confirming the subsequence characteristic information of the other product value time subsequences is the same as the process
And S108, fusing the subsequence characteristic information of each product value time subsequence through a second layer attention mechanism network in the pre-trained value prediction model to obtain the predicted value of the product to be predicted after the product value time sequence.
Wherein, the second layer attention mechanism network is used for confirming the attention corresponding to each product value time subsequence.
Specifically, the server inputs subsequence characteristic information of each product time value subsequence into a second layer attention mechanism network in a pre-trained value prediction model respectively, and confirms attention corresponding to each product time value subsequence through the second layer attention mechanism network; and then according to the attention corresponding to each product time value subsequence, fusing the subsequence characteristic information of each product time value subsequence to obtain the predicted value of the product to be predicted after the product time sequence.
For example, the server obtains a plurality of product value time subsequences { X } through step S1061 ,X2 ,X3 ,…Xn Subsequence feature information (S) corresponding to1 ,S2 ,S3 ,…Sn }; then, in step S108, according to the second layer attention mechanism network, the attention { beta } corresponding to each product time value sub-sequence is confirmed1 ,β2 ,β3 …βn Is then according toAttention corresponding to each product time value subsequence to each subsequence characteristic information { S1 ,S2 ,S3 ,…Sn And fourthly, performing fusion processing to obtain the predicted value S of the product to be predicted after the product value time sequence.
In the product value prediction method, the server firstly carries out sliding window processing on a product value time sequence of a product to be predicted to obtain a plurality of product value time subsequences; then, the server acquires target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes through a recurrent neural network in a pre-trained value prediction model; then, the server performs fusion processing on target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes through a first layer of attention mechanism network in a pre-trained value prediction model to obtain subsequence characteristic information of each product value time subsequence; and finally, the server performs fusion processing on the subsequence characteristic information of each product value time subsequence through a second layer attention mechanism network in a pre-trained value prediction model to obtain the prediction value of the product to be predicted after the product value time sequence. Therefore, the server can obtain the subsequence characteristic information of the product value time subsequence based on the target characteristic information of a plurality of product value time subsequences obtained by sliding window processing under different preset characteristic indexes and a first layer attention mechanism network in a pre-trained value prediction model; and then, based on the subsequence characteristic information of the product value time subsequences and a second layer attention mechanism network in a pre-trained value prediction model, further obtaining the predicted value of the product to be predicted after the product value time sequence. On one hand, the product value is predicted based on the product value time sequence, and the dependency and correlation between the previous value data and the next value data are reserved; on the other hand, the characteristics of the product value time sequence under the multiple preset characteristic index scales are comprehensively considered, and the volatility and the noise in the product value data are eliminated; thereby improving the accuracy of product value prediction.
In an exemplary embodiment, in step S106, the target feature information of each product value time subsequence under a plurality of preset feature indexes is fused through a first layer attention mechanism network in a pre-trained value prediction model, so as to obtain subsequence feature information of each product value time subsequence, which specifically includes the following contents: confirming the weight of each preset characteristic index through a first layer attention mechanism network in a pre-trained value prediction model; and aiming at each product value time subsequence, performing fusion processing on target characteristic information of the product value time subsequence under a plurality of preset characteristics through the weight of each preset characteristic index to obtain subsequence characteristic information of the product value time subsequence.
The weight of each preset characteristic index is used for representing the importance degree of each preset characteristic index to the subsequence characteristic information of the product time subsequence.
Specifically, the server respectively confirms attention, namely weight, corresponding to each preset characteristic index through a first layer attention mechanism network in a pre-trained value prediction model; and then, according to the weight corresponding to each preset characteristic index and according to each product value time subsequence, weighting and summing the target characteristic information of the product value time subsequence under each preset characteristic index to obtain the subsequence characteristic information of the product value time subsequence.
For example, the server respectively confirms the weights α corresponding to the preset feature indexes through the first layer attention mechanism networki Then according to the weight alphai And weighted summation of target characteristic information h of the product value time subsequences under each preset characteristic indexij Obtaining the subsequence characteristic information S of the jth product value time subsequencej
Figure BDA0004024671140000101
Wherein i is the serial number of the preset characteristic indexes, m is the total number of the preset characteristic indexes, and j is the product value time subsequenceNumber, αi Weight corresponding to the ith preset characteristic index, hij And target characteristic information of the jth product value time subsequence under the ith preset characteristic index is obtained.
In this embodiment, the server confirms the weights corresponding to the preset feature indexes through the first-layer attention mechanism network, and then performs weighted fusion on the target feature information under the preset feature indexes according to the weights to obtain subsequence feature information of the product value time subsequence, and the product value time subsequence can be comprehensively represented by the preset feature indexes of multiple scales, so that noise and volatility in the product value time subsequence are eliminated from multiple dimensions, interference of the noise and volatility on prediction accuracy is avoided in a prediction process, and accuracy of product value prediction is improved.
In an exemplary embodiment, in step S108, the subsequence feature information of each product value time subsequence is fused through a second layer attention mechanism network in a pre-trained value prediction model, so as to obtain the predicted value of the product to be predicted after the product value time sequence, where the method specifically includes the following steps: confirming the weight of each product value time subsequence through a second layer attention mechanism network in a pre-trained value prediction model; and fusing the subsequence characteristic information of each product value time subsequence to obtain the predicted value of the product to be predicted after the product value time sequence.
And the weight of each product value time subsequence is used for representing the importance degree of each product value time subsequence on the product value prediction result.
Specifically, since the product value time subsequences have a time relationship, and the product value time subsequences at different times have different degrees of influence on the product value prediction result, the weights corresponding to the product value time subsequences need to be determined. The server respectively confirms the attention, namely the weight, corresponding to each product value time subsequence through a second layer attention mechanism network in a pre-trained value prediction model; and then, according to the weight corresponding to each product value time subsequence, weighting and summing the product value time subsequences to obtain the predicted value of the product to be predicted after the product value time sequence, namely the predicted result of the product value.
For example, the server respectively confirms the weights β corresponding to the product value time sub-sequences through the second layer attention mechanism networkj Then according to the weight betaj Weighted summation of individual product value time subsequencesj And obtaining the predicted value S of the product to be predicted after the product value time sequence:
Figure BDA0004024671140000111
wherein j is the serial number of the product value time subsequence, n is the total number of the product value time subsequences, and betaj Weight corresponding to the jth product value time sub-sequence, Sj Subsequence feature information for the jth product value time subsequence.
In the embodiment, the server confirms the weight corresponding to each product value time subsequence through the second-layer attention mechanism network, then performs weighted fusion on the subsequence feature information of each product value time subsequence according to the weight to obtain the predicted value of the product to be predicted after the product value time subsequence, and can convert the subsequence information of a plurality of product value time subsequences into the predicted value of the product to be predicted after the product value time sequence, so that the value prediction of the product to be predicted is realized, in the prediction process, the front-back time relation of value data is fully considered, noise and fluctuation in the value data are fully eliminated, and the accuracy of the value prediction is further improved.
In an exemplary embodiment, as shown in fig. 2, the value prediction model trained in advance in step S104 is obtained by training in the following way:
step S202, a sample product value time series of the sample product and an actual value of the sample product after the sample product value time series are obtained.
And step S204, performing sliding window processing on the sample product value time sequence to obtain a plurality of sample product value time subsequences.
And S206, acquiring target characteristic information of each sample product value time subsequence under a plurality of preset characteristic indexes through a recurrent neural network in the value prediction model to be trained.
And S208, fusing target characteristic information of each sample product value time subsequence under a plurality of preset characteristic indexes through a first layer attention mechanism network in the value prediction model to be trained to obtain subsequence characteristic information of each sample product value time subsequence.
And S210, fusing the subsequence feature information of each sample product value time subsequence through a second layer attention mechanism network in the value prediction model to be trained to obtain the prediction value of the sample product after the sample product value time sequence.
Step S212, training the value prediction model to be trained according to the difference between the prediction value of the sample product after the sample product value time sequence and the actual value of the sample product after the sample product value time sequence to obtain a trained value prediction model which is used as a pre-trained value prediction model.
The sample product refers to a product with a similar or similar change trend of the value of the product to be predicted in a period of time.
Specifically, the server first obtains a sample product value time series of the sample product and an actual value of the sample product after the sample product value time series; then, carrying out sliding window processing on the sample product value time sequence to obtain a plurality of sample product value time subsequences, and then training a value prediction model according to the sample product value time subsequences of the sample products, wherein the specific training process is as follows: acquiring target characteristic information of each sample product value time subsequence under a plurality of preset characteristic indexes through a recurrent neural network in a value prediction model to be trained; initializing initial weights corresponding to all preset characteristic indexes through a first layer of attention mechanism network in a value prediction model to be trained, weighting and summing target characteristic information of the sample product value time subsequences under all preset characteristic indexes according to the initial weights corresponding to all preset characteristic indexes aiming at all sample product value time subsequences, and obtaining subsequence characteristic information of the sample product value time subsequences; initializing the initial weight of each sample product value time subsequence through a second layer of attention mechanism network in a value prediction model to be trained, and weighting and summing subsequence characteristic information of each sample product value time subsequence according to the initial weight of each sample product value time subsequence to obtain the prediction value of a sample product after the sample product value time sequence; then comparing the predicted value of the sample product after the sample product value time sequence with the actual value of the sample product after the sample product value time sequence, and confirming the loss value of the value prediction model to be trained according to the difference; and finally, under the condition that the loss value is larger than the preset threshold value, adjusting parameters in the model, such as the initial weight corresponding to each preset characteristic index and the initial weight of each sample product value time subsequence, and then repeating the process by the server until the loss value is smaller than the preset threshold value, so that the model training is finished.
In the embodiment, the server trains the three-layer network of the recurrent neural network, the first layer attention mechanism network and the second layer attention mechanism network in the value prediction model to be trained through the sample product value time sequence of the sample product, judges whether the model is trained or not through the loss value between the actual value and the prediction value of the sample product after the sample product value time sequence, and can train to obtain the value prediction model meeting the precision requirement, so that the high-precision prediction model is provided for the subsequent product value prediction process, and the accuracy of value prediction is improved.
In an exemplary embodiment, in step S104, the target feature information of each product value time subsequence under a plurality of preset feature indexes is obtained through a recurrent neural network in a pre-trained value prediction model, and the target feature information specifically includes the following contents: acquiring the mapped characteristic information of each product value time subsequence under a plurality of preset characteristic indexes; for each product value time subsequence, inputting the mapped feature information of the product value time subsequence under a plurality of preset feature indexes into a recurrent neural network in a pre-trained value prediction model to obtain hidden state information corresponding to the mapped feature information of the product value time subsequence under the plurality of preset feature indexes; and confirming hidden state information corresponding to the mapped feature information of the product value time subsequence under a plurality of preset feature indexes as target feature information of the product value time subsequence under the plurality of preset feature indexes.
The hidden state information is an output result of the recurrent neural network, the information contained in the hidden state information is most relevant to the mapped feature information, and is a feature vector after the common action of a current product value time subsequence corresponding to the hidden state information and a historical product value time subsequence before the current product value time subsequence, for example, the hidden state information of a third product value time subsequence is a feature vector after the common action of a first product value time subsequence, a second product value time subsequence and a third product value time subsequence.
Specifically, the server firstly obtains the mapped feature information of each product value time subsequence under each preset feature index, then, according to each product value time subsequence, the mapped feature information of the product value time subsequence under each preset feature index is respectively used as input to be input into a recurrent neural network in a value prediction model trained in advance, the mapped feature information of the product value time subsequence is obtained, the hidden state information after the common action of the mapped feature information of the historical product value time subsequence before the product value time subsequence and the mapped feature information of the historical product value time subsequence before the product value time subsequence is used as the hidden state information corresponding to the mapped feature information of the product value time subsequence under a plurality of preset feature indexes, and therefore the target feature information of the product value time subsequence under the plurality of preset feature indexes is obtained.
Taking the target characteristic information confirmation process of the first product value time subsequence under three preset characteristic indexes as an example for explanation, the server firstly obtains the mapping characteristic information of the first product value time subsequence under each preset characteristic index
Figure BDA0004024671140000131
Wherein it is present>
Figure BDA0004024671140000132
Represents the mapped characteristic information of the first product value time subsequence under the aggregation approximation criterion, and>
Figure BDA0004024671140000133
represents the mapped characteristic information of the first product value time subsequence under the slope fit criterion, and->
Figure BDA0004024671140000141
Representing the mapped characteristic information of the first product value time subsequence under the gradient index; the server will then { XPAA ,XSlope ,XHOG-1D Inputting the mapping result into a recurrent neural network in a pre-trained value prediction model to obtain hidden state information (corresponding to the mapped feature information)>
Figure BDA0004024671140000142
Wherein it is present>
Figure BDA0004024671140000143
Then the hidden state information is->
Figure BDA0004024671140000144
And the target characteristic information is used as the target characteristic information of the first product value time subsequence under three preset characteristic indexes.
It should be noted that, in the embodiment of the present application, only three preset feature indexes and specific meanings thereof are listed, and in the actual value prediction, unlimited number and unlimited types of preset feature indexes may be selected to perform multidimensional index characterization on the product value time subsequence.
In this embodiment, the server converts the post-mapping feature information of a specific product value time subsequence into the post-mapping feature information of the current product value time subsequence through a recurrent neural network in a pre-trained value prediction model, and uses the hidden state information after the combined action of the post-mapping feature information of the historical product value time subsequence before the current product value time subsequence and the post-mapping feature information of the historical product value time subsequence before the current product value time subsequence as the target feature information of the current product value time subsequence, so that the relationship between the previous and next product value time subsequences can be fully considered, the product value is predicted on the premise of considering the previous and next association of data, and the accuracy of value prediction is improved.
In an exemplary embodiment, the step of obtaining the mapped feature information of each product value time subsequence under a plurality of preset feature indexes specifically includes the following steps: acquiring initial characteristic information of each product value time subsequence under a plurality of preset characteristic indexes, and confirming the mapping relation of each preset characteristic index; and aiming at each product value time subsequence, mapping the initial characteristic information of the product value time subsequence under a plurality of preset characteristic indexes according to the mapping relation of each preset characteristic index to obtain the mapped characteristic information of the product value time subsequence under the plurality of preset characteristic indexes.
For example, for a slope fitting index, the mapped feature information with the slope between [0,0.5] can be preset to be 1, the mapped feature information with the slope between [0.5,1] is 1, the mapped feature information with the slope between [1,1.5] is 2, and the like. For the mapping of each preset characteristic index, the preset characteristic indexes with different dimensionalities are mainly mapped into the same expression space for expression.
Specifically, the server acquires initial characteristic information of each product value time subsequence under a plurality of preset characteristic indexes, and confirms the mapping relation of each preset characteristic index; and then, the server sequentially maps each initial characteristic information of the product value time subsequences under each preset characteristic index according to the mapping relation of each preset characteristic index aiming at each product value time subsequence to obtain the corresponding mapped characteristic information of the product value time subsequences under each preset characteristic index.
Taking the confirmation process of the mapped feature information of the first product value time subsequence as an example for explanation, the server firstly carries out three different dimensionalities on the first product value time subsequence respectively to obtain the initial feature information of the first product value time subsequence under three preset feature indexes of an aggregation approximation index, a slope fitting index and a gradient index
Figure BDA0004024671140000151
Then respectively obtaining the mapping relations of the aggregation approximate index, the slope fitting index and the gradient index of the three preset characteristic indexes; then according to the mapping relation, respectively mapping->
Figure BDA0004024671140000152
And->
Figure BDA0004024671140000153
Obtaining the mapped characteristic information of the first product value time subsequence>
Figure BDA0004024671140000154
In this embodiment, the server maps the initial characteristic information of the product value time subsequence under each preset characteristic index to obtain the mapped characteristic information of the product value time subsequence under each preset characteristic index according to the initial characteristic information of the product value time subsequence under each preset characteristic index and the mapping relationship corresponding to each preset characteristic index, and can map the initial characteristic information of different dimensions to a unified expression space for expression, thereby facilitating the subsequent processing process of the pre-trained value prediction model on data.
In an exemplary embodiment, the plurality of preset characteristic indicators at least include an aggregation approximation indicator, a slope fitting indicator and a gradient indicator.
The steps are that initial characteristic information of each product value time subsequence under a plurality of preset characteristic indexes is obtained, and the method specifically comprises the following steps: dividing the product value time subsequences aiming at each product value time subsequence to obtain a plurality of product value time secondary subsequences with equal time length; confirming the average value of the secondary subsequences of each product value time; and carrying out fusion processing on the average value of each product value time secondary subsequence to obtain the aggregation approximation information of the product value time subsequences under the aggregation approximation index, and using the aggregation approximation information as the initial characteristic information of the product value time subsequences under the aggregation approximation index.
Specifically, the server may adopt a PAA method (piece wise aggregation approximation) to confirm initial characteristic information of the product value time subsequence under the aggregation approximation index, and the specific process is as follows: the server divides each product value time subsequence again, and divides each product value time subsequence into a plurality of product value time secondary subsequences with equal time length; taking one of the product value time subsequences as an example, sequentially confirming the average value of each product value time secondary subsequence, and representing the product value time secondary subsequence by using the average value; and then combining the average values of the product value time secondary subsequences for representing the initial characteristic information of the product value time subsequences under the aggregation approximate index. Namely, the product value time subsequence is averagely divided, and the average value vector of the product value time secondary subsequence is utilized to approach the original product value time subsequence.
In this embodiment, the server can convert the long sequence data into a plurality of short sequence data by re-dividing the time product value subsequence, so that the variation trend and the correlation among the data can be better captured, an analysis basis is provided for accurately predicting the product value subsequently, and the accuracy of product value prediction is further improved.
In an exemplary embodiment, the step of obtaining the initial characteristic information of each product value time subsequence under a plurality of preset characteristic indexes further specifically includes the following steps: dividing the product value time subsequences aiming at each product value time subsequence to obtain a plurality of product value time three-level subsequences; confirming gradient information of the three-level subsequence of each product value time; the gradient information includes horizontal gradient information in a horizontal direction and vertical gradient information in a vertical direction; and obtaining gradient histogram information of the product value time subsequence under the gradient index according to the gradient information of each product value time three-level subsequence, and using the gradient histogram information as initial characteristic information of the product value time subsequence under the gradient index.
Specifically, the server may adopt a Histogram of Oriented Gradients (HOG) to aggregate gradients of the product value time subsequence in the horizontal direction and the vertical direction into a one-dimensional histogram of gradients, so as to confirm initial characteristic information of the product value time subsequence under the gradient index, which includes the following specific steps: the server divides each product value time subsequence again, and divides each product value time subsequence into a plurality of product value time three-level subsequences; then, taking a product value time subsequence as an example, confirming horizontal gradient information of each product value time three-level subsequence in the horizontal direction and vertical direction gradient information in the vertical direction; and then, drawing a direction gradient histogram according to the horizontal gradient information in the horizontal direction and the vertical gradient information in the vertical direction, wherein the direction gradient histogram is used as the initial characteristic information of the product value time subsequence under the gradient index.
In the embodiment, the server can integrally master the gradient change trend of the time product value subsequence through the horizontal gradient information and the vertical gradient information of the time product value three-level subsequence, so that the change trend and the correlation among data can be better captured, an analysis basis is provided for accurately predicting the product value subsequently, and the accuracy of product value prediction is further improved.
In an exemplary embodiment, the server may further capture a slope variation trend of each product value time subsequence in a relatively long period by a least square fitting method, so as to confirm initial characteristic information of each product value time subsequence under a slope fitting index.
In an exemplary embodiment, in step S108, the fusion processing is performed on the subsequence feature information of each product value time subsequence through a second layer attention mechanism network in a pre-trained value prediction model, and after obtaining the predicted value of the product to be predicted after the product value time sequence, the method further includes the following steps: acquiring the actual value of a product to be predicted after the product value time sequence; confirming the loss value of a pre-trained value prediction model according to the prediction value of the product to be predicted after the product value time sequence and the actual value of the product to be predicted after the product value time sequence; and correcting the parameters of the pre-trained value prediction model according to the loss value under the condition that the loss value is greater than the preset threshold value.
The loss value of the pre-trained value prediction model is determined by a cross entropy function between the prediction value of the product to be predicted after the product value time sequence and the actual value of the product to be predicted after the product value time sequence; the preset threshold is a loss value threshold set according to the training precision requirement of the value prediction model.
Specifically, after obtaining the predicted value of the product to be predicted after the product value time series, the server also needs to obtain the actual value of the product to be predicted after the product value time series; then, according to the cross entropy between the predicted value and the actual value, confirming the loss value of the pre-trained value prediction model; and modifying parameters in the pre-trained value prediction model, such as the weight obtained by the first layer of attention mechanism confirmation or the weight obtained by the second layer of attention mechanism confirmation, under the condition that the loss value is greater than the preset threshold value, until the loss values of the predicted value and the actual value confirmation obtained according to the modified pre-trained value prediction model are less than the preset threshold value.
In the embodiment, the server confirms the loss value of the pre-trained value prediction model according to the prediction value and the actual value of the product to be predicted, so that the trained value prediction model is trained again; therefore, the value prediction model is corrected based on the actual prediction result, the prediction precision of the value prediction model is dynamically ensured, and the accuracy of the value prediction is further ensured.
In an exemplary embodiment, as shown in fig. 3, another product value prediction method is provided, which is described by taking the method as an example applied to a server, and includes the following steps:
step S301, training the value prediction model to be trained according to the sample product value time sequence of the sample product and the actual value of the sample product after the sample product value time sequence to obtain a pre-trained value prediction model.
Step S302, performing sliding window processing on the product value time sequence of the product to be predicted to obtain a plurality of product value time subsequences.
Step S303, aiming at each product value time subsequence, acquiring initial characteristic information of the product value time subsequence under a plurality of preset characteristic indexes, and confirming the mapping relation of each preset characteristic index.
Step S304, according to the mapping relation of each preset characteristic index, mapping processing is carried out on the initial characteristic information of the product value time subsequence under a plurality of preset characteristic indexes, and the mapped characteristic information of the product value time subsequence under the plurality of preset characteristic indexes is obtained.
Step S305, inputting the mapped characteristic information of the product value time subsequence under a plurality of preset characteristic indexes into a recurrent neural network in a pre-trained value prediction model to obtain target characteristic information of the product value time subsequence under the plurality of preset characteristic indexes.
Step S306, confirming the weight of each preset characteristic index through a first layer attention mechanism network in the pre-trained value prediction model.
And S307, aiming at each product value time subsequence, fusing target characteristic information of the product value time subsequence under a plurality of preset characteristics through the weight of each preset characteristic index to obtain subsequence characteristic information of the product value time subsequence.
And step S308, confirming the weight of each product value time subsequence through a second layer attention mechanism network in the pre-trained value prediction model.
And S309, fusing the subsequence characteristic information of each product value time subsequence according to the weight of each product value time subsequence to obtain the predicted value of the product to be predicted after the product value time sequence.
And S310, confirming the loss value of the pre-trained value prediction model according to the prediction value of the product to be predicted after the product value time sequence and the actual value of the product to be predicted after the product value time sequence.
In step S311, when the loss value is greater than the preset threshold, the parameters of the pre-trained value prediction model are corrected according to the loss value.
In this embodiment, the server may obtain the subsequence feature information of the product value time subsequence based on the target feature information of the multiple product value time subsequences obtained by the sliding window processing under different preset feature indexes and the weights of the preset feature indexes confirmed by the first-layer attention mechanism network in the pre-trained value prediction model; then, based on the subsequence feature information of the product value time subsequences and the weight of each product value time subsequence confirmed and obtained by a second layer attention mechanism network in a pre-trained value prediction model, the prediction value of the product to be predicted after the product value time sequence is further obtained; and finally, correcting the pre-trained value prediction model based on the loss value between the prediction value and the actual value. On one hand, the product value is predicted based on the product value time sequence, and the dependency and correlation between the previous value data and the next value data are reserved; on the other hand, the characteristics of the product value time sequence under the multiple preset characteristic index scales are comprehensively considered, and the volatility and the noise in the product value data are eliminated; in addition, parameters in the pre-trained value prediction model are dynamically updated, so that the accuracy of product value prediction is improved.
In order to more clearly illustrate the product value prediction method provided in the embodiments of the present application, the following describes the product value prediction method in detail with a specific embodiment. In an exemplary embodiment, the present application further provides a hierarchical attention product value prediction method based on multi-scale fusion as shown in fig. 4, in this embodiment, the product value prediction method provided by the present application is described in detail in connection with a price prediction scenario of stocks, and the method specifically includes the following steps:
step 1: in order to relieve noise fluctuation brought by stock price sequences, a sliding window treatment is adopted to obtain stock price lattice sequences under a plurality of windows.
Step 2: for each stock price grid sequence, three different indexes are operated to obtain three indexes influencing stock price fluctuation: a polymerization approximation index, a slope fitting index and a gradient index.
Wherein, the polymerization approximation index is: to mitigate short term stock price fluctuations. That is, based on the long sequence data, short sequence data is generated such that the short sequence data is approximately similar in trend to the long sequence data. The method is to give a stock price subsequence, represent the stock price subsequence as l equal-length space, and approach the original stock price subsequence by a new vector through PAA.
Slope fitting index: to capture the trend of the sequence over a relatively long period of time. I.e. a feature to describe time shift invariance to time series. The method specifically comprises the step of obtaining the slope of the time sequence in a certain interval by adopting a least square fitting method.
Gradient index: the gradients in different directions are statistically aggregated into a histogram.
And step 3: and performing characteristic splicing on the three indexes in each stock price lattice sequence to obtain a characteristic vector of the stock price lattice sequence, and performing mapping transformation on the characteristic vector of the stock price subsequence to obtain a mapped characteristic vector of the stock price subsequence.
And 4, step 4: inputting the mapped characteristic vector of the stock price subsequence into an LSTM network of a price prediction model to obtain the hidden state of the mapped characteristic vector of the stock price subsequence. Wherein each indicator corresponds to a hidden state.
And 5: and through a first-layer attention mechanism network of the price prediction model, carrying out weighted summation on three hidden states of each stock price lattice sequence to obtain subsequence characteristic representation of the stock price lattice sequence.
Step 6: and weighting and summing the subsequence characteristic representation of each stock price lattice sequence through a second layer attention mechanism network of the price prediction model to obtain the predicted value of the stock price.
And 7: and optimizing the price prediction model through the cross entropy loss value of the predicted value and the true value of the stock price.
In the embodiment, because the stock price is formed based on the transaction activities of different time spans, a multi-scale sliding window is executed on the stock price sequence to extract various local description features, then the local description features are connected into the LSTM network, and fusion is carried out on the different scales and different stock price grid sequences based on the attention mechanism network.
It should be understood that, although the steps in the flowcharts related to the embodiments are shown in sequence as indicated by the arrows, the steps are not necessarily executed in sequence as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a part of the steps in the flowcharts related to the above embodiments may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least a part of the steps or stages in other steps.
Based on the same inventive concept, the embodiment of the application also provides a product value prediction device for realizing the product value prediction method. The implementation scheme for solving the problem provided by the device is similar to the implementation scheme recorded in the method, so the specific limitations in one or more embodiments of the product value prediction device provided below can be referred to the limitations of the product value prediction method in the above, and details are not repeated here.
In an exemplary embodiment, as shown in fig. 5, there is provided a product value prediction apparatus including: a sequence slidingwindow processing module 502, an indexinformation confirmation module 504, a sequenceinformation confirmation module 506, and a productvalue prediction module 508, wherein:
the sequence slidingwindow processing module 502 is configured to perform sliding window processing on the product value time sequence of the product to be predicted to obtain a plurality of product value time subsequences.
And an indexinformation confirmation module 504, configured to obtain target feature information of each product value time subsequence under multiple preset feature indexes through a recurrent neural network in a pre-trained value prediction model.
And the sequenceinformation confirmation module 506 is configured to perform fusion processing on the target feature information of each product value time subsequence under the multiple preset feature indexes through a first layer attention mechanism network in the pre-trained value prediction model to obtain subsequence feature information of each product value time subsequence.
And the productvalue prediction module 508 is configured to perform fusion processing on the subsequence feature information of each product value time subsequence through a second-layer attention mechanism network in the pre-trained value prediction model, so as to obtain a prediction value of the product to be predicted after the product value time sequence.
In an exemplary embodiment, the sequenceinformation determining module 506 is further configured to determine the weight of each preset feature index through a first layer of attention mechanism network in a pre-trained value prediction model; and aiming at each product value time subsequence, performing fusion processing on target characteristic information of the product value time subsequence under a plurality of preset characteristics through the weight of each preset characteristic index to obtain subsequence characteristic information of the product value time subsequence.
In an exemplary embodiment, the productvalue prediction module 508 is further configured to determine weights for each product value time subsequence via a second layer attention mechanism network in a pre-trained value prediction model; and fusing the subsequence characteristic information of each product value time subsequence to obtain the predicted value of the product to be predicted after the product value time sequence.
In an exemplary embodiment, the product value prediction apparatus further includes a prediction model training module for obtaining a sample product value time series of the sample product and an actual value of the sample product after the sample product value time series; performing sliding window processing on the sample product value time sequence to obtain a plurality of sample product value time subsequences; acquiring target characteristic information of each sample product value time subsequence under a plurality of preset characteristic indexes through a recurrent neural network in a value prediction model to be trained; performing fusion processing on target characteristic information of each sample product value time subsequence under a plurality of preset characteristic indexes through a first layer attention mechanism network in a value prediction model to be trained to obtain subsequence characteristic information of each sample product value time subsequence; fusing subsequence characteristic information of each sample product value time subsequence through a second layer attention mechanism network in a value prediction model to be trained to obtain the prediction value of the sample product after the sample product value time sequence; and training the value prediction model to be trained according to the difference between the predicted value of the sample product after the sample product value time sequence and the actual value of the sample product after the sample product value time sequence to obtain a trained value prediction model which is used as a pre-trained value prediction model.
In an exemplary embodiment, the indexinformation confirmation module 504 is further configured to obtain the mapped feature information of each product value time subsequence under a plurality of preset feature indexes; for each product value time subsequence, inputting the mapped feature information of the product value time subsequence under a plurality of preset feature indexes into a recurrent neural network in a pre-trained value prediction model to obtain hidden state information corresponding to the mapped feature information of the product value time subsequence under the plurality of preset feature indexes; and confirming hidden state information corresponding to the mapped characteristic information of the product value time subsequence under the plurality of preset characteristic indexes as target characteristic information of the product value time subsequence under the plurality of preset characteristic indexes.
In an exemplary embodiment, the indexinformation confirmation module 504 is further configured to obtain initial feature information of each product value time subsequence under a plurality of preset feature indexes, and confirm a mapping relationship of each preset feature index; and aiming at each product value time subsequence, mapping the initial characteristic information of the product value time subsequence under a plurality of preset characteristic indexes according to the mapping relation of each preset characteristic index to obtain the mapped characteristic information of the product value time subsequence under the plurality of preset characteristic indexes.
In an exemplary embodiment, the indexinformation determining module 504 is further configured to divide the product value time subsequences for each product value time subsequence to obtain a plurality of product value time secondary subsequences with equal time length; confirming the average value of each product value time secondary subsequence; and carrying out fusion processing on the average value of each product value time secondary subsequence to obtain the aggregation approximation information of the product value time subsequences under the aggregation approximation index, and using the aggregation approximation information as the initial characteristic information of the product value time subsequences under the aggregation approximation index.
In an exemplary embodiment, the indexinformation determining module 504 is further configured to divide the product value time subsequences for each product value time subsequence to obtain a plurality of product value time tertiary subsequences; confirming gradient information of the three-level subsequence of each product value time; the gradient information includes horizontal gradient information in a horizontal direction and vertical gradient information in a vertical direction; and obtaining gradient histogram information of the product value time subsequence under the gradient index according to the gradient information of each product value time three-level subsequence, and using the gradient histogram information as initial characteristic information of the product value time subsequence under the gradient index.
In an exemplary embodiment, the product value predicting apparatus further includes a prediction model modification module for obtaining an actual value of the product to be predicted after the product value time series; confirming the loss value of a pre-trained value prediction model according to the prediction value of the product to be predicted after the product value time sequence and the actual value of the product to be predicted after the product value time sequence; and under the condition that the loss value is larger than the preset threshold value, modifying the parameters of the pre-trained value prediction model according to the loss value.
The modules in the product value predicting apparatus may be implemented in whole or in part by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In an exemplary embodiment, a computer device is provided, which may be a server, and its internal structure diagram may be as shown in fig. 6. The computer device includes a processor, a memory, an Input/Output interface (I/O for short), and a communication interface. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface is connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is used for storing the product value time series data. The input/output interface of the computer device is used for exchanging information between the processor and an external device. The communication interface of the computer device is used for connecting and communicating with an external terminal through a network. The computer program is executed by a processor to implement a product value prediction method.
Those skilled in the art will appreciate that the architecture shown in fig. 6 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In an exemplary embodiment, a computer device is further provided, which includes a memory and a processor, the memory stores a computer program, and the processor implements the steps of the above method embodiments when executing the computer program.
In an exemplary embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
In an exemplary embodiment, a computer program product is provided, comprising a computer program which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
It should be noted that, the user information (including but not limited to user equipment information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, displayed data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party, and the collection, use and processing of the related data need to comply with the relevant laws and regulations and standards of the relevant country and region.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include a Read-Only Memory (ROM), a magnetic tape, a floppy disk, a flash Memory, an optical Memory, a high-density embedded nonvolatile Memory, a resistive Random Access Memory (ReRAM), a Magnetic Random Access Memory (MRAM), a Ferroelectric Random Access Memory (FRAM), a Phase Change Memory (PCM), a graphene Memory, and the like. Volatile Memory can include Random Access Memory (RAM), external cache Memory, and the like. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others. The databases involved in the embodiments provided herein may include at least one of relational and non-relational databases. The non-relational database may include, but is not limited to, a block chain based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic devices, quantum computing based data processing logic devices, etc., without limitation.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present application shall be subject to the appended claims.

Claims (13)

1. A method for predicting product value, the method comprising:
performing sliding window processing on the product value time sequence of the product to be predicted to obtain a plurality of product value time subsequences;
acquiring target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes through a recurrent neural network in a pre-trained value prediction model;
performing fusion processing on target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes through a first layer attention mechanism network in the pre-trained value prediction model to obtain subsequence characteristic information of each product value time subsequence;
and fusing the subsequence characteristic information of each product value time subsequence through a second layer attention mechanism network in the pre-trained value prediction model to obtain the predicted value of the product to be predicted after the product value time sequence.
2. The method according to claim 1, wherein the obtaining of the subsequence feature information of each product value time subsequence by fusing the target feature information of each product value time subsequence under a plurality of preset feature indexes through a first layer attention mechanism network in the pre-trained value prediction model comprises:
confirming the weight of each preset characteristic index through a first layer attention mechanism network in the pre-trained value prediction model;
and aiming at each product value time subsequence, performing fusion processing on target characteristic information of the product value time subsequence under a plurality of preset characteristics through the weight of each preset characteristic index to obtain subsequence characteristic information of the product value time subsequence.
3. The method according to claim 1, wherein the obtaining the predicted value of the product to be predicted after the product value time series by fusing the subsequence feature information of each product value time subsequence through a second layer attention mechanism network in the pre-trained value prediction model comprises:
determining a weight of each of the product value time subsequences through a second layer attention mechanism network in the pre-trained value prediction model;
and fusing the subsequence characteristic information of each product value time subsequence to obtain the predicted value of the product to be predicted after the product value time sequence.
4. The method of claim 1, wherein the pre-trained value prediction model is trained by:
obtaining a sample product value time series of a sample product and an actual value of the sample product after the sample product value time series;
performing sliding window processing on the sample product value time sequence to obtain a plurality of sample product value time subsequences;
acquiring target characteristic information of each sample product value time subsequence under a plurality of preset characteristic indexes through a recurrent neural network in a value prediction model to be trained;
performing fusion processing on target characteristic information of each sample product value time subsequence under a plurality of preset characteristic indexes through a first layer attention mechanism network in the value prediction model to be trained to obtain subsequence characteristic information of each sample product value time subsequence;
fusing the subsequence characteristic information of each sample product value time subsequence through a second layer attention mechanism network in the value prediction model to be trained to obtain the predicted value of the sample product after the sample product value time sequence;
and training the value prediction model to be trained according to the difference between the predicted value of the sample product after the sample product value time sequence and the actual value of the sample product after the sample product value time sequence to obtain a trained value prediction model serving as the pre-trained value prediction model.
5. The method according to claim 1, wherein the obtaining of the target feature information of each product value time subsequence under a plurality of preset feature indexes through a recurrent neural network in a pre-trained value prediction model comprises:
acquiring the mapped characteristic information of each product value time subsequence under a plurality of preset characteristic indexes;
for each product value time subsequence, inputting the mapped feature information of the product value time subsequence under a plurality of preset feature indexes into a recurrent neural network in the pre-trained value prediction model to obtain hidden state information corresponding to the mapped feature information of the product value time subsequence under the plurality of preset feature indexes;
and confirming hidden state information corresponding to the mapped feature information of the product value time subsequence under a plurality of preset feature indexes as target feature information of the product value time subsequence under the plurality of preset feature indexes.
6. The method according to claim 5, wherein the obtaining of the mapped feature information of each product value time subsequence under a plurality of preset feature indicators comprises:
acquiring initial characteristic information of each product value time subsequence under a plurality of preset characteristic indexes, and confirming the mapping relation of each preset characteristic index;
and aiming at each product value time subsequence, mapping the initial characteristic information of the product value time subsequence under a plurality of preset characteristic indexes according to the mapping relation of each preset characteristic index to obtain the mapped characteristic information of the product value time subsequence under the plurality of preset characteristic indexes.
7. The method of claim 6, wherein the plurality of predetermined characteristic measures includes at least an aggregate approximation measure, a slope fitting measure, and a gradient measure;
the obtaining of the initial characteristic information of each product value time subsequence under a plurality of preset characteristic indexes includes:
aiming at each product value time subsequence, dividing the product value time subsequence to obtain a plurality of product value time secondary subsequences with equal time length;
confirming the average value of each product value time secondary subsequence;
and carrying out fusion processing on the average value of each product value time secondary subsequence to obtain the aggregation approximation information of the product value time subsequences under the aggregation approximation index, wherein the aggregation approximation information is used as the initial characteristic information of the product value time subsequences under the aggregation approximation index.
8. The method according to claim 6, wherein the obtaining of initial characteristic information of each product value time subsequence under a plurality of preset characteristic indexes further comprises:
aiming at each product value time subsequence, dividing the product value time subsequence to obtain a plurality of product value time three-level subsequences;
confirming gradient information of each product value time three-level subsequence; the gradient information includes horizontal gradient information in a horizontal direction and vertical gradient information in a vertical direction;
and obtaining gradient histogram information of the product value time subsequence under the gradient index according to the gradient information of each product value time three-level subsequence, and using the gradient histogram information as initial characteristic information of the product value time subsequence under the gradient index.
9. The method according to any one of claims 1 to 8, wherein after the sub-sequence feature information of each product value time sub-sequence is fused through a second layer attention mechanism network in the pre-trained value prediction model to obtain the predicted value of the product to be predicted after the product value time sequence, the method further comprises:
acquiring the actual value of the product to be predicted after the product value time series;
confirming the loss value of the pre-trained value prediction model according to the predicted value of the product to be predicted after the product value time sequence and the actual value of the product to be predicted after the product value time sequence;
and under the condition that the loss value is larger than a preset threshold value, correcting the parameters of the pre-trained value prediction model according to the loss value.
10. A product value prediction apparatus, characterized in that the apparatus comprises:
the sequence sliding window processing module is used for performing sliding window processing on the product value time sequence of the product to be predicted to obtain a plurality of product value time subsequences;
the index information confirmation module is used for acquiring target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes through a recurrent neural network in a pre-trained value prediction model;
the sequence information confirmation module is used for fusing target characteristic information of each product value time subsequence under a plurality of preset characteristic indexes through a first layer of attention mechanism network in the pre-trained value prediction model to obtain subsequence characteristic information of each product value time subsequence;
and the product value prediction module is used for carrying out fusion processing on the subsequence feature information of each product value time subsequence through a second layer attention mechanism network in the pre-trained value prediction model to obtain the predicted value of the product to be predicted after the product value time sequence.
11. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor realizes the steps of the method of any one of claims 1 to 9 when executing the computer program.
12. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 9.
13. A computer program product comprising a computer program, characterized in that the computer program realizes the steps of the method of any one of claims 1 to 9 when executed by a processor.
CN202211701339.5A2022-12-292022-12-29Product value prediction method, device, computer equipment and storage mediumPendingCN115860802A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202211701339.5ACN115860802A (en)2022-12-292022-12-29Product value prediction method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202211701339.5ACN115860802A (en)2022-12-292022-12-29Product value prediction method, device, computer equipment and storage medium

Publications (1)

Publication NumberPublication Date
CN115860802Atrue CN115860802A (en)2023-03-28

Family

ID=85655673

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202211701339.5APendingCN115860802A (en)2022-12-292022-12-29Product value prediction method, device, computer equipment and storage medium

Country Status (1)

CountryLink
CN (1)CN115860802A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN116739840A (en)*2023-08-142023-09-12贵州优特云科技有限公司 Travel package recommendation method, device and storage medium based on multi-target group optimization

Cited By (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN116739840A (en)*2023-08-142023-09-12贵州优特云科技有限公司 Travel package recommendation method, device and storage medium based on multi-target group optimization
CN116739840B (en)*2023-08-142023-11-14贵州优特云科技有限公司 Travel package recommendation method, device and storage medium based on multi-target group optimization

Similar Documents

PublicationPublication DateTitle
CN112418292B (en)Image quality evaluation method, device, computer equipment and storage medium
CN111080360B (en)Behavior prediction method, model training method, device, server and storage medium
CN110659744A (en)Training event prediction model, and method and device for evaluating operation event
CN114662570A (en)Business risk prediction method and device, computer equipment and storage medium
CN113255908A (en)Method, neural network model and device for service prediction based on event sequence
CN119484065B (en)Network security situation awareness method based on deep neural network
CN114170000A (en) Credit card user risk category identification method, device, computer equipment and medium
CN114549174A (en)User behavior prediction method and device, computer equipment and storage medium
CN115730125A (en)Object identification method and device, computer equipment and storage medium
CN116755627A (en)Spatial data storage method, spatial data storage device, computer equipment and storage medium
CN115860802A (en)Product value prediction method, device, computer equipment and storage medium
CN115630221A (en)Terminal application interface display data processing method and device and computer equipment
CN115204984A (en) Business product push method, apparatus, computer equipment and storage medium
Li et al.An alternating nonmonotone projected Barzilai–Borwein algorithm of nonnegative factorization of big matrices
CN117593090A (en)Graph structure prediction model training method and related device based on multi-task learning
CN116976187A (en)Modeling variable determining method, abnormal data prediction model construction method and device
CN115034225A (en)Word processing method and device applied to medical field, electronic equipment and medium
US11989243B2 (en)Ranking similar users based on values and personal journeys
CN116977049A (en)Method, device, computer equipment and storage medium for processing resource application service
CN116703544A (en)Resource interaction track prediction method and device, computer equipment and storage medium
CN119719994A (en) Abnormal user identification method, device, computer equipment and storage medium
CN120045909A (en)Resource allocation method, device, computer equipment and storage medium
CN116342242A (en)Abnormality detection method, abnormality detection device, computer device, and storage medium
CN119474549A (en) Method, device and computer equipment for predicting click probability of recommended information
CN116910604A (en)User classification method, apparatus, computer device, storage medium, and program product

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination

[8]ページ先頭

©2009-2025 Movatter.jp