Movatterモバイル変換


[0]ホーム

URL:


CN110689164A - Prediction method and system for user reduction behavior - Google Patents

Prediction method and system for user reduction behavior
Download PDF

Info

Publication number
CN110689164A
CN110689164ACN201910792568.4ACN201910792568ACN110689164ACN 110689164 ACN110689164 ACN 110689164ACN 201910792568 ACN201910792568 ACN 201910792568ACN 110689164 ACN110689164 ACN 110689164A
Authority
CN
China
Prior art keywords
behavior
user
feature
network model
group
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910792568.4A
Other languages
Chinese (zh)
Other versions
CN110689164B (en
Inventor
王艺林
王哲
苏建安
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Advanced New Technologies Co Ltd
Advantageous New Technologies Co Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding LtdfiledCriticalAlibaba Group Holding Ltd
Priority to CN201910792568.4ApriorityCriticalpatent/CN110689164B/en
Publication of CN110689164ApublicationCriticalpatent/CN110689164A/en
Application grantedgrantedCritical
Publication of CN110689164BpublicationCriticalpatent/CN110689164B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The application discloses a method and a system for predicting user reduction behaviors. An embodiment of the present specification provides a method for predicting a user's reduction behavior, including: extracting the behavior sequence characteristics in the user data in a preset historical time period and other characteristics except the behavior sequence characteristics to generate training samples; inputting the training sample of the behavior sequence characteristic into a long-term and short-term memory network model for training, and inputting the training sample of other characteristics into a deep neural network model for training; forming a prediction model by the trained long and short term memory network model and the deep neural network model, wherein the output of the long and short term memory network model is connected with the input of the neural network model; and predicting the result of the reduction behavior of the user according to the prediction model.

Description

Prediction method and system for user reduction behavior
Technical Field
This description relates to the internet field.
Background
In some internet products, besides the need to attract new users to sign up, the intention of the old users to solve the contract is predicted in advance before the old users lose the contract, and relevant countermeasures (such as putting interest advertisements) are taken to activate the old users for recovery. Most of the current consumption prediction systems adopt machine learning models (such as LR and GBDT), but the prediction accuracy is not high enough.
Disclosure of Invention
The specification provides a method and a system for predicting a user contract-solving behavior, which can more accurately predict a high-probability contract-solving user group while avoiding a large amount of manual feature extraction projects.
The application discloses a prediction method of a user reduction behavior, which comprises the following steps:
extracting the behavior sequence characteristics and other characteristics except the behavior sequence characteristics in the user data in a preset historical time period to generate a training sample;
inputting the training samples of the behavior sequence characteristics into a long-term and short-term memory network model for training, and inputting the training samples of other characteristics into a deep neural network model for training;
forming a prediction model by the trained long-short term memory network model and the deep neural network model, wherein the output of the long-short term memory network model is connected with the input of the neural network model;
and predicting the result of the reduction behavior of the reduction user according to the prediction model.
In a preferred embodiment, the extracting the behavior sequence feature and other features except the behavior sequence feature in the user data in the preset historical time period to generate the training sample further includes:
extracting behavior sequence characteristics in user data in a preset historical time period to obtain corresponding behavior sequence data;
extracting other features except the behavior sequence features in the user data in the preset historical time period to obtain corresponding other feature data, and preprocessing the feature data of each feature group to obtain an embedded vector corresponding to each feature group based on one or more feature groups of each feature dimension in the other features;
and obtaining the training sample according to the behavior sequence data and the embedded vector corresponding to each feature group.
In a preferred example, the behavior sequence data at least includes any one or more of: transaction times sequence, signing state sequence and contract opening times sequence in the application program store.
In a preferred example, the other features include one or more of a basic attribute feature dimension, a business scenario feature dimension, and a security attribute feature dimension;
the basic attribute feature dimension comprises one or more of a basic portrait attribute feature group, a wealth attribute feature group, a position attribute feature group, a purchase attribute feature group, an interest preference attribute feature group, a search query attribute feature group and a marketing information sensitivity attribute feature group;
the service scene characteristic dimension comprises one or more of an access comment behavior characteristic group of a signing platform public number in a payment platform, a characteristic group of application program store requirements and interest degree and a signing activity degree characteristic group of the signing platform;
the security attribute feature dimension comprises one or more of a credit service opening condition feature set and a reduction times feature set.
In a preferred embodiment, predicting, according to the prediction model, after the result of the reduction behavior of the reduction user, further includes:
and sorting the users according to the reduction probability in the reduction behavior result from high to low, and selecting the users to be thrown to a throwing platform according to the sorting result.
In a preferred example, the deep neural network model is a three-layer deep neural network model.
In a preferred example, the training sample is provided with a label, and the label is whether the user will practice the offer or not.
The application also discloses a system for predicting the user's reduction behavior, which comprises:
the sample generation module is used for extracting the behavior sequence characteristics in the user data in the preset historical time period and other characteristics except the behavior sequence characteristics to generate training samples;
the model training module is used for inputting the training samples of the behavior sequence characteristics into a long-term and short-term memory network model for training and inputting the training samples of other characteristics into a deep neural network model for training;
and the prediction module is used for forming a prediction model by the trained long-short term memory network model and the deep neural network model, wherein the output of the long-short term memory network model is connected with the input of the neural network model, and the result of the reduction behavior of the user is predicted according to the prediction model.
In a preferred embodiment, the sample generation module is further configured to extract a behavior sequence feature in user data in a preset historical time period to obtain corresponding behavior sequence data, extract other features except the behavior sequence feature in the user data in the preset historical time period to obtain corresponding other feature data, perform preprocessing on the feature data of each feature group to obtain an embedded vector corresponding to each feature group based on one or more feature groups of each feature dimension in the other features, and obtain the training sample according to the behavior sequence data and the embedded vector corresponding to each feature group.
In a preferred example, the behavior sequence data at least includes any one or more of: transaction times sequence, signing state sequence and contract opening times sequence in the application program store.
In a preferred example, the other features include one or more of a basic attribute feature dimension, a business scenario feature dimension, and a security attribute feature dimension;
the basic attribute feature dimension comprises one or more of a basic portrait attribute feature group, a wealth attribute feature group, a position attribute feature group, a purchase attribute feature group, an interest preference attribute feature group, a search query attribute feature group and a marketing information sensitivity attribute feature group;
the service scene characteristic dimension comprises one or more of an access comment behavior characteristic group of a signing platform public number in a payment platform, a characteristic group of application program store requirements and interest degree and a signing activity degree characteristic group of the signing platform;
the security attribute feature dimension comprises a credit service opening condition feature group and/or a reduction times feature group.
In a preferred embodiment, the prediction module is further configured to sort, from high to low, the reduction probabilities in the reduction behavior results by the reduction users, and select users to place the users on the placement platform according to the sorted results.
In a preferred example, the deep neural network model is a three-layer deep neural network model.
In a preferred example, the training sample is provided with a label, and the label is whether the user will practice the offer or not.
The application also discloses a system for predicting the user's reduction behavior, which comprises:
a memory for storing computer executable instructions; and the number of the first and second groups,
a processor for implementing the steps in the method as described hereinbefore when executing the computer-executable instructions.
The present application also discloses a computer-readable storage medium having stored therein computer-executable instructions which, when executed by a processor, implement the steps in the method as described above.
In the embodiment according to the description, compared with the traditional machine learning, through the combination of the LSTM model and the DNN model, a large number of projects for manually selecting the features are reduced, mass data can be fully utilized to learn the feature cross relationship, and the fitting capability of the latter is better.
In addition, in the embodiment of the present specification, in the process of constructing the feature, the distinguishing degree of the feature to the positive and negative categories and the coverage condition of the feature are comprehensively considered, for example, the sequence behavior features related to the solution, such as user historical solution, consumption, and the like, are extracted, and other features are extracted from multiple dimensions, such as a user, a scene, and the like, wherein each dimension further includes multiple groups. The reliability and accuracy of predicting the high-probability contract-solving user group are ensured.
Furthermore, because the memory function of the LSTM model deeply digs out the inherent rule in the time sequence, the LSTM model learns the sequence behavior characteristics related to the solution, the influence of historical data on the future data prediction result can be truly reflected, the time sequence of the behavior characteristics is fully reflected, the accuracy and the reliability of the prediction are improved, and the time sequence is used as an input of the DNN model; meanwhile, an embedded vector is generated for other characteristics according to the characteristics under each group and is used as the other input of the DNN model, so that the high-probability solution user group can be predicted more accurately, and the accuracy and the reliability of prediction are further improved.
A large number of technical features are described in the specification, and are distributed in various technical solutions, so that the specification is too long if all possible combinations of the technical features (namely, the technical solutions) in the application are listed. In order to avoid this problem, the respective technical features disclosed in the above summary of the invention of the present specification, the respective technical features disclosed in the following embodiments and examples, and the respective technical features disclosed in the drawings may be freely combined with each other to constitute various new technical solutions (which should be regarded as having been described in the present specification) unless such a combination of the technical features is technically impossible. For example, in one example, the feature a + B + C is disclosed, in another example, the feature a + B + D + E is disclosed, and the features C and D are equivalent technical means for the same purpose, and technically only one feature is used, but not simultaneously employed, and the feature E can be technically combined with the feature C, then the solution of a + B + C + D should not be considered as being described because the technology is not feasible, and the solution of a + B + C + E should be considered as being described.
Drawings
FIG. 1 is a flow chart illustrating a method for predicting user reduction behavior according to an embodiment of a first embodiment of the present disclosure;
FIG. 2 is a flow chart illustrating a method for predicting user reduction behavior according to an embodiment of the first embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a system for predicting user reduction behavior according to a second embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a prediction system of a user's solution behavior according to an example of the second embodiment of the present specification.
Detailed Description
In the following description, numerous technical details are set forth in order to provide a better understanding of the present application. However, it will be understood by those skilled in the art that the technical solutions claimed in the present application may be implemented without these technical details and with various changes and modifications based on the following embodiments.
Description of partial concepts:
DNN: deep Neural Network, Deep Neural Network.
LSTM: long Short Term Memory, Long Short Term Memory network.
ODPS: open Data Processing Service.
UV: unique viewer, independent guest.
ODPS: open Data Processing Service, big Data computing Service. The technology is now renamed as MaxCommute, and is a rapid and completely hosted TB/PB level data warehouse solution.
Embodiments of the present description will be described in further detail below with reference to the accompanying drawings.
Currently, some consumption prediction models mostly use traditional machine learning models to predict the intention of solution. However, the prediction effect of the model training traditional machine learning algorithm is closely related to the features extracted manually in the early stage and the number of the features, if a relatively accurate prediction result is obtained, a large number of features are often extracted manually according to experience in the early stage, on one hand, the labor cost is high, and on the other hand, even if the extraction work of a large number of features is completed, the extracted features are obtained by depending on experience, and the reliability is often general.
In view of the above problems, a first embodiment of the present specification provides a method for predicting a user reduction behavior, which selects a user group to be reduced based on a deep learning model, improves recall rate compared with a rule defining user group, and often achieves better fitting capability of the deep learning model, thereby avoiding a large amount of manual feature engineering, being capable of sufficiently utilizing mass data to learn a feature cross relationship, and improving accuracy of the user group.
Specifically, fig. 1 shows a method for predicting a user's reduction behavior according to an embodiment of the present application, and the flow of the method is shown in fig. 1, where the method includes the followingsteps 102 to 108:
instep 102, the behavior sequence feature and other features except the behavior sequence feature in the user data in the preset historical time period are extracted to generate a training sample.
It should be noted that the "preset historical time period" instep 102 may be any time period before the current time. Alternatively, it may be, but is not limited to, any period of time within one year of the present, such as the last few weeks, months, etc.
Alternatively, the "user data" instep 102 may be the original full amount of user behavior data or partial user behavior data of a certain product platform. For example, for the prediction of a user of an acriba under-flag product, the user data may be the full amount of user behavior data stored in the ODPS, or the like. Alternatively, the user data may be user data stored on other big data computing platforms.
Preferably, the training sample is provided with a label, which is whether the user will be offered or not. Specifically, the biggest difference between the supervised learning and the unsupervised learning generated by the training samples is whether the input data has a label, and in the preferred embodiment, the training samples are provided with labels, and then the supervised learning is adopted. For example, the label of the training sample is whether the user will be relieved in the last week, data of the last month is used as the training sample (training set), which amounts to five million, and the test set needs to be selected from data of other times than the last month because the test sample (test set) needs to be staggered from the training set. It should be noted that the details listed in this example are provided primarily for ease of understanding and are not intended to limit the scope of the present application.
Optionally, the other features instep 102 include one or more of a basic attribute feature dimension, a business scenario feature dimension, and a security attribute feature dimension. The basic attribute feature dimension can comprise one or more of a basic portrait attribute feature set, a wealth attribute feature set, a position attribute feature set, a purchase attribute feature set, an interest preference attribute feature set, a search query attribute feature set and a marketing information sensitivity attribute feature set; the service scene characteristic dimension can comprise one or more of an access comment behavior characteristic group of a signing platform public number in a payment platform, a characteristic group of application program store demand and interest degree, and a signing activity degree characteristic group of the signing platform; the security attribute feature dimension may include a credit service fulfillment feature set and/or a number of savvy feature set. Further, for example, the set of base portrait attribute features may include, but is not limited to: age, gender, constellation, occupation, educational level, life stage, etc. For example, the set of wealth attribute features may include, but is not limited to, income bracket, purchasing power bracket, probability of having a room, probability of having a car, etc.; the set of location attribute features may include, but is not limited to: a place of birth, a place of work, a place of home, a residential premises, etc. For example, the set of application store requirements and interest characteristics may include, but is not limited to: and the user application program stores pay attention to the public number of the signing platform and cancel the attention times in the near k (k is more than or equal to 1) days under the historical transaction condition. For example, the subscription liveness feature set with the subscription platform may include, but is not limited to: and the times of logging in the signing platform in a day of near m (m is more than or equal to 1), and the application programs in a wallet of the signing platform in a day of near n (n is more than or equal to 1) use UV and the like. It should be noted that the details listed in this example are provided primarily for ease of understanding and are not intended to limit the scope of the present application.
Optionally, thestep 102 may further include the followingsub-steps 202 to 204, as shown in fig. 2, wherein in thestep 202, the behavior sequence features in the user data in the preset historical time period are extracted to obtain corresponding behavior sequence data. Then, step 204 is performed, other features except the behavior sequence feature in the user data in the preset historical time period are extracted to obtain corresponding other feature data, wherein for the other feature data, a plurality of feature groups are provided based on each feature dimension in the other features, and the feature data in each feature group is preprocessed to obtain an embedded vector corresponding to each feature group. Then, instep 206, the training sample is obtained according to the behavior sequence data and the embedded vector corresponding to each feature set. It should be noted that thesteps 202 and 204 may be performed simultaneously or sequentially in any order. In the embodiment, in the construction process of each feature, the discrimination of the feature to the positive and negative categories and the coverage condition of the feature are comprehensively considered. A good foundation is provided for the training of a subsequent model, and the reliability and the accuracy of predicting a high-probability solution user group are ensured.
Further, in one embodiment, the behavioral sequence data involved instep 202 above may include one or more behavioral sequences related to reduction and/or consumption. Optionally, the behavioral sequence data includes at least any one or more of: transaction times sequence, signing state sequence and contract-release times sequence in an application program store; but is not limited thereto.
Then step 104 is entered, the training samples of the behavior sequence features are input into the long-short term memory network model for training, and the training samples of the other features are input into the deep neural network model for training. Because the memory function of the LSTM model deeply digs out the inherent rule in the time sequence, the LSTM model learns the sequence behavior characteristics related to the solution, the influence of historical data on the future data prediction result can be truly reflected, the time sequence of the behavior characteristics is fully embodied, and the accuracy and the reliability of the prediction are improved; meanwhile, an embedded vector is generated for other features according to the features under each feature and is used as the other input of the DNN model, so that the high-probability solution user group can be predicted more accurately, and the accuracy and the reliability of prediction are further improved.
The deep neural network model is selected as a multilayer deep neural network model. Preferably, a three-layer deep neural network model is selected.
And then step 106 is entered, the trained long-short term memory network model and the deep neural network model are combined into a prediction model, wherein the output of the long-short term memory network model is connected with the input of the neural network model. Compared with the traditional machine learning, through the combination of the LSTM model and the DNN model, a large number of projects for manually selecting the features are reduced, the cross relationship of the features can be learned by fully utilizing mass data, the fitting capability of the latter is better, and a high-probability solution user group can be predicted more accurately while a large number of projects for manually extracting the features are avoided.
Thereafter,step 108 is entered to predict the result of the reduction behavior of the user at the reduction according to the prediction model.
Optionally, thestep 108 may further include the following steps: and sorting the users according to the reduction probability in the reduction behavior result from high to low, and selecting the users to be thrown to a throwing platform according to the sorting result. For example, the prediction model predicts, scores and sorts the behavior sequence characteristics of the users and other characteristics except the behavior sequence characteristics, and extracts the users with the top N million of the ranking to be delivered to the delivery platform. It should be noted that the details listed in this example are provided primarily for ease of understanding and are not intended to limit the scope of the present application.
A second embodiment of the present specification provides a system for predicting a user's reduction behavior, which has a structure shown in fig. 3, and includes a sample generation module, a model training module, and a prediction module, specifically as follows:
the sample generation module is used for extracting the behavior sequence characteristics in the user data in the preset historical time period and other characteristics except the behavior sequence characteristics to generate training samples.
The preset historical time period can be any time period before the current time. Alternatively, but not limited to, any period of time within the year from now, such as the last few weeks, months, and so forth.
Alternatively, the user data may be the original full amount of user behavior data or partial user behavior data for a certain product platform. For example, for the prediction of a user of an acriba under-flag product, the user data may be the full amount of user behavior data stored in the ODPS, or the like. Alternatively, the user data may be user data stored on other big data computing platforms.
Preferably, the training sample is provided with a label. In one embodiment, the tag is whether the user will contract.
Optionally, the other features may include one or more of a basic attribute feature dimension, a business scenario feature dimension, and a security attribute feature dimension. The basic attribute feature dimension can comprise one or more of a basic portrait attribute feature set, a wealth attribute feature set, a position attribute feature set, a purchase attribute feature set, an interest preference attribute feature set, a search query attribute feature set and a marketing information sensitivity attribute feature set; the service scene characteristic dimension can comprise one or more of an access comment behavior characteristic group of a signing platform public number in a payment platform, a characteristic group of application program store demand and interest degree, and a signing activity degree characteristic group of the signing platform; the security attribute feature dimension comprises a credit service opening condition feature set and/or a reduction times feature set.
Optionally, the sample generation module is configured to extract behavior sequence features in user data in a preset historical time period to obtain corresponding behavior sequence data, extract other features except the behavior sequence features in the user data in the preset historical time period to obtain corresponding other feature data, perform preprocessing on the feature data of each feature group to obtain an embedded vector corresponding to each feature group based on one or more feature groups of each feature dimension in the other features, and obtain the training sample according to the behavior sequence data and the embedded vector corresponding to each feature group. In the embodiment, when the features are selected, the discrimination of the features to positive and negative categories and the covering condition of the features are comprehensively considered, and the reliability and the accuracy of predicting the high-probability solution user group are ensured.
In one embodiment, the behavioral sequence data may include one or more behavioral sequences related to reduction and/or consumption. Optionally, the behavioral sequence data includes at least any one or more of: transaction times sequence, signing state sequence and contract-release times sequence in an application program store; but is not limited thereto.
The model training module comprises a long-short term memory network model and a neural network model and is used for inputting the training samples of the behavior sequence characteristics into the long-short term memory network model for training and inputting the training samples of other characteristics into the deep neural network model for training. Because the memory function of the LSTM model deeply digs out the inherent rule in the time sequence, the LSTM model learns the sequence behavior characteristics related to the solution, the influence of historical data on the future data prediction result can be truly reflected, the time sequence of the behavior characteristics is fully embodied, the output of the LSTM model is used as one input of the DNN model, and the accuracy and the reliability of prediction are improved; meanwhile, an embedded vector is generated for other characteristics according to the characteristics under each group and is used as the other input of the DNN model, so that the high-probability solution user group can be predicted more accurately, and the accuracy and the reliability of prediction are further improved.
The deep neural network model is a multilayer deep neural network module. Preferably, the deep neural network model is a three-layer deep neural network model.
The prediction module is used for forming a prediction model by the trained long-short term memory network model and the deep neural network model, wherein the output of the long-short term memory network model is connected with the input of the neural network model, and the result of the reduction behavior of the user is predicted according to the prediction model. Through the combination of the LSTM model and the DNN model, a large number of projects for manually selecting the features are reduced, mass data can be fully utilized to learn the feature cross relationship, and the fitting capability of the latter model is better.
Optionally, the prediction module is further configured to sort the reduction probabilities of the reduction users according to the reduction behavior results from high to low, and select the users to place the users on the placement platform according to the sorting results.
Fig. 4 is a schematic structural diagram of a prediction system of a user's reduction behavior according to an embodiment of the present invention. In this embodiment, the method includes: the characteristic extraction module is used for extracting behavior sequence characteristics in the historical user behavior data to obtain corresponding behavior sequence data and extracting other characteristics except the behavior sequence characteristics in the historical user behavior data to obtain corresponding other characteristic data; a training sample generation module, configured to pre-process feature data of each feature group to obtain an embedded vector corresponding to each feature group based on one or more feature dimensions in other features of the historical user behavior data, and obtain the training sample according to the behavior sequence data and the embedded vector corresponding to each feature group; the model training module is used for inputting the training samples of the behavior sequence characteristics into the long-term and short-term memory network model for training and inputting the training samples of other characteristics into the deep neural network model for training; the model prediction module is used for carrying out scoring prediction through a prediction model according to the extracted behavior sequence data and other characteristic data in the user behavior data to obtain a prediction result; the ordering module is used for ordering the reduction probability in the prediction result from high to low; the user group selection module is used for selecting and putting the first N million users in the sequence; and the platform releasing module is used for releasing the selected N million users to a releasing platform.
The first embodiment is a method embodiment corresponding to the present embodiment, and the technical details in the first embodiment may be applied to the present embodiment, and the technical details in the present embodiment may also be applied to the first embodiment.
It should be noted that, as will be understood by those skilled in the art, the implementation functions of the modules shown in the embodiment of the prediction system for user reduction behavior described above can be understood by referring to the foregoing description of the prediction method for user reduction behavior. The functions of the modules shown in the embodiment of the prediction system for user reduction behavior described above may be implemented by a program (executable instructions) running on a processor, or may be implemented by specific logic circuits. The prediction system for the user's solution behavior in the embodiments of the present disclosure may also be stored in a computer-readable storage medium if it is implemented in the form of a software function module and sold or used as a stand-alone product. Based on such understanding, the technical solutions of the embodiments of the present specification may be essentially implemented or portions thereof contributing to the prior art may be embodied in the form of a software product stored in a storage medium, and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the method of the embodiments of the present specification. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, or an optical disk. Thus, embodiments of the present description are not limited to any specific combination of hardware and software.
Accordingly, the present specification embodiments also provide a computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, implement the method embodiments of the present specification. Computer-readable storage media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable storage medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
In addition, the present specification embodiments also provide a system for predicting user's reduction behavior, comprising a memory for storing computer-executable instructions, and a processor; the processor is configured to implement the steps of the method embodiments described above when executing the computer-executable instructions in the memory.
In one embodiment, the computer-executable instructions may be for: extracting the behavior sequence characteristics in the user data in a preset historical time period and other characteristics except the behavior sequence characteristics to generate training samples; inputting the training sample of the behavior sequence characteristic into a long-term and short-term memory network model for training, and inputting the training sample of other characteristics into a deep neural network model for training; forming a prediction model by the trained long and short term memory network model and the deep neural network model, wherein the output of the long and short term memory network model is connected with the input of the neural network model; and predicting the result of the reduction behavior of the reduction user according to the prediction model.
In one embodiment, the Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), or the like. The aforementioned memory may be a read-only memory (ROM), a Random Access Memory (RAM), a Flash memory (Flash), a hard disk, or a solid state disk. The steps of the method disclosed in the embodiments of the present invention may be directly implemented by a hardware processor, or implemented by a combination of hardware and software modules in the processor. In one embodiment, the system for predicting user reduction behavior further comprises a bus and a communication interface. The processor, memory and communication interface are all interconnected by a bus. The communication interface may be a wireless communication interface or a wired communication interface for enabling the processor to communicate with other devices.
It is noted that, in the present patent application, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. In the present patent application, if it is mentioned that a certain action is executed according to a certain element, it means that the action is executed according to at least the element, and two cases are included: performing the action based only on the element, and performing the action based on the element and other elements. The expression of a plurality of, a plurality of and the like includes 2, 2 and more than 2, more than 2 and more than 2.
All documents mentioned in this specification are to be considered as being incorporated in their entirety into the disclosure of this specification so as to be subject to modification as necessary. It should be understood that the above description is only a preferred embodiment of the present disclosure, and is not intended to limit the scope of the present disclosure. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of one or more embodiments of the present disclosure should be included in the scope of protection of one or more embodiments of the present disclosure.
In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.

Claims (16)

CN201910792568.4A2019-08-262019-08-26Prediction method and system for user offer behaviorActiveCN110689164B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201910792568.4ACN110689164B (en)2019-08-262019-08-26Prediction method and system for user offer behavior

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201910792568.4ACN110689164B (en)2019-08-262019-08-26Prediction method and system for user offer behavior

Publications (2)

Publication NumberPublication Date
CN110689164Atrue CN110689164A (en)2020-01-14
CN110689164B CN110689164B (en)2023-04-28

Family

ID=69108584

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201910792568.4AActiveCN110689164B (en)2019-08-262019-08-26Prediction method and system for user offer behavior

Country Status (1)

CountryLink
CN (1)CN110689164B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN111275350A (en)*2020-02-082020-06-12支付宝(杭州)信息技术有限公司Method and device for updating event evaluation model
CN111898706A (en)*2020-08-242020-11-06深圳市富之富信息科技有限公司Intelligent iterative deployment method and device of model, computer equipment and storage medium
CN112085541A (en)*2020-09-272020-12-15中国建设银行股份有限公司 User demand analysis method and device based on browsing consumption time series data
CN113570204A (en)*2021-07-062021-10-29北京淇瑀信息科技有限公司User behavior prediction method, system and computer equipment
CN113673620A (en)*2021-08-272021-11-19工银科技有限公司Method, system, device, medium and program product for model generation
CN114022202A (en)*2021-11-032022-02-08中南大学 User Churn Prediction Method and System Based on Deep Learning
CN114154154A (en)*2021-12-032022-03-08广东共链科技有限公司 A smart contract fuzzing method and system based on long short-term memory network
CN114707699A (en)*2022-03-042022-07-05易视腾科技股份有限公司Network television order unsubscribe prediction method based on machine learning

Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP2002366732A (en)*2001-06-112002-12-20Sas Institute Japan LtdCustomer maintenance supporting system with respect to member customer
CN107615186A (en)*2015-08-132018-01-19华为技术有限公司 Method and device for model predictive control
CN107615313A (en)*2015-05-292018-01-19日本电气株式会社The business activity servicing unit and business activity householder method predicted using contract cancellation
CN108763319A (en)*2018-04-282018-11-06中国科学院自动化研究所Merge the social robot detection method and system of user behavior and text message
CN109670569A (en)*2017-10-162019-04-23优酷网络技术(北京)有限公司Neural net prediction method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP2002366732A (en)*2001-06-112002-12-20Sas Institute Japan LtdCustomer maintenance supporting system with respect to member customer
CN107615313A (en)*2015-05-292018-01-19日本电气株式会社The business activity servicing unit and business activity householder method predicted using contract cancellation
CN107615186A (en)*2015-08-132018-01-19华为技术有限公司 Method and device for model predictive control
CN109670569A (en)*2017-10-162019-04-23优酷网络技术(北京)有限公司Neural net prediction method and device
CN108763319A (en)*2018-04-282018-11-06中国科学院自动化研究所Merge the social robot detection method and system of user behavior and text message

Cited By (10)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN111275350A (en)*2020-02-082020-06-12支付宝(杭州)信息技术有限公司Method and device for updating event evaluation model
CN111275350B (en)*2020-02-082021-06-04支付宝(杭州)信息技术有限公司Method and device for updating event evaluation model
CN111898706A (en)*2020-08-242020-11-06深圳市富之富信息科技有限公司Intelligent iterative deployment method and device of model, computer equipment and storage medium
CN112085541A (en)*2020-09-272020-12-15中国建设银行股份有限公司 User demand analysis method and device based on browsing consumption time series data
CN112085541B (en)*2020-09-272024-12-24中国建设银行股份有限公司 User demand analysis method and device based on browsing consumption time series data
CN113570204A (en)*2021-07-062021-10-29北京淇瑀信息科技有限公司User behavior prediction method, system and computer equipment
CN113673620A (en)*2021-08-272021-11-19工银科技有限公司Method, system, device, medium and program product for model generation
CN114022202A (en)*2021-11-032022-02-08中南大学 User Churn Prediction Method and System Based on Deep Learning
CN114154154A (en)*2021-12-032022-03-08广东共链科技有限公司 A smart contract fuzzing method and system based on long short-term memory network
CN114707699A (en)*2022-03-042022-07-05易视腾科技股份有限公司Network television order unsubscribe prediction method based on machine learning

Also Published As

Publication numberPublication date
CN110689164B (en)2023-04-28

Similar Documents

PublicationPublication DateTitle
CN110689164A (en)Prediction method and system for user reduction behavior
US11651381B2 (en)Machine learning for marketing of branded consumer products
CN110400169A (en)A kind of information-pushing method, device and equipment
CN114119137B (en)Risk control method and apparatus
CN108416616A (en)The sort method and device of complaints and denunciation classification
CN105225135B (en)Potential customer identification method and device
CN113065911A (en) Recommended information generation method, device, storage medium and electronic device
CN105308591A (en)Dynamics of tie strength from social interaction
CN116976664A (en)Risk merchant prediction method, system, computer and readable storage medium
Sobreiro et al.A slr on customer dropout prediction
CN112989182A (en)Information processing method, information processing apparatus, information processing device, and storage medium
Liu et al.Extracting, ranking, and evaluating quality features of web services through user review sentiment analysis
CN112084408B (en)List data screening method, device, computer equipment and storage medium
CN111177657B (en)Demand determining method, system, electronic device and storage medium
CN111860554B (en)Risk monitoring method and device, storage medium and electronic equipment
KR102153790B1 (en)Computing apparatus, method and computer readable storage medium for inspecting false offerings
US12340323B2 (en)Systems and methods for curating online vehicle reservations to avoid electric vehicle battery depletion
CN116245658A (en)Risk assessment method and terminal for identifying enterprise withholding tax
CN113255857B (en)Risk detection method, device and equipment for graphic code
CN117493550A (en)Training method of text classification model, text classification method and device
CN116630059A (en)Loss prediction method, device, equipment and storage medium based on artificial intelligence
CN115660733A (en)Sales prediction system and method based on artificial intelligence
Saanchay et al.An approach for credit card churn prediction using gradient descent
Fitrianto et al.Development of direct marketing strategy for banking industry: The use of a Chi-squared Automatic Interaction Detector (CHAID) in deposit subscription classification
CN111159397B (en)Text classification method and device and server

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
TA01Transfer of patent application right
TA01Transfer of patent application right

Effective date of registration:20200922

Address after:Cayman Enterprise Centre, 27 Hospital Road, George Town, Grand Cayman, British Islands

Applicant after:Advanced innovation technology Co.,Ltd.

Address before:A four-storey 847 mailbox in Grand Cayman Capital Building, British Cayman Islands

Applicant before:Alibaba Group Holding Ltd.

Effective date of registration:20200922

Address after:Cayman Enterprise Centre, 27 Hospital Road, George Town, Grand Cayman, British Islands

Applicant after:Innovative advanced technology Co.,Ltd.

Address before:Cayman Enterprise Centre, 27 Hospital Road, George Town, Grand Cayman, British Islands

Applicant before:Advanced innovation technology Co.,Ltd.

GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp