Movatterモバイル変換


[0]ホーム

URL:


CN109903100A - A kind of customer churn prediction technique, device and readable storage medium storing program for executing - Google Patents

A kind of customer churn prediction technique, device and readable storage medium storing program for executing
Download PDF

Info

Publication number
CN109903100A
CN109903100ACN201910225076.7ACN201910225076ACN109903100ACN 109903100 ACN109903100 ACN 109903100ACN 201910225076 ACN201910225076 ACN 201910225076ACN 109903100 ACN109903100 ACN 109903100A
Authority
CN
China
Prior art keywords
sample
feature samples
prediction model
feature
training
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910225076.7A
Other languages
Chinese (zh)
Inventor
苏杰
马志伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meng Yu Science And Technology Ltd Of Shenzhen
Original Assignee
Meng Yu Science And Technology Ltd Of Shenzhen
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meng Yu Science And Technology Ltd Of ShenzhenfiledCriticalMeng Yu Science And Technology Ltd Of Shenzhen
Publication of CN109903100ApublicationCriticalpatent/CN109903100A/en
Pendinglegal-statusCriticalCurrent

Links

Landscapes

Abstract

The embodiment of the invention discloses a kind of customer churn prediction technique, device and readable storage medium storing program for executing.This method comprises: equipment is trained sample vector to obtain the first prediction model, the importance ranking of multiple feature samples in sample vector is generated further according to the first prediction model, and obtain the cross feature of preceding k feature samples in importance ranking, after obtaining final prediction model according to cross feature and sample vector the first prediction model of update, the second training characteristics of user to be predicted are input to updated first prediction model to predict that user to be predicted logins the time that target application distance this time logins target application next time by equipment.Using the embodiment of the present application, the accuracy of prediction model can be improved, realize the prediction to customer churn.

Description

A kind of customer churn prediction technique, device and readable storage medium storing program for executing
Technical field
The present invention relates to technical field of data processing more particularly to a kind of customer churn prediction technique, device and readable depositStorage media.
Background technique
Many network services and game on line have been all suffered from there is a large amount of use within a few minutes or a few houres of beginningThe case where family is lost, in order to reduce customer churn, can predict the loss of user, to formulate not for different usersSame strategy, improves the game experiencing of user.
Existing attrition prediction method mostly uses core index to fluctuate or use the methods of logistic regression, decision tree, coreHeart index refers mainly to game duration, outpost failure rate etc. and occurs then to think that user will be lost when large variation, and logistic regression is determinedPlan tree refers mainly to carry out whether prediction user will be lost according to user's history Behavioral availability logistic regression or decision tree.SoAnd both mode covering surfaces are relatively narrow, prediction accuracy is not high.Therefore, how more accurately prediction customer churn situation is thisThe problem of technical field personnel are studying.
Summary of the invention
The embodiment of the invention discloses a kind of customer churn prediction technique, device and readable storage medium storing program for executing, can be realized pairThe prediction of customer churn, and improve the accuracy of prediction model.
In a first aspect, the embodiment of the invention provides a kind of customer churn prediction techniques, this method comprises:
Sample vector is trained to obtain the first prediction model, wherein the sample vector includes multiple feature samplesThis, each feature samples include the first training characteristics and user tag, first training characteristics in the multiple feature samplesFor the feature extracted in the initial data of pre-set user, when the initial data includes representation data and operation target applicationBehavioral data;The user tag is logined target application distance for describing the pre-set user next time and is this time loginedThe time of the target application, first prediction model are used to carry out ranking to the importance of the multiple feature samples;
The importance ranking of multiple feature samples in the sample vector is generated according to first prediction model, and is obtainedThe cross feature of preceding k feature samples in the importance ranking, the cross feature are that the preceding k feature samples are countedObtained feature is calculated in student movement;
First prediction model is updated according to the cross feature and the sample vector;
The second instruction is extracted in the initial data in the preset period of time for logining the target application from user to be predictedPractice feature, second training characteristics is input to updated first prediction model, to predict the user to be predictedThe time that the target application distance this time logins the target application is logined next time.
In the above-mentioned methods, equipment is trained sample vector to obtain the first prediction model, further according to the first predictionModel generates the importance ranking of multiple feature samples in sample vector, and obtains preceding k feature samples in importance rankingCross feature updates the first prediction model according to cross feature and sample vector and obtains final prediction model, to predict to pre-It surveys user and logins the time that target application distance this time logins target application next time;It is this to be arranged by obtaining feature importanceThe cross feature of preceding k feature samples carrys out the mode of training pattern in name, can expand the coverage rate of important feature, to improveThe prediction to customer churn is realized in the accuracy of prediction model.
It is described that sample vector is trained to obtain based in a first aspect, in a kind of wherein optional implementationFirst prediction model, comprising:
Obtain sample vector;
Training set is generated according to the sample vector, and the training training set is to obtain the first prediction model;Wherein, instituteStating includes multiple feature samples in training set, and each feature samples in the multiple feature samples are in the sample vectorFeature samples.
This implementation is screened again by the sample vector to acquisition, improves the quality of feature samples, fromAnd improve the accuracy of model.
Based in a first aspect, described generated according to the sample vector is trained in a kind of wherein optional implementationCollection, comprising:
The sample vector includes positive sample and negative sample, and the positive sample is in the multiple feature samples comprising defaultThe sample of field, the negative sample is the sample for not including the preset field in the multiple feature samples, if the positive sampleThe ratio of this and the negative sample is more than preset range, then down-sampling is carried out to the negative sample, so that in the training setThe ratio of the positive sample and the negative sample is within preset range.
This implementation is provided with the ratio of positive negative sample in training set, reasonable positive and negative during training patternSample proportion can be improved the accuracy of model.
It is described to be generated according to first prediction model based in a first aspect, in a kind of wherein optional implementationThe importance ranking of the multiple feature samples, comprising:
According to first prediction model prediction as a result, calculate the accuracy and recall rate of the multiple feature samples,The accuracy of each feature samples in the importance ranking is greater than preset threshold, and recall rate is bigger, in the importanceRanking in ranking more before.
Based in a first aspect, the preset period of time is no more than two hours in a kind of wherein optional implementation.
Such implementation offers the predicted times of hour grade, in two hours after can only being logged in using userOr the data of shorter time predict whether user is lost, more efficient provides prediction result, enable a device to mention fasterFor being suitable for the personalized service of user to be predicted.
Second aspect, the embodiment of the invention provides a kind of customer churn prediction meanss, which includes:
Training unit, for being trained to sample vector to obtain the first prediction model, wherein the sample vector packetMultiple feature samples are included, each feature samples include the first training characteristics and user tag in the multiple feature samples, describedFirst training characteristics are the feature extracted in the initial data of pre-set user, and the initial data includes representation data and operationBehavioral data when target application;The user tag for describe the pre-set user login next time the target application away fromFrom the time for this time logining the target application, first prediction model is used for the importance to the multiple feature samplesCarry out ranking;
Acquiring unit, for generating the important of multiple feature samples in the sample vector according to first prediction modelProperty ranking, and obtain the cross feature of preceding k feature samples in the importance ranking, the cross feature is the preceding kFeature samples perform mathematical calculations obtained feature;
Updating unit, for updating first prediction model according to the cross feature and the sample vector;
Predicting unit mentions in the initial data in the preset period of time for logining the target application from user to be predictedTake the second training characteristics, second training characteristics be input to updated first prediction model, with predict it is described toPrediction user logins the time that the target application distance this time logins the target application next time.
Based on second aspect, in one of the implementation manners, the training unit includes:
Subelement is obtained, for obtaining sample vector;
Training subelement, for generating training set according to the sample vector, and the training training set is to obtain firstPrediction model;It wherein, include multiple feature samples in the training set, each feature samples in the multiple feature samples areFeature samples in the sample vector.
Based on second aspect, in one of the implementation manners, the sample vector includes positive sample and negative sample, describedPositive sample be the multiple feature samples in include preset field sample, the negative sample be the multiple feature samples in notSample comprising the preset field;The acquisition subelement further include:
Sampling unit, if being more than preset range for the ratio of the positive sample and the negative sample, to the negative sampleThis progress down-sampling, so that the ratio of the positive sample and the negative sample in the training set is within preset range.
Based on second aspect, in one of the implementation manners, the acquiring unit further include:
Computing unit, for according to first prediction model predict as a result, calculating the essence of the multiple feature samplesThe accuracy of exactness and recall rate, each feature samples in the importance ranking is greater than preset threshold, and recall rate is bigger,The ranking in the importance ranking more before.
Based on second aspect, in one of the implementation manners, the preset period of time is no more than two hours.
It should be noted that the implementation of second aspect and corresponding beneficial effect are referred to first aspect and phaseThe description in implementation is answered, details are not described herein again.
The third aspect, the embodiment of the invention discloses a kind of computer readable storage medium, the computer storage mediumIt is stored with program instruction, described program instruction makes the processor execute first aspect or first party when being executed by a processorMethod described in any possible implementation in face.
It should be noted that the implementation of the third aspect and corresponding beneficial effect are referred to first aspect and phaseThe description in implementation is answered, details are not described herein again.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, the present invention will be implemented belowAttached drawing needed in example or background technique is briefly described.
Fig. 1 is a kind of structural schematic diagram of the pre- measurement equipment of customer churn provided in an embodiment of the present invention;
Fig. 2 is a kind of flow diagram of customer churn prediction technique provided in an embodiment of the present invention;
Fig. 3 is a kind of structural schematic diagram of customer churn prediction meanss provided in an embodiment of the present invention.
Specific embodiment
It is described below in conjunction with attached drawing technical solution in the embodiment of the present invention.
It should be appreciated that the term used in this present specification is merely for the sake of for the purpose of describing particular embodimentsAnd it is not intended to limit the application." embodiment " is referred in the specification of the present application it is meant that is described is specific in conjunction with the embodimentsFeature, structure or characteristic may be embodied at least one embodiment of the application.It is somebody's turn to do each position in the descriptionPhrase might not each mean identical embodiment, nor the independent or alternative embodiment with other embodiments mutual exclusion.Those skilled in the art explicitly and implicitly understand that embodiment described herein can mutually be tied with other embodimentsIt closes.The term " equipment " that uses in the present specification, " unit ", " system " etc. for indicate computer-related entity, hardware,Firmware, the combination of hardware and software, software or software in execution.For example, equipment can be but not limited to, and processor, dataProcessing platform calculates equipment, computer, 2 or more computers etc..
It is also understood that referring in present specification to term "and/or" used in the appended claims relatedJoin any combination and all possible combinations of one or more of item listed, and including these combinations.
In order to better understand a kind of customer churn prediction technique provided by the embodiments of the present application, device and computer-readableThe equipment of storage medium, the customer churn prediction technique being first applicable in below the embodiment of the present application is described:
Refering to fig. 1, Fig. 1 is the equipment schematic diagram for the customer churn prediction technique that this programme embodiment provides.Equipment 10 canTo include processor 101, memory 104 and communication module 105, processor 101, memory 104 and communication module 105 can lead toCross the interconnection of bus 106.Memory 104 can be high speed random access memory (Random Access Memory, RAM)Memory is also possible to non-volatile memory (non-volatile memory), for example, at least a magnetic disk storage.Memory 104 optionally can also be that at least one is located remotely from the storage system of aforementioned processor 101.Memory 104 is used forApplication code is stored, may include operating system, network communication module, Subscriber Interface Module SIM and data processor;Communication module 105 is used to carry out information exchange with external equipment, wherein may include for carrying out wireless, wired or other communicationsThe unit of mode.Optionally, the device in 103 parts for realizing receive capabilities can be considered as receiving unit, reality will be used forThe device of existing sending function is considered as transmission unit, i.e. 103 parts include receiving unit and transmission unit;Processor 101 can also be withReferred to as processing unit handles veneer, processing module, processing unit etc..Processor can be central processing unit (centralProcessing unit, CPU), the combination of network processing unit (network processor, NP) or CPU and NP.Work as processingWhen device 101 calls the payment amount Prediction program of memory 104, method shown in Fig. 2 is executed.
In the concrete realization, the pre- measurement equipment 10 of customer churn may include cell phone, tablet computer, personal digital assistant(Personal Digital Assistant, PDA), mobile internet device (Mobile Internet Device, MID),The equipment that intelligent wearable device (such as smartwatch, Intelligent bracelet) various users can be used, the embodiment of the present application are not made to haveBody limits.
Optionally, the equipment can (multiple servers may be constructed a server set for one or more serversGroup), needing on server to run has corresponding server to provide corresponding customer churn prediction service, such as databaseService, data calculating, decision execution etc..
Customer churn prediction technique of the invention is illustrated below with reference to Fig. 2, as shown in Fig. 2, it is real for the present inventionA kind of flow diagram of customer churn prediction technique of example offer is provided, this method can be realized based on equipment shown in FIG. 1,This method can include but is not limited to following steps:
Step S201: equipment is trained sample vector to obtain the first prediction model.
Specifically, after equipment gets sample vector, according to sample vector training Gradient Iteration decision tree (GradientBoosting Decision Tree, GBDT), to obtain the first prediction model, sample vector includes multiple feature samples, multipleEach feature samples include the first training characteristics and user tag in feature samples, and the first training characteristics are the original in pre-set userThe feature extracted in beginning data, behavioral data when initial data includes representation data and operation target application, wherein number of drawing a portraitAccording to comprising user's gender, age, region, end message etc., behavioral data includes to login number, online hours, outpost number, mostClosely once login time point etc.;User tag for describe pre-set user login next time target application distance this time login meshIt marks the time of application, the first prediction model is used to carry out ranking to the importance of multiple feature samples.
In a kind of wherein embodiment, after equipment gets sample vector, training set is generated according to the sample vector, andThe training training set is to obtain the first prediction model;It wherein, include multiple feature samples in training set, in multiple feature samplesEach feature samples are the feature samples in sample vector;In other words, equipment to multiple feature samples in the sample vector intoRow screening, gets training set, wherein the mode screened can be the quantity according to positive negative sample as foundation, i.e. sample vectorIncluding positive sample and negative sample, positive sample is the sample in multiple feature samples comprising preset field, and negative sample is multiple featuresDo not include the sample of preset field in sample, if the ratio of positive sample and negative sample is less than preset range in sample vector,Equipment can carry out down-sampling to negative sample, i.e., primary to the several sample value values of the train interval of negative sample, so that in training setPositive sample and negative sample ratio within preset range, if the ratio of positive sample and negative sample is more than pre- in sample vectorIf range, then equipment, which can reduce the quantity of positive sample or increase the quantity of negative sample, makes positive sample in training set and negative sampleWithin preset range, which is generally arranged between 0.2~0.5 this ratio.For example, if sample vector packetThis 20 feature samples of the M20 that includes M1, M2, M3, M4 ..., wherein positive sample is M1, M2, M3, remaining 17 are negative sample, at this momentThe ratio of positive negative sample is 0.176, is not belonging in preset range, then carries out down-sampling to negative sample, it can with 2 for interval pairNegative sample is sampled, and the negative sample after sampling is M4, M6, M8, M10, M12, M14, M16, M18, M20, at this moment positive negative sampleRatio be 0.33, belong in preset range, i.e. screening is completed, the feature samples in training set are M1, M2, M3, M4, M6, M8,M10,M12,M14,M16,M18,M20.This this implementation of embodiment is by carrying out again the sample vector of acquisitionScreening, controls the ratio of positive negative sample in training set, improves the quality of feature samples, during training pattern rationallyPositive and negative sample proportion can be improved the accuracy of model.
Step S202: equipment generates the importance ranking of multiple feature samples in sample vector according to the first prediction model,And obtain the cross feature of preceding k feature samples in importance ranking.
Specifically, equipment obtains the first prediction model according to sample vector training, by the output knot of first prediction modelFruit is compared with the user tag in the sample vector, calculates the accuracy of multiple feature samples in sample vector and recallsRate, the accuracy of each feature samples in importance ranking are greater than preset threshold, and which is generally arranged at 0.8~Between 0.9, and recall rate is bigger, the ranking in the importance ranking more before, in other words, as long as feature samples is accurateDegree has been more than preset threshold, is just ranked up according to the recall rate of feature samples;Then preceding k in the importance ranking are obtainedFeature samples perform mathematical calculations to obtain cross feature, in the concrete realization, the operation include plus operation, subtract operation, multiplication,At least one of division operation, i.e. at most there are four different cross features between two feature samples.
For example, the feature samples in training set be M1, M2, M3, M4, M6, M8, M10, M12, M14, M16, M18,M20, wherein the accuracy of this seven feature samples of M1, M2, M3, M4, M6, M8, M10 is greater than preset threshold, then according to recall rateRankings are carried out to this seven feature samples, the bigger ranking of recall rate more before, ranking can be M6, M8, M2, M3, M10, M1, M4,Default k=3 then performs mathematical calculations to obtain two-by-two new cross feature to before ranking 3 feature samples M6, M8 and M2, M6 andThe cross feature of M8 can be c1=M6+M8, c2=M6-M8, c3=M6*M8, c4=M6/M8;It should be understood that M6 and M2 itBetween cross feature and M2 and M8 between cross feature can similarly obtain.
Step S203: equipment updates the first prediction model according to cross feature and sample vector.
Specifically, after getting cross feature, equipment carries out feature selecting to the cross feature, and it is special to obtain optimal intersectionSign, optimal cross feature may include multiple cross features, can select the quantity of required cross feature according to the actual situation,Equipment updates the first prediction model according to the optimal cross feature and sample vector, obtains final prediction model.
Step S204: equipment extracts in initial data in the preset period of time for logining target application from user to be predictedSecond training characteristics are input to updated first prediction model by two training characteristics.
Specifically, after equipment obtains final prediction model, input user to be predicted login target application it is default whenThe second training characteristics in section this time login target application to predict that user to be predicted logins target application distance next timeTime, wherein preset period of time is usually no more than two hours, in other words, logins two of target application in user to be predictedIn hour, equipment obtains behavioral data and representation data of the user to be predicted in target application, and wherein representation data includesUser's gender, age, region, end message etc., behavioral data include to login number, online hours, outpost number, the last timeLogin time point etc.;Then the second training characteristics are extracted in behavioral data and representation data, are inputted second training characteristics and are arrivedIn final prediction model, with predict user to be predicted login next time target application distance this time login target application whenBetween, this embodiment provides the predicted time of hour grade, in two hours after can only being logged in using user or moreThe data of short time predict whether user is lost, and more efficient provide prediction result, enable a device to provide faster suitableTogether in the personalized service of user to be predicted.
In the method depicted in fig. 2, equipment is trained sample vector to obtain the first prediction model, further according toOne prediction model generates the importance ranking of multiple feature samples in sample vector, and obtains preceding k feature in importance rankingThe cross feature of sample updates the first prediction model according to cross feature and sample vector and obtains final prediction model, with pre-It surveys user to be predicted and logins the time that target application distance this time logins target application next time;It is this to pass through acquisition feature weightThe cross feature of preceding k feature samples carrys out the mode of training pattern in the property wanted ranking, can expand the coverage rate of important feature, fromAnd the accuracy of prediction model is improved, realize the prediction to customer churn.
For the ease of better implementing the above scheme of the embodiment of the present invention, the present invention is also corresponding to provide a kind of user's streamPrediction meanss are lost, are described in detail with reference to the accompanying drawing:
As shown in figure 3, the embodiment of the present invention provides a kind of structural schematic diagram of customer churn prediction meanss 30, the device 30It can be a device (for example, chip) in devices described above or the equipment, customer churn prediction meanss 30 can be withIt include: training unit 301, acquiring unit 302, updating unit 303, predicting unit 304, wherein
Training unit 301, for being trained to sample vector to obtain the first prediction model, wherein sample vector packetInclude multiple feature samples, each feature samples include training characteristics and user tag in multiple feature samples, training characteristics beThe feature extracted in the initial data of pre-set user, behavior number when initial data includes representation data and operation target applicationAccording to;User tag is used to describe pre-set user and logins the time that target application distance this time logins target application next time, theOne prediction model is used to carry out ranking to the importance of multiple feature samples;
Acquiring unit 302, the importance for generating multiple feature samples in sample vector according to the first prediction model are arrangedName, and the cross feature of preceding k feature samples in importance ranking is obtained, cross feature is that preceding k feature samples carry out mathematicsThe obtained feature of operation;
Updating unit 303, for updating the first prediction model according to cross feature and sample vector;
Predicting unit 304, for being mentioned in initial data in the preset period of time for logining target application from user to be predictedThe second training characteristics are taken, the second training characteristics are input to updated first prediction model, to predict one under user to be predictedThe secondary time logined target application distance and this time login target application.
Wherein, preset period of time is no more than two hours.
In a kind of wherein embodiment, training unit 301 includes obtaining subelement 305 and training subelement 306, whereinSubelement 305 is obtained for obtaining sample vector;
Training subelement 306 is used to generate training set according to sample vector, and training training set is to obtain the first prediction mouldType;It wherein, include multiple feature samples in training set, each feature samples in multiple feature samples are the spy in sample vectorLevy sample.
In a kind of wherein embodiment, sample vector includes positive sample and negative sample, and positive sample is in multiple feature samplesSample comprising preset field, negative sample are the sample for not including preset field in multiple feature samples;Obtain subelement 306 alsoInclude:
Sampling unit 307 adopt to negative sample if being more than preset range for the ratio of positive sample and negative sampleSample, so that the ratio of positive sample and negative sample in training set is within preset range.
In a kind of wherein embodiment, acquiring unit 302 further include:
Computing unit 308, for according to the first prediction model predict as a result, calculate multiple feature samples accuracy andThe accuracy of recall rate, each feature samples in importance ranking is greater than preset threshold, and recall rate is bigger, arranges in importanceName in ranking more before.
It should be noted that the function of each functional unit can be found in device described in Fig. 3 in the embodiment of the present applicationThe associated description of step S201- step S204 in embodiment of the method described in Fig. 2 is stated, details are not described herein again.
Those of ordinary skill in the art will appreciate that realizing all or part of the process in above-described embodiment method, being can be withRelevant hardware is instructed to complete by computer program, the program can be stored in a computer-readable storage mediumIn, the program is when being executed, it may include such as the process of the embodiment of above-mentioned each method.Wherein, the storage medium can be magneticDish, CD, read-only memory (Read-Only Memory, ROM) or random access memory (Random AccessMemory, RAM) etc..
In this application, the unit as illustrated by the separation member may or may not be physically separate, component shown as a unit may or may not be physical unit, it can and it is in one place, or can alsoTo be distributed over a plurality of network elements.Some or all of unit therein can be selected to realize this hair according to the actual needsThe purpose of bright example scheme.
It, can also be in addition, the functional units in various embodiments of the present invention may be integrated into one processing unitIt is that each unit physically exists alone, is also possible to two or more units and is integrated in one unit.It is above-mentioned integratedUnit both can take the form of hardware realization, can also realize in the form of software functional units.
The above description is merely a specific embodiment, but scope of protection of the present invention is not limited thereto, anyThose familiar with the art in the technical scope disclosed by the present invention, can readily occur in various equivalent modifications or replaceIt changes, these modifications or substitutions should be covered by the protection scope of the present invention.Therefore, protection scope of the present invention should be with rightIt is required that protection scope subject to.
It should be understood that magnitude of the sequence numbers of the above procedures are not meant to execute suitable in the various embodiments of the applicationSequence it is successive, the execution of each process sequence should be determined by its function and internal logic, the implementation without coping with the embodiment of the present inventionProcess constitutes any restriction.Although the application is described in conjunction with each embodiment herein, however, being protected required by embodimentDuring the application of shield, those skilled in the art are appreciated that and realize other variations of open embodiment.

Claims (10)

CN201910225076.7A2018-12-252019-03-22A kind of customer churn prediction technique, device and readable storage medium storing program for executingPendingCN109903100A (en)

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
CN20181159644212018-12-25
CN2018115964422018-12-25

Publications (1)

Publication NumberPublication Date
CN109903100Atrue CN109903100A (en)2019-06-18

Family

ID=66953474

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201910225076.7APendingCN109903100A (en)2018-12-252019-03-22A kind of customer churn prediction technique, device and readable storage medium storing program for executing

Country Status (1)

CountryLink
CN (1)CN109903100A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN110598845A (en)*2019-08-132019-12-20中国平安人寿保险股份有限公司Data processing method, data processing device, computer equipment and storage medium
CN111803957A (en)*2020-07-172020-10-23网易(杭州)网络有限公司Player prediction method and device for online game, computer equipment and medium
CN111861588A (en)*2020-08-062020-10-30网易(杭州)网络有限公司Training method of loss prediction model, player loss reason analysis method and player loss reason analysis device
CN112245934A (en)*2020-11-162021-01-22腾讯科技(深圳)有限公司Data analysis method, device and equipment for virtual resources in virtual scene application
CN114663138A (en)*2022-03-152022-06-24深圳依时货拉拉科技有限公司 User Churn Prediction Method, Apparatus, Computer Equipment and Storage Medium
CN115018562A (en)*2022-07-062022-09-06湖南草花互动科技股份公司User pre-churn prediction method, device and system
TWI881814B (en)*2024-04-282025-04-21遊戲橘子數位科技股份有限公司Methods to find effective return user groups based on dynamic tags

Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20120233159A1 (en)*2011-03-102012-09-13International Business Machines CorporationHierarchical ranking of facial attributes
CN105005909A (en)*2015-06-172015-10-28深圳市腾讯计算机系统有限公司Method and device for predicting lost users
CN107832581A (en)*2017-12-152018-03-23百度在线网络技术(北京)有限公司Trend prediction method and device
CN108121795A (en)*2017-12-202018-06-05北京奇虎科技有限公司User's behavior prediction method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20120233159A1 (en)*2011-03-102012-09-13International Business Machines CorporationHierarchical ranking of facial attributes
CN105005909A (en)*2015-06-172015-10-28深圳市腾讯计算机系统有限公司Method and device for predicting lost users
CN107832581A (en)*2017-12-152018-03-23百度在线网络技术(北京)有限公司Trend prediction method and device
CN108121795A (en)*2017-12-202018-06-05北京奇虎科技有限公司User's behavior prediction method and device

Cited By (9)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN110598845A (en)*2019-08-132019-12-20中国平安人寿保险股份有限公司Data processing method, data processing device, computer equipment and storage medium
CN111803957A (en)*2020-07-172020-10-23网易(杭州)网络有限公司Player prediction method and device for online game, computer equipment and medium
CN111803957B (en)*2020-07-172024-02-09网易(杭州)网络有限公司Method, device, computer equipment and medium for predicting players of online games
CN111861588A (en)*2020-08-062020-10-30网易(杭州)网络有限公司Training method of loss prediction model, player loss reason analysis method and player loss reason analysis device
CN111861588B (en)*2020-08-062023-10-31网易(杭州)网络有限公司Training method of loss prediction model, player loss reason analysis method and player loss reason analysis device
CN112245934A (en)*2020-11-162021-01-22腾讯科技(深圳)有限公司Data analysis method, device and equipment for virtual resources in virtual scene application
CN114663138A (en)*2022-03-152022-06-24深圳依时货拉拉科技有限公司 User Churn Prediction Method, Apparatus, Computer Equipment and Storage Medium
CN115018562A (en)*2022-07-062022-09-06湖南草花互动科技股份公司User pre-churn prediction method, device and system
TWI881814B (en)*2024-04-282025-04-21遊戲橘子數位科技股份有限公司Methods to find effective return user groups based on dynamic tags

Similar Documents

PublicationPublication DateTitle
CN109903100A (en)A kind of customer churn prediction technique, device and readable storage medium storing program for executing
CN106250403A (en)Customer loss Forecasting Methodology and device
CN110928993A (en)User position prediction method and system based on deep cycle neural network
CN112183818A (en)Recommendation probability prediction method and device, electronic equipment and storage medium
CN112311578B (en)VNF scheduling method and device based on deep reinforcement learning
KR20190052143A (en) Search for neural architecture
US11423307B2 (en)Taxonomy construction via graph-based cross-domain knowledge transfer
CN105095279B (en)File recommendation method and device
CN108182634A (en)A kind of training method for borrowing or lending money prediction model, debt-credit Forecasting Methodology and device
CN113379042B (en)Business prediction model training method and device for protecting data privacy
CN112232887A (en)Data processing method and device, computer equipment and storage medium
CN113902131B (en) An Update Method for Node Models Resisting Discrimination Propagation in Federated Learning
CN108133390A (en)For predicting the method and apparatus of user behavior and computing device
CN118379138A (en)Risk assessment method, risk assessment device, computer equipment and computer readable storage medium
CN111461188A (en)Target service control method, device, computing equipment and storage medium
CN106803092B (en)Method and device for determining standard problem data
KR102010031B1 (en)Method and apparatus for predicting game indicator information
CN119558886A (en) Model training methods, product recommendation methods, equipment, media and program products
CN109753708A (en)A kind of payment amount prediction technique, device and readable storage medium storing program for executing
Mohammadi et al.Machine learning assisted stochastic unit commitment: A feasibility study
CN117876091B (en)Information transmission method, apparatus, electronic device, and computer-readable medium
CN115794323A (en)Task scheduling method, device, server and storage medium
CN110489435B (en)Data processing method and device based on artificial intelligence and electronic equipment
CN116361564B (en)Super-parameter iteration training method, iteration training device and electronic equipment
CN117853180A (en)Data pricing method, device, equipment and medium based on reinforcement learning

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
RJ01Rejection of invention patent application after publication
RJ01Rejection of invention patent application after publication

Application publication date:20190618


[8]ページ先頭

©2009-2025 Movatter.jp