Movatterモバイル変換


[0]ホーム

URL:


CN107885544B - Application program control method, device, medium and electronic equipment - Google Patents

Application program control method, device, medium and electronic equipment
Download PDF

Info

Publication number
CN107885544B
CN107885544BCN201711044959.5ACN201711044959ACN107885544BCN 107885544 BCN107885544 BCN 107885544BCN 201711044959 ACN201711044959 ACN 201711044959ACN 107885544 BCN107885544 BCN 107885544B
Authority
CN
China
Prior art keywords
application program
layer
input
calculation
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711044959.5A
Other languages
Chinese (zh)
Other versions
CN107885544A (en
Inventor
梁昆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp LtdfiledCriticalGuangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201711044959.5ApriorityCriticalpatent/CN107885544B/en
Publication of CN107885544ApublicationCriticalpatent/CN107885544A/en
Priority to PCT/CN2018/110518prioritypatent/WO2019085749A1/en
Application grantedgrantedCritical
Publication of CN107885544BpublicationCriticalpatent/CN107885544B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The present application providesAn application program control method, device, medium and electronic equipment are provided, wherein historical characteristic information x is acquirediAnd generating a training model by adopting a Back Propagation (BP) neural network algorithm, and when detecting that the application program enters a background, bringing the current characteristic information s of the application program into the training model, further judging whether the application program needs to be closed, and intelligently closing the application program.

Description

Application program control method, device, medium and electronic equipment
Technical Field
The application relates to the field of electronic equipment terminals, in particular to an application program control method, device, medium and electronic equipment.
Background
End users use a large number of applications every day, and often, after one application is pushed to the background, precious system memory resources are occupied if not cleared in time, and system power consumption is affected. Therefore, it is desirable to provide an application management and control method, apparatus, medium, and electronic device.
Disclosure of Invention
The embodiment of the application provides an application program control method, an application program control device, an application program control medium and electronic equipment, so that an application program can be closed intelligently.
The embodiment of the application provides an application program control method, which is applied to electronic equipment and comprises the following steps:
obtaining the sample vector set of the application program, wherein the sample vectors in the sample vector set comprise historical feature information x of multiple dimensions of the application programi
Calculating a sample vector set by adopting a Back Propagation (BP) neural network algorithm to generate a training model;
when an application program enters a background, inputting the current characteristic information s of the application program into the training model for calculation; and
and judging whether the application program needs to be closed or not.
The embodiment of the present application further provides an application management and control method and device, where the device includes:
an obtaining module, configured to obtain a sample vector set of the application program, where a sample vector in the sample vector set includes historical feature information x of multiple dimensions of the application programi
The generating module is used for calculating the sample vector set by adopting a BP neural network algorithm to generate a training model;
the calculation module is used for inputting the current characteristic information s of the application program into the training model for calculation when the application program enters a background; and
and the judging module is used for judging whether the application program needs to be closed or not.
The embodiment of the application also provides a medium, wherein a plurality of instructions are stored in the medium, and the instructions are suitable for being loaded by a processor to execute the application management and control method.
An embodiment of the present application further provides an electronic device, where the electronic device includes a processor and a memory, the electronic device is electrically connected to the memory, the memory is used to store instructions and data, and the processor is used to execute the following steps:
obtaining the sample vector set of the application program, wherein the sample vectors in the sample vector set comprise historical feature information x of multiple dimensions of the application programi
Calculating a sample vector set by adopting a BP neural network algorithm to generate a training model;
when an application program enters a background, inputting the current characteristic information s of the application program into the training model for calculation; and
and judging whether the application program needs to be closed or not.
The application program control method, device, medium and electronic equipment provided by the application program control method and device acquire historical characteristic information xiAnd generating a training model by adopting a BP neural network algorithm, and when detecting that the application program enters the background, bringing the current characteristic information s of the application program into the training model, further judging whether the application program needs to be closed, and intelligently closing the application program.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a system diagram of an application management and control apparatus according to an embodiment of the present disclosure.
Fig. 2 is a schematic view of an application scenario of an application management and control apparatus according to an embodiment of the present application.
Fig. 3 is a flowchart illustrating an application management and control method according to an embodiment of the present disclosure.
Fig. 4 is another schematic flowchart of an application management and control method according to an embodiment of the present disclosure.
Fig. 5 is a schematic structural diagram of an apparatus according to an embodiment of the present disclosure.
Fig. 6 is another schematic structural diagram of an apparatus according to an embodiment of the present disclosure.
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 8 is another schematic structural diagram of an electronic device according to an embodiment of the present application.
DETAILED DESCRIPTION OF EMBODIMENT (S) OF INVENTION
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the description of the present application, it is to be understood that the terms "center," "longitudinal," "lateral," "length," "width," "thickness," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," "clockwise," "counterclockwise," and the like are used in the orientations and positional relationships indicated in the drawings for convenience in describing the present application and for simplicity in description, and are not intended to indicate or imply that the referenced devices or elements must have a particular orientation, be constructed in a particular orientation, and be operated in a particular manner, and are not to be construed as limiting the present application. Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
In the description of the present application, it is to be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; may be mechanically connected, may be electrically connected or may be in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate.
In this application, unless expressly stated or limited otherwise, the first feature "on" or "under" the second feature may comprise direct contact of the first and second features, or may comprise contact of the first and second features not directly but through another feature in between. Also, the first feature being "on," "above" and "over" the second feature includes the first feature being directly on and obliquely above the second feature, or merely indicating that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature includes the first feature being directly under and obliquely below the second feature, or simply meaning that the first feature is at a lesser elevation than the second feature.
The following disclosure provides many different embodiments, or examples, for implementing different features of the application. In order to simplify the disclosure of the present application, specific example components and arrangements are described below. Of course, they are merely examples and are not intended to limit the present application. Moreover, the present application may repeat reference numerals and/or letters in the various examples, which have been repeated for purposes of simplicity and clarity and do not in themselves dictate a relationship between the various embodiments and/or arrangements discussed. In addition, examples of various specific processes and materials are provided herein, but one of ordinary skill in the art may recognize applications of other processes and/or use of other materials.
Referring to the drawings, wherein like reference numbers refer to like elements throughout, the principles of the present application are illustrated as being implemented in a suitable computing environment. The following description is based on illustrated embodiments of the application and should not be taken as limiting the application with respect to other embodiments that are not detailed herein.
While the principles of the application have been described in the foregoing text, it is not meant to be limiting and those of skill in the art will appreciate that various steps and operations described below may be implemented in hardware. The principles of the present application may be employed in numerous other general-purpose or special-purpose computing, communication environments or configurations.
The application program management and control method provided by the application program is mainly applied to electronic equipment, such as: the mobile terminal comprises intelligent mobile electronic equipment such as a bracelet, a smart phone, a tablet computer based on an apple system or an android system, or a notebook computer based on a Windows or Linux system. It should be noted that the application program may be a chat application program, a video application program, a music application program, a shopping application program, a shared bicycle application program, or a mobile banking application program.
Referring to fig. 1, fig. 1 is a system schematic diagram of an application management and control apparatus according to an embodiment of the present disclosure. The application program management and control device is mainly used for: obtaining historical characteristic information x of application program from databaseiThen, the history feature information xiCalculating through an algorithm to obtain a training model, then inputting the current characteristic information s of the application program into the training model for calculation, and judging whether the application program can be closed or not according to a calculation result so as to control a preset application program, such as closing or freezing.
Specifically, please refer to fig. 2, and fig. 2 is a schematic view of an application scenario of the application management and control method according to the embodiment of the present application. In thatIn one embodiment, historical feature information x of an application program is obtained from a databaseiThen, the history feature information xiCalculating through an algorithm to obtain a training model, inputting current characteristic information s of the application program into the training model for calculation when the application program control device detects that the application program enters a background of the electronic equipment, and judging whether the application program can be closed or not according to a calculation result. For example, the historical characteristic information x of the application program a is obtained from the databaseiThen, the history feature information xiCalculating through an algorithm to obtain a training model, secondly, when an application program control device detects that an application program a enters the background of the electronic equipment, inputting the current characteristic information s of the application program into the training model to calculate, judging that the application program a can be closed through a calculation result, closing the application program a, inputting the current characteristic information s of the application program b into the training model to calculate when the application program control device detects that the application program b enters the background of the electronic equipment, judging that the application program b needs to be reserved through the calculation result, and reserving the application program b.
An execution subject of the application management and control method may be the application management and control apparatus provided in the embodiment of the present invention, or an electronic device that becomes the application management and control apparatus, where the application management and control apparatus may be implemented in a hardware or software manner.
Referring to fig. 3, fig. 3 is a flowchart illustrating an application management and control method according to an embodiment of the present disclosure. The application program management and control method provided by the embodiment of the application program is applied to the electronic equipment, and the specific flow can be as follows:
step S101, obtaining the sample vector set of the application program, wherein the sample vectors in the sample vector set include the historical feature information x of multiple dimensions of the application programi
Acquiring a sample vector set of the application program from a sample database, wherein the sample vectors in the sample vector set comprise historical feature information x of multiple dimensions of the application programi
Wherein, the characteristic information of the plurality of dimensions can refer to table 1.
Figure BDA0001452074140000051
Figure BDA0001452074140000061
TABLE 1
It should be noted that the 10-dimensional characteristic information shown in table 1 is only one in the embodiment of the present application, but the present application is not limited to the 10-dimensional characteristic information shown in table 1, and may also be one of the 10-dimensional characteristic information, or at least two of the 10-dimensional characteristic information, or may also include characteristic information of other dimensions, for example, whether charging is currently performed, the current amount of power is currently performed, or whether WiFi is currently connected, or the like.
In one embodiment, historical feature information may be selected for 6 dimensions:
A. the time that the application resides in the background;
B. whether the screen is bright, for example, the screen is bright and is marked as 1, and the screen is off and is marked as 0;
C. counting the total use times in the week;
D. counting the total use time of the week;
E. whether WiFi is on or not, for example, WiFi is on and is recorded as 1, WiFi is off and is recorded as 0; and
F. whether charging is currently in progress, e.g., currently charging, noted as 1, and not currently charging, noted as 0.
And step S102, calculating a sample vector set by adopting a BP neural network algorithm to generate a training model.
Referring to fig. 4, fig. 4 is a schematic flowchart illustrating an application management and control method according to an embodiment of the present disclosure. In one embodiment, the step S102 may include:
step S1021: defining a network structure; and
step S1022: and bringing the sample vector set into a network structure for calculation to obtain a training model.
In step S1021, the defining the network structure comprises:
step S1021a, an input layer is set, the input layer comprises N nodes, the number of the nodes of the input layer and the historical characteristic information xiAre the same in dimension.
Wherein the historical feature information xiIs less than 10, and the number of nodes of the input layer is less than 10, so as to simplify the operation process.
In one embodiment, the historical feature information xiIs 6 dimensions, the input layer comprises 6 nodes.
Step S1021b, setting a hidden layer, where the hidden layer includes M nodes.
Wherein the hidden layer may comprise a plurality of hidden hierarchies. The number of nodes of each implicit hierarchy is less than 10, so that the operation process is simplified.
In one embodiment, the hidden layers may include a first hidden layer, a second hidden layer, and a third hidden layer. The first implied hierarchy includes 10 nodes, the second implied hierarchy includes 5 nodes, and the third implied hierarchy includes 5 nodes.
Step S1021c, setting a classification layer, wherein the classification layer adopts a softmax function, and the softmax function is
Figure BDA0001452074140000071
Wherein p is a prediction probability value, ZKIs a median value, C is the number of classes of prediction,
Figure BDA0001452074140000072
is the jth intermediate value.
Step S1021d, setting an output layer, the output layer comprising 2 nodes.
Step S1021e, setting an activation function, wherein the activation function adopts a sigmoid function, and the sigmoid function is
Figure BDA0001452074140000073
Wherein f (x) ranges from 0 to 1.
Step S1021f, set a batch size, the batch size being a.
The batch size can be flexibly adjusted according to actual conditions. The batch size may be 50-200 a.
In one embodiment, the batch size is 128.
In step S1021g, a learning rate is set, where the learning rate is B.
Wherein, the learning rate can be flexibly adjusted according to the actual situation. The learning rate may be 0.1-1.5.
In one embodiment, the learning rate is 0.9.
It should be noted that the sequence of the steps S1021a, S1021b, S1021c, S1021d, S1021e, S1021f, and S1021g can be flexibly adjusted.
In step S1022, the step of bringing the sample vector set into the network structure for calculation to obtain the training model may include:
step S1022a, the sample vector set is input to the input layer for calculation, so as to obtain an output value of the input layer.
Step S1022b, inputting the output value of the input layer into the hidden layer, and obtaining the output value of the hidden layer.
Wherein the output value of the input layer is the input value of the hidden layer.
In one embodiment, the hidden layer may include a plurality of hidden hierarchies. The output value of the input layer is the input value of the first implied hierarchy. The output value of the first implied hierarchy is the input value of the second implied hierarchy. And the output value of the second implied hierarchy is the input value of the third implied hierarchy, and so on.
Step S1022c, the output value of the hidden layer is input into the classification layer for calculation, and the prediction probability value [ p ] is obtained1p2]T
Wherein the output value of the hidden layer is the input value of the classification layer.
In one embodiment, the hidden layer may include a plurality of hidden hierarchies. The output value of the last implied hierarchy is the input value of the hierarchy.
Step S1022d, the prediction probability value is brought into the output layer for calculation to obtain the prediction result value y, when p1Greater than p2When y is [10 ]]TWhen p is1P is less than or equal to2When y is [ 01 ]]T
Wherein the output value of the classification layer is the input value of the output layer.
Step S1022e, the network structure is corrected according to the prediction result value y, and a training model is obtained.
And S103, when the application program enters a background, inputting the current characteristic information S of the application program into the training model for calculation.
Referring to fig. 4, in an embodiment, the step S103 may include:
step S1031: and collecting the current characteristic information s of the application program.
Acquiring the dimension of the current characteristic information s of the application program and the acquired historical characteristic information x of the application programiAre the same.
Step S1032: and substituting the current characteristic information s into the training model for calculation.
Inputting the current characteristic information s into the training model for calculation to obtain the prediction probability value [ p ] of the classification layer1’ p2’]TWhen p is1' greater than p2When y is 10]TWhen p is1' less than or equal to p2When y is ═ 01]T
And step S104, judging whether the application program needs to be closed.
When y is ═ 10]TDetermining that the application needs to be closed; when y is ═ 01]TAnd determining that the application needs to be reserved.
The application program control method provided by the application program control method obtains historical characteristic information xiBy using BP spiritAnd generating a training model through a network algorithm, and when detecting that the application program enters a background, bringing the current characteristic information s of the application program into the training model so as to judge whether the application program needs to be closed or not and intelligently close the application program.
Referring to fig. 5, fig. 5 is a schematic structural diagram of an application management and control apparatus according to an embodiment of the present disclosure. Theapparatus 30 includes an obtainingmodule 31, a generatingmodule 32, a calculatingmodule 33 and a judgingmodule 34.
It should be noted that the application program may be a chat application program, a video application program, a music application program, a shopping application program, a shared bicycle application program, or a mobile banking application program.
The obtainingmodule 31 is configured to obtain a sample vector set of the application program, where sample vectors in the sample vector set include historical feature information x of multiple dimensions of the application programi
Acquiring a sample vector set of the application program from a sample database, wherein the sample vectors in the sample vector set comprise historical feature information x of multiple dimensions of the application programi
Referring to fig. 6, fig. 6 is a schematic structural diagram of an application management and control apparatus according to an embodiment of the present disclosure. Theapparatus 30 further comprises adetection module 35 for detecting that the application enters the background.
Thedevice 30 may also include astorage module 36. Thestorage module 36 is used for storing the historical feature information x of the application programi
Wherein, the characteristic information of the plurality of dimensions can refer to table 2.
Figure BDA0001452074140000091
Figure BDA0001452074140000101
TABLE 2
It should be noted that the 10-dimensional characteristic information shown in table 2 is only one in the embodiment of the present application, but the present application is not limited to the 10-dimensional characteristic information shown in table 1, and may also be one of the 10-dimensional characteristic information, or at least two of the 10-dimensional characteristic information, or may also include characteristic information of other dimensions, for example, whether charging is currently performed, the current amount of power is currently performed, or whether WiFi is currently connected, or the like.
In one embodiment, historical feature information may be selected for 6 dimensions:
A. the time that the application resides in the background;
B. whether the screen is bright, for example, the screen is bright and is marked as 1, and the screen is off and is marked as 0;
C. counting the total use times in the week;
D. counting the total use time of the week;
E. whether WiFi is on or not, for example, WiFi is on and is recorded as 1, WiFi is off and is recorded as 0; and
F. whether charging is currently in progress, e.g., currently charging, noted as 1, and not currently charging, noted as 0.
The generatingmodule 32 is configured to calculate a sample vector set by using a BP neural network algorithm, and generate a training model.
The generatingmodule 32 trains the historical feature information x acquired by the acquiringmodule 31iInputting the historical characteristic information x in a BP neural network algorithmi
Referring to fig. 6, the generatingmodule 32 includes a definingmodule 321 and asolving module 322.
The definingmodule 321 is configured to define a network structure.
The definingmodule 321 may include an inputlayer defining module 3211, a hiddenlayer defining module 3212, a classificationlayer defining module 3213, an outputlayer defining module 3214, an activationfunction defining module 3215, a batchsize defining module 3216, and a learningrate defining module 3217.
The inputlayer definition module 3211 is configured to set an input layer, where the input layer includes N nodes, and the number of nodes of the input layer and the historical feature information xiAre the same in dimension.
Wherein the historical feature information xiIs less than 10, and the number of nodes of the input layer is less than 10, so as to simplify the operation process.
In one embodiment, the historical feature information xiIs 6 dimensions, the input layer comprises 6 nodes.
The hiddenlayer defining module 3212 is configured to set a hidden layer, which includes M nodes.
Wherein the hidden layer may comprise a plurality of hidden hierarchies. The number of nodes of each implicit hierarchy is less than 10, so that the operation process is simplified.
In one embodiment, the hidden layers may include a first hidden layer, a second hidden layer, and a third hidden layer. The first implied hierarchy includes 10 nodes, the second implied hierarchy includes 5 nodes, and the third implied hierarchy includes 5 nodes.
The classificationlayer definition module 3213 is configured to set a classification layer, where the classification layer uses a softmax function, and the softmax function is
Figure BDA0001452074140000111
Wherein p is a prediction probability value, ZKIs a median value, C is the number of classes of prediction,
Figure BDA0001452074140000112
is the jth intermediate value.
The outputlayer defining module 3214 is configured to set an output layer, where the output layer includes 2 nodes.
The activationfunction definition module 3215 is configured to set an activation function, where the activation function is a sigmoid function, and the sigmoid function is
Figure BDA0001452074140000121
Wherein f (x) ranges from 0 to 1.
The batchsize definition module 3216 is configured to set a batch size, which is a.
The batch size can be flexibly adjusted according to actual conditions. The batch size may be 50-200 a.
In one embodiment, the batch size is 128.
The learningrate defining module 3217 is configured to set a learning rate, where the learning rate is B.
Wherein, the learning rate can be flexibly adjusted according to the actual situation. The learning rate may be 0.1-1.5.
In one embodiment, the learning rate is 0.9.
It should be noted that the sequence of the inputlayer defining module 3211 setting an input layer, the hiddenlayer defining module 3212 setting a hidden layer, the classificationlayer defining module 3213 setting a classification layer, the outputlayer defining module 3214 setting an output layer, the activationfunction defining module 3215 setting an activation function, the batchsize defining module 3216 setting a batch size, and the learningrate defining module 3217 setting a learning rate may be flexibly adjusted.
Thesolving module 322 is configured to bring the sample vector set into a network structure for calculation, so as to obtain a training model.
Thesolving module 322 may include afirst solving module 3221, asecond solving module 3222, athird solving module 3223, afourth solving module 3224 and a correcting module.
Thefirst solving module 3221 is configured to input the sample vector set in the input layer for calculation, so as to obtain an output value of the input layer.
Thesecond solving module 3222 is configured to input the output value of the input layer in the hidden layer, so as to obtain the output value of the hidden layer.
Wherein the output value of the input layer is the input value of the hidden layer.
In one embodiment, the hidden layer may include a plurality of hidden hierarchies. The output value of the input layer is the input value of the first implied hierarchy. The output value of the first implied hierarchy is the input value of the second implied hierarchy. And the output value of the second implied hierarchy is the input value of the third implied hierarchy, and so on.
The third stepofThe solution module 3223 is configured to input the output value of the hidden layer in the classification layer for calculation, so as to obtain the prediction probability value [ p [ ]1p2]T
Wherein the output value of the hidden layer is the input value of the classification layer.
Thefourth solving module 3224 is configured to bring the prediction probability value into an output layer for calculation, so as to obtain a prediction result value y, when p is1Greater than p2When y is [10 ]]TWhen p is1P is less than or equal to2When y is [ 01 ]]T
Wherein the output value of the classification layer is the input value of the output layer.
The correctingmodule 3225 is configured to correct the network structure according to the prediction result value y, so as to obtain a training model.
Thecalculation module 33 is configured to input the current feature information s of the application program into the training model for calculation when the application program enters the background.
Referring to fig. 6, in an embodiment, thecalculation module 33 may include anacquisition module 331 and anoperation module 332.
The collectingmodule 331 is configured to collect current feature information s of the application program.
Acquiring the dimension of the current characteristic information s of the application program and the acquired historical characteristic information x of the application programiAre the same.
Theoperation module 332 is used for substituting the current feature information s into the training model for calculation.
Inputting the current characteristic information s into the training model for calculation to obtain the prediction probability value [ p ] of the classification layer1’ p2’]TWhen p is1' greater than p2When y is 10]TWhen p is1' less than or equal to p2When y is [ 01 ]]T
In an embodiment, the acquiringmodule 331 is configured to acquire current feature information s at regular time according to a predetermined acquiring time and store the current feature information s in thestorage module 36, and the acquiringmodule 331 is further configured to acquire the current feature information s corresponding to a time point when the application program enters the background, and input the current feature information s into theoperation module 332 for being taken into the training model for calculation.
The determiningmodule 34 is configured to determine whether the application needs to be closed.
When y is ═ 10]TDetermining that the application needs to be closed; when y is ═ 01]TAnd determining that the application needs to be reserved.
Theapparatus 30 may further include ashutdown module 37 configured to shutdown the application when it is determined that the application needs to be shutdown.
The device for the application program management and control method provided by the application obtains the historical characteristic information xiAnd generating a training model by adopting a BP neural network algorithm, and when detecting that the application program enters the background, bringing the current characteristic information s of the application program into the training model, further judging whether the application program needs to be closed, and intelligently closing the application program.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. Theelectronic device 500 includes: aprocessor 501 and amemory 502. Theprocessor 501 is electrically connected to thememory 502.
Theprocessor 501 is a control center of theelectronic device 500, connects various parts of the wholeelectronic device 500 by using various interfaces and lines, executes various functions of the electronic device and processes data by running or loading an application program stored in thememory 502 and calling the data stored in thememory 502, thereby monitoring the wholeelectronic device 500.
In this embodiment, theprocessor 501 in theelectronic device 500 loads instructions corresponding to processes of one or more application programs into thememory 502 according to the following steps, and theprocessor 501 runs the application programs stored in thememory 502, so as to implement various functions:
obtaining the sample vector set of the application program, wherein the sample vector packet in the sample vector setHistorical feature information x comprising multiple dimensions of the applicationi
Calculating a sample vector set by adopting a neural network algorithm to generate a training model;
when an application program enters a background, inputting the current characteristic information s of the application program into the training model for calculation; and
and judging whether the application program needs to be closed or not.
It should be noted that the application program may be a chat application program, a video application program, a music application program, a shopping application program, a shared bicycle application program, or a mobile banking application program.
Acquiring a sample vector set of the application program from a sample database, wherein the sample vectors in the sample vector set comprise historical feature information x of multiple dimensions of the application programi
Wherein, the characteristic information of the plurality of dimensions can refer to table 3.
Figure BDA0001452074140000141
Figure BDA0001452074140000151
TABLE 3
It should be noted that the 10-dimensional characteristic information shown in table 3 is only one in the embodiment of the present application, but the present application is not limited to the 10-dimensional characteristic information shown in table 1, and may also be one of the 10-dimensional characteristic information, or at least two of the 10-dimensional characteristic information, or may also include characteristic information of other dimensions, for example, whether charging is currently performed, the current amount of power is currently performed, or whether WiFi is currently connected, or the like.
In one embodiment, historical feature information may be selected for 6 dimensions:
A. the time that the application resides in the background;
B. whether the screen is bright, for example, the screen is bright and is marked as 1, and the screen is off and is marked as 0;
C. counting the total use times in the week;
D. counting the total use time of the week;
E. whether WiFi is on or not, for example, WiFi is on and is recorded as 1, WiFi is off and is recorded as 0; and
F. whether charging is currently in progress, e.g., currently charging, noted as 1, and not currently charging, noted as 0.
In one embodiment, theprocessor 501 calculates the sample vector set by using a BP neural network algorithm, and generating the training model further includes:
defining a network structure; and
and bringing the sample vector set into a network structure for calculation to obtain a training model.
Wherein the defining a network structure comprises:
setting an input layer, wherein the input layer comprises N nodes, the number of the nodes of the input layer is equal to the historical characteristic information xiAre the same in dimension;
wherein the historical feature information xiIs less than 10, and the number of nodes of the input layer is less than 10, so as to simplify the operation process.
In one embodiment, the historical feature information xiIs 6 dimensions, the input layer comprises 6 nodes.
Setting a hidden layer, wherein the hidden layer comprises M nodes.
Wherein the hidden layer may comprise a plurality of hidden hierarchies. The number of nodes of each implicit hierarchy is less than 10, so that the operation process is simplified.
In one embodiment, the hidden layers may include a first hidden layer, a second hidden layer, and a third hidden layer. The first implied hierarchy includes 10 nodes, the second implied hierarchy includes 5 nodes, and the third implied hierarchy includes 5 nodes.
Setting a classification layer, wherein the classification layer adopts a softmax function, and the softmax function is
Figure BDA0001452074140000161
Wherein p is a prediction probability value, ZKIs a median value, C is the number of classes of prediction,
Figure BDA0001452074140000162
is the jth intermediate value.
And setting an output layer, wherein the output layer comprises 2 nodes.
Setting an activation function, wherein the activation function adopts a sigmoid function, and the sigmoid function is
Figure BDA0001452074140000163
Wherein f (x) ranges from 0 to 1.
And setting a batch size, wherein the batch size is A.
The batch size can be flexibly adjusted according to actual conditions. The batch size may be 50-200 a.
In one embodiment, the batch size is 128.
And setting a learning rate, wherein the learning rate is B.
Wherein, the learning rate can be flexibly adjusted according to the actual situation. The learning rate may be 0.1-1.5.
In one embodiment, the learning rate is 0.9.
It should be noted that the sequence of the setting input layer, the setting hidden layer, the setting classification layer, the setting output layer, the setting activation function, the setting batch size, and the setting learning rate can be flexibly adjusted.
The step of bringing the sample vector set into a network structure for calculation to obtain a training model may include:
and inputting the sample vector set in the input layer for calculation to obtain an output value of the input layer.
And inputting the output value of the input layer into the hidden layer to obtain the output value of the hidden layer.
Wherein the output value of the input layer is the input value of the hidden layer.
In one embodiment, the hidden layer may include a plurality of hidden hierarchies. The output value of the input layer is the input value of the first implied hierarchy. The output value of the first implied hierarchy is the input value of the second implied hierarchy. And the output value of the second implied hierarchy is the input value of the third implied hierarchy, and so on.
Inputting the output value of the hidden layer in the classification layer for calculation to obtain the prediction probability value [ p1p2]T
Wherein the output value of the hidden layer is the input value of the classification layer.
In one embodiment, the hidden layer may include a plurality of hidden hierarchies. The output value of the last implied hierarchy is the input value of the hierarchy.
Substituting the prediction probability value into an output layer for calculation to obtain a prediction result value y, when p is1Greater than p2When y is [10 ]]TWhen p is1P is less than or equal to2When y is [ 01 ]]T
Wherein the output value of the classification layer is the input value of the output layer.
And correcting the network structure according to the predicted result value y to obtain a training model.
When the application program enters the background, the step of inputting the current characteristic information s of the application program into the training model for calculation comprises the following steps:
and collecting the current characteristic information s of the application program.
Acquiring the dimension of the current characteristic information s of the application program and the acquired historical characteristic information x of the application programiAre the same.
And substituting the current characteristic information s into the training model for calculation.
Inputting the current characteristic information s into the training model for calculation to obtain the prediction probability value [ p ] of the classification layer1’ p2’]TWhen p is1' greater than p2When y is 10]TWhen p is1' less than or equal to p2When y is ═ 01]T
In the step of judging whether the application program needs to be closed, when y is ═ 10]TDetermining that the application needs to be closed; when y is ═ 01]TAnd determining that the application needs to be reserved.
Thememory 502 may be used to store applications and data. Thememory 502 stores programs containing instructions executable in the processor. The programs may constitute various functional modules. Theprocessor 501 executes various functional applications and data processing by executing programs stored in thememory 502.
In some embodiments, as shown in fig. 8, fig. 8 is a schematic structural diagram of an electronic device provided in an embodiment of the present application. Theelectronic device 500 further comprises:radio frequency circuit 503,display 504,control circuit 505,input unit 506,audio circuit 507,sensor 508, andpower supply 509. Theprocessor 501 is electrically connected to theradio frequency circuit 503, thedisplay 504, thecontrol circuit 505, theinput unit 506, theaudio circuit 507, thesensor 508, and thepower supply 509.
Theradio frequency circuit 503 is used for transceiving radio frequency signals to communicate with a server or other electronic devices through a wireless communication network.
Thedisplay 504 may be used to display information entered by or provided to the user as well as various graphical user interfaces of the terminal, which may be comprised of images, text, icons, video, and any combination thereof.
Thecontrol circuit 505 is electrically connected to thedisplay 504 and is configured to control thedisplay 504 to display information.
Theinput unit 506 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint), and generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
Theaudio circuit 507 may provide an audio interface between the user and the terminal through a speaker, microphone.
Thesensor 508 is used to collect external environmental information. Thesensors 508 may include one or more of ambient light sensors, acceleration sensors, gyroscopes, and the like.
Thepower supply 509 is used to power the various components of theelectronic device 500. In some embodiments,power supply 509 may be logically coupled toprocessor 501 through a power management system to manage charging, discharging, and power consumption management functions through the power management system.
Although not shown in fig. 8, theelectronic device 500 may further include a camera, a bluetooth module, and the like, which are not described in detail herein.
The electronic equipment provided by the application acquires the historical characteristic information xiAnd generating a training model by adopting a BP neural network algorithm, and when detecting that the application program enters the background, bringing the current characteristic information s of the application program into the training model, further judging whether the application program needs to be closed, and intelligently closing the application program.
The embodiment of the present invention further provides a medium, where multiple instructions are stored, and the instructions are suitable for being loaded by a processor to execute the application management and control method according to any one of the above embodiments.
The application program control method, device, medium and electronic device provided by the embodiment of the invention belong to the same concept, and the specific implementation process is detailed in the whole specification and is not described herein again.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable storage medium, and the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
The application management and control method, device, medium and electronic device provided by the embodiments of the present application are described in detail above, and specific examples are applied in the present application to explain the principles and embodiments of the present application, and the descriptions of the above embodiments are only used to help understanding the present application. Meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

Translated fromChinese
1.一种应用程序管控方法,应用于电子设备,其特征在于,所述应用程序管控方法包括以下步骤:1. An application program management and control method, applied to an electronic device, wherein the application program management and control method comprises the following steps:获取所述应用程序的样本向量集,其中该样本向量集中的样本向量包括所述应用程序多个维度的历史特征信息xiObtain a sample vector set of the application, wherein the sample vectors in the sample vector set include historical feature information xi of multiple dimensions of the application;定义网络结构,包括设定输入层、隐含层、分类层、输出层、激活函数、批量大小和学习率,其中,所述输入层包括N个节点,所述输入层的节点数与所述历史特征信息xi的维数相同,所述隐含层包括M个节点,所述分类层采用Softmax函数,所述输出层包括2个节点,所述激活函数采用sigmoid函数,所述sigmoid函数为
Figure FDA0002320346460000011
其中,所述f(x)的范围为0到1,所述批量大小为A,所述学习率为B;Define the network structure, including setting the input layer, hidden layer, classification layer, output layer, activation function, batch size and learning rate, wherein the input layer includes N nodes, and the number of nodes in the input layer is the same as the number of nodes in the input layer. The dimensions of the historical feature information xi are the same, the hidden layer includes M nodes, the classification layer adopts the Softmax function, the output layer includes 2 nodes, the activation function adopts the sigmoid function, and the sigmoid function is
Figure FDA0002320346460000011
Wherein, the range of f(x) is 0 to 1, the batch size is A, and the learning rate is B;在输入层输入所述样本向量集进行计算,得到输入层的输出值,在所述隐含层输入所述输入层的输出值,得到所述隐含层的输出值,在所述分类层输入所述隐含层的输出值进行计算,得到预测概率值,将所述预测概率值代入输出层进行计算,得到预测结果值y,根据预测结果值y修正所述网络结构,得到训练模型;Input the sample vector set into the input layer for calculation to obtain the output value of the input layer, input the output value of the input layer into the hidden layer, obtain the output value of the hidden layer, and input the input value into the classification layer The output value of the hidden layer is calculated to obtain a predicted probability value, and the predicted probability value is substituted into the output layer for calculation to obtain a predicted result value y, and the network structure is modified according to the predicted result value y to obtain a training model;当应用程序进入后台,将所述应用程序的当前特征信息s输入所述训练模型进行计算,得到计算结果;以及When the application program enters the background, the current feature information s of the application program is input into the training model for calculation, and the calculation result is obtained; and根据所述计算结果判断所述应用程序是否需要关闭。Whether the application needs to be closed is determined according to the calculation result.2.如权利要求1所述的应用程序管控方法,其特征在于:在将所述应用程序的当前特征信息s输入所述训练模型进行计算的步骤中,将当前特征信息s输入所述训练模型进行计算得到分类层的预测概率值。2. The application program control method according to claim 1, wherein in the step of inputting the current feature information s of the application program into the training model for calculation, inputting the current feature information s into the training model The calculation is performed to obtain the predicted probability value of the classification layer.3.如权利要求1所述的应用程序管控方法,其特征在于:所述隐含层包括第一隐含层,第二隐含层和第三隐含层,所述第一隐含层,第二隐含层和第三隐含层中的每一层的节点数均小于10。3. The application program control method according to claim 1, wherein the hidden layer comprises a first hidden layer, a second hidden layer and a third hidden layer, and the first hidden layer, The number of nodes in each of the second hidden layer and the third hidden layer is less than 10.4.一种应用程序管控装置,其特征在于,所述装置包括:4. An application program management and control device, wherein the device comprises:获取模块,用于获取所述应用程序的样本向量集,其中该样本向量集中的样本向量包括所述应用程序多个维度的历史特征信息xian acquisition module for acquiring a sample vector set of the application, wherein the sample vectors in the sample vector set include historical feature information xi of multiple dimensions of the application;生成模块,用于定义网络结构,包括设定输入层、隐含层、分类层、输出层、激活函数、批量大小和学习率,其中,所述输入层包括N个节点,所述输入层的节点数与所述历史特征信息xi的维数相同,所述隐含层包括M个节点,所述分类层采用Softmax函数,所述输出层包括2个节点,所述激活函数采用sigmoid函数,所述sigmoid函数为
Figure FDA0002320346460000021
其中,所述f(x)的范围为0到1,所述批量大小为A,所述学习率为B,在输入层输入所述样本向量集进行计算,得到输入层的输出值,在所述隐含层输入所述输入层的输出值,得到所述隐含层的输出值,在所述分类层输入所述隐含层的输出值进行计算,得到预测概率值,将所述预测概率值代入输出层进行计算,得到预测结果值y,根据预测结果值y修正所述网络结构,得到训练模型;
The generation module is used to define the network structure, including setting the input layer, the hidden layer, the classification layer, the output layer, the activation function, the batch size and the learning rate, wherein the input layer includes N nodes, and the input layer has N nodes. The number of nodes is the same as the dimension of the historical feature information xi , the hidden layer includes M nodes, the classification layer adopts the Softmax function, the output layer includes 2 nodes, and the activation function adopts the sigmoid function, The sigmoid function is
Figure FDA0002320346460000021
Wherein, the range of f(x) is 0 to 1, the batch size is A, the learning rate is B, and the sample vector set is input in the input layer for calculation, and the output value of the input layer is obtained. The output value of the input layer is input to the hidden layer, and the output value of the hidden layer is obtained, and the output value of the hidden layer is input into the classification layer for calculation to obtain a predicted probability value. The value is substituted into the output layer for calculation, and the predicted result value y is obtained, and the network structure is modified according to the predicted result value y to obtain a training model;
计算模块,用于当应用程序代入后台,将所述应用程序的当前特征信息s输入所述训练模型进行计算,得到计算结果;以及A calculation module, used for inputting the current feature information s of the application into the training model for calculation when the application is substituted into the background to obtain a calculation result; and判断模块,用于根据所述计算结果判断所述应用程序是否需要关闭。A judging module, configured to judge whether the application program needs to be closed according to the calculation result.
5.如权利要求4所述的应用程序管控装置,其特征在于:所述计算模块用于当应用程序进入后台,将所述应用程序的当前特征信息s输入所述训练模型进行计算,得到分类层的预测概率值。5 . The application program management and control device according to claim 4 , wherein the calculation module is configured to input the current feature information s of the application program into the training model for calculation when the application program enters the background, and obtain the classification. 6 . The predicted probability value of the layer.6.如权利要求4所述的应用程序管控装置,其特征在于:还包括检测模块,用于检测所述应用程序进入后台。6 . The application program management and control device according to claim 4 , further comprising a detection module for detecting that the application program enters the background. 7 .7.如权利要求4所述的应用程序管控装置,其特征在于:还包括储存模块,用于储存应用程序的特征信息。7 . The application program management and control device according to claim 4 , further comprising a storage module for storing feature information of the application program. 8 .8.如权利要求4所述的应用程序管控装置,其特征在于:还包括关闭模块,用于当判断应用程序需要关闭时,将所述应用程序关闭。8 . The application program management and control device according to claim 4 , further comprising a closing module, configured to close the application program when it is determined that the application program needs to be closed. 9 .9.一种介质,其特征在于:所述介质中存储有多条指令,所述指令适于由处理器加载以执行如权利要求1至3中任一项所述的应用程序管控方法。9 . A medium, wherein a plurality of instructions are stored in the medium, and the instructions are adapted to be loaded by a processor to execute the application program management method according to any one of claims 1 to 3 .10.一种电子设备,其特征在于:所述电子设备包括处理器和存储器,所述电子设备与所述存储器电性连接,所述存储器用于存储指令和数据,所述处理器用于执行如权利要求1至3中任一项所述的应用程序管控方法。10. An electronic device, characterized in that: the electronic device comprises a processor and a memory, the electronic device is electrically connected to the memory, the memory is used to store instructions and data, and the processor is used to execute the The application program management and control method according to any one of claims 1 to 3.
CN201711044959.5A2017-10-312017-10-31Application program control method, device, medium and electronic equipmentActiveCN107885544B (en)

Priority Applications (2)

Application NumberPriority DateFiling DateTitle
CN201711044959.5ACN107885544B (en)2017-10-312017-10-31Application program control method, device, medium and electronic equipment
PCT/CN2018/110518WO2019085749A1 (en)2017-10-312018-10-16Application program control method and apparatus, medium, and electronic device

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201711044959.5ACN107885544B (en)2017-10-312017-10-31Application program control method, device, medium and electronic equipment

Publications (2)

Publication NumberPublication Date
CN107885544A CN107885544A (en)2018-04-06
CN107885544Btrue CN107885544B (en)2020-04-10

Family

ID=61783058

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201711044959.5AActiveCN107885544B (en)2017-10-312017-10-31Application program control method, device, medium and electronic equipment

Country Status (2)

CountryLink
CN (1)CN107885544B (en)
WO (1)WO2019085749A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN107885544B (en)*2017-10-312020-04-10Oppo广东移动通信有限公司Application program control method, device, medium and electronic equipment
CN109101326A (en)*2018-06-062018-12-28三星电子(中国)研发中心A kind of background process management method and device
CN110286949A (en)*2019-06-272019-09-27深圳市网心科技有限公司 Process suspending method and related equipment based on reading and writing of physical host storage device
CN110286961A (en)*2019-06-272019-09-27深圳市网心科技有限公司 Process suspension method based on physical host processor and related equipment
CN110275760A (en)*2019-06-272019-09-24深圳市网心科技有限公司 Process suspending method based on virtual host processor and its related equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN105389193A (en)*2015-12-252016-03-09北京奇虎科技有限公司Accelerating processing method, device and system for application, and server
CN106648023A (en)*2016-10-022017-05-10上海青橙实业有限公司Mobile terminal and power-saving method of mobile terminal based on neural network
CN107133094A (en)*2017-06-052017-09-05努比亚技术有限公司Application management method, mobile terminal and computer-readable recording medium
CN107145215A (en)*2017-05-062017-09-08维沃移动通信有限公司A kind of background application method for cleaning and mobile terminal

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN102306095B (en)*2011-07-212017-04-05宇龙计算机通信科技(深圳)有限公司Application management method and terminal
KR20160091786A (en)*2015-01-262016-08-03삼성전자주식회사Method and apparatus for managing user
US10572797B2 (en)*2015-10-272020-02-25Pusan National University Industry—University Cooperation FoundationApparatus and method for classifying home appliances based on power consumption using deep learning
CN106909447B (en)*2015-12-232019-11-15北京金山安全软件有限公司Background application processing method and device and terminal
CN105718027B (en)*2016-01-202019-05-31努比亚技术有限公司The management method and mobile terminal of background application
CN105808410B (en)*2016-03-292019-05-31联想(北京)有限公司A kind of information processing method and electronic equipment
CN106354836A (en)*2016-08-312017-01-25南威软件股份有限公司Advertisement page prediction method and device
CN107608748B (en)*2017-09-302019-09-13Oppo广东移动通信有限公司Application program control method and device, storage medium and terminal equipment
CN107643948B (en)*2017-09-302020-06-02Oppo广东移动通信有限公司Application program control method, device, medium and electronic equipment
CN107885544B (en)*2017-10-312020-04-10Oppo广东移动通信有限公司Application program control method, device, medium and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN105389193A (en)*2015-12-252016-03-09北京奇虎科技有限公司Accelerating processing method, device and system for application, and server
CN106648023A (en)*2016-10-022017-05-10上海青橙实业有限公司Mobile terminal and power-saving method of mobile terminal based on neural network
CN107145215A (en)*2017-05-062017-09-08维沃移动通信有限公司A kind of background application method for cleaning and mobile terminal
CN107133094A (en)*2017-06-052017-09-05努比亚技术有限公司Application management method, mobile terminal and computer-readable recording medium

Also Published As

Publication numberPublication date
WO2019085749A1 (en)2019-05-09
CN107885544A (en)2018-04-06

Similar Documents

PublicationPublication DateTitle
CN107643948B (en)Application program control method, device, medium and electronic equipment
CN107885544B (en)Application program control method, device, medium and electronic equipment
CN107608748B (en)Application program control method and device, storage medium and terminal equipment
US11249645B2 (en)Application management method, storage medium, and electronic apparatus
CN111797288B (en) Data screening method, device, storage medium and electronic device
CN107632697B (en)Application processing method and device, storage medium and electronic equipment
CN113284142A (en)Image detection method, image detection device, computer-readable storage medium and computer equipment
CN111797861A (en) Information processing method, device, storage medium and electronic device
CN111046742B (en)Eye behavior detection method, device and storage medium
CN107402808B (en)Process management method, device, storage medium and electronic equipment
US20200241483A1 (en)Method and Device for Managing and Controlling Application, Medium, and Electronic Device
CN111797079A (en) Data processing method, device, storage medium and electronic device
CN111984803A (en)Multimedia resource processing method and device, computer equipment and storage medium
CN112948763B (en)Piece quantity prediction method and device, electronic equipment and storage medium
CN107729144A (en) Application control method, device, storage medium and electronic equipment
CN108829595A (en)Test method, device, storage medium and electronic equipment
CN110838306A (en)Voice signal detection method, computer storage medium and related equipment
CN112742026A (en)Game control method, device, storage medium and electronic equipment
CN107861770B (en)Application program control method and device, storage medium and terminal equipment
CN108875901A (en)Neural network training method and generic object detection method, device and system
CN107766892B (en)Application program control method and device, storage medium and terminal equipment
CN111797863A (en) Model training method, data processing method, device, storage medium and device
CN107870791A (en) Application management method, device, storage medium and electronic device
CN114897158A (en)Training method of data processing model, data processing method, device and equipment
CN109066870B (en) Charge management method, device, medium and electronic device applying the method

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
CB02Change of applicant information
CB02Change of applicant information

Address after:523860 No. 18, Wu Sha Beach Road, Changan Town, Dongguan, Guangdong

Applicant after:OPPO Guangdong Mobile Communications Co., Ltd.

Address before:523860 No. 18, Wu Sha Beach Road, Changan Town, Dongguan, Guangdong

Applicant before:Guangdong OPPO Mobile Communications Co., Ltd.

GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp