Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present disclosure, "a plurality" means at least two, e.g., two, three, etc., unless explicitly specifically limited otherwise.
In view of the above technical problems in the related art, embodiments of the present disclosure provide a method for intent recognition to solve at least one or all of the above technical problems.
FIG. 1 shows a schematic diagram of an exemplary system architecture to which the methods for intent recognition of embodiments of the present disclosure may be applied; as shown in fig. 1:
the system architecture may include a server 101, a network 102, and a client 103. Network 102 serves as a medium for providing communication links between clients 103 and server 101. Network 102 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The server 101 may be a server that provides various services, such as a background management server that provides support for performing recognition processing on data acquired from the client 203. The background management server may send a collection instruction to the client 103, receive and analyze the user information set to be identified returned by the client 103, and identify data obtained by the analysis.
In some optional embodiments, the server 101 may receive a set of to-be-identified user information sent by a user terminal; the server 101 may analyze the user information set to be identified, obtain user service data, user terminal data and user behavior time data collected by the user terminal, and obtain a receiving time point for receiving the user information to be identified; the server 101 may perform intention recognition for the target intention according to the user service data, the user terminal data, the user behavior time data, and the reception time point, and obtain a recognition result value for the target intention; the server 101 may determine the degree to which the set of user information to be identified characterizes the intent of the object from the identification result value.
The client 103 may be a mobile terminal such as a mobile phone, a game console, a tablet computer, an electronic book reader, smart glasses, a smart home device, an AR (Augmented Reality) device, a VR (Virtual Reality) device, or the client 103 may also be a personal computer such as a laptop computer, a desktop computer, and the like.
The client 103 can receive the acquisition instruction sent by the server 101, acquire data on the terminal based on the acquisition instruction, and perform hash calculation on the terminal based on the acquisition instruction, so that when a terminal is used as a card maintenance terminal by a large number of telephone cards, the terminal is subjected to calculation power consumption, and card maintenance cost is generated.
It should be understood that the number of clients, networks and servers in fig. 1 is only illustrative, and the server 101 may be a physical server, a server cluster composed of a plurality of servers, a cloud server, and any number of clients, networks and servers according to actual needs.
Hereinafter, each step of the method for intention identification in the exemplary embodiment of the present disclosure will be described in more detail with reference to the accompanying drawings and embodiments.
FIG. 2 illustrates a flow diagram of a method for intent recognition, according to one embodiment of the present disclosure. The method provided by the embodiment of the present disclosure may be executed by a server or a client as shown in fig. 1, but the present disclosure is not limited thereto.
In the following description, the server cluster 101 is used as an execution subject for illustration.
As shown in fig. 2, a method for intention identification provided by an embodiment of the present disclosure may include the following steps:
step S201, receiving a user information set to be identified sent by a user terminal;
step S203, analyzing the user information set to be identified, obtaining user service data, user terminal data and user behavior time data collected by a user terminal, and obtaining a receiving time point for receiving the user information to be identified;
step S205, according to the user service data, the user terminal data, the user behavior time data and the receiving time point, performing intention identification aiming at the target intention to obtain an identification result value of the target intention;
and step S207, determining the degree of representing the target intention by the user information set to be identified according to the identification result value.
According to the method for identifying the intention, the user information set to be identified can be obtained from the user terminal, the user information set to be identified is analyzed to obtain the user service data, the user terminal data and the user behavior time data, and then the degree of the target intention represented by the user information set to be identified can be identified by utilizing multiple dimensions of the user service data, the user terminal data and the user behavior time data, so that a better target intention identification effect is achieved.
Next, steps S201 to S207 of the method for intention identification in the present exemplary embodiment will be described in more detail with reference to fig. 2 and the embodiment.
Step S201, receiving a user information set to be identified sent by a user terminal.
The data used for generating the user information set to be identified can be collected by the user terminal on the local computer, so that the user information set to be identified is generated according to the collected data, and then the user terminal sends the user information set to be identified to the server.
In some embodiments, before step S201, the following steps may be further included: configuring an acquisition instruction, wherein an acquisition trigger condition can be configured in the acquisition instruction in advance; and sending an acquisition instruction to the user terminal so that the user terminal acquires user service data, user terminal data and user behavior time data according to the acquisition instruction when the user terminal triggers an acquisition triggering condition, and further generating a user information set to be identified.
Specifically, the acquisition trigger condition may be a specific trigger action, such as a "dialing action", or a specific time, such as "3 pm of each day", so that when the terminal detects that the acquisition trigger condition is satisfied, the terminal executes the acquisition instruction.
In some embodiments, the acquisition instruction further configures a target service data type to be acquired, a target terminal data type to be acquired, and a user behavior time acquisition strategy; the user service data is acquired by the user terminal according to the type of the target service data; the user terminal data is acquired by the user terminal according to the data type of the target terminal; the user behavior time data is obtained by the user terminal according to the user behavior time obtaining strategy.
Specifically, the user service data may be, for example, service information such as a user card number registration place, a minimum consumption service type, and a package service type, and the user terminal data may be, for example, power change information, application quantity information, and terminal category information on the terminal. The user service data and the user terminal data can be searched and obtained from the local storage space of the terminal.
Step S203, analyzing the user information set to be identified, obtaining user service data, user terminal data and user behavior time data collected by the user terminal, and obtaining a receiving time point for receiving the user information to be identified. After receiving the user information set to be identified sent by the terminal, the server can directly analyze and obtain the data acquired by the terminal, and can record the receiving time point of receiving the user information to be identified for the subsequent identification step.
Step S205, according to the user service data, the user terminal data, the user behavior time data and the receiving time point, the intention identification aiming at the target intention is carried out, and the identification result value of the target intention is obtained. In the embodiment of the disclosure, the target intention can be a card-keeping intention.
Fig. 3 shows a flowchart of obtaining a recognition result value in the method for intention recognition according to an embodiment of the disclosure, and as shown in fig. 3, step S205 in the embodiment of fig. 2 may further include the following steps S301 to S309 in some embodiments:
step S301, according to the user service data, performing service characteristic identification aiming at the target intention to obtain a service identification result value. In this embodiment, the identification may be performed from the service feature dimension.
In some embodiments, the user traffic data comprises at least one of: the method comprises the following steps of registering a user card number, a lowest consumption service type, a package service type, a monthly rental service type, an average income value of each user, calling time, called time, a network-accessing monthly consumption value and a user attribution channel type; step S301 may include: acquiring a card maintenance identification model, and inputting user service data into the card maintenance identification model; obtaining the card-raising intention probability output by the card-raising identification model; taking the card-raising intention probability as a business identification result value; the card-raising recognition model is obtained by training according to historical business data of card-raising users and historical business data of non-card-raising users.
The user service data can represent the service use state or condition of the telephone card user, and each information in the user service data can be used for judging whether the user normally uses the telephone card; in some practical applications, user service data of a large number of users can be obtained as training samples, a neural network model is trained according to the training samples, and then the trained neural network model is used for service feature recognition.
Step S303, carrying out terminal environment identification aiming at the target intention according to the user terminal data, and obtaining a terminal identification result value. In this embodiment, the identification may be performed from the terminal environment dimension.
In some embodiments, step S303 may include: determining the number of user terminal data meeting the maintenance environment condition; determining a terminal identification result value according to the number; wherein, the environment condition of the nutrient solution comprises: the method comprises the steps that no international mobile equipment identification code exists on a terminal, no local number exists on the terminal, no user identification card number exists on the terminal, no brand type information exists on the terminal, no CPU information exists on the terminal, no Mac address information exists on the terminal, no battery quantity change information exists on the terminal, no baseband information exists on the terminal, control group information cannot be read on the terminal, process group information cannot be read on the terminal, no wlan driving information exists on the terminal, wlan driving information on the terminal is abnormal, the number of sensors on the terminal is smaller than a sensor number threshold value, and the number of pre-installed applications on the terminal is smaller than an application number threshold value.
The user terminal data can be used for representing the environment of the terminal, and as the terminal of a normal user does not meet the environment condition of the card when using the phone card, for example, the terminal used by the normal user usually has international mobile equipment identification codes and electric quantity change. Therefore, the terminal identification result value can be determined by determining whether the user terminal data satisfies the maintenance environment condition and the number of the maintenance environment conditions satisfied in the case of the satisfaction.
For example, the terminal identification result value may be set to 0 and 1, where 0 may indicate that there is no card-holding intention, and 1 may indicate that there is a card-holding intention; if the number of the maintenance environment conditions is 0, the corresponding terminal identification result value is 0, and if the number of the maintenance environment conditions is 1 or more, the corresponding terminal identification result value is 1. For example, the terminal identification result values may be set to 0, 0.2, 0.4, 0.6, 0.8, and 1, the corresponding terminal identification result value may be set to 0 if the number satisfying the maintenance environment condition is 0, the corresponding terminal identification result value may be set to 0.2 if the number satisfying the maintenance environment condition is 1 or more and 2 or less, and the corresponding terminal identification result value may be set to 0.4 if the number satisfying the maintenance environment condition is 3 or more and 4 or less.
And step S305, performing data authenticity identification according to the user behavior time data and the receiving time point to obtain a data identification result value.
In some embodiments, for the aforementioned acquisition instruction, an offset time period, a hash algorithm, and a hash calculation number may also be configured in the user behavior time acquisition policy of the acquisition instruction, so that the terminal may obtain the user behavior time data according to the user behavior time acquisition policy configured with the offset time period, the hash algorithm, and the hash calculation number. The user behavior time data may include: the method comprises the steps of determining a target behavior according to the target behavior, determining an offset time period, determining a behavior occurrence time point of the target behavior based on the offset time period, and calculating a calculation time period which is taken for performing iterative calculation on the designated data according to the hash calculation times by using a hash algorithm.
The user behavior can be determined as follows: the method comprises the steps of determining a first moment for triggering acquisition triggering conditions, taking a moment determined according to an offset time period before the first moment as a second moment, and further determining the latest user behavior before the second moment as a target behavior. Further, the target behavior may include at least one of a behavior of clicking a mouse, a behavior of operating a keyboard, a gesture behavior, and a gyroscope behavior; the specifying data may include: at least one of user traffic data, user terminal data, and behavior log data of the target behavior.
For example, if the first time may be 8: 00, and with the offset period set to two hours, 8: if the time of two hours before 00 is 6:00, the second time can be determined to be 6:00, then query is carried out in the local storage space of the terminal to find out the target behavior recorded last time before 6:00, and further the occurrence time of the target behavior is 5:00, then the occurrence time point of the behavior can be determined to be 5: 00. The log data of the target behavior may also be used as the specified data, the hash algorithm is used to perform iterative computation on the specified data according to the hash computation times, and then the time taken by the iterative computation is recorded, for example, 2 seconds, and further 2 seconds may be used as the computation time period. Thus, "the offset period is two hours", "the action occurrence time point is 5: 00", "the calculation period is 2 seconds" can be taken as the user action time data.
In some practical applications, the hash algorithm and the hash calculation times can be adjusted and set according to practical situations; the hash algorithm may be regarded as an algorithm capable of consuming the terminal computing power, for example, the hash algorithm may be an SHA256 algorithm, an SHA512 algorithm, or the like, and the hash calculation number may be, for example, 5 times, 1000 times, 5000 times, or the like. The determination method of the calculation time period may be: and recording the time of one-time Hash calculation, and obtaining a calculation time period according to the product of the time of one-time Hash calculation and the Hash calculation times.
Fig. 4 is a schematic diagram illustrating iterative computation in a method for intent recognition according to an embodiment of the present disclosure, and as shown in fig. 4, a schematic diagram illustrating 5 hash iterative computations includes:
the specified data may be obtained first;
the first calculation is as follows: acquiring current acquisition time, processing the specified data and the current acquisition time by using an SHA256 Hash algorithm to obtain a Hash value 1 of a Hash calculation for the first time, and recording the time spent in the calculation, namely a time slice t;
and (3) calculating for the second time: acquiring current acquisition time, processing the specified data, the current acquisition time and Hash1 by using an SHA256 Hash algorithm to obtain a Hash value Hash2 of the second Hash calculation, and recording the time spent in the calculation, namely a time slice t;
the third calculation: acquiring current acquisition time, processing the designated data, the current acquisition time and Hash2 through an SHA256 Hash algorithm to obtain a Hash value Hash3 of the third Hash calculation, and recording the time spent in the calculation, namely time slice t;
the fourth calculation: acquiring current acquisition time, processing the designated data, the current acquisition time and Hash3 through an SHA256 Hash algorithm to obtain a Hash value Hash4 of the fourth Hash calculation, and recording the time spent in the calculation, namely time slice t;
the fifth calculation: acquiring current acquisition time, processing the designated data, the current acquisition time and Hash4 through an SHA256 Hash algorithm to obtain a Hash value Hash5 of a fifth Hash calculation, and recording the time spent in the calculation, namely a time slice t;
at this point, the iterative computation is complete.
It can be seen that by implementing the method shown in fig. 4, the previous hash calculation result is used as an input parameter of the next SHA-256 algorithm to participate in the calculation, and the calculation result can be generated only by performing certain hash calculation after the behavior occurs, so that the method increases the counterfeiting cost.
Because a great number of telephone cards are usually applied to one card maintenance terminal in the card maintenance behavior, the calculation power of the card maintenance terminal can be consumed when each telephone card enables the card maintenance terminal to perform a great number of hash calculations. Therefore, when the information of the user information set to be identified is collected, the terminal used by a normal user is not affected and the card-holding terminal with the card-holding intention is continuously consumed by calculation through configuring a Hash algorithm and Hash calculation times in the collection instruction, so that the effect of generating cost in card-holding behavior is achieved.
In step S305, in an embodiment, it may be identified whether the acquired data is a forged dimension. The steps in this embodiment may be used to determine whether the data used for identification in steps S301 and S303 is terminal-forged, so as to avoid that the terminal uses forged data to cause an inaccurate identification result.
In some embodiments, step S305 may include: calculating to obtain a behavior prediction occurrence time point according to the receiving time point, the calculation time period and the offset time period; determining a difference value between the behavior prediction occurrence time point and the behavior occurrence time point, and taking the difference value as a behavior time difference; determining the real probability of the user behavior time data according to the relationship between the behavior time difference and the time difference threshold value, and taking the real probability as a data authenticity identification result; and determining a data identification result value according to the data authenticity identification result. The time difference threshold may be adjusted and set according to actual conditions, for example, the minimum threshold may be set to 0, and the maximum threshold may be set to 2 hours.
Specifically, for example, if it is determined that the deviation time period T1 is two hours, the behavior occurrence time point T1 is 5:00, the calculation time period T2 is 1 minute, and the receiving time point T2 is 8:05, the behavior predicted occurrence time point T3 may be obtained by calculation according to the receiving time point, the calculation time period, and the deviation time period, for example, a formula (T3 ═ T2-T2-T1) may be used, and the behavior predicted occurrence time point T3 is 6: 04; determining the difference value between the behavior prediction occurrence time point and the behavior occurrence time point, namely the value of (t3-t1), obtaining that the behavior time difference is 1 hour and 4 minutes, further acquiring a time difference threshold value, and determining the real probability of the user behavior time data according to the relationship between the behavior time difference and the time difference threshold value; specifically, a probability mapping table corresponding to the time difference threshold may be obtained, where the probability mapping table may include a correspondence between the behavior time difference and the real probability, so that the real probability corresponding to the behavior time difference may be found.
Step S307, obtaining the weight values corresponding to the service identification value, the terminal identification value, and the data identification value, respectively. The weight values corresponding to the service identification value, the terminal identification value and the data identification value can be preset, and the weight values can be adjusted and set according to actual conditions.
It should be noted that, in the present disclosure, the specific implementation sequence of steps S301 to S307 is not limited, and steps S301 to S307 only need to be completed before step S309; specifically, the steps in steps S301 to S307 may be executed simultaneously or sequentially, and may be adjusted according to actual situations.
Step S309, determining an identification result value according to the service identification value, the terminal identification value, the data identification value and each corresponding weight value.
Fig. 5 is a schematic diagram illustrating the determination of the recognition result value in the method for intention recognition according to an embodiment of the disclosure, and as shown in fig. 5, the processes of steps S301 to S309 may be:
firstly, a corresponding quantization value (namely, a service identification value) P1 is obtained through card-raising feature identification Y1 (namely, service feature identification), a corresponding quantization value (namely, a terminal identification value) P2 is obtained through simulator Y2 (namely, terminal environment identification), a corresponding quantization value (data identification value) P3 is obtained through anti-counterfeiting identification Y3 (namely, data authenticity identification), then weights W1, W2 and W3 corresponding to all the quantization values are respectively obtained, and then weighting calculation is carried out (P1W 1+ P2W 2+ P3W 3) to obtain an identification result value. The higher the identification result value is, the larger the card-raising intention represented by the user information set to be identified can be considered.
And step S207, determining the degree of representing the target intention by the user information set to be identified according to the identification result value.
The preset grade mapping table can be obtained, and whether the user information set to be identified represents the target intention and the degree of representing the target intention are judged according to the corresponding relation between the identification result value and the grade of the card-keeping intention in the grade mapping table.
Fig. 6 shows a flowchart of a method for intent recognition of one embodiment of the present disclosure, as shown in fig. 6, including:
step (1), a user side triggers user behavior data and terminal environment data acquisition; recording the time of each behavior point and each environmental parameter; time offsets for each behavior data are calculated.
The user behavior can comprise any one of a behavior of clicking a mouse, a behavior of operating a keyboard, a gesture behavior and a gyroscope behavior and a combination of the behaviors; the environment data refers to software and hardware information of the user terminal, and may include an IMEI number (international mobile equipment identity), a local number, a serial number of a SIM card (subscriber identity card number), a brand of a mobile phone, a manufacturer, a model, CPU information, a Mac address, battery information, baseband information, a number of sensors, a number of app pre-installed by a user, and the like.
And (2) performing n SHA-256 hash calculations (for example, 5) by the user terminal, wherein the hash calculation time of each SHA-256 hash is a time slice t.
And (3) forming a result set by the results obtained in the step (3), the step (1) and the step (2), and sending the result set to a server.
And (4) the server receives the result set and performs: 1. identifying the card raising characteristics: modeling by adopting clustered big data; 2. the simulator identifies: judging by adopting an environmental parameter rule; 3. anti-counterfeiting identification: and calculating the difference between the user behavior calculating time and the user behavior collecting time, and judging whether the difference exceeds a preset time difference threshold value.
For the card-raising feature identification (namely business feature identification), big data modeling can be adopted, and data mining can be carried out according to features; specifically, a big data clustering algorithm (according to characteristics of location, low consumption, package type, monthly rental, monthly ARPU value, calling duration, online monthly consumption, developed channel, activated IMEI, and the like) can be adopted to judge whether to support the card.
For simulator identification (namely terminal environment identification), whether the simulator is identified or not can be judged according to the acquired software and hardware parameters of the user terminal; specifically, the user behavior is determined to be running on the simulator if one of the following conditions is met: (1) acquiring an IMEI number of the mobile phone, a local number and a serial number of an SIM card, if the serial number is null; (2) reading the brand information of the mobile phone, including the brand, the manufacturer and the model, if the brand information is null; (3) checking CPU information, such as null; (4) checking the Mac address, if empty; (5) checking whether the battery information is changed, if not; (6) checking whether the baseband information is null, if so; (7) judging whether the control group information can be read or not, if not, judging whether the control group information can be read or not; (8) judging whether process group information can be read or not, if not, judging whether process group information can be read or not; (9) judging whether the wlan driving is unknown or abnormal, if yes; (10) judging whether the number of the sensors is 1, if so; (12) it is determined whether the user-preloaded app is less than 5, if so.
For anti-counterfeiting identification (namely data authenticity identification), a user can be prevented from forging action data such as a mouse, a keyboard, a gesture, a gyroscope and the like, equipment data such as IP, UA, equipment fingerprints and the like, a machine is prevented from sending a sticker, brushing a bill and sending an advertisement sticker, and the machine is prevented from being killed in seconds; specifically, a behavior prediction occurrence time point may be calculated by the acquired receiving time point, the calculation time period, and the offset time period, and then the behavior prediction occurrence time point and the behavior occurrence time point are compared, so that on one hand, the behavior prediction occurrence time point should be later than the behavior occurrence time point, and on the other hand, the time period in which the behavior occurrence time point is earlier than the behavior prediction occurrence time point may not exceed the preset time difference threshold, and thus, the recognition result may be determined according to the comparison result.
And (5) calculating a model according to the weight scores to obtain a score (namely a recognition result value).
The scoring formula may be (P1 × W1+ P2 × W2+ P3 × W3), and the same contents in this step as those in the embodiment shown in fig. 5 may refer to the text description of the embodiment shown in fig. 5, and the disclosure is not repeated herein.
It is to be noted that the above-mentioned figures are only schematic illustrations of the processes involved in the method according to an exemplary embodiment of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Fig. 7 shows a block diagram of an apparatus 700 for intent recognition in a fifth embodiment of the present disclosure; as shown in fig. 7, includes:
a receiving module 701, configured to receive a user information set to be identified, where the user information set is sent by a user terminal;
the analysis module 702 is configured to analyze a user information set to be identified, obtain user service data, user terminal data, and user behavior time data collected by a user terminal, and obtain a receiving time point for receiving the user information to be identified;
the identification module 703 is configured to perform intention identification for the target intention according to the user service data, the user terminal data, the user behavior time data, and the receiving time point, and obtain an identification result value for the target intention;
and a determining module 704, configured to determine, according to the recognition result value, a degree to which the user information set to be recognized represents the target intention.
Other aspects of the embodiment of fig. 7 may be found in relation to other embodiments described above.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
FIG. 8 shows a block diagram of a computer device for intent recognition in an embodiment of the present disclosure. It should be noted that the illustrated electronic device is only an example, and should not bring any limitation to the functions and the scope of the embodiments of the present invention.
Anelectronic device 800 according to this embodiment of the invention is described below with reference to fig. 8. Theelectronic device 800 shown in fig. 8 is only an example and should not bring any limitations to the function and scope of use of the embodiments of the present invention.
As shown in fig. 8,electronic device 800 is in the form of a general purpose computing device. The components of theelectronic device 800 may include, but are not limited to: the at least oneprocessing unit 810, the at least onememory unit 820, and abus 830 that couples the various system components including thememory unit 820 and theprocessing unit 810.
Wherein the storage unit stores program code that is executable by theprocessing unit 810 to cause theprocessing unit 810 to perform steps according to various exemplary embodiments of the present invention as described in the above section "exemplary methods" of the present specification. For example, theprocessing unit 810 may perform a method as shown in fig. 2.
Thestorage unit 820 may include readable media in the form of volatile memory units such as a random access memory unit (RAM)8201 and/or acache memory unit 8202, and may further include a read only memory unit (ROM) 8203.
Thestorage unit 820 may also include a program/utility 8204 having a set (at least one) ofprogram modules 8205,such program modules 8205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 830 may be any of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
Theelectronic device 800 may also communicate with one or more external devices 900 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with theelectronic device 800, and/or with any devices (e.g., router, modem, etc.) that enable theelectronic device 800 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 850. Also, theelectronic device 800 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via thenetwork adapter 860. As shown, thenetwork adapter 860 communicates with the other modules of theelectronic device 800 via thebus 830. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with theelectronic device 800, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
According to the program product for implementing the method, the portable compact disc read only memory (CD-ROM) can be adopted, the program code is included, and the program product can be operated on terminal equipment, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a mobile terminal, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.