Movatterモバイル変換


[0]ホーム

URL:


CN111158829A - Operation rollback processing method and device - Google Patents

Operation rollback processing method and device
Download PDF

Info

Publication number
CN111158829A
CN111158829ACN201911398682.5ACN201911398682ACN111158829ACN 111158829 ACN111158829 ACN 111158829ACN 201911398682 ACN201911398682 ACN 201911398682ACN 111158829 ACN111158829 ACN 111158829A
Authority
CN
China
Prior art keywords
withdrawing
time
unit area
target
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911398682.5A
Other languages
Chinese (zh)
Inventor
金少博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
People's Happiness Co ltd
Original Assignee
Beijing Kingsoft Internet Security Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Kingsoft Internet Security Software Co LtdfiledCriticalBeijing Kingsoft Internet Security Software Co Ltd
Priority to CN201911398682.5ApriorityCriticalpatent/CN111158829A/en
Publication of CN111158829ApublicationCriticalpatent/CN111158829A/en
Pendinglegal-statusCriticalCurrent

Links

Images

Classifications

Landscapes

Abstract

The application provides an operation rollback processing method and device, wherein the method comprises the following steps: in the process of displaying continuously changing scene content on a screen, storing attribute change information corresponding to each unit area in the screen, wherein the attribute change information comprises: history time and history attribute information corresponding to the unit area identifier of each unit area; when a scene withdrawing event is monitored, analyzing the scene withdrawing event to determine target withdrawing time; inquiring attribute change information, and determining each target history attribute information corresponding to the target history time matched with the unit area identifier of each unit area and the target withdrawal time; and displaying the target content in each corresponding unit area according to each target history attribute information corresponding to the unit area identifier of each unit area. Therefore, the content displayed on the screen can be completely returned to the corresponding historical moment according to the operation returning processing, and the operation returning experience is improved.

Description

Operation rollback processing method and device
Technical Field
The present application relates to the field of vision processing technologies, and in particular, to an operation rollback processing method and apparatus.
Background
With the development of computer technology, various applications depending on computer technology have also been vigorously developed.
When a user executes related operations through an application, operation rollback is often required due to operation errors and the like, in the related art, only the latest operation of the user needs to be rolled back, and this local rollback method results in incomplete rollback in some scenes and affects the rollback experience of the user.
Disclosure of Invention
The application provides an operation rollback processing method and device, which are used for solving the technical problem that rollback only partially backs part of scene contents in the related technology so as to cause low rollback experience.
An embodiment of an aspect of the present application provides an operation rollback processing method, including: storing attribute change information corresponding to each unit area in a screen during a process of displaying continuously changing scene content on the screen, wherein the attribute change information includes: history time and history attribute information corresponding to the unit area identifier of each unit area; when a scene withdrawing event is monitored, analyzing the scene withdrawing event to determine target withdrawing time; inquiring the attribute change information, and determining each target history attribute information corresponding to the target history time matched with the unit area identifier of each unit area and the target withdrawal time; and displaying target content in each corresponding unit area according to each target historical attribute information corresponding to the unit area identifier of each unit area.
In addition, the operation rollback processing method according to the embodiment of the present application further includes the following additional technical features:
in a possible implementation manner of the embodiment of the present application, the storing attribute change information corresponding to each unit area in the screen includes: detecting whether a preset target control in the scene content is triggered or not; and if the target control is monitored to be triggered, acquiring current attribute information corresponding to the unit area identification of each unit area in the screen at the current moment, and storing the current attribute information into the attribute change information.
In a possible implementation manner of the embodiment of the present application, when a scene retraction event is monitored, analyzing the scene retraction event to determine a target retraction time includes: analyzing the scene withdrawing event to acquire a withdrawing requirement identifier and a withdrawing user identifier; determining withdrawal time according to the withdrawal requirement identifier; inquiring user registration information, and acquiring withdrawal upper limit time corresponding to the withdrawal user identification; judging whether the withdrawing time exceeds the withdrawing upper limit time or not, and if the withdrawing time exceeds the withdrawing upper limit time, determining the withdrawing upper limit time as the target withdrawing time; and if the withdrawing time does not exceed the withdrawing upper limit time, determining that the withdrawing time is the target withdrawing time.
In a possible implementation manner of the embodiment of the present application, when a scene retraction event is monitored, analyzing the scene retraction event to determine a target retraction time includes: when a plurality of scene withdrawing events are monitored simultaneously, analyzing each scene withdrawing event to obtain a plurality of withdrawing requirement identifications and a plurality of withdrawing user identifications; inquiring the level information, and acquiring a user level corresponding to each withdrawing user identifier; if all the user levels are the same, determining a plurality of withdrawing times corresponding to each withdrawing requirement identification, and determining the shortest withdrawing time in the withdrawing times as the target withdrawing time from the withdrawing times; and if all the user levels are different, determining the withdrawal time corresponding to the withdrawal requirement identifier of the user with the highest level as the target withdrawal time.
In a possible implementation manner of the embodiment of the present application, after the displaying the target content in the corresponding unit area, the method further includes: receiving a withdrawal cancellation event; judging whether the time difference between the receiving time of the withdrawing cancel event and the receiving time of the scene withdrawing event is within a preset range or not; if the time difference is within the preset range, displaying the content before receiving the scene retraction event in each corresponding unit area; and if the time difference exceeds the preset range, continuing to display the target content in the corresponding unit area.
An embodiment of an aspect of the present application provides an operation rollback processing apparatus, including: a storage module, configured to store attribute change information corresponding to each unit area in a screen in a process of displaying continuously changing scene content on the screen, where the attribute change information includes: history time and history attribute information corresponding to the unit area identifier of each unit area; the analysis module is used for analyzing the scene withdrawing event to determine target withdrawing time when the scene withdrawing event is monitored; the determining module is used for inquiring the attribute change information and determining each target historical attribute information corresponding to the target historical time matched with the unit area identifier of each unit area and the target withdrawing time; and the display module is used for displaying the target content in each corresponding unit area according to the target historical attribute information corresponding to the unit area identifier of each unit area.
In addition, the operation rollback processing device according to the embodiment of the present application further includes the following additional technical features:
in a possible implementation manner of the embodiment of the present application, the storage module is specifically configured to: detecting whether a preset target control in the scene content is triggered or not; and if the target control is monitored to be triggered, acquiring current attribute information corresponding to the unit area identification of each unit area in the screen at the current moment, and storing the current attribute information into the attribute change information.
In a possible implementation manner of the embodiment of the present application, the parsing module is specifically configured to: analyzing the scene withdrawing event to acquire a withdrawing requirement identifier and a withdrawing user identifier; determining withdrawal time according to the withdrawal requirement identifier; inquiring user registration information, and acquiring withdrawal upper limit time corresponding to the withdrawal user identification; judging whether the withdrawing time exceeds the withdrawing upper limit time or not, if so, determining that the withdrawing upper limit time is the target withdrawing time, and if not, determining that the withdrawing time is the target withdrawing time.
In a possible implementation manner of the embodiment of the present application, the parsing module is specifically configured to: when a plurality of scene withdrawing events are monitored simultaneously, analyzing each scene withdrawing event to obtain a plurality of withdrawing requirement identifications and a plurality of withdrawing user identifications; inquiring the level information, and acquiring a user level corresponding to each withdrawing user identifier; if all the user levels are the same, determining a plurality of withdrawing times corresponding to each withdrawing requirement identification, and determining the shortest withdrawing time in the withdrawing times as the target withdrawing time from the withdrawing times; and if all the user levels are different, determining the withdrawal time corresponding to the withdrawal requirement identifier of the user with the highest level as the target withdrawal time.
In a possible implementation manner of the embodiment of the present application, the method further includes: a receiving module, configured to receive a retraction canceling event; the judging module is used for judging whether the time difference between the receiving time of the withdrawing cancellation event and the receiving time of the scene withdrawing event is within a preset range or not; the display module is further configured to display, in the corresponding unit areas, content before the scene retraction event is received if it is known that the time difference is within the preset range; and the display module is further configured to continue to display the target content in the corresponding unit areas if the time difference is known to exceed the preset range.
Another embodiment of the present application provides an electronic device, including a processor and a memory; wherein the processor executes a program corresponding to the executable program code by reading the executable program code stored in the memory, so as to implement the operation rollback processing method according to the above embodiment.
Another embodiment of the present application provides a non-transitory computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the operation rollback processing method according to the above embodiment.
The technical scheme provided by the embodiment of the application at least has the following technical effects:
in the process of displaying continuously changing scene content on a screen, storing attribute change information corresponding to each unit area in the screen, wherein the attribute change information comprises: and when the scene retraction event is monitored, analyzing the scene retraction event to determine target retraction time, further inquiring attribute change information, determining each target history attribute information corresponding to the unit area identifier of each unit area and the target history time matched with the target retraction time, and finally displaying target content in each corresponding unit area according to each target history attribute information corresponding to the unit area identifier of each unit area. Therefore, the content displayed on the screen can be completely returned to the corresponding historical moment according to the operation returning processing, and the operation returning experience is improved.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a flow diagram of a method of operating a rollback process according to one embodiment of the present application;
FIG. 2 is a schematic illustration of a unit area display according to one embodiment of the present application;
FIG. 3 is a flow diagram of a method of operating a rollback processing according to another embodiment of the present application;
FIG. 4 is a flow diagram of a method of operating a rollback processing according to yet another embodiment of the present application;
FIG. 5 is a schematic diagram of an operational fallback processing scenario according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a structure of an operation rollback processing apparatus according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an operation rollback processing apparatus according to another embodiment of the present application; and
FIG. 8 illustrates a block diagram of an exemplary electronic device suitable for use in implementing embodiments of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application and should not be construed as limiting the present application.
An operation rollback processing method and apparatus according to an embodiment of the present application are described below with reference to the drawings.
Fig. 1 is a flowchart of an operation rollback processing method according to an embodiment of the present application, as shown in fig. 1, the method including:
step 101, in the process of displaying continuously changing scene content on a screen, storing attribute change information corresponding to each unit area in the screen, wherein the attribute change information includes: history time and history attribute information corresponding to the unit area identifier of each unit area.
It can be understood that, as shown in fig. 2, the screen in the embodiment of the present application is divided into a plurality of unit areas, and each unit area includes corresponding attribute information, where the attribute information includes display content and display parameters (display color, etc.) in the current unit area.
With continued reference to fig. 2, the attribute information of the unit area 1 is displayed as an apple, the color of the apple is red (indicated by black in the figure), and the background is white.
Specifically, in the process of displaying continuously changing scene content on the screen, for example, in the process of changing a UI interface of a game or in the process of playing a video, attribute change information corresponding to each unit area in the screen is stored, where the attribute change information includes: and historical time and historical attribute information corresponding to the unit area identification of each unit area, wherein the historical time is the display time of the current terminal equipment or the timing time counted by starting a timer when the continuous change of the screen display is started, and the historical attribute information corresponds to the attribute information. The unit area identifier of each unit area may be a unit area number, a coordinate position, or the like.
It should be noted that, in different application scenarios, the manner of storing the attribute change information corresponding to each unit area in the screen is different, and the following example is given:
example one:
in this example, it is considered that in some scenes, the scene content changes only after some target controls are triggered, for example, in a game scene, the scene changes only when a user triggers a corresponding game control, and therefore, whether a touch operation occurs on a preset target control in the scene content is detected, and if the touch operation is monitored, current attribute information corresponding to a unit area identifier of each unit area in a screen at the current time is obtained and stored in the attribute change information.
Example two:
in this example, the scene content is changed in real time, at this time, the attribute information under the unit area identifier corresponding to each frame is saved, and the current attribute information corresponding to the unit area identifier of each unit area in the scene content image of each frame is stored in the attribute change information.
And 102, when the scene withdrawing event is monitored, analyzing the scene withdrawing event to determine the target withdrawing time.
The scene retraction event may be triggered by a user through voice information or triggered by triggering a corresponding retraction control.
When the scene withdrawing event is monitored, the scene withdrawing event is analyzed to determine target withdrawing time, wherein the target withdrawing time refers to time corresponding to the historical scene content to be withdrawn. In this embodiment, the scene retraction event includes corresponding target retraction time, where when the scene retraction event determines the corresponding target retraction time by triggering the corresponding retraction control for a duration or a number of times, or the target retraction time may be determined by a time keyword included in the user voice information.
In one embodiment of the present application, it is considered that in some applications, different users have different withdrawal time limits to guarantee the authority of some users, for example, in a game scene, in order to provide better service to paying users, the withdrawal time limit of users who may not pay is stronger.
To meet this requirement of the application, in one embodiment of the present application, as shown in fig. 3, thestep 102 includes:
step 201, analyzing a scene retraction event to obtain a retraction requirement identifier and a retraction user identifier.
Step 202, determining the withdrawal time according to the withdrawal requirement identifier.
It can be understood that, in this embodiment, the retraction time is determined according to a retraction requirement identifier, where the retraction requirement identifier may be the aforementioned trigger time length or the number of times of the target control, and may also be a time length keyword in the voice control information.
In addition, the above mentioned retrieving user identifier may be obtained by identifying according to fingerprint information of a display screen touched by the user detected by retrieving, or by extracting voiceprint information according to voice information of the user, or by determining according to a login account of the user in the current scene.
Step 203, querying preset user registration information, and acquiring the withdrawal upper limit time corresponding to the withdrawal user identifier.
Specifically, user registration information is obtained in advance, the user registration information includes a corresponding relationship between a user identifier and the withdrawal upper limit time, preset user registration information is inquired, and the withdrawal upper limit time corresponding to the withdrawal user identifier is obtained.
And 204, judging whether the withdrawing time exceeds the withdrawing upper limit time, and if the withdrawing time exceeds the withdrawing upper limit time, determining the withdrawing upper limit time as the target withdrawing time.
Instep 205, if the pullback time does not exceed the pullback upper limit time, it is determined that the pullback time is the target pullback time.
Specifically, whether the withdrawing time exceeds the withdrawing upper limit time is judged, if the withdrawing upper limit time is known to be exceeded, the withdrawing upper limit time is determined as the target withdrawing time, and if the withdrawing upper limit time is not known to be exceeded, the withdrawing time is determined as the target withdrawing time.
In one embodiment of the present application, some scene contents are multi-user scenes, such as a multiplayer game scene, and thus, scene retraction events of multiple users may be received simultaneously.
Specifically, as shown in fig. 4, thestep 102 includes:
step 301, when a plurality of scene retraction events are monitored simultaneously, analyzing each scene retraction event to obtain a plurality of retraction requirement identifiers and a plurality of retraction user identifiers.
Specifically, after a plurality of scene retraction events are monitored simultaneously, each scene retraction event is analyzed to obtain a plurality of retraction requirement identifiers and a plurality of retraction user identifiers.
The above embodiments may be referred to as a manner of obtaining a plurality of retraction requirement identifiers and a plurality of retraction user identifiers.
Step 302, querying the level information, and obtaining a user level corresponding to each withdrawn user identifier.
It can be understood that user level information is pre-constructed, and the corresponding relationship between the user identifier and the user level is stored in the user level information, so that the preset level information is queried to obtain the user level corresponding to each withdrawn user identifier.
Step 303, if all the user levels are the same, determining a plurality of withdrawing times corresponding to each withdrawing requirement identifier, and determining the shortest withdrawing time in the withdrawing times as the target withdrawing time from the plurality of withdrawing times.
Specifically, if it is known that all the user levels are the same, a plurality of withdrawing times corresponding to each withdrawing requirement identifier are determined, and the shortest withdrawing time among the plurality of withdrawing times is determined as the target withdrawing time.
And step 304, if all the user levels are different, determining the withdrawal time corresponding to the withdrawal requirement identifier of the user with the highest level as the target withdrawal time.
Specifically, if it is known that all user levels are different, the withdrawal time corresponding to the withdrawal requirement identifier of the user with the highest level is determined as the target withdrawal time.
Of course, in the actual execution process, other manners may also be used to determine the target retraction time, for example, all the retraction times corresponding to the retraction requirement identifiers of the users are sent to the user with the highest user level, and the user with the highest user level selects and determines the target retraction time.
And 103, inquiring the attribute change information, and determining each target history attribute information corresponding to the target history time matched with the unit area identifier of each unit area and the target withdrawal time.
Specifically, the attribute change information is queried, and the target history attribute information corresponding to the target history time at which the unit area identifier of each unit area matches the target retraction time is determined, that is, the target history attribute information at the corresponding history time is obtained.
And 104, displaying the target content in each corresponding unit area according to each target historical attribute information corresponding to the unit area identifier of each unit area.
Specifically, after the target historical attribute information is acquired, the target content is displayed in each corresponding unit area according to each target historical attribute information corresponding to the unit area identifier of each unit area, so that all the unit areas return to the historical display state, and the rollback experience of the user is improved.
Considering that in some application scenarios, the user may return to the scene content at the historical time, rather than regress some operations, but wants to return to the scene content at the historical time, in an embodiment of the present application, as shown in fig. 5, after the corresponding display target content of each unit area is determined, a small screen may be displayed in the form of a preview image in the relevant area of the display screen to display the display target content of each unit area, and when the user double-clicks the small screen, the user may return to the scene content at the historical time.
In some scenarios, it is possible that a user cancels a previous rollback operation to return to a current scene content, that is, in an embodiment of the present application, after the target content is displayed in the corresponding unit areas, a rollback cancellation event may also be received, where the receipt rollback cancellation event may be triggered by voice or by clicking a corresponding control, and it is determined whether a time difference between a receipt time of the rollback cancellation event and a receipt time of the scene rollback event is within a preset range, where the preset range may be calibrated according to actual needs, if the time difference is within the preset range, the content before the scene rollback event is received is displayed in the corresponding unit areas, and if the time difference exceeds the preset range, the target content is continuously displayed in the corresponding unit areas. That is, in the present embodiment, the retraction canceling time can be executed only within the preset range, and thus, the attribute information of the unit area from the retraction time to the current time can be cached only at the time within the preset range, and the processing can be deleted beyond that time, thereby relieving the caching pressure.
To sum up, in the operation rollback processing method according to the embodiment of the present application, in the process of displaying continuously changing scene content on the screen, attribute change information corresponding to each unit area in the screen is stored, where the attribute change information includes: and when the scene retraction event is monitored, analyzing the scene retraction event to determine target retraction time, further inquiring attribute change information, determining each target history attribute information corresponding to the unit area identifier of each unit area and the target history time matched with the target retraction time, and finally displaying target content in each corresponding unit area according to each target history attribute information corresponding to the unit area identifier of each unit area. Therefore, the content displayed on the screen can be completely returned to the corresponding historical moment according to the operation returning processing, and the operation returning experience is improved.
In order to implement the above embodiments, the present application further provides an operation rollback processing apparatus. Fig. 6 is a schematic structural diagram of an operation rollback processing apparatus according to an embodiment of the present application, and as shown in fig. 6, the operation rollback processing apparatus includes: astorage module 100, aparsing module 200, adetermination module 300, and adisplay module 400, wherein,
astorage module 100, configured to store attribute change information corresponding to each unit area in a screen in a process of displaying continuously changing scene content on the screen, where the attribute change information includes: history time and history attribute information corresponding to the unit area identifier of each unit area;
theanalysis module 200 is configured to, when a scene retraction event is monitored, analyze the scene retraction event to determine a target retraction time;
the determiningmodule 300 is configured to query the attribute change information, and determine each target history attribute information corresponding to the target history time at which the unit area identifier of each unit area matches the target retraction time;
adisplay module 400, configured to display the target content in each corresponding unit area according to each target history attribute information corresponding to the unit area identifier of each unit area.
In an embodiment of the present application, thestorage module 100 is specifically configured to:
detecting whether a target control preset in scene content is triggered;
and if the target control is monitored to be triggered, acquiring current attribute information corresponding to the unit area identification of each unit area in the screen at the current moment, and storing the current attribute information into attribute change information.
In an embodiment of the present application, theparsing module 200 is specifically configured to:
analyzing a scene withdrawing event to acquire a withdrawing demand identifier and a withdrawing user identifier;
determining withdrawal time according to the withdrawal requirement identifier;
inquiring user registration information, and acquiring withdrawal upper limit time corresponding to a withdrawal user identifier;
and judging whether the withdrawing time exceeds the withdrawing upper limit time, if so, determining the withdrawing upper limit time as the target withdrawing time, and if not, determining the withdrawing time as the target withdrawing time.
In an embodiment of the present application, the parsing module is specifically configured to:
when a plurality of scene withdrawing events are monitored simultaneously, analyzing each scene withdrawing event to obtain a plurality of withdrawing requirement identifications and a plurality of withdrawing user identifications;
inquiring the level information, and acquiring a user level corresponding to each withdrawing user identifier;
if all the user levels are the same, determining a plurality of withdrawing times corresponding to each withdrawing requirement identification, and determining the shortest withdrawing time in the withdrawing times as a target withdrawing time from the plurality of withdrawing times;
and if all the user levels are different, determining the withdrawal time corresponding to the withdrawal requirement identification of the user with the highest level as the target withdrawal time.
In one embodiment of the present application, as shown in fig. 7, on the basis of fig. 6, the apparatus further comprises: a receivingmodule 500 and a determiningmodule 600, wherein,
areceiving module 500, configured to receive a retraction canceling event;
a determiningmodule 600, configured to determine whether a time difference between a receiving time of the retraction canceling event and a receiving time of the scene retraction event is within a preset range;
in this embodiment, thedisplay module 400 is further configured to display, in each corresponding unit area, content before the receiving scene retraction event if the time difference is within the preset range;
thedisplay module 400 is further configured to continue to display the target content in each corresponding unit area if the time difference is beyond the preset range.
It should be noted that the foregoing explanation of the operation rollback processing method is also applicable to the operation rollback processing apparatus in the embodiment of the present application, and the implementation principle thereof is similar and will not be described herein again.
To sum up, the operation rollback processing apparatus according to the embodiment of the present application stores attribute change information corresponding to each unit area in a screen during a process of displaying continuously changing scene content on the screen, where the attribute change information includes: and when the scene retraction event is monitored, analyzing the scene retraction event to determine target retraction time, further inquiring attribute change information, determining each target history attribute information corresponding to the unit area identifier of each unit area and the target history time matched with the target retraction time, and finally displaying target content in each corresponding unit area according to each target history attribute information corresponding to the unit area identifier of each unit area. Therefore, the content displayed on the screen can be completely returned to the corresponding historical moment according to the operation returning processing, and the operation returning experience is improved.
In order to implement the foregoing embodiments, an electronic device is further provided in an embodiment of the present application, including a processor and a memory;
wherein, the processor runs the program corresponding to the executable program code by reading the executable program code stored in the memory, so as to implement the operation rollback processing method as described in the above embodiments.
FIG. 8 illustrates a block diagram of an exemplary electronic device suitable for use in implementing embodiments of the present application. The electronic device 12 shown in fig. 8 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in FIG. 8, electronic device 12 is embodied in the form of a general purpose computing device. The components of electronic device 12 may include, but are not limited to: one or more processors orprocessing units 16, asystem memory 28, and abus 18 that couples various system components including thesystem memory 28 and theprocessing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. These architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus, to name a few.
Electronic device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by electronic device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
Memory 28 may include computer system readable media in the form of volatile Memory, such as Random Access Memory (RAM) 30 and/orcache Memory 32. The electronic device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only,storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 8, and commonly referred to as a "hard drive"). Although not shown in FIG. 8, a disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a Compact disk Read Only memory (CD-ROM), a Digital versatile disk Read Only memory (DVD-ROM), or other optical media) may be provided. In these cases, each drive may be connected tobus 18 by one or more data media interfaces.Memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the application.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, inmemory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally perform the functions and/or methodologies of the embodiments described herein.
Electronic device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device,display 24, etc.), with one or more devices that enable a user to interact with electronic device 12, and/or with any devices (e.g., network card, modem, etc.) that enable electronic device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O)interface 22. Also, the electronic device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public Network such as the Internet) via theNetwork adapter 20. As shown, thenetwork adapter 20 communicates with other modules of the electronic device 12 via thebus 18. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with electronic device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Theprocessing unit 16 executes various functional applications and data processing, for example, implementing the methods mentioned in the foregoing embodiments, by executing programs stored in thesystem memory 28.
In order to implement the foregoing embodiments, the present application also proposes a non-transitory computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the operation rollback processing method described in the foregoing embodiments.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (10)

CN201911398682.5A2019-12-302019-12-30Operation rollback processing method and devicePendingCN111158829A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201911398682.5ACN111158829A (en)2019-12-302019-12-30Operation rollback processing method and device

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201911398682.5ACN111158829A (en)2019-12-302019-12-30Operation rollback processing method and device

Publications (1)

Publication NumberPublication Date
CN111158829Atrue CN111158829A (en)2020-05-15

Family

ID=70559619

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201911398682.5APendingCN111158829A (en)2019-12-302019-12-30Operation rollback processing method and device

Country Status (1)

CountryLink
CN (1)CN111158829A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN113885342A (en)*2021-10-092022-01-04青岛海尔科技有限公司 Method and apparatus for scene execution rollback
CN114849241A (en)*2022-04-222022-08-05网易(杭州)网络有限公司 An information processing method, device, computer equipment and storage medium
WO2024093937A1 (en)*2022-10-312024-05-10北京字跳网络技术有限公司Method and apparatus for viewing audio-visual content, device, and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101040250A (en)*2004-11-122007-09-19佳思腾软件公司Data processing device and data processing method
CN107050850A (en)*2017-05-182017-08-18腾讯科技(深圳)有限公司The recording and back method of virtual scene, device and playback system
US20180004547A1 (en)*2016-06-302018-01-04Microsoft Technology Licensing, Llc.Assistive technology notifications for relevant metadata changes in a document
CN108211350A (en)*2017-12-072018-06-29网易(杭州)网络有限公司Information processing method, electronic equipment and storage medium
CN108228300A (en)*2018-01-022018-06-29武汉斗鱼网络科技有限公司The method and device that a kind of control content refreshes in real time
WO2018130135A1 (en)*2017-01-132018-07-19腾讯科技(深圳)有限公司Method and device for controlling way-finding of simulation object, and server
CN109218522A (en)*2018-08-242019-01-15北京金山安全软件有限公司Function area processing method and device in application, electronic equipment and storage medium
CN110052030A (en)*2019-04-262019-07-26腾讯科技(深圳)有限公司Vivid setting method, device and the storage medium of virtual role

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101040250A (en)*2004-11-122007-09-19佳思腾软件公司Data processing device and data processing method
US20180004547A1 (en)*2016-06-302018-01-04Microsoft Technology Licensing, Llc.Assistive technology notifications for relevant metadata changes in a document
WO2018130135A1 (en)*2017-01-132018-07-19腾讯科技(深圳)有限公司Method and device for controlling way-finding of simulation object, and server
CN107050850A (en)*2017-05-182017-08-18腾讯科技(深圳)有限公司The recording and back method of virtual scene, device and playback system
CN108211350A (en)*2017-12-072018-06-29网易(杭州)网络有限公司Information processing method, electronic equipment and storage medium
CN108228300A (en)*2018-01-022018-06-29武汉斗鱼网络科技有限公司The method and device that a kind of control content refreshes in real time
CN109218522A (en)*2018-08-242019-01-15北京金山安全软件有限公司Function area processing method and device in application, electronic equipment and storage medium
CN110052030A (en)*2019-04-262019-07-26腾讯科技(深圳)有限公司Vivid setting method, device and the storage medium of virtual role

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN113885342A (en)*2021-10-092022-01-04青岛海尔科技有限公司 Method and apparatus for scene execution rollback
CN114849241A (en)*2022-04-222022-08-05网易(杭州)网络有限公司 An information processing method, device, computer equipment and storage medium
WO2024093937A1 (en)*2022-10-312024-05-10北京字跳网络技术有限公司Method and apparatus for viewing audio-visual content, device, and storage medium

Similar Documents

PublicationPublication DateTitle
CN108073519B (en)Test case generation method and device
CN108009303B (en)Search method and device based on voice recognition, electronic equipment and storage medium
CN111158829A (en)Operation rollback processing method and device
CN106874520B (en)Webpage loading method and device and electronic equipment
CN106998494B (en)Video recording method and related device
CN109376256B (en)Image searching method and device
CN110300327B (en)Game client performance analysis method, device, terminal and storage medium
CN114461691B (en)Control method and device of state machine, electronic equipment and storage medium
CN105183302B (en)A kind of method and terminal of control application
CN109310354A (en)The rendering method and device of ST event in electrocardiogram
CN104199693B (en)Method, device and terminal for obtaining boot time
CN110265122A (en)Image processing method, device, equipment and storage medium based on endoscopic system
EP3086205A1 (en)Method and apparatus for identifying operation event
CN112597931A (en)Screen state detection method and device, electronic equipment, server and storage medium
CN109857907B (en)Video positioning method and device
CN113641286A (en)Screen capturing method, electronic equipment and computer storage medium
CN109271228A (en)Interface function recognition methods, device and the electronic equipment of application
CN108833830B (en)Method and device for displaying monitoring picture in monitoring system
CN110286990B (en)User interface display method, device, equipment and storage medium
CN110876086A (en)Bullet screen generation adjusting method, device, equipment and storage medium
CN112613999A (en)Screen state recognition method and device, electronic equipment, server and storage medium
CN111151002A (en)Touch aiming method and device
CN111124109A (en)Interactive mode selection method, intelligent terminal, equipment and storage medium
CN110543582A (en)image-based query method and device
CN116901866A (en)Vehicle control method, device, electronic equipment, vehicle and storage medium

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
TA01Transfer of patent application right
TA01Transfer of patent application right

Effective date of registration:20220816

Address after:Texas, USA

Applicant after:People's happiness Co.,Ltd.

Address before:100085 East District, Second Floor, 33 Xiaoying West Road, Haidian District, Beijing

Applicant before:BEIJING KINGSOFT INTERNET SECURITY SOFTWARE Co.,Ltd.

WD01Invention patent application deemed withdrawn after publication
WD01Invention patent application deemed withdrawn after publication

Application publication date:20200515


[8]ページ先頭

©2009-2025 Movatter.jp