Movatterモバイル変換


[0]ホーム

URL:


CN112445697B - Method and apparatus for testing applications - Google Patents

Method and apparatus for testing applications
Download PDF

Info

Publication number
CN112445697B
CN112445697BCN201910825794.8ACN201910825794ACN112445697BCN 112445697 BCN112445697 BCN 112445697BCN 201910825794 ACN201910825794 ACN 201910825794ACN 112445697 BCN112445697 BCN 112445697B
Authority
CN
China
Prior art keywords
user operation
test
log
buried point
result data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910825794.8A
Other languages
Chinese (zh)
Other versions
CN112445697A (en
Inventor
谢俊英
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Zhenshi Information Technology Co Ltd
Original Assignee
Beijing Jingdong Zhenshi Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Zhenshi Information Technology Co LtdfiledCriticalBeijing Jingdong Zhenshi Information Technology Co Ltd
Priority to CN201910825794.8ApriorityCriticalpatent/CN112445697B/en
Publication of CN112445697ApublicationCriticalpatent/CN112445697A/en
Application grantedgrantedCritical
Publication of CN112445697BpublicationCriticalpatent/CN112445697B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

Embodiments of the present disclosure disclose methods and apparatus for testing applications. One embodiment of the method comprises the following steps: generating a buried point log corresponding to the user operation through a log buried point configured in the application in advance in response to the detection of the user operation of the application, wherein the buried point log comprises actual result data corresponding to the user operation; determining a test case matched with the buried point log from a predetermined test case set, wherein the test cases in the test case set comprise expected result data; based on the buried point log and the test cases matched with the buried point log, actual result data and expected result data corresponding to user operation are compared, and a test result is generated. The embodiment realizes the test in an on-line real environment, does not generate garbage data, and is beneficial to improving the coverage rate of the automatic test.

Description

Method and apparatus for testing applications
Technical Field
Embodiments of the present disclosure relate to the field of computer technology, and in particular, to a method and apparatus for testing applications.
Background
Most of the existing automatic tests are automatic in actual operation, and are required to depend on a test environment and a pre-release environment. In general, the test environment and the pre-production environment involved in automated testing are not real environments after the application is online. After the application is online, there may still be many untested functions, and whether the untested functions have anomalies is often difficult to predict in advance.
If the online real environment is used as a test environment, a tester adopts the simulation data to test, and garbage data can be generated in the test process.
Disclosure of Invention
The present disclosure proposes methods and apparatus for testing applications.
In a first aspect, embodiments of the present disclosure provide a method for testing an application, the method comprising: generating a buried point log corresponding to the user operation through a log buried point configured in the application in advance in response to the detection of the user operation of the application, wherein the buried point log comprises actual result data corresponding to the user operation; determining a test case matched with the buried point log from a predetermined test case set, wherein the test cases in the test case set comprise expected result data; based on the buried point log and the test cases matched with the buried point log, actual result data and expected result data corresponding to user operation are compared, and a test result is generated.
In some embodiments, the user operation is for operating on data in a target database; and comparing actual result data corresponding to the user operation with expected result data based on the buried point log and the test case matched with the buried point log to generate a test result, wherein the test result comprises the following steps: comparing the actual result data and the expected result data corresponding to the user operation to generate a comparison result; responding to the comparison result to indicate that the actual result data corresponding to the user operation is inconsistent with the expected result data, and determining whether the abnormal log information aiming at the user operation is captured; generating a test result representing test case failure in response to the captured abnormal log information; responding to the comparison result to indicate that the actual result data corresponding to the user operation is consistent with the expected result data, and determining whether the data corresponding to the user operation stored in the target database is correct or not; generating a test result representing success of the test case in response to the correctness of the data corresponding to the user operation stored in the target database; and generating a test result representing the failure of the test case in response to incorrect data corresponding to the user operation stored in the target database.
In some embodiments, the method further comprises: storing the buried point log in a log file; and cleaning the buried point logs stored in the log file according to the preset log cleaning time.
In some embodiments, determining test cases matching the buried point log from a predetermined set of test cases includes: from a predetermined set of test cases, a test case that includes the same key character as the buried point log is determined as a test case that matches the buried point log.
In some embodiments, the method further comprises: and running the test cases in which the matched buried point logs are not generated in the first time period in the test case set to obtain test results of the test cases in which the matched buried point logs are not generated.
In some embodiments, the method further comprises: and generating the test coverage rate corresponding to the preset second time period based on the test case set and the buried point log generated in the preset second time period.
In a second aspect, embodiments of the present disclosure provide an apparatus for testing an application, the apparatus comprising: a first generation unit configured to generate a buried point log corresponding to a user operation through a log buried point configured in the application in advance in response to detection of the user operation on the application, wherein the buried point log includes actual result data corresponding to the user operation; a determining unit configured to determine a test case matching the buried point log from a predetermined set of test cases, wherein the test cases in the set of test cases include expected result data; the comparison unit is configured to compare actual result data corresponding to user operation with expected result data based on the buried point log and the test cases matched with the buried point log, and generate a test result.
In some embodiments, the user operation is for operating on data in a target database; and an alignment unit further configured to: comparing the actual result data and the expected result data corresponding to the user operation to generate a comparison result; responding to the comparison result to indicate that the actual result data corresponding to the user operation is inconsistent with the expected result data, and determining whether the abnormal log information aiming at the user operation is captured; generating a test result representing test case failure in response to the captured abnormal log information; responding to the comparison result to indicate that the actual result data corresponding to the user operation is consistent with the expected result data, and determining whether the data corresponding to the user operation stored in the target database is correct or not; generating a test result representing success of the test case in response to the correctness of the data corresponding to the user operation stored in the target database; and generating a test result representing the failure of the test case in response to incorrect data corresponding to the user operation stored in the target database.
In some embodiments, the apparatus further comprises: a storage unit configured to store the buried point log in a log file; the cleaning unit is configured to clean the buried point logs stored in the log file according to the preset log cleaning time.
In some embodiments, the determining unit is further configured to: from a predetermined set of test cases, a test case that includes the same key character as the buried point log is determined as a test case that matches the buried point log.
In some embodiments, the apparatus further comprises: the operation unit is configured to operate the test cases in which the matched buried point logs are not generated in the first time period in the test case set, and test results of the test cases in which the matched buried point logs are not generated are obtained.
In some embodiments, the apparatus further comprises: the second generation unit is configured to generate a test coverage rate corresponding to a preset second time period based on the test case set and the buried point log generated in the preset second time period.
In a third aspect, embodiments of the present disclosure provide an electronic device for testing applications, comprising: one or more processors; and a storage device having one or more programs stored thereon, which when executed by the one or more processors cause the one or more processors to implement a method as in any of the embodiments of the method for testing an application described above.
In a fourth aspect, embodiments of the present disclosure provide a computer-readable medium for testing an application, having stored thereon a computer program which, when executed by a processor, implements a method as in any of the embodiments of the method for testing an application described above.
According to the method and the device for testing the application, under the condition that the user operation of the application is detected, the embedded point log corresponding to the user operation is generated through the embedded point of the log configured in the application in advance, the embedded point log comprises actual result data corresponding to the user operation, then, the test cases matched with the embedded point log are determined from the predetermined test case set, the test cases in the test case set comprise expected result data, finally, the test results are generated by comparing the actual result data corresponding to the user operation with the expected result data based on the embedded point log and the test cases matched with the embedded point log, therefore, the test can be realized in an on-line real environment, garbage data is not generated, and the coverage rate of automatic test is improved.
Drawings
Other features, objects and advantages of the present disclosure will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the following drawings:
FIG. 1 is an exemplary system architecture diagram in which an embodiment of the present disclosure may be applied;
FIG. 2 is a flow chart of one embodiment of a method for testing an application according to the present disclosure;
FIG. 3 is a schematic illustration of one application scenario of a method for testing an application according to the present disclosure;
FIG. 4 is a flow chart of yet another embodiment of a method for testing an application according to the present disclosure;
FIG. 5 is a schematic structural view of one embodiment of an apparatus for testing applications according to the present disclosure;
FIG. 6 is a schematic diagram of a computer system suitable for use with a server implementing embodiments of the present disclosure.
Detailed Description
The present disclosure is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings.
It should be noted that, without conflict, the embodiments of the present disclosure and features of the embodiments may be combined with each other. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
FIG. 1 illustrates an exemplary system architecture 100 of an embodiment of a method for testing an application or an apparatus for testing an application to which embodiments of the present disclosure may be applied.
As shown in fig. 1, a system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
A user may interact with the server 105 via the network 104 using the terminal devices 101, 102, 103 to receive or transmit data or the like. For example, after a user operates a client application installed in the terminal device 101, 102, 103, the terminal device 101, 102, 103 may send an HTTP ((HyperText Transfer Protocol, hypertext transfer protocol) request corresponding to the operation to the server 105 so that the server 105 detects the operation of the application by the user.
The terminal devices 101, 102, 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices having a display screen and supporting page browsing, including but not limited to smartphones, tablet computers, electronic book readers, MP3 players (Moving Picture Experts Group Audio Layer III, dynamic video expert compression standard audio plane 3), MP4 (Moving Picture Experts Group Audio Layer IV, dynamic video expert compression standard audio plane 4) players, laptop and desktop computers, and the like. When the terminal devices 101, 102, 103 are software, they can be installed in the above-listed electronic devices. Which may be implemented as multiple software or software modules (e.g., software or software modules for providing distributed services) or as a single software or software module. The present invention is not particularly limited herein.
The server 105 may be a server providing various services, such as a background server providing support for pages displayed on the terminal devices 101, 102, 103. The background server may analyze and process the received data such as HTTP requests, so as to provide support for running applications installed in the terminal devices 101, 102, and 103, and further generate a buried point log corresponding to the user operation by burying the log configured in the applications in advance, and generate a test result based on the buried point log and a test case matched with the buried point log. As an example, the server 105 may be a cloud server or a physical server.
It should be noted that, the server may be hardware, or may be software. When the server is hardware, the server may be implemented as a distributed server cluster formed by a plurality of servers, or may be implemented as a single server. When the server is software, it may be implemented as a plurality of software or software modules (e.g., software or software modules for providing distributed services), or as a single software or software module. The present invention is not particularly limited herein.
It should also be noted that the method for testing an application provided by the embodiments of the present disclosure is typically performed by a server. Accordingly, the various parts (e.g., individual units, sub-units, modules, sub-modules) that the apparatus for testing applications includes are typically disposed in a server.
It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to fig. 2, a flow 200 of one embodiment of a method for testing an application according to the present disclosure is shown. The method for testing an application comprises the steps of:
in step 201, in response to detecting a user operation of the application, a buried point log corresponding to the user operation is generated through a log buried point configured in the application in advance.
In the present embodiment, in the case where a user operation for operating an application is detected, an execution subject of a method for testing an application (for example, a server shown in fig. 1) may generate a buried point log corresponding to the user operation by burying points in a log configured in the application in advance. The embedded point log comprises actual result data corresponding to user operation.
The application may be a series of sets of computer data and instructions organized in a particular order, among others. For example, the application may be system software, application software (e.g., shopping software), web site application, and the like. The user operation may be an operation performed on the application by a user of the application through a terminal device in which the application is installed. For example, user operations may include, but are not limited to: a collect product operation, a buy product operation, etc.
It will be appreciated that the log burial point may be configured by a technician in the program code of the application during the development phase. After the development is completed, after the application is deployed on line, the user may interact with the execution subject through the user terminal in which the application is installed by downloading the application at the user terminal. In the interaction process, the user operation can run the program code of the application, and when the log embedded point is configured in the executed program code, the execution body can generate the embedded point log corresponding to the user operation through the log embedded point configured in the application in advance.
As an example, if the user operation indicates an order payment, then the log entry may be "log.info (" autotest: order xxxx payment success | ")", in the event that the user payment is successful; in the event of a user payment failure, the log entry may be "log. Info (" autotest: order xxx payment failure | ")". Thus, the executing body can generate a buried point log of' order xxxx payment success! "; in case of a user payment failure, a buried log "order xxxx payment failure-! ". Wherein, the buried log "order xxxxxx payment success-! The actual result data corresponding to the included user operation can be "payment success"; buried log "order xxxx payment failure-! The "actual result data corresponding to the included user operation" may be "payment failure".
As an example, the above-described log burial points may include, but are not limited to, at least one of: journal embedding points for front end (Web) function interfaces, journal embedding points for interface and Message Queue (MQ) codes, exception journal embedding points, and the like. The log embedded point of the front-end function interface can be used for statistics, but is not limited to the following information: the touch rate of each screen, the turn-back rate of each screen, the stay time of each screen and the like. The embedded point of the interface and the message queue code can be realized by a log4j (an open-source log operation package) method through configuration of properties files, configuration of destinations, formats and the like of the log, and in addition, the embedded point log corresponding to the embedded point of the interface and the message queue code can be printed according to the date. When the log is abnormal in the range of the embedded point of the set test case, the abnormal log information can be captured.
In practice, a plurality of log embedded points can be set for each function of the application, and in general, the greater the number of the log embedded points, the more embedded point logs can be obtained, so as to facilitate more comprehensive testing of the application through subsequent steps.
Step 202, determining a test case matched with the buried point log from a predetermined test case set.
In this embodiment, the execution body may determine a test case matching the buried point log from a predetermined test case set. Wherein the test cases in the test case set include expected result data. The desired result data may characterize a desired result of a user operation as predetermined by a technician. As an example, when a user operation indicates a payment order, the expected result data included in the test case corresponding to the user operation may be data indicating that payment was successful.
Here, the test case set may be set for each of a plurality of flows. Each flow may correspond to multiple test cases. As an example, if the application is shopping software, the test case set may be set for the following flow: a return process, a repair new process, a pick-up process, a payment process, a purchase process, and the like. Wherein each flow may implement one or more functions. Each function may set one or more test cases accordingly.
As an example, a technician can set a test case for each program statement, or can set a test case for each program branch in an application program, so that a more comprehensive test is realized, and the test coverage rate is improved.
For example, the set of predetermined test cases may be stored in a file, which may be stored in a pre-specified location. Therefore, after the development packages of 'java.io.ioException' and 'java.io.random Access File' are imported, the file can be read in real time according to the position of the stored file by using a seek method in RandomAccessFile types. Analyzing key characters of the test case: for example, key characters: "autotest: sales outlet order xxxxxx operation outlet success-! "test cases for ex-warehouse operations". Therefore, the execution subject can compare actual result data with expected result data by intercepting key characters.
In some optional implementations of this embodiment, the foregoing execution body may execute the step 202 by:
from a predetermined set of test cases, a test case that includes the same key character as the buried point log is determined, so that the test case that includes the same key character as the buried point log is taken as a test case that matches the buried point log.
Alternatively, the above-mentioned execution body may also execute the step 202 as follows:
When the technician configures the embedded point of the log, an identifier of a test case matched with the embedded point of the log is set in each embedded point of the log, then the test case indicated by the identifier included in the embedded point log is determined from a predetermined test case set, and then the test case indicated by the identifier included in the embedded point log is taken as the test case matched with the embedded point log.
Step 203, comparing the actual result data corresponding to the user operation with the expected result data based on the embedded point log and the test case matched with the embedded point log, and generating a test result.
In this embodiment, the execution body may compare actual result data corresponding to the user operation with expected result data based on the embedded point log and the test case matched with the embedded point log, and generate a test result. The test result may be used to indicate whether the actual result data corresponding to the user operation is consistent with the expected result data.
Here, when the actual result data and the desired result data corresponding to the user operation indicate the same result, the actual result data and the desired result data corresponding to the user operation agree.
When the actual result data and the expected result data are not exactly the same, the actual result data and the expected result data may be identical. For example, if the actual result data is "order XXX out successful" and the desired result data is "out successful", it may be characterized that the actual result data and the desired result data are consistent.
Specifically, the execution body may compare actual result data included in the generated buried point log with expected result data included in the test case matched with the buried point log, thereby generating a test result.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the method for testing an application according to the present embodiment. In the application scenario of fig. 3, the user performs a user operation 303 (an operation of "favorite XX product" in fig. 3) on the application installed in the terminal device 301, and then, in the case where the user operation 303 of the above-described application is detected by the user, the server 302 generates a buried point log 306 corresponding to the user operation 303 through a log buried point 304 configured in the application in advance, where the buried point log 306 includes actual result data 308 corresponding to the user operation 303. Then, the server 302 identifies a test case 307 matching the buried log 306 from the predetermined test case set 305. Wherein the test cases in the test case set 305 include expected result data. Finally, the server 302 compares the actual result data 308 corresponding to the user operation 303 with the expected result data 309 based on the embedded point log 306 and the test cases 307 matching the embedded point log 306, and generates a test result 310.
Most of the existing automatic tests are automatic in actual operation, and are required to depend on a test environment and a pre-release environment. In general, the test environment and pre-production environment involved in existing automated testing are not real environments after the application is brought online. Thus, after the application is brought online, there may still be more functions that have not been tested. Whether or not the function which is not tested is abnormal is difficult to predict in advance. If the online real environment is used as a test environment, a tester adopts the simulation data to test, and garbage data can be generated in the test process.
According to the method for testing the application, under the condition that the user operation of the application is detected, the embedded point log corresponding to the user operation is generated through the embedded point of the log configured in the application in advance, then the actual result data corresponding to the user operation and the expected result data are compared based on the generated embedded point log and the test case matched with the embedded point log, and a test result is generated, so that the application is tested in an on-line real environment, and the coverage rate of automatic test is improved. In addition, the data adopted in the prior art are often simulated data, and the data adopted in the embodiment of the present disclosure are data generated by actual operations of users, so that the use situation of the users is restored, and therefore, the test result is more accurate, and the more functions the users use, the more tests are performed, so that the stability of the functions with higher use frequency of the users is facilitated. Moreover, in the prior art, when the online real environment is used as the test environment, a tester often uses simulated data to perform a test, and data to be stored in a database may be generated during the test, however, the database is usually used for storing the real data generated during the operation of a user, so that the data stored in the database during the test performed by the tester using the simulated data is usually used as garbage data. The embodiment of the disclosure can test the application only through the user operation in the use process of the user, and the data generated in the use process is real data, so that the embodiment of the disclosure can avoid the generation of junk data in the test process.
In some optional implementations of this embodiment, the foregoing execution body may further execute the following steps:
First, the buried point log is stored in a log file. The log file may be a file with log as a suffix, a file with txt as a suffix, or a file with other suffixes.
And then cleaning the buried point logs stored in the log file according to the preset log cleaning time. The preset log cleaning time may be a predetermined time for cleaning the log. Thus, the execution body can clean the buried point log stored in the log file at the predetermined time for cleaning the log. For example, the predetermined log clearing time may be "0 pm per day", so that the executing body can clear the buried log stored in the log file at 0 pm per day. In addition, the above-mentioned preset log cleaning time may also indicate a time period for cleaning the log. In this way, the execution body may periodically clean up the buried point log stored in the log file.
In some optional implementations of this embodiment, the foregoing execution body may further execute the following steps:
And running the test cases in which the matched buried point logs are not generated in the first time period in the test case set to obtain test results of the test cases in which the matched buried point logs are not generated. Wherein the first period of time may be a predetermined period of time. The first period may be a period starting at any time and ending at any time after the start. In some cases, the end of the first time period may be the current time. As an example, the first time period may be a time period from 0 point to 24 points of the day.
It can be understood that, in the case that the test result of the application in the first period needs to be obtained, the test result of the test case which is not matched by the user operation in the first period can be obtained by determining the test case in the test case set, in which the matched buried point log is not generated in the first period, so that the determined test case is adopted to test the application. Thus, the alternative implementation may further improve the test coverage over a scenario in which the test result is obtained only through user operation.
In some optional implementations of this embodiment, the foregoing execution body may further execute the following steps:
And generating the test coverage rate corresponding to the preset second time period based on the test case set and the buried point log generated in the preset second time period.
As an example, the above-described execution subject may generate the test coverage corresponding to the preset second period of time in the following manner:
first, the number of test cases in the test case set, which are matched with each buried point log generated in a preset second time period, is determined.
Then, the ratio of the number of the matched test cases to the total number of the test cases in the test case set is determined as the test coverage corresponding to the preset second time period.
As yet another example, the execution subject may also generate the test coverage corresponding to the preset second period in the following manner:
Firstly, determining the quantity of buried point logs which characterize the success of the test cases in the test case set and each buried point log generated in a preset second time period.
And then, determining the ratio of the determined number of the buried point logs to the total number of the test cases in the test case set as the test coverage rate corresponding to the preset second time period.
It can be appreciated that by calculating the test coverage, test integrity and effectiveness can be measured, thereby achieving reliability and stability of the online application.
With further reference to fig. 4, a flow 400 of yet another embodiment of a method for testing an application is shown. The flow 400 of the method for testing an application comprises the steps of:
In step 401, in response to detecting the user operation of the application, a buried point log corresponding to the user operation is generated through a log buried point configured in the application in advance. Thereafter, step 402 is performed.
In the present embodiment, in the case where a user operation of an application is detected, an execution subject (e.g., a server shown in fig. 1) of a method for testing the application may generate a buried point log corresponding to the user operation by burying points in a log configured in the application in advance. The embedded point log comprises actual result data corresponding to user operation. The user operation is used for operating on the data in the target database.
In this embodiment, step 401 is substantially identical to step 201 in the corresponding embodiment of fig. 2, and will not be described herein.
Step 402, determining a test case matched with the buried point log from a predetermined test case set. Thereafter, step 403 is performed.
In this embodiment, step 402 is substantially identical to step 202 in the corresponding embodiment of fig. 2, and will not be described herein.
Step 403, comparing the actual result data corresponding to the user operation with the expected result data to generate a comparison result. Thereafter, step 404 is performed.
In this embodiment, an execution body (for example, a server shown in fig. 1) of the method for testing an application may compare actual result data corresponding to a user operation with expected result data to generate a comparison result. The comparison result may be used to indicate whether the actual result data corresponding to the user operation is consistent with the expected result data.
When the actual result data and the expected result data are not exactly the same, the actual result data and the expected result data may be identical. For example, if the actual result data is "order XXX out successful" and the desired result data is "out successful", it may be characterized that the actual result data and the desired result data are consistent.
Step 404, determining whether the comparison result indicates that the actual result data corresponding to the user operation is consistent with the expected result data. If yes, go to step 406; if not, go to step 405.
In this embodiment, the execution body may determine whether the comparison result indicates that the actual result data corresponding to the user operation is consistent with the expected result data.
Step 405 determines whether abnormal log information for the user operation is captured. After that, if the abnormality log information for the user operation is captured, step 407 is performed.
In this embodiment, when the comparison result indicates that the actual result data corresponding to the user operation is inconsistent with the expected result data, the execution body may further determine whether to capture abnormal log information for the user operation.
Here, the above-described abnormality log information may be information indicating that an abnormality log occurs during the test.
It will be appreciated that, typically, when an anomaly occurs during a user operation, the executing entity may capture anomaly log information for the user operation.
As an example, in the case where the comparison result indicates that the actual result data corresponding to the user operation is inconsistent with the desired result data, the execution subject may determine that an abnormality occurs in the user operation process, thereby capturing abnormality log information for indicating that the actual result data corresponding to the user operation is inconsistent with the desired result data.
As yet another example, the above-described execution subject may also determine that an abnormality occurs during the user operation in other application scenarios than the above-described example, for example, if the program statement is marked with a chinese score "; "rather than English semicolon"; when the program statement is ended, the execution body can determine that an abnormality occurs in the operation process of the user, so that abnormal log information for indicating that the program statement is ended by Chinese semicolon instead of English semicolon is captured.
In practice, a technician can determine which kind of abnormal log information is generated in which scene according to actual requirements, and the embodiments of the present disclosure are not described herein.
Step 406, determining whether the data stored in the target database corresponding to the user operation is correct. Then, if the data stored in the target database and corresponding to the user operation is correct, executing step 408; if the data stored in the target database corresponding to the user operation is incorrect, step 407 is performed.
In this embodiment, when the comparison result indicates that the actual result data corresponding to the user operation is consistent with the expected result data, the execution subject may determine whether the data corresponding to the user operation stored in the target database is correct. The target database may be for storing data operated by a user operation.
Specifically, the executing body may compare the data actually stored in the target database and corresponding to the user operation with the data stored in the target database expected to correspond to the user operation, and if the comparison result indicates that the data stored in the target database and the data stored in the target database are consistent with each other, the executing body may determine that the data stored in the target database and corresponding to the user operation is correct; if the comparison result indicates that the two data are inconsistent, the execution subject can determine that the data corresponding to the user operation stored in the target database is incorrect.
As an example, if a user operation is an operation for indicating a favorite product, the target database is used to store product information of the product that the user operation corresponds to. Then, after the user performs the user operation, the target database should be newly added with product information of the product collected by the user operation, relative to before the user performs the user operation. In this application scenario, the data stored in the target database corresponding to the user operation may be product information of a product collected by the user performing the user operation, and then, after the user performs the user operation, if product information of a product collected by the user performing the user operation is actually newly added in the target database, with respect to before the user performs the user operation, the execution subject may determine that the data stored in the target database corresponding to the user operation is correct; otherwise, the executing body may determine that the data corresponding to the user operation stored in the target database is incorrect.
Step 407, generating a test result representing the test case failure.
In this embodiment, in the case where the data corresponding to the user operation stored in the target database is incorrect, the execution subject may generate a test result that characterizes the test case failure.
In step 408, a test result is generated that characterizes the success of the test case.
In this embodiment, the execution body may generate a test result indicating success of the test case when the data corresponding to the user operation stored in the target database is correct.
Optionally, after the execution body generates a test result representing failure of the test case or a test result representing success of the test case, a test report may be generated, so that a developer knows about the running condition of the application, and further repairs the current application or the new function of the development application in time.
It should be noted that, in addition to the above, the present embodiment may further include the same or similar features and effects as those of the embodiment corresponding to fig. 2, which are not described herein.
As can be seen from fig. 4, the flow 400 of the method for testing an application in this embodiment may combine the comparison result obtained by comparing the actual result data corresponding to the user operation with the expected result data, and the correctness of the data corresponding to the user operation stored in the database, and test the application, thereby further improving the accuracy of the test result.
With further reference to fig. 5, as an implementation of the method shown in fig. 2 described above, the present disclosure provides an embodiment of an apparatus for testing an application, which corresponds to the method embodiment shown in fig. 2, which may include the same or corresponding features as the method embodiment shown in fig. 2, in addition to the features described below, and produces the same or corresponding effects as the method embodiment shown in fig. 2. The device can be applied to various electronic equipment.
As shown in fig. 5, the apparatus 500 for testing an application of the present embodiment includes: a first generation unit 501, a determination unit 502 and an alignment unit 503. Wherein, the first generating unit 501 is configured to generate a buried point log corresponding to a user operation through a log buried point configured in the application in advance in response to detection of the user operation on the application, wherein the buried point log includes actual result data corresponding to the user operation; a determining unit 502 configured to determine a test case matching the buried point log from a predetermined set of test cases, wherein the test cases in the set of test cases include expected result data; the comparison unit 503 is configured to compare actual result data corresponding to the user operation with expected result data based on the buried point log and the test case matched with the buried point log, and generate a test result. .
In the present embodiment, the first generation unit 501 of the apparatus 500 for testing an application may generate a buried point log corresponding to a user operation by burying a point in a log configured in the application in advance. The embedded point log comprises actual result data corresponding to user operation.
In this embodiment, the determining unit 502 may determine a test case matching the buried point log from a predetermined test case set. Wherein the test cases in the test case set include expected result data. The desired result data may characterize a result of a user operation that is predetermined by the technician.
In this embodiment, the comparison unit 503 may compare the actual result data corresponding to the user operation with the expected result data based on the buried point log and the test case matched with the buried point log, and generate the test result. The test result may be used to indicate whether the actual result data and the expected result data corresponding to the user operation indicate the same result.
In some optional implementations of the present embodiments, the user operations are for operating on data in the target database; and, the alignment unit 503 may be further configured to: comparing the actual result data and the expected result data corresponding to the user operation to generate a comparison result; responding to the comparison result to indicate that the actual result data corresponding to the user operation is inconsistent with the expected result data, and determining whether the abnormal log information aiming at the user operation is captured; generating a test result representing test case failure in response to the captured abnormal log information; responding to the comparison result to indicate that the actual result data corresponding to the user operation is consistent with the expected result data, and determining whether the data corresponding to the user operation stored in the target database is correct or not; generating a test result representing success of the test case in response to the correctness of the data corresponding to the user operation stored in the target database; and generating a test result representing the failure of the test case in response to incorrect data corresponding to the user operation stored in the target database.
In some optional implementations of this embodiment, the apparatus 500 further includes: a storage unit (not shown in the figure) configured to store the buried point log in a log file; a cleaning unit (not shown in the figure) configured to clean the buried point log stored in the log file according to a preset log cleaning time.
In some optional implementations of the present embodiment, the determining unit is further configured to: from a predetermined set of test cases, a test case that includes the same key character as the buried point log is determined as a test case that matches the buried point log.
In some optional implementations of this embodiment, the apparatus 500 further includes: and the running unit (not shown in the figure) is configured to run the test cases in which the matched buried point logs are not generated in the first time period in the test case set, so as to obtain the test results of the test cases in which the matched buried point logs are not generated.
In some optional implementations of this embodiment, the apparatus 500 further includes: a second generating unit (not shown in the figure) configured to generate a test coverage corresponding to a preset second period of time based on the test case set and the buried point log generated in the preset second period of time.
In the device for testing an application provided in the foregoing embodiment of the present disclosure, in response to detecting a user operation of the application, the first generating unit 501 generates, through a log embedded point configured in advance in the application, an embedded point log corresponding to the user operation, where the embedded point log includes actual result data corresponding to the user operation, and then the determining unit 502 determines, from a predetermined test case set, a test case matching with the embedded point log, where the test case in the test case set includes expected result data, and then the comparing unit 503 compares, based on the embedded point log and the test case matching with the embedded point log, the actual result data corresponding to the user operation and the expected result data, to generate a test result, thereby implementing testing of the application in an online real environment, and contributing to improving coverage rate of an automated test. In addition, the data adopted in the prior art are often simulated data, and the data adopted in the embodiment of the present disclosure are data generated by actual operations of users, so that the use situation of the users is restored, and therefore, the test result is more accurate, and the more functions the users use, the more tests are performed, so that the stability of the functions with higher use frequency of the users is facilitated. Moreover, in the prior art, when the online real environment is used as the test environment, a tester often uses simulated data to perform a test, and data to be stored in a database may be generated during the test, however, the database is usually used for storing the real data generated during the operation of a user, so that the data stored in the database during the test performed by the tester using the simulated data is usually used as garbage data. The embodiment of the disclosure can test the application only through the user operation in the use process of the user, and the data generated in the use process is real data, so that the embodiment of the disclosure can avoid the generation of junk data in the test process.
Referring now to fig. 6, a schematic diagram of an electronic device (e.g., server in fig. 1) 600 suitable for use in implementing embodiments of the present disclosure is shown. The server illustrated in fig. 6 is merely an example, and should not be construed as limiting the functionality and scope of use of the embodiments of the present disclosure in any way.
As shown in fig. 6, the electronic device 600 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 601, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 602 or a program loaded from a storage means 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data required for the operation of the electronic apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM 603 are connected to each other through a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
In general, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, and the like; an output device 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, magnetic tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 shows an electronic device 600 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead. Each block shown in fig. 6 may represent one device or a plurality of devices as needed.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via communication means 609, or from storage means 608, or from ROM 602. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing means 601.
It should be noted that, the computer readable medium according to the embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In an embodiment of the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. Whereas in embodiments of the present disclosure, the computer-readable signal medium may comprise a data signal propagated in baseband or as part of a carrier wave, with computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
The computer readable medium may be contained in the electronic device; or may be present alone without being incorporated into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: generating a buried point log corresponding to the user operation through a log buried point configured in the application in advance in response to the detection of the user operation of the application, wherein the buried point log comprises actual result data corresponding to the user operation; determining a test case matched with the buried point log from a predetermined test case set, wherein the test cases in the test case set comprise expected result data; based on the buried point log and the test cases matched with the buried point log, actual result data and expected result data corresponding to user operation are compared, and a test result is generated.
Computer program code for carrying out operations of embodiments of the present disclosure may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments described in the present disclosure may be implemented by means of software, or may be implemented by means of hardware. The described units may also be provided in a processor, for example, described as: a processor includes a first generation unit, a determination unit, and an alignment unit. The names of these units do not constitute a limitation on the unit itself in some cases, and for example, the first generation unit may also be described as "a unit that generates a user operation corresponding buried point log by a log buried point configured in advance in an application".
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by those skilled in the art that the scope of the invention referred to in this disclosure is not limited to the specific combination of features described above, but encompasses other embodiments in which features described above or their equivalents may be combined in any way without departing from the spirit of the invention. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).

Claims (10)

Based on the buried point log and the test case matched with the buried point log, comparing actual result data corresponding to the user operation with expected result data to generate a test result, wherein the test result comprises: comparing the actual result data corresponding to the user operation with the expected result data to generate a comparison result; responding to the comparison result to indicate that the actual result data corresponding to the user operation is consistent with the expected result data, and comparing the actual stored data corresponding to the user operation with the expected stored data corresponding to the user operation in the target database; generating a test result representing success of the test case in response to determining that the actually stored data corresponding to the user operation in the target database is consistent with the expected stored data corresponding to the user operation; and generating a test result representing test case failure in response to determining that the actually stored data corresponding to the user operation and the expected stored data corresponding to the user operation in the target database are inconsistent.
The alignment unit is further configured to: comparing the actual result data corresponding to the user operation with the expected result data to generate a comparison result; responding to the comparison result to indicate that the actual result data corresponding to the user operation is consistent with the expected result data, and comparing the actual stored data corresponding to the user operation with the expected stored data corresponding to the user operation in the target database; generating a test result representing success of the test case in response to determining that the actually stored data corresponding to the user operation in the target database is consistent with the expected stored data corresponding to the user operation; and generating a test result representing test case failure in response to determining that the actually stored data corresponding to the user operation and the expected stored data corresponding to the user operation in the target database are inconsistent.
CN201910825794.8A2019-09-032019-09-03Method and apparatus for testing applicationsActiveCN112445697B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201910825794.8ACN112445697B (en)2019-09-032019-09-03Method and apparatus for testing applications

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201910825794.8ACN112445697B (en)2019-09-032019-09-03Method and apparatus for testing applications

Publications (2)

Publication NumberPublication Date
CN112445697A CN112445697A (en)2021-03-05
CN112445697Btrue CN112445697B (en)2024-08-16

Family

ID=74734265

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201910825794.8AActiveCN112445697B (en)2019-09-032019-09-03Method and apparatus for testing applications

Country Status (1)

CountryLink
CN (1)CN112445697B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN116820992B (en)*2023-07-052024-07-19上海灵动微电子股份有限公司Automatic testing method and system for embedded software
CN119759754A (en)*2024-11-142025-04-04北京罗克维尔斯科技有限公司Interactive system reliability test method and related device

Citations (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN109960651A (en)*2019-02-132019-07-02北京达佳互联信息技术有限公司Bury a test method, system, device and computer readable storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN105868256A (en)*2015-12-282016-08-17乐视网信息技术(北京)股份有限公司Method and system for processing user behavior data
CN109145230A (en)*2017-06-152019-01-04百度在线网络技术(北京)有限公司Information output method and device
CN107832216A (en)*2017-11-082018-03-23无线生活(杭州)信息科技有限公司One kind buries a method of testing and device
CN109032870A (en)*2018-08-032018-12-18百度在线网络技术(北京)有限公司Method and apparatus for test equipment

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN109960651A (en)*2019-02-132019-07-02北京达佳互联信息技术有限公司Bury a test method, system, device and computer readable storage medium

Also Published As

Publication numberPublication date
CN112445697A (en)2021-03-05

Similar Documents

PublicationPublication DateTitle
CN111309632B (en)Application program testing method and device, computer equipment and storage medium
US20130219220A1 (en)Generating a replayable testing script for iterative use in automated testing utility
CN113448854A (en)Regression testing method and device
CN109542743B (en)Log checking method and device, electronic equipment and computer readable storage medium
US11121912B2 (en)Method and apparatus for processing information
CN112445697B (en)Method and apparatus for testing applications
CA3141546A1 (en)Log-based testing method, device, and computer system
CN115705190A (en)Method and device for determining dependence degree
CN113362173A (en)Anti-duplication mechanism verification method, anti-duplication mechanism verification system, electronic equipment and storage medium
CN114064504B (en) Detection method, device, medium and computing equipment for full-link stress testing data isolation
CN112084114B (en)Method and apparatus for testing interfaces
CN114285774A (en)Flow recording method and device, electronic equipment and storage medium
CN112306826B (en) Method and device for processing information in terminal
CN112882948A (en)Stability testing method, device and system for application and storage medium
CN116662193A (en)Page testing method and device
CN113485902B (en)Method, device, equipment and computer readable medium for testing service platform
CN113407229B (en)Method and device for generating offline scripts
CN112749078A (en)Buried point testing method and device
CN110765006A (en)Flow testing method and device, computer readable storage medium and electronic device
CN109542921B (en)Data checking method and device, electronic equipment and storage medium
CN111831530A (en) Test method and apparatus
CN114157647B (en)Tracking method and device for browsing web pages by user, electronic equipment and storage medium
CN113835995B (en)Method and device for generating test cases
CN111371745B (en) Method and apparatus for determining SSRF vulnerabilities
CN119149377A (en)Offline loading test method, device, equipment, computer readable medium and product

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp