Disclosure of Invention
Based on the method, the device, the computer equipment and the computer readable storage medium, the outbound test method, the device, the computer equipment and the computer readable storage medium can effectively detect whether the automatic outbound flow in the automatic outbound process is normal or not.
In a first aspect, an embodiment of the present application provides an outbound test method, where the method includes:
acquiring an outbound test task aiming at a target outbound scene, wherein the outbound test task comprises configuration information;
obtaining a test sample corresponding to the outbound test task according to the configuration information, wherein the test sample corresponds to at least one outbound node in an automatic outbound flow of the target outbound scene, and the test sample comprises a sample test statement and a sample reply corresponding to the sample test statement;
and obtaining a target answer aiming at the sample test statement fed back by the outbound component based on the sample test statement, and comparing the sample answer with the target answer to obtain an outbound test result, wherein the outbound test result is used for representing whether an automatic outbound flow of the target outbound scene is normal or not.
In one embodiment, the comparing the sample answer with the target answer to obtain the outbound test result includes:
determining a sample text corresponding to the sample answer and a target text corresponding to the target answer;
and detecting whether the sample text and the target text are the same or not, and acquiring the outbound test result according to the detection result.
In one embodiment, the acquiring the outbound test task for the target outbound scenario includes:
detecting a candidate test task aiming at the target outbound scene in a database according to a preset time period, and determining a task state label of the candidate test task;
and if the task state label is a first preset state label, taking the candidate test task as the outbound test task.
In one embodiment, after comparing the sample answer with the target answer to obtain the outbound test result, the method further includes:
and modifying the task state label of the outbound test task into a second preset state label in a database, wherein the second preset state label represents a different task state from the first preset state label.
In one embodiment, the method further comprises:
displaying a task interface, wherein the task interface comprises a task creating control;
and acquiring candidate configuration information aiming at the candidate test task based on the task creating control, and generating the candidate test task in a database according to the candidate configuration information.
In one embodiment, after comparing the sample answer with the target answer to obtain the outbound test result, the method further includes:
and displaying the outbound test result corresponding to the outbound test task in a task interface.
In one embodiment, the method further comprises:
displaying a sample management page, wherein the sample management page comprises a sample creating control;
and acquiring the test sample based on the sample creating control, and storing the test sample in a database.
In one embodiment, the sample management page further includes a sample processing control, and after storing the test sample in the database, further includes:
obtaining processing instructions for the test sample based on the sample processing control;
and responding to the processing instruction, and performing preset processing on the test sample, wherein the preset processing comprises at least one of unloading the test sample to a preset storage position, deleting the test sample from a database and displaying the test sample in the sample management page.
In a second aspect, an embodiment of the present application provides an outbound test apparatus, where the apparatus includes:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring an outbound test task aiming at a target outbound scene, and the outbound test task comprises configuration information;
a second obtaining module, configured to obtain, according to the configuration information, a test sample corresponding to the outbound test task, where the test sample corresponds to at least one outbound node in an automatic outbound flow of the target outbound scenario, and the test sample includes a sample test statement and a sample reply corresponding to the sample test statement;
and the test module is used for acquiring a target answer which is fed back by the outbound component and aims at the sample test statement based on the sample test statement, and comparing the sample answer with the target answer to obtain an outbound test result, wherein the outbound test result is used for representing whether an automatic outbound flow of the target outbound scene is normal or not.
In one embodiment, the test module includes:
the determining unit is used for determining a sample text corresponding to the sample reply and a target text corresponding to the target reply;
and the test unit is used for detecting whether the sample text is the same as the target text or not and acquiring the outbound test result according to the detection result.
In one embodiment, the first obtaining module is specifically configured to detect a candidate test task for the target outbound scenario in a database according to a preset time period, and determine a task state label of the candidate test task; and if the task state label is a first preset state label, taking the candidate test task as the outbound test task.
In one embodiment, the apparatus further comprises:
and the tag configuration module is used for modifying the task state tag of the outbound test task into a second preset state tag in a database, wherein the second preset state tag represents a different task state from the first preset state tag.
In one embodiment, the apparatus further comprises:
the first display module is used for displaying a task interface, and the task interface comprises a task creation control;
and the third acquisition module is used for acquiring candidate configuration information aiming at the candidate test task based on the task creation control and generating the candidate test task in a database according to the candidate configuration information.
In one embodiment, the apparatus further comprises:
and the second display module is used for displaying the outbound test result corresponding to the outbound test task in a task interface.
In one embodiment, the apparatus further comprises:
the third display module is used for displaying a sample management page, and the sample management page comprises a sample creation control;
and the sample storage module is used for creating a control based on the sample, acquiring the test sample and storing the test sample in a database.
In one embodiment, the apparatus further comprises:
the instruction acquisition module is used for acquiring a processing instruction aiming at the test sample based on the sample processing control;
and the processing module is used for responding to the processing instruction and performing preset processing on the test sample, wherein the preset processing comprises at least one of unloading the test sample to a preset storage position, deleting the test sample from a database and displaying the test sample in the sample management page.
In a third aspect, an embodiment of the present application provides a computer device, which includes a memory and a processor, where the memory stores a computer program, and the processor implements the steps of the method according to the first aspect when executing the computer program.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the method according to the first aspect as described above.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
by acquiring an outbound test task aiming at a target outbound scene, wherein the outbound test task comprises configuration information, and then acquiring a test sample corresponding to the outbound test task according to the configuration information, the test sample corresponds to at least one outbound node in the automatic outbound flow of the target outbound scene, the test sample comprises a sample test statement and a sample answer corresponding to the sample test statement, the sample answer is a standard answer corresponding to the sample test statement, thus, the target response fed back by the outbound component (such as the outbound robot) and aiming at the sample test sentence is obtained based on the sample test sentence, and comparing the sample reply with the target reply to obtain an outbound test result, if the outbound test result is that the sample reply and the target reply are the same, the characterization outbound component can normally feed back the standard answer corresponding to the sample test statement, namely the automatic outbound flow of the target outbound scene is normal; if the outbound test result is that the sample answer and the target answer are different, the representation outbound component cannot normally feed back the standard answer corresponding to the sample test statement, and the outbound component has the condition of wrong answer feedback, namely, the automatic outbound flow of the target outbound scene is abnormal, so that the effective detection on whether the automatic outbound flow of the outbound component in the automatic outbound process is normal or not is realized, and the automatic outbound flow is favorable for timely maintaining and repairing under the condition that the automatic outbound flow is abnormal, so that the outbound success rate of the outbound component is improved.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The outbound testing method, the outbound testing device, the computer equipment and the computer readable storage medium provided by the embodiment of the application aim at solving the technical problem that the outbound robot has a low outbound success rate due to the fact that the outbound robot often has an abnormal automatic outbound flow in the outbound process in the traditional technology. The following describes in detail the technical solutions of the present application and how the technical solutions of the present application solve the above technical problems by embodiments and with reference to the drawings. The following specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
The following describes technical solutions related to the embodiments of the present application with reference to a scenario in which the embodiments of the present application are applied.
Fig. 1-a is a schematic diagram of an implementation environment related to the outbound test method provided in the embodiment of the present application. As shown in fig. 1-a, the implementation environment may include acomputer device 101, thecomputer device 101 may be a smartphone, a tablet, a personal computer, a laptop, an in-vehicle device, etc., and the outbound component may be deployed as a software component in thecomputer device 101.
In the implementation environment shown in fig. 1-a, thecomputer device 101 may obtain an outbound test task for a target outbound scenario, where the outbound test task includes configuration information, and then thecomputer device 101 may obtain a test sample corresponding to the outbound test task according to the configuration information, where the test sample corresponds to at least one outbound node in an automatic outbound flow of the target outbound scenario, and the test sample includes a sample test statement and a sample reply corresponding to the sample test statement, and then thecomputer device 101 may obtain, based on the sample test statement, a target reply to the sample test statement fed back by the outbound component, and compare the sample reply with the target reply to obtain an outbound test result, where the outbound test result is used to characterize whether the automatic outbound flow of the target outbound scenario is normal.
Fig. 1-b is a schematic diagram of another implementation environment related to the outbound test method provided in the embodiment of the present application. As shown in fig. 1-b, the implementation environment may further include acomputer device 101 and anoutbound device 102, and thecomputer device 101 and theoutbound device 102 may communicate with each other via a wired network or a wireless network.
Thecomputer device 101 may be a smart phone, a tablet computer, a personal computer, a notebook computer, a vehicle-mounted device, or the like; theoutbound device 102 may be a chat robot, a session server, or the like, and the outbound component may be deployed in theoutbound device 102 as a software component, or of course, the outbound component may also be a hardware component, for example, the outbound component may be theoutbound device 102.
In the implementation environment shown in fig. 1-b, thecomputer device 101 may obtain an outbound test task for a target outbound scenario, the outbound test task including configuration information, then, thecomputer device 101 can obtain the test sample corresponding to the outbound test task according to the configuration information, wherein the test sample corresponds to at least one outbound node in the automatic outbound flow of the target outbound scenario, the test sample comprises a sample test statement and a sample reply corresponding to the sample test statement, then, thecomputer device 101 may send the sample test statement to theoutbound device 102, theoutbound device 102 feeds back a target answer for the sample test statement to thecomputer device 101, thecomputer device 101 compares the sample answer with the target answer to obtain an outbound test result, and the outbound test result is used to indicate whether an automatic outbound flow of a target outbound scenario is normal.
In one embodiment, as shown in fig. 2, there is provided an outbound testing method, which is illustrated by way of example as applied to thecomputer device 101 in fig. 1-b, comprising the steps of:
step 201, a computer device obtains an outbound test task for a target outbound scenario.
At present, automatic outbound is widely applied to different business scenes of many industries, for example, in the field of financial science and technology, a financial institution can call an outbound object to receive through an outbound system, in the field of e-commerce, an e-commerce platform can call the outbound object to return to the satisfaction through the outbound system, and the like. In the embodiment of the application, the target outbound scenario may be any service scenario that adopts automatic outbound, for example, the target outbound scenario is a collection scenario in the field of financial technology, and the like.
In general, each outbound scene has its corresponding automatic outbound flow, the automatic outbound flow may include a plurality of outbound nodes, each outbound node has a corresponding jump condition and a standard reply of an outbound component, that is, in the process of outbound of an outbound object based on the automatic outbound flow, if a dialog sentence of the outbound object meets the jump condition corresponding to the outbound node, the outbound object should be replied by the standard reply corresponding to the outbound node under normal conditions.
For example, taking a call-out scenario as an example, referring to fig. 3, fig. 3 is a schematic flow chart of an automatic call-out process of an exemplary call-out scenario. As shown in fig. 3, the plurality of outbound nodes of the automatic outbound flow may include "1.1 initials", "2.1 reconfirms identity", "3.1 explains the reason for the incoming call", etc. Suppose that after the calling-out node "2.1 reconfirms identity" reconfirms identity to the calling-out object, if the dialog sentence of the calling-out object satisfies the next calling-out node "3.1 explains that the reason for calling" is oneself ", then a standard reply corresponding to the calling-out node" 3.1 explains that the reason for calling "needs to be adopted to reply to the calling-out object, the standard reply is for example" you are good, you have overdue on the credit card of theend number 1 of the bank a, the bill amount is a yuan, the lowest repayment is b yuan, and ask you to pay back the debt before 8 o' clock today.
However, in the practical application process, there are often situations of wrong recognition of intentions or semantics contained in the dialog sentences of the outbound object, wrong matching of the jump conditions, and the like, which results in that the outbound component cannot normally feed back the standard answer of the outbound node, i.e. the outbound component has a situation of wrong answer feedback, thereby causing abnormal automatic outbound flow and low outbound success rate of the outbound component.
In view of this, in the embodiment of the present application, an outbound test task may be configured for a target outbound scenario, and a computer device serving as an outbound management platform may automatically test whether an outbound component has a response feedback error to one or more outbound nodes in an automatic outbound flow of the target outbound scenario based on the outbound test task, that is, test whether the automatic outbound flow of the target outbound scenario is normal.
In a possible implementation manner, the outbound test task may be manually configured by a tester based on a task interface displayed by the computer device, and of course, the outbound test task may also be automatically generated by the computer device based on a preset task configuration policy, where a configuration manner of the outbound test task is not specifically limited.
Step 202, the computer device obtains a test sample corresponding to the outbound test task according to the configuration information.
In the embodiment of the application, the outbound test task includes configuration information, and the configuration information may include a sample identifier of a test sample corresponding to the outbound test task, so that the computer device may pull the test sample corresponding to the sample identifier from the database according to the sample identifier.
The test sample corresponds to at least one outbound node in the automatic outbound flow of the target outbound scenario, and includes a sample test statement and a sample reply corresponding to the sample test statement, that is, the number of the sample test statement and the sample reply included in the test sample may be multiple, and for each outbound node in the automatic outbound flow, the test sample may include the sample test statement corresponding thereto and the sample reply corresponding to the sample test statement.
In a possible implementation manner, for an outbound node, a sample test statement of the outbound node included in a test sample may correspond to a jump condition corresponding to the outbound node, for example, please continue to refer to fig. 3, where the outbound node "3.1 illustrates that the reason for the incoming call" is the corresponding jump condition "is oneself", then the sample test statement corresponding to the outbound node may be "i", "i" or "i" is XXX ", and so on, that is, the intention or semantic meaning included in the sample test statement may satisfy the jump condition corresponding to the outbound node; the sample answer corresponding to the sample test statement is the standard answer corresponding to the outbound node, and the outbound node ' 3.1 illustrates the reason of the incoming call ' is the corresponding standard answer, such as ' you are good, your credit card at theend number 1 of bank A is overdue, the bill amount is a yuan, the lowest repayment is b yuan, and ask you can pay back the debt at 8 o ' clock today '.
And 203, the computer equipment acquires a target reply aiming at the sample test statement fed back by the outbound component based on the sample test statement, and compares the sample reply with the target reply to obtain an outbound test result.
In this way, the computer device obtains the test sample, that is, the target response fed back by the outbound component and aiming at the sample test statement can be obtained based on the sample test statement. For each sample test statement, the computer equipment can call a test interface of the outbound component to perform a round of test, and the test obtains a target response to the sample test statement fed back by the outbound component.
The following describes a procedure for performing a round of testing on a computer device.
In a possible implementation manner, the computer device may send a test request to the outbound device shown in fig. 1-b (the outbound device is deployed with an outbound component, or the outbound device is an outbound component), and send the test request carrying a sample test statement to the outbound device, where the outbound device identifies an intention or semantic included in the sample test statement, and queries a target reply corresponding to an identification result and feeds the target reply back to the computer device.
In another possible implementation, the computer device may also identify the intention or the semantic meaning included in the sample test statement to obtain an identification result, and the computer device carries the identification result with the test request and sends the test request to the outbound device, and then the outbound device queries a target response corresponding to the identification result and feeds the target response back to the computer device.
After the computer equipment acquires the target response fed back by the outbound component and aiming at the sample test statement, the sample response and the target response are compared in consistency to obtain an outbound test result, and the outbound test result is used for representing whether the automatic outbound flow of the target outbound scene is normal or not.
For each round of test, if the outbound test result is that the sample answer and the target answer are the same, the characteristic outbound component can normally feed back the standard answer corresponding to the sample test statement, namely the automatic outbound flow of the target outbound scene is normal; if the outbound test result of one or more rounds of tests is that the sample answer and the target answer are different, the characteristic outbound component cannot completely and normally feed back the standard answer corresponding to the sample test statement, and the outbound component has the condition of wrong answer feedback, namely, the automatic outbound flow of the target outbound scene has abnormality.
In the above embodiment, the outbound test task for the target outbound scenario is obtained, where the outbound test task includes configuration information, and then a test sample corresponding to the outbound test task is obtained according to the configuration information, where the test sample corresponds to at least one outbound node in the automatic outbound flow of the target outbound scenario, the test sample includes a sample test statement and a sample reply corresponding to the sample test statement, and the sample reply is a standard reply corresponding to the sample test statement, so that a target reply for the sample test statement fed back by an outbound component (e.g., an outbound robot) is obtained based on the sample test statement, and the sample reply and the target reply are compared to obtain an outbound test result, and if the outbound test result is that the sample reply and the target reply are the same, the representation outbound component can normally feed back the standard reply corresponding to the sample test statement, namely, the automatic outbound flow of the target outbound scene is normal; if the outbound test result is that the sample answer and the target answer are different, the characteristic outbound component cannot normally feed back the standard answer corresponding to the sample test statement, and the outbound component has the condition of wrong answer feedback, namely the automatic outbound flow of the target outbound scene has abnormality. Therefore, whether the automatic outbound flow in the automatic outbound process of the outbound component is normal or not is effectively detected, so that timely maintenance and repair are facilitated under the condition that the automatic outbound flow is abnormal, and the outbound success rate of the outbound component is improved.
In one embodiment, based on the embodiment shown in fig. 2, referring to fig. 4, this embodiment relates to a process of how a computer device compares a sample reply with a target reply to obtain an outbound test result. As shown in fig. 4, the process includessteps 401 and 402:
instep 401, the computer device determines a sample text corresponding to the sample reply and a target text corresponding to the target reply.
In one possible implementation, the sample response and the target response may be in text form, such that the computer device then treats the sample response as sample text and the target response as target text. It should be noted that, when the outbound object is replied based on the reply in the text form, the reply speech may be obtained by performing speech synthesis on the reply in the text form, and then the outbound object is replied by the reply speech.
In another possible implementation, the sample response and the target response may also be in a form of voice, so that the computer device performs voice recognition on the sample response and the target response respectively to obtain a sample text corresponding to the sample response and a target text corresponding to the target response.
And step 402, the computer equipment detects whether the sample text is the same as the target text, and obtains an outbound test result according to the detection result.
If the sample text is the same as the target text, the representation sample answer is the same as the target answer, and the computer equipment determines that the outbound component can normally feed back the standard answer corresponding to the sample test statement, namely the automatic outbound flow of the target outbound scene is normal; if the sample text is different from the target text, the representation sample answer is different from the target answer, the computer equipment determines that the outbound component cannot normally feed back the standard answer corresponding to the sample test statement, and the outbound component has the condition of wrong answer feedback, namely the automatic outbound flow of the target outbound scene has abnormality. Therefore, whether the automatic outbound flow in the automatic outbound process of the outbound component is normal or not is effectively detected.
In one embodiment, based on the embodiment shown in fig. 2, referring to fig. 5, the present embodiment relates to a process of how a computer device acquires an outbound test task for a target outbound scenario. As shown in fig. 5,step 201 includessteps 2011 and 2012 shown in fig. 5:
step 2011, the computer device detects a candidate test task for the target outbound scenario in the database according to a preset time period, and determines a task status label of the candidate test task.
The computer device can be set with a task timer, the task timer is used for triggering the computer device to scan candidate test tasks aiming at the target outbound scene in a database at fixed time, and after the computer device scans the candidate test tasks, the computer device identifies task state labels of the candidate test tasks, wherein the task state labels can be 'task not executed', 'task in execution', or 'task executed', and the like.
Step 2012, if the task status label is the first preset status label, the computer device takes the candidate test task as an outbound test task.
If the task state label of the candidate test task is a first preset state label, the first preset state label may represent that the candidate test task is not executed by the computer device, and the first preset state label may be, for example, "task not executed", and the computer device may use the candidate test task as the outbound test task.
As an implementation manner, after the computer device obtains the newly-created candidate test task, the first preset state tag may be added to the newly-created candidate test task, so as to mark that the newly-created candidate test task is not executed by the computer through the first preset state tag, and thus, the computer device can quickly distinguish the task state of the test task through the task state tag, which is beneficial for the computer device to quickly determine the outbound test task that is not executed by the computer, and improves the speed of obtaining the outbound test task by the computer device, thereby improving the overall efficiency of the outbound test.
In an embodiment, based on the embodiment shown in fig. 5, referring to fig. 6, in the outbound test method of this embodiment, step 203 further includes step 204:
step 204, the computer device modifies the task state label of the outbound test task into a second preset state label in the database.
The second preset state label and the first preset state label represent different task states.
The computer device obtains a target reply, which is fed back by the outbound component and is directed at the sample test statement, based on the sample test statement, compares the sample reply with the target reply, and modifies a task state label of the outbound test task into a second preset state label in the database after the outbound test result is obtained, that is, after the computer device completes execution of the outbound test task, where the second preset state label may represent that the outbound test task has been executed by the computer device, and the second preset state label may be, for example, "task executed".
Therefore, for the test tasks in the database, the computer equipment can quickly distinguish the task states of the test tasks through the task state labels, and the computer equipment is favorable for conveniently carrying out classification management on the test tasks in different task states.
In one embodiment, based on the embodiment shown in FIG. 5, and with reference to FIG. 7, this embodiment is directed to a process of how a computer device generates candidate test tasks. As shown in fig. 7, the process includessteps 2051 and 2052:
step 2051, the computer device displays a task interface.
In the embodiment of the application, under the condition that a candidate test task needs to be newly added, the computer device can display a task interface, wherein the task interface comprises a task creation control.
Referring to FIG. 8, FIG. 8 is a diagram of an exemplary task interface, and the task creation control can be, for example, a button of "New task" displayed in the upper right corner of the task interface.
Step 2052, the computer device obtains candidate configuration information for the candidate test task based on the task creation control, and generates the candidate test task in the database according to the candidate configuration information.
After the operation and maintenance personnel click the task creation control, the operation and maintenance personnel may select a test sample already saved by the computer device, and input candidate configuration information, where the candidate configuration information may be a sample identifier of the test sample corresponding to the candidate test task, and thus, the computer device generates a test task record in the database based on the candidate configuration information, where the test task record may include the candidate configuration information and a task state label of the candidate test task (where the task state label is a first preset state label, and the first preset state label is "task not executed"), and the test task record is the candidate test task.
Therefore, under the condition that the computer equipment needs to execute the outbound test task to be executed, the candidate test task with the task state label as the first preset state label is detected in the database according to the preset time period, and the candidate test task with the task state label as the first preset state label is used as the outbound test task to carry out the outbound test process.
In one embodiment, based on the embodiment shown in fig. 2, referring to fig. 9, this embodiment relates to a process for a computer device to display the result of an outbound test. As shown in fig. 9, the outbound test method of this embodiment further includes step 206:
and step 206, the computer equipment displays the outbound test result corresponding to the outbound test task in the task interface.
The outbound test result can be that the state of the automatic outbound flow of the target outbound scene is normal, or the state of the automatic outbound flow of the target outbound scene is abnormal, and the computer equipment can display the outbound test result of the outbound test task in a task interface for operation and maintenance personnel to check.
For example, referring to fig. 8, the computer device may display the outbound test result "abnormal" corresponding to the outbound test task "session batch-20210615202539" in the task interface shown in fig. 8, so that the operation and maintenance staff may visually check the outbound test result.
Optionally, the computer device may further display information such as task completion time of the outbound test task, re-execution of the outbound test task to trigger the control, and the like in the task interface, so that the operation and maintenance staff can know more comprehensive information of the outbound test task and can perform the outbound test more flexibly.
Optionally, after the computer device obtains the outbound test result, the computer device may further compare and analyze the outbound test result with a preset sample expected output, classify and count the analysis result, and upload the obtained statistical result to the file server to provide downloading. The computer device can also display a statistical result export control in the task interface for the operation and maintenance personnel to export a statistical result, wherein the statistical result can comprise statistics of whether each outbound node in the automatic outbound flow of the target outbound scene jumps wrongly, statistics of whether intent or semantic recognition contained in a sample test statement corresponding to each outbound node is wrong, analysis of a test result corresponding to each outbound node and overall result analysis of whether the automatic outbound flow is normal or not, and the like.
Therefore, the computer equipment can visually display the outbound test result through the task interface, so that operation and maintenance personnel can timely know the outbound test result, the automatic outbound flow is favorable for timely maintenance and repair under the abnormal condition, and the outbound success rate of the outbound assembly is improved.
In one embodiment, based on the embodiment shown in fig. 2, referring to fig. 10, this embodiment relates to a process of how a computer device obtains a test sample. As shown in fig. 10, the process includesstep 2071 and step 2072:
step 2071, the computer device displays a sample management page.
In the embodiment of the application, the computer device can display a sample management page, and the sample management page comprises a sample creation control for the operation and maintenance personnel to input the test sample based on the sample creation control.
Illustratively, referring to fig. 11, fig. 11 is a schematic diagram of an exemplary sample management page, which includes a sample creation control, such as a "create sample set" button shown in the upper right corner of fig. 11, and the operation and maintenance personnel can create test samples for each outbound scenario by clicking on the sample creation control.
Wherein the test sample of each outbound scenario may be in the form of a sample set, which may contain a plurality of sample test statements and a plurality of sample replies. Referring to fig. 12, fig. 12 is a schematic diagram of an exemplary test sample corresponding to an outbound scenario, for each outbound node in an automatic outbound flow of the outbound scenario, an operation and maintenance person may configure at least one sample test statement corresponding to each outbound node and a sample reply corresponding to the sample test statement, where one sample test statement and the corresponding sample reply form a dialog turn.
Step 2072, the computer device creates a control based on the sample, obtains a test sample, and stores the test sample in a database.
After the operation and maintenance personnel configure the test sample based on the sample creation control, the computer device can store the test sample in the database, and certainly, the computer device can also configure a sample identifier corresponding to the test sample, so that the computer device can acquire the test sample corresponding to the outbound test task to perform the outbound test according to the configuration information of the outbound test task in the process of executing the outbound test task.
In a possible implementation manner, the sample management page further includes a sample processing control, please continue to refer to fig. 10, and afterstep 2072, the method for exception testing in this embodiment further includesstep 2073 and step 2074:
step 2073, the computer device obtains a processing instruction for the test sample based on the sample processing control.
With continued reference to fig. 12, in the embodiment of the present application, the sample processing controls may include an "export sample" control, a "view" control, and a "delete" control shown in fig. 12, and the operation and maintenance person may click any sample processing control to input a processing instruction for the test sample.
Atstep 2074, the computer device performs a predetermined process on the test sample in response to the processing instruction.
After receiving the processing instruction, the computer device performs preset processing on the test sample according to the instruction of the processing instruction, wherein the preset processing includes at least one of transferring the test sample to a preset storage position, deleting the test sample from the database and displaying the test sample in a sample management page.
Therefore, through the sample management page, the operations of creating and deleting the test samples and the like can be conveniently carried out on different outbound scenes, so that the computer equipment can carry out automatic outbound test on each outbound scene according to the corresponding test sample.
In the practical application process, the automatic outbound flow of the outbound scene often changes along with the change of the service requirement, the outbound test task is timely updated after the automatic outbound flow changes, and the computer equipment can execute the outbound test task to automatically perform outbound test, so that the automatic outbound flow is timely maintained and repaired under the condition that the automatic outbound flow is abnormal, and the outbound success rate of the outbound assembly is improved.
In one embodiment, an outbound test method is provided, which may be applied in the implementation environment shown in fig. 1-b, and includes:
step A1, the computer device displays a task interface, wherein the task interface comprises a task creation control.
Step A2, the computer device acquires candidate configuration information for the candidate test task based on the task creation control, and generates the candidate test task in the database according to the candidate configuration information.
Step A3, the computer device detects the candidate test task aiming at the target outbound scene in the database according to the preset time period, and determines the task state label of the candidate test task.
Step A4, if the task status label is the first preset status label, the computer device takes the candidate test task as the outbound test task, and the outbound test task includes the configuration information.
Step A5, the computer device obtains a test sample corresponding to the outbound test task according to the configuration information, wherein the test sample corresponds to at least one outbound node in the automatic outbound flow of the target outbound scenario, and the test sample comprises a sample test statement and a sample reply corresponding to the sample test statement.
Step A6, the computer device sends a test request to the outbound device (the outbound device is deployed with an outbound component, or the outbound device is an outbound component), and sends a sample test statement to the outbound device, the outbound device identifies the intention or semantic included in the sample test statement, and queries the target reply corresponding to the identification result and feeds back the target reply to the computer device;
or the computer equipment identifies the intention or the semantics contained in the sample test sentence to obtain an identification result, the computer equipment carries the identification result with the test request and sends the test request to the outbound equipment, and the outbound equipment queries a target response corresponding to the identification result and feeds the target response back to the computer equipment.
In step A7, the computer device determines a sample text corresponding to the sample response and a target text corresponding to the target response.
And step A8, the computer equipment detects whether the sample text and the target text are the same, and obtains an outbound test result according to the detection result, wherein the outbound test result is used for representing whether the automatic outbound flow of the target outbound scene is normal.
Step A9, the computer device modifies the task state label of the outbound test task into a second preset state label in the database, and the second preset state label represents a different task state from the first preset state label.
Step A10, the computer device displays the outbound test result corresponding to the outbound test task in the task interface.
It should be understood that, although the steps in the above-described flowcharts are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in the above-described flowcharts may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or the stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least a portion of the sub-steps or stages of other steps.
In one embodiment, as shown in fig. 13, there is provided an outbound testing device comprising:
a first obtainingmodule 10, configured to obtain an outbound test task for a target outbound scenario, where the outbound test task includes configuration information;
a second obtainingmodule 20, configured to obtain, according to the configuration information, a test sample corresponding to the outbound test task, where the test sample corresponds to at least one outbound node in an automatic outbound flow of the target outbound scenario, and the test sample includes a sample test statement and a sample reply corresponding to the sample test statement;
thetest module 30 is configured to obtain a target answer, which is fed back by the outbound component and is directed to the sample test statement, based on the sample test statement, and compare the sample answer with the target answer to obtain an outbound test result, where the outbound test result is used to characterize whether an automatic outbound flow of the target outbound scenario is normal.
In one embodiment, thetest module 30 includes:
the determining unit is used for determining a sample text corresponding to the sample reply and a target text corresponding to the target reply;
and the test unit is used for detecting whether the sample text is the same as the target text or not and acquiring the outbound test result according to the detection result.
In one embodiment, the first obtainingmodule 10 is specifically configured to detect, according to a preset time period, a candidate test task for the target outbound scenario in a database, and determine a task state label of the candidate test task; and if the task state label is a first preset state label, taking the candidate test task as the outbound test task.
In one embodiment, the apparatus further comprises:
and the tag configuration module is used for modifying the task state tag of the outbound test task into a second preset state tag in a database, wherein the second preset state tag represents a different task state from the first preset state tag.
In one embodiment, the apparatus further comprises:
the first display module is used for displaying a task interface, and the task interface comprises a task creation control;
and the third acquisition module is used for acquiring candidate configuration information aiming at the candidate test task based on the task creation control and generating the candidate test task in a database according to the candidate configuration information.
In one embodiment, the apparatus further comprises:
and the second display module is used for displaying the outbound test result corresponding to the outbound test task in a task interface.
In one embodiment, the apparatus further comprises:
the third display module is used for displaying a sample management page, and the sample management page comprises a sample creation control;
and the sample storage module is used for creating a control based on the sample, acquiring the test sample and storing the test sample in a database.
In one embodiment, the apparatus further comprises:
the instruction acquisition module is used for acquiring a processing instruction aiming at the test sample based on the sample processing control;
and the processing module is used for responding to the processing instruction and performing preset processing on the test sample, wherein the preset processing comprises at least one of unloading the test sample to a preset storage position, deleting the test sample from a database and displaying the test sample in the sample management page.
The outbound test device provided in this embodiment may implement the outbound test method described above, and its implementation principle and technical effect are similar, which are not described herein again. For the specific definition of the outbound test device, reference may be made to the above definition of the outbound test method, which is not described herein again. The modules in the outbound test device may be implemented wholly or partially by software, hardware, or a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, there is also provided a computer device as shown in fig. 14, the computer device may be a terminal, and its internal structure diagram may be as shown in fig. 14. The computer device includes a processor, a memory, a communication interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, an operator network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a method of outbound testing. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 14 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory having a computer program stored therein, the processor implementing the following steps when executing the computer program:
acquiring an outbound test task aiming at a target outbound scene, wherein the outbound test task comprises configuration information;
obtaining a test sample corresponding to the outbound test task according to the configuration information, wherein the test sample corresponds to at least one outbound node in an automatic outbound flow of the target outbound scene, and the test sample comprises a sample test statement and a sample reply corresponding to the sample test statement;
and obtaining a target answer aiming at the sample test statement fed back by the outbound component based on the sample test statement, and comparing the sample answer with the target answer to obtain an outbound test result, wherein the outbound test result is used for representing whether an automatic outbound flow of the target outbound scene is normal or not.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
determining a sample text corresponding to the sample answer and a target text corresponding to the target answer;
and detecting whether the sample text and the target text are the same or not, and acquiring the outbound test result according to the detection result.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
detecting a candidate test task aiming at the target outbound scene in a database according to a preset time period, and determining a task state label of the candidate test task;
and if the task state label is a first preset state label, taking the candidate test task as the outbound test task.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
and modifying the task state label of the outbound test task into a second preset state label in a database, wherein the second preset state label represents a different task state from the first preset state label.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
displaying a task interface, wherein the task interface comprises a task creating control;
and acquiring candidate configuration information aiming at the candidate test task based on the task creating control, and generating the candidate test task in a database according to the candidate configuration information.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
and displaying the outbound test result corresponding to the outbound test task in a task interface.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
displaying a sample management page, wherein the sample management page comprises a sample creating control;
and acquiring the test sample based on the sample creating control, and storing the test sample in a database.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
obtaining processing instructions for the test sample based on the sample processing control;
and responding to the processing instruction, and performing preset processing on the test sample, wherein the preset processing comprises at least one of unloading the test sample to a preset storage position, deleting the test sample from a database and displaying the test sample in the sample management page.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, the computer program can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Ramb microsecond direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
acquiring an outbound test task aiming at a target outbound scene, wherein the outbound test task comprises configuration information;
obtaining a test sample corresponding to the outbound test task according to the configuration information, wherein the test sample corresponds to at least one outbound node in an automatic outbound flow of the target outbound scene, and the test sample comprises a sample test statement and a sample reply corresponding to the sample test statement;
and obtaining a target answer aiming at the sample test statement fed back by the outbound component based on the sample test statement, and comparing the sample answer with the target answer to obtain an outbound test result, wherein the outbound test result is used for representing whether an automatic outbound flow of the target outbound scene is normal or not.
In one embodiment, the computer program when executed by the processor further performs the steps of:
determining a sample text corresponding to the sample answer and a target text corresponding to the target answer;
and detecting whether the sample text and the target text are the same or not, and acquiring the outbound test result according to the detection result.
In one embodiment, the computer program when executed by the processor further performs the steps of:
detecting a candidate test task aiming at the target outbound scene in a database according to a preset time period, and determining a task state label of the candidate test task;
and if the task state label is a first preset state label, taking the candidate test task as the outbound test task.
In one embodiment, the computer program when executed by the processor further performs the steps of:
and modifying the task state label of the outbound test task into a second preset state label in a database, wherein the second preset state label represents a different task state from the first preset state label.
In one embodiment, the computer program when executed by the processor further performs the steps of:
displaying a task interface, wherein the task interface comprises a task creating control;
and acquiring candidate configuration information aiming at the candidate test task based on the task creating control, and generating the candidate test task in a database according to the candidate configuration information.
In one embodiment, the computer program when executed by the processor further performs the steps of:
and displaying the outbound test result corresponding to the outbound test task in a task interface.
In one embodiment, the computer program when executed by the processor further performs the steps of:
displaying a sample management page, wherein the sample management page comprises a sample creating control;
and acquiring the test sample based on the sample creating control, and storing the test sample in a database.
In one embodiment, the computer program when executed by the processor further performs the steps of:
obtaining processing instructions for the test sample based on the sample processing control;
and responding to the processing instruction, and performing preset processing on the test sample, wherein the preset processing comprises at least one of unloading the test sample to a preset storage position, deleting the test sample from a database and displaying the test sample in the sample management page.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above examples only show some embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.