Movatterモバイル変換


[0]ホーム

URL:


CN115514843A - Alarm method and device, equipment and storage medium - Google Patents

Alarm method and device, equipment and storage medium
Download PDF

Info

Publication number
CN115514843A
CN115514843ACN202211153356.XACN202211153356ACN115514843ACN 115514843 ACN115514843 ACN 115514843ACN 202211153356 ACN202211153356 ACN 202211153356ACN 115514843 ACN115514843 ACN 115514843A
Authority
CN
China
Prior art keywords
user
interaction
alarm
information
strategy
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211153356.XA
Other languages
Chinese (zh)
Other versions
CN115514843B (en
Inventor
王秋瑾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Wingtech Information Technology Co Ltd
Original Assignee
Shanghai Wingtech Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Wingtech Information Technology Co LtdfiledCriticalShanghai Wingtech Information Technology Co Ltd
Priority to CN202211153356.XApriorityCriticalpatent/CN115514843B/en
Publication of CN115514843ApublicationCriticalpatent/CN115514843A/en
Application grantedgrantedCritical
Publication of CN115514843BpublicationCriticalpatent/CN115514843B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The invention discloses an alarm method, an alarm device, alarm equipment and a storage medium; the method comprises the following steps: receiving an alarm triggering instruction; responding to the alarm triggering instruction, and acquiring the current interaction state of a user, wherein the interaction state is used for indicating whether the user has at least one interaction capability, and the interaction capability comprises listening, speaking, reading and writing; determining a current interaction strategy matched with the current interaction state, wherein the current interaction state indicates that at least one interaction capacity possessed by the user comprises the interaction capacity required to be used by the current interaction strategy, and the current interaction strategy comprises an information receiving strategy and an information sending strategy in an alarm process; and sending alarm information according to the information sending strategy, and receiving strategy receiving information according to the information. The alarm mode which is most matched with the current state of the user can be determined, the alarm efficiency can be improved, and the aim of quickly alarming is fulfilled.

Description

Alarm method and device, equipment and storage medium
Technical Field
The embodiment of the application relates to a terminal interaction technology, and relates to but is not limited to an alarm method, an alarm device, alarm equipment and a storage medium.
Background
With the rapid development of communication technology, the application scenarios of mobile terminals are more and more extensive. For example, the mobile terminal may be applied in an emergency alarm scenario, and when a user needs help, the user may call or send a short message to alarm in various ways to seek help.
Because the alarm modes are more, if the alarm mode is not successful, another alarm mode is needed to alarm, and the problem of low alarm efficiency can be caused by multiple times of switching.
Disclosure of Invention
In view of this, embodiments of the present application provide an alarm method, an alarm device, and a storage medium, which can determine an alarm manner that best matches a current state of a user, and can improve alarm efficiency to achieve a purpose of fast alarm. The alarm method, the alarm device, the alarm equipment and the alarm storage medium are realized as follows:
the alarm method provided by the embodiment of the application comprises the following steps:
receiving an alarm triggering instruction;
responding to the alarm triggering instruction, and acquiring the current interaction state of a user, wherein the interaction state is used for indicating whether the user has at least one interaction capability, and the interaction capability comprises listening, speaking, reading and writing;
determining a current interaction strategy matched with the current interaction state, wherein the current interaction state indicates that at least one interaction capacity possessed by the user comprises the interaction capacity required to be used by the current interaction strategy, and the current interaction strategy comprises an information receiving strategy and an information sending strategy in an alarm process;
and sending alarm information according to the information sending strategy, and receiving strategy receiving information according to the information.
In some embodiments, the obtaining the current interaction state of the user includes:
acquiring an interactive capacity detection flow corresponding to the user, and sending out interactive information according to the interactive capacity detection flow;
and determining the current interaction state according to whether the feedback of the user for the interaction information is received.
In some embodiments, the acquiring an interaction capability detection procedure corresponding to the user includes:
acquiring user information of the user, wherein the user information is used for indicating the preference degree of the user to at least one interactive capability;
and determining the interactive capacity detection flow according to the user information, wherein the interactive capacity with higher preference degree is in the front of the detection sequence in the interactive capacity detection flow.
In some embodiments, the obtaining the current interaction state of the user includes:
acquiring a target image, wherein the target image comprises the user or the environment where the user is located;
determining the optimal interaction mode of the user based on the image;
and determining the current interaction state of the user according to the optimal interaction mode.
In some embodiments, the determining the current interaction policy that matches the current interaction state comprises:
determining a first interactive capability for information reception and a second interactive capability for information transmission from more than two interactive capabilities possessed by the user;
and determining the information receiving strategy according to the first interactive capability, and determining the information sending strategy according to the second interactive capability.
In some embodiments, the sending alarm information according to the information sending policy and receiving information according to the information receiving policy includes:
after an alarm instruction is sent to an alarm platform, receiving interaction information sent by the alarm platform;
converting the interactive information according to the information receiving strategy and the information sending strategy to output the converted interactive information;
and receiving feedback information of the user aiming at the converted interactive information, and sending the feedback information to the alarm platform to finish alarm.
In some embodiments, the current interaction state is used to indicate that the user does not have any of the at least one interaction capability, the current interaction policy is a policy for interacting with an alarm assistant corresponding to the alarm platform, and the sending an alarm message according to the information sending policy and receiving a message according to the information receiving policy include:
and after an alarm instruction is sent to the alarm platform, waking up an alarm assistant plug-in corresponding to the alarm platform, and finishing alarming through the alarm assistant plug-in.
The alarm device that this application embodiment provided includes:
the receiving module is used for receiving an alarm triggering instruction;
the acquisition module is used for responding to the alarm triggering instruction and acquiring the current interaction state of a user, wherein the interaction state is used for indicating whether the user has at least one interaction capability, and the interaction capability comprises listening, speaking, reading and writing;
a determining module, configured to determine a current interaction policy that is matched with the current interaction state, where the current interaction state indicates that at least one interaction capability that the user has includes an interaction capability that the current interaction policy needs to use, and the current interaction policy includes an information receiving policy and an information sending policy in an alarm process;
and the execution module is used for sending alarm information according to the information sending strategy and receiving information according to the information receiving strategy.
The computer device provided by the embodiment of the present application includes a memory and a processor, where the memory stores a computer program that can run on the processor, and the processor implements the method described in the embodiment of the present application when executing the program.
The computer-readable storage medium provided by the embodiment of the present application stores thereon a computer program, and the computer program, when executed by a processor, implements the method provided by the embodiment of the present application.
According to the alarm method, the alarm device, the computer equipment and the computer readable storage medium, after an alarm triggering instruction is received, a current interaction state of a user is obtained in response to the alarm triggering instruction, the current interaction state is used for indicating whether the user has at least one interaction capacity of a plurality of interaction capacities including listening, speaking, reading and writing, then a current interaction strategy matched with the current interaction state is determined, the current interaction state indicates that the at least one interaction capacity of the user includes the interaction capacity required to be used by the current interaction strategy, the current interaction strategy includes an information receiving strategy and an information sending strategy in the alarm process, alarm information is sent according to the information sending strategy, and information is received according to the information receiving strategy.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and, together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic diagram of an alarm scenario provided in an embodiment of the present application;
fig. 2 is a block diagram of an example of an electronic device according to an embodiment of the present application;
fig. 3 is a schematic flow chart of an implementation of the alarm method according to the embodiment of the present application;
fig. 4 is a schematic flowchart of another implementation of the alarm method according to the embodiment of the present application;
fig. 5 is a block flow diagram of an interaction capability detection process provided in an embodiment of the present application;
fig. 6 is a schematic diagram of an example of a display interface of an interaction capability detection process according to an embodiment of the present application;
fig. 7 is a schematic diagram of another example of a display interface of an interaction capability detection process according to an embodiment of the present application;
fig. 8 is a schematic diagram of another example of a display interface of an interaction capability detection process according to an embodiment of the present application;
FIG. 9 is a schematic structural diagram of an alarm device according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, specific technical solutions of the present application will be described in further detail below with reference to the accompanying drawings in the embodiments of the present application. The following examples are intended to illustrate the present application but are not intended to limit the scope of the present application.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
It should be noted that the terms "first \ second \ third" are used herein to distinguish similar or different objects and do not denote a particular order or importance to the objects, and it should be understood that "first \ second \ third" may be interchanged with a particular order or sequence where permissible to enable embodiments of the present application described herein to be practiced otherwise than as shown or described herein.
The rapid development of communication technology and mobile terminals brings convenience to life of people, and the mobile terminals can provide help for users in different scenes. As an example, in the field of emergency alarm, when a user needs help, the user may alarm the emergency alarm platform through the mobile terminal, for example, the user may send alarm information to the emergency alarm platform by using various information interaction methods such as voice, text, video, and the like.
Taking the example that the user dials 120 for help, after the user dials 120 a call, the 120 alarm platform performs information interaction with the user, for example, the small assistant of the alarm platform asks for the address and physical symptoms of the user, then the user answers the small assistant by voice, and the small assistant performs subsequent alarm processes according to the voice reply of the user. When the user is in a state of impaired listening and speaking ability or the current state of the user cannot sound, the alarm platform determines that the current alarm process is finished because the user does not reply, and the alarm fails. In this case, the user may call again or may alert again for help by other means, such as a text message alert. Therefore, the alarm mode in the related art has the problem of low alarm efficiency.
In view of this, the present application provides an alarm method, which is applied to an electronic device, and the electronic device may be various types of devices with information processing capability in the implementation process. For example, the electronic device may include a personal computer, a notebook computer, a palm top computer, a server, or the like; the electronic device may also be a mobile terminal, for example, the mobile terminal may include a mobile phone, a vehicle-mounted computer, a tablet computer, a projector, or the like. The functions implemented by the method can be implemented by calling program code by a processor in an electronic device, which at least comprises a processor and a storage medium.
Referring to fig. 1, a schematic diagram of an alarm scenario provided in an embodiment of the present application is shown in fig. 1, where the alarm scenario includes anelectronic device 101 and analarm platform 102, where theelectronic device 101 may be a terminal such as a mobile phone, a computer, and a smart watch, or may also be a device in a terminal such as a mobile phone, a computer, and a smart watch, for example, a device for assisting an alarm; thealarm platform 102 may be a server, and theelectronic device 101 may establish a communication connection with thealarm platform 102 to complete an alarm through information interaction with thealarm platform 102.
When theelectronic device 101 is a device for assisting an alarm in a terminal, please refer to fig. 2, which is a block diagram of an example of theelectronic device 101. As shown in fig. 2, theelectronic device 101 includes avoice acquisition unit 1011, avoice processing unit 1012, atouch display unit 1013, atext processing unit 1014, and acommunication unit 1015, where thevoice acquisition unit 1011 is configured to acquire a voice of a user, thevoice processing unit 1012 is configured to perform voice recognition, semantic analysis, voice synthesis, encoding processing, and the like on the acquired voice of the user or the voice transmitted by thealarm platform 102, thetouch display unit 1013 is configured to output display contents and detect a touch operation of the user, thetext processing unit 1014 is configured to recognize text information input by the user at thetouch display unit 1013, perform semantic analysis, recognition, and the like on the text information, and thecommunication unit 1015 is configured to communicate with thealarm platform 102.
It should be noted that theelectronic device 101 may be an independent apparatus, or may be a system composed of a plurality of functional modules, and the plurality of functional modules may be deployed in one or more terminals, and the specific form of theelectronic device 101 is not limited herein.
Hereinafter, an alarm method provided in an embodiment of the present application is described in detail with reference to the accompanying drawings.
Fig. 3 is a schematic flow chart illustrating an implementation of the alarm method according to the embodiment of the present application. As shown in fig. 3, the method may include the followingsteps 301 to 304:
step 301, receiving an alarm triggering instruction.
In the embodiment of the present application, the alarm triggering instruction may be obtained according to a target operation of a user on the electronic device, for example, by an operation of clicking a designated number in a dial pad interface of the electronic device, or an operation of dialing a designated number, or may be an operation of pressing a physical key of the electronic device in a preset sequence, and the like; the alarm triggering instruction can also be automatically generated by the electronic equipment according to the detected user information or environment information, for example, the electronic equipment can periodically start a camera to acquire images including the user, determine the body state of the user through image identification, and automatically generate the alarm triggering instruction if the user is identified to lie on the ground and the current body state of the user is poor; or, the electronic device may also collect environment information of the user, for example, collect an audio signal of an environment where the user is located, identify whether the audio signal includes some help content, for example, if contents such as "save", "fast run", and the like are obtained from the audio signal, determine that the user is in a dangerous environment, and thereby automatically generate an alarm triggering instruction; or, the alarm triggering instruction may be generated according to the reminder set by the electronic device, for example, the user may set an alarm clock, and when the time indicated by the alarm clock arrives, the alarm triggering instruction is automatically generated.
Of course, the generation manner of the alarm triggering instruction may include, but is not limited to, the above examples, and therefore, it is not described here, and those skilled in the art may set the alarm triggering instruction according to actual use requirements.
Step 302, responding to the alarm triggering instruction, and acquiring the current interaction state of the user.
In this embodiment of the application, the interaction status is used to indicate whether the user has at least one interaction capability, where the interaction capability may be an ability to interact with the outside through a limb or a body part of the user, and as an example, the interaction capability includes four capabilities of listening, speaking, reading, and writing, but is not limited thereto, for example, the interaction capability may include an ability of the user to interact with the electronic device through an eyeball, and the content selected by the user is determined by detecting a focal position of the eyeball of the user, so as to implement the interaction; or the interaction capacity can comprise the capacity of the user to interact with the electronic equipment through sign language, and the interaction is realized by collecting and identifying sign language actions of the user; of course, other interaction capabilities may also be included, not to mention here.
Step 303, determining a current interaction policy matching the current interaction state.
In the embodiment of the application, the interaction policy may be understood as an interaction manner used for indicating that the alarm platform performs information interaction with the user in the alarm process, and the current interaction policy includes an information receiving policy and an information sending policy in the alarm process. For example, if the alarm platform needs to obtain the name of the user, the information sending policy is what kind of interaction manner is used to send the name of the user, for example, the information sending policy may include obtaining the name of the user by collecting the voice of the user, or obtaining the name of the user by using the text input by the user on the electronic device, and the information receiving policy is what kind of interaction manner is used to receive the information, for example, if the alarm platform asks for an address from the user, the information receiving policy information may include directly playing the asking voice, or identifying the asking voice to obtain the corresponding asking text, and displaying the asking text on the electronic device, which kind of interaction manner should be specifically selected needs to be determined according to the current interaction state of the user.
In the embodiment of the application, the current interaction policy matched with the current interaction state meets the following conditions: the current interaction state indicates that at least one interaction capability possessed by the user comprises an interaction capability required to be used by the current interaction policy. It is to be understood that the interaction capability required to be used by the current interaction policy must be currently possessed by the user, for example, the user possesses a listening and speaking interaction capability, and the interaction capability used in the current interaction policy may be listening or speaking, that is, may also be listening and speaking, but cannot use other interaction capabilities.
And after the current interaction state of the user is determined, determining a current interaction strategy according to the interaction state. For example, the current state of the user indicates that the user has an interaction capability of speaking and reading, and the current interaction policy may be determined according to the interaction capability of the user, for example, the current interaction policy may be: the interactive information output by the alarm platform is displayed in a manner that the user can read through the electronic equipment, and the interactive information fed back by the user is fed back to the electronic equipment in a manner that the electronic equipment indicates the user to speak.
Or, the electronic device may also preset a corresponding relationship between the interaction state and the interaction policy, and when the current interaction state of the user is determined, the current interaction policy is determined according to the corresponding relationship.
Of course, other specific implementations may be included, and are not described herein.
And step 304, sending alarm information according to the information sending strategy, and receiving information according to the information receiving strategy.
And after the interaction strategy corresponding to the user is determined, performing information interaction with the alarm platform according to the determined interaction strategy to finish the alarm process. Following the above example, the current interaction policy is: the method comprises the steps that interactive information output by an alarm platform is displayed in a user-readable mode through electronic equipment, the interactive information fed back by a user indicates the user to speak to the electronic equipment through the electronic equipment, the information is displayed in the electronic equipment in a dialog box mode after the electronic equipment receives user information acquisition information sent by the alarm platform, the voice input state is displayed in each input box, when a cursor is displayed in a first input box, the voice input function is automatically started, the voice of the user to the first input box is acquired, the voice is converted into characters to be filled in the dialog box, when the user finishes filling in the first input box, the user automatically jumps to the next input box, voice input is automatically started, the voice of the user in the next input box is acquired, filling in is automatically completed until filling in all the input boxes is completed, then the filled contents are reported to the alarm platform, all interactive processes required by the alarm platform are completed in the mode, and alarm is completed.
In the technical scheme, the alarm is given based on the interaction strategy matched with the current interaction state of the user, that is, the interaction process in the alarm process can be completed by the user in the current state, so that the situation that the interaction process in the alarm process cannot be completed due to the limited interaction capability of the user can be avoided, the step of switching the alarm mode is needed, and the alarm efficiency can be improved.
As can be seen from the above, whether the current interaction state of the user can be accurately obtained is a factor that determines that the method in the embodiment of the present application can effectively improve the efficiency of the alarm, and the manner of obtaining the current interaction state of the user is many.
Fig. 4 is a flowchart of another example of an alarm method provided in an embodiment of the present application, and as shown in fig. 4, the method may include the followingsteps 401 to 405:
step 401, receiving an alarm triggering instruction.
Step 401 is similar to step 301 in the embodiment shown in fig. 3, and is not described again here.
Step 402, responding to the alarm triggering instruction, acquiring an interaction capability detection flow corresponding to the user, sending interaction information according to the interaction capability detection flow, and determining the current interaction state according to whether feedback of the user for the interaction information is received.
In the embodiment of the present application, the interactive capability of the user is obtained through an interactive capability detection process, and the interactive capability detection process may be preset.
Please refer to fig. 5, which is a schematic diagram of an interaction capability detection process according to an embodiment of the present application. The interactive capability detection process shown in fig. 5 includes detecting a preset interactive capability, where the preset interactive capability includes five interactive capabilities of listening, speaking, reading, clicking a key, and inputting a text, that is, the interactive capability detection process is detecting the preset interactive capability. After receiving the alarm triggering instruction, the electronic device detects the interaction capability of the user according to the interaction capability detection flow shown in fig. 5.
Firstly, playing preset voice confirmation information A through a loudspeaker of the electronic equipment, and displaying preset character confirmation information B on a display screen of the electronic equipment, wherein if the voice A' fed back by a user is detected within a preset time length and is matched with the voice confirmation information A, the fact that the user has the interaction ability of listening and speaking is indicated. As an example, the voice confirmation information a may be content for the user to perform voice rephrasing, for example, "please rephrase the following content: the voice A ' is matched with the voice confirmation information A if the voice A ' is the Chinese, otherwise, the voice A ' is not matched with the voice confirmation information A; as another example, phonetic confirmation information a may be a phonetic question and answer, such as "what the last name of wu has", if phonetic a ' is "wu", then phonetic a ' is declared to match phonetic confirmation information a, otherwise phonetic a ' is not matched with phonetic confirmation information a. Of course, the specific content of the voice confirmation information a may be set according to the actual use situation, and is not limited herein.
And if the user feedback voice A 'is not detected within the preset time length, but the voice B' corresponding to the text confirmation information B through voice feedback is detected, determining that the user has the interactive capacity of speaking and reading, but does not have the interactive capacity of listening. As an example, referring to fig. 6, the text confirmation information B may be "please input the color of the flower in the lower drawing and confirm whether the flower can be heard", and four touch keys of "red", "green", "audible", and "inaudible" are presented on the display screen, if the voice B' is "the flower is red, i cannot hear", it indicates that the user has the interactive capability of speaking and reading, and does not have the audible interactive capability.
And if the feedback of the user to the voice confirmation information A and the text confirmation information B is not detected within the preset time length, confirming that the user does not have any interaction capacity.
And if the user feedback voice A' is not detected within the preset time length but the key click feedback operation of the user for the character confirmation information B is detected, confirming that the user has the reading and key click capabilities. Moreover, it may be determined whether the user has the audible interaction capability in addition to the interaction capability of reading and key clicking by clicking the corresponding option by the user, for example, selecting "audible" or "inaudible" click operation, for example, if the user detects the click operation on the "audible" touch key, it is determined that the user has the audible interaction capability, and if the user detects the click operation on the "inaudible" touch key, it is determined that the user does not have the audible interaction capability.
After the detection step is completed, another display interface is displayed on the display screen of the electronic device, as shown in fig. 7. In the display interface, a user input box is presented and instructs the user to input information by text, for example, the user may be instructed to input physical conditions by text or input a name, and specific input information may be set according to actual use conditions, which is not limited herein, and in fig. 7, the input of a last name is taken as an example for explanation.
And if the character input operation of the user is detected within the preset time length, determining that the user has the interactive capability of character input, otherwise, determining that the user does not have the interactive capability of character input.
At this point, the whole interactive capability detection process is completed.
It should be noted that the interactive ability detection process shown in fig. 5 is only one example, and those skilled in the art may set more processes to detect more interactive abilities, which is not necessarily described herein.
In the interaction ability detection process shown in fig. 5, detection is performed according to a preset sequence, for example, the interaction ability of listening, speaking, reading, and clicking a key of a user is detected first, and then the interaction ability of text input of the user is detected, but since the interaction ability possessed by each user or the preference for using the interaction ability is different, for example, user X prefers to use the listening and speaking interaction ability, and user Y prefers to use the text input interaction ability, in this case, if a fixed interaction ability detection process is used to detect different users, there may be a problem that detection time is too long or a detection result is inaccurate.
As an example, before performing the interactive process detection process, user information of a user is first obtained, where the user information is used to indicate a preference degree of the user for at least one interactive capability, and then the interactive capability detection process is determined according to the user information, where interactive capabilities with higher preference degrees are in a detection sequence in the interactive capability detection process. That is, if the user prefers which interactive capability to use, the interactive capability is preferentially detected.
In the embodiment of the present application, the user information may be the age of the user, and the preferences of different age groups for different interactive capabilities may be different, for example, a user aged 10-20 may prefer to use the interactive capability of text input, and a user aged 50-60 may prefer to use the interactive capability of listening and speaking, and the order of detecting the interactive capability of text input and the interactive capability of listening and speaking may be adjusted for users of different age groups, and the detection of the interactive capability of text input may be preceded by the detection of the interactive capability of listening and speaking. Or, the user information may also be historical operation data of the user on the electronic device, and if the historical operation data represents that the duration of the call using and voice function of the user is longer than the duration of the function of inputting characters by using a virtual keyboard of the electronic device, the user is considered to prefer to use the listening and speaking interaction capability; otherwise, the user is deemed to prefer the interactive capability of using text input. Of course, the specific determination method varies according to the user information, and is not described here.
Instep 403, a current interaction policy matching the current interaction state is determined.
After determining the current interaction state of the user, further determining a current interaction strategy. Because the interaction strategy comprises receiving information of the alarm platform and sending feedback information to the alarm platform, in the embodiment of the application, the current interaction strategy comprises an information receiving strategy and an information sending strategy in the alarm process.
As an example, the current interaction state may indicate that the user has only one interaction capability, and the information receiving policy and the information sending policy in the current interaction policy are both policies for performing interaction using only one interaction capability. For example, if the current interaction state of the user is used for indicating that the user only has the interaction capability of reading, the current interaction strategy is to display information sent by the alarm platform in a text and selection box manner, detect eye movement data of the user to determine the intersection point of the sight lines of the user, determine feedback information, and send the feedback information to the alarm platform.
As another example, the current interaction state may indicate that the user has two interaction capabilities, the information receiving policy in the current interaction policy may select a policy for interacting with one of the interaction capabilities, and the information sending policy may select a policy for interacting with the other interaction capability. For example, if the current interaction state of the user is used to indicate that the user has an interaction ability of listening and speaking, the current interaction policy is to output information sent by the alarm platform in a voice manner, and to use the collected voice of the user as feedback information and send the feedback information to the alarm platform.
As another example, if the current interaction state indicates that the user has more than two interaction capabilities, then determining a current interaction policy that matches the current interaction state includes: determining a first interaction capacity for information reception and a second interaction capacity for information transmission from more than two interaction capacities possessed by the user; and determining the information receiving strategy according to the first interactive capacity, and determining the information sending strategy according to the second interactive capacity. For example, the current interaction state of the user is used to indicate that the user has the interaction capabilities of listening, speaking and reading, historical usage data of the user on the electronic device may be further obtained, usage preferences of the user for various interaction capabilities are determined according to the historical usage data, a specific determination manner is similar to the above corresponding contents, and details are not repeated here.
In the embodiment of the application, the electronic device may store in advance a corresponding relationship between a plurality of interaction capabilities and an interaction policy. Please refer to table 1. In table 1, the interaction capabilities include five interaction capabilities of listening, speaking, reading, inputting text, and clicking a key, and each interaction policy includes a policy of receiving information and a policy of sending information, where "√" indicates that the interaction capability is present, and "xx" indicates that the interaction capability is absent. The 'direct listening' means that a user directly listens to information sent by the alarm platform by means of the hearing organ of the user. The 'direct speaking' means that a user directly speaks by depending on a self-sounding organ, and the electronic equipment interacts with the alarm platform according to the collected user voice. The 'voice-to-text display' means that voice information sent by the alarm platform is converted into text display by recognizing voice. The step of inputting information in characters and outputting after synthesizing voice refers to the step that a user inputs character information through electronic equipment in the modes of handwriting, inputting by an input method and the like, obtains the input character information, carries out voice synthesis on the character information to obtain voice information corresponding to the character information, and interacts with an alarm platform according to the voice information. The step of giving a response option, detecting key confirmation operation and outputting information corresponding to the key confirmation operation after synthesizing voice is carried out, which means that information sent by an alarm platform is converted into text information, semantic analysis is carried out to generate key answer options which can be directly selected by a user, the key confirmation operation is detected, the key confirmation operation is subjected to voice synthesis to obtain voice information, and interaction with the alarm platform is carried out according to the voice information.
TABLE 1
Figure BDA0003852346480000101
Figure BDA0003852346480000111
The corresponding relationship shown in table 1 is a corresponding relationship between the interaction capability and the interaction policy, and after the interaction capability of the user is determined, the matching interaction policy also needs to be searched in table 1 according to the interaction capability, and when the interaction capability is high or the combination condition is high, the matching interaction policy may need to be determined for a long time. Therefore, in another example, the interaction capability combinations may also be performed according to preset interaction capabilities, and a corresponding interaction policy is generated for each combination, so that the corresponding relationship shown in table 2 may be obtained. As shown in table 2, the identifiers in table 2 are identifiers corresponding to various interaction capability combinations, so that only the correspondence between the identifiers and the interaction strategies needs to be stored in table 2, and after the current interaction capability of the user is obtained, which interaction capability combination belongs to is determined, and then the corresponding target identifier is determined, and the corresponding interaction strategy can be obtained by searching the identifiers from table 2, so that the search efficiency can be improved.
TABLE 2
Figure BDA0003852346480000112
Figure BDA0003852346480000121
In the embodiment of the present application, the corresponding relationship shown in table 1 or table 2 is not limited, and a person skilled in the art may determine the content stored in the corresponding relationship according to an actual use requirement.
And step 404, establishing an alarm channel, sending alarm information according to the information sending strategy, and receiving information according to the information receiving strategy.
After determining the interaction strategy corresponding to the user, sending an alarm instruction to the alarm platform, for example, calling a number corresponding to the alarm platform, establishing an alarm channel between the electronic device and the alarm platform, and then receiving the interaction information sent by the alarm platform. The interactive information can be voice information sent by a worker or an intelligent robot corresponding to the alarm platform, and then the electronic equipment converts the interactive information according to the information receiving strategy and the information sending strategy so as to output the converted interactive information, receives feedback information of the user aiming at the converted interactive information, and sends the feedback information to the alarm platform.
As an example, if the information receiving policy in the current interaction policy is to display information sent by the alarm platform in the form of text and a selection box, and the information sending policy is to use collected voice of the user as feedback information and send the feedback information to the alarm platform, then when the alarm platform sends voice "ask for your name? When the user enters the voice input mode, the electronic device recognizes the voice, outputs the content of the voice by using a preset character template, and please refer to fig. 8, as shown in fig. 8, displays the character content corresponding to the voice in a display interface, displays an input box under the character content, displays an icon of voice collection to indicate the user to input the voice in a voice mode, and then feeds the collected voice of the user back to the alarm platform to complete one-time interaction. If there are multiple subsequent interactions, the processing manner of each interaction information is similar to the above-mentioned contents, and is not described herein again.
Step 405, releasing the alarm channel and ending the alarm.
After the user completes all the interaction processes with the alarm platform, the alarm platform can actively interrupt the connection with the electronic equipment, or the electronic equipment can actively interrupt the connection with the alarm platform after not receiving the interaction information sent by the alarm platform for a long time, thereby finishing the alarm.
In the above embodiment, the current interaction state of the user is determined according to the interaction capability detection process, and in another embodiment, the current interaction state may also be determined by an image acquisition manner. As an example, after the electronic device detects an alarm violation instruction, the electronic device may start an image capturing device to capture a target image, where the target image includes the user or an environment where the user is located, then determine an optimal interaction manner of the user based on the image, and determine a current interaction state of the user according to the optimal interaction manner.
For example, when the target image acquired by the electronic device is that the user lies on the ground with both hands covering the stomach, and it is inferred that the user is not suitable for watching the screen and clicking the button under the current condition, the current optimal interaction mode of the user is a voice interaction mode, and the interaction capacity required to be used by the voice interaction mode includes the listening and speaking interaction capacity, so that it is determined that the current interaction state of the user is the listening and speaking interaction capacity.
For another example, when the target image acquired by the electronic device is an image of a user in a concert, it is inferred that the user is in a noisy scene and is not sensitive to sound, it is determined that the optimal interaction mode of the user is a text interaction mode, and the interaction capabilities required to be used by the text interaction mode include interaction capabilities of reading and text input, so that it is determined that the current interaction state of the user is the interaction capability with reading and text input.
Of course, the obtaining of the current interaction state of the user may also include other manners, for example, the current interaction state of the user may be determined through historical alarm data of the user, which is not described in the embodiments of the present application.
In addition, the warning in the above embodiment is performed on the premise that the user has at least one of a plurality of interaction capabilities, and in some cases, the user does not have any interaction capability, for example, the user is already in a coma state, in this case, the current interaction policy is a policy for performing interaction by using a warning assistant corresponding to the warning platform, the sending of the warning information according to the information and the receiving of the information according to the information include:
and after an alarm instruction is sent to the alarm platform, waking up an alarm assistant plug-in corresponding to the alarm platform, and finishing alarm through the alarm assistant plug-in.
It is understood that, in this case, the alarm assistant, which may be a piece of software or a plug-in installed in the electronic device, hosts the alarm flow, and the alarm assistant feeds back the interaction information of the alarm platform, and the alarm assistant may store basic information corresponding to the user in advance, such as name, address, and physical health condition, so that, when the alarm assistant hosts, the alarm assistant may complete the alarm flow according to the pre-stored information.
It should be understood that although the various steps in the flowcharts of fig. 3-4 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 3-4 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternating with other steps or at least some of the sub-steps or stages of other steps.
Based on the foregoing embodiments, the present application provides an alarm device, which includes modules and units included in the modules, and can be implemented by a processor; of course, the implementation can also be realized through a specific logic circuit; in implementation, the processor may be a Central Processing Unit (CPU), a Microprocessor (MPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), or the like.
Fig. 9 is a schematic structural diagram of an alarm device provided in an embodiment of the present application, and as shown in fig. 9, the alarm device 900 includes a receiving module 901, an obtaining module 902, a determining module 903, and an executing module 904, where:
a receiving module 901, configured to receive an alarm triggering instruction;
an obtaining module 902, configured to obtain, in response to the alarm triggering instruction, a current interaction state of a user, where the interaction state is used to indicate whether the user has at least one interaction capability, and the interaction capability includes listening, speaking, reading, and writing;
a determining module 903, configured to determine a current interaction policy that is matched with the current interaction state, where the current interaction state indicates that at least one interaction capability that the user has includes an interaction capability that the current interaction policy needs to use, and the current interaction policy includes an information receiving policy and an information sending policy in an alarm process;
and the execution module 904 is configured to send alarm information according to the information sending policy, and receive information according to the information receiving policy.
In some embodiments, the obtaining module 902 is specifically configured to:
acquiring an interactive capacity detection flow corresponding to the user, and sending out interactive information according to the interactive capacity detection flow;
and determining the current interaction state according to whether the feedback of the user for the interaction information is received.
In some embodiments, the obtaining module 902 is specifically configured to:
acquiring user information of the user, wherein the user information is used for indicating the preference degree of the user for at least one interactive capability;
and determining the interactive capacity detection flow according to the user information, wherein the interactive capacity with higher preference degree is in the front of the detection sequence in the interactive capacity detection flow.
In some embodiments, the obtaining module 902 is specifically configured to:
acquiring a target image, wherein the target image comprises the user or the environment where the user is located;
determining the optimal interaction mode of the user based on the image;
and determining the current interaction state of the user according to the optimal interaction mode.
In some embodiments, the current interaction state indicates that the user has more than two interaction capabilities, and the determining module 903 is specifically configured to:
determining a first interaction capacity for information reception and a second interaction capacity for information transmission from more than two interaction capacities possessed by the user;
and determining the information receiving strategy according to the first interactive capability, and determining the information sending strategy according to the second interactive capability.
In some embodiments, the execution module 904 is specifically configured to:
after an alarm instruction is sent to an alarm platform, receiving interactive information sent by the alarm platform;
converting the interactive information according to the information receiving strategy and the information sending strategy to output the converted interactive information;
and receiving feedback information of the user aiming at the converted interactive information, and sending the feedback information to the alarm platform to finish alarm.
In some embodiments, the current interaction state is used to indicate that the user does not have any of the at least one interaction capability, the current interaction policy is a policy for interacting with an alarm assistant corresponding to the alarm platform, and the executing module 904 is specifically configured to:
and after an alarm instruction is sent to the alarm platform, waking up an alarm assistant plug-in corresponding to the alarm platform, and finishing alarm through the alarm assistant plug-in. .
In the embodiment of the application, the alarm is performed based on the interaction strategy matched with the current interaction state of the user, that is, the interaction process in the alarm process can be completed by the user in the current state, so that the situation that the interaction process in the alarm process cannot be completed due to the limited interaction capability of the user can be avoided, the step of switching the alarm mode is required, and the alarm efficiency can be improved.
The above description of the apparatus embodiments, similar to the above description of the method embodiments, has similar beneficial effects as the method embodiments. For technical details not disclosed in the embodiments of the apparatus of the present application, reference is made to the description of the embodiments of the method of the present application for understanding.
It should be noted that, in the embodiment of the present application, the division of the module by the alarm device shown in fig. 9 is schematic, and is only a logic function division, and another division manner may be provided in actual implementation. In addition, functional units in the embodiments of the present application may be integrated into one processing unit, may exist alone physically, or may be integrated into one unit by two or more units. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit. Or may be implemented in a combination of software and hardware.
It should be noted that, in the embodiment of the present application, if the method described above is implemented in the form of a software functional module and sold or used as a standalone product, it may also be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application or portions thereof that contribute to the related art may be embodied in the form of a software product, where the computer software product is stored in a storage medium and includes several instructions for causing an electronic device to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, or an optical disk. Thus, embodiments of the present application are not limited to any specific combination of hardware and software.
An embodiment of the present application provides a computer device, which may be a terminal, and an internal structure diagram of the computer device may be as shown in fig. 10. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is used for storing data. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement the steps of the alarm method provided in the above embodiments.
Embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps in the methods provided in the above embodiments.
Embodiments of the present application provide a computer program product containing instructions, which when executed on a computer, cause the computer to perform the steps of the method provided by the above method embodiments.
It will be appreciated by those skilled in the art that the configuration shown in fig. 10 is a block diagram of only a portion of the configuration associated with the present application, and is not intended to limit the computing device to which the present application may be applied, and that a particular computing device may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, the alarm apparatus provided herein may be implemented in the form of a computer program that is executable on a computer device such as that shown in fig. 10. The memory of the computer device may store various program modules constituting the sampling apparatus, such as the receiving module, the obtaining module, the determining module, and the executing module shown in fig. 9. The program modules constitute computer programs to make the processor execute the steps of the alarm method of the embodiments of the present application described in the present specification.
In an embodiment, a computer device is provided, comprising a memory storing a computer program and a processor implementing the steps of any one of the alert methods provided in the above embodiments when the processor executes the computer program.
In an embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, is adapted to carry out the steps of any one of the alert methods provided in the above embodiments.
Here, it should be noted that: the above description of the storage medium and device embodiments, similar to the description of the method embodiments above, has similar beneficial effects as the method embodiments. For technical details not disclosed in the embodiments of the storage medium, the storage medium and the device of the present application, reference is made to the description of the embodiments of the method of the present application for understanding.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" or "some embodiments" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" or "in some embodiments" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application. The above-mentioned serial numbers of the embodiments of the present application are merely for description, and do not represent the advantages and disadvantages of the embodiments. The foregoing description of the various embodiments is intended to highlight various differences between the embodiments, and the same or similar parts may be referred to each other, and for brevity, will not be described again herein.
The term "and/or" herein is merely an association relationship describing an associated object, and means that three relationships may exist, for example, object a and/or object B, may mean: the object A exists alone, the object A and the object B exist simultaneously, and the object B exists alone.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a component of' 8230; \8230;" does not exclude the presence of additional like elements in a process, method, article, or apparatus that comprises the element.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments are merely illustrative, and for example, the division of the modules is only one logical functional division, and other divisions may be realized in practice, such as: multiple modules or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or modules may be electrical, mechanical or other.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules; can be located in one place or distributed on a plurality of network units; some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all functional modules in the embodiments of the present application may be integrated into one processing unit, or each module may be separately regarded as one unit, or two or more modules may be integrated into one unit; the integrated module can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps of implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer-readable storage medium, and when executed, executes the steps including the method embodiments; and the aforementioned storage medium includes: various media that can store program codes, such as a removable Memory device, a Read Only Memory (ROM), a magnetic disk, or an optical disk.
Alternatively, the integrated units described above in the present application may be stored in a computer-readable storage medium if they are implemented in the form of software functional modules and sold or used as independent products. Based on such understanding, the technical solutions of the embodiments of the present application or portions thereof that contribute to the related art may be embodied in the form of a software product, where the computer software product is stored in a storage medium and includes several instructions for causing an electronic device to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
The methods disclosed in the several method embodiments provided in the present application may be combined arbitrarily without conflict to obtain new method embodiments.
Features disclosed in several of the product embodiments provided in the present application may be combined in any combination to yield new product embodiments without conflict.
The features disclosed in the several method or apparatus embodiments provided herein may be combined in any combination to arrive at a new method or apparatus embodiment without conflict.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think of the changes or substitutions within the technical scope of the present application, and shall cover the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. An alarm method, characterized in that the method comprises:
receiving an alarm triggering instruction;
responding to the alarm triggering instruction, and acquiring a current interaction state of a user, wherein the interaction state is used for indicating whether the user has at least one interaction capability, and the interaction capability comprises listening, speaking, reading and writing;
determining a current interaction strategy matched with the current interaction state, wherein the current interaction state indicates that at least one interaction capacity possessed by the user comprises an interaction capacity required to be used by the current interaction strategy, and the current interaction strategy comprises an information receiving strategy and an information sending strategy in an alarm process;
and sending alarm information according to the information sending strategy, and receiving strategy receiving information according to the information.
2. The method of claim 1, wherein the obtaining a current interaction state of the user comprises:
acquiring an interactive capacity detection flow corresponding to the user, and sending out interactive information according to the interactive capacity detection flow;
and determining the current interaction state according to whether the feedback of the user for the interaction information is received.
3. The method according to claim 2, wherein the obtaining of the interaction capability detection procedure corresponding to the user comprises:
acquiring user information of the user, wherein the user information is used for indicating the preference degree of the user for at least one interactive capability;
and determining the interactive capacity detection flow according to the user information, wherein the higher the preference degree, the higher the detection sequence of the interactive capacity in the interactive capacity detection flow.
4. The method of claim 1, wherein the obtaining the current interaction state of the user comprises:
acquiring a target image, wherein the target image comprises the user or the environment where the user is located;
determining the optimal interaction mode of the user based on the image;
and determining the current interaction state of the user according to the optimal interaction mode.
5. The method of claim 1, wherein the current interaction state indicates that the user has more than two interaction capabilities, and wherein the determining a current interaction policy that matches the current interaction state comprises:
determining a first interaction capacity for information reception and a second interaction capacity for information transmission from more than two interaction capacities possessed by the user;
and determining the information receiving strategy according to the first interactive capability, and determining the information sending strategy according to the second interactive capability.
6. The method of claim 1, wherein sending alarm information according to the information sending policy and receiving information according to the information receiving policy comprises:
after an alarm instruction is sent to an alarm platform, receiving interactive information sent by the alarm platform;
converting the interactive information according to the information receiving strategy and the information sending strategy so as to output the converted interactive information;
and receiving feedback information of the user aiming at the converted interactive information, and sending the feedback information to the alarm platform to finish alarm.
7. The method of claim 1, wherein the current interaction state is used for indicating that the user does not have any of the at least one interaction capability, the current interaction policy is a policy for interacting with an alarm assistant corresponding to the alarm platform, and the sending of the alarm information according to the information sending policy and the receiving of the information according to the information receiving policy comprise:
and after an alarm instruction is sent to the alarm platform, waking up an alarm assistant plug-in corresponding to the alarm platform, and finishing alarm through the alarm assistant plug-in.
8. An alarm device, characterized in that the device comprises:
the receiving module is used for receiving an alarm triggering instruction;
the acquisition module is used for responding to the alarm triggering instruction and acquiring the current interaction state of a user, wherein the interaction state is used for indicating whether the user has at least one interaction capability, and the interaction capability comprises listening, speaking, reading and writing;
a determining module, configured to determine a current interaction policy that is matched with the current interaction state, where the current interaction state indicates that at least one interaction capability that the user has includes an interaction capability that the current interaction policy needs to use, and the current interaction policy includes an information receiving policy and an information sending policy in an alarm process;
and the execution module is used for sending alarm information according to the information sending strategy and receiving strategy receiving information according to the information.
9. A computer device comprising a memory and a processor, said memory storing a computer program operable on the processor, wherein the processor when executing said program performs the steps of the method of any of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
CN202211153356.XA2022-09-192022-09-19Alarm method and device, equipment and storage mediumActiveCN115514843B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202211153356.XACN115514843B (en)2022-09-192022-09-19Alarm method and device, equipment and storage medium

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202211153356.XACN115514843B (en)2022-09-192022-09-19Alarm method and device, equipment and storage medium

Publications (2)

Publication NumberPublication Date
CN115514843Atrue CN115514843A (en)2022-12-23
CN115514843B CN115514843B (en)2024-06-21

Family

ID=84503053

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202211153356.XAActiveCN115514843B (en)2022-09-192022-09-19Alarm method and device, equipment and storage medium

Country Status (1)

CountryLink
CN (1)CN115514843B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN107483745A (en)*2017-09-142017-12-15维沃移动通信有限公司 A mobile terminal information processing method, mobile terminal and server
WO2019052021A1 (en)*2017-09-132019-03-21新丝绸之路科技有限公司Alarm method, alarm processing method, electronic device, and computer storage medium
CN109697827A (en)*2018-12-292019-04-30出门问问信息科技有限公司Intelligent alarm method, device, equipment and storage medium
CN109725725A (en)*2018-12-292019-05-07出门问问信息科技有限公司The information processing method of smart-interactive terminal and smart-interactive terminal
CN112383665A (en)*2020-11-062021-02-19闻泰通讯股份有限公司Information processing method, information processing device, computer equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2019052021A1 (en)*2017-09-132019-03-21新丝绸之路科技有限公司Alarm method, alarm processing method, electronic device, and computer storage medium
CN107483745A (en)*2017-09-142017-12-15维沃移动通信有限公司 A mobile terminal information processing method, mobile terminal and server
CN109697827A (en)*2018-12-292019-04-30出门问问信息科技有限公司Intelligent alarm method, device, equipment and storage medium
CN109725725A (en)*2018-12-292019-05-07出门问问信息科技有限公司The information processing method of smart-interactive terminal and smart-interactive terminal
CN112383665A (en)*2020-11-062021-02-19闻泰通讯股份有限公司Information processing method, information processing device, computer equipment and storage medium

Also Published As

Publication numberPublication date
CN115514843B (en)2024-06-21

Similar Documents

PublicationPublication DateTitle
US20220277752A1 (en)Voice interaction method and related apparatus
CN108847214B (en)Voice processing method, client, device, terminal, server and storage medium
US11705120B2 (en)Electronic device for providing graphic data based on voice and operating method thereof
KR20200017249A (en)Apparatus and method for providing feedback for confirming intent of a user in an electronic device
CN113380275B (en)Voice processing method and device, intelligent equipment and storage medium
US11200899B2 (en)Voice processing method, apparatus and device
US9854439B2 (en)Device and method for authenticating a user of a voice user interface and selectively managing incoming communications
KR102443636B1 (en)Electronic device and method for providing information related to phone number
CN106775561B (en)Question intercepting method and device and intelligent equipment
KR102629796B1 (en)An electronic device supporting improved speech recognition
CN111522524B (en)Presentation control method and device based on conference robot, storage medium and terminal
US20210311893A1 (en)Systems and methods for pairing devices using visual recognition
US20180286388A1 (en)Conference support system, conference support method, program for conference support device, and program for terminal
KR20180076830A (en)Audio device and method for controlling the same
JP7087804B2 (en) Communication support device, communication support system and communication method
CN115514843A (en)Alarm method and device, equipment and storage medium
CN114222302B (en)Calling method and device for abnormal call, electronic equipment and storage medium
CN116860913A (en)Voice interaction method, device, equipment and storage medium
US11900929B2 (en)Electronic apparatus providing voice-based interface and method for controlling the same
CN112863511B (en)Signal processing method, device and storage medium
CN111813486B (en) Page display method, device, electronic device and storage medium
CN110602325B (en)Voice recommendation method and device for terminal
JP7139839B2 (en) Information processing device, information processing method and program
CN106559541A (en)voice data processing method and device
CN116743909B (en)Information reminding method, electronic equipment and readable medium

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp