Movatterモバイル変換


[0]ホーム

URL:


CN106302137A - Group chat message processing apparatus and method - Google Patents

Group chat message processing apparatus and method
Download PDF

Info

Publication number
CN106302137A
CN106302137ACN201610929486.6ACN201610929486ACN106302137ACN 106302137 ACN106302137 ACN 106302137ACN 201610929486 ACN201610929486 ACN 201610929486ACN 106302137 ACN106302137 ACN 106302137A
Authority
CN
China
Prior art keywords
contact
target contact
mobile terminal
group chat
icon
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610929486.6A
Other languages
Chinese (zh)
Inventor
杜国威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co LtdfiledCriticalNubia Technology Co Ltd
Priority to CN201610929486.6ApriorityCriticalpatent/CN106302137A/en
Publication of CN106302137ApublicationCriticalpatent/CN106302137A/en
Pendinglegal-statusCriticalCurrent

Links

Classifications

Landscapes

Abstract

The invention discloses a kind of group chat message processing apparatus, described device includes: receive unit, for receiving the first operation;Acquiring unit, for obtaining, based on described first operation, the contact icon chosen;Determining unit, the contact icon chosen for detection is the most dragged, and specifies the contact person corresponding to the contact icon in region to be defined as object contact person by being dragged to;Control unit, after object contact person to be determined, the voice messaging gathered is sent to described object contact person place mobile terminal, and notifies that described object contact person place mobile terminal reminds described object contact person to have the voice messaging for described object contact person in the first way.The present invention further simultaneously discloses a kind of group chat message treatment method.Using technical scheme of the present invention, can improve the input efficiency of voice messaging, destination object can be made quickly to know has the voice messaging for this destination object simultaneously, promotes the experience of user.

Description

Group chat message processing device and method
Technical Field
The present invention relates to information processing technologies, and in particular, to a group chat message processing apparatus and method.
Background
With the rapid development of communication technology, various instant messaging applications have played a vital role in our work and life. Particularly, as the number of people using the instant messaging application increases, group chat is more and more favored by users. Currently, many instant messaging applications start to support group chat technology, which may allow some users interested in a common topic to gather together for information interaction and sharing. In the process of using the instant messaging application, the inventor finds that the group chat voice information of some instant messaging applications has the following problems: when a user sends voice information in a group chat interface and wants some people to pay special attention, the people need to pay special attention to the voice information through a currently common reminding mode such as an '@ xxx' after sending the voice information.
However, the main disadvantage of this method for reminding the target object in the group to view the voice message is that the operation is complicated, and the voice message needs to be input first, and then the reminding method that can make the target object clear is input, so the input efficiency is low; moreover, the information output can cause the group chat interface to be not beautiful and concise. In addition, in the prior art, the target object in the group chat needs to wait all the time and pay attention to the progress of the group chat interface from time to time, which brings higher use cost to the user and brings poorer experience.
Disclosure of Invention
In view of this, the present invention is expected to provide a group chat message processing apparatus and method, which can improve the input efficiency of voice information, and at the same time, can enable a target object to quickly know the voice information for the target object, thereby improving the user experience.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
the invention provides a group chat message processing device, which is applied to a mobile terminal and comprises:
a receiving unit configured to receive a first operation; the first operation is a selection operation of a contact icon displayed in a group chat interface;
the acquisition unit is used for acquiring the selected contact icon based on the first operation;
the determining unit is used for detecting whether the selected contact icon is dragged or not and determining the contact corresponding to the contact icon dragged to the designated area as the target contact;
and the control unit is used for sending the collected voice information to the mobile terminal where the target contact person is located after the target contact person is determined, and informing the mobile terminal where the target contact person is located to remind the target contact person of the voice information aiming at the target contact person in a first mode.
In the foregoing scheme, optionally, the control unit is further configured to: and when the collected voice information is sent to the mobile terminal where the target contact person is located, the collected voice information is also sent to the mobile terminal where the non-target contact person in the group corresponding to the group chat interface is located.
In the foregoing solution, optionally, in the group chat interface of the target contact and the group chat interface of the non-target contact, a reminding message for reminding the target contact is not displayed.
In the foregoing scheme, optionally, the determining unit is further configured to:
judging whether the selected contact icon moves from the first position to the second position; the first position is the initial position of the contact icon, and the second position is the final position of the contact icon;
if the first position is different from the second position, judging that the selected contact icon is dragged;
and if the second position does not fall into the designated area, automatically jumping the selected contact icon from the second position to the first position.
In the foregoing scheme, optionally, the first mode includes one or more of the following:
ringing according to a preset ring, vibrating, changing the color of an indicator light or flashing according to a preset frequency, and prompting by voice.
The invention also provides a group chat message processing method, which is applied to the mobile terminal, and in the process that the mobile terminal displays the group chat interface of the first application, the method comprises the following steps:
receiving a first operation; the first operation is a selection operation of a contact icon displayed in a group chat interface;
acquiring the selected contact icon based on the first operation;
detecting whether the selected contact icon is dragged or not, and determining a contact corresponding to the contact icon dragged to the designated area as a target contact;
after the determination of the target contact is completed,
and sending the collected voice information to the mobile terminal where the target contact person is located, and informing the mobile terminal where the target contact person is located to remind the target contact person of the voice information aiming at the target contact person in a first mode.
In the foregoing scheme, optionally, when the collected voice information is sent to the mobile terminal where the target contact is located, the method further includes:
and sending the collected voice information to a mobile terminal where non-target contacts in a group corresponding to the group chat interface are located.
In the foregoing solution, optionally, in the group chat interface of the target contact and the group chat interface of the non-target contact, a reminding message for reminding the target contact is not displayed.
In the above scheme, optionally, detecting whether the selected contact icon is dragged, and determining a contact corresponding to the contact icon dragged to the designated area as the target contact includes:
judging whether the selected contact icon moves from the first position to the second position; the first position is the initial position of the contact icon, and the second position is the final position of the contact icon;
if the first position is different from the second position, judging that the selected contact icon is dragged;
and if the second position does not fall into the designated area, automatically jumping the selected contact icon from the second position to the first position.
In the foregoing scheme, optionally, the first mode includes one or more of the following:
ringing according to a preset ring, vibrating, changing the color of an indicator light or flashing according to a preset frequency, and prompting by voice.
The device and the method for processing the group chat message provided by the invention receive a first operation in the process that the mobile terminal displays the group chat interface of a first application; the first operation is a selection operation of a contact icon displayed in a group chat interface; acquiring the selected contact icon based on the first operation; detecting whether the selected contact icon is dragged or not, and determining a contact corresponding to the contact icon dragged to the designated area as a target contact; after the target contact is determined, sending the collected voice information to a mobile terminal where the target contact is located, and informing the mobile terminal where the target contact is located to remind the target contact of the voice information aiming at the target contact in a first mode; therefore, the target contact person is determined by dragging the contact person icon of the target contact person to the designated area, the sender can conveniently select the target contact person in the group, and after the sender inputs the voice information, the sender does not need to input other reminding modes such as characters or pictures again to remind the target contact person, so that the input efficiency of the voice information is improved, and the use experience of a user of the sender is improved; meanwhile, the target contact person is reminded of the voice information aiming at the target contact person through the mobile terminal where the target contact person is located, so that the target object can know whether the voice information aiming at the target object exists in the group chat message in time, and the use experience of a receiver user is improved.
Drawings
Fig. 1 is a schematic hardware configuration diagram of an alternative mobile terminal implementing various embodiments of the present invention;
FIG. 2 is a diagram of a wireless communication system for the mobile terminal shown in FIG. 1;
fig. 3 is a schematic flow chart illustrating an implementation of a group chat message processing method according to an embodiment of the present invention;
fig. 4(a) is a schematic diagram of a group chat interface when a first operation is not received according to an embodiment of the present invention; fig. 4(b) is a schematic diagram of dragging a target contact icon to a designated area according to an embodiment of the present invention; fig. 4(c) is a schematic diagram illustrating a target contact icon dragged to a designated area according to an embodiment of the present invention; fig. 4(d) is a schematic diagram illustrating that after a target contact is determined, a new target contact icon is dragged to the designated area again according to the embodiment of the present invention; fig. 4(e) is a schematic diagram illustrating that after a target contact is determined, a new target contact icon is dragged to a designated area according to an embodiment of the present invention;
fig. 5(a) is a schematic diagram of a group chat interface of a mobile terminal held by a sender after the sender sends a voice message to a target contact in the group chat interface according to an embodiment of the present invention; fig. 5(b) is a schematic diagram of a group chat interface of a mobile terminal held by a target contact after a sender sends a voice message to the target contact in the group chat interface according to an embodiment of the present invention; fig. 5(c) is a schematic diagram of a group chat interface of a mobile terminal held by a non-target contact in a group after a sender sends a voice message to the target contact in the group chat interface according to an embodiment of the present invention;
fig. 6(a) is another schematic diagram illustrating a group chat interface of a mobile terminal held by a target contact after a sender sends a voice message to the target contact in the group chat interface; fig. 6(b) is another schematic diagram illustrating a group chat interface of a mobile terminal held by a non-target contact in a group after a sender sends a voice message to the target contact in the group chat interface;
fig. 7 is a schematic structural diagram of a group chat message processing apparatus according to an embodiment of the present invention.
Detailed Description
So that the manner in which the features and aspects of the embodiments of the present invention can be understood in detail, a more particular description of the embodiments of the invention, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings.
A terminal implementing various embodiments of the present invention will now be described with reference to the accompanying drawings. In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the description of the embodiments of the present invention, and have no specific meaning in themselves. Thus, "module", "component" or "unit" may be used mixedly.
The terminal may be implemented in various forms. For example, the terminal described in the embodiments of the present invention may include terminals such as a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a Personal Digital Assistant (PDA), a tablet computer (PAD), a Portable Multimedia Player (PMP), a navigation device, and the like, and fixed terminals such as a digital TV, a desktop computer, and the like. In the following, it is assumed that the terminal is a mobile terminal. However, it will be understood by those skilled in the art that the configuration according to the embodiment of the present invention can be applied to a fixed type terminal in addition to elements particularly used for moving purposes.
Fig. 1 is a schematic hardware configuration of a mobile terminal implementing various embodiments of the present invention.
The mobile terminal 100 may include an audio/video (a/V) input unit 120, a user input unit 130, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190, and the like. Fig. 1 illustrates a mobile terminal having various components, but it is to be understood that not all illustrated components are required to be implemented. More or fewer components may alternatively be implemented. Elements of the mobile terminal will be described in detail below.
The a/V input unit 120 is used to receive an audio or video signal. The a/V input unit 120 may include a camera 121 and a microphone 1220, and the camera 121 processes image data of still pictures or video obtained by an image capturing apparatus in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 151. The image frames processed by the camera 121 may be stored in the memory 160 (or other storage medium), and two or more cameras 1210 may be provided according to the construction of the mobile terminal. The microphone 122 may receive sounds (audio data) via the microphone in a phone call mode, a recording mode, a voice recognition mode, or the like, and can process such sounds into audio data. The microphone 122 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The user input unit 130 may generate key input data according to a command input by a user to control various operations of the mobile terminal. The user input unit 130 allows a user to input various types of information, and may include a keyboard, dome sheet, touch pad (e.g., a touch-sensitive member that detects changes in resistance, pressure, capacitance, and the like due to being touched), scroll wheel, joystick, and the like. In particular, when the touch pad is superimposed on the display unit 151 in the form of a layer, a touch screen may be formed.
The interface unit 170 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The identification Module may store various information for authenticating a User using the mobile terminal 100 and may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), a Universal Subscriber Identity Module (USIM), and the like. In addition, a device having an identification module (hereinafter, referred to as an "identification device") may take the form of a smart card, and thus, the identification device may be connected with the mobile terminal 100 via a port or other connection means. The interface unit 170 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal and the external device.
In addition, when the mobile terminal 100 is connected with an external cradle, the interface unit 170 may serve as a path through which power is supplied from the cradle to the mobile terminal 100 or may serve as a path through which various command signals input from the cradle are transmitted to the mobile terminal. Various command signals or power input from the cradle may be used as signals for recognizing whether the mobile terminal is accurately mounted on the cradle. The output unit 150 is configured to provide output signals (e.g., audio signals, video signals, alarm signals, vibration signals, etc.) in a visual, audio, and/or tactile manner. The output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.
The display unit 151 may display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 151 may display a User Interface (UI) or a Graphical User Interface (GUI) related to a call or other communication (e.g., text messaging, multimedia file downloading, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or an image and related functions, and the like.
Meanwhile, when the display unit 151 and the touch pad are overlapped with each other in the form of a layer to form a touch screen, the display unit 151 may serve as an input device and an output device. The Display unit 151 may include at least one of a Liquid Crystal Display (LCD), a Thin Film Transistor LCD (TFT-LCD), an Organic Light-Emitting Diode (OLED) Display, a flexible Display, a three-dimensional (3D) Display, and the like. Some of these displays may be configured to be transparent to allow a user to see from the outside, which may be referred to as transparent displays, and a typical transparent display may be, for example, a Transparent Organic Light Emitting Diode (TOLED) display or the like. Depending on the particular desired implementation, the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown). The touch screen may be used to detect a touch input pressure as well as a touch input position and a touch input area.
The audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 into an audio signal and output as sound when the mobile terminal is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output module 152 may provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output module 152 may include a speaker, a buzzer, and the like.
The alarm unit 153 may provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alarm unit 153 may provide output in different ways to notify the occurrence of an event. For example, the alarm unit 153 may provide an output in the form of vibration, and when a call, a message, or some other incoming communication (communicating communication) is received, the alarm unit 153 may provide a tactile output (i.e., vibration) to inform the user thereof. By providing such a tactile output, the user can recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm unit 153 may also provide an output notifying the occurrence of an event via the display unit 151 or the audio output module 152.
The memory 160 may store software programs or the like for processing and controlling operations performed by the controller 180, or may temporarily store data (e.g., a phonebook, messages, still images, videos, etc.) that has been output or is to be output. Also, the memory 160 may store data regarding various ways of vibration and audio signals output when a touch is applied to the touch screen.
The Memory 160 may include at least one type of storage medium including a flash Memory, a hard disk, a multimedia card, a card-type Memory (e.g., SD or DX Memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic Memory, a magnetic disk, an optical disk, etc. Also, the mobile terminal 100 may cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
The controller 180 generally controls the overall operation of the mobile terminal. For example, the controller 180 performs control and processing related to voice calls, data communications, video calls, and the like. In addition, the controller 180 may include a multimedia module 181 for reproducing (or playing back) multimedia data, and the multimedia module 181 may be constructed within the controller 180 or may be constructed separately from the controller 180. The controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
The power supply unit 190 receives external power or internal power and provides appropriate power required to operate various elements and components under the control of the controller 180.
The various embodiments described herein may be implemented in a computer-readable medium using, for example, computer software, hardware, or any combination thereof. For a hardware implementation, the embodiments described herein may be implemented using at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a processor, a controller, a microcontroller, a microprocessor, and an electronic unit designed to perform the functions described herein, and in some cases, such embodiments may be implemented in the controller 180. For a software implementation, the implementation such as a process or a function may be implemented with a separate software module that allows performing at least one function or operation. The software codes may be implemented by software applications (or programs) written in any suitable programming language, which may be stored in the memory 160 and executed by the controller 180.
Up to now, the mobile terminal has been described in terms of its functions. Hereinafter, a slide-type mobile terminal among various types of mobile terminals, such as a folder-type, bar-type, swing-type, slide-type mobile terminal, and the like, will be described as an example for the sake of brevity. Accordingly, the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
The mobile terminal 100 as shown in fig. 1 may be configured to operate with communication systems such as wired and wireless communication systems and satellite-based communication systems that transmit data via frames or packets.
A communication system in which a mobile terminal according to an embodiment of the present invention is operable will now be described with reference to fig. 2.
Such communication systems may use different air interfaces and/or physical layers. For example, the air interface used by the communication System includes, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)), global System for Mobile communications (GSM), and the like. By way of non-limiting example, the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
Referring to fig. 2, the CDMA wireless communication system may include a plurality of Mobile terminals 100, a plurality of Base Stations (BSs) 270, a Base Station Controller (BSC) 275, and a Mobile Switching Center (MSC) 280. The MSC280 is configured to interface with a Public Switched Telephone Network (PSTN) 290. The MSC280 is also configured to interface with a BSC275, which may be coupled to the base station 270 via a backhaul. The backhaul may be constructed according to any of several known interfaces including, for example, E1/T1, ATM, IP, PPP, frame Relay, HDSL, ADSL, or xDSL. It will be understood that a system as shown in fig. 2 may include multiple BSCs 275.
Each BS270 may serve one or more sectors (or regions), each sector covered by a multi-directional antenna or an antenna pointing in a particular direction being radially distant from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS270 may be configured to support multiple frequency allocations, with each frequency allocation having a particular frequency spectrum (e.g., 1.25MHz, 5MHz, etc.).
The intersection of partitions with frequency allocations may be referred to as a CDMA channel. The BS270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology. In such a case, the term "base station" may be used to generically refer to a single BSC275 and at least one BS 270. The base stations may also be referred to as "cells". Alternatively, each sector of a particular BS270 may be referred to as a plurality of cell sites.
As shown in fig. 2, a Broadcast Transmitter (BT) 295 transmits a Broadcast signal to the mobile terminal 100 operating within the system. A broadcast receiving module 111 as shown in fig. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295. In fig. 2, several Global Positioning System (GPS) satellites 300 are shown. The satellite 300 assists in locating at least one of the plurality of mobile terminals 100.
In fig. 2, a plurality of satellites 300 are depicted, but it is understood that useful positioning information may be obtained with any number of satellites. The GPS module 115 as shown in fig. 1 is generally configured to cooperate with satellites 300 to obtain desired positioning information. Other techniques that can track the location of the mobile terminal may be used instead of or in addition to GPS tracking techniques. In addition, at least one GPS satellite 300 may selectively or additionally process satellite DMB transmission.
As a typical operation of the wireless communication system, the BS270 receives reverse link signals from various mobile terminals 100. The mobile terminal 100 is generally engaged in conversations, messaging, and other types of communications. Each reverse link signal received by a particular base station 270 is processed within the particular BS 270. The obtained data is forwarded to the associated BSC 275. The BSC provides call resource allocation and mobility management functions including coordination of soft handoff procedures between BSs 270. The BSCs 275 also route the received data to the MSC280, which provides additional routing services for interfacing with the PSTN 290. Similarly, the PSTN290 interfaces with the MSC280, the MSC interfaces with the BSCs 275, and the BSCs 275 accordingly control the BS270 to transmit forward link signals to the mobile terminal 100.
Based on the above mobile terminal hardware structure and communication system, in order to better serve users, improve the input efficiency of voice information, and enable the target object to quickly know the voice information for the target object, the embodiments of the method of the present invention are provided.
Example one
Fig. 3 is a schematic flow chart illustrating an implementation of a group chat message processing method according to an embodiment of the present invention, where the group chat message processing method in this example is applied to a mobile terminal, and as shown in fig. 3, the group chat message processing method mainly includes the following steps:
step 301: receiving a first operation; wherein the first operation is a selection operation for a contact icon displayed in a group chat interface.
Here, a first application is provided on the mobile terminal, wherein the first application is an application that can support group chat. For example, the first application may be an instant messaging application.
Here, the contact icon is an identifier that can characterize a contact. For example, the contact icon may be an icon of an avatar for the contact.
In an optional implementation manner, the first operation is received in the process that the mobile terminal displays the group chat interface of the first application.
Here, the first operation may be a long press operation, a single click operation, a double click operation, or the like.
Step 302: and acquiring the selected contact icon based on the first operation.
In an optional embodiment, the obtaining the selected contact icon based on the first operation includes:
in the process that the mobile terminal displays the group chat interface of the first application, after the first operation is received, the selected contact icon is set to be in the second state from the first state.
Here, the first state refers to a fixed state in which the contact icon displayed in the group chat interface is not allowed to be operated in a normal case, and the operation includes a move or drag operation.
Here, the second state refers to that the contact icon presented in the group chat interface is in a movable state after the first operation is received.
Step 303: and detecting whether the selected contact icon is dragged or not, and determining the contact corresponding to the contact icon dragged to the designated area as the target contact.
Optionally, the designated area is an area on the group chat interface related to an area where a voice key icon is located.
The designated area may be the same as the area where the voice key icon is located; the designated area may also be smaller than the area where the voice key icon is located, such as being located within the area where the voice key icon is located and being a part of the area where the voice key icon is located; the designated area may also be larger than the area where the voice key icon is located, such as covering the area where the voice key icon is located and being larger than the area where the voice key icon is located.
Here, the shape of the designated area is not limited, and for example, the designated area may be a circle, a triangle, a rectangle, an irregular shape, or the like.
For example, the designated area is an area occupied by a voice key icon on the group chat interface.
In an optional embodiment, the detecting whether the selected contact icon is dragged, and determining a contact corresponding to the contact icon dragged to the designated area as the target contact includes:
judging whether the selected contact icon moves from the first position to the second position; the first position is the initial position of the contact icon, and the second position is the final position of the contact icon;
if the first position is different from the second position, the selected contact person icon is judged to be dragged, and if the second position falls into a designated area, the selected contact person icon is determined to be a target contact person; if the second position does not fall into the designated area, the selected contact icon automatically jumps back to the first position from the second position;
if the first position is the same as the second position, judging that the selected contact icon is not dragged, and directly determining the selected contact icon as a non-target contact.
Therefore, if the selected contact icon is dragged but not dragged to the designated area, the selected contact icon automatically jumps back to the first position from the second position, so that a sender user can have intuitive feeling and can clearly determine whether the selection of the target contact is successful.
In an optional embodiment, after determining the contact corresponding to the contact icon dragged to the designated area as the target contact, the method further includes:
presenting a voice input prompt key in a group chat interface of a sender mobile terminal; the voice input prompt key is used for starting a voice acquisition function when receiving a trigger operation.
In an optional embodiment, the method further comprises:
establishing a correlation between the operation of dragging the contact icon to the designated area and a voice key in a group chat interface;
and when the operation of dragging the contact icon to the designated area is detected, judging to trigger a voice key, and presenting a voice input prompt key in a group chat interface of the sender mobile terminal.
Therefore, the operation of triggering the voice key by the user specially is not needed to be received, the operation of dragging the contact icon to the designated area is detected, the voice key is automatically triggered, and the voice input prompt key is presented, so that the operation steps of the user are simplified; therefore, after the sender user determines the target contact in the group chat interface, the sender user can directly click the voice input prompt key to perform voice input without triggering the voice key.
Step 304: after the target contact is determined, sending the collected voice information to the mobile terminal where the target contact is located, and informing the mobile terminal where the target contact is located to remind the target contact of the voice information aiming at the target contact in a first mode.
Here, the determined target contact may be one person or a plurality of persons in the group.
In a specific embodiment, the determination method for determining that the target contact is finished includes:
the first operation is not received within the preset time; or
And detecting the operation of triggering a voice input prompt key, wherein the voice input prompt key is used for starting a voice acquisition function when receiving the triggering operation.
For example, there are a total of 4 individuals in the group, individually identified as A, B, C, D; and D is the person who sends the voice message to the target contact in the pre-given group. Fig. 4(a) is a schematic diagram illustrating a group chat interface when the first operation is not received, and as can be seen from fig. 4(a), the group chat interface is a group chat interface displayed by a mobile terminal held by D, and at this time, D does not select a target contact. Fig. 4(b) is a schematic diagram illustrating that the target contact icon is dragged to the designated area, and as can be seen from fig. 4(b), D determines C as the target contact in advance, and drags the target contact icon of C to the designated area, that is, where the voice key is located. Fig. 4(C) is a schematic diagram illustrating the target contact icon dragged to the designated area, and as can be seen from fig. 4(C), the system determines C as the target contact, and the text input box in the group chat interface is changed into a voice input prompt key. Fig. 4(D) is a schematic diagram illustrating that after a target contact is determined, a new target contact icon is dragged to the designated area again, and as can be seen from fig. 4(D), after C is determined as the target contact, B is also determined as the target contact in advance, and the target contact icon of B is dragged to the designated area, that is, the position where the voice key is located. Fig. 4(e) is a schematic diagram of the target contact icon dragged to the designated area corresponding to fig. 4(c), and as can be seen from fig. 4(e), C, B is determined as the target contact by the system, and the current voice input prompt key in the group chat interface is not changed.
In an optional implementation manner, when the collected voice information is sent to the mobile terminal where the target contact is located, the method further includes:
and sending the collected voice information to a mobile terminal where a non-target contact person in a group corresponding to the group chat interface is located, but not informing the mobile terminal where the non-target contact person is located to remind the non-target contact person of the voice information aiming at the target contact person in a first mode.
That is, whether for the target contact or for the non-target contact, the voice message sent by the presented group member in the group is the same in the group chat interface of the first application displayed by the mobile terminal. After the sender sends the voice message, reminding messages such as '@ XXX' and the like for reminding a target contact person do not need to be sent in a group chat interface, so that the input operation of a user is simplified, and the input efficiency of the voice message is improved; in addition, the problem that the reminded person needs to search for the reminded person in a plurality of voice messages because the reminded person receives a plurality of voice messages sent by different persons at the same time can be avoided.
In a specific optional implementation manner, in the group chat interface of the target contact and the group chat interface of the non-target contact, a reminding message for reminding the target contact is not displayed. The reminding message for reminding the target contact person comprises words, characters or pictures and the like.
In a specific optional implementation manner, in the group chat interface of the target contact and the group chat interface of the non-target contact, the message display contents of the voice message are the same, and a reminding message for reminding the target contact is not displayed.
In a specific optional implementation manner, in the group chat interface of the target contact and the group chat interface of the non-target contact, the message display contents of the voice message are different, the voice message is only displayed on the group chat interface of the target contact, and a reminding message for reminding the target contact is not displayed.
The message display contents of the voice messages are the same. Here, the display content refers to absence on the group chat interface.
Continuing with the example of fig. 4, there are a total of 4 individuals in the group, individually identified as A, B, C, D; wherein D is a person who pre-sends a voice message to a target contact in the group, wherein D only determines C as the target contact, fig. 5(a) shows a schematic diagram of a group chat interface of a mobile terminal held by a sender after the sender sends the voice message to the target contact in the group chat interface, as can be seen from fig. 5(a), the group chat interface is a group chat interface displayed by the mobile terminal held by D, and at this time, D does not need to input a reminder message of "@ C" any more. Fig. 5(b) is a schematic diagram illustrating a group chat interface of a mobile terminal held by a target contact after a sender sends a voice message to the target contact in the group chat interface, and as shown in fig. 5(b), the group chat interface only has the voice message and does not have a reminder message of "@ C", but the target contact C knows that the voice message sent by D is regarded as the target contact by the reminder message sent by the mobile terminal in a first manner, such as ringing. Similarly, after the sender sends the voice message to the target contact in the group chat interface, the schematic diagram of the group chat interface of the mobile terminal held by the non-target contact in the group is similar to that in fig. 5(b), as shown in fig. 5(C), fig. 5(C) is the group chat interface displayed by the mobile terminal held by a, and there is no reminding message of "@ C" and the non-target contact does not know that the voice message sent by D is regarded as the target contact. Thus, the group chat interface can be simple and tidy.
In another optional implementation manner, when the collected voice information is sent to the mobile terminal where the target contact is located, the method further includes:
and not sending the collected voice information to the mobile terminal where the non-target contact persons in the group corresponding to the group chat interface are located.
Therefore, voice messages aiming at corresponding target contacts are selectively released into the group chat interface, so that the privacy can be better respected and protected; it is also possible to avoid non-target contacts being disturbed by voice messages directed only to target contacts.
Continuing with the example of fig. 5, there are a total of 4 individuals in the group, individually identified as A, B, C, D; and D is the person who sends the voice message to the target contact in the pre-given group, wherein D only determines C as the target contact. Fig. 6(a) is a schematic view of a group chat interface of a mobile terminal held by a target contact after a sender sends a voice message to the target contact in the group chat interface, and as shown in fig. 6(a), there is a new voice message D in the group chat interface, and there is no reminder message of "@ C", but the target contact C knows that the voice message D sent is regarded as the target contact by the target contact C through the mobile terminal in a first manner, such as a ringing alert message. Fig. 6(b) is a schematic diagram illustrating a group chat interface of a mobile terminal held by a non-target contact in a group after a sender sends a voice message to the target contact in the group chat interface; as shown in fig. 6(b), the same applies when there is no new voice message for the target contact C in the group chat interface, i.e., the group chat interface displayed by the mobile terminal held by A, B which is not regarded as the target contact. Therefore, the privacy of the sender and the target contact person can be protected, and meanwhile useless work of the non-target contact person caused by looking up the voice message only aiming at the target contact person can be avoided.
In the foregoing scheme, optionally, the first mode includes one or more of the following:
ringing according to a preset ring, vibrating, changing the color of an indicator light or flashing according to a preset frequency, and prompting by voice.
Here, in this embodiment, the first manner is not limited, and any manner that enables the target contact to know that the newly received voice message in the group chat interface is a voice message of the sender for the target contact may be used as the first manner.
Of course, it should be noted that, for different recipient users, different first modes may be set according to usage habits.
In the embodiment, a first operation is received in the process of displaying a group chat interface of a first application by a mobile terminal; the first operation is a selection operation of a contact icon displayed in a group chat interface; acquiring the selected contact icon based on the first operation; detecting whether the selected contact icon is dragged or not, and determining a contact corresponding to the contact icon dragged to the designated area as a target contact; after the target contact is determined, sending the collected voice information to a mobile terminal where the target contact is located, and informing the mobile terminal where the target contact is located to remind the target contact of the voice information aiming at the target contact in a first mode; therefore, the target contact person is determined by dragging the contact person icon of the target contact person to the designated area, the sender can conveniently select the target contact person in the group, and the sender does not need to input other reminding modes such as characters or pictures and the like again after inputting the voice information, so that the input efficiency of the voice information is improved, and the use experience of a user of the sender is improved; meanwhile, the target contact person is reminded of the voice information aiming at the target contact person through the mobile terminal where the target contact person is located, so that the target object can know whether the voice information aiming at the target object exists in the group chat message in time, and the use experience of a receiver user is improved.
Example two
Fig. 7 is a schematic diagram of a configuration structure of a group chat processing apparatus according to an embodiment of the present invention, which is applied to a mobile terminal, and as shown in fig. 7, the apparatus includes a receiving unit 71, an obtaining unit 72, a determining unit 73, and a control unit 74; wherein,
a receiving unit 71 for receiving a first operation; the first operation is a selection operation of a contact icon displayed in a group chat interface;
an obtaining unit 72, configured to obtain the selected contact icon based on the first operation;
a determining unit 73, configured to detect whether the selected contact icon is dragged, and determine a contact corresponding to the contact icon dragged to the designated area as a target contact;
and the control unit 74 is configured to send the collected voice information to the mobile terminal where the target contact is located after the target contact is determined, and notify the mobile terminal where the target contact is located to remind the target contact of the voice information for the target contact in a first manner.
In an embodiment, the control unit 74 is further configured to: and when the collected voice information is sent to the mobile terminal of the target contact person, the collected voice information is also sent to the mobile terminal of the non-target contact person in the group corresponding to the group chat interface, but the mobile terminal of the non-target contact person is not informed to remind the non-target contact person of the voice information aiming at the target contact person in a first mode.
In another embodiment, the control unit 74 is further configured to: and when the collected voice information is sent to the mobile terminal where the target contact person is located, the collected voice information is not sent to the mobile terminal where the non-target contact person in the group corresponding to the group chat interface is located.
In one embodiment, a reminding message for reminding the target contact person is not displayed in the group chat interface of the target contact person and the group chat interface of the non-target contact person. The reminding message for reminding the target contact person comprises words, characters or pictures and the like.
In an embodiment, in the group chat interface of the target contact and the group chat interface of the non-target contact, the message display contents of the voice message are the same, and a reminding message for reminding the target contact is not displayed.
In an embodiment, in the group chat interface of the target contact and the group chat interface of the non-target contact, the message display content of the voice message is different, the voice message is only displayed on the group chat interface of the target contact, and a reminding message for reminding the target contact is not displayed.
In an embodiment, the determining unit 73 is further configured to:
judging whether the selected contact icon moves from the first position to the second position; the first position is the initial position of the contact icon, and the second position is the final position of the contact icon;
if the first position is different from the second position, the selected contact person icon is judged to be dragged, and if the second position falls into a designated area, the selected contact person icon is determined to be a target contact person; if the second position does not fall into the designated area, the selected contact icon automatically jumps back to the first position from the second position;
if the first position is the same as the second position, judging that the selected contact icon is not dragged, and directly determining the selected contact icon as a non-target contact.
Therefore, if the selected contact icon is dragged but not dragged to the designated area, the selected contact icon automatically jumps back to the first position from the second position, so that a sender user can have intuitive feeling and can clearly determine whether the selection of the target contact is successful.
In an alternative embodiment, the control unit 74 is further configured to:
after determining a contact corresponding to the contact icon dragged to the designated area as a target contact, presenting a voice input prompt key in a group chat interface of the sender mobile terminal; the voice input prompt key is used for starting a voice acquisition function when receiving a trigger operation.
In an alternative embodiment, the apparatus further comprises:
the establishing unit 75 is configured to establish a correlation between an operation of dragging the contact icon to the designated area and a voice key in the group chat interface;
the control unit 75 is further configured to detect an operation of dragging the contact icon to the designated area, determine to trigger a voice key, and present a voice input prompt key in a group chat interface of the sender mobile terminal.
The group chat message processing device can be arranged in the mobile terminal.
In practical applications, the specific structures of the receiving unit 71, the obtaining unit 72, the determining unit 73, the controlling unit 74, and the establishing unit 75 may all correspond to a processor. The specific structure of the processor may be a Central Processing Unit (CPU), a Micro Controller Unit (MCU), a Digital Signal Processor (DSP), a Programmable Logic Controller (PLC), or other electronic components or a collection of electronic components having a Processing function. The processor includes executable codes, the executable codes are stored in a storage medium, the processor can be connected with the storage medium through a communication interface such as a bus, and when the corresponding functions of specific units are executed, the executable codes are read from the storage medium and executed. The portion of the storage medium used to store the executable code is preferably a non-transitory storage medium.
The receiving unit 71, the obtaining unit 72, the determining unit 73, the controlling unit 74 and the establishing unit 75 may be integrated to correspond to the same processor, or respectively correspond to different processors; when integrated to correspond to the same processor, the processor performs time division processing of the corresponding functions of the receiving unit 71, the obtaining unit 72, the determining unit 73, the controlling unit 74, and the establishing unit 75.
The group chat message processing device can improve the input efficiency of voice information, and meanwhile, a target object can quickly know the voice information aiming at the target object, so that the use experience of a user is improved.
It should be understood by those skilled in the art that the functions of the units in the group chat message processing apparatus according to the embodiment of the present invention may be realized by an analog circuit that implements the functions described in the embodiment of the present invention, or by running software that executes the functions described in the embodiment of the present invention on an intelligent terminal.
The present invention further discloses a mobile terminal, which includes the group chat processing apparatus shown in fig. 7, and is not described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

CN201610929486.6A2016-10-312016-10-31Group chat message processing apparatus and methodPendingCN106302137A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201610929486.6ACN106302137A (en)2016-10-312016-10-31Group chat message processing apparatus and method

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201610929486.6ACN106302137A (en)2016-10-312016-10-31Group chat message processing apparatus and method

Publications (1)

Publication NumberPublication Date
CN106302137Atrue CN106302137A (en)2017-01-04

Family

ID=57720626

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201610929486.6APendingCN106302137A (en)2016-10-312016-10-31Group chat message processing apparatus and method

Country Status (1)

CountryLink
CN (1)CN106302137A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN107146621A (en)*2017-03-312017-09-08北京奇艺世纪科技有限公司A kind of voice message originator method for instant messaging, apparatus and system
CN107357484A (en)*2017-06-292017-11-17维沃移动通信有限公司One kind prompting mark adding method, terminal and computer-readable recording medium
CN107592416A (en)*2017-08-312018-01-16努比亚技术有限公司Method for sending voice message, terminal and computer-readable recording medium
CN107911556A (en)*2017-11-272018-04-13珠海市魅族科技有限公司Contact person's based reminding method, device, computer installation and computer-readable recording medium
CN109766156A (en)*2018-12-242019-05-17维沃移动通信有限公司 A session creation method and terminal device
CN109873751A (en)*2019-01-112019-06-11珠海格力电器股份有限公司Group chat voice information processing method and device, storage medium and server
CN110177041A (en)*2019-05-312019-08-27网易(杭州)网络有限公司The sending method and device of voice messaging, storage medium, electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN102567367A (en)*2010-12-252012-07-11上海量明科技发展有限公司Method and system for setting correlations in communication interface
CN104184887A (en)*2014-07-292014-12-03小米科技有限责任公司Message prompting method and device and terminal equipment
CN105204748A (en)*2014-06-272015-12-30阿里巴巴集团控股有限公司Terminal interaction method and device
CN105376141A (en)*2015-11-242016-03-02阿里巴巴集团控股有限公司Instant communication message processing method and device
CN105897561A (en)*2016-05-262016-08-24努比亚技术有限公司Message sending method, device and system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN102567367A (en)*2010-12-252012-07-11上海量明科技发展有限公司Method and system for setting correlations in communication interface
CN105204748A (en)*2014-06-272015-12-30阿里巴巴集团控股有限公司Terminal interaction method and device
CN104184887A (en)*2014-07-292014-12-03小米科技有限责任公司Message prompting method and device and terminal equipment
CN105376141A (en)*2015-11-242016-03-02阿里巴巴集团控股有限公司Instant communication message processing method and device
CN105897561A (en)*2016-05-262016-08-24努比亚技术有限公司Message sending method, device and system

Cited By (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN107146621A (en)*2017-03-312017-09-08北京奇艺世纪科技有限公司A kind of voice message originator method for instant messaging, apparatus and system
CN107357484A (en)*2017-06-292017-11-17维沃移动通信有限公司One kind prompting mark adding method, terminal and computer-readable recording medium
CN107592416A (en)*2017-08-312018-01-16努比亚技术有限公司Method for sending voice message, terminal and computer-readable recording medium
CN107592416B (en)*2017-08-312020-11-17努比亚技术有限公司Voice message transmitting method, terminal and computer readable storage medium
CN107911556A (en)*2017-11-272018-04-13珠海市魅族科技有限公司Contact person's based reminding method, device, computer installation and computer-readable recording medium
CN109766156A (en)*2018-12-242019-05-17维沃移动通信有限公司 A session creation method and terminal device
US12028476B2 (en)2018-12-242024-07-02Vivo Mobile Communication Co., Ltd.Conversation creating method and terminal device
CN109873751A (en)*2019-01-112019-06-11珠海格力电器股份有限公司Group chat voice information processing method and device, storage medium and server
CN109873751B (en)*2019-01-112020-10-09珠海格力电器股份有限公司Group chat voice information processing method and device, storage medium and server
CN110177041A (en)*2019-05-312019-08-27网易(杭州)网络有限公司The sending method and device of voice messaging, storage medium, electronic device
CN110177041B (en)*2019-05-312022-05-03网易(杭州)网络有限公司Voice information sending method and device, storage medium and electronic device

Similar Documents

PublicationPublication DateTitle
CN106302137A (en)Group chat message processing apparatus and method
CN105487802B (en)Screen projection management method, device and system
CN105303398B (en)Information display method and system
CN106547439B (en)Method and device for processing message
CN106210328A (en)Information display device and method
CN105955613B (en)A kind of control method and device
CN106776017B (en)Device and method for cleaning application memory and garbage
CN107705120A (en)A kind of barcode scanning payment mechanism and method
CN105554386A (en)Mobile terminal and camera shooting control method thereof
CN105490928A (en)Mobile terminal and multitask processing method thereof
CN104881223A (en)Method and device for achieving application operation
CN106406621B (en)A kind of mobile terminal and its method for handling touch control operation
CN105187663B (en)Automatically the method and terminal conversed
CN106713121A (en)Device and method for acquiring instant message state information
CN105245724A (en)Mobile terminal and incoming call processing method thereof
CN106774860B (en)Intelligent eliminating method and system for information corner mark
CN106020693B (en) Desktop page entry method and electronic device
CN104978140B (en)A kind of information processing method and mobile terminal
CN105224177B (en)A kind of mobile terminal and application icon redraw method
CN106095790A (en)A kind of mobile terminal and control method thereof
CN106302134A (en)A kind of message playing device and method
CN106453826A (en)Quick reply method and apparatus, and terminal
CN105357188A (en)Method and server for realizing WIFI connection and mobile terminal
CN105138235A (en)Picture processing apparatus and method
CN104850351A (en)Method and device for frameless interaction

Legal Events

DateCodeTitleDescription
C06Publication
PB01Publication
C10Entry into substantive examination
SE01Entry into force of request for substantive examination
RJ01Rejection of invention patent application after publication

Application publication date:20170104

RJ01Rejection of invention patent application after publication

[8]ページ先頭

©2009-2025 Movatter.jp