Movatterモバイル変換


[0]ホーム

URL:


US9173074B2 - Personal hub presence and response - Google Patents

Personal hub presence and response
Download PDF

Info

Publication number
US9173074B2
US9173074B2US13/686,899US201213686899AUS9173074B2US 9173074 B2US9173074 B2US 9173074B2US 201213686899 AUS201213686899 AUS 201213686899AUS 9173074 B2US9173074 B2US 9173074B2
Authority
US
United States
Prior art keywords
message
recipient
mobile device
received message
computing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US13/686,899
Other versions
US20130316746A1 (en
Inventor
Brian F. MILLER
Jose Menendez
Rohit Sauhta
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qualcomm Inc
Original Assignee
Qualcomm Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm IncfiledCriticalQualcomm Inc
Priority to US13/686,899priorityCriticalpatent/US9173074B2/en
Assigned to QUALCOMM INCORPORATEDreassignmentQUALCOMM INCORPORATEDASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: SAUHTA, ROHIT, MENENDEZ, JOSE, MILLER, Brian F.
Priority to KR1020147036239Aprioritypatent/KR101622506B1/en
Priority to EP13722921.7Aprioritypatent/EP2856782B1/en
Priority to PCT/US2013/038745prioritypatent/WO2013180873A2/en
Priority to CN201380027418.5Aprioritypatent/CN104335612B/en
Publication of US20130316746A1publicationCriticalpatent/US20130316746A1/en
Application grantedgrantedCritical
Publication of US9173074B2publicationCriticalpatent/US9173074B2/en
Expired - Fee Relatedlegal-statusCriticalCurrent
Adjusted expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

Methods, devices, and systems for transmitting convenient messages to a recipient for rendering based on the recipient's device availabilities. A recipient's mobile device may be connected to a personal hub and/or earpiece devices configured to render various incoming communications, such as audio messages and visual messages. The incoming messages may be delivered to the recipient's mobile device and other connected devices that may render the contents of the incoming messages. A delivery confirmation message that describes the receipt and use of incoming messages may be generated and returned to a sender's computing device. In an embodiment, the recipient's devices may generate status information for describing the status of devices to a sender's computing device. In an embodiment, the sender's computing device may generate and transmit outgoing messages formatted based on the received status information and including metadata that instructs the recipient's devices to render message content in particular manners.

Description

RELATED APPLICATIONS
The present non-provisional patent application claims the benefit of priority to U.S. Provisional Application No. 61/652,229 entitled “Personal Hub Presence and Response,” and filed May 27, 2012, the entire contents of which are hereby incorporated by reference.
BACKGROUND
Mobile devices, such as mobile phones, smart phones, tablets, or laptops, may have various types of user notifications including audio, visual, and haptic (e.g., vibration) notifications. These devices may also include various modes combining or restricting some of these notifications (e.g., a smart phone set to vibrate may not ring or a laptop allowing a pop up reminder may not sound an alarm when sound is muted). Current mobile devices may allow users to control notification modes (e.g., silencing a ringing phone that may be interrupting a meeting or movie). However, mobile device users may call or text a friend, family member, or co-worker without actually knowing if their message was received, seen, or ignored. Mobile device users can be frustrated by having no contextual or definitive confirmation of the receipt and ingestion of their messages. Typically, users must wait for a response from the recipient or try sending redundant messages to expedite their communication attempts. Message recipients may be inconvenienced by multiple messages and the social discomfort of being unable to easily describe their availability to those attempting to contact them.
SUMMARY
Methods, devices, and systems enable various embodiments for transmitting convenient messages to a recipient for rendering based on the recipient's device availabilities. In particular, a recipient may utilize a mobile device with a connected personal hub, wireless earpieces, and/or other associated devices to determine and transmit their status information to a sender's computing device and/or a server for storage. The status information may include status indicators, such as whether a device is activated, currently in a voice call or some other form of use, mute, or in silent mode. In an embodiment, the recipient's computing device may record and transmit information back to the sender's computing device describing the receipt and delivery of messages. In an embodiment, the sender's computing device may use status information describing the status of the recipient devices to generate messages that the recipient's personal hub and wireless earpieces may render in convenient ways for the recipient. In an embodiment, the recipient's personal hub may detect input from the recipient in response to rendering simple response options of a message received from a sender's computing device and generate a response message based on the detected input.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which are incorporated herein and constitute part of this specification, illustrate exemplary embodiments of the invention, and together with the general description given above and the detailed description given below, serve to explain the features of the invention.
FIG. 1 is a communication system block diagram of another network suitable for use with the various embodiments.
FIG. 2 is a process flow diagram illustrating an embodiment method for sending messages and receiving the recipient's status information.
FIG. 3 is a process flow diagram illustrating an embodiment method for directly pulling a recipient's status information and sending messages based on the status information.
FIG. 4 is a process flow diagram illustrating an embodiment method for retrieving a recipient's status information from a server and sending messages based on the status information.
FIG. 5 is a data structure diagram illustrating potential elements of a presence update message.
FIG. 6 is a process flow diagram illustrating an embodiment method for generating and formatting a message based on the recipient's status information.
FIG. 7 is a process flow diagram illustrating an embodiment method for a recipient receiving, rendering and responding to a message formatted based on the user's status information.
FIG. 8 is a component diagram of an example mobile device suitable for use with the various embodiments.
FIG. 9 is a component diagram of another example mobile device suitable for use with the various embodiments.
FIG. 10 is a component diagram of a personal hub suitable for use with the various embodiments.
FIGS. 11A-11B are component diagrams of wireless earpieces suitable for use with the various embodiments.
DETAILED DESCRIPTION
The various embodiments will be described in detail with reference to the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. References made to particular examples and implementations are for illustrative purposes, and are not intended to limit the scope of the invention or the claims.
The word “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any implementation described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other implementations.
The term “mobile device” is used herein to refer to any one or all of mobile computing devices, such as cellular telephones, smart phones, personal or mobile multi-media players, personal data assistants (PDA's), laptop computers, tablet computers, smart books, palm-top computers, wireless electronic mail receivers, multimedia Internet enabled cellular telephones, wireless gaming controllers, and similar personal electronic devices which include a programmable processor and memory and circuitry for sending and/or receiving voice and data calls, sending and/or receiving messages (e.g., short message service (SMS) messages, e-mails, application notifications, such as Facebook® post notifications and/or game high score change notifications, etc), sending and/or receiving warnings (e.g., low battery warnings, loss of network connectivity warnings, etc), and/or sending and/or receiving reminders (e.g., calendar reminders, task reminders, etc).
The term “personal hub” is used herein to refer to any device that may be worn or carried by a user and may interact with a variety of mobile devices and/or earpieces. In an embodiment, a personal hub may be configured to be worn by a user around the user's wrist in a manner similar to that of a wrist watch. In alternative embodiments, a personal hub may be a badge, tag, bracelet, patch, belt buckle, medallion, pen, key chain, or any other device worn or carried by a user. In an embodiment, a personal hub may wirelessly communicate with a user's mobile device(s), such as a smart phone, and a wireless earpiece worn by the user. The personal hub may wirelessly communicate with the user's mobile device(s) to enable the user to operate the user's mobile device(s) remotely. In an embodiment, a personal hub system may comprise a personal hub and one or two wireless earpieces worn by the user, together enabling a personal hub system user to utilize a user's mobile device(s) remotely.
As used herein, the term “sender” refers to a person sending a message using a computing device, such as a smart phone, a tablet device, a laptop computer, or any other device capable of transmitting SMS text messages, emails, audio data, visual data (e.g., images), or any other such electronic communications. Senders may also employ a personal hub and wireless earpieces when sending messages, although this is not necessary. As used herein, the term “recipient” refers to a person using a mobile device plus a personal hub and wireless earpieces to receive and reply to incoming messages from a sender, such as audio messages, visual messages, voice calls, text based messages, social media messages, and/or application reminders.
As used herein, the term “whisper message” may be used to refer to an audio clip message that may be delivered to a mobile device for delivery via a speaker, such as a wireless earpiece described herein. In general, whisper messages may be communications in which selected individuals (e.g., individuals on a user-defined privileged list) may transmit an audio clip that when received is immediately played in the recipient user's ears like a whisper. Whisper messages may be sent in addition to SMS, MMS and email messages. A sender wishing to send a whisper message may record the message, such as by speaking a brief verbal message into their mobile device (which may be running an application for creating whisper messages such as YagattaTalk®), and then press a send key or icon. The audio clip data may be transmitted through a wireless network (e.g., either a cellular data network or a WiFi network depending upon current connectivity) and delivered to the intended recipient's mobile device. The recipient's mobile device may confirm that the sender is on a privileged list, and if so, immediately play the audio clip through one or both of the ear pieces. In this manner, privileged senders can deliver a personal audio message to a recipient with the message being played immediately (or in response to a user action) in the recipient's ear just like a private whisper.
The various embodiments provide methods, devices, and systems for transmitting convenient messages to a recipient for rendering based on the recipient's device availabilities. In general, a mobile device may be paired with a personal hub, a wireless earpiece, and various other communication devices configured to render incoming messages, such as a voice call (e.g., PSTN call, VOIP call, cellular call, etc), text based message (e.g., SMS, e-mail), social media message (e.g., Facebook® notification, Tweet®, etc), whisper message (e.g., a recorded voice message from a YagattaTalk® user), and/or application reminder (e.g., E-bay® auction notification, remote based calendar application reminder, etc). In response to receiving the incoming communication, the mobile device may render the received communication. For example, the mobile device may display a text message, play an audio message, or render a visual message. Alternatively, the mobile device may relay the received communication for rendering by the personal hub, wireless earpieces, and/or any other connected devices (e.g., a laptop, etc.). For example, an incoming message may be rendered as audio within a connected wireless earpiece device.
Based on the receipt and use of a message received from a sender, the recipient's mobile device may generate a delivery confirmation message for transmission back to the sender. The delivery confirmation message may contain timestamp information, identities of the various recipient devices (e.g., personal hub, wireless earpiece, etc.) to which the message (or its contents) was transmitted, and also descriptions of the manner in which the received message was displayed, played or otherwise rendered, as well as any recipient interactions with the message (e.g., an acknowledgement, play selection, etc.). For example, the delivery confirmation message may report that the sender's audio message was received by the recipient's personal hub and played through connected wireless earpieces. By providing the sender with information regarding whether and the manner in which a message was delivered to the recipient may be useful in situations where delivery of the message is important to the sender or the recipient but the recipient is otherwise not able to immediately respond.
Additionally, a recipient's mobile device (e.g., smart phone, tablet, etc.), personal hub, and/or wireless earpieces may generate status information that describes the status of the recipient's devices. For the purposes of this disclosure, status information may be data that reflects the activities and/or availability of the recipient as well as his/her personal hub and/or wireless earpiece devices. For example, status information may include software settings (e.g., phone display lock, screen saver, etc.), information describing the privileged relationship between a sender and the recipient, a list of devices wirelessly connected to the recipient's mobile device or personal hub, information about the recipient's recent activities with the devices (e.g., recent input on touch screen unit or apps recently used), and data from sensors contained within the recipient's personal hub and/or wireless earpiece devices (e.g., temperature from sensors in the wireless earpieces, motion sensors in the wireless earpieces and/or personal hub, touch data from touch sensors in the wireless earpieces and/or personal hub, etc.). Status information may include information that may be relayed from the recipient's mobile device, personal hub and/or wireless earpiece devices that enables a sender or sender's computing device to determine the availability and best modes for communicating with the recipient, effectively providing a virtual presence of the recipient to the sender.
A sender's computing device may transmit a message to the recipient's mobile device requesting the recipient's status information. The sender's computing device may receive a message describing the recipient's status information (e.g., privilege status between the sender and the recipient, a list of devices connected to the recipient's mobile device, information about recent activities of devices connected to the recipient's mobile device, sensor data, etc.) and may interpret the recipient's availability to receive substantive messages from the sender's computing device. For example, the sender's computing device may interpret status information to describe the recipient as in a meeting or on a jog. In an embodiment, the recipient's mobile device may periodically determine status information and transmit the information to a server for storage and distribution to the sender's computing device.
Based on the status information, the sender's computing device may recommend certain formatting, message templates, delivery methods, message types (e.g., email, audio messages, visual messages, SMS text message, whisper message, etc.) or other configurations for the sender's substantive message to the recipient's mobile device. In an embodiment, the sender's computing device may use software that utilizes a whisper application (or “app”) that may display the intended recipient's status information, recommend a message format or mode (e.g., whisper message, SMS, email, phone call, etc.), receive sender inputs, and create messages for transmission to and presentation on a recipient's mobile device, personal hub and/or wireless earpieces. The whisper app may format messages based on the recipient's status information by using metadata or other data-encoding techniques such that particular content may be rendered by an appropriate one or more of the recipient's devices. For example, if the status information indicates the recipient is currently using the personal hub and wearing at least one wireless earpiece, the whisper app may prompt the sender to speak a whisper message, capture the sender's speech, and create an audio message to be transmitted to the recipient's mobile device for rendering through one or both wireless earpieces.
In an embodiment, the recipient's devices may decode and present to the recipient received messages, receive recipient inputs (e.g., taps or speech) for responding, and format response messages so as to provide a convenient response opportunity based on the recipient's status information. In an embodiment, the sender's computing device may utilize a whisper app to generate a message that includes simple response options (e.g., “yes,” “no,” “option one or option two,” etc.) from which the recipient may easily indicate a selection without having to type or dictate a response. For example, an audio message may include an interrogatory (or question) with answer options ‘yes’ and ‘no’. A recipient's mobile device, personal hub, and wireless earpieces may decode metadata within the sender's message to obtain instructions for rendering the message contents (e.g., formatting of rendered message contents, which connected device should render various message contents, etc.). As an example, a message presented to the recipient aurally may receive an answer input in the form of a touch on one side of the personal hub device display (or button) to indicate “yes” response and a touch on the other side of the display to indicate a “no” response. Additionally, a user may tap and hold the “yes” or “no” buttons to indicate their answer and record a whisper to provide further explanation. As another example, a message presenting to the recipient aurally may direct audio explaining a first option in the right wireless earpiece to indicate that the recipient can select the first option by touching the right earpiece, and direct audio explaining a second option in the left wireless earpiece to indicate that the recipient can select the second option by touching the left earpiece. Additionally, a user may tap and hold an earpiece to indicate their answer and record a whisper to provide further explanation.
In an embodiment, the recipient personal hub and other connected devices may detect recipient inputs from sensors (e.g., touch sensors configured to detect touch interactions, accelerometers configured to detect a tap, etc.) or interactions with a graphical user interface unit, and may generate response messages based on the received formatted sender message. For example, based on touch sensor and/or accelerometer data, the personal hub may determine that the recipient user tapped the right wireless earpiece in response to the playback of selected audio (e.g., ‘yes’) in that ear. In an embodiment, the personal hub or recipient's mobile device may generate a response message based on the recipient's simple response options.
FIG. 1 illustrates anetwork system100 suitable for use with the various embodiments employing wired and/or wireless data links. Thenetwork system100 may include multiple devices, such as apersonal hub102, anearpiece104, and two mobile devices, such as a smart phonemobile device106 andlaptop computer108. In an embodiment, thepersonal hub102,earpiece104, smart phonemobile device106, andlaptop computer108 may be configured to exchange data over either or both wired and wireless data links.
Thepersonal hub102 and theearpiece104 may exchange data via awireless data link130 and/or awired data link164. As an example, the wireless data link130 between thepersonal hub102 and theearpiece104 may be a Bluetooth® connection. Similarly, thepersonal hub102 and the smart phonemobile device106 may exchange data via awireless data link128 and/or awired data link176, and thepersonal hub102 and thelaptop computer108 may exchange data via awireless data link132 and/or awired data link174. The smart phonemobile device106 and theearpiece104 may exchange data via awireless data link134 and/or wireddata link168. The smart phonemobile device106 and thelaptop computer108 may exchange data directly with each other via awireless data link126 and/or wireddata link162. Theearpiece104 and thelaptop computer108 may exchange data directly with each other via awireless data link136 and/or wireddata link166. In various embodiments, thewireless data links126,128,130,132,134 and136 may be Bluetooth®, Zigbee®, Peanut®, or RF data links. In the various embodiments, thewired data links162,164,168,166,174 and176 may be cable connections, such as a USB cable, a FireWire® cable, or standard audio analog or digital cables with suitable connectors at each end.
Additionally, the smart phonemobile device106 and thelaptop computer108 may be configured to connect to theInternet116 viawireless connections122 and124, respectively, which may be established with awireless access point112, such as a Wi-Fi access point. Thewireless access point112 may be connected to theInternet116. In this manner data may be exchanged between the smart phonemobile device106, thelaptop computer108, andother Internet116 connected devices by methods well known in the art. Additionally, the smart phonemobile device106 and a cellular tower orbase station110 may exchange data via acellular connection120, including CDMA, TDMA, GSM, PCS, G-3, G-4, LTE, or any other type connection. The cellular tower orbase station110 may be in communication with arouter114 which may connect to theInternet116. In this manner, data (e.g., voice calls, text messages, e-mails, etc) may be exchanged between the smart phonemobile device106 and other devices by methods well known in the art.
In thevarious embodiment methods200,300,400,600, and700 discussed below, thepersonal hub102, theearpiece104, and the smart phonemobile device106 may communicate with each other using any combination of wired or wireless connections. For example, the smart phonemobile device106 may have a wireddata link176 to thepersonal hub102 and awireless data link134 to theearpiece104. In other words, with embodiments employingwireless data links126,128,130,132,134, and/or136,wired data links162,176,164,174,168, and/or166 may be substituted. With such substitutions, the operations ofmethods200,300,400,600, and700 discussed below may be performed in substantially similar manners over wired data links or combinations of wired and wireless data links.
In an embodiment, theearpiece104 may be connected to the smart phonemobile device106 via a wireddata link168, and the smart phonemobile device106 may be connected to thepersonal hub102 via awireless data link128. In this configuration, messages may be handled wirelessly between the smart phonemobile device106 and thepersonal hub102, while audio signals may be handled via the wired connection between the smart phonemobile device106 and theearpiece104. The wireddata link168 between the smart phonemobile device106 and theearpiece104 may reduce the processing required to generate, send, and/or receive wireless signals which may conserve battery power on the smart phonemobile device106 andearpiece104.
In another embodiment, thepersonal hub102 may be connected to the smart phonemobile device106 and/orlaptop computer108 viawireless data links128 and/or132, respectively, and thepersonal hub102 may be connected to theearpiece104 by a wireddata link164. In a further embodiment, theearpiece104 may be connected to the smart phonemobile device106 and/orlaptop computer108 viawireless data links134 and/or136, respectively. In this configuration thepersonal hub102 may send/receive information to/from theearpiece104 via a wired connection while theearpiece104 may send/receive information to/from the smart phonemobile device106 and/orlaptop computer108 via a wireless connection.
In an additional embodiment, the smart phonemobile device106 may be connected to thelaptop computer108 via a wireddata link162, and the smart phonemobile device106 may be connected to thepersonal hub102 andearpiece104 viawireless data links128 and134, respectively. In this configuration, thelaptop computer108 may send/receive information to/from the smart phonemobile device106 via a wired connection, and the smart phonemobile device106 may send/receive information to/from thepersonal hub102 and/orearpiece104 via wireless connections. As an example, while the smart phonemobile device106 is connected to thelaptop computer108 via a USB connection while the smart phonemobile device106 is charging, the smart phonemobile device106 may receive a reminder via the USB connection, and the smart phonemobile device106 may wirelessly send an alert message associated with the reminder to thepersonal hub102.
The private message delivery mechanisms enabled by the personal hub device (e.g., quiet vibration and small display) and wireless earpieces (which generate sound heard only by the recipient) permit users to receive messages in situations where normal smartphone operation is inappropriate or not permitted. To avoid being disturbed by unimportant messages, users may designate certain individuals authorized to send messages that are delivered by such private message delivery mechanisms even when the user's mobile device is in mute. Such designated individuals may be included in a privileged list. If desired, those on the user's list may have the privilege to designate their messages for immediate delivery when the recipient is wearing the communication devices. Individuals on a user's privileged list may be authorized to receive the user's current status information.
Thepersonal hub102 may exchange data with the user's mobile device, such as a smart phonemobile device106 andlaptop computer108 and act as a message control center or handy message notification display. For example, thepersonal hub102 and the smart phonemobile device106 may exchange data via awireless data link128, and thepersonal hub102 and thelaptop computer108 may exchange data via awireless data link132.
Thepersonal hub102 may serve as a wearable interface for any of a number of user computing devices, particularly mobile devices that are within close range of the user. For example thepersonal hub102 may serve as a message notification and management center enabling a user to monitor and react to various types of messages received on the user'slaptop computer108. Also, thepersonal hub102 may serve a similar function for messages received on the smart phonemobile device106. Thepersonal hub102 may also perform the same functions for both the user'slaptop computer108 and smart phonemobile device106 when both devices are in communication range. For ease of description the descriptions of the various embodiments refer to interactions with and operations performed by the user's “mobile device” which is intended to encompass any computing device communicating with the user'spersonal hub102, including but not limited to a smart phonemobile device106, alaptop computer108 or any other mobile device.
FIGS. 2-4 illustrate various embodiment methods for a recipient's mobile device communicating data that describes conditions during receiving, relaying, and rendering messages received from a communication system as described above. In particular, the recipient's mobile device may be configured to communicate delivery confirmation of received messages by obtaining instructions for rendering a received message on the recipient's mobile device from the received message, generating a delivery confirmation message reporting whether the received message was delivered and, when the received message was delivered, a manner in which the received message was delivered, and transmitting the delivery confirmation message to a sender of the received message. In another embodiment, reporting the manner in which the received message was delivered may include reporting information describing at least one of an identity of a device that is associated with the recipient's mobile device, a first indicator of whether message contents of the received message were successfully rendered by the device associated with the recipient's mobile device, and a second indicator of whether message contents of the received message were queued for rendering by the device associated with the recipient's mobile device. In an embodiment, the received message may be one of an email, a whisper message, an SMS text message, an audio message, a visual message, a video call, a telephone call, and a message formatted for use with whisper software. In another embodiment, obtaining instructions for rendering a received message on the recipient's mobile device from the received message may include decoding the received message to obtain metadata indicating a device on which the sender desires the received message to be rendered and at least one of sound and visual message contents, determining whether the device indicated in the metadata is connected to the recipient's mobile device, and providing the at least one of sound and visual message contents to the device indicated in the metadata when that device is connected to the recipient's mobile device. In another embodiment, the method may further include receiving input data from at least one of the recipient's mobile device and a device connected to the recipient's mobile device in response to the received message, and generating a response message based on the received input data. In another embodiment, the received input data may be at least one of an input on a graphical user interface and sensor data, wherein sensor data includes data received from a sensor selected from a group consisting of a touch sensor, an accelerometer, a gyroscope, and a magnetometer.
FIG. 2 illustrates anembodiment method200 for sending messages and receiving status information from devices employed by a recipient. Inblock210, a recipient'smobile device106 may be initialized and connected to recipient devices. For example, themobile device106 may be a smart phone that exchanges short-range radio signals with apersonal hub102, which make exchange short-range radio signals with at least one wireless earpiece.
Inblock215, a sender'scomputing device201 may select the recipient from a list of contacts. For example, the sender'scomputing device201 may display a stored list of contacts, such as in an address book database, and receive selection input data from the sender, such as by detecting the sender's touch on the area of the sender'scomputing device201 touch screen that represents a contact name. The sender'scomputing device201 may be configured to store and relate contact names with phone numbers, fax numbers, email addresses, and other information for establishing communications and/or transmitting messages. In an embodiment, the contact information may be used by the sender'scomputing device201 to determine destination addresses for SMS text messages, whisper messages, visual messages, video calls, audio telephone calls, data messages, emails, and any other form of communication. In an embodiment, if the sender'scomputing device201 maintains multiple contact addresses for the selected recipient, the sender'scomputing device201 may prompt the sender to supply further inputs to determine a method of communication with the selected recipient. For example, the sender'scomputing device201 may respond to the selection of the recipient contact by prompting the sender to further select whether to generate an email, an SMS text message, a whisper message, an audio message, a visual message, a video call, or establish a conventional vocal conversation (e.g., a telephone call).
In an embodiment, the sender'scomputing device201 may store and execute software applications or routines that perform the communication services or protocols described above. For example, the sender'scomputing device201 may store and execute a whisper app, with which the sender'scomputing device201 may receive sender input that directs the app to create a whisper message. In an embodiment, the sender'scomputing device201 may have an address book, a priority list, and/or a privileged list which stores the contact information for other devices (e.g.,mobile devices106 and personal hubs102) which may identify individuals (i.e., contacts) who have included the sender in their privileged lists. For example, the sender'scomputing device201 may receive a message indicating that the sender has been added to a recipient's privileged list and note that information in the contacts database as someone to which the sender can send whisper message transmissions. In another embodiment, the sender'scomputing device201 may instead contain a database that stores device contact information to which the sender'scomputing device201 may transmit privileged transmissions, such as whisper messages. In an embodiment, the sender'scomputing device201 may transmit but not receive privileged communications to a particular recipient device.
Inblock220, the sender'scomputing device201 may create a message for the recipient that was selected inblock215. For example, the sender'scomputing device201 may enter an email message creation routine in response to the sender selecting the recipient's email address. The sender'scomputing device201 may receive input, such as speech, text, and/or selection of displayed options, and generate a complete message configured to be transmitted to the recipient's destination address as defined by the selected recipient information. Inblock225, the sender'scomputing device201 may transmit the message to the recipient, for example, to the recipient'smobile device106.
Inblock230, the recipient'smobile device106 may receive the message from the sender'scomputing device201. A mail server or cellular network messaging server may push the message to the recipient'smobile device106. Alternatively, the recipient'smobile device106 may receive an alert that the message is available for download from a remote server.
Inblock235, the recipient'smobile device106 may query the status of connected devices. For example, the recipient'smobile device106 may obtain status information from the recipient's mobile device, personal hub and/or wireless earpieces. In particular, the recipient's mobile device may determine device settings by polling the devices to determine configuration settings, such as activity modes and operating parameters. For example, the recipient'smobile device106 may determine whether the recipient'smobile device106 is in silent mode. Other such device settings the recipient'smobile device106 detects may include whether the phone lock is engaged (e.g., an authentication routine is executing that may require user input or password), or whether a vibration mode is engaged. The recipient'smobile device106 may also determine whether there are active connections with thepersonal hub102, nearby laptop computers, and other wireless devices. For example, the recipient'smobile device106 may determine whether there are any active data links between the recipient'smobile device106 and thepersonal hub102, such as Bluetooth® pairings/links. In an embodiment, the initialization operations inblock210 may be recorded and used by the recipient'smobile device106 to indicate status information of the recipient at a given time.
The recipient'smobile device106 may also determine the status of the recipient by polling the sensors within thepersonal hub102. In an embodiment, the recipient'smobile device106 may query thepersonal hub102 to determine movement, acceleration, and other recipient activities based on data collected by sensors. For example, thepersonal hub102 may report to the recipient'smobile device106 that a temperature sensor positioned at the bottom of the personal hub102 (e.g., in contact with the recipient's wrist) recently measured body temperature, or the same temperature as measured by temperature sensors contained within the wireless earpieces. As another example, thepersonal hub102 may report that measurements taken from a temperature sensor, gyroscope, magnetometer, and/or an accelerometer located within a wireless earpiece were static for a period of time. In an embodiment, the recipient'smobile device106 may contain various sensors, gather and store sensor measurements that may be used in determining the status of the recipient'smobile device106. For example, the recipient'smobile device106 may contain a gyroscope unit which measures motion activity.
In an embodiment, the recipient'smobile device106,personal hub102, and any other devices may store status indicators (i.e., codes) that describe the individual devices' configurations or settings. For example, thepersonal hub102 may store a code which describes whether the personal hub has active Bluetooth® connections to other devices, is in a sleep or silent mode, or is currently rendering message data. In an embodiment, the various recipient devices may store status indicators representing the status of any other device or all recipient devices.
Inoptional block238, the recipient'smobile device106 may generate a presence update message. The presence update message may be based on the determined status of the recipient'smobile device106, thepersonal hub102, and/or other devices associated with the recipient. The contents of an embodiment of a presence update message are described below with reference toFIG. 5.
In an embodiment, the recipient'smobile device106 may create the presence update message in the same form or delivery method as received from the sender'scomputing device201 inblock230. For example, if the sender'scomputing device201 transmitted an email to the recipient, the recipient'smobile device106 may generate an email presence update message for transmission to the sender. In an embodiment, the contents and method of transmission of the presence update message may be determined by user configurations stored within a file on the recipient'smobile device106. For example, the recipient may configure the recipient'smobile device106 to only send presence update messages as SMS text messages. In an embodiment, the recipient may store configurations which prioritize various methods of transmission for the presence update messages and/or use contact information stored within the recipient'smobile device106 to determine available methods of transmission for various senders. In an embodiment, the presence update message may contain a natural language summary of the recipient's status based on the determined status information. For example, the presence update message may contain a text segment which indicates that the recipient is asleep based on static motion data and regular temperature readings.
Inoptional block240, the recipient'smobile device106 may transmit the presence update message to the sender, for instance to the sender'scomputing device201. In an embodiment, the recipient'smobile device106 may only transmit the presence update message if the sender's computing device201 (or its user) has been pre-approved, authenticated, or is a member of a privileged list stored on the recipient'smobile device106 orpersonal hub102. In an embodiment, the recipient'smobile device106 may prompt the recipient to authorize a transmission of the presence update message. In another embodiment, the presence update message may further contain an indication that confirms that the sender is within the recipient's privileged list.
Inoptional block245, the sender'scomputing device201 may receive the presence update message from the recipient, for instance from the recipient'smobile device106. In an embodiment, if the sender'scomputing device201 executes a whisper application (or app), the sender'scomputing device201 may execute routines to parse the presence update message and identify certain formatting information for rendering the presence update message. For example, the presence update message may contain header information which describes categorizations of the recipient's status and the status of the recipient's personal hub (e.g., whether it is currently worn and recently used) and the wireless earpieces (e.g., whether they are currently on the recipient's ears). In an embodiment, the sender'scomputing device201 may detect information within the presence update message indicating that the sender is within the recipient's privileged list. For example, there may be a code that indicates the sender has a favored status with the recipient. In an embodiment, the sender'scomputing device201 may render the presence update message as text, a graphical display, audio, or a combination of these.
Inblock250, the senders's message, which was received inblock230, may be rendered on one or more devices (such as the recipient'smobile device106,personal hub102, and other devices connected to the mobile device106) based on metadata. For example, the recipient'smobile device106 may display a downloaded email message from the sender. As another example, thepersonal hub102 may play an audio file attached to the sender's message through a wireless earpiece. In various embodiments, the sender's message may contain metadata indicating rendering instructions which the recipient'smobile device106 may obtain by decoding and/or parsing the sender's message. In an embodiment, the sender'scomputing device201 may utilize a whisper app to generate such instructions. The application and/or use of metadata within messages is described detail in below with reference toFIGS. 6-7.
Inblock252, the recipient'smobile device106 may generate a delivery confirmation message. The delivery confirmation message may be based on the manner in which the sender's message was presented to the user as well as any use of message by the recipient's devices, such as the recipient'smobile device106 orpersonal hub102. The recipient'smobile device106 may monitor and record every access, modification, and exchange related to the sender's message, creating a data trail of the message through the recipient's devices. For example, the recipient'smobile device106 may record whether the recipient sees, reads, or hears a message as well as whether the recipient saves a draft response to a whisper message or an SMS text message. As another example, the recipient'smobile device106 may record the identity of the output devices (e.g., thepersonal hub102, earpieces, etc.) that render a received whisper message or SMS text message. In an embodiment, the recipient'smobile device106 may log the timestamp information for each received message, the time and identity of the destination device (or output device) for any transmission of the message or its associated content between the recipient'smobile device106, thepersonal hub102, and any other connected wireless devices, and the software on the respective devices which rendered, displayed, performed, or otherwise accessed the message and/or its associated content. For example, the recipient'smobile device106 may log that the sender message was delivered to the recipient'smobile device106 from a remote message server at a particular time and that it was transmitted to thepersonal hub102 where it displayed as text on the display unit. The delivery confirmation messages may also report the manner in which the received message was delivered by including an indicator of whether message contents were successfully rendered by output devices and/or an indicator of whether message contents of the received message were queued for rendering by output devices.
In an embodiment, the delivery confirmation message may contain information indicating the manner in which the message was delivered to the recipient including the output devices that displayed or played the message, such as the recipient's mobile device, earpiece or personal hub. For example, the delivery confirmation message may state that the sender's audio message was played through two wireless earpiece devices. In an embodiment, the delivery confirmation message may contain similar information as the presence update message.
In an embodiment, either the recipient'smobile device106 or the sender'scomputing device201 may process delivery confirmation information to determine the likelihood that the recipient accessed or played the sender's message. For example, the recipient'smobile device106 may evaluate logged timestamps, access reports, and device status to interpret if the recipient played an audio file. As another example, the sender'scomputing device201 may receive and process the delivery confirmation message to determine the likelihood. The determined likelihood may be expressed within the delivery confirmation message as a probability, a natural language summary, or any other manner of representing the recipient'smobile device106 evaluation. In an embodiment, the likelihood evaluation may be conducted by a whisper app running on either of thedevices106,201.
Inblock255, the recipient'smobile device106 may transmit the delivery confirmation message to the sender'scomputing device201. Inblock260, the sender'scomputing device201 may receive the delivery confirmation message, which was transmitted inblock255. In an embodiment, the sender'scomputing device201 may process and render the contents of the delivery confirmation message as graphical indicators in a messaging application. For example, the sender'scomputing device201 may receive and process the delivery confirmation message and draw a prompt on the display unit informing the sender that the sender's message was delivered and opened by the recipient including displaying an icon of the device that rendered the message.
FIG. 3 illustrates anembodiment method300 for directly pulling a recipient's status information and sending messages based on the status information. Themethod300 is similar to themethod200 described above, however here a sender'scomputing device201 may retrieve status information describing a recipient's devices' status and generate messages having formatting based on the status information. The operations in blocks210-215 are as described above with reference toFIG. 2.
Inblock305, the sender'scomputing device201 may transmit a presence request message to the recipient who was selected inblock215. In an embodiment, the presence request message may include a predefined system call, an API command, or any other software command that a recipient'smobile device106 may interpret as a request to query devices. In an embodiment, the presence request message may simply include email, SMS text, whisper message content, or audio content that indicates the sender's desire to obtain the recipient's status information. For example, the recipient'smobile device106 may recognize a presence request in response to receiving an email from the sender that asks “what's your status?” In another embodiment, the sender'scomputing device201 may transmit a presence request message after receiving sender input on the sender'scomputing device201. For example, the sender'scomputing device201 may send a presence request message in response to detecting a user click on a graphical user interface button beside the recipient's name in a contact list. In another embodiment, the sender'scomputing device201 may execute background routines that periodically transmit presence request messages for some or all contacts within a stored contact list. For example, the sender'scomputing device201 may automatically send presence request messages to some or all contacts stored within a whisper privileged list located on the sender'scomputing device201. In another embodiment, the sender'scomputing device201 may transmit a presence request message while the sender composes a message or immediately before sending a message.
Inblock310, the recipient'smobile device106 may receive the presence request message. The recipient'smobile device106 may determine that the message is a presence request by evaluating any metadata (e.g., header information), by conducting text analysis, or by any other means of classifying received message data. Inblock235, the recipient'smobile device106 may query the status of connected devices. Inblock238, the recipient'smobile device106 may generate a presence update message. Inblock240, the recipient'smobile device106 may transmit the presence update message to the sender'scomputing device201. Inblock245, the sender'scomputing device201 may receive the presence update message from the recipient, for instance from the recipient'smobile device106.
Inblock320, the sender'scomputing device201 may generate a message based on the presence update message received from the recipient inblock245. For example, the sender'scomputing device201 may generate a whisper message to the recipient based on the contents of the received presence update message. In an embodiment, a whisper application (or app) executing on the sender'scomputing device201 may parse the presence update message to determine formatting techniques or other message configurations that may capitalize on the recipient's current status (e.g., does the recipient have wireless earpieces equipped, is the recipient'smobile device106 in silent mode, etc.).FIG. 6 describes in detail how the sender'scomputing device201 may use the status information in creating messages.
Inblock225, the sender'scomputing device201 may transmit the message to the recipient, for example, to the recipient'smobile device106. Inblock230, the recipient'smobile device106 may receive the message from the sender'scomputing device201. Inblock250, the senders's message, which was received inblock230, may be rendered on one or more devices based on metadata. Inblock252, the recipient'smobile device106 may generate a delivery confirmation message. Inblock255, the recipient'smobile device106 may transmit the delivery confirmation message to the sender. Inblock260, the sender'scomputing device201 may receive the delivery confirmation message.
FIG. 4 illustrates anembodiment method400 that employs aserver401 to store status information. Unlike in the operations in blocks305-240 inFIG. 3, the sender'scomputing device201 may not receive presence update information from the recipient'smobile device106, but instead receive that information from theserver401. Inblock210, the recipient'smobile device106 may be initialized and connected to recipient devices. Inblock235, the recipient'smobile device106 may query the status of connected devices. Inblock238, the recipient'smobile device106 may generate a presence update message. Inblock410, the recipient'smobile device106 may transmit the presence update message to theserver401. The recipient'smobile device106 may repeatedly perform the operations in blocks235-410 as an operational loop. In an embodiment, the recipient'smobile device106 may transmit presence status update messages to theserver401 at a predefined frequency. For example, a regular report from the recipient'smobile device106 may provide a real-time (or “heartbeat”) presence assessment that theserver401 may maintain. In an embodiment, the recipient'smobile device106 may automatically transmit presence update messages to the contact within a privilege contact list.
Inblock415, theserver401 may store presence update information. For example, theserver401 may store presence update information indicated within presence update messages, which were transmitted by the recipient'smobile device106 inblock410. In an embodiment, theserver401 may record status information within a relational database and may store subsets of the status information within the presence update message. In an embodiment, theserver401 may update database values in response to receiving the presence update message or, alternatively, may log some or all status information over time. In an embodiment, theserver401 may generate statistical information describing the status information over time. For example, theserver401 may calculate the percentage of time the recipient'smobile device106 is in silent mode, or how often the recipient'smobile device106 is connected to apersonal hub102. As another example, theserver401 may track motion data represented in the presence update message and estimate typical motion values based on time of day.
The remainder ofmethod400 is similar to the operations inmethod300. However, in blocks420-430, the sender'scomputing device201 requests and receives presence update messages from theserver401 instead of directly from the recipient'smobile device106. Inblock215, the sender'scomputing device201 may select the recipient from a list of contacts. Inblock420, the sender'scomputing device201, may transmit a presence request message to theserver401. Inblock425, theserver401 may receive the presence request message. Inblock430, theserver401 may transmit a presence update message to the sender'scomputing device201. Inblock245′, the sender'scomputing device201 may receive the presence update message from theserver401. Inblock320, the sender'scomputing device201 may generate a message based on the presence update message. Inblock225, the sender'scomputing device201 may transmit the message to the recipient, for example, to the recipient'smobile device106.
Inblock230, the recipient'smobile device106 may receive the message from the sender'scomputing device201. Inblock250, the senders's message, which was received inblock230, may be rendered on one or more devices based on metadata. Inblock252, the recipient'smobile device106 may generate a delivery confirmation message. Inblock255, the recipient'smobile device106 may transmit the delivery confirmation message to the sender'scomputing device201. Inblock260, the sender'scomputing device201 may receive the delivery confirmation message.
FIG. 5 illustrates an embodiment data structure within apresence update message500. Thepresence update message500 may be transmitted by the recipient's mobile device to a sender's computing device or a server maintaining up-to-date status information. Thepresence update message500 may containmetadata502, such as formatting information to assist in the rendering using a whisper app. In an embodiment, themetadata502 may instruct the sender's computing device to display the status information using particular devices. For example, themetadata502 may direct the smart phone to execute a text-to-speech conversion routine and generate an audio file that may be performed through the sender's personal hub and connected wireless earpiece. In an embodiment, themetadata502 may provide whisper app instructions that direct the software running on the sender's computing device to adjust system or software variables based on thepresence update message500 contents. For example, themetadata502 may instruct a whisper app to change default settings for generating new messages.
Thepresence update message500 may also contain numerous descriptions of the recipient's mobile device's status at the time thepresence update message500 was created, such asdisplay lock status504, a list of some or all apps installed506 on the recipient's smart phone, a list of the apps currently in use andusage information508 on the recipient's smart phone, and silent mode/mute status510. Thepresence update message500 may further include application data describing the recipient's use of particular applications. For example, thepresence update message500 may contain browser history reports, data entered into online forms, or cookie information.
Thepresence update message500 may contain data regarding the various devices associated with the recipient's mobile device, such as a list of some or all devices in a pair list512 (e.g., devices which have been Bluetooth® paired with the recipient's smart phone). In an embodiment, thepresence update message500 may list some or all of the devices with which the recipient's smart phone has ever paired. Thepresence update message500 may contain a list of the devices that are currently connected or active514, such as a connected personal hub and wireless earpiece. This list may contain information regarding devices that are connected via wired or wireless (e.g., short-range radio) connections. For example, thepresence update message500 may indicate whether wired headphones (or wired headsets or earpieces) are connected to the personal hub or other associated devices.
The various connections between the recipient's mobile device and the associated devices may be indicated in thepresence update message500. For example, thepresence update message500 may indicate that a smart phone is connected via a wire or short-range radio connection to a close personal computer (e.g., a laptop). In general, the connected status of the various devices may indicate the devices' availabilities. For example, when the personal hub is active and wirelessly paired or connected with the recipient's mobile device (e.g., smart phone), the personal hub may be available. In an embodiment, thepresence update message500 may indicate whether any of the associated devices (e.g., personal hub, wireless earpieces, wired headset, etc.) are connected to a power source (e.g., an AC adapter) and/or are being charged. Thepresence update message500 may describe the power status of the batteries within the associated devices and may report if a battery within any associated device is “dead”. For example, thepresence update message500 may indicate that the personal hub battery is almost out of power. Further information may includesensor data516, such as temperature data and/or motion data. Thesensor data516 may be retrieved from sensor or measurement units located within the smart phone, personal hub, and wireless earpieces. For example, the recipient's mobile device may retrieve motion data from a connected wireless earpiece containing a gyroscope. As another example, thesensor data516 may represent motion data from an accelerometer and body temperature measurements from a temperature sensor, both sensors contained within the personal hub. Thesensor data516 may also include temperature measurements from the wireless earpieces which may be used to determine whether the wireless earpieces are being worn. In an embodiment, thepresence update message500 may indicate possible reasons why devices are not being worn, such as indicating that the devices are currently being powered (or charged) via a connection to a power source, are currently malfunctioning (e.g., report software error codes), or redundant because other connected devices are being worn by the user (e.g., wireless earpieces are not worn because a wired headset is connected to the personal hub).
Thepresence update message500 may also include data describing the recipient's user interactionrecent history518 with the recipient's mobile device. This user interactionrecent history518 data may include information about the recipient's mobile device's data usage, such as records of data uploaded/downloaded by the smart phone over a period of time, recently used applications, and inputs within various applications (e.g., tapping on a touch screen, pushing device buttons, etc.). For example, thepresence update message500 may include a report of the timestamps of the recent taps on the touch screen.
Thepresence update message500 may contain location (or position)information520 about the recipient, such as GPS coordinates retrieved from a GPS unit within the smart phone. In an embodiment, other data describing the location of the recipient or used to determine a location may be included in thepresence update message500, and may include cell phone tower information (e.g., access records and tower identification information), WiFi access information (e.g., router log information), and IP addresses of the various recipient devices. In an embodiment, the recipient's mobile device may store thepresence update message500 and append the message to other transmissions. For example, the smart phone may insert the information within thepresence update message500 into a delivery confirmation message.
In an embodiment, thepresence update message500 may include information describing the recipient's schedule or itinerary. For example, the recipient's mobile device may access calendar data, stored on the phone or in an accessible online account (e.g., Google calendar, Outlook calendar, etc.), and append the recipient's current activity to thepresence update message500. In an embodiment, the presence update message may contain information about the recipient's mobile device's current cellular network data rate, connectivity to WiFi networks, and any other indicators of data streaming or telephonic availability. In another embodiment, thepresence update message500 may contain information describing the sender's relationship or privileged status with respect to the recipient. The recipient's mobile device may determine the privileged status (e.g., the sender is within the recipient's privileged list, “known,” “preferred,” “liked,” etc.) of the sender (or the sender's computing device) by comparing the sender's ID against data stored within a contact information database, social networking accounts/databases, or other sources that may contain information about the relationship between the recipient and the sender. For example, thepresence update message500 may include an indicator of whether or not the sender is on the recipient's privileged contact list and thereby able to create whisper messages that the recipient's personal hub may render immediately upon receipt. As another example, thepresence update message500 may describe the recipient's current disposition regarding the sender and include text such as, “Recipient is not taking your calls at the moment” or “You have been removed from Recipient's favorites list.”
FIGS. 6-7 illustrate various embodiment methods for generating an outgoing message that includes instructions for rendering message contents on a recipient's mobile device. In the various embodiments, the operations may be performed by various communication devices, such as the devices within the communication system as described above with reference toFIG. 1 (e.g., a smart phone) as well as other computing devices capable of generating messages for delivery to a recipient's mobile device, such as a laptop computer, a central server, and various other computing devices associated with a sender.
An embodiment method may include determining, based on received status information regarding the recipient's mobile device, availabilities of message rendering devices coupled to the recipient's mobile device, identifying a format and a message type for sending the outgoing message to the recipient's mobile device based on the determined availabilities of message rendering devices coupled to the recipient's mobile device, formatting the outgoing message according to the identified format and the identified message type, generating metadata that includes instructions for rendering the outgoing message based on the determined availabilities of message rendering devices coupled to the recipient's mobile device, and transmitting the outgoing message to the recipient's mobile device. In another embodiment, the status information regarding the recipient's mobile device may include at least one of information describing a privileged status, a list of devices connected to the recipient's mobile device, information about recent activities of devices connected to the recipient's mobile device, and sensor data. In another embodiment, the identified message type may be selected from a group that may include of an email, a whisper message, an SMS text message, an audio message, a visual message, a video call, a telephone call, and a message formatted for processing with whisper service software. In another embodiment, the generated metadata that includes instructions for rendering the outgoing message may indicate a device coupled to the recipient's mobile device on which the outgoing message is to be rendered and whether the outgoing message is to be rendered audibly, visually, or tactilely. In another embodiment, the generated metadata that includes instructions for rendering the outgoing message may indicate that a first portion of the outgoing message is to be rendered on a first device connected to the recipient's mobile device and a second portion of the outgoing message is to be rendered on a second device connected to the recipient's mobile device. In another embodiment, the generated metadata that includes instructions for rendering the outgoing message may indicate that a first portion of the outgoing message is to be rendered audibly via a first device connected to the recipient's mobile device and a second portion of the outgoing message is to be rendered visually on a second device connected to the recipient's mobile device. In another embodiment, the generated metadata that includes instructions for rendering the outgoing message may indicate that a vibration motor should be activated when rendering the outgoing message. In another embodiment, identifying a format and a message type for sending the outgoing message to the recipient's mobile device may be include determining whether the outgoing message includes an interrogatory posed by a sender, prompting the sender to provide a set of recipient responses related to the interrogatory when the outgoing message includes the interrogatory, and modifying the outgoing message to include the set of recipient responses based on an input received from the sender. In another embodiment, generating metadata that includes instructions for rendering the outgoing message based on the determined availabilities of message rendering devices coupled to the recipient's mobile device may include generating metadata that indicates that each response in the set of recipient responses is to be rendered differently. In another embodiment, generating metadata that includes instructions for rendering the outgoing message based on the determined availabilities of message rendering devices coupled to the recipient's mobile device may include generating metadata that includes instructions for rendering the set of recipient responses using at least one of large text, small text, colored text, blinking text, animated text, sound rendered on a device coupled to the recipient's mobile device, an image rendered on a device coupled to the recipient's mobile device, and information defined within a configuration file.
FIG. 6 illustrates anembodiment method600 for a sender's computing device to create formatted messages based on a recipient's status information. As described above, in various embodiments the sender's computing device, such as a smart phone or laptop computer, may request presence update messages directly from the recipient's mobile device or a server maintaining status information for the recipient's mobile device. The sender's computing device may execute a whisper app (or whisper software services) and, based on status information indicated in presence update messages, may generate messages that direct the recipient's mobile device and associated devices (e.g., personal hub, wireless earpiece, etc.) to render the messages in a convenient and effective manner. In various embodiments, the whisper app may be installed and the operations inmethod600 may be performed by the sender's computing device or any other computing device wired or wirelessly connected to the phone, such as a personal hub, a laptop computer, a wireless earpiece, a tablet device, etc.
Inblock602, the sender's computing device may receive a presence update message. For example, the sender's computing device may receive a presence update message that includes status information of devices currently connected to the recipient's smartphone. Inblock604, the whisper app may determine the recipient's availabilities based on status information in the presence update message. In other words, the whisper app may create availability conclusions that indicate the availability of devices associated with the recipient to render message contents. For example, the whisper app, running on the processor of the sender's computing device, may interpret presence or status information indicated within the received presence update message and may determine availabilities of the recipient. Availabilities (or availability conclusions) may be codes, classifications, or categories that define the ability of the recipient to receive and interact with various forms of communications (i.e., status information). In particular, the availabilities may be based on the connectivity of the recipient's devices (e.g., personal hub, wireless earpieces, etc.) For example, the whisper app may conclude that the recipient is fully available for an interactive audio message, as the recipient's smart phone browser app is active and his personal hub and wireless earpiece are both connected. As another example, the whisper app may conclude that the recipient has limited availability, as his wireless earpiece microphone is configured to a mute setting, but his personal hub is active. As another example, the status information may indicate that the sender is within a privileged list of the recipient's and may transmit whisper messages.
The whisper app may parse the presence update message into discrete data elements and evaluate the variables singularly or in combination. For example, the whisper app may interpret motion data reported from an accelerometer within a personal hub of the recipient and conclude that the recipient is active. As another example, the whisper app may conclude that the recipient is asleep based on static motion data, a report that the personal hub and a wireless earpiece are equipped, an active screen saver application on the recipient's mobile device, and data use records that show no use for a long period. In an embodiment, the whisper app may determine availabilities (or availability conclusions) based on a rule-based system that may or may not use dependency relationships between various presence variables.
In an embodiment, the whisper app may weigh certain presence variables differently than others when determining availabilities. For example, motion data from the recipient's personal hub may be weighed as more important than the activity of a phone display lock. In an embodiment, the whisper app may utilize sender configuration files to determine the weight or importance of various presence variables. In an embodiment, the whisper app may utilize threshold variables and/or values to determine the recipient's availability. For example, the whisper app may only determine the recipient is fully available if motion data retrieved from the recipient's mobile device accelerometer represents moderate activity.
In an embodiment, the presence update message may contain recipient availability conclusions conducted by whisper app services on the recipient's mobile device prior to transmission of the presence update message. For example, the recipient's mobile device may evaluate the recipient's various devices, determine presence variables, and conclude that the recipient has low availability as he is in a meeting.
Inblock606, the whisper app may identify a message type and format for outgoing messages based on the determined availabilities. For example, the whisper app may identify a certain message format and/or message type recommended to be used for messages to the recipient based on availability conclusions. In an embodiment, based on the availabilities, the whisper app may identify (or recommend) a particular type of transmission. For example, the whisper app may suggest that the recipient is most available to receive an SMS message as the recipient's personal hub is not currently connected to the recipient's smart phone. The whisper app may identify and/or recommend other message types, such as email, whisper messages, SMS text messages, audio messages, visual messages, video calls, telephone calls, and messages formatted for processing with whisper service software. The whisper app may also identify a recommended message format or content guidelines for transmissions based on the determined availabilities. For example, as the recipient has limited data availability due to a low reported data rate, the whisper app may identify that the sender should transmit a short text instead of an email with large attachments.
In an embodiment, the whisper app may render a graphical dialog window on the sender's computing device which displays identified message and format recommendations. In another embodiment, the whisper app may allow the sender to accept or reject recommendations. For example, the sender may not choose to follow the whisper app's identified recommended message type of a whisper message instead of an email message. In an embodiment, the whisper app may automatically impose formatting and message type limitations on the sender based on the determined availabilities. For example, after interpreting the recipient's availability as minimal due to a reported calendar entry of a meeting at the current time and a recipient status message of “Do Not Disturb,” the whisper app may disallow the creation of a whisper audio message.
In an embodiment, if the sender desires to transmit a message including an interrogatory (or question) for the recipient, the whisper app may identify formatting for rendering the message as a selection list. For example, if the determined availabilities describe the recipient as unable to provide a long response, the whisper app may prompt the sender to create a simple question message with a set of responses from which the recipient may choose using his smart phone, personal hub, or other connected devices, such as wireless earpieces. In an embodiment, the whisper app may prompt the sender to input whether a transmission requires a binary response from the recipient (i.e., ‘yes’ or ‘no’). In an embodiment, the whisper app may prompt the sender to provide other formatting aspects to messages, such as optional text effects (e.g., small text, large text, blinking text, animated text, bold text, colored text, etc.), whether a response is audible (e.g., a sound response), and whether a response is to be rendered from a particular output device associated with the recipient's mobile device (e.g., a visual response displayed on the recipient's personal hub, an audible response rendered on an earpiece device, etc.). In an embodiment, the whisper app may prompt the sender to use a predefined or template format for the message (e.g., a format or other information defined within a configuration file or user preference file).
Inblock608, the whisper app may prompt the sender to enter message contents for an outgoing message to the recipient. For example, the sender may provide message contents, such as statements, comments, and interrogatories (or questions). In an embodiment, the whisper app may render a series of questions to the sender on the sender's computing device which prompts the sender to enter information. For example, the whisper app may display questions, such as “Do you want to make a statement or pose a question to the recipient?” The whisper app may render graphical user interface buttons with which the sender may interact to proceed through the app's prompts. In an embodiment, the whisper app may open new applications for message contents input by the sender based on responses to prompts. For example, the whisper app may indicate that the recipient is best available on email or SMS text, provide the sender with a GUI button for each, and open either an email composition form or SMS text form based on the sender's input.
In an embodiment, the whisper app may evaluate the message contents and determine the likelihood that the recipient will be available based on the determined availabilities (e.g., availability conclusions and status information). For example, the whisper app may determine a low likelihood that a long email having questions for the recipient to answer will be responded to based on status information which describes the recipient as driving a car. In an embodiment, the whisper app may indicate likelihood determinations to the sender and may display alternatives to increase the likelihood of response by the recipient. For example, the whisper app may prompt the sender to simplify existing text, change the mode of transmission (e.g., from email to SMS text message or whisper message), or change the format of the message.
Indetermination block609, the whisper app may determine whether the message contents contain an interrogatory. In other words, the whisper app may determine whether the sender is posing a question within the message contents to be answered by the recipient. In an embodiment, the whisper app may detect question marks text symbols or question phrasing using natural language processing. If the message contents contain an interrogatory (i.e., determination block609=“Yes”), inblock610, the whisper app may prompt the sender to enter simply response options for the interrogatory. In other words, the sender may be prompted to provide responses the recipient can choose to answer the interrogatory. For example, if the sender wants to send the recipient a question “What do you want for dinner?,” the whisper app may prompt the sender to input a set of possible responses, such as “Steak” and “Sushi.”
Inblock611, the whisper app may modify the message contents based on received input data from the sender. In other words, the whisper app may modify the outgoing message to include a set of possible recipient responses to the interrogatory received from the sender in response to prompting. For example, the sender may input text data that represents a simple response option (e.g., “yes,” “no,” “steak,” “sushi,” etc.). In an embodiment, the whisper app may also record instructions that may be executed by the recipient's mobile device or personal hub if the recipient selects a particular simple response option. For example, the whisper app may record in a metadata instruction that if the recipient selects a particular response, then the recipient's mobile device may transmit a particular response transmission. As further example, the metadata may direct the recipient's mobile device running the whisper app to generate and transmit an SMS text message to the sender's computing device if the recipient's personal hub detects a particular recipient response. Once the response values for the interrogatory are received from the sender, the whisper app may continue with the operations indetermination block609. For example, the whisper app may determine whether another interrogatory is within the message contents and thus additional response values may be received.
If the message contents do not contain an interrogatory (i.e., determination block609=“No”) or if the sender has provided simple response options for interrogatories within the message, inblock612 the whisper app may generate metadata that includes instructions for rendering the message contents based on determined availabilities. In other words, the metadata may include instructions indicating how the recipient's mobile device and connected devices may render the sender's outgoing message. The metadata may be code or other information that is inserted within, intermixed, and/or surrounding message contents. In an embodiment, the metadata may be readable by other whisper app implementations (e.g., on the recipient's mobile device) and may direct a particular device to render the message contents according to the sender's specifications or the limitations defined by the determined availabilities. For example, metadata may indicate different simple response options related to an interrogatory and may include instructions to render each simple response option differently (e.g., one on the left, one on the right, one loud, one soft, etc.). As another example, the metadata may direct a left earpiece to render a particular portion of the message contents. In an embodiment, the metadata may indicate whether the sender's outgoing message and/or its message contents should be rendered by the recipient's devices audibly, visually, or tactilely. For example, the metadata may direct the whisper app running on the recipient's mobile device to play audio in a wireless earpiece. As another example, the metadata may cause an audible beep at the recipient's personal hub upon receipt of a whisper message, email, or an SMS message. As yet another example, the metadata in the outgoing message may direct the recipient's mobile device, personal hub, and/or other connected device to vibrate or generate other haptic feedback (e.g., a vibration motor may be activated).
In an embodiment, the generated metadata may include instructions that direct different devices connected to the recipient's mobile device to render different portions of the message contents. For example, the metadata may instruct a first portion to be rendered on a first device connected to the recipient's mobile device and a second portion to be rendered on a second device connected to the recipient's mobile device. In another embodiment, the metadata may instruct a device
In an embodiment, the generated metadata may include instructions that direct one (or more) devices connected to the recipient's mobile device to render portions of the message contents in different manners. For example, the metadata may instruct a wireless earpiece connected to the recipient's mobile device to render a first portion audibly via and may also instruct a personal hub device to render a second portion visually. The metadata may further include formatting or rendering instructions, such as whether to render message contents as large text, small text, colored text, blinking text, animated text, sound, an image, and/or using information defined within a configuration file (e.g., a preferred format, etc.).
Inblock613, the whisper app may format the outgoing message based on the identified message type and identified format. In an embodiment, the outgoing message may also be formatted using the generated metadata. Inblock614, the sender's computing device may transmit the formatted outgoing message to the recipient, for instance to the recipient's mobile device. In an embodiment, the metadata may describe the message contents with a category description (e.g., question, statement, audio, whisper message, etc.).
FIG. 7 illustrates anembodiment method700 for a recipient's mobile device, personal hub, and wireless earpieces receiving, rendering and responding to a message formatted based on the recipients status information. The mobile device may be a laptop computer, smart phone, tablet device, and other similar computing devices. In various embodiments, any of the mobile device, personal hub, or the wireless earpieces may perform the following operations for processing received whisper messages and may each execute software capable of processing whisper messages, such as a whisper app.
Inblock702, the recipient's mobile device may receive an incoming message from a sender, for instance from a sender's computing device. For example, via a cellular network, the recipient's mobile device may receive a whisper message, visual message, telephone call, or other audio message.
Inblock704, the recipient's mobile device may obtain instructions for rendering and/or delivering the received message. In particular, the recipient's mobile device may determine whether the received message includes metadata, such as generated by a whisper app or whisper software running on the sender's computing device. For example, the recipient's mobile device may decode, parse and evaluate header information or other encoding information within the received message to obtain metadata, formatting data, message contents, and/or rendering instructions.
The recipient's mobile device may obtain instructions (e.g., metadata) that indicate instructions for delivering message contents to devices connected to the recipient's mobile device (e.g., output devices such as wireless earpieces, etc.). For example, detected metadata may include instructions for the recipient's mobile device to forward audio to a connected output device (e.g., the recipient's wireless earpiece). As another example, metadata may instruct a certain portion of the message contents to be performed as audio in a right wireless earpiece connected to the recipient's mobile device and another portion to be rendered as visual information on a connected personal hub device. The received message may also contain instructions that indicate formatting instructions for various portions of the received message. For example, metadata may include instructions for the recipient's mobile device to forward text-to-speech audio from one part of the received message to the recipient's left wireless earpiece and text-to-speech audio from another part of the message to the right wireless earpiece. As another example, metadata may indicate that certain text should be rendered in a certain color or as an animation. In an embodiment, metadata may also contain instructions directing the recipient's mobile device, personal hub or wireless earpieces to replay audio a certain number of times, slow down audio playbacks, increase the volume in certain sections of audio playbacks, and other effects.
In an embodiment, the received message may not contain metadata. In such a case, the received message may be handled by the recipient's mobile device in conventional manners (e.g., leave SMS text messages as unread and stored on the smart phone, send audio messages to voice mail, etc.). In an embodiment, the recipient's mobile device may interpret the received message's message type, and contents, and deliver the received message to various output devices, such as the personal hub and wireless earpieces, for rendering without metadata. For example, the recipient's mobile device may determine that a received message which does not have metadata is a text message from a contact on the recipient's privileged list and may render the message as audio for playing through the wireless earpieces.
Indetermination block706, the recipient's mobile device may determine whether devices are connected and/or available. In particular, the recipient's mobile device may determine whether output devices indicated in obtained instructions or metadata of the received message are connected to the recipient's mobile device and/or available for receiving and rendering message contents. For example, when metadata of the received message indicates particular message contents are to be rendered as audio by a wireless earpiece, the recipient's mobile device may determine whether the wireless earpiece is wirelessly connected (or paired) with the recipient's mobile device. In an embodiment, the recipient's mobile device, the personal hub, or any other computing device employed by the recipient may detect whether output devices are connected based on operations similar to those for determining status information as described above with reference toFIG. 5. For example, the recipient's mobile device may determine whether the personal hub is active and wirelessly paired such that the personal hub is available for rendering message contents.
If devices are connected and/or available (i.e., determination block706=“Yes”), inblock707 the recipient's mobile device may provide message contents to the devices for rendering. The recipient's mobile device may provide, transfer, and/or otherwise deliver message contents based on the obtained instructions (e.g., metadata) within the received message. For example, sound or visual message contents may be provided (or delivered) to a right wireless earpiece for rendering based on metadata within the received message. As another example, an instruction for a motor to generate a vibration may be transmitted to the personal hub. In an embodiment, the personal hub may receive the contents and instructions for delivery to wireless earpieces. The personal hub may execute obtained instructions in ways that include transferring data to wireless earpieces for playback, rendering messages on the display unit of the personal hub, activating vibrations within the personal hub and/or wireless earpiece devices, polling sensor units, etc. In an embodiment, the instructions may direct the personal hub to transmit the data to the earpieces at particular moments. For example, the personal hub may schedule data transmissions to the left wireless earpiece before transmissions to the right earpiece. In an embodiment, the recipient's mobile device, the personal hub, or any other computing device employed by the recipient may provide message contents to a subset of the output devices indicated in the obtained instructions or metadata based on the devices' availability (or connectivity) at the time of receipt of the received message. For example, if metadata indicates message contents are to be rendered by two separate wireless earpieces and only one earpiece is connected to the recipient's mobile device, the recipient's mobile device may provide contents to the one connected wireless earpiece.
If the device are not connected or available (i.e., determination block706=“No”) or if message contents have been provided to the devices, inoptional block255 the recipient's mobile device may transmit a delivery confirmation message to the sender's computing device as described above in reference toFIG. 2.
Indetermination block710, the recipient's mobile device may determine whether the received message requires a response. In an embodiment, metadata may indicate a category describing the received message as containing an interrogatory (or question) for the recipient to answer. In another embodiment, the recipient's mobile device may analyze text or audio data within the received message to determine whether an interrogatory is posed. For example, the recipient's mobile device may execute a pattern matching routine to find well-known interrogatory patterns or symbols, such as a question mark. If the recipient's mobile device determines that the received message did not require a response (i.e., determination block710=“No”), then themethod700 may continue with the operations inblock702.
If the recipient's mobile device determines that the received message did require a response (i.e., determination block712=“Yes”), indetermination block712, the recipient's mobile device may determine whether the interrogatory requires a simple response options choice. For example, the interrogatory may include a set of simple responses the sender indicated within the message that the recipient may choose between to respond. In other words, the recipient's mobile device may determine whether the message contents and metadata indicate appropriate ways to present the question to the recipient using the personal hub and wireless earpieces. The recipient's mobile device may analyze metadata within the received message to detect simple response options, such as ‘yes’ or ‘no’. For example, the metadata may contain a code or identifying information that marks the message as having simple response options. In an embodiment, the recipient's mobile device may analyze the text or audio of the received message to determine the message contains simple response options.
If the recipient's mobile device determines the interrogatory requires a simple response options choice (i.e., determination block712=“Yes”), inblock714 the recipient's mobile device may present simple response options via graphical user interface or audible indicators. In an embodiment, the recipient's mobile device may transmit instructions based on metadata in the received message to the various connected devices (e.g., earpiece devices, personal hub, etc.) to render these simple response options to the recipient. Instructions for simple response options may include the personal hub displaying large text response options (e.g., ‘yes’/‘no’, “steak”/“sushi”, etc.), displaying differently colored response options, showing options as blinking or static text, playing response options in a particular wireless earpiece (e.g., ‘yes’ audio is played in the right wireless earpiece and ‘no’ audio is played in the left wireless earpiece), and the like. In an embodiment, the metadata may direct the recipient's mobile device to query recipient user configuration files to determine how to instruct the various devices to display or render simple response options. For example, the recipient may have a presets configuration file that informs the recipient's mobile device to direct all ‘yes’ response options to the recipient's left wireless earpiece and ‘no’ response options to the right earpiece.
Inblock716, response input data may be received from the recipient. In other words, the personal hub, wireless earpieces, and other devices connected to the recipient's mobile device may receive input data from the recipient indicating a choice of presented simple response options. Response input data may include graphical user interface input (e.g., selections on a graphical user interface button, etc.) and/or sensor data received from a sensor within a device connected to the recipient's mobile device (e.g., an accelerometer, a touch sensor, a gyroscope, and/or a magnetometer within a wireless earpiece, etc.). For example, the left wireless earpiece may detect an abrupt motion measurement using an on-board gyroscope and wirelessly transmit this motion data to the personal hub which may in turn recognize that the recipient tapped the left wireless earpiece to indicate selection of the response option played there. As another example, the personal hub may detect recipient touch input on the side of the display unit which corresponds to an response option. As another example, the recipient may provide different input responses by tapping on the left wireless earpiece, tapping on the right wireless earpiece, tapping on both wireless earpieces simultaneously, and tapping on either while tapping on the graphical user interface of the personal hub or smart phone. In an embodiment, the recipient's mobile device may determine a response by the user not providing input (e.g., tapping, swiping, talking, etc.) within a certain period of time. In another embodiment, wireless earpieces, the personal hub, and/or the recipient's mobile device may include microphones configured to receive audio inputs from the recipient. For example, when the recipient taps or strikes a wireless earpiece that includes a microphone, that microphone may detect the sound of the tap or strike and the wireless earpiece may interpret a corresponding microphone signal as an input (e.g., a tap input to select a simple option rendered through the corresponding wireless earpiece, etc.).
In an embodiment, the recipient's mobile device may send an audio or text message (e.g., SMS, whisper message, etc.) to the sender's computing device, execute other software, or establish a phone conversation in response to the recipient not selecting a simple response option. For example, if the recipient's mobile device does not receive a simple response option within a certain period of time or if it detects audio input data indicating the simple response options are inadequate (e.g., the recipient says into the wireless earpiece microphone “call him!”), the recipient's mobile device may initiate a phone call or transmit a whisper message.
Inblock718, if the received message did not require a simple response option choice (i.e., determination block712=“No”) or the recipient provided input indicating a choice (or a selection) of a simple response option, the recipient's mobile device may receive additional response data from the recipient. For example, the recipient's mobile device may receive one or more of a graphical user interface or button input (e.g., typing words, numbers, etc.), speech-to-text input, and/or other information the recipient wants to include in a response to the sender. In an embodiment, the recipient's mobile device may receive audio input from the recipient for dictation to an email, whisper message, or SMS text message.
In the various embodiments, the smart phone, personal hub, and other connected devices may receive input data from the recipient in the form of movement data (e.g., tapping, striking, swiping, etc.), audio data (e.g., recipient verbally responds to a question), or entered text responses.
Inblock720, the recipient's mobile device may generate a response message based on response data. For example, the recipient's mobile device may package any received responses or input data (e.g., text message for the sender, etc.) into an outgoing response message to be sent to the sender's computing device. The response message may be formatted in a similar manner as the operations described above with reference toFIG. 6. Inblock722, the recipient's mobile device may transmit the response message to the sender's computing device.
The various embodiments may be implemented in any of a variety of mobile devices, an example of which is illustrated inFIG. 8. For example, themobile device800 may include aprocessor802 coupled tointernal memories804 and810.Internal memories804 and810 may be volatile or non-volatile memories, and may also be secure and/or encrypted memories, or unsecure and/or unencrypted memories, or any combination thereof. Theprocessor802 may also be coupled to atouch screen display806, such as a resistive-sensing touch screen, capacitive-sensing touch screen infrared sensing touch screen, or the like. Additionally, the display of themobile device800 need not have touch screen capability. Additionally, themobile device800 may have one ormore antenna808 for sending and receiving electromagnetic radiation that may be connected to a wireless data link and/orcellular telephone transceiver816 coupled to theprocessor802. Themobile device800 may also includephysical buttons812aand812bfor receiving user inputs. Themobile device800 may also include apower button818 for turning themobile device800 on and off. In various embodiments, themobile device800 may also include amicrophone850 connected to theprocessor802 to receive an audio input.
The various embodiments described above may also be implemented within a variety of personal computing devices, such as alaptop computer910 as illustrated inFIG. 9. Many laptop computers include a touchpad touch surface917 that serves as the computer's pointing device, and thus may receive drag, scroll, and flick gestures similar to those implemented on mobile devices equipped with a touch screen display and described above. Alaptop computer910 will typically include aprocessor911 coupled tovolatile memory912 and a large capacity nonvolatile memory, such as adisk drive913 of Flash memory. Thelaptop computer910 may also include afloppy disc drive914 and a compact disc (CD) drive915 coupled to theprocessor911. Thelaptop computer910 may also include a number of connector ports coupled to theprocessor911 for establishing data connections or receiving external memory devices, such as a USB or FireWire® connector sockets, or other network connection circuits for coupling theprocessor911 to a network. In a notebook configuration, the computer housing includes the touchpad touch surface917, thekeyboard918, and thedisplay919 all coupled to theprocessor911. Other configurations of the computing device may include a computer mouse or trackball coupled to the processor (e.g., via a USB input) as are well known, which may also be use in conjunction with the various embodiments.
The various embodiments described above may also be implemented within a variety of personal hubs, such as a wrist watch typepersonal hub1000 as illustrated inFIG. 10. Apersonal hub1000 may include aprocessor1002 coupled tointernal memories1004 and1006.Internal memories1004 and1006 may be volatile or non-volatile memories, and may also be secure and/or encrypted memories, or unsecure and/or unencrypted memories, or any combination thereof. Theprocessor1002 may also be coupled to atouch screen display1020, such as a resistive-sensing touch screen, capacitive-sensing touch screen infrared sensing touch screen, or the like. Additionally, thepersonal hub1000 may have one ormore antenna1008 for sending and receiving electromagnetic radiation that may be connected to a wireless data link and coupled to theprocessor1002. Thepersonal hub1000 may also includephysical buttons1022 and1010 for receiving user inputs as well as a slide sensor1018 for receiving user inputs. Thepersonal hub1000 may also include avibratory motor1021 coupled to theprocessor1002 to enable thepersonal hub1000 to vibrate. Thepersonal hub1000 may also include various environment sensors or a sensor pack which may include sensor, such as atemperature sensor1014, anaccelerometer1012, atouch sensor1015, and agyroscope1013 coupled to theprocessor1002. In an embodiment, thepersonal hub1000 may also include amicrophone1050 coupled to theprocessor1102 to receive an audio input.
The various embodiments described above may also be implemented within a variety of wireless earpieces, such as awireless earpiece104 as illustrated inFIG. 11A. Awireless earpiece104 may include aprocessor1102 coupled tointernal memories1104 and1106.Internal memories1104 and1106 may be volatile or non-volatile memories, and may also be secure and/or encrypted memories, or unsecure and/or unencrypted memories, or any combination thereof. Thewireless earpiece104 may include aphysical button1114 for receiving user inputs. Additionally, thewireless earpiece104 may have one ormore antenna1112 for sending and receiving electromagnetic radiation that may be connected to a wireless data link and coupled to theprocessor1102. Thewireless earpiece104 may include aspeaker1108 coupled to theprocessor1102 and configured to generate an audio output. Thewireless earpiece104 may also include amicrophone1116 coupled to theprocessor1102 to receive an audio input. Thewireless earpiece104 may also include various environment sensors or a sensor pack which may include sensors, such as atemperature sensor1117, anaccelerometer1118, atouch sensor1120, and agyroscope1119 coupled to theprocessor1102.
FIG. 11B illustrates an alternative embodiment of thewireless earpiece1150 in which themicrophone1116 may be positioned within a main housing of thewireless earpiece1150 rather than on a microphone boom extending toward the user's mouth when thewireless earpiece1150 is worn. In an embodiment, themicrophone1116 may be a directional microphone, and the earpiece housing may be configured such that themicrophone1116 is pointed toward the mouth of the user when the user may be wearing thewireless earpiece1150. Further audio processing of microphone data may be used to further direct the microphone sensitivity in order to capture audio data from the user's mouth. In this manner, themicrophone1116 may receive audio input from the user of thewireless earpiece1150.
Theprocessors802,911,1002, and1102 may be any programmable microprocessor, microcomputer or multiple processor chip or chips that can be configured by software instructions (applications) to perform a variety of functions, including the functions of the various embodiments described above. In some devices, multiple processors may be provided, such as one processor dedicated to wireless communication functions and one processor dedicated to running other applications. Typically, software applications may be stored in theinternal memory804,810,912,913,1004,1006,1104, and1106 before they are accessed and loaded into theprocessors802,911,1002, and1102. Theprocessors802,911,1002, and1102 may include internal memory sufficient to store the application software instructions. In many devices the internal memory may be a volatile or nonvolatile memory, such as flash memory, or a mixture of both. For the purposes of this description, a general reference to memory refers to memory accessible by theprocessors802,911,1002, and1102 including internal memory or removable memory plugged into the device and memory within theprocessor802,911,1002, and1102 themselves.
In further embodiments, the communications between a personal hub, smart phone, laptop, and/or earpiece may be transmitted over wired data links or combinations of wired and wireless data links. In these embodiments, an example of which is illustrated inFIG. 1 discussed below, two or more of the personal hub, smart phone, laptop, and earpiece may be connected by various cables instead of, or in addition to, wireless data links. In such embodiments, the personal hub may be worn or carried by a user and may interact with a variety of mobile devices and/or accessories through wired connections, or combinations of wired and wireless connections, such as wired headphones, wireless headsets, wired earpieces, etc.
The foregoing method descriptions and the process flow diagrams are provided merely as illustrative examples and are not intended to require or imply that the steps of the various embodiments must be performed in the order presented. As will be appreciated by one of skill in the art the order of steps in the foregoing embodiments may be performed in any order. Words such as “thereafter,” “then,” “next,” etc. are not intended to limit the order of the steps; these words are simply used to guide the reader through the description of the methods. Further, any reference to claim elements in the singular, for example, using the articles “a,” “an” or “the” is not to be construed as limiting the element to the singular.
The various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The hardware used to implement the various illustrative logics, logical blocks, modules, and circuits described in connection with the aspects disclosed herein may be implemented or performed with a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but, in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Alternatively, some steps or methods may be performed by circuitry that is specific to a given function.
In one or more exemplary aspects, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. The steps of a method or algorithm disclosed herein may be embodied in a processor-executable software module which may reside on a tangible, non-transitory computer-readable storage medium. Tangible, non-transitory computer-readable storage media may be any available media that may be accessed by a computer. By way of example, and not limitation, such non-transitory computer-readable media may comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that may be used to store desired program code in the form of instructions or data structures and that may be accessed by a computer. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of non-transitory computer-readable media. Additionally, the operations of a method or algorithm may reside as one or any combination or set of codes and/or instructions on a tangible, non-transitory machine readable medium and/or computer-readable medium, which may be incorporated into a computer program product.
The preceding description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the following claims and the principles and novel features disclosed herein.

Claims (20)

What is claimed is:
1. A method for communicating delivery confirmation information related to received messages by a recipient's mobile device, the method comprising:
receiving a message in the recipient's mobile device identifying a device coupled to the recipient's mobile device via a short-range wireless communication technology;
obtaining from the received message instructions for rendering the received message on at least one of the recipient's mobile device or the device coupled to the recipient's mobile device via the short-range wireless communication technology, wherein obtaining from the received message instructions for rendering the received message on at least one of the recipient's mobile device or the device coupled to the recipient's mobile device via the short-range wireless communication technology includes decoding the received message to obtain metadata indicating the device on which the sender desires the received message to be rendered and at least one of sound or visual message contents;
determining whether the device indicated in the metadata is coupled to the recipient's mobile device via the short-range wireless communication technology;
providing the at least one of sound or visual message contents to the device indicated in the metadata in response to determining that the device is coupled to the recipient's mobile device;
generating a delivery confirmation message reporting whether the received message was delivered and, if the received message was delivered, a manner in which the received message was delivered; and
transmitting the delivery confirmation message to a sender of the received message.
2. The method ofclaim 1, wherein reporting the manner in which the received message was delivered includes reporting information describing at least one of an identity of the device coupled to the recipient's mobile device, a first indicator of whether message contents of the received message were successfully rendered by the device coupled to the recipient's mobile device, or a second indicator of whether message contents of the received message were queued for rendering by the device coupled to the recipient's mobile device.
3. The method ofclaim 1, wherein receiving the message in the recipient's mobile device comprises receiving at least one of an email, a whisper message, an SMS text message, an audio message, a visual message, a video call, a telephone call, or a message formatted for use with whisper software.
4. The method ofclaim 1, further comprising:
receiving input data from at least one of the recipient's mobile device and or the device coupled to the recipient's mobile device in response to delivering the received message; and
generating the delivery confirmation message based on the received input data.
5. The method ofclaim 4, wherein receiving input data comprises receiving sensor data at least from one of:
a touch sensor;
an accelerometer;
a gyroscope; or
a magnetometer.
6. A computing device comprising:
a memory; and
a processor coupled to the memory, wherein the processor is configured with processor-executable instructions to perform operations comprising:
receiving a message in the recipient's mobile device identifying a device coupled to the recipient's mobile device via a short-range wireless communication technology;
obtaining from the received message instructions for rendering the received message on at least one of the recipient's mobile device or the device coupled to the recipient's mobile device via the short-range wireless communication technology, wherein obtaining from the received message instructions for rendering the received message on at least one of the recipient's mobile device or the device coupled to the recipient's mobile device via the short-range wireless communication technology includes decoding the received message to obtain metadata indicating the device on which the sender desires the received message to be rendered and at least one of sound or visual message contents;
determining whether the device indicated in the metadata is coupled to the recipient's mobile device via the short-range wireless communication technology;
providing the at least one of sound or visual message contents to the device indicated in the metadata in response to determining that the device is coupled to the recipient's mobile device;
generating a delivery confirmation message reporting whether the received message was delivered and, if the received message was delivered, a manner in which the received message was delivered; and
transmitting the delivery confirmation message to a sender of the received message.
7. The computing device ofclaim 6, wherein the processor is configured with processor-executable instructions to perform operations such that reporting the manner in which the received message was delivered includes reporting information describing at least one of an identity of the device coupled to the computing device, a first indicator of whether message contents of the received message were successfully rendered by the device coupled to the computing device, or a second indicator of whether message contents of the received message were queued for rendering by the device coupled to the computing device.
8. The computing device ofclaim 6, wherein the processor is configured with processor-executable instructions to perform operations such that receiving the message in the computing device comprises receiving at least one of an email, a whisper message, an SMS text message, an audio message, a visual message, a video call, a telephone call, or a message formatted for use with whisper software.
9. The computing device ofclaim 6, wherein the processor is configured with processor-executable instructions to perform operations further comprising:
receiving input data from at least one of the computing device or the device coupled to the computing device in response to delivering the received message; and
generating the delivery confirmation message based on the received input data.
10. The computing device ofclaim 9, wherein the processor is configured with processor-executable instructions to perform operations such that receiving input data comprises receiving sensor data from at least one of:
a touch sensor;
an accelerometer;
a gyroscope; or
a magnetometer.
11. A non-transitory processor-readable storage medium having stored thereon processor-executable software instructions configured to cause a processor in a computing device to perform operations comprising:
receiving a message in the recipient's mobile device identifying a device coupled to the recipient's mobile device via a short-range wireless communication technology;
obtaining from the received message instructions for rendering the received message on at least one of the recipient's mobile device or the device coupled to the recipient's mobile device via the short-range wireless communication technology, wherein obtaining from the received message instructions for rendering the received message on at least one of the recipient's mobile device or the device coupled to the recipient's mobile device via the short-range wireless communication technology includes decoding the received message to obtain metadata indicating the device on which the sender desires the received message to be rendered and at least one of sound or visual message contents;
determining whether the device indicated in the metadata is coupled to the recipient's mobile device via the short-range wireless communication technology;
providing the at least one of sound or visual message contents to the device indicated in the metadata in response to determining that the device is coupled to the recipient's mobile device;
generating a delivery confirmation message reporting whether the received message was delivered and, if the received message was delivered, a manner in which the received message was delivered; and
transmitting the delivery confirmation message to a sender of the received message.
12. The non-transitory processor-readable storage medium ofclaim 11, wherein the stored processor-executable instructions are configured to cause a processor to perform operations such that reporting the manner in which the received message was delivered includes reporting information describing at least one of an identity of the device coupled to the computing device, a first indicator of whether message contents of the received message were successfully rendered by the device coupled to the computing device, or a second indicator of whether message contents of the received message were queued for rendering by the device coupled to the computing device.
13. The non-transitory processor-readable storage medium ofclaim 11, wherein the stored processor-executable instructions are configured to cause a processor to perform operations such that receiving the message in the computing device comprises receiving at least one of an email, a whisper message, an SMS text message, an audio message, a visual message, a video call, a telephone call, or a message formatted for use with whisper software.
14. The non-transitory processor-readable storage medium ofclaim 11, wherein the stored processor-executable instructions are configured to cause a processor to perform operations further comprising:
receiving input data from at least one of the computing device or the device coupled to the computing device in response to delivering the received message; and
generating the delivery confirmation message based on the received input data.
15. The non-transitory processor-readable storage medium ofclaim 14, wherein the stored processor-executable instructions are configured to cause a processor to perform operations such that receiving input data comprises receiving sensor data from at least one of:
a touch sensor;
an accelerometer;
a gyroscope; or
a magnetometer.
16. A computing device comprising:
means for receiving a message identifying a device coupled to the computing device via a short-range wireless communication technology;
means for obtaining from the received message instructions for rendering the received message on at least one of the computing device or the device coupled to the computing device via the short-range wireless communication technology, wherein means for obtaining from the received message instructions for rendering the received message on at least one of the recipient's mobile device or the device coupled to the recipient's mobile device via the short-range wireless communication technology includes means for decoding the received message to obtain metadata indicating the device on which the sender desires the received message to be rendered and at least one of sound or visual message contents;
means for determining whether the device indicated in the metadata is coupled to the recipient's mobile device via the short-range wireless communication technology;
means for providing the at least one of sound or visual message contents to the device indicated in the metadata in response to determining that the device is coupled to the recipient's mobile device;
means for generating a delivery confirmation message reporting whether the received message was delivered and, if the received message was delivered, a manner in which the received message was delivered; and
means for transmitting the delivery confirmation message to a sender of the received message.
17. The computing device ofclaim 16, wherein means for generating a delivery confirmation message reporting the manner in which the received message was delivered includes means for reporting information describing at least one of an identity of the device coupled to the computing device, a first indicator of whether message contents of the received message were successfully rendered by the device coupled to the computing device, or a second indicator of whether message contents of the received message were queued for rendering by the device coupled to the computing device.
18. The computing device ofclaim 16, wherein means for receiving the message comprises means for receiving at least one of an email, a whisper message, an SMS text message, an audio message, a visual message, a video call, a telephone call, or a message formatted for use with whisper software.
19. The computing device ofclaim 16, further comprising:
means for receiving input data from at least one of the computing device or a device coupled to the computing device in response to the received message; and
means for generating a response message based on the received input data.
20. The computing device ofclaim 19, wherein means for receiving input data comprises means for receiving sensor data from at least one of:
a touch sensor;
an accelerometer;
a gyroscope; or
a magnetometer.
US13/686,8992012-05-272012-11-27Personal hub presence and responseExpired - Fee RelatedUS9173074B2 (en)

Priority Applications (5)

Application NumberPriority DateFiling DateTitle
US13/686,899US9173074B2 (en)2012-05-272012-11-27Personal hub presence and response
KR1020147036239AKR101622506B1 (en)2012-05-272013-04-30Message presentation based on capabilities of a mobile device
EP13722921.7AEP2856782B1 (en)2012-05-272013-04-30Message presentation based on capabilities of a mobile device
PCT/US2013/038745WO2013180873A2 (en)2012-05-272013-04-30Personal hub presence and response
CN201380027418.5ACN104335612B (en)2012-05-272013-04-30 Message presentation based on the capabilities of the mobile device

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
US201261652229P2012-05-272012-05-27
US13/686,899US9173074B2 (en)2012-05-272012-11-27Personal hub presence and response

Publications (2)

Publication NumberPublication Date
US20130316746A1 US20130316746A1 (en)2013-11-28
US9173074B2true US9173074B2 (en)2015-10-27

Family

ID=49622007

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US13/686,899Expired - Fee RelatedUS9173074B2 (en)2012-05-272012-11-27Personal hub presence and response

Country Status (5)

CountryLink
US (1)US9173074B2 (en)
EP (1)EP2856782B1 (en)
KR (1)KR101622506B1 (en)
CN (1)CN104335612B (en)
WO (1)WO2013180873A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20150358260A1 (en)*2014-06-092015-12-10Ca, Inc.Dynamic buddy list management based on message content
US10356237B2 (en)*2016-02-292019-07-16Huawei Technologies Co., Ltd.Mobile terminal, wearable device, and message transfer method

Families Citing this family (199)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US8677377B2 (en)2005-09-082014-03-18Apple Inc.Method and apparatus for building an intelligent automated assistant
US9318108B2 (en)2010-01-182016-04-19Apple Inc.Intelligent automated assistant
US8977255B2 (en)2007-04-032015-03-10Apple Inc.Method and system for operating a multi-function portable electronic device using voice-activation
US10002189B2 (en)2007-12-202018-06-19Apple Inc.Method and apparatus for searching using an active ontology
US9330720B2 (en)2008-01-032016-05-03Apple Inc.Methods and apparatus for altering audio output signals
US20100030549A1 (en)2008-07-312010-02-04Lee Michael MMobile device having human language translation capability with positional feedback
US8676904B2 (en)2008-10-022014-03-18Apple Inc.Electronic devices with voice command and contextual data processing capabilities
US20120309363A1 (en)2011-06-032012-12-06Apple Inc.Triggering notifications associated with tasks items that represent tasks to perform
US8706147B2 (en)*2009-06-222014-04-22Mitel Networks CorporationMethod, system and apparatus for enhancing digital voice call initiation between a calling telephony device and a called telephony device
US10276170B2 (en)2010-01-182019-04-30Apple Inc.Intelligent automated assistant
US8682667B2 (en)2010-02-252014-03-25Apple Inc.User profiling for selecting user specific voice input processing information
US9262612B2 (en)2011-03-212016-02-16Apple Inc.Device access using voice authentication
US10057736B2 (en)2011-06-032018-08-21Apple Inc.Active transport based notifications
US9740883B2 (en)2011-08-242017-08-22Location Labs, Inc.System and method for enabling control of mobile device functional components
US9407492B2 (en)*2011-08-242016-08-02Location Labs, Inc.System and method for enabling control of mobile device functional components
US9819753B2 (en)2011-12-022017-11-14Location Labs, Inc.System and method for logging and reporting mobile device activity information
US8954571B2 (en)2012-01-132015-02-10Wavemarket, Inc.System and method for implementing histogram controlled mobile devices
US10134385B2 (en)2012-03-022018-11-20Apple Inc.Systems and methods for name pronunciation
US9489531B2 (en)2012-05-132016-11-08Location Labs, Inc.System and method for controlling access to electronic devices
US10417037B2 (en)2012-05-152019-09-17Apple Inc.Systems and methods for integrating third party services with a digital assistant
WO2013179263A1 (en)*2012-05-302013-12-05Marvell World Trade Ltd.Network presence offload
US9721563B2 (en)2012-06-082017-08-01Apple Inc.Name recognition system
US8639233B1 (en)*2012-07-232014-01-28Sprint Communications Company L.P.Data session continuity between wireless networks
US9591452B2 (en)2012-11-282017-03-07Location Labs, Inc.System and method for enabling mobile device applications and functional components
US9477838B2 (en)2012-12-202016-10-25Bank Of America CorporationReconciliation of access rights in a computing system
US9537892B2 (en)2012-12-202017-01-03Bank Of America CorporationFacilitating separation-of-duties when provisioning access rights in a computing system
US9639594B2 (en)2012-12-202017-05-02Bank Of America CorporationCommon data model for identity access management data
US9542433B2 (en)2012-12-202017-01-10Bank Of America CorporationQuality assurance checks of access rights in a computing system
US9489390B2 (en)2012-12-202016-11-08Bank Of America CorporationReconciling access rights at IAM system implementing IAM data model
US9554190B2 (en)2012-12-202017-01-24Location Labs, Inc.System and method for controlling communication device use
US9483488B2 (en)2012-12-202016-11-01Bank Of America CorporationVerifying separation-of-duties at IAM system implementing IAM data model
US9529629B2 (en)*2012-12-202016-12-27Bank Of America CorporationComputing resource inventory system
US9495380B2 (en)2012-12-202016-11-15Bank Of America CorporationAccess reviews at IAM system implementing IAM data model
US9189644B2 (en)2012-12-202015-11-17Bank Of America CorporationAccess requests at IAM system implementing IAM data model
US9300744B2 (en)*2013-01-182016-03-29Plantronics, Inc.Context sensitive and shared location based reminder
US8874761B2 (en)*2013-01-252014-10-28Seven Networks, Inc.Signaling optimization in a wireless network for traffic utilizing proprietary and non-proprietary protocols
DE212014000045U1 (en)2013-02-072015-09-24Apple Inc. Voice trigger for a digital assistant
US10652394B2 (en)2013-03-142020-05-12Apple Inc.System and method for processing voicemail
US10748529B1 (en)2013-03-152020-08-18Apple Inc.Voice activated device for use with a voice-based digital assistant
US10560324B2 (en)2013-03-152020-02-11Location Labs, Inc.System and method for enabling user device control
US9116808B2 (en)*2013-03-152015-08-25Vonage Network LlcMethod and system for determining device configuration settings
WO2014197335A1 (en)2013-06-082014-12-11Apple Inc.Interpreting and acting upon commands that involve sharing information with remote devices
US10176167B2 (en)2013-06-092019-01-08Apple Inc.System and method for inferring user intent from speech inputs
DE112014002747T5 (en)2013-06-092016-03-03Apple Inc. Apparatus, method and graphical user interface for enabling conversation persistence over two or more instances of a digital assistant
FR3009467A1 (en)*2013-08-052015-02-06Orange METHOD FOR MANAGING, IN A FIRST DEVICE, THE WRITING OF A RESPONSE TO A MESSAGE RECEIVED FROM A SECOND DEVICE
DE112014003653B4 (en)2013-08-062024-04-18Apple Inc. Automatically activate intelligent responses based on activities from remote devices
US9959775B2 (en)*2013-09-092018-05-01Alexis PracarMonitoring, tracking, and managing symptoms of Alzheimer's disease
US10425365B2 (en)*2013-11-222019-09-24At&T Intellectual Property I, L.P.System and method for relaying messages
US10296160B2 (en)2013-12-062019-05-21Apple Inc.Method for extracting salient dialog usage from live data
CN105874443B (en)*2013-12-192020-05-19英特尔公司Method and apparatus for communicating between companion devices
US10540063B2 (en)2014-01-272020-01-21Microsoft Technology Licensing, LlcProcessing actionable notifications
US10802681B2 (en)*2014-01-272020-10-13Microsoft Technology Licensing, LlcActionable notifications
KR102222696B1 (en)*2014-02-212021-03-05삼성전자주식회사Apparatus and method for transmitting message
US10469428B2 (en)*2014-02-212019-11-05Samsung Electronics Co., Ltd.Apparatus and method for transmitting message
US9772985B2 (en)*2014-02-282017-09-26Microsoft Technology Licensing, LlcCommunications control for resource constrained devices
US20150304368A1 (en)*2014-04-162015-10-22Facebook, Inc.Sharing Locations with Friends on Online Social Networks
WO2015172071A1 (en)*2014-05-082015-11-12Tank Design, Inc.Systems and methods for location-dependent electronic communications
US9430463B2 (en)2014-05-302016-08-30Apple Inc.Exemplar-based natural language processing
US9715875B2 (en)2014-05-302017-07-25Apple Inc.Reducing the need for manual start/end-pointing and trigger phrases
US10170123B2 (en)2014-05-302019-01-01Apple Inc.Intelligent assistant for home automation
US10148805B2 (en)2014-05-302018-12-04Location Labs, Inc.System and method for mobile device control delegation
US9633004B2 (en)2014-05-302017-04-25Apple Inc.Better resolution when referencing to concepts
CN110797019B (en)2014-05-302023-08-29苹果公司Multi-command single speech input method
US9338493B2 (en)2014-06-302016-05-10Apple Inc.Intelligent automated assistant for TV user interactions
US9661126B2 (en)2014-07-112017-05-23Location Labs, Inc.Driving distraction reduction system and method
US9749458B2 (en)2014-08-112017-08-29Location Labs, Inc.Driving without distraction support system
US9818400B2 (en)2014-09-112017-11-14Apple Inc.Method and apparatus for discovering trending terms in speech requests
EP2999249A1 (en)2014-09-222016-03-23Gemalto SaMethod for detecting dynamically that secure elements are eligible to an OTA campaign and corresponding OTA server
US10074360B2 (en)2014-09-302018-09-11Apple Inc.Providing an indication of the suitability of speech recognition
US9668121B2 (en)2014-09-302017-05-30Apple Inc.Social reminders
US10146838B2 (en)2014-09-302018-12-04At&T Intellectual Property I, L.P.Contextual management of client devices
US10127911B2 (en)2014-09-302018-11-13Apple Inc.Speaker identification and unsupervised speaker adaptation techniques
US9306899B1 (en)*2015-02-272016-04-05Ringcentral, Inc.System and method for determining presence based on an attribute of an electronic message
US20160255024A1 (en)*2015-02-272016-09-01Vonage Network LlcSystems and methods for managing presentation of message content at user communication terminals
US10152299B2 (en)2015-03-062018-12-11Apple Inc.Reducing response latency of intelligent automated assistants
US9721566B2 (en)2015-03-082017-08-01Apple Inc.Competing devices responding to voice triggers
US9886953B2 (en)2015-03-082018-02-06Apple Inc.Virtual assistant activation
US10567477B2 (en)2015-03-082020-02-18Apple Inc.Virtual assistant continuity
CN104811533A (en)*2015-03-242015-07-29广东欧珀移动通信有限公司 A method, smart speaker and system for automatically playing messages by voice
US11394824B2 (en)2015-05-012022-07-19Vyng Inc.Adjusting presentation on smart phone lockscreen of visual content associated with metadata of incoming call
US11368575B2 (en)2015-05-012022-06-21Vyng, Inc.Management of calls and media content associated with a caller on mobile computing devices
US11451659B2 (en)2015-05-012022-09-20Vyng Inc.Dynamic rewardable activity value determination and allocation
US11394821B2 (en)2015-05-012022-07-19Vyng Inc.Curated search of entities from dial pad selections
US11381679B2 (en)2015-05-012022-07-05Vyng, Inc.Management of media content associated with call context on mobile computing devices
US11394823B2 (en)2015-05-012022-07-19Vyng Inc.Configuring business application for utilization of sender controlled media service
US11394822B2 (en)2015-05-012022-07-19Vyng Inc.Incentivising answering call in smartphone lockscreen
US10460227B2 (en)2015-05-152019-10-29Apple Inc.Virtual assistant in a communication session
IN2015CH02514A (en)*2015-05-192015-07-10Wipro Ltd
US10083688B2 (en)2015-05-272018-09-25Apple Inc.Device voice control for selecting a displayed affordance
US10200824B2 (en)2015-05-272019-02-05Apple Inc.Systems and methods for proactively identifying and surfacing relevant content on a touch-sensitive device
US9578173B2 (en)2015-06-052017-02-21Apple Inc.Virtual assistant aided communication with 3rd party service in a communication session
US11025565B2 (en)2015-06-072021-06-01Apple Inc.Personalized prediction of responses for instant messaging
US10268748B2 (en)*2015-06-072019-04-23Apple Inc.Reader application with a personalized feed and method of providing recommendations while maintaining user privacy
US20160378747A1 (en)2015-06-292016-12-29Apple Inc.Virtual assistant for media playback
EP3826207B1 (en)*2015-08-182022-05-11GN Hearing A/SA method of exchanging data packages between first and second portable communication devices
US10740384B2 (en)2015-09-082020-08-11Apple Inc.Intelligent automated assistant for media search and playback
US10671428B2 (en)2015-09-082020-06-02Apple Inc.Distributed personal assistant
US10747498B2 (en)2015-09-082020-08-18Apple Inc.Zero latency digital assistant
US10331312B2 (en)2015-09-082019-06-25Apple Inc.Intelligent automated assistant in a media environment
KR102396634B1 (en)*2015-09-182022-05-11삼성전자주식회사Apparatus and method for transmitting of message reception information in wireless communication system
US11587559B2 (en)2015-09-302023-02-21Apple Inc.Intelligent device identification
US10148808B2 (en)2015-10-092018-12-04Microsoft Technology Licensing, LlcDirected personal communication for speech generating devices
US10262555B2 (en)2015-10-092019-04-16Microsoft Technology Licensing, LlcFacilitating awareness and conversation throughput in an augmentative and alternative communication system
US9679497B2 (en)2015-10-092017-06-13Microsoft Technology Licensing, LlcProxies for speech generating devices
US10691473B2 (en)2015-11-062020-06-23Apple Inc.Intelligent automated assistant in a messaging environment
US10956666B2 (en)2015-11-092021-03-23Apple Inc.Unconventional virtual assistant interactions
US10484493B2 (en)*2015-11-172019-11-19At&T Intellectual Property I, L.P.Method and apparatus for communicating messages
US10469436B2 (en)*2015-11-202019-11-05Accenture Global Solutions LimitedManaging messaging services
CN105376412A (en)*2015-12-012016-03-02小米科技有限责任公司Information processing method and device
US10049668B2 (en)2015-12-022018-08-14Apple Inc.Applying neural network language models to weighted finite state transducers for automatic speech recognition
CN106878937A (en)*2015-12-142017-06-20中国电信股份有限公司A kind of express delivery is received sending method, device and the terminal of information
US10223066B2 (en)2015-12-232019-03-05Apple Inc.Proactive assistance based on dialog communication between devices
CN109076271B (en)*2016-03-302021-08-03惠普发展公司,有限责任合伙企业Indicator for indicating the status of a personal assistance application
US11227589B2 (en)2016-06-062022-01-18Apple Inc.Intelligent list reading
US10049663B2 (en)2016-06-082018-08-14Apple, Inc.Intelligent automated assistant for media exploration
US12223282B2 (en)2016-06-092025-02-11Apple Inc.Intelligent automated assistant in a home environment
US10192552B2 (en)*2016-06-102019-01-29Apple Inc.Digital assistant providing whispered speech
US10586535B2 (en)2016-06-102020-03-10Apple Inc.Intelligent digital assistant in a multi-tasking environment
US12197817B2 (en)2016-06-112025-01-14Apple Inc.Intelligent device arbitration and control
DK201670540A1 (en)2016-06-112018-01-08Apple IncApplication integration with a digital assistant
DK179415B1 (en)2016-06-112018-06-14Apple IncIntelligent device arbitration and control
CN107770360A (en)*2016-08-182018-03-06联想(北京)有限公司Control method, terminal device and the wearable device of terminal device
US10798044B1 (en)2016-09-012020-10-06Nufbee LlcMethod for enhancing text messages with pre-recorded audio clips
US10474753B2 (en)2016-09-072019-11-12Apple Inc.Language identification using recurrent neural networks
US10043516B2 (en)2016-09-232018-08-07Apple Inc.Intelligent automated assistant
CN106375591A (en)*2016-09-292017-02-01维沃移动通信有限公司 A device for detecting audio input devices and a mobile terminal
US10057880B2 (en)*2016-12-112018-08-21Qualcomm IncorporatedSmart notifications between devices
US11204787B2 (en)2017-01-092021-12-21Apple Inc.Application integration with a digital assistant
US10417266B2 (en)2017-05-092019-09-17Apple Inc.Context-aware ranking of intelligent response suggestions
DK201770383A1 (en)2017-05-092018-12-14Apple Inc.User interface for correcting recognition errors
US10726832B2 (en)2017-05-112020-07-28Apple Inc.Maintaining privacy of personal information
US10395654B2 (en)2017-05-112019-08-27Apple Inc.Text normalization based on a data-driven learning network
DK180048B1 (en)2017-05-112020-02-04Apple Inc. MAINTAINING THE DATA PROTECTION OF PERSONAL INFORMATION
DK201770439A1 (en)2017-05-112018-12-13Apple Inc.Offline personal assistant
DK179496B1 (en)2017-05-122019-01-15Apple Inc. USER-SPECIFIC Acoustic Models
US11301477B2 (en)2017-05-122022-04-12Apple Inc.Feedback analysis of a digital assistant
DK179745B1 (en)2017-05-122019-05-01Apple Inc. SYNCHRONIZATION AND TASK DELEGATION OF A DIGITAL ASSISTANT
DK201770427A1 (en)2017-05-122018-12-20Apple Inc.Low-latency intelligent automated assistant
DK201770431A1 (en)2017-05-152018-12-20Apple Inc.Optimizing dialogue policy decisions for digital assistants using implicit feedback
DK201770432A1 (en)2017-05-152018-12-21Apple Inc.Hierarchical belief states for digital assistants
DK201770411A1 (en)2017-05-152018-12-20Apple Inc. MULTI-MODAL INTERFACES
US10303715B2 (en)2017-05-162019-05-28Apple Inc.Intelligent automated assistant for media exploration
US10311144B2 (en)2017-05-162019-06-04Apple Inc.Emoji word sense disambiguation
US20180336892A1 (en)2017-05-162018-11-22Apple Inc.Detecting a trigger of a digital assistant
US10403278B2 (en)2017-05-162019-09-03Apple Inc.Methods and systems for phonetic matching in digital assistant services
DK179549B1 (en)2017-05-162019-02-12Apple Inc.Far-field extension for digital assistant services
US10657328B2 (en)2017-06-022020-05-19Apple Inc.Multi-task recurrent neural network architecture for efficient morphology handling in neural language modeling
US10445429B2 (en)2017-09-212019-10-15Apple Inc.Natural language understanding using vocabularies with compressed serialized tries
US10755051B2 (en)2017-09-292020-08-25Apple Inc.Rule-based natural language processing
US10636424B2 (en)2017-11-302020-04-28Apple Inc.Multi-turn canned dialog
CN119854404A (en)*2017-12-112025-04-18数字利夫公司Method and system for managing media content associated with a message context on a mobile computing device
US10733982B2 (en)2018-01-082020-08-04Apple Inc.Multi-directional dialog
US10733375B2 (en)2018-01-312020-08-04Apple Inc.Knowledge-based framework for improving natural language understanding
US10789959B2 (en)2018-03-022020-09-29Apple Inc.Training speaker recognition models for digital assistants
US10592604B2 (en)2018-03-122020-03-17Apple Inc.Inverse text normalization for automatic speech recognition
US10818288B2 (en)2018-03-262020-10-27Apple Inc.Natural assistant interaction
JP7023769B2 (en)*2018-03-282022-02-22京セラ株式会社 Electronic devices and processing systems
JP7073160B2 (en)*2018-03-282022-05-23京セラ株式会社 Electronic equipment and processing systems
US10909331B2 (en)2018-03-302021-02-02Apple Inc.Implicit identification of translation payload with neural machine translation
JP7059092B2 (en)*2018-04-252022-04-25京セラ株式会社 Electronic devices and processing systems
US10928918B2 (en)2018-05-072021-02-23Apple Inc.Raise to speak
US11145294B2 (en)2018-05-072021-10-12Apple Inc.Intelligent automated assistant for delivering content from user experiences
US10984780B2 (en)2018-05-212021-04-20Apple Inc.Global semantic word embeddings using bi-directional recurrent neural networks
DK180639B1 (en)2018-06-012021-11-04Apple Inc DISABILITY OF ATTENTION-ATTENTIVE VIRTUAL ASSISTANT
DK179822B1 (en)2018-06-012019-07-12Apple Inc.Voice interaction at a primary device to access call functionality of a companion device
US10892996B2 (en)2018-06-012021-01-12Apple Inc.Variable latency device coordination
US11386266B2 (en)2018-06-012022-07-12Apple Inc.Text correction
DK201870355A1 (en)2018-06-012019-12-16Apple Inc.Virtual assistant operation in multi-device environments
US10504518B1 (en)2018-06-032019-12-10Apple Inc.Accelerated task performance
JP6966979B2 (en)*2018-06-262021-11-17株式会社日立製作所 Dialogue system control methods, dialogue systems and programs
US11010561B2 (en)2018-09-272021-05-18Apple Inc.Sentiment prediction from textual data
US11462215B2 (en)2018-09-282022-10-04Apple Inc.Multi-modal inputs for voice commands
US11170166B2 (en)2018-09-282021-11-09Apple Inc.Neural typographical error modeling via generative adversarial networks
US10839159B2 (en)2018-09-282020-11-17Apple Inc.Named entity normalization in a spoken dialog system
US11475898B2 (en)2018-10-262022-10-18Apple Inc.Low-latency multi-speaker speech recognition
US11638059B2 (en)2019-01-042023-04-25Apple Inc.Content playback on multiple devices
US11348573B2 (en)2019-03-182022-05-31Apple Inc.Multimodality in digital assistant systems
US11190468B2 (en)*2019-04-192021-11-30Microsoft Technology Licensing, LlcMethod and system of synchronizing communications in a communication environment
DK201970509A1 (en)2019-05-062021-01-15Apple IncSpoken notifications
US11475884B2 (en)2019-05-062022-10-18Apple Inc.Reducing digital assistant latency when a language is incorrectly determined
US11423908B2 (en)2019-05-062022-08-23Apple Inc.Interpreting spoken requests
US11307752B2 (en)2019-05-062022-04-19Apple Inc.User configurable task triggers
US11140099B2 (en)2019-05-212021-10-05Apple Inc.Providing message response suggestions
US11496600B2 (en)2019-05-312022-11-08Apple Inc.Remote execution of machine-learned models
US11289073B2 (en)2019-05-312022-03-29Apple Inc.Device text to speech
DK201970511A1 (en)2019-05-312021-02-15Apple IncVoice identification in digital assistant systems
DK180129B1 (en)2019-05-312020-06-02Apple Inc. USER ACTIVITY SHORTCUT SUGGESTIONS
US11227599B2 (en)2019-06-012022-01-18Apple Inc.Methods and user interfaces for voice-based control of electronic devices
US11360641B2 (en)2019-06-012022-06-14Apple Inc.Increasing the relevance of new available information
EP3779820A1 (en)*2019-08-142021-02-17Nokia Technologies OyMessage delivery
US11683282B2 (en)2019-08-152023-06-20Microsoft Technology Licensing, LlcMethod and system of synchronizing communications
US11488406B2 (en)2019-09-252022-11-01Apple Inc.Text detection using global geometry estimators
US11183193B1 (en)2020-05-112021-11-23Apple Inc.Digital assistant hardware abstraction
US11061543B1 (en)2020-05-112021-07-13Apple Inc.Providing relevant data items based on context
US11755276B2 (en)2020-05-122023-09-12Apple Inc.Reducing description length based on confidence
US11490204B2 (en)2020-07-202022-11-01Apple Inc.Multi-device audio adjustment coordination
US11438683B2 (en)2020-07-212022-09-06Apple Inc.User identification using headphones
TWI765808B (en)2021-08-172022-05-21三竹資訊股份有限公司System and method of dispatching an instant message in silent mode
US12069022B2 (en)*2022-03-232024-08-20Motorola Mobility LlcInternet protocol messages via short message service

Citations (28)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
EP0371609A2 (en)1988-11-291990-06-06International Business Machines CorporationElectronic mail systems
WO1998019438A1 (en)1996-10-291998-05-07Telefonaktiebolaget Lm Ericsson (Publ)Method and arrangement for handling of multimedia messages in a telecommunication system
US20020003468A1 (en)*1996-06-212002-01-10Toshiyuki TsumuraPager capable of EN bloc display of a set of messages
EP1202202A2 (en)2000-10-232002-05-02Hewlett-Packard CompanyValidation and audit of e-media delivery
US20020132608A1 (en)2001-03-162002-09-19Masahito ShinoharaTransmission-origin mobile telephone capable of detecting the media a types and formats of a multimedia message that are receivable by destination mobile telephones in a multimedia communication system
US20050060381A1 (en)2002-07-012005-03-17H2F Media, Inc.Adaptive electronic messaging
US20050233758A1 (en)2004-04-162005-10-20Nokia CorporationMethod and apparatus to transfer recipient MMS capabilities to MMS originator
US7016707B2 (en)*2000-06-212006-03-21Seiko Epson CorporationMobile telephone and radio communication device cooperatively processing incoming call
US20070281762A1 (en)*2006-05-312007-12-06Motorola, Inc.Signal routing to a communication accessory based on device activation
US20080040951A1 (en)2005-06-212008-02-21Lawrence KatesSystem and method for wearable electronics
US20080112567A1 (en)2006-11-062008-05-15Siegel Jeffrey MHeadset-derived real-time presence and communication systems and methods
US20090054039A1 (en)2007-02-212009-02-26Van Wijk JacquesMethods and Systems for Presence-Based Filtering of Notifications of Newly-Received Personal Information Manager Data
US7697941B2 (en)2005-08-022010-04-13Sony Ericsson Mobile Communications AbUpdating presence in a wireless communications device
US7702282B2 (en)2006-07-132010-04-20Sony Ericsoon Mobile Communications AbConveying commands to a mobile terminal through body actions
US7715873B1 (en)2006-06-232010-05-11Sprint Communications Company L.P.Wearable accessories providing visual indicia of incoming events for wireless telecommunications device
US20100173655A1 (en)*2007-05-302010-07-08Sung-Yong ChoiMethod of receiving confirmation of a short message service message and terminal apparatus for performing the same
US20100240345A1 (en)2009-03-172010-09-23Sony Ericsson Mobile Communications AbEvent notifier device and headset
US20110059769A1 (en)2009-09-042011-03-10Brunolli Michael JRemote phone manager
US20110056769A1 (en)*2006-02-172011-03-10Cameron International CorporationIntegrated lubrication module for compressors
US7996496B2 (en)2008-08-292011-08-09Sony Ericsson Mobile Communications AbRemote user interface in multiphone environment
US7996571B2 (en)2008-03-252011-08-09Nokia CorporationWireless coordination of apparatus interaction
US20110316698A1 (en)2010-06-292011-12-29Nokia CorporationSystems, methods, and apparatuses for providing adaptive user notifications
US20120044062A1 (en)*2010-08-222012-02-23Andrea Theresa JersaWrist Wound Vibrating Device
US20120196629A1 (en)*2011-01-282012-08-02Protext Mobility, Inc.Systems and methods for monitoring communications
US20130225134A1 (en)*2012-02-282013-08-29Research In Motion LimitedSmart-phone answering service for handling incoming calls
US20130324194A1 (en)*2011-02-182013-12-05Microsoft CorporationAutomatic answering of a mobile phone
US20140256293A1 (en)*2003-12-082014-09-11Ipventure, Inc.Adaptable communication techniques for electronic devices
US20140324437A1 (en)*2008-03-102014-10-30Lg Electronics Inc.Communication device transforming text message into speech

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN1642194A (en)*2004-01-172005-07-20上海迪比特实业有限公司Method for processing call-in of mobile telephone set
CN101179520A (en)*2006-12-282008-05-14腾讯科技(深圳)有限公司Method and system for sensing mail status
CN102014354B (en)*2010-11-092013-05-08北京无限新锐网络科技有限公司Short message receipt system and method

Patent Citations (28)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
EP0371609A2 (en)1988-11-291990-06-06International Business Machines CorporationElectronic mail systems
US20020003468A1 (en)*1996-06-212002-01-10Toshiyuki TsumuraPager capable of EN bloc display of a set of messages
WO1998019438A1 (en)1996-10-291998-05-07Telefonaktiebolaget Lm Ericsson (Publ)Method and arrangement for handling of multimedia messages in a telecommunication system
US7016707B2 (en)*2000-06-212006-03-21Seiko Epson CorporationMobile telephone and radio communication device cooperatively processing incoming call
EP1202202A2 (en)2000-10-232002-05-02Hewlett-Packard CompanyValidation and audit of e-media delivery
US20020132608A1 (en)2001-03-162002-09-19Masahito ShinoharaTransmission-origin mobile telephone capable of detecting the media a types and formats of a multimedia message that are receivable by destination mobile telephones in a multimedia communication system
US20050060381A1 (en)2002-07-012005-03-17H2F Media, Inc.Adaptive electronic messaging
US20140256293A1 (en)*2003-12-082014-09-11Ipventure, Inc.Adaptable communication techniques for electronic devices
US20050233758A1 (en)2004-04-162005-10-20Nokia CorporationMethod and apparatus to transfer recipient MMS capabilities to MMS originator
US20080040951A1 (en)2005-06-212008-02-21Lawrence KatesSystem and method for wearable electronics
US7697941B2 (en)2005-08-022010-04-13Sony Ericsson Mobile Communications AbUpdating presence in a wireless communications device
US20110056769A1 (en)*2006-02-172011-03-10Cameron International CorporationIntegrated lubrication module for compressors
US20070281762A1 (en)*2006-05-312007-12-06Motorola, Inc.Signal routing to a communication accessory based on device activation
US7715873B1 (en)2006-06-232010-05-11Sprint Communications Company L.P.Wearable accessories providing visual indicia of incoming events for wireless telecommunications device
US7702282B2 (en)2006-07-132010-04-20Sony Ericsoon Mobile Communications AbConveying commands to a mobile terminal through body actions
US20080112567A1 (en)2006-11-062008-05-15Siegel Jeffrey MHeadset-derived real-time presence and communication systems and methods
US20090054039A1 (en)2007-02-212009-02-26Van Wijk JacquesMethods and Systems for Presence-Based Filtering of Notifications of Newly-Received Personal Information Manager Data
US20100173655A1 (en)*2007-05-302010-07-08Sung-Yong ChoiMethod of receiving confirmation of a short message service message and terminal apparatus for performing the same
US20140324437A1 (en)*2008-03-102014-10-30Lg Electronics Inc.Communication device transforming text message into speech
US7996571B2 (en)2008-03-252011-08-09Nokia CorporationWireless coordination of apparatus interaction
US7996496B2 (en)2008-08-292011-08-09Sony Ericsson Mobile Communications AbRemote user interface in multiphone environment
US20100240345A1 (en)2009-03-172010-09-23Sony Ericsson Mobile Communications AbEvent notifier device and headset
US20110059769A1 (en)2009-09-042011-03-10Brunolli Michael JRemote phone manager
US20110316698A1 (en)2010-06-292011-12-29Nokia CorporationSystems, methods, and apparatuses for providing adaptive user notifications
US20120044062A1 (en)*2010-08-222012-02-23Andrea Theresa JersaWrist Wound Vibrating Device
US20120196629A1 (en)*2011-01-282012-08-02Protext Mobility, Inc.Systems and methods for monitoring communications
US20130324194A1 (en)*2011-02-182013-12-05Microsoft CorporationAutomatic answering of a mobile phone
US20130225134A1 (en)*2012-02-282013-08-29Research In Motion LimitedSmart-phone answering service for handling incoming calls

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Burger E., et al., "Message Context for Internet Mail; rfc3458.txt", Jan. 1, 2003, pp. 1-17, XP015009241, ISSN: 0000-0003.
International Search Report and Written Opinion-PCT/US2013/038745, International Search Authority-European Patent Office, Dec. 18, 2013.
Partial International Search Report-PCT/US2013/038745-ISA/EPO-Jul. 17, 2013.

Cited By (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20150358260A1 (en)*2014-06-092015-12-10Ca, Inc.Dynamic buddy list management based on message content
US10356237B2 (en)*2016-02-292019-07-16Huawei Technologies Co., Ltd.Mobile terminal, wearable device, and message transfer method

Also Published As

Publication numberPublication date
KR101622506B1 (en)2016-05-18
CN104335612A (en)2015-02-04
KR20150022897A (en)2015-03-04
WO2013180873A2 (en)2013-12-05
EP2856782A2 (en)2015-04-08
US20130316746A1 (en)2013-11-28
EP2856782B1 (en)2020-08-19
WO2013180873A3 (en)2014-01-30
CN104335612B (en)2019-02-05

Similar Documents

PublicationPublication DateTitle
US9173074B2 (en)Personal hub presence and response
US10602321B2 (en)Audio systems and methods
US9247525B2 (en)Systems and methods for providing notifications
US8948821B2 (en)Notification based on user context
US10652705B2 (en)Apparatus and method for processing call services in mobile terminal
US10205814B2 (en)Wireless earpiece with walkie-talkie functionality
US20140045463A1 (en)Wearable Communication Device
US20090248820A1 (en)Interactive unified access and control of mobile devices
CN105874443B (en)Method and apparatus for communicating between companion devices
CN110249612B (en)Call processing method and terminal
US10972614B2 (en)Systems and methods of audio notification upon state change

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:QUALCOMM INCORPORATED, CALIFORNIA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MILLER, BRIAN F.;MENENDEZ, JOSE;SAUHTA, ROHIT;SIGNING DATES FROM 20121107 TO 20121112;REEL/FRAME:029360/0149

STCFInformation on status: patent grant

Free format text:PATENTED CASE

FEPPFee payment procedure

Free format text:MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPSLapse for failure to pay maintenance fees

Free format text:PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCHInformation on status: patent discontinuation

Free format text:PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FPLapsed due to failure to pay maintenance fee

Effective date:20191027


[8]ページ先頭

©2009-2025 Movatter.jp