Siri (/ˈsɪri/ⓘSEER-ee,backronym: Speech Interpretation and Recognition Interface) is a digital assistant purchased, developed, and popularized byApple Inc., which is included in theiOS,iPadOS,watchOS,macOS,tvOS,audioOS, andvisionOSoperating systems.[1][2] It uses voice queries, gesture based control, focus-tracking and anatural-language user interface to answer questions, make recommendations, and perform actions by delegating requests to a set ofInternet services. With continued use, it adapts to users' individual language usages, searches, and preferences, returning individualized results.
Siri is aspin-off from a project developed by theSRI International Artificial Intelligence Center. Itsspeech recognition engine was provided byNuance Communications, and it uses advancedmachine learning technologies to function. Its original American, British, and Australianvoice actors recorded their respective voices around 2005, unaware of the recordings' eventual usage. Siri was released as an app for iOS in February 2010. Two months later, Apple acquired it and integrated it into theiPhone 4s at its release on 4 October 2011, removing the separate app from the iOSApp Store. Siri has since been an integral part of Apple's products, having been adapted into other hardware devices including neweriPhone models,iPad,iPod Touch,Mac,AirPods,Apple TV,HomePod, andApple Vision Pro.
Siri supports a wide range of user commands, including performing phone actions, checking basic information, scheduling events and reminders, handling device settings, searching the Internet, navigating areas, finding information on entertainment, and being able to engage with iOS-integrated apps. With the release ofiOS 10, in 2016, Apple opened up limited third-party access to Siri, including third-party messaging apps, as well as payments,ride-sharing, andInternet calling apps. With the release ofiOS 11, Apple updated Siri's voice and added support for follow-up questions, language translation, and additional third-party actions.iOS 17 andiPadOS 17 enabled users to activate Siri by simply saying "Siri", while the previous command, "Hey Siri", is still supported. Siri was upgraded to usingApple Intelligence oniOS 18,iPadOS 18, andmacOS Sequoia, replacing the logo.
Siri's original release on iPhone 4s in October 2011 received mixed reviews. It received praise for itsvoice recognition and contextual knowledge of user information, including calendar appointments, but was criticized for requiring stiff user commands and having a lack of flexibility. It was also criticized for lacking information on certain nearby places and for its inability to understand certainEnglish accents. In 2016 and 2017, a number of media reports said that Siri lacked innovation, particularly against new competing voice assistants. The reports concerned Siri's limited set of features, "bad" voice recognition, and undeveloped service integrations as causing trouble for Apple in the field ofartificial intelligence and cloud-based services; the basis for the complaints reportedly due to stifled development, as caused by Apple's prioritization of userprivacy and executive power struggles within the company.[3] Its launch was also overshadowed by the death ofSteve Jobs, which occurred one day after the launch.
The initial Siri prototype was implemented using the Active platform, a joint project between the Artificial Intelligence Center ofSRI International and the Vrai Group atEcole Polytechnique Fédérale de Lausanne. The Active platform was the focus of a Ph.D. thesis led byDidier Guzzoni, who joined Siri as its chief scientist.[11]
Siri was acquired by Apple Inc. in April 2010 under the direction of Steve Jobs.[12] Apple's first notion of a digital personal assistant appeared in a 1987 concept video,Knowledge Navigator.[13][14]
Siri has been updated with enhanced capabilities made possible by Apple Intelligence. InmacOS Sequoia,iOS 18, andiPadOS 18, Siri features an updated user interface, improved natural language processing, and the option to interact via text by double tapping the home bar without enabling the feature in the Accessibility menu on iOS and iPadOS. Apple Intelligence adds the ability for Siri to use personal context from device activities to make conversations more natural and fluid. Siri can give users device support and will have larger app support via the Siri App Intents API. Siri will be able to deliver intelligence that's tailored to the user and their on-device information using personal context. For example, a user can say, "Play that podcast that Jamie recommended," and Siri will be able to locate and play the episode, without the user having to remember where it was mentioned. They could also ask, "When is Mom's flight landing?" and Siri will find the flight details and cross-reference them with real-time flight tracking to give an arrival time.[15][16] For more day to day interactions with Apple devices, Siri will now summarize messages (on more apps than just Messages, such as Discord and Slack). According to users, this feature can be helpful but can also be inappropriate in certain situations. As a beta tester explained, this current version of Siri with Apple Intelligence is still in the early development stages, so users shouldn't expect a vastly different experience.[17]
The original American voice of Siri was recorded in July 2005 bySusan Bennett, who was unaware it would eventually be used for the voice assistant.[18][19] A report fromThe Verge in September 2013 about voice actors, their work, and machine learning developments, hinted that Allison Dufty was the voice behind Siri,[20][21] but this was disproven when Dufty wrote on her website that she was "absolutely, positivelynot the voice of Siri."[19] Citing growing pressure, Bennett revealed her role as Siri in October, and her claim was confirmed by Ed Primeau, an Americanaudio forensics expert.[19] Apple has never acknowledged it.[19]
The original British male voice was provided byJon Briggs, a former technology journalist and for 12 years narrated for the hitBBC quiz showThe Weakest Link.[18] After discovering he was Siri's voice by watching television, he first spoke about the role in November 2011. He acknowledged that the voice work was done "five or six years ago", and that he didn't know how the recordings would be used.[22][23]
In an interview between all three voice actors andThe Guardian, Briggs said that "the original system was recorded for a US company called Scansoft, who were then bought by Nuance. Apple simply licensed it."[24]
ForiOS 11, Apple auditioned hundreds of candidates to find new female voices, then recorded several hours of speech, including different personalities and expressions, to build a newtext-to-speech voice based ondeep learning technology.[25] In February 2022, Apple added Quinn, its first gender-neutral voice as a fifth user option, to the iOS 15.4 developer release.[26]
Siri released as astand-alone application for the iOS operating system in February 2010, and at the time, the developers were also intending to release Siri forAndroid andBlackBerry devices.[27] Two months later, Apple acquired Siri.[28][29][30] On October 4, 2011, Apple introduced theiPhone 4S with abeta version of Siri.[31][32] After the announcement, Apple removed the existing standalone Siri app fromApp Store.[33]TechCrunch wrote that, though the Siri app supportsiPhone 4, its removal from App Store might also have had a financial aspect for the company, in providing an incentive for customers to upgrade devices.[33] Third-party developer Steven Troughton-Smith, however, managed toport Siri to iPhone 4, though without being able to communicate with Apple's servers.[34] A few days later, Troughton-Smith, working with an anonymous person nicknamed "Chpwn", managed to fully hack Siri, enabling its full functionalities on iPhone 4 andiPod Touch devices.[35] Additionally, developers were also able to successfully create and distribute legal ports of Siri to any device capable of running iOS 5, though aproxy server was required for Apple server interaction.[36]
Over the years, Apple has expanded the line of officially supported products, including newer iPhone models,[37] as well as iPad support in June 2012,[38] iPod Touch support in September 2012,[39] Apple TV support, and the stand-aloneSiri Remote, in September 2015,[40] Mac and AirPods support in September 2016,[41][42] and HomePod support in February 2018.[43][44]
Siri also offers numerous pre-programmed responses to amusing questions. Such questions include "What is the meaning of life?" to which Siri may reply "All evidence to date suggests it's chocolate"; "Why am I here?", to which it may reply "I don't know. Frankly, I've wondered that myself"; and "Will you marry me?", to which it may respond with "MyEnd User Licensing Agreement does not covermarriage. My apologies."[49][50] In addition to some of these questions, there are also statements you can tell Siri such as "I am your father." to which Siri may reply "Nooooo!".
Initially limited to female voices, Apple announced in June 2013 that Siri would feature a gender option, adding a male voice counterpart.[51]
In September 2014, Apple added the ability for users to speak "Hey Siri" to enable the assistant without the requirement of physically handling the device.[52]
In September 2015, the "Hey Siri" feature was updated to include individualized voice recognition, a presumed effort to prevent non-owner activation.[53][54]
With the announcement ofiOS 10 in June 2016, Apple opened up limited third-party developer access to Siri through a dedicatedapplication programming interface (API). The API restricts the usage of Siri to engaging with third-party messaging apps, payment apps, ride-sharing apps, and Internet calling apps.[55][56]
In iOS 11, Siri is able to handle follow-up questions, supports language translation, and opens up to more third-party actions, including task management.[57][58] Additionally, users are able to type to Siri,[59] and a new, privacy-minded "on-device learning" technique improves Siri's suggestions by privately analyzing personal usage of different iOS applications.[60]
iOS 17 and iPadOS 17 allows users to simply say "Siri" to initiate Siri, and the virtual assistant now supports back to back requests, allowing users to issue multiple requests and conversations without reactivating it.[61] In the public beta versions of iOS 17, iPadOS 17, andmacOS Sonoma, Apple added support for bilingual queries to Siri.[62]
Siri received mixed reviews during its beta release as an integrated part of theiPhone 4S in October 2011.
MG Siegler ofTechCrunch wrote that Siri was "great," praising the potential for Siri after losing the beta tag:
The amount of times Siri hasn't been able to understand and execute my request is astonishingly low.... Just imagine what will happen when Apple partners with other services to expand Siri further. And imagine when they have an API that any developer can use. This really could alter the mobile landscape.[65]
[Siri] thinks for a few seconds, displays a beautifully formatted response and speaks in a calm female voice.... It's mind-blowing how inexact your utterances can be. Siri understands everything from, 'What's the weather going to be like in Tucson this weekend?' to 'Will I need an umbrella tonight?'... Once, I tried saying, 'Make an appointment with Patrick for Thursday at 3.' Siri responded, 'Note that you already have an all-day appointment about "Boston Trip" for this Thursday. Shall I schedule this anyway?' Unbelievable.[66]
Jacqui Cheng ofArs Technica wrote that Apple's claims of what Siri could do were bold, and the early demos "even bolder":
Though Siri shows real potential, these kinds of high expectations are bound to be disappointed.... Apple makes clear that the product is still in beta—an appropriate label, in our opinion.[67]
While praising its ability to "decipher our casual language" and deliver "very specific and accurate result," sometimes even providing additional information, Cheng noted and criticized its restrictions, particularly when the language moved away from "stiffer commands" into more human interactions. One example included the phrase "Send a text to Jason, Clint, Sam, and Lee saying we're having dinner at Silver Cloud," which Siri interpreted as sending a message to Jason only, containing the text "Clint Sam and Lee saying we're having dinner at Silver Cloud." She also noted a lack of proper editability, as saying "Edit message to say: We're at Silver Cloud and you should come find us," generated "Clint Sam and Lee saying we're having dinner at Silver Cloud to say we're at Silver Cloud and you should come find us."[67]
Google's executive chairman and former chief,Eric Schmidt, conceded that Siri could pose a competitive threat to the company's core search business.[68]
Natalie Kerris, a spokeswoman for Apple, toldThe New York Times:
Our customers want to use Siri to find out all types of information, and while it can find a lot, it doesn't always find what you want.... These are not intentional omissions meant to offend anyone. It simply means that as we bring Siri from beta to a final product, we find places where we can do better, and we will in the coming weeks.[72]
In January 2016,Fast Company reported that, in then-recent months, Siri had begun to confuse the word "abortion" with "adoption", citing "health experts" who stated that the situation had "gotten worse." However, at the time ofFast Company's report, the situation had changed slightly, with Siri offering "a more comprehensive list ofPlanned Parenthood facilities", although "Adoption clinics continue to pop up, but near the bottom of the list."[73][74]
Siri has also not been well received by some English speakers with distinctive accents, includingScottish[75] and Americans fromBoston or theSouth.[76]
In March 2012, Frank M. Fazio filed a class action lawsuit against Apple on behalf of the people who bought the iPhone 4S and felt misled about the capabilities of Siri, alleging its failure to function as depicted in Apple's Siri commercials. Fazio filed the lawsuit in California and claimed that the iPhone 4S was merely a "more expensive iPhone 4" if Siri fails to function as advertised.[77][78] On July 22, 2013, U.S. District Judge Claudia Wilken in San Francisco dismissed the suit but said the plaintiffs could amend at a later time. The reason given for dismissal was that plaintiffs did not sufficiently document enough misrepresentations by Apple for the trial to proceed.[79]
In June 2016,The Verge's Sean O'Kane wrote about the then-upcoming major iOS 10 updates, with a headline stating "Siri's big upgrades won't matter if it can't understand its users":
What Apple didn't talk about was solving Siri's biggest, most basic flaws: it's still not very good at voice recognition, and when it gets it right, the results are often clunky. And these problems look even worse when you consider that Apple now has full-fledged competitors in this space:Amazon'sAlexa,Microsoft'sCortana, and Google'sAssistant.[80]
Also writing forThe Verge,Walt Mossberg had previously questioned Apple's efforts in cloud-based services, writing:[81]
...perhaps the biggest disappointment among Apple's cloud-based services is the one it needs most today, right now: Siri. Before Apple bought it, Siri was on the road to being a robust digital assistant that could do many things, and integrate with many services—even though it was being built by a startup with limited funds and people. After Apple bought Siri, the giant company seemed to treat it as a backwater, restricting it to doing only a few, slowly increasing number of tasks, like telling you the weather, sports scores, movie and restaurant listings, and controlling the device's functions. Its unhappy founders have left Apple to build a new AI service calledViv. And, on too many occasions, Siri either gets things wrong, doesn't know the answer, or can't verbalize it. Instead, it shows you a web search result, even when you're not in a position to read it.
In October 2016,Bloomberg reported that Apple had plans to unify the teams behind its various cloud-based services, including a single campus and reorganized cloud computing resources aimed at improving the processing of Siri's queries,[82] although another report fromThe Verge, in June 2017, once again called Siri's voice recognition "bad."[83]
In June 2017,The Wall Street Journal published an extensive report on the lack of innovation with Siri following competitors' advancement in the field of voice assistants. Noting that Apple workers' anxiety levels "went up a notch" on the announcement of Amazon's Alexa, theJournal wrote: "Today, Apple is playing catch-up in a product category it invented, increasing worries about whether the technology giant has lost some of its innovation edge." The report gave the primary causes being Apple's prioritization of user privacy, including randomly-tagged six-month Siri searches, whereas Google and Amazon keep data until actively discarded by the user,[clarification needed] and executive power struggles within Apple. Apple did not comment on the report, whileEddy Cue said: "Apple often uses generic data rather than user data to train its systems and has the ability to improve Siri's performance for individual users with information kept on their iPhones."[3][84]
In July 2019, a then-anonymous whistleblower and former Apple contractor Thomas le Bonniec said that Siri regularly records some of its users' conversations even when it was not activated. The recordings are sent to Apple contractors grading Siri's responses on a variety of factors. Among other things, the contractors regularly hear private conversations between doctors and patients, business and drug deals, and couples having sex. Apple did not disclose this in its privacy documentation and did not provide a way for its users to opt-in or out.[85]
An example of a conversation with Siri.
In August 2019, Apple apologized, halted the Siri grading program, and said that it plans to resume "later this fall when software updates are released to [its] users".[86] The company also announced "it would no longer listen to Siri recordings without your permission".[87] iOS 13.2, released in October 2019, introduced the ability to opt out of the grading program and to delete all the voice recordings that Apple has stored on its servers.[88] Users were given the choice of whether their audio data was received by Apple or not, with the ability to change their decision as often as they like. It was then made an opt-in program.
In May 2020, Thomas le Bonniec revealed himself as the whistleblower and sent a letter to European data protection regulators, calling on them to investigate Apple's "past and present" use of Siri recordings. He argued that, even though Apple has apologized, it has never faced the consequences for its years-long grading program.[89][90]
In December 2024, Apple agreed to a $95 million class-action settlement, compensating users of Siri-enabled from the past ten years. Additionally, Apple must confirm the deletion of Siri recordings before 2019 (when the feature became opt-in) and issue new guidance on how data is collected and how users can participate in efforts to improve Siri.[91]
Apple has introduced various accessibility features aimed at making its devices more inclusive for individuals with disabilities. The company provides users the opportunity to share feedback on accessibility features through email.[92] Some of the new functionalities include live speech, personal voice, Siri's atypical speech pattern recognition, and much more.[93]
Accessibility features:
VoiceOver: This feature provides visual feedback for Siri responses, allowing users to engage with Siri through both visual and auditory channels.[94]
Voice-to-text and text-to-voice: Siri can transcribe spoken words into and text as well as read text typed by the user out loud.[95][96]
Text commands: Users can type what they want Siri to do.[97]
Personal voice: This allows users to create a synthesized voice that sounds like them.[98]
Siri, like many AI systems, can perpetuate gender and racial biases through its design and functionality. According to an article fromThe Conversation, Siri "reinforces the role of women as secondary and submissive to men" due to the fact that the default is a soft, female voice.[99] Although Apple now offers a larger variety of voices with different accents and languages, this original narrative perpetuates the idea of women servicing men. Not only this but the article also explains how different settings of Siri's voice result in different responses, specifically the female voice being programmed with more flirtatious statements than the male voice. Additionally, Siri may misinterpret certain accents or dialects, particularly those spoken by people from marginalized racial or ethnic backgrounds, making it less accessible to these groups. According to an article fromThe Scientific American, Claudia Lloreda explains that non-native English speakers have to "adapt our way of speaking to interact with speech-recognition technologies."[100] Furthermore, due to repetitive "learnings" from a larger user base, Siri may unintentionally produce a Western perspective, limiting representation and furthering biases in everyday interactions. Despite these perpetuated issues, Siri provides several benefits as well, especially for those with disabilities that typically limit their abilities to use technology and access the internet.
The iOS version of Siri ships with a vulgar content filter; however, it is disabled by default and must be enabled by the user manually.[101]
In 2018,Ars Technica reported a new glitch that could be exploited by a user requesting the definition of "mother" be read out loud. Siri would issue a response and ask the user if they would like to hear the next definition; when the user replies with "yes," Siri would mention "mother" as being short for "motherfucker."[102] This resulted in multipleYouTube videos featuring the responses and/or how to trigger them. Apple fixed the issue silently. The content is picked up from third-party sources such as theOxford English Dictionary and not a supplied message from the corporation.[103]
^abcMcKee, Heidi (2017).Professional Communication and Network Interaction: A Rhetorical and Ethical Approach. Routledge Studies in Rhetoric and Communication. London: Taylor and Francis. p. 167.ISBN978-1-351-77077-4.OCLC990411615. RetrievedDecember 1, 2018.Siri's voices were recorded in 2005 by a company who then licensed the voices to Apple for use in Siri. The three main voices of Siri at original launch were Karen Jacobson (in Australia), Susan Bennett (in the United States), and Jon Briggs ...
For a detailed article on the history of the organizations and technologies preceding the development of Siri, and their influence upon that application, see Bianca Bosker, 2013, "Siri Rising: The Inside Story Of Siri's Origins (And Why She Could Overshadow The iPhone)", inThe Huffington Post (online), January 22, 2013 (updated January 24, 2013), accessed November 2, 2014.