TheApple Vision Pro is amixed-reality headset developed byApple. It was announced on June 5, 2023, at Apple'sWorldwide Developers Conference (WWDC) and was released first in the US, then in global territories throughout 2024. Apple Vision Pro is Apple's first new major product category since the release of theApple Watch in 2015.[8]
Apple markets Apple Vision Pro as aspatial computer where digital media is integrated with the real world. Physical inputs—such as motion gestures,eye tracking, andspeech recognition—can be used to interact with the system.[9] Apple has avoided marketing the device as avirtual reality headset when discussing the product in presentations and marketing.[10]
The device runsvisionOS,[11] a mixed-reality operating system derived fromiPadOS frameworks using a3D user interface; it supportsmultitasking viawindows that appear to float within the user's surroundings,[12] as seen by cameras built into the headset. A dial on the top of the headset can be used to mask the camera feed with a virtual environment to increase immersion. The OS supportsavatars (officially called "Personas"), which are generated by scanning the user's face; a screen on the front of the headset displays a rendering of the avatar's eyes ("EyeSight"), which are used to indicate the user's level of immersion to bystanders, and assist in communication.[13]
On October 15, 2025, Apple announced an updated Apple Vision Pro featuring theM5 chip, which delivers improved performance, enhanced display rendering, extended battery life, and support for up to 120 Hz refresh rates. The updated model also introduced the Dual Knit Band, a redesigned headband option designed for improved comfort and fit.[14]
In May 2015,Apple acquired the Germanaugmented reality (AR) companyMetaio, originallyspun off fromVolkswagen.[15] That year, Apple hired Mike Rockwell fromDolby Laboratories. Rockwell formed a team called the Technology Development Group including Metaioco-founder Peter Meier andApple Watch manager Fletcher Rothkopf. The team developed an AR demo in 2016 but was opposed by chief design officerJony Ive and his team. Augmented reality andvirtual reality (VR) expert and formerNASA specialist Jeff Norris was hired in April 2017.[16][17] Rockwell's team helped deliverARKit in 2017 withiOS 11. Rockwell's team sought to create a headset and worked with Ive's team; the decision to reveal the wearer's eyes through a front-facing eye display was well received by the industrial design team.[18]
The headset's development experienced a period of uncertainty with the departure of Ive in 2019. His successor, Evans Hankey, left the company in 2023.[19] Senior engineering manager Geoff Stahl, who reports to Rockwell, led the development of its visionOSoperating system,[17][20] after previously working on games and graphics technology at Apple.[21] Apple's extended reality headset is meant as a bridge to future lightweight AR glasses, which are not yet technically feasible.[22][23] In November 2017, Apple acquired Canadian MR company Vrvana, founded by Bertrand Nepveu, for $30 million.[24][25] The Vrvana Totem was able to overlay fully opaque, true-color animations on top of the real world rather than the ghost-like projections of other AR headsets, which cannot display the color black. It was able to do this while avoiding the often-noticeable lag between the cameras capturing the outside world while simultaneously maintaining a 120-degree field of view at 90 Hz.[26]Vrvana's innovations, including IR illuminators and infrared cameras for spatial and hand tracking, were integral to the development of the headset.[27] According to leaker Wayne Ma, Apple was originally going to allowmacOS software to be dragged from the display to the user's environment, but was scrapped early on due to the limitations of being based on iPadOS and noted that the hand-tracking system was not precise enough for games. Workers also discussed collaborations with brands such asNike for working out with the headset, and others investigated face cushions that were better suited for sweaty, high-intensity workouts, but was scrapped due to the battery pack and the fragile screen. A feature called "co-presence"; a projection of a FaceTime user's full body, was also scrapped for unknown reasons.[28]
In May 2022, Apple's Board of Directors previewed the device.[29] The company began recruiting directors and creatives to develop content for the headset in June. One such director,Jon Favreau, was enlisted to bring the dinosaurs on hisApple TV+ showPrehistoric Planet to life.[30] By April, Apple was also attempting to attract developers to make software and services for the headset.[31] Apple filed over 5,000 patents for technologies which contributed to the development of Apple Vision Pro.[32]
Apple Vision Pro was announced at Apple's2023 Worldwide Developers Conference (WWDC23) on June 5, 2023, to launch in early 2024 in the United States at a starting price ofUS$3,499.[1][33]
On June 6, the day after the announcement, Apple acquired the AR headset startup Mira, whose technology is used atSuper Nintendo World'sMario Kart ride. The company has a contract with theUnited States Air Force andNavy. Eleven of the company's employees were onboarded.[34]
On January 8, 2024, Apple announced that the release date of Apple Vision Pro in the United States would be on February 2, 2024.[35][9] Estimates of initial shipments ranged from 60,000 to 80,000 units.[36] Pre-orders began on January 19, 2024, at 5:00 a.m. PST[37] and the launch shipments sold out in 18 minutes.[38] Apple sold up to 200,000 units in the two-week pre-order period,[39] a majority of which were to be shipped five to seven weeks after launch day.[40]
It also became available for purchase in China, Hong Kong, Japan, and Singapore on June 28, 2024, in Australia, Canada, France, Germany, and the UK on July 12, 2024, and in South Korea and the UAE on November 15, 2024.[41][42]
The front of the headset covering the colored "EyeSight" display and cameras
Apple Vision Pro comprises approximately 300 components.[43] It has a curvedlaminated glass display on the front, an aluminum frame on its sides, a flexible cushion on the inside, and a removable, adjustable headband. The frame contains five sensors, six microphones, and 12 cameras. Users see two 3660 × 3200 pixel[3] 1.41-inch (3.6 cm) micro-OLED displays with a total of 23megapixels usually running at 90FPS through the lens but can automatically adjust to 96 or 100 FPS based on the content being shown. The eyes are tracked by a system of LEDs and infrared cameras, which form the basis of the device'siris scanner named Optic ID (used for authentication, like the iPhone'sFace ID). Horizontally mounted motors adjust lenses for individual eye positions to ensure clear and focused images that precisely track eye movements. Sensors such asaccelerometers andgyroscopes track facial movements, minimizing discrepancies between the real world and the projected image.[43] Custom optical inserts are supported for users with prescription glasses, which will attach magnetically to the main lens and are developed in partnership withZeiss. The device's two speakers ("Audio pods") are inside the headband and are placed in front of the user's ears. It can also virtualizesurround sound.[44][11][43] Two cooling fans about 4 cm (1.6 in) in diameter are placed near the eye positions to help withheat dissipation due to high-speed processing of data. Anactive noise control function counters distracting noises, including the fan sounds.[43] During the ordering process, users must scan their face using aniPhone oriPad with Face ID for fitting purposes; this can be done via theApple Store app or at an Apple Store retail location.[45][46]
Apple Vision Pro uses theApple M2 system on a chip. It is accompanied by a co-processor known as Apple R1, which is used for real-timesensor input processing. The device can be purchased with three internal storage configurations: 256 GB, 512 GB, and 1 TB.[37] It is powered by an external battery pack that connects through a locking connector on the left side of the headband, twisting into place.[47][10] The battery pack connects to the headset using a 12-pin locking variant of theLightning connector that can be removed with a SIM ejection tool.[48]
The user's face is scanned by the headset during setup to generate apersona—a realisticavatar used by OS features.[49] One such feature is "EyeSight", an outward-facing screen which displays the eyes of the user's persona. Its eyes appear dimmed when in AR and obscured when in full immersion to indicate the user's environmental awareness. When someone else approaches or speaks, even if the user is fully immersed, EyeSight shows their persona's virtual eyes normally and makes the other person visible.[47][50]
A digital crown dial on the headset is used to control the amount of virtual background occupying the user's field of view, ranging from amixed-reality view where apps and media appear to float in the user's real-world surroundings, to completely hiding the user's surroundings. It may also alternatively control the device's speaker volume.[51][47]
The Vision Pro travel case, seen here including the device and accessories
First-party consumer accessories for Apple Vision Pro include a US$199 travel case, $99 or $149 Zeiss-manufactured lens inserts for users with vision prescriptions (depending on the prescription),[52] a $199 light seal, and a $29 light seal cushion. The only official third-party accessory available at launch is a battery holder made byBelkin.[53][54][55]
A first-party adapter costing $299 is available and can only be purchased by registered, paid Apple Developer accounts, that replaces the right head-strap connection and adds a USB-C port for use by developers.[56][57][58] Code from diagnostics tools have revealed that the adapter is capable of interacting with Apple Vision Pro in a diagnostic mode.[59]
In November 2024, it was announced that Apple will sell a Belkin head strap for use with the Solo Knit Band.[60]
Apple Vision Pro runs visionOS (internally called xrOS before a last-minute change ahead of WWDC[61]), which is derived primarily from iPadOS core frameworks (includingUIKit,SwiftUI, andARKit), and MR-specific frameworks forfoveated rendering and real-time interaction.[1][33]
The operating system uses a3D user interface navigated viafinger tracking,eye tracking, andspeech recognition. Users can select elements by looking at it and pinching two fingers together, move the element by moving their pinched fingers, and scroll by flicking their wrist. Apps are displayed in floatingwindows that can be arranged in 3D space. visionOS supports a virtual keyboard for text input, theSiri virtual assistant, and externalBluetooth peripherals includingMagic Keyboard,Magic Trackpad, andgamepads.[47][62] visionOS supports screen mirroring to other Apple devices usingAirPlay.[63] visionOS can mirror the primary display of a macOS device via the "Mac Virtual Display" feature; the Mac can also be controlled using peripherals paired with the headset.[63]
visionOS supports vision apps fromApp Store, and isbackward compatible with selected iOS and iPadOS apps; developers are allowed to opt out from visionOS compatibility.[64]Netflix,Spotify, andYouTube notably announced that they would not release visionOS apps at launch, nor support their iOS apps on the platform, and directed users to use their web versions inSafari.[65] Analysts suggested that this may have resulted from the companies' strained relationships with Apple over App Store policies such as mandatory 30%revenue sharing, including associatedantitrust allegations.[66][67] In an interview, Netflix co-CEOGreg Peters stated that Apple Vision Pro was too niche for the company to support at this time, but that "we're always in discussions with Apple to try and figure that out".[68] A YouTube spokesperson later stated toThe Verge that the service had plans to develop a visionOS app in the future.[69]
Before the official release of Apple Vision Pro, Samuel Axon ofArs Technica said that Apple Vision Pro was "truly something I had never seen before", noting the intuitiveness of its user interface in a choreographed demo given by Apple, and praising a dinosaurtech demo for its immersive-ness. Axon said that its displays were dim but "much better than other headsets I've used on this front, even if it still wasn't perfect", and that the personas looked "surreal" but conveyed body language better than a more stylized avatar (such asAnimoji orHorizon Worlds).[46] He argued that Apple Vision Pro was not a virtual reality (VR) platform, nor a competitor toMeta Platforms'sQuest (formerly Oculus) product line, due to its positioning as "primarily an AR device that just happens to have a few VR features", and not as a mass market consumer product.[46] Media outlets observed that Meta had announced theMeta Quest 3 shortly before WWDC, seemingly in anticipation of Apple's announcement.[70][71][72] Following its release, Meta CEOMark Zuckerberg stated he had demoed the headset and liked its display resolution and eye tracking, but still believed the Quest 3 was the "better product" due to its lower price and Apple's "closed" ecosystem.[73]
Jay Peters ofThe Verge similarly noted that Apple did not present Apple Vision Pro as a VR platform or refer to the device as a headset, and described it as an AR device and "spatial computer", and only demonstrated non-VR games displayed in windows and controlled using an external gamepad, rather than fully immersive experiences such as games and social platforms (includingmotion controllers). He suggested that this positioning "leaves wiggle room for the likely future of this technology that looks nothing like a bulky VR headset:AR glasses".[74] App Store guidelines for visionOS similarly state that developers should refer to visionOS software as "spatial computing experiences" or "vision apps", and avoid the use of terms such as "augmented reality" and "mixed reality".[75][76]
After the initial announcement of Apple Vision Pro, it was criticized due to its high cost, as too high to go mainstream;[77][78][79] the three priciest components in Apple Vision Pro are its camera and sensor array, its dual Apple silicon chips, and the twin 4K micro-OLED virtual reality displays. Apple is reportedly working on a cheaper model that is scheduled sometime for release for the end of 2025 and a second-generation model with a faster processor.[80] Apple Vision Pro also faced criticism over its short battery life,[81] appearing distracting to others,[81] and its lack of HDMI input[82][83] and haptic feedback.[81]
A Vision Pro user shown with EyeSight on the front screen, in both 'eye' (top) and 'immersed' (bottom) modes
Apple Vision Pro received mixed to positive reviews.Nilay Patel ofThe Verge praised the headset's design as being more premium and less "goofy"-looking than other existing VR headsets, felt that its displays were "generally incredible" in their sharpness and brightness, it had the highest-quality video passthrough he had seen on an MR headset yet (even while having afield of view narrower than the Meta Quest 3), and that its speakers had a "convincing"spatial audio effect. However, he felt that there was "so much technology in this thing that feels like magic when it works and frustrates you completely when it doesn't", citing examples such as the passthrough cameras (which "cannot overcome the inherent nature of cameras and displays"), eye and hand tracking that was "inconsistent" and "frustrating" to use (with parts of the visionOS interface demanding precision that couldn't be met by the eye-tracking system), visionOS lacking a window management tool similar toExpose or Stage Manager, and that the personas and EyeSight features were uncanny (with the latter's visibility hampered by a dim, low-resolution display covered by reflective glass). Patel felt that Apple Vision Pro was meant to be adevelopment kit for future AR glasses, as the device's current form was—from technological and philosophical standpoints—too limiting for Apple's ambitions, and "may have inadvertently revealed that some of these core ideas are actually dead ends."[84]
Joanna Stern ofThe Wall Street Journal echoed this sentiment, arguing that it was "the best mixed-reality headset I've ever tried", and "so much of what the Vision Pro can do feelssci-fi", but that "these companies know these aren't really the devices we want. They're all working toward building virtual experiences into something that looks more like a pair of regular eyeglasses. Until then, they're just messing with our heads."[85]
Reviews from buyers of Apple Vision Pro have been mixed. One person attempted to drive while using the device, which Apple warns against in the Apple Vision Pro user manual.[86] Other users posted videos of themselves using the device while walking, a feature not officially supported at launch.[87]
Some have experimented with cooking while wearing the headset, which is not recommended by Apple. This allows users to easily see step-by-step instructions while cooking.[88] Its use has also been documented as a potential tool in the operating room, additional use cases include education, productivity, sales, collaboration anddigital twins.[89][90]
On October 15, 2025, Apple introduced an updated Vision Pro model powered by theApple M5 chip. Compared to the M2 chip used in the previous model, the M5 features more CPU cores, an improved GPU and an improved Neural Engine. The R1 chip continues to handle sensor processing.[94]
The updated chip allows the Vision Pro to render 10 percent more pixels on the micro-OLED displays compared to the original model, resulting in sharper images and crisper text. The headset can also now increase its refresh rate up to 120 Hz (compared to a maximum of 100 Hz on the M2 model) for reduced motion blur.[14]
The M5 variant introduced a new "Dual Knit Band" as the standard headband option, designed to address many of the complaints related to comfort with the previous model. The band is similar to the Solo Knit Band of the previous model, but also features an upper head strap to offer more support to the user's head. Tungsten inserts were added to the lower strap to act as a counterweight for balance and stability.[14] The Dual Knit Band is also sold separately, and is compatible with the previous generation Vision Pro.
Pre-orders for the M5 Vision Pro began on October 15, 2025, in Australia, Canada, France, Germany, Hong Kong, Japan, the UAE, the UK, and the U.S., with China and Singapore following on October 17. The device became available inApple Store locations on October 22, 2025. The model is priced at the same as the previous model, $3,499 for the 256GB model, with 512GB and 1TB configurations also available.[14]
The device is set to launch in South Korea and Taiwan on November 28, 2025.[95]
Theuser experience of the Apple Vision Pro centers on itsvisionOS interface, which relies oneye-tracking,hand gestures, andvoice input as the primary forms of interaction. Reviewers have noted that its spatial computing environment allows users to navigate apps and media in an immersive, multitasking-oriented workspace. While many assessments highlight the intuitiveness of its controls, some users have reported discomfort related to device weight, prolonged wear, and visual strain.[96]
The Apple Vision Pro includes a variety of accessibility settings that are intended to accommodate users with different physical and sensory abilities. This includes features such as VoiceOver, Siri commands, Pointer Control, Zoom, Accessibility Reader and Braille support.[97]
Vision Pro has been evaluated with the Derby Dozen heuristics list in comparison to theMeta Quest 3 to assess its accessibility and comfortability for users. In its study, while both scored high in heuristics, evaluators noted that both devices struggled with weight on the users face and cheekbones and eyestrain which potentially is due to its eye tracking capabilities.[96]
^abcdBan, Masaharu; Eguchi, Ryosuke (February 28, 2024). Shiraishi, Takeshi; Tsunashima, Yuta; Nitta, Yuichi (eds.)."Vision Pro teardown".Nikkei Asia.Archived from the original on March 1, 2024.
^abAros, M., Arnold, H., & Chaparro, B. (2025). Exploring Device Usability: A Heuristic Evaluation of the Apple Vision Pro and Meta Quest 3. In C. Stephanidis, S. Ntoa, M. Antona, & G. Salvendy (Eds.), HCI International 2025 Posters (pp. 3–12). Springer Nature Switzerland.https://doi.org/10.1007/978-3-031-94150-4_1