- Notifications
You must be signed in to change notification settings - Fork1
montaguegabe/arkit-depth-renderer
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
Displays the depth values received by the front-facing camera. The depth values are applied to a heat map and multiplied with the camera's color image. The resulting image is then used as the background for the augmented reality scenekit scene.
This example builds upon the officialCreating Face-Based AR Experiences demo and is free to use. The original demo code ishere.
- See additions to ViewController.swift
- Depth frames can be accessed by any object that conforms to
ARSessionDelegateusing thesession(_ session: ARSession, didUpdate frame: ARFrame)method. - To align SceneKit rendering with the depth image, we use
frame.displayTransformto get a matrix that is used to properly align the projection. - The raw frame data can be fed into Core Image for manipulation, for example with
CIImage(cvImageBuffer: depthBuffer) - Once in Core Image, a temperature gradient can be applied, and the color image can be multiplied in using filters (see Core Image docs).
Depth frames are not received as quickly as color frames are. As a result, the camera feedback is not as fast as a photo preview would normally be. I haven't been able to find a way to manipulate the camera configuration to increase the rate at which the depth images are received.
About
Displays the depth values received by the front-facing camera.
Topics
Resources
Uh oh!
There was an error while loading.Please reload this page.
