- Notifications
You must be signed in to change notification settings - Fork57
VR driving 🚙 + eye tracking 👀 simulator based on CARLA for driving interaction research
License
HARPLab/DReyeVR
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
Submission Video Demonstration (YouTube)
This project extends theCarla simulator to add virtual reality integration, a first-person maneuverable ego-vehicle, eye tracking support, and several immersion enhancements.
If you have questions, hopefully ourF.A.Q. wiki page andissues page can answer some of them.
IMPORTANT: Currently DReyeVR only supports Carla versions:0.9.13 with Unreal Engine 4.26
Fully drivablevirtual reality (VR) ego-vehicle withSteamVR integration (seeEgoVehicle.h)
- SteamVR HMD head tracking (orientation & position)
- We have tested with the following devices:
Device VR Supported Eye tracking OS HTC Vive Pro Eye ✅ ✅ Windows, Linux Quest 2 ✅ ❌ Windows - While we haven't tested other headsets, they should still work for basic VR usage (not eye tracking) if supported by SteamVR.
- Eye tracking is currentlyONLY supported on the HTC Vive Pro Eye since we useSRanipal for the eye-tracker SDK. We are happy to support more devices through contributions for adding other SDKs.
- We have tested with the following devices:
- Vehicle controls:
- Generic keyboard WASD + mouse
- Support for Logitech Steering wheel with this open sourceLogitechWheelPlugin
- Includes force-feedback with the steering wheel.
- We used aLogitech G923 Racing Wheel & Pedals
- Full list of supported devices can be foundhere though we can't guarantee out-of-box functionality without testing.
- Realistic (and parameterizable) rear & side view mirrors
- WARNING: very performance intensive
- Vehicle dashboard:
- Speedometer (in miles-per-hour by default)
- Gear indicator
- Turn signals
- Dynamic steering wheel
- Adjustable parameters, responsive to steering input
- See our documentation on thishere
- "Ego-centric" audio
- Responsive engine revving (throttle-based)
- Turn signal clicks
- Gear switching
- Collisions
- Fully compatible with the existing CarlaPythonAPI andScenarioRunner
- Minor modifications were made. SeeUsage.md documentation.
- Fully compatible with the CarlaRecorder and Replayer
- Including HMD pose/orientation & sensor reenactment
- Ability to handoff/takeover control to/from Carla's AI wheeled vehicle controller
- Carla-based semantic segmentation camera (see
Shaders/README.md)
Carla-compatibleego-vehicle sensor (seeEgoSensor.h) is an "invisible sensor" that tracks the following:
- Real-timeEye tracking with theHTC Vive Pro Eye VR headset
- Eye tracker data includes:
- Timing information (based off headset, world, and eye-tracker)
- 3D Eye gaze ray (left, right, & combined)
- 2D Pupil position (left & right)
- Pupil diameter (left & right)
- Eye Openness (left & right)
- Focus point in the world & hit actor information
- SeeDReyeVRData.h:EyeTracker for the complete list
- Eye reticle visualization in real time
- Eye tracker data includes:
- Real-time user inputs (throttle, steering, brake, turn signals, etc.)
- Image (screenshot) frame capture based on the camera
- Typically used in Replay rather than real-time because highly performance intensive.
- Fully compatible with the LibCarla data serialization for streaming to a PythonAPI client (seeLibCarla/Sensor)
- We have also tested and verified support for (
rospy) ROS integration our sensor data streams
- We have also tested and verified support for (
- Custom DReyeVR config file for one-time runtime params. SeeDReyeVRConfig.ini
- Especially useful to change params without recompiling everything.
- Uses standard c++ io management to read the file with minimal performance impact. SeeDReyeVRUtils.h.
- World ambient audio
- Birdsong, wind, smoke, etc. (SeeDocs/Sounds.md)
- Non-ego-centric audio (Engine revving from non-ego vehicles)
- Synchronized Replay with per-frame frame capture for post-hoc analysis (SeeDocs/Usage.md)
- Recorder/replayer media functions
- Added in-game keyboard commands Play/Pause/Forward/Backward/etc.
- Static in-environment directional signs for natural navigation (See
Docs/Signs.md) - Adding weather to the Carla recorder/replayer/query (See thisCarla PR)
- Custom dynamic 3D actors with full recording support (eg. HUD indicators for direction, AR bounding boxes, visual targets, etc.). SeeCustomActor.md for more.
- (DEBUG ONLY) Foveated rendering for improved performance with gaze-aware (or fixed) variable rate shading
SeeDocs/Install.md to:
- Install and build
DReyeVRon top of a workingCarlarepository. - Download plugins for
DReyeVRrequired for fancy features such as:- Eye tracking (SRanipal)
- Steering wheel/pedals (Logitech)
- Set up a
condaenvironment for DReyeVR PythonAPI
| OS | VR | Eye tracking | Audio | Keyboard+Mouse | Racing wheel | Foveated Rendering (Editor) |
|---|---|---|---|---|---|---|
| Windows | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Linux | ✅ | ❌ | ✅ | ✅ | ❌ | ❌ |
| MacOS | ❌ | ❌ | ✅ | ✅ | ❌ | ❌ |
- While Windows (10) is recommended for optimized VR support, all our work translates to Linux systems except for the eye tracking and hardware integration which have Windows-only dependencies.
- Unfortunately the eye-tracking firmware is proprietary & does not work on Linux
- This is (currently) only supported on Windows because of some proprietary dependencies betweenHTC SRanipal SDK and Tobii's SDK. Those interested in the Linux discussion for HTC's Vive Pro Eye Tracking can follow the subjecthere (Vive),here (Vive), andhere (Tobii).
- Additionally, theLogitechWheelPlugin we use only has Windows support currently. Though it should be possible to use the G923 on Linux as per theArch Wiki.
- Unfortunately the eye-tracking firmware is proprietary & does not work on Linux
- Also, although MacOS is not officially supported by CARLA, we have development happening on an Apple Silicon machine and have active forks of CARLA + UE4.26 with MacOS 12+ support. Note that this is primarily for development, as it is the most limited system by far.
- See
F.A.Q. wikifor our Frequently Asked Questions wiki page. - See
Install.mdto install and build DReyeVR - See
Usage.mdto learn how to use our provided DReyeVR features - See
Development.mdto get started with DReyeVR development and add new features - See
Docs/Tutorials/to view several DReyeVR tutorials such as customizing the EgoVehicle, adding custom signs/props and more.
If you use our work, please cite the correspondingpaper:
@inproceedings{silvera2022dreyevr,title={DReyeVR: Democratizing Virtual Reality Driving Simulation for Behavioural \& Interaction Research},author={Silvera, Gustavo and Biswas, Abhijat and Admoni, Henny},booktitle={Proceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction},pages={639--643},year={2022}}
- This project builds upon and extends theCARLA simulator
- This repo includes some code from CARLA: Computer Vision Center (CVC) at the Universitat Autonoma de Barcelona (UAB) & Intel Corporation.
- This repo includes some code from Hewlett-Packard Development Company, LP. Seenvidia.ph. This is a modified diagnostic tool used during development.
- Custom DReyeVR code is distributed under the MIT License.
- Unreal Engine 4 follows itsown license terms.
- Code used from other sources that is prefixed with a Copyright header belongs to those individuals/organizations.
- CARLA specific licenses (and dependencies) are described on theirGitHub
About
VR driving 🚙 + eye tracking 👀 simulator based on CARLA for driving interaction research
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Uh oh!
There was an error while loading.Please reload this page.
Contributors6
Uh oh!
There was an error while loading.Please reload this page.
