- Notifications
You must be signed in to change notification settings - Fork0
This work describes an immersive control system for the Spot robot by Boston Dynamics, designed to track the head movements of the operator wearing the Meta Quest 2, while also utilizing touch controller commands for locomotion control.
License
aliy98/zed-oculus-spot
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
This work describes an immersive control system for the Spot robot by Boston Dynamics, designed to track the head movements ofthe operator wearing the Meta Quest 2, while also utilizing joystick commands for locomotion control.
Authors:
- Ali Yousefi,ali.yousefi@edu.unige.it
- Zoe Betta,zoe.betta@edu.unige.it
- Giovanni Mottola,giovanni.mottola@unige.it
- Carmine Tommaso Recchiuto,carmine.recchiuto@dibris.unige.it
- Antonio Sgorbissa,antonio.sgorbissa@unige.it
©2024 RICE Lab - DIBRIS, University of Genova
If you use this work in an academic context, please cite:
@inproceedings{Yousefi2024zedoculusspot, title={Immersive control of a quadruped robot with Virtual Reality Eye-Wear}, DOI={10.1109/ro-man60168.2024.10731469}, author={Yousefi, Ali and Betta, Zoe and Mottola, Giovanni and Recchiuto, Carmine Tommaso and Sgorbissa, Antonio}, booktitle={2024 33rd IEEE International Conference on Robot and Human Interactive Communication (ROMAN)}, year={2024}}
This repository provides a modified version of the software available onzed-oculus. Since the IMU and touch input data is required for this work, themain.cpp
is modified in such a way that it reads the angular velocities, and touch input data usingts.HeadPose.AngularVelocity
,InputState.Thumbstick[ovrHand_Right]
, andInputState.Thumbstick[ovrHand_Left]
class attributes, and sends them to the process executed bymain.py
. Additionally, considering the fact that the ZED camera is not connected to the user PC with a USB cable, another modification is done inmain.cpp
file, in order to open the ZED camera from the socket input, by changing theinit_paramters
values inzed.open(init_parameters)
same as the methodHERE.
Furthermore, thescripts
folder is added to this package, which contains the sofware developed for the headtracking task, and locomotion with joysticks. The script files are described as follows:
Script | Description |
---|---|
main.py | Gets executed by themain.cpp file. Uses theSpotInterface andController classes methods for the headtracking and locomotion tasks. |
controller.py | Provides a simple closed loop controller using thesimple-pid python module with the methodget_hmd_controls(setpoints) . Additionally, computes the locomotion control signal based on the touch input reference signals with the methodget_touch_controls(setpoints) . |
spot_interface.py | Initializes the Lease, eStop, Power, RobotState, and RobotCommand clients. Provides the required method for sending the control signals to the robotset_controls(controls, dt) , and receving robot angular velocitiesget_body_vel() |
The system architecture for this work is shown as it follows:
A Wi-Fi bridge could be implemented on the Raspberry Pi board using thistutorial. Once it is ready, the components of the local network could be configured with the following ip addresses:
Component | IP Address |
---|---|
Raspberry Pi 4 Model B | 192.168.220.1 (Server - Ethernet and Wireless) |
Jetson Nano | 192.168.220.50 (Client - Ethernet) |
OMEN PC | 10.42.0.210 (Client - Wireless) |
Spot Robot | 10.42.0.211 (Client - Wireless) |
For the purpose of this work, the IMU data generated by the HMD, is transmitted to themain.py
control process, using anamed pipe. Moreover, the stereo camera images is transmitted from the robot to the HMD, using asocket, with the method of thistutorial. Sending the control signals and receiving the robot state data is done usinggRPC.
- Windows 64 bits
- Spot SDK
- simple-pid
- ZED SDK 3.x and its dependencies (CUDA). Last tested with ZED SDK 3.2.2
- Oculus SDK (1.17 or later)
- GLEW included in the ZED SDK dependencies folder
- SDL
Download the sample and follow the instructions below: More
- Create a folder called "build" in the source folder
- Open cmake-gui and select the source and build folders
- Generate the Visual Studio Win64 solution
- Open the resulting solution and change configuration to Release. You may have to modify the path of the dependencies to match your configuration
- Build solution
- On the robot side (Linux/Jetson), build and run the streaming sender using the method shownHERE.
- On the user side (Windows), run the ''ZED_Stereo_Passthrough.exe'' in a terminal as it follows:
./ZED_Streaming_Receiver <ip:port>
Once it is executed, the stereo passthourgh from ZED camera to Oculus starts. Moreover, it will automatically run themain.py
script, that provides the control system and iterface with the robot.
For future work, we aim to address the limitations of the current study. This includes implementing a more efficient communication system to control the robot from greater distances, and ensuring a comparable field of view between the HMD and the tablet-based controller.
About
This work describes an immersive control system for the Spot robot by Boston Dynamics, designed to track the head movements of the operator wearing the Meta Quest 2, while also utilizing touch controller commands for locomotion control.