Difference between revisions of "K-Scale Teleop"
(→Bi-Manual Remote Robotic Teleoperation) |
m (Vrtnis moved page K-Scale teleop to K-Scale Teleop: Correct title capitalization) |
||
(2 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
A minimal implementation of a bi-manual remote robotic teleoperation system using VR hand tracking and camera streaming. | A minimal implementation of a bi-manual remote robotic teleoperation system using VR hand tracking and camera streaming. | ||
+ | |||
+ | {| class="wikitable" | ||
+ | ! Feature | ||
+ | ! Status | ||
+ | |- | ||
+ | | VR and browser visualization | ||
+ | | ✔ Completed | ||
+ | |- | ||
+ | | Bi-manual hand gesture control | ||
+ | | ✔ Completed | ||
+ | |- | ||
+ | | Camera streaming (mono + stereo) | ||
+ | | ✔ Completed | ||
+ | |- | ||
+ | | Inverse kinematics | ||
+ | | ✔ Completed | ||
+ | |- | ||
+ | | Meta Quest Pro HMD + NVIDIA® Jetson AGX Orin™ Developer Kit | ||
+ | | ✔ Completed | ||
+ | |- | ||
+ | | .urdf robot model | ||
+ | | ✔ Completed | ||
+ | |- | ||
+ | | 3dof end effector control | ||
+ | | ✔ Completed | ||
+ | |- | ||
+ | | Debug 6dof end effector control | ||
+ | | ⬜ Pending | ||
+ | |- | ||
+ | | Resets to various default poses | ||
+ | | ⬜ Pending | ||
+ | |- | ||
+ | | Tested on real world robot | ||
+ | | ⬜ Pending | ||
+ | |- | ||
+ | | Record & playback trajectories | ||
+ | | ⬜ Pending | ||
+ | |} | ||
+ | |||
+ | |||
+ | |||
+ | === Setup === | ||
+ | <pre> | ||
+ | git clone https://github.com/kscalelabs/teleop.git && cd teleop | ||
+ | conda create -y -n teleop python=3.8 && conda activate teleop | ||
+ | pip install -r requirements.txt | ||
+ | </pre> | ||
+ | |||
+ | === Usage === | ||
+ | * Start the server on the robot computer: | ||
+ | <pre>python demo_hands_stereo_ik3dof.py</pre> | ||
+ | * Start ngrok on the robot computer: | ||
+ | <pre>ngrok http 8012</pre> | ||
+ | * Open the browser app on the HMD and go to the ngrok URL. | ||
+ | |||
+ | === Dependencies === | ||
+ | * Vuer is used for visualization | ||
+ | * PyBullet is used for inverse kinematics | ||
+ | * ngrok is used for networking | ||
+ | |||
+ | === Citation === | ||
+ | <pre> | ||
+ | @misc{teleop-2024, | ||
+ | title={Bi-Manual Remote Robotic Teleoperation}, | ||
+ | author={Hugo Ponte}, | ||
+ | year={2024}, | ||
+ | url={https://github.com/kscalelabs/teleop} | ||
+ | } | ||
+ | </pre> |
Latest revision as of 20:40, 22 May 2024
A minimal implementation of a bi-manual remote robotic teleoperation system using VR hand tracking and camera streaming.
Feature | Status |
---|---|
VR and browser visualization | ✔ Completed |
Bi-manual hand gesture control | ✔ Completed |
Camera streaming (mono + stereo) | ✔ Completed |
Inverse kinematics | ✔ Completed |
Meta Quest Pro HMD + NVIDIA® Jetson AGX Orin™ Developer Kit | ✔ Completed |
.urdf robot model | ✔ Completed |
3dof end effector control | ✔ Completed |
Debug 6dof end effector control | ⬜ Pending |
Resets to various default poses | ⬜ Pending |
Tested on real world robot | ⬜ Pending |
Record & playback trajectories | ⬜ Pending |
Contents
Setup[edit]
git clone https://github.com/kscalelabs/teleop.git && cd teleop conda create -y -n teleop python=3.8 && conda activate teleop pip install -r requirements.txt
Usage[edit]
- Start the server on the robot computer:
python demo_hands_stereo_ik3dof.py
- Start ngrok on the robot computer:
ngrok http 8012
- Open the browser app on the HMD and go to the ngrok URL.
Dependencies[edit]
- Vuer is used for visualization
- PyBullet is used for inverse kinematics
- ngrok is used for networking
Citation[edit]
@misc{teleop-2024, title={Bi-Manual Remote Robotic Teleoperation}, author={Hugo Ponte}, year={2024}, url={https://github.com/kscalelabs/teleop} }