User Tools

Site Tools


hebi_vr_teleo

HEBI Robotic Arm VR Teleoperation

Author: Yu Hang He Email: yuhangheedu@gmail.com

Date: Last modified on <12/21/20>

Keywords: HEBI robotic arm, VR, Teleoperation, Unity, Mixed Reality Toolkit 2, C#, ROS, RealSense D435

This tutorial will demonstrate how to create a VR teleoperation system for HEBI 6 DoF robotic arm. This includes a VR system programmed in Unity using Mixed Reality Toolkit and a ROS program to control HEBI robotic arm. This tutorial take approximately 5 hours to complete.

Motivation and Audience

This tutorial's motivation is to demonstrate how to create a VR teleoperation system for controlling manipulator. This tutorial assumes the reader has the following background and interests:

  • Familiar with forward and inverse kinematic of robotic manipulator
  • Familiar with HEBI robotic arm
  • Familiar with C# and ROS
  • Familiar with Unity and Mixed Reality Toolkit 2

The rest of this tutorial is presented as follows:

Prerequisite

The author assume that the reader already setup the ROS environment for HEBI arm on a separate Linux machine. If not, the reader should follow the instruction from HEBI Robotic, and try the Arm Node example. You can also check out the tutorial by Jason Kreitz on HEBI arm.

The author also assume that the reader are familiar with Unity and Mixed Reality Toolkit before starting this tutorial. Unity provides some very beginner friendly tutorial for you to get started: Unity Tutorials. You can also learn about Mixed Reality Toolkit by Microsoft from these tutorials

Import and Apply Kinematic to Robot Model

1. Obtain a URDF of the manipulator

  • Convert xacro to URDF using
  • rosrun xacro xacro –xacro-ns -o model.urdf model.xacro

2. Add BioIK resource to Unity project

3. Add ROS # resources to Unity project

4. Add URDF resources to Unity project

5. Import the robot model into Unity through URDF using ROS #

  • move the model.urdf file to the outside of hebi_description file structure

  • Select the model.urdf file within Unity's Project explorer
  • Select Assets → Import Robot from URDF
  • You should see the imported HEBI Arm in Unity

  • In the Urdf Robot properties of the imported model, make sure that Is Kinematic option is Enable and Use Gravity option is Disable

6. Add Bio IK component to the imported manipulator game object

7. Specify joints in Bio Ik

  • Select component corresponds to a joint and click Add Joint

  • Make sure that joint orientation and position are appropriate
  • Enable X, Y, or Z Motion that aligns with the axis of rotation of the joint

  • Repeat the same step for all rotational joints

8. Create a GameObject as the Child of the last joint and name it End_effector

9. Add Position and Orientation objectives to the End_effector in Bio Ik

10. Create a Sphere GameObject and name it Target

  • Drag the Target into Target Transform of both Position and Orientation objectives

11. To Test if the Bio IK is working properly, play the scene and move the Target around. The arm should follow the position and orientation of the Target

12. Apply joint limit of each individual joint in Bio IK.

13. In Bio IK options, change Motion Type to Realistic

14. Set the home position of the gripper by editing the joint position

  • The home position of 6 joints in radians [0, 1.57, 2.14, 0, 1.58, 0]
  • J2 example

VR Teleoperation

1. Add MRTK to the Unity project

2. Following the instruction to properly configure MRTK to the Unity Project

3. Enable Virtual Reality Support

  • On the toolbar, click Edit → Project Settings → Player → XR Settings → Virtual Reality Supported
  • Play the scene and the camera should follow the headset

4. Set the Target to track the controller

  • Add Orbital component to the Target
  • In SolverHandler component, set Tracked Target Type to Controller Ray and Tracked Handness to Right
  • In Orbital component, set Local Offset to 0 for all axes

5. Assign custom input actions to controller buttons

  • Create a new configuration profile by inspecting the components of MixedRealityToolkit GameObject in Unity's Hierarchy tree
  • Make a clone of the default configuration profile and name it something else

  • Similarly, clone the default input system profile

  • Clone the default input actions profile
  • Under Input Actions tab within Input Systems, Add a New Action
  • Scroll down the list of Actions to find the newly added action. Edit the name to Manipulation and change the Axis Constraint to Digital

  • Under the Controllers tab in Input System, clone the controller mapping profile
  • Expand Controller Definitions. Go to Generic Open VR Right Hand Controller and click on Edit Input Action Map
  • Set entry 10's Action to Manipulation and KeyCode to Joystick Button 5 (this corresponds to button A on the right index controller).

6. Start and stop controller tracking base on input action

  • Select Target GameObject, disable SolverHandler and Orbital components and add InputActionHandler component
  • Set the Input Action to Manipulation and uncheck the Is Focus Required option

  • Under On Input Action Started function in InputActionHandler add two callback function by clicking the plus sign on the button right corner

  • Add the Orbital and the SolverHandler components
  • On the Function option, select the bool enabled of the respective components, this will enable the components when the input action is triggered (the input action was assigned to button press on the controller). Make sure the check mark is checked which will set the bool to true

  • Repeat similar steps to disable the components on button release

7. Move the Target to the initial position of the gripper

  • Apply the following transforms in Transform and SolverHandler components to match the orientation of gripper at the start

8. Test the VR teleoperation

  • At this point, you should be able to control the HEBI robotic arm using right Index controller while holding the A button on the controller

Setup ROS Communication

1. Install rosbridge-suite via

 $ sudo apt-get install ros-melodic-rosbridge-server

2. Place the file_server package in the src folder of Catkin workspace, then build by running

 $ catkin_make

3. Modified the IP address and port number in ros_sharp_communication.launch in the file_server/launch

| ros_sharp_communication.launch
<launch>
 
	<arg name="port" default="9090" />
 
	<include file="$(find rosbridge_server)/launch/rosbridge_websocket.launch">
		<arg name="port" value="9090" />
                <arg name="address" value="192.168.201.45" />
	</include>
 
	<node name="file_server" pkg="file_server" type="file_server" output="screen" />
 
</launch>

4. After source setup.bash, try running

 $ roslaunch ros_sharp_communication.launch

If running into error with No module name builtins, try

 $ pip install future

5. If the rosbridge websocket launched successfully, the following message should appear 6. Test websocket connection

  • Add Ros Connector Component
  • Modify the ROS Bridge Server Url with the correct IP address and port

7. The console in Unity should display the following on successful connection

Publish JointTrajectory Message through ROS #

1. The HEBI ROS example code Arm Node accepts ROS message type trajectory_msgs/JointTrajectory published on the topic /joint_waypoints

2. The following is an example of the ROS JointTrajectory message that the example code can accept

rostopic pub /joint_waypoints trajectory_msgs/JointTrajectory "header:
  seq: 0
  stamp:
    secs: 0
    nsecs: 0
  frame_id: ''
joint_names:
- ''
points:
- positions: [0.51, 2.09439, 2.09439, 0.01, 1.5707963, 0.01]
  velocities: [0, 0, 0, 0, 0, 0]
  accelerations: [0, 0, 0, 0, 0, 0]
  effort: []
  time_from_start: {secs: 0, nsecs: 0}

3. Copied from Hebi CPP example ““Note: for the JointTrajectory messages, you can ignore the header and the “names” fields, as well as the “efforts”. You must fill in the “positions”, “velocities”, and “accelerations” vectors for each waypoint, along with the desired time_from_start for each waypoint (these must be monotonically increasing).””

4. To simplify, for this project, each JointTrajectory message will be treated as a trajectory with only one point. The VR teleoperating system will continuously update the trajectory at high Hz to achieve continuous motion.

5. Create a custom ROS # publisher that can publish JointTrajectory message and update the position parameter with joint angles from BIO IK.

| JointTrajectoryPublisher.cs
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
 
namespace RosSharp.RosBridgeClient
{
    public class JointTrajectoryPublisher : UnityPublisher<MessageTypes.Trajectory.JointTrajectory>
    {
        private string FrameId = "Unity";
        private BioIK.BioIK hebi;
 
        private MessageTypes.Trajectory.JointTrajectory message;
 
        protected override void Start()
        {
            base.Start();
            hebi = this.gameObject.GetComponent<BioIK.BioIK>();
            InitializeMessage();
        }
 
        private void InitializeMessage()
        {
            message = new MessageTypes.Trajectory.JointTrajectory
            {
                header = new MessageTypes.Std.Header { frame_id = FrameId },
                points = new MessageTypes.Trajectory.JointTrajectoryPoint[1],
            };
            message.points[0] = new MessageTypes.Trajectory.JointTrajectoryPoint();
            message.points[0].positions = new double[] { 0, 0, 0, 0, 0, 0 };
            message.points[0].velocities = new double[] { 0, 0, 0, 0, 0, 0 };
            message.points[0].accelerations = new double[] { 0, 0, 0, 0, 0, 0 };
            message.points[0].effort = new double[] { 0, 0, 0, 0, 0, 0 };
            message.points[0].time_from_start = new MessageTypes.Std.Duration { nsecs= 20000000 };
        }
 
        private void FixedUpdate()
        {
            int i = 0;
            foreach (BioIK.BioSegment segment in hebi.Segments)
            {
                if (segment.Joint != null)
                {
                    double angle = 0.0;
                    if (segment.Joint.X.IsEnabled())
                    {
                        angle = segment.Joint.X.GetCurrentValue();
                    }
                    else if (segment.Joint.Y.IsEnabled())
                    {
                        angle = segment.Joint.Y.GetCurrentValue();
                    }
                    else if (segment.Joint.Z.IsEnabled())
                    {
                        angle = segment.Joint.Z.GetCurrentValue();
                    }
                    message.points[0].positions[i] = angle * Mathf.PI / 180;
                    i++;
                }
            }
            Publish(message);
        }
    }
}

6. Add JointTrajectoryPublisher component to the model of HEBI arm that contains the BIO IK. 7. Test the system while only running the file_server on the Ubuntu machine controlling the HEBI arm.

$ rostopic echo /joint_waypoints

The following ROS msg should be updating on the terminal

Realsense Camera Feed

1. Install the ROS Wrapper for Intel RealSense SDK following the instruction

2. Start camera node in ROS

 $ roslaunch realsense2_camera rs_camera.launch
 

3. Check that /camera messages are being published successfully

4. Add a MRTK Slate Prefab to the project

  • Tutorials on Slate Prefab can be found here

5. Add a ImageSubscriber component along with ROSconnector

  • In the Topic field, enter the ROS topic /camera/color/image_raw/compressed
  • Drag the ContentQuad sub object from Slate Prefab into the Mesh Renderer field

6. The camera feed from RealSense can be now seen on the Slate prefab in the VR scene

Gripper Control

1. Following the last section of tutorial in HEBI arm to set up gripper control

  • Note: that gripper node must be running in conjunction with arm node

2. Test gripper node

 rostopic pub /gripper_strength std_msgs/Float64 "data: 0.5"

3. Create C# script in Unity to send the proper ROS msg

| GripperStrengthPublisher.cs
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using Microsoft.MixedReality.Toolkit.Input;
using Microsoft.MixedReality.Toolkit;
 
namespace RosSharp.RosBridgeClient
{
    public class GripperStrengthPublisher : UnityPublisher<MessageTypes.Std.Float64>
    {
        private MessageTypes.Std.Float64 message;
 
        protected override void Start()
        {
            base.Start();
            InitializeMessage();
        }
 
        private void InitializeMessage()
        {
            message = new MessageTypes.Std.Float64
            {
                data = 0.0f
            };
        }
 
        private void FixedUpdate()
        {
            // Listening to all controller inputs detected by MRTK
            foreach (var controller in CoreServices.InputSystem.DetectedControllers)
            {
                // Interactions for a controller is the list of inputs that this controller exposes
                foreach (MixedRealityInteractionMapping inputMapping in controller.Interactions)
                {
                    // Isolate Thumbstick signal
                    if (inputMapping.Description == "Trackpad-Thumbstick Position")
                    {
                        // Small deadzone
                        if (inputMapping.Vector2Data.y > 0.05f)
                        {
                            // Convert input signal to gripper control signal
                            message.data = inputMapping.Vector2Data.y * 0.5;
                            Publish(message);
                            break;
                        }
                    }
 
                }
            }
        }
    }
}

4. Add GripperStrengthPublisher.cs as a component

  • Set /gripper_strength as the Topic

5. While the gripper node is running, test the code by moving the thumbstick on the Index controller up.

Demonstration

In this demonstration, I teleoperated the HEBI arm using HTC Vive to grab a 3D printed component.

Final Words

For questions, clarifications, etc, Email: yuhangheedu@gmail.com

hebi_vr_teleo.txt · Last modified: 2020/12/21 10:33 by yuhanghe