UBR-1 on MoveIt2

Yet another post on ROS2 and the UBR-1. Here are quick links to the previous posts:

Installation

MoveIt2 is still evolving, and the Debian builds seem to be quite out of date most of the time, so I would suggest doing a source install. I’ve had good results lately with the main branch building on both Iron and Jazzy. In addition to the moveit2 repository, I’m also building the ros2 branch of moveit_msgs and moveit_task_constructor from source.

MoveIt2 Setup Assistant

Everything in MoveIt really starts with the MoveIt Setup Assistant (MSA). Unfortunately, the MSA was one of the last components migrated to ROS 2 (at least partially because launch files are massively different from ROS 1). While the MSA is still a bit brittle, it is pretty well documented with a tutorial which I followed quite closely.

The MSA basically exports a new ROS 2 package, typically called [robot_name]_moveit. You can find my ubr1_moveit export here.

This package includes a custom set of configuration and launch files for the robot to use MoveIt2. In comparison to the files generated in ROS 1, these end up being a bit less easy to modify since most of the “launch” file is actually part of the moveit_config_utils package.

For the UBR-1, I defined three planning groups:

After first exporting the MSA config, I had make a few minor tweaks. The most notable is that I had to add acceleration limits to the joints as the Time Optimal Parameterization now requires acceleration limits. These values were added to the config/joint_limits.yaml file.

I also had to manually update the collision matrix in the SRDF file to allow collision between the bellows and the torso/base. I’m not sure why the collision checker in the MSA didn’t pick up that these are always in collision, but I couldn’t plan without these collisions being allowed.

Definitely commit any changes to your configuration before re-exporting with the MSA because it will override/lose some changes in many cases and git diff will help you track what was lost.

Once we have an export, we can test things using the demo.launch:

ros2 launch ubr1_moveit move_group.launch

Planning worked well but I was unable to execute any motion plans in the demo environment. I found that because I had two groups using the arm - I needed to switch controllers using the rqt_controller_manager tool. This isn’t an issue on the real robot where I use robot_controllers instead of ros2_control, which does automatic switching of controllers when they receive an action goal.

One thing to keep in mind - when loading the rviz2 plugin for MoveIt2 - always use the RVIZ2 launch file generated by the MSA. Since there is no shared parameter server, this launch file will properly populate the kinematics parameters for rviz2. Without those parameters, you’ll not be able to plan (and the interactive markers will be missing).

Along the way, I had to fix a few bugs:

I found you might also have to set the DISPLAY variable in your environment if running headless.

MoveIt2 Without ros2_control

Much of MoveIt2 assumes you are using ros2_control - but I’m not. I continue to use my alternative framework robot_controllers. This is still supported in MoveIt using the simple controller manager which does no controller switching, but interfaces with the standard control_msgs actions expected for arms and gripper.

Some Interesting Debugging

Once I had MoveIt2 setup to interact with the robot, I could plan motions and move the arm without issue, but I was unable to actually get MoveIt2 to control the gripper. It turned out that the action server implementation in my gripper driver wasn’t entirely correct:

In theory this would be an easy issue to debug - but two things threw me off:

I ported this driver to ROS 2 quite some time ago, but now the easy approach if you only need one action goal active at a time would be to use the simple_actions package. I did just that later on when calling my grasping perception for the pick and place demo.

Simple Grasping

Perception is really the weak part of the ROS ecosystem. There are numerous packages available for controls and planning, but not much for perception. Years ago I created a package simple_grasping which segments a point cloud into objects and support surfaces which can then be used for pick and place. Actually porting the code to ROS 2 was fairly straight forward.

One of the issues I’m seeing in Jazzy that I have not yet resolved is that for large messages (such as images or point clouds), a reliable publisher connecting to a best effort subscriber seems not to work - there isn’t any issue with QoS mismatch, but no images get through. My workaround has been to add QoS overrides to a number of nodes. For instance, in simple_grasping, I have the following code:

// Would prefer to subscribe to head camera cloud as best effort
rclcpp::QoS points_qos(10);
points_qos.best_effort();

// But due to issues in Jazzy, allow overriding QoS at runtime
rclcpp::SubscriptionOptions sub_opts;
sub_opts.qos_overriding_options = rclcpp::QosOverridingOptions::with_default_policies();
cloud_sub_ = this->create_subscription<sensor_msgs::msg::PointCloud2>(
  "/head_camera/depth_registered/points",
  points_qos,
  std::bind(&BasicGraspingPerception::cloud_callback, this, _1));
  std::bind(&BasicGraspingPerception::cloud_callback, this, _1),
  sub_opts);

(You can see the full set of changes here)

And then in my launch file, I can override the

Node(
  name='basic_grasping_perception_node',
  package='simple_grasping',
  executable='basic_grasping_perception_node',
  parameters=[{
    'qos_overrides': {
      '/head_camera/depth_registered/points': {
        'subscription': {
          'reliability': 'reliable',
        }
      }
    },
  }],
)

Pick and Place Demo

Not every feature from MoveIt in ROS 1 made it into MoveIt2. One of the big features that was dropped was the pick and place part of move_group. Users are instructed to use the MoveIt Task Constructor (MTC).

It seems to be a common theme in ROS 2 that things are designed to be much more capable, but then are much harder to use and also poorly documented. This theme holds up for pick and place. I ended up spending more time setting up an MTC pick and place node than all of the other steps above combined.

There is a single tutorial that shows how to do pick and place with a Franka arm. The tutorial uses several simple components inside MTC to do the grasp generation and place generation.

My implementation in the ubr1_demo package is based off the tutorial. The files can be roughly described as:

MTC has a nice visualization interface in rviz2 where you can inspect each step in the task. One note that caused me some difficulty: you need to keep the MTC task in scope or you lose the visualization in rviz2.

To actually execute the MTC plans, you need to also be running move_group with a modified launch file:

  from moveit_configs_utils import MoveItConfigsBuilder
  from moveit_configs_utils.launches import generate_move_group_launch

  def generate_launch_description():
      moveit_config = MoveItConfigsBuilder("ubr1", package_name="ubr1_moveit").  to_moveit_configs()
      # Add the following line to load the MTC capability
      moveit_config.move_group_capabilities["capabilities"] = ["move_group/ExecuteTaskSolutionCapability"]
      return generate_move_group_launch(moveit_config)

The only part of the MTC that I had some issues with was the Connect stage between the ready pose and the pre-grasp pose. There was some minor noise in the gripper position, which was causing issues. I found a closed issue where the following was recommended to be added to the .rosconsole/config file:

log4j.logger.ros.moveit_task_constructor.core.Connect=DEBUG
log4j.logger.ros.moveit_task_constructor.core.Connecting=DEBUG

Next Steps

Now that I have all the building blocks working for the UBR-1 on ROS 2, I want to create a more compelling demo. Stay tuned.