Preparing Ubuntu 18.04 for Real-time and ROS 2

I’ve been setting up ROS 2 for one of my recent robotics projects that I am expecting will have important deterministic requirements. Surprisingly, while Dashing Diademata is the first LTS version with Ubuntu 18.04 as Tier 1, there isn’t much documentation on getting your operating system ready for real-time stuff, so I thought I might as well document my process. So as of writing this post, I am working with Ubuntu 18.04.2 which comes with the kernel version 4.18.0-15-generic.

Download Compatible Kernel and Patch

The idea is to modify your Linux kernel with a version where PREEMPT_RT is supported. According to the Linux Foundation Page, as of this post, the stable version is 4.19-rt. So, we want to obtain this stable version, as well as the corresponding Linux kernel.

To keep things a bit more manageable, let’s first create a kernel folder in your home folder and download, extract, and patch everything in there. First, let’s download the latest patch.

Edit: As of August 4, 2019, the above link no longer works as a new version has come out. To exactly replicate this post, run instead the following:

Then, from the Linux kernel page, also download the corresponding kernel version.

Because recent Debian versions surprisingly do not include bison or flex in build-essential, you need to install those two to copy your old config file over.

Once all the files have been downloaded and packages have been installed, let’s unpack the kernel and rename it so that it matches the patch name.

Patch Kernel

Now, inside the kernel folder linux-4.19.50-rt22, patch the kernel.

Next, we copy over the current OS configuration files, install some tools, and proceed with enable the preemptible kernel.

Now comes the step that has been slightly modified from previous versions because of the relocation of the option to select the Fully Preemptible Kernel (RT). Under General setup exists Preemptible Model. Select the Fully Preemptible Kernel (RT). (Note that this is NOT under Processor type and features)

Make and Install RT Kernel

Exit out of menuconfig and let’s make and install the kernel. Note that this process can take a while depending on the computer you are working on.

Once everything is done, reboot your computer. Your new kernel should show as 4.19.50-rt22.

How to install python-pcl pcl_visualization on Ubuntu

pcl_visualization is a handy visualization feature if you are mostly working with PCL’s PointT format. Unfortunately, for some reason, the visualizer cannot be installed with the provided setup.py file. Hence, you need to make some modifications to it to make it work with certain versions of VTK if you plan on using the visualizer on Ubuntu.

My setup is Ubuntu 16.04. Note that you may have to make some adjustments to the command lines if you are running a different version.

The idea is uncomment certain parts of the “setup.py” file and modify a line such that pcl_visualization runs using the VTK version you have installed on your machine.

In my version of python-pcl, I uncommented line 561:

Included above:

Note that I used “vtk-6.2” instead of “vtk-5.8” because that was the VTK available when I observed the directory contents in /usr/include/.

Lastly, I uncommencted line 625:

For reference purposes, I did not install PCL explicitly, but used the one that came default with ROS Kinetic.

Intel Realsense D435 Setup on Ubuntu 16.04

I’ve been writing an Intel Realsense D435 vs ZED mini comparison post, but in a recent project I was doing, I needed to install D435 on a fresh copy of Ubuntu 16.04 and thought that it might be useful to document the process here.

System Preparation

1. Before installing the latest SDK, we need to modify the kernel. Kernel versions 4.4.0-x, 4.8.0-x, 4.10.0-x, 4.13.0-x, or 4.15.0-x are supported as of this post. Check your Ubuntu installation kernel version by running the following command in your terminal:

2. Install some core packages that allow you to build librealsense.

3. I am on Ubuntu 16.04, so you also need to install:

4. Apply the Realsense permissions scripts located in the librealsense source directory. Again, note that this command needs to be run inside your librealsense folder!

5. Finally, let’s download, patch, and build your custom kernel.

If the above is a success, you will get prints that show that your videodev, uvcvideo, hid_sensor_accel_3d, and hid_sensor_gyro_3d has been replaced and this procedure was a success.

Building the SDK

1. I’m going to assume you have CMake installed and your build toolchain is gcc-5. If you are not sure, run:

2. Now, let’s run CMake! I am including the ‘Examples’ option and the ‘Python Bindings’ option since I’d assume you would want to see an example and many people use Python. Feel free to turn them off by removing them.

3. Once CMake has generated the necessary files, let’s install it. In case you have already run ‘make’ in this folder previously, run the following for convenience:

3.a Obviously, feel free todo parallel compilation if your system allows this by replacing make with make -jX

4. Once the SDK is installed, run one of the examples to make sure that everything is installed well!

 

Passing arguments to Python callback in rospy

To pass more arguments to your callback function in rospy, it is actually super simple.

Of course, rather than a tuple, something like a single “String” could also be passed directly.

Shell Ocean Discovery XPRIZE

I have always been fond of underwater robotics and thought that it might be the hardest environment for robots to perform in. We need to tackle of a multitude of challenges ranging from waterproof hardware to communication through these new mediums. It obviously will not be done in a short span of time and in the midst of drones dominating the headlines and locomotion on rough terrain even quite a distance away, I really wondered when we would start seriously considering underwater as another environment.

In the wake of SpaceX successfully landing its Falcon 9, I actually found something just as exciting that someone like me actually should have found out about much before.

So there is this new cool competition in XPRIZE called the Shell Ocean Discovery XPRIZE. The goal of this competition is to launch entries from shore or air into the competition area to produce:

  • high resolution bathymetric map
  • images of a specific object
  • identify archaeological, biological, or geological features
  • track a chemical or biological signal to its source (bonus)

Looking at these four tasks, the first thing that comes into my mind is, “Wow! DARPA-hard!”

This is going to be very interesting to continue to follow. Team registrations are due in June 2016 and regardless of what the outcome of this competition is come December 2018, advances in underwater robotics will be massive.

The massive downside is:

Operating Costs:

Teams will be responsible for funding their own technology development costs.

Anyone care to find some funding so we can have a go at this? 😉

Gyro Rigidity and Drift

I am working with the Microstrain 3DM-GX3-25 and to better understand how to use it, I have also been studying the exact mechanism on how all its components work together. A part I found difficult as a novice engineer was how the gyro and the accelerometer work together. A key component of understanding the two’s collaboration is the drift that gyro experiences. This is a super short clip on YouTube that I found which sums up why the gyro is only good in the short run.