Vertue descending into a manhole. It was close to the side so that it could pass by an obstruction below.

While at Redzone Robotics, I was part of the team that designed a robot called Vertue that inspects manholes. It has four 13MP cameras as well as the equivalent of 800 Watts of incadescent lighing. In most situations, it was fully automatic where an operator started an inspection and then Vertue would descend, stop at the bottom of a manhole, and come back, stopping when it had reached the top. Some situations required manual operation so that it could be repositioned to miss an obstruction or maybe to stop early because of water pouring down on Vertue. It was waterproof but our data wasn't as good if the lens domes had water droplets all over them.

For this project, I designed the low-level microcontroller code that ran on the part that descended into the manhole. It synced all of the cameras, drove the lighting based on exposure time and how bright the lights should be. It generated a timestamp used to ensure sync in post production, and read all of the sensors, which were passed onto the CPU.

On the CPU, I wrote a linux-based program that worked with Gstreamer to record the four cameras and also stream the bottom camera up to the operator's tablet. The recorder program also adjusted lighting and exposure based on information from a custom Gstreamer element that I also wrote. The Gstreamer element originally intended to split the imagery into two videos where one was for imagery taken when the lasers were on, and the other when the lights were on but that ended up taxing the video encoder too much because of the context switching needed to encode eight video streams. Ultimately, the element did two things: It timestamped the video with the Linux epoch time and also with the timestamp received from the microcontroller. It also computed a histogram of the imagery, which was used by the recorder program to adjust exposure.

While software was my primariy duty on this project, I also spent a lot of time working with our mechanical engineers to ensure that we were designing an usable product that could be machined, assembled, and maintained easily. As this was built during the 2020 pandemic and I was the only one with facilities for software, electronics, and machining, as well as space for testing, most of the troubleshooting, assembly, and testing of our early prototypes was done by me.

The images on this page were taken from Redzone's website because I am unsure if I am allowed to share pictures that I personally took of Vertue.

One of the final outputs from Vertue was a photogrammetrically reconstructed model of the inside of the manhole.

Another output of the processing pipeline was a stitched panorama of the inside of the manhole. This was done as a 2D process at the time of writing this webpage so there were some issues with stitching when Vertue was close to the wall due to the large camera spacing.

Robot Brigade Logo