For the past few months, I have been developing an all-new autonomous steering system for the self-driving golf cart. I not only developed a new deep learning model but also completely modified the hardware design. If you have been following my blog, you might have seen posts related to these topics.
In this two-part series, I will show you the testing process for this new system. Part one discusses all of the new features and improvements, as well as the indoor test. Part two will be about more rigorous outdoor tests.
- Linear actuator: an actuator that creates motion in a straight line, in contrast to the circular motion of a conventional electric motor.
- ROS (robot operating system): a collection of software frameworks for robot software development. It provides services designed for hardware abstraction, low-level device control, implementation of commonly used functionality, message-passing between processes, and package management.
- ROS Nodes: a process that performs computations. Nodes are combined together into a graph and communicate with one another using streaming topics, RPC services, and the Parameter Server.
- Arduino: Open source microcontrollers for robotics and so much more.
- Deep Learning: is part of a broader family of machine learning methods. Not task-specific algorithms. Vaguely inspired by information processing and communication patterns in biological nervous systems yet have various differences compared with biological brains.
The New Hardware Design
Before we get started, let’s take a look at the new steering hardware. In the previous blog posts, I discussed the new design with the linear actuator. Here is what it looks like from the bottom up.
Some advantages to this new system.
- It’s more powerful! The linear actuator can create more than 500N of force. (It can almost lift up a person!)
- The actuator is mounted securely to the vehicle, making the steering more precise.
- The new system is hidden underneath the golf cart, making it none obtrusive.
This flow chart illustrates the interactions between different nodes in ROS. The deep learning system subscribes to the camera inputs. The Arduino constantly listens to the deep learning outputs, and then execute those commands.
Here is a short video demonstrating the functionality of the actuator. If you want to learn more about the mechanical engineering behind the scenes, check out this post.
The New Deep Learning System
You might be wondering, how does the golf cart generate all of the steering commands? There are multiple approaches to this very interesting problem. As a passionate student of machine learning, I developed a deep learning system that allows the computer to predict the steering angle based on images captured by a front-facing camera. This method is also known as behavioral cloning. In a previous post, I discuss my new approaches and new results for the deep learning system.
In mid-2017, Google DeepMind proposed a novel video analysis dataset used for action recognition known as Two-Stream Inflated 3D ConvNet (I3D)
This effective architecture has proven to be ideal for video analysis, thus becoming a great candidate for end-to-end behavioral cloning. The paper proposed three networks, one using multiple frames of RGB images as input, one using multiple frames of optical flow results as input, the final one being a combination of the both. Training has proven that the first architecture leverages both speed and accuracy, thus producing a viable solution for real-world applications. Here is an overview of the model.
Here is a short video demonstrating the robustness of behavioral cloning.
Testing it out…
I combined the new hardware and the new software into an all-new deep learning powered steering system. I created a ROS testing node for all of this. Here are the steps of the testing program.
- The ROS camera node simulates camera input by publishing a two-minute pre-recorded video.
- The deep learning model subscribes to those frames, and simultaneously publishing predictions.
- The Arduino subscribes to the steering commands and executes them with the new steering hardware.
That’s it! The steering system is up and running. However, it’s far from being perfect. I still need to test out the range of motion of the linear actuator. More rigorous outdoor tests will be covered in the second part of this two-part series. Please like this post if you enjoyed it. Also, you can contact me at email@example.com. Thanks for stopping by!
If you are interested in learning more about the self-driving golf cart project, you might enjoy the following posts.
- Deep learning steering prediction
- Semantic segmentation
- Robot Operating System