Autonomous Warehouse Robotics!

In October of 2023 I had the opportunity to start out a very interesting project, creating the software architecture for an autonomous lights out warehouse. When I started this project one of the topics to tackle was the development of an autonomous navigation software stack for a differential drive robot.

A freedom given to me was the ability to develop this architecture from scratch. Some of the timelines and software tools available limited by ability to use the ROS 2 Nav package (In case you’re curious check it out! https://navigation.ros.org/_) so I opted to implement this work using MATLAB and SIMULINK. As an autonomous systems engineer I like being able to compare and contrast all the tools available to me which is why ROS 2 was considered. Instead I was given the unique opportunity to develop the entire localization and navigation stack on my own. Given this challenge I began to work on an interesting journey, how do I develop and autonomous warehouse robot???

Comparing Different Localization Architectures…

Based on my endless list of tasks I began looking at different localization techniques, either seen during my graduate studies or from other academic sources. One technique which a colleague of mine and I had worked on with a university collaborator allowed us to use AprilTags (Find out more info here: https://april.eecs.umich.edu/software/apriltag ) as a reference landmark of our surroundings. Initially this system proved to be extremely effective! If you’re curious about it this technique is called visual inertial navigation and it uses a camera and IMU data to estimate the pose (position and orientation) of a rigid body in space. The main obstacle which I ran into were the transitions between zones of no AprilTags and visible AprilTags. A Kalman estimator fused the kinematic model of a differential robot with the position information from the AprilTags, however when large discontinuities were present I ran into pose estimation issues. My mobile robot would do a little jiggle in space and become lost in the world…. not fun! This did not mean all hope was lost when it came to the AprilTags, they became really important later on in the system!

Another popular technique for localization is called laser scan matching. Think of it like taking a series of points in space (the patterns need to have corresponding/similar geometric properties) and back calculating the translational and rotational differences between the two points. This isn’t an iterative closest point problem (ICP) which is computationally intensive. Instead this is a grid search type of optimization technique which looks at the points and find the translation and rotation with the highest probability to be correct and tells you “Hey I think this is how much you’ve moved and rotated!” At work our prior work had only used one LiDAR laser scan to localize the system, we had yet to use more than one, until now! Based on the advice from the same colleague whom I had done AprilTag localization, I discretised my large work area into sub maps. This approach reminded me of road trips with my parents (back when maps of cities were huge books) and my parents would be trying to scroll from page to page and identify what street to take next to get to our destination. This approach proved to be very effective and consistent! I was able to consistently localize my mobile robot and transition between different local maps.

How to move from A to B…

Robotics is a cool field, specially for those who want to explore the different tools that enable robots to perform tasks. I’m lucky I have the opportunity not just to work on the navigation, localization side of the project…. I’m also the architect of the robots behaviour and how it decides to travel around the world. Differential drive robots offer cool motion sequences, the one I implemented is called a sequential drive system. The idea behind this system is for a robot to orient itself from it’s current position to its’ target position and move in a straight line, adjusting the heading along the way. This technique is effective because it also allows me to define to orientation of the robot in a intuitive way without worrying about the parabolic arc that a differential robot can make when both speed and heading velocities are commanded at the same time.

Motion is not the only name of the game. I also had to implement some sort of behaviour for the robot. We as humans are really good at generating different subtasks for goals we want to accomplish. For robots, these subtasks/routines have to be programmed in. These routines are known as states (or system states) which when connected together make a finite state machine (If you’re curious: https://en.wikipedia.org/wiki/Finite-state_machine). There is a huge area of research related to system behaviour and different architectures, in video games for example behaviour trees are used instead. They allow a programmer to define more complex scheduling of actions and responses which a character can make. If you’ve ever played a game with bad AI it’s probably because the behaviour system still needs some polishing hahaha Behaviour trees are also be used in mobile robotics and in fact ROS uses them throughout their Nav 2 package in order to schedule tasks. If I get the opportunity to upgrade and improve the autonomy stack of the warehouse robot I will be integrating behaviour trees as well! For now the state machine design has proven to be a readable and maintainable approach, however it comes with drawbacks. The more states and transitions that are added the levels of convoluted states and transitions become tricky to manage. To fully control the beaviour of the robot I wrote out the state machine using MATLAB functions, note there are other tools out there to help achieve the same goal.

Hello Network

Using my prior experience with network architectures and tools I implemented a website endpoint which allowed a remote user to send commands to the autonomous robot and allow to get high level commands. This culminated in a cool video which shows the combination of different systems coming together and letting the mobile robot navigate from point A to point B.

This system could still use more updates this is what one the trial runs looked like!

Leave a comment