Lab 13: Path Planning and Execution

Outro:

First of all this the last lab of the semester and you've made it through. So to future Chidera, if you're reading this you've successfully completeted your Robotics Minor at Cornell and you've shown that you're now equipped with the neccessary skills to navigate the robotics space in the future. So let this website be a reminder to you that you're capable of working through some of the hardest problems you've ever faced and will face in the future. Anyways back to regularly scheduled programming.

Materials

Objectives:

First of all this the objective of this lab was to have the robot navigate through a set of waypoints in that environment as quickly and accurately as possible. Below is the set of points (in the correct order) to hit in order to complete this lab.

  1. (-4, -3)
  2. (-2, -1)
  3. (1, -1)
  4. (2, -3)
  5. (5, -3)
  6. (5, -2)
  7. (5, 3)
  8. (0, 3)
  9. (0, 0)

Lab Taks

High Level Program Low

Next State

To make the given waypoint data compatible with our preexisting localization system, we had to transform the waypoints from units of feets to units of meters as our localization belief is calculated in meters not in feet so we wanted to make sure there were no discrepancies in the computations. With that change our waypoints in meters are now:

  1. (-1.2192, -0.9144)
  2. (-0.6096, -0.3048)
  3. (0.3048, -0.3048)
  4. (0.3048, -0.9144)
  5. (1.524, -0.9144)
  6. (1.524, -0.6096)
  7. (1.524, 0.9144)
  8. (0, 0.9144)
  9. (0, 0)

The excution of this navigation was heavily based on Arduino and Python working in tandem to communicate with each other about what was going on in the map. To make the program flow as efficent as possible, we decided to offload heavy computation of to the python side on the computer to make the robot as fast as possible (After all this is called FAST ROBOTS) and only let Arduino worry about the physical actuation of the motors based the feedback from the python end over bluetooth. More concretely we did the following:

Python -> Arduino: Send a Start Run command

Arduino -> Python: Robot spins 360 degree and sends the retrieved the sensor values

Python -> Arduino: Find Belief, Calculate, & send control inputs to the Robot

Arduino -> Python: Based on the received control inputs, actuate as neccessary and send retrieved sensor values.

Python->Arduino: Find Belief, Calculate, & send control inputs to the Robot

Repeat again until Robot reaches destination

Below is the Python Script that takes care of the off-system computation for generating the beliefs and the control inputs to the robot.

Below is the Arduino Sketch that Indicated to the Robot how to perform localization spins, sensor data retrieval and transmission. Simply put this sketch takes care of localizing and sending ToF data over bluetooth.

Upon integrating both programs together we were able to produce some good attempts of the path navigation. Displayed Below are two videos: Attempt 1 and Attempt 2. Attempt 1 and Attempt 2 successfully complete the task but in Attempt 2, the robot gets stuck to the wall and needs a little nudge to get unstuck. Yet, despite an external input into the system, it is still able to handle the task very and proceed to complete the navigation in flying colors

Attempt 1:

Attempt 2:

Our Robot is Set apart

One of the stronget asset about our system and the algorithm we deploy on the robot is the robustness of device. On several occasions, as we were attmepting the path, the robot exhibited great recovery even when it drastically overshot its goal or even ended up in corner so far away from the target that we thought it'd be almost impossible for the robot to recover from such a mistake. To demonstrate the robustness and reliability of our algorithm, we reversed the order of the waypoints and tasked the robot to essentially complete the designated task backwards. This is to show that our algorithm is appplicable to any set of waypoints not just the ones were given and in the order it was given. See the reversed attempt video below.

Reversed Attempt:

Acknowledgements

To Ben and Anya:

Remember those moments when our robot would overshoot so many times and we thought it quit on us but then it figured out how to get back to its goal, I think those moments should be remembered because it truly serves as a testament to the level of effort, thought, and planning that we put into this lab that allowed to robot to do the "impossible." Those moments when we thought that the robot was gonna fail and it didn't highlights the fact that we shouldn't doubt the skills that we've picked up the course of this semester, because it is with those same abilities that we can use to create the next "impossible" in the years to come. So I just want to give a shoutout to my teammates Ben and Anya and let them know to always believe in themselves and trust that they well equipped with the neccessary skills to create a brand new tommorrow.

A big thanks to Kirsten and the teaching staff for supporting and encouraging us throughout the semester even when things got tough. I can't wait to see what the next gen of Fast Robot students produce under your guidance!

A big shoutout to my fellow students in this class for riding the wave, through the ups and downs we made it through

Speaking of Waves: Enjoy this Video