Lab 13: Path Planning and Execution

Outro:

First of all this the last lab of the semester and you've made it through. So to future Chidera, if you're reading this you've successfully completeted your Robotics Minor at Cornell and you've shown that you're now equipped with the neccessary skills to navigate the robotics space in the future. So let this website be a reminder to you that you're capable of working through some of the hardest problems you've ever faced and will face in the future. Anyways back to regularly scheduled programming.

Materials

Objectives:

First of all this the objective of this lab was to have the robot navigate through a set of waypoints in that environment as quickly and accurately as possible. Below is the set of points (in the correct order) to hit in order to complete this lab.

  1. (-4, -3)
  2. (-2, -1)
  3. (1, -1)
  4. (2, -3)
  5. (5, -3)
  6. (5, -2)
  7. (5, 3)
  8. (0, 3)
  9. (0, 0)

Lab Taks

High Level Program Low

Next State

To make the given waypoint data compatible with our preexisting localization system, we had to transform the waypoints from units of feets to units of meters as our localization belief is calculated in meters not in feet so we wanted to make sure there were no discrepancies in the computations. With that change our waypoints in meters are now:

  1. (-1.2192, -0.9144)
  2. (-0.6096, -0.3048)
  3. (0.3048, -0.3048)
  4. (0.3048, -0.9144)
  5. (1.524, -0.9144)
  6. (1.524, -0.6096)
  7. (1.524, 0.9144)
  8. (0, 0.9144)
  9. (0, 0)

The excution of this navigation was heavily based on Arduino and Python working in tandem to communicate with each other about what was going on in the map. To make the program flow as efficent as possible, we decided to offload heavy computation of to the python side on the computer to make the robot as fast as possible (After all this is called FAST ROBOTS) and only let Arduino worry about the physical actuation of the motors based the feedback from the python end over bluetooth. More concretely we did the following:

Python -> Arduino: Send a Start Run command

Arduino -> Python: Robot spins 360 degree and sends the retrieved the sensor values

Python -> Arduino: Find Belief, Calculate, & send control inputs to the Robot

Arduino -> Python: Based on the received control inputs, actuate as neccessary and send retrieved sensor values.

Python->Arduino: Find Belief, Calculate, & send control inputs to the Robot

Repeat again until Robot reaches destination

Below is the Python Script that takes care of the off-system computation for generating the beliefs and the control inputs to the robot.

# Setup
```python
%load_ext autoreload
%autoreload 2
import traceback
from notebook_utils import *
from Traj import *
import asyncio
import pathlib
import os
from utils import load_config_params
from localization_extras import Localization
# The imports below will only work if you copied the required ble-related
# python files into the notebooks directory
from ble import get_ble_controller
from base_ble import LOG
from cmd_types import CMD
import numpy as np
# Setup Logger
LOG = get_logger('demo_notebook.log')
LOG.propagate = False
# Init GUI and Commander
gui = GET_GUI()
cmdr = gui.launcher.commander
```
# Variables, Functions, Classes
```python
# Global variables #
log = [] # Other data
turnLog = [] # Data related to localization
# Waypoints
wp = [ [-1.2192,-0.9144,45], [-0.6096,-0.3048,0], [0.3048,-0.3048,300], [0.3048,-0.9144,0],\
[1.524,-0.9144,90], [1.524,-0.6096,90], [1.524, 0.9144,180], [0, 0.9144,270], [0,0,0] ]
wp.reverse()
wp_idx = 1
done = 0 # Reached final waypoint
cur_goal = wp[wp_idx] # Next waypoint to hit
end_goal = wp[8] # Final waypoint
# Callback function #
def updateValue( uuid, value ):
global log
global turnLog
received = value.decode('utf-8').split(',')
# Localization data will be length 3
if ( len(received) == 3 ):
turnLog.append(received)
else:
log.append(received)
# Robot Class #
class RealRobot():
def __init__(self, commander, ble):
# Load world config
self.world_config = os.path.join(str(pathlib.Path(os.getcwd()).parent), "config", "world.yaml")
self.config_params = load_config_params(self.world_config)
# Commander to commuincate with the Plotter process
# Used by the Localization module to plot odom and belief
self.cmdr = commander
self.ble = ble # ArtemisBLEController to communicate with the Robot
self.setup_notify()
def setup_notify( self ):
ble.start_notify( ble.uuid['RX_STRING'], updateValue )
def stop_notify( self ):
ble.stop_notify( ble.uuid['RX_STRING'] )
def send_input( self, delta_dist, delta_rot1, delta_rot2 ):
input_string = str( delta_dist ) + "|" + str( delta_rot1 ) + "|" + str( delta_rot2 )
ble.send_command(CMD.SEND_THREE_FLOATS, input_string)
def perform_observation_loop(self, rot_vel=120):
global turnLog
theta_arr = [ float(i[0]) for i in turnLog ] # Obtain theta values
dist_arr = [ float(i[1])*0.001 for i in turnLog ] # Obtain dist values
turnLog = [] # Erase turnLog
sensor_ranges = np.array(dist_arr)[np.newaxis].T # Convert to column array
sensor_bearings = np.array([0, 0, 0, 0, 0, 0, 0])[np.newaxis] # Random empty array
return sensor_ranges, sensor_bearings
```
# Connect Bluetooth
```python
ble = get_ble_controller()
ble.connect()
```
# Initialize
```python
# Initialize RealRobot with a Commander object to communicate with the plotter process
# and the ArtemisBLEController object to communicate with the real robot
robot = RealRobot(cmdr, ble)
# Initialize mapper
# Requires a VirtualRobot object as a parameter
mapper = Mapper(robot)
# Initialize your BaseLocalization object
# Requires a RealRobot object and a Mapper object as parameters
loc = Localization(robot, mapper)
# Initialize Uniform Belief
loc.init_grid_beliefs()
```
# Actual Run
```python
robot.send_input( 1, 0, 0 ) # Used as a random input to start run
while ( not done ):
await asyncio.sleep(0.0001)
if ( len(turnLog) != 0 ):
await asyncio.sleep(2)
loc.get_observation_data() # Get localization data
loc.update_step() # Update belief
x, y, a = loc.get_belief() # Get belief coordinates in meters
x_end_diff = end_goal[0] - x
y_end_diff = end_goal[1] - y
# Reached last waypoint
if ( np.abs( x_end_diff ) < 0.25 and np.abs( y_end_diff ) < 0.25 ):
done = 1
else:
# Reached current goal
if ( np.abs( cur_goal[0] - x ) < 0.35 and np.abs( cur_goal[1] - y ) < 0.35 ):
wp_idx += 1
cur_goal = wp[ wp_idx ] # Update goal
# Calculate control input
delta_rot1 = mapper.normalize_angle( np.degrees( np.arctan2( cur_goal[1] - y, cur_goal[0] - x ) ) - a )
delta_dist = np.sqrt( ( cur_goal[0] - x )**2 + ( cur_goal[1] - y )**2 )
delta_rot2 = mapper.normalize_angle( cur_goal[2] - a - delta_rot1 )
# Change negative angles to positive
if ( delta_rot1 < 0 ):
delta_rot1 = 360 + delta_rot1
if ( delta_rot2 < 0 ):
delta_rot2 = 360 + delta_rot2
# Drift causes robot to go beyond intended distance, so decrease dist
delta_dist = delta_dist * 2 / 3
robot.send_input( delta_dist*1000, delta_rot1, delta_rot2 )
```
view raw Lab13.py hosted with ❤ by GitHub

Below is the Arduino Sketch that Indicated to the Robot how to perform localization spins, sensor data retrieval and transmission. Simply put this sketch takes care of localizing and sending ToF data over bluetooth.

//////DRIVE STRAIGHT//////
int beginTime=millis();
while (millis()-beginTime<1300) motor_forward(60);
motor_brake();
beginTime =millis();
while(millis()-beginTime<4000) motor_brake;
while (1) {
/////DO A SCAN//////////
int updateSuccess;
angle=0;
endIntegral=micros();
updateSuccess=myBot.updatePosition();
while (central.connected() && CR==1) {
angle+=abs(myBot.gX)*(float((micros()-endIntegral))/1000000);
endIntegral=micros();
updateSuccess=myBot.updatePosition();
// Serial.println(myBot.gX);
e = abs(myBot.gX) - 15;
pwm_val = 105 - Kp*e;
// motor_spin( limit_range(pwm_val), limit_range(pwm_val)+15 );
motor_spin(100,115);
// motor_spin(
if (measurement>=18){ //reintialize for next loop send over turn log and move on
motor_stop();
angle=0;
increment=0;
measurement=0;
sendMapLog();
CR=0;
break;
}
if (angle-increment>=20){
struct turn t;
t.predictedAngle=angle;
t.frontTOF=myBot.front;
t.sideTOF=myBot.side;
measurements[measurement]=t;
measurement+=1;
increment=angle;
}
}
/////DO A SCAN//////////
////////receive an instruction///////
while (central.connected()){
// Serial.println("waiting for a command");
if (rx_characteristic_string.written()) {
Serial.println("handled");
handle_command();
break;
}
}
////////receive an instruction///////
/////////TURN AN ANGLE//////
angle=0;
endIntegral=micros();
updateSuccess=myBot.updatePosition();
// if (pi>180) pi=360-pi;
while (central.connected()) {
angle+=abs(myBot.gX)*(float((micros()-endIntegral))/1000000);
endIntegral=micros();
updateSuccess=myBot.updatePosition();
if (angle>=int(pi)) {
angle=0; //reset for next time
motor_stop();
break; /// i is second angle
}
motor_spin( 100, 115 ); //just use last pwm value
}
/////////TURN AN ANGLE//////
//////DRIVE STRAIGHT//////
myBot.updatePosition();
startingPoint=myBot.front;
Serial.print("starting point is ");
Serial.println(startingPoint);
while (startingPoint-myBot.front<int(p) && central.connected()){ //use this
myBot.updatePosition();
driveStraight(myBot.gX);
myBot.pright=rightWheel;
myBot.pleft=leftWheel;
}
motor_stop();
Serial.print("the last one was ");
Serial.print(startingPoint-myBot.front);
Serial.println("starting turn");
beginTime = millis();
while (central.connected()){
motor_brake();
if (millis()-beginTime>1000) break;
}
//////DRIVE STRAIGHT//////
beginTime = millis();
while (central.connected()){
motor_brake();
if (millis()-beginTime>1000) break;
}
CR=1;
}
view raw Lab13.ino hosted with ❤ by GitHub

Upon integrating both programs together we were able to produce some good attempts of the path navigation. Displayed Below are two videos: Attempt 1 and Attempt 2. Attempt 1 and Attempt 2 successfully complete the task but in Attempt 2, the robot gets stuck to the wall and needs a little nudge to get unstuck. Yet, despite an external input into the system, it is still able to handle the task very and proceed to complete the navigation in flying colors

Attempt 1:

Attempt 2:

Our Robot is Set apart

One of the stronget asset about our system and the algorithm we deploy on the robot is the robustness of device. On several occasions, as we were attmepting the path, the robot exhibited great recovery even when it drastically overshot its goal or even ended up in corner so far away from the target that we thought it'd be almost impossible for the robot to recover from such a mistake. To demonstrate the robustness and reliability of our algorithm, we reversed the order of the waypoints and tasked the robot to essentially complete the designated task backwards. This is to show that our algorithm is appplicable to any set of waypoints not just the ones were given and in the order it was given. See the reversed attempt video below.

Reversed Attempt:

Acknowledgements

To Ben and Anya:

Remember those moments when our robot would overshoot so many times and we thought it quit on us but then it figured out how to get back to its goal, I think those moments should be remembered because it truly serves as a testament to the level of effort, thought, and planning that we put into this lab that allowed to robot to do the "impossible." Those moments when we thought that the robot was gonna fail and it didn't highlights the fact that we shouldn't doubt the skills that we've picked up the course of this semester, because it is with those same abilities that we can use to create the next "impossible" in the years to come. So I just want to give a shoutout to my teammates Ben and Anya and let them know to always believe in themselves and trust that they well equipped with the neccessary skills to create a brand new tommorrow.

A big thanks to Kirsten and the teaching staff for supporting and encouraging us throughout the semester even when things got tough. I can't wait to see what the next gen of Fast Robot students produce under your guidance!

A big shoutout to my fellow students in this class for riding the wave, through the ups and downs we made it through

Speaking of Waves: Enjoy this Video