Milestone 3: Robot Detection and Maze Navigation
Detecting other Robots
To detect other robots, we implemented an IR sensing circuit that uses a phototransistor's amplified output to determine the proximity of another robot emitting infrared light. There is a flag called "pt" to determine if the phototransistor output is a 0 or 1, and if it is 1, then a robot has been detected and our robot makes a 180 degree turn.
The command to turn 180 degrees upon detection of a robot can be seen in the code below.
if(pt == 1) { turn(TURN_RIGHT, DEG_180, left_line_val, mid_line_val, right_line_val); }
The video below shows a demonstration of our robot detecting another robot. We used a stationary IR LED powered by a 9V battery instead of another robot.
Maze Navigation
The maze mapping algorithm uses an iterative depth first search process, continuously selecting to visit nodes that haven't been visited, before reaching a logical or physical dead end and retracing its steps until it finds another path to travel that explores more unvisited nodes. The way we implement this is by defining an 2-dimensional mxn array of structs containing node information such as our backtrack orientation, whether the node's neighbors have been visited and whether the node itself has been visited. Globally, we also keep track of the robot's orientation and position within the maze and the total number of nodes we still haven't visited. We continue exploring the maze until that global number of nodes is 0, at which point we stop and turn the green LED on. The orientation and position of the robot help us make decisions about turning and ensuring we know how to retrace our steps effectively. With this method, we guarantee that each path is only explored once (so no repeats), but we don't necessarily take the shortest path through the maze.
The code below shows how the maze mapping algorithm is implemented with the where_to_go() function. The definitions for the node struct and orientation and direction lookup tables are also shown below.
void where_to_go() { // print_robot_coordinates(); if (maze[robot_x][robot_y].visited == false){ update_node(); } // stop at end of maze // no more nodes are unvisited if (nodes_to_visit == 0) { while(true) { servo_L.write(STOPPED); servo_R.write(STOPPED); digitalWrite(LED_GREEN, HIGH); digitalWrite(LED_BLUE, LOW); digitalWrite(LED_RED, LOW); } } // set default orientation to backtrack in case this node // has no neighbors that haven't been visited ORIENTATION path_or = maze[robot_x][robot_y].or_backtrack; // check neighbors array for(int o = OR_NORTH; o < OR_NUM; o++) { if (maze[robot_x][robot_y].neighbor[o] == false){ // neighbor in orientation o hasn't been visited, we should go there path_or = (ORIENTATION) o; break; } } // path_or is the desired orientation to travel in DIRECTION dir = or_or[robot_orientation][path_or]; // print which direction we are going // Serial.print("going in orientation "); // Serial.print(path_or); // Serial.print("\t we are turning "); // Serial.println(dir); // call turning function based off direction switch (dir) { case DIR_LEFT: turn(DIR_LEFT, DEG_90, left_line_val, mid_line_val, right_line_val); turn_executed = true; break; case DIR_FORWARD: // do nothing, line following should take us to next intersection break; case DIR_RIGHT: turn(DIR_RIGHT, DEG_90, left_line_val, mid_line_val, right_line_val); turn_executed = true; break; case DIR_BACKWARD: turn(DIR_LEFT, DEG_180, left_line_val, mid_line_val, right_line_val); turn_executed = true; break; default: Serial.println("DIRECTION OUT OF BOUNDS"); } }
Below is the struct definition for a node.
typedef struct{ bool neighbor[OR_NUM] = {0, 0, 0, 0}; // this array keeps track of whether or not a neighbor has been visited, one entry for each possible neighbor orientation ORIENTATION or_backtrack; // if we have no where else to go, we bail this way bool visited = false; // have we been to this node before?? } node_t;
Below are the definitions for the lookup tables used to determine orientation and direction.
ORIENTATION or_dir[OR_NUM][DIR_NUM] = { // L F R B /* NORTH ORIENTATION */ {OR_WEST, OR_NORTH, OR_EAST, OR_SOUTH}, /* EAST ORIENTATION */ {OR_NORTH, OR_EAST, OR_SOUTH, OR_WEST} , /* SOUTH ORIENTATION */ {OR_EAST, OR_SOUTH, OR_WEST, OR_NORTH}, /* WEST ORIENTATION */ {OR_SOUTH, OR_WEST, OR_NORTH, OR_EAST} , }; // LUT of sorts to figure out which direction to turn based on current vs. desired orientation DIRECTION or_or[OR_NUM][OR_NUM] = { // N E S W /* NORTH ORIENTATION */ {DIR_FORWARD, DIR_RIGHT, DIR_BACKWARD, DIR_LEFT}, /* EAST ORIENTATION */ {DIR_LEFT, DIR_FORWARD, DIR_RIGHT, DIR_BACKWARD} , /* SOUTH ORIENTATION */ {DIR_BACKWARD, DIR_LEFT, DIR_FORWARD, DIR_RIGHT}, /* WEST ORIENTATION */ {DIR_RIGHT, DIR_BACKWARD, DIR_LEFT, DIR_FORWARD} , };
The videos below show the robot implementing maze mapping on multiple mazes.