ECE 3400 Team 1 Milestone 1

Goals

We broke down the task into four sub-tasks.

  1. Make a driving robot (accomplished in lab 1)
  2. Have the robot detect lines
  3. Combine tasks 1 and 2 to allow line following
  4. Allow plus-junction detection to allow figure 8 traversal

Task 1

We accomplished this task by reusing a lot of the lab 1 setup. Digital pins 9 and 10 still gave the servo control signals, but instead of simply proceeding at full speed, we only use roughly 10% of full speed when traveling in a straight line. Turns still proceed at full speed, with one servo running at full forward and the other at full reverse during a turn.

Task 1 Helper Functions

We used helper routines to standardize left and right 90-degree turns. A left turn pseudo-code snippet is below.
void turn_left(){
drive_ramp(MOTOR_STRAIGHT);//center the robot pivot point on the junction
delay(125);
drive_ramp(RIGHT_FORWARD_LEFT_BACKWARD);//turn
delay(630);
drive_ramp(MOTOR_STRAIGHT);//get out of the junction
delay(125);
drive_ramp(MOTOR_STOP);
}
drive_ramp is a helper function that Alex Coy felt would lead to smoother turns by ramping up motor speed gradually. It takes an argument that is the desired motor response.

Task 2: Line Sensors

The line sensors work by shining an infrared beam towards a surface and returning a reflection strength. It turns out, white surfaces yield a low output voltage from the sensor, while dark surfaces yield a high voltage from the sensor. For the time being, Ryan and Tyrone connected the line sensors to analog inputs A0 through A4. Joseph worked on a Schmitt trigger circuit in the hopes that we can make the line sensors digital at some point. We did not implement Schmitt trigger circuit for this milestone.

Tasks 3 and 4: Line Following and figure 8

Alex drew a simple line sensor holder in Autodesk Inventor to 3D print. The holder situates the line sensors 18mm apart and centers them between the wheel planes. This allows two sensors to straddle the line for line navigation and for two more sensors to detect junctions to perform turns. A rendering of the sensor holder is below.
Line sensor holder
Ryan and Tyrone worked on navigation code. A navigation pseudo-code snippet is the following:
while(1){
poll_sensors;
if(outside_sensors_black){
if(inside_sensors_black) both_servos_same_forward;
else if(inside_right_white) right_slightly_slower;
else if(inside_left_white) left_slightly_slower;
}else if(outside_sensors_white){
make_correct_turn;
}
}

In the above code snippet, the robot polls the line sensors every iteration of the loop. We found that the robot polls quickly enough to navigate successfully at the moment. The robot makes a decision on what to do next based on its current state; it tries to correct its course if it gets off of the line, and if it meets a junction, it makes the next turn in the figure 8 sequence (using a counter variable in the actual code). The robot assumes that it starts in the bottom right corner of the 8 (or the top right corner of an infinity sign).

Video proof of concept

This video exhibits a sample of our robot following a line and navigating junctions. The robot had an unexplained error at the end of the video(which is why it stopped), but the code instructs the robot to follow the figure 8 path forever.

Code References

Our full code is on our Github repository. The file is navigation_figure8.ino, currently located in the top directory of the repository. We used an article by Chowdhury, Khushi, and Rashid as a starting point for our algorithm, located at https://ijcjournal.org/index.php/InternationalJournalOfComputer/article/download/819/412