Ryan's ECE 4960 Site


Lab 9

This lab was dedicated to creating a map of the environment the robot is in. This was done by spinning the robot and taking measurements from the robot's front Time of Flight (ToF) sensor at different points in the environment, somewhat similar to how spinning LiDAR sensors work.

Taking the Data

When I started this lab, my plan was to use angular speed control using a PID controller to spin the robot at a constant 20 degrees per second while continuously taking measurements from the front ToF Sensor along with time stamps of the measurements. Then during processing, I can use the angular velocity and time stamps to figure out the angles for each ToF measurement. This first required me to combine both the ToF and IMU sensor code, which was surprisingly time consuming since they both used the same I2C protocol. Once I finished that, I created this function that carries out the strategy described above:


          int target_DPS = 20;
          float proportional_gain = 1;

          void botPID(){
            int current_distance = getDistanceSensor2();

            if( myICM.dataReady() ){
              myICM.getAGMT();
            }
            float current_DPS = myICM.gyrZ();

            float error = target_DPS - current_DPS;
            float motor_value = 70 + proportional_gain * error; // 70 to handle the rotational deadband

            motor_spin_2( motor_value);

            if(log_index < num_samples){
              time_stamps[log_index] = millis() / 1000.0;
              tof_2_values[log_index] = current_distance;
              log_index++;
            }

            Serial.print(current_DPS);
            Serial.print(",");
            Serial.print(current_distance);
            Serial.print(",");
            Serial.println(motor_value);
          }
        

I then tried to run the code, which was when I ran into reliablility problems with the ICM where it would constantly and randomly stop working in the middle of runs. This would also happen when I ran the example code for the ICM, which indicated that it was likely broken. This was possibly caused by the multiple hard collisions from lab 8, or by when I tried to resolder parts to redistribute the weight also in lab 8, or a combination of both. Since I was running low on time at this point, I decided to spend a few minutes testing how accurate open loop turning would be to see if it could be an alternative to spending a lot of time resoldering a new ICM to the robot and continuing with the more complex strategy described above. After some testing, I discovered that open loop turning was surprisingly accurate and decided to switch over to open loop turning instead of PID to control the rotations due to the low amount of time I had left. Below is the code I used to perform this. Since open loop turning depends on timing and since the code cannot be blocking for the Bluetooth to work, I implemented a state machine with time stamps to control the rotations and data collection.


          int sample_index = 0;
          int rotation_duration = 160; // tune this
          int stop_duration = 250;
          unsigned long data_time_stamp;
          int collect_data_state = 0; // 0 = stop1, 1 = stop2, 2 = rotate
          // stores in the data array defined way above
          void collect_data(){
            if(sample_index < num_samples){
              if(collect_data_state == 0){
                motor_forward(0);

                if(millis() - data_time_stamp >= stop_duration){
                  collect_data_state = 1;
                  data_time_stamp = millis();
                }
              }else if(collect_data_state == 1){
                motor_forward(0);
                digitalWrite(19,HIGH);
                if(millis() - data_time_stamp >= stop_duration){
                  data[sample_index] = getDistanceSensor2();
                  Serial.println(data[sample_index]);
                  sample_index++;
                  data_time_stamp = millis();
                  collect_data_state = 2;
                  digitalWrite(19,LOW);
                }
              }else if(collect_data_state == 2){
                motor_rotate(130);
                if(millis() - data_time_stamp >= rotation_duration){
                  data_time_stamp = millis();
                  collect_data_state = 0;
                }
              }
            }else{
              //data[sample_index] = getDistanceSensor2();
              robot_state = 2;
              sample_index = 0; // reset sample_index
              collect_data_state = 0;
            }
          }
        

Below is a video showing the measurement strategy in action and showing that the robot turns roughly on axis:

Sanity Checking

I then performed a measurement on each of the marked spots in the environment and then as a sanity check plotted each run on a polar coordinate plot, which can be seen below:

Merging and Plotting Readings

Next, I converted the angle and distance measurements at each marked spot to the inertial reference frame of the room and combined them into a consolidated map. To convert the measurements into x and y coordinates in the inertial reference frame, I used the transformation matrices from Lecture 2. It consists of the rotation matrix R and translation matrix P_1 of the point relative to the robot. Since the distance measurements are directly in front of the robot, I made my
P_1 =
[distance]
[ 0 ]

I turned these matrices into a Python function with the help of the numpy library, which I used to plot the points:


          import numpy as np
          import matplotlib.pyplot as plt

          def plot(measurements,angles,x_off,y_off,color):
              for i in range(len(measurements)):
                  R = np.array([[np.cos(angles[i]),-1*np.sin(angles[i])],[np.sin(angles[i]),np.cos(angles[i])]])
                  P1 = np.array([[measurements[i]],[0]])
                  P0 = R.dot(P1)

                  x=P0[0][0] + x_off
                  y=P0[1][0] + y_off

                  print(str(x)+","+str(y))
                  plt.plot(x,y, 'o', color=color)
        

Using this Python function and the measured data, I plotted all the Time of Flight sensor measurements, which turned out very nice and seemed to match the environment very closely:

Line-Based Map

Finally, so that we can use this in the simulator, I drew lines of where the walls and obstacles are based on this plot. I then turned the start and end points of these lines into lists that can be used by the simulator:


          wall_x = [-1600,-400,-400,350,350,1900,1900,-650,-650,-1600,-1600]
          wall_y = [-1300,-1300,-900,-900,-1300,-1300,1300,1300,0,0,-1300]

          box_x = [400,400,1200,1200,400]
          box_y = [550,-230,-230,550,550]