This project centers around making a virtual pet embedded system using a FRDM-KL46Z microcontroller board. Through the use of the onboard switches and touch slider, the user can interact with the pet (represented by a finite state machine) with animations displayed via Pygame.
Our project consists of two main code files: one uploaded to the board to run the input and FSM logic and the other to take serial input from the board and output a series of images which make up our animations. The board takes in user input via two onboard switches as well as the capacitive touch slider in between. A combination of interrupts, polling, and delays are used to change variables within our FSM, prompting different serial outputs via UART to the python file. The python file reads the serial port and displays the appropriate animations based on the FSM state.
Here is a video of us demonstrating the board interactions synced up with the python pygame window. Link here
In this section we go into detail about each main portion of our code. The board runs the main FSM and also serves as the user interface while the Python file is used exclusively to play animations and sound. However, in order for the board and Python to interact correctly there are deisgn decisions that we had to make involving the structure of our UART communications.
The FSM is implemented to react to user inputs and keep track of the pet status. Our pet has five main stages: birth, idle, eat, play, and throw, with each stage corresponding to a specific set of animation. Besides we also have an extra stage corresponding to the time triggered background change, User will interact with the pet through capacitive sliders and click buttons and timed interrupts are used to change the background of the display to simulate the time of day. The user input triggers the stage transition and each stage will output serial messages to the python script to change the animations and audio effects. For actions that have more than one possible animation, we added counters to switch between them.
We made use of both switches on the board to enable two different sets of animations. Each one is set up on Port C via pins 3 and 12. Each switch is polled before the FSM goes into the state change function. Because switch inputs are saved by the board, we do not need to continuously poll them which would cause large delays. Each switch updates a variable used by the FSM when deciding which state to go to next.
The KL46Z microcontroller provides us with a capacitive touch slider interface for highly sensitive and robust touch sensing. For our purpose we set the module to the capacitive sensing mode. By enabling the necessary registers in the TSI module and configuring the control registers for the detection range and offset, we operate the slider to scan and read the DATA register. To be able to distinguish between swipe left and swipe right input, in the check_slider() function we sample 8 groups of data points with each data point averaging four continuous capacitive readings. We also set the cutoff value at 9999 so that any slider reading value higher than that will be considered as an error. The averaging minimizes the effect of noise on the reading output and the data grouping makes sure that our sampling is spreaded out on the slider bar. As the slider data value is smaller than 50 when not touched, we wait in a while loop until we see a valid reading result; if the program spins on the while loop for more than 75 cycles, we treat it as the user not touching the slider and exit the function. For each reading we compare its value to the previous reading to decide if it is increasing or decreasing. The results of the 7 comparisons are added together to decide whether the user swiped left or right on the bar and the judgment is returned as a function output. In the FSM this will trigger a state transition from idle to throw and the slider direction decides whether the animation will show throwing left or right. We admit that this handling method will not guarantee 100% correct result, but on the other hand, this gives the user the best interactive experience compared to compute-intensive methods like linear regression calculation. We also carefully tuned the delay between each sampling and the sampling comparison cutoff value so that it will yield the most accurate result up to 80% success rate.
When the system has no user inputs, the FSM animates the pet jumping left and right in the idle stage. Our original implementation sends a trigger message to the python script through the serial interface, stays in the a long delay, and checks for user inputs to decide the state transition. However, this approach wastes a lot of compute cycles in the long delay and mismatched delay sometimes causes serial commands to clog up the serial buffer. From a user’s perspective they will see their inputs get animated many iterations later. Thus we decide to take advantage of the interrupt feature. Rather than waiting in the long delay function before sending the next idle stage command, we setup an interrupt routine using the periodic interrupt timer (PIT). Every five seconds, the PIT timer will count down to 0, triggers the ISR, and sends out the idle animation command to the python script if there are no user inputs. Otherwise the FSM keeps checking for user inputs to decide whether to perform a state transition and animate the user input. The interrupt is also where the background is changed every few times the interrupt is called (through the use of a counter) and when it does, the respective serial output line is printed.
As there are two LEDs on the board, we utilize the red LED to indicate that the user input is registered and will be performed. The green LED toggles every time we enter the interrupt service routine for the idle state animation. This is mainly for debugging purposes.
The Python file takes in a line from the serial port and updates display variables based on the string of numbers the board sends over. If the string starts with “s” we know to update the state while if it starts with “b” we update the background instead. Once we take the number out of the string and save it to our state variable, we update other variables (number of frames and sound). A loop then concatenates the different variable numbers in the sequences state+frame+.png and bg+.png in order to display each frame after a short half-second delay. The window is wiped and images are redisplayed (background first, then frame overlaid on top) every cycle of the loop using Pygame’s fill, blit, and display functions. Sound files are also updated based on the state variable and played before every animation loop, timed correctly so that the sound and animation finish together.
This project was tested extensively via repeated user interaction. Every input option was tested multiple times to verify that we received consistent results through the use of print statements by both the board and Python file. To ensure smooth incorporation into our main project file, switches and the touch slider files were written and tested separately. Within the Python file, each function (serial buffer, variable update, frame and sound update) was written and incorporated one at a time.
Testing the FSM and switch inputs involved using PRINTF statements and looking at the Python terminal to see if the correct states and variables were being changed. As we are able to send messages through the serial buffer and display them in the python terminal, we extensively utilized this method to check for state transition and program flow. At the same time, onboard LEDs are also utilized to indicate whether we are able to enter certain condition checks and register the inputs. The touch slider was first implemented outside of the main project file and tested through the use of the LCD. When testing the touch slider feature, we also set up the LCD screen to display each sampling output so that we can capture the slider behavior in real time.
In the early stages of this project we focused on getting Pygame to display an image after reading data in from the serial port, and then updating it once another line was sent by the board. This was done using the “Hello World” SDK example on the board as a proof of concept. Once this functionality was deemed to be sufficient for our needs, we proceeded to write the rest of the C code for the board. Once we had the media files ready, testing was a matter of seeing if the Pygame window was displaying the correct states. A good amount of time was also spent adjusting the delay and interrupt timing of both files to achieve a smooth interaction between the board and Pygame (if we clogged the serial port with too many display instructions, the python file would not have been able to keep up in real time, leading to a large disparity between user input and what is shown on the screen).
A variety of resources were used to complete this project. The serial communication examples were found outside of class while the switch and touch slider modules were referenced from a combination of in-class examples as well as online resources linked below. Visuals were made by editing and photobashing pre-existing sprites. Original sound effect files were also created for this project.
Most code for the main file was written in person, peer programming style. Switch integration was worked on together early on. Az292 worked on touch slider input and interrupts, while tc575 worked on the python file and images. Debugging and testing was done together in person with files shared over GitHub. The website text was written together in a shared document before being placed into an html file.