The when and what of our project steps
During this first meeting, we added the EV3Dev extension to VScode and tried to connect to Ghoul (our EV3 Brick). We were not able to connect to Ghoul over Bluetooth.
Still unable to connect over Bluetooth, we tried the other EV3 bricks and a variety of troubleshooting techniques. We also tried to update the bricks' firmware by flashing their SD cards and installing the latest firmware. However, we were unable to get any EV3 to connect to VS Code.
Realizing that the EV3s would not work for this project, we began to consider alternative systems or even switching projects altogether. Deciding to stick with the project, we researched and compared different Arduino kits. Missing the hardware, we spoke briefly about the general structure for the code.
Having mostly hashed out the general logic behind the code, we created a python prototpe of the algorithm.
Settling on the Arduino Nano RP2040, we ordered the necessary parts, including infrared sensors for black-and-white line detection. Fortunately, another professor was happy to lend us an Arduino Nano 33 IoT with an attached chassis, motor driver and motors.
We started interacting with Nano 33 IoT. We began with blinking an LED on the board and worked up to controlling the individual motors and connecting to the Nano via Bluetooth Low Energy. We began considering available pins to connect the infrared sensors, which were the last remaining component to add.
We soldered the splitter wire which should allow us to power two infrared sensors simultaneously. With it, we can split our VCC (+3V) to supply both sensors.
We attached the front wheel using our Mindstorms parts, removed the standoffs that prevented the chassis from moving, and secured the infrared sensors onto our Mindstorms parts. We connected the sensors to the Arduino and ran the test file, but we couldn't get the sensors working. We tried troubleshooting for the rest of the meeting by changing pins and printing out all of the Arduino's digital voltage pins, but we couldn't get the sensors to work.
We tested the IR sensors using an instrumentation board and found that while the ones we had been using were not reliable, there were two that were reliable and we started using them instead. We also discovered that the sensors are extremely directionally sensitive, and work best when they are angled slightly downward and forward. We also began debugging the algorithm using the moderating test program.
We got the sensors working by securing them in a forward position! We ran motor tests and calculated correctional constants, though we found that the robot has a lot of variability in its motion. We think that this variability could be due to the front metal ball being heavy and exhibiting varying friction, so we designed and 3D printed a part to replace it (this part needs to be redesigned for next time because it's too bumpy). We worked on the program and got it to pass all of the test cases of the virtual version from 2021, showing that its logic is reliable! We also tested importing all of the libraries we need and running BlueTooth tests with the Arduino writing to the phone. Both of these were successful after some troubleshooting, and the program took in total 82% storage, so hopefully we'll be fine with the small storage of the Nano.
We 3D printed and tweaked the measurements of the new plastic front wheel to make it compatible with the LEGO Mindstorms parts, thereby making it easily interchangeable with our current metal wheel. We glued the wheel and it seems to spin decently. If we decide that it has too much friction, we can try gluing the ball in place and dragging the plastic wheel, which should have low friction with the slippery painted wood surface of the board. We will test tomorrow when the lab is open again.
After a lot of troubleshooting, we got the logic running properly on the Arduino. NanoBot properly waits for scents that are sent over Bluetooth. We had a lot of trouble getting the Bluetooth working because the board must maintain a Bluetooth connection for a long time, with both the user and the board reading and writing from the connection. After we got Bluetooth working, we started working on correcting the movement of NanoBot by using the encoder library and by using the infrared sensors to correct when we pass a white line. We were able to get the white line correction pretty successful after a lot of debugging, but we still may want to tweak some constants. Using the encoder library and just proportional control, we got pretty close to 90 degrees, but NanoBot couldn't get so accurate because when the angle is close to 90 degrees, correction from proportional control is very weak. Adding integral control, we were able to get a pretty consistent angle and just need to tweak the constants to get 90 degrees.
We changed the line following to approximate the angle based on the difference in time between when the first infrared sensor hits the line and when the second infrared sensor hits the line. Using the approximate error in angle, it uses the rotate functions to correct it. We also adjusted the rotate functions to use a weighted integral control, which puts more weight on the error as time goes on, a method which was intended to hasten the angle correction. A smaller change which posed a couple insidious bugs was changing the 180 degree rotation to use the rotate(180) rather than rotating 90 twice because each rotation has an error, so rotate(180) will have less total error than 2 rotate(90). With all of these changes, NanoBot was able to solve many of the challenges (after a few tries each)!
We finished up the write-up website by adding a discussion, demo videos, images, and making other improvements.