2017 Day 4

Written by James Irwin. Posted in Competition

This morning we downloaded the newly trained network weights for our vision processing, and they appear to perform very well. We had our practice run in the afternoon about an hour and a half before our first semifinal run. The practice run didn't go as smoothly as we had hoped due to a lack of coordination between the members of the software team, but after working out those bugs, we successfully hit a buoy within in the dolphin pool.

 

Following this testing and doing dry runs of our complete AI system we felt ready and confident for our semifinal run. We performed the run with no major software or hardware issues, and traveled through the gates 3 out of 4 attempts. However, it appeared that our AI was not transitioning to searching for the buoys as it should.

After our semifinal run we looked more closely at our log files and saw that our individual AI tasks (except for the first) all timed out without providing any outputs. We were puzzled at first, because we had just ran the same software 5 minutes before without any issues. After some thought, we realized the key variable that had changed between our tests in the dolphin pool and our semifinal run: we had to run completely untethered during the semifinal run. The way we had ROS (our communication framework) configured was such that it's reliant on the ability to perform DNS resolution between various computers, including our NUC (our primary computer), and our Jetson (the vision processing computer). Our router had always performed this DNS resolution, and this was actually the first time we ever tried to run our software since installing the Jetson without our tether (and therefore router) attached, so we never noticed the lurking problem. The NUC couldn't send camera images to the Jetson, and likewise the Jetson could not provide any vision processing results back to the AI running on the NUC. Without any information to operate on, the AI sat idle. After realizing the problem, the fix was quickly implemented and verified good.

Unfortunately, later in the evening we accidentally plugged the battery in backwards to the sub, which damaged our power control electronics. While the sub can still turn on, thrusters cannot be enabled. I myself made the same mistake back in June, and luckily the fix was fairly straightforward, just needed to replace a single MOSFET and a diode. We're pretty confident the same parts fried again, so we'll be stopping at a local electronics shop tomorrow morning to get the required replacement parts. Shortly after fixing it the last time I designed some protection circuitry to prevent damage should things get plugged in backwards again, but I never implemented it. Definitely regretting that now. We expect to be able to fix the power control electronics before our second semifinal run tomorrow (Saturday) at 3:45pm, however we may not get as much testing as we would have liked. However, that's the way things tend to go at this competition, it's rare for any team to not encounter an unexpected setback like this at least once during the week.

Our hope is to pass through the start gate and hit at least one buoy tomorrow, and hopefully all the buoys!