Two years after Singapore, Team Minion arrived in Hawaii a different crew. The roster had turned over almost entirely — graduated seniors replaced by a new generation of ERAU engineers hungry for their first competition experience. But the institutional knowledge they inherited, encoded in design documents, software commits, and frank post-mortems, gave them a running start that the 2014 team never had.
The 2016 Maritime RobotX Challenge was held in December at the Hawaii Kai Marine area on Oahu — a protected inlet surrounded by volcanic ridgelines, with conditions that shifted from glassy calm to gusty whitecaps within the span of a single morning.
Rebuilding for 2016
The months between Singapore and Hawaii were not idle ones. The team’s approach to the 2016 campaign was methodical: audit every subsystem, identify the failure modes from 2014, and redesign accordingly.
Software Architecture Overhaul
The original autonomy stack had grown organically — a characteristic of first-iteration systems everywhere. By 2016 the team had reorganized the codebase around a cleaner Robot Operating System (ROS) node graph, with explicit separation between perception, planning, and control layers. This made debugging during on-water tests dramatically faster: engineers could isolate a misbehaving node without restarting the entire system.
State machine logic for task execution was rebuilt from scratch using a behavior tree framework. Instead of hardcoded conditional chains, task behaviors were composable, testable in simulation, and recoverable from partial failure states — exactly the property that had been missing when the docking task fell apart in Singapore.
Sensor Improvements
The camera suite was upgraded to a wider field-of-view stereo pair with hardware-synchronized exposure control, eliminating the motion blur artifacts that had degraded buoy detection at speed. The LIDAR was repositioned on the mast to reduce self-occlusion from the hull structure — a small change that meaningfully expanded the vessel’s obstacle awareness envelope.
Detailed hardware decisions are documented in our boat overview, which covers sensor selection and placement rationale across all platform generations.
The Hawaii Competition Environment
Hawaii presented environmental challenges that Florida test ponds simply can’t replicate. Ocean swell, even in a sheltered inlet, introduces low-frequency motion into the platform that onshore sensors can’t prepare for. The vessel’s IMU calibration routines had to be re-run each morning as thermal expansion changed sensor biases through the day.
Wind was the team’s persistent adversary. The volcanic terrain that makes Hawaii so striking also creates localized gusts that funnel through gaps in the ridgeline without warning. On two separate test runs the vessel was pushed off course mid-task by a 15-knot puff that the wind model hadn’t anticipated.
The team’s response was pragmatic: rather than fight the wind model problem before competition, they tightened the station-keeping controller gains and accepted higher thruster activity in exchange for better position hold. It worked.
Task-by-Task Performance
Navigation channel: Minion cleared the buoy-gate course cleanly on the first qualifying attempt. The upgraded camera processing pipeline identified gate markers at longer range, giving the path planner more time to compute smooth approach trajectories. This was a clean win and a confidence builder for the team.
Obstacle field: The obstacle avoidance run was the most visually dramatic event of the competition week. Minion detected and avoided three dynamic obstacles — floating spheres tethered to slow-moving lines — using the LIDAR. One close pass had spectators gasping, but the vessel never made contact.
Docking task: The rebuilt station-keeping controller delivered. Minion entered the dock, held position for the required dwell time, and exited cleanly. The team erupted. Watching a system they had painstakingly tuned perform exactly as designed in a real-world competitive environment is a feeling that’s difficult to describe to anyone who hasn’t lived it.
Underwater acoustic pinger: The hydrophone array again demonstrated its value. TDOA localization brought the vessel to within two meters of the pinger, close enough to trigger the proximity scoring threshold. This placed Team Minion among a small number of teams who successfully completed the acoustic task.
Light buoy identification: The vision pipeline correctly classified the illuminated sequence on three of four attempts. The fourth failure was traced to direct solar glare overexposing the camera sensor — a hardware fix (a polarizing filter) was noted for future seasons.
People Make the Platform
Any honest competition recap has to acknowledge the human side of the story. The 2016 team was younger and less experienced than the 2014 crew when they started the season. By December in Hawaii they had become something else: a tight, confident group of engineers who trusted each other under pressure.
The team captain that season coordinated across six technical subteams, managed the shipping logistics for a vessel crossing the Pacific, and still found time to personally rewrite the collision avoidance planner the week before the event when simulations revealed an edge-case failure. That kind of ownership is what separates competitive teams from also-rans.
What Hawaii Proved
The 2016 season validated the team’s rebuild strategy. Systematic post-mortems, disciplined software architecture, and genuine hardware improvements translate directly into on-water performance. It also proved that the Team Minion culture — collaborative, technically rigorous, student-led — could survive a complete roster turnover and emerge stronger.
Hawaii 2016 wasn’t the end of the story. It was the moment the team understood that they could contend.