Close-Range Moving Object Tracking

Goal
Develop a system that determines the absolute position and heading of boats near our autonomous sailboat in a global coordinate system.
Motivation
To safely navigate and avoid collisions, we need to track the positions and movement directions of surrounding boats.
Available Resources
Camera Photos with object bounding boxes generated by a fine-tuned YOLO model, GPS, Compass, IMU
Requirements
- Continuously estimate the absolute position of nearby boats, ideally as a probability distribution around the most likely position.
- Determine the most likely heading of each detected boat. This could require an additional vision-based model trained to estimate direction based on bounding box content.
- Ensure the system performs reliably even when boats are moving in complex scenarios.
Challenges & Considerations
- Tracking Over Time: Since object detection provides only frame-by-frame information, a suitable tracking method needs to be implemented. The task includes evaluating different approaches to associate detections over time and produce smooth trajectory estimates.
- Uncertainty Handling: Sensor inaccuracies (GPS drift, compass errors, camera distortions) need to be accounted for when estimating positions and headings.
- Heading Estimation: No labeled heading data is available – bounding box regions must be extracted from full images, and suitable datasets or models must be found or created.
- Occlusions: Boats may partially or fully block each other. A strategy for handling occlusions needs to be developed.
Simplifications
- All boats are assumed to be moving, meaning we do not yet classify them as anchored or stationary.
- Handling motion state estimation (e.g., detecting whether a boat is moving or stopped) is deferred to future work.