Project Report – UWB Indoor Autonomous Navigation
One-paragraph summary
I built an indoor autonomous navigation system that uses dual UWB tags for position, fused odometry for heading, and a custom A* → spline → Pure-Pursuit pipeline to reach user-selected goals on a 2D floorplan. The system communicates with the Iiwari Cloud API to fetch tag positions, integrates with a pre-built AI obstacle detection model over a Nokia 5G link, and visualizes navigation in real time through a lightweight web interface. In live trials, the robot reliably reached its goals with ±0.25 cm accuracy and paused/resumed automatically when obstacles appeared.
Objective
- Evaluate how 10 Hz UWB positioning can be leveraged for indoor autonomous navigation.
- Prove that a simple floorplan-based planner (no LiDAR SLAM) is sufficient for safe autonomy.
- Integrate supervision via an external AI model for pause/resume obstacle handling.
- Demonstrate usability with a live visualizer and real-world demo videos.
What I built (my contributions)
Pose acquisition (via API):
- Queried Iiwari Cloud API for tag status at 10 Hz.
- Front + back UWB tags → midpoint = robot (x,y).
- Heading (yaw) fused from /odometry/filtered (robot_localization).
- Internal transform handled Y-down map vs ROS yaw mismatch.
Planner & controller:
- A* on an occupancy grid from a floorplan PNG (black = obstacle).
- Pruning + Catmull-Rom smoothing for compact, drivable paths.
- Pure-Pursuit controller with:
- Automatic initial orientation alignment.
- Curvature-aware dynamic lookahead.
- Multi-stage docking for precise goal convergence.
- Re-plan when deviation exceeds threshold.
Supervision & control:
- TCP server for pause/resume commands (used by the AI model).
- Safe integration with a Nokia 5G link for reliable, low-latency commands.
Visualization (Flask + Canvas):
- Live map with robot pose, path, and goal.
- Path rendering with glow, curvature thickness, arrows, and trails.
- Telemetry (speed, heading, latency) for operator confidence.
Demonstrations
Normal Navigation (no dynamic obstacles)
- Map included chairs (left) and stairs (right) as obstacles (black areas).
- A* automatically avoided them, producing a safe route.
- Robot smoothly followed the planned path and reached its goal.
Navigation with Obstacle Avoidance
- Same setup, but when a person walked in front of the robot: The external AI (via 5G) paused the robot immediately.
- Once clear, the robot resumed motion and reached the goal.
- Includes a screen recording of the web visualizer showing pose, path, and stop/start events.
Results
- Accuracy: Goal convergence within ±0.25 cm.
- Stability: Smooth control loop despite 10 Hz UWB updates.
- Reliability: Planner respected static obstacles; controller tracked paths tightly.
- Supervision: External AI pause/resume worked seamlessly.
- Usability: Web interface made navigation transparent and operator-friendly.
Why this matters (Impact)
- End-to-end autonomy: sensing (API), planning (A*), control (Pure-Pursuit), visualization, supervision.
- System integration: bridged ROS, cloud APIs, planning, control, 5G, and AI.
- Vendor showcase: Nokia 5G as a robust low-latency backbone.
- HR-ready story: systems thinking, safety, and user-facing tools.
Future Work
- More powerful A*: finer grids/higher compute for smoother paths.
- Improved heading estimation with higher-grade IMU.
- Precision docking for charging/tasks with cm-level repeatability.
- Adaptive obstacle handling: move from pause/resume to dynamic re-planning.
Portfolio one-liner
Built an indoor autonomy stack that uses UWB-based positioning (via cloud API), a custom A*→spline→Pure-Pursuit planner, and Nokia 5G AI integration for obstacle handling — achieving ±0.25 cm precision and delivering a live web-based visualization.