Remove UAV frame material documentation and update README with detailed project requirements. Refactor skills documentation to clarify modes of operation and enhance input specifications. Delete unused E2E test infrastructure template.

This commit is contained in:
Oleksandr Bezdieniezhnykh
2026-03-18 16:40:50 +02:00
parent 3ab47526bd
commit d969bec3b6
27 changed files with 2551 additions and 203 deletions
@@ -0,0 +1,73 @@
# Question Decomposition
## Original Question
"Analyze completeness of the current solution. How mature is it?"
## Active Mode
**Mode B: Solution Assessment** — 5 solution drafts exist (`solution_draft01.md` through `solution_draft05.md`). Assessing the latest draft (05) for completeness and maturity.
## Question Type Classification
**Knowledge Organization + Problem Diagnosis**
- Knowledge Organization: systematically map what a complete GPS-denied nav system requires vs what is present
- Problem Diagnosis: identify gaps, weak points, and missing elements that reduce maturity
## Research Subject Boundary Definition
| Dimension | Boundary |
|-----------|----------|
| **Population** | Fixed-wing UAV GPS-denied visual navigation systems using visual odometry + satellite matching + IMU fusion |
| **Geography** | Eastern/southern Ukraine conflict zone operations |
| **Timeframe** | Current state of art (2024-2026), focusing on Jetson-class embedded deployment |
| **Level** | System-level architecture completeness — from sensor input to flight controller output |
## Problem Context Summary
The solution is a real-time GPS-denied visual navigation system for a custom 3.5m fixed-wing UAV:
- **Hardware**: Jetson Orin Nano Super (8GB), ADTI 20L V1 camera (0.7fps), Viewpro A40 Pro gimbal, Pixhawk 6x
- **Core pipeline**: cuVSLAM VO (0.7fps) → ESKF fusion → GPS_INPUT via pymavlink at 5-10Hz
- **Satellite correction**: LiteSAM/EfficientLoFTR/XFeat TRT FP16 on keyframes, async Stream B
- **5 draft iterations**: progressed from initial architecture → TRT migration → camera rate correction + UAV platform specs
- **Supporting docs**: tech_stack.md, security_analysis.md
## Decomposed Sub-Questions (Mode B)
### Functional Completeness
- **SQ-1**: What components does a mature GPS-denied visual navigation system require that are missing or under-specified in the current draft?
- **SQ-2**: How complete is the ESKF sensor fusion specification? (state vector, process model, measurement models, Q/R tuning, observability analysis)
- **SQ-3**: How does the system handle disconnected route segments (sharp turns with no overlap)? Is this adequately specified?
- **SQ-4**: What coordinate system transformations are needed (camera → body → NED → WGS84) and are they specified?
- **SQ-5**: How does the system handle initial localization (first frame + satellite matching bootstrap)?
- **SQ-6**: Is the re-localization request workflow (to ground station) sufficiently defined?
- **SQ-7**: How complete is the offline tile preparation pipeline (zoom levels, storage requirements, coverage calculation)?
- **SQ-8**: Is the object localization component sufficiently specified for operational use?
### Performance & Robustness
- **SQ-9**: What are the realistic drift characteristics of cuVSLAM at 0.7fps over long straight segments?
- **SQ-10**: How robust is satellite matching with Google Maps imagery in the operational area?
- **SQ-11**: What happens during extended periods with no satellite match (cloud cover on tiles, homogeneous terrain)?
- **SQ-12**: Is the 5-10Hz GPS_INPUT rate adequate for the flight controller's EKF?
### Maturity Assessment
- **SQ-13**: What is the Technology Readiness Level (TRL) of each component?
- **SQ-14**: What validation/testing has been done vs what is only planned?
- **SQ-15**: What operational procedures are missing (pre-flight checklist, in-flight monitoring, post-flight analysis)?
- **SQ-16**: Are there any inconsistencies between documents (tech_stack.md, security_analysis.md, solution_draft05.md)?
### Security
- **SQ-17**: Are there security gaps not covered by the existing security_analysis.md?
- **SQ-18**: How does the MAVLink GPS_INPUT message security work (spoofing of the GPS replacement itself)?
## Timeliness Sensitivity Assessment
- **Research Topic**: GPS-denied visual navigation system completeness and maturity
- **Sensitivity Level**: 🟡 Medium
- **Rationale**: Core algorithms (VO, ESKF, feature matching) are well-established. Hardware (Jetson Orin Nano Super) is relatively new but stable. cuVSLAM library updates are moderate pace. No rapidly-changing AI/LLM dependencies.
- **Source Time Window**: 1-2 years
- **Priority official sources to consult**:
1. NVIDIA cuVSLAM / Isaac ROS documentation
2. PX4/ArduPilot MAVLink GPS_INPUT documentation
3. LiteSAM / EfficientLoFTR papers and repos
- **Key version information to verify**:
- cuVSLAM: PyCuVSLAM v15.0.0
- TensorRT: 10.3.0
- JetPack: 6.2.2
@@ -0,0 +1,166 @@
# Source Registry
## Source #1
- **Title**: ArduPilot AP_GPS_Params — GPS_RATE minimum 5Hz
- **Link**: https://github.com/ArduPilot/ardupilot/pull/15980
- **Tier**: L1
- **Publication Date**: 2024
- **Timeliness Status**: ✅ Currently valid
- **Target Audience**: ArduPilot flight controller users
- **Research Boundary Match**: ✅ Full match
- **Summary**: ArduPilot enforces minimum 5Hz GPS update rate. GPS_RATE parameter description: "Lowering below 5Hz(default) is not allowed."
- **Related Sub-question**: SQ-12
## Source #2
- **Title**: MAVLink GPS_INPUT Message Definition
- **Link**: https://ardupilot.org/mavproxy/docs/modules/GPSInput.html
- **Tier**: L1
- **Publication Date**: 2024
- **Timeliness Status**: ✅ Currently valid
- **Target Audience**: MAVLink developers
- **Research Boundary Match**: ✅ Full match
- **Summary**: GPS_INPUT requires: lat, lon, alt, fix_type, hdop, vdop, horiz_accuracy, vert_accuracy, speed_accuracy, vn, ve, vd, time_usec, time_week, time_week_ms, satellites_visible, gps_id, ignore_flags. GPS_TYPE=14 for MAVLink GPS.
- **Related Sub-question**: SQ-6, SQ-12
## Source #3
- **Title**: pymavlink GPS_INPUT example (GPS_INPUT_pymavlink.py)
- **Link**: https://webperso.ensta.fr/lebars/Share/GPS_INPUT_pymavlink.py
- **Tier**: L3
- **Publication Date**: 2024
- **Timeliness Status**: ✅ Currently valid
- **Target Audience**: pymavlink developers
- **Research Boundary Match**: ✅ Full match
- **Summary**: Working pymavlink example sending GPS_INPUT over serial at 10Hz with GPS time calculation from system time.
- **Related Sub-question**: SQ-6
## Source #4
- **Title**: PyCuVSLAM API Reference (v15.0.0)
- **Link**: https://wiki.seeedstudio.com/pycuvslam_recomputer_robotics/
- **Tier**: L2
- **Publication Date**: 2026-03
- **Timeliness Status**: ✅ Currently valid
- **Target Audience**: cuVSLAM developers on Jetson
- **Research Boundary Match**: ✅ Full match
- **Summary**: cuVSLAM supports mono/stereo/inertial modes. Requires Camera model (fx,fy,cx,cy,distortion), ImuCalibration (noise density, random walk, frequency, T_imu_rig). Modes: Performance/Precision/Moderate. IMU fallback ~1s acceptable quality.
- **Related Sub-question**: SQ-1, SQ-5, SQ-9
## Source #5
- **Title**: ESKF Python implementation for fixed-wing UAV
- **Link**: https://github.com/ludvigls/ESKF
- **Tier**: L4
- **Publication Date**: 2023
- **Timeliness Status**: ✅ Currently valid
- **Target Audience**: ESKF implementers
- **Research Boundary Match**: ✅ Full match
- **Summary**: Reference ESKF: 16-state vector (pos[3], vel[3], quat[4], acc_bias[3], gyro_bias[3]). Prediction with IMU at high rate. Update with GPS position/velocity. Tuning parameters: Q (process noise), R (measurement noise).
- **Related Sub-question**: SQ-2
## Source #6
- **Title**: ROS ESKF based on PX4/ecl — multi-sensor fusion
- **Link**: https://github.com/EliaTarasov/ESKF
- **Tier**: L4
- **Publication Date**: 2022
- **Timeliness Status**: ✅ Currently valid
- **Target Audience**: UAV ESKF implementers
- **Research Boundary Match**: ✅ Full match
- **Summary**: ESKF fusing GPS, Magnetometer, Vision Pose, Optical Flow, RangeFinder with IMU. Shows that vision pose and optical flow are separate measurement models, each with its own observation matrix and noise parameters.
- **Related Sub-question**: SQ-2
## Source #7
- **Title**: Visual-Inertial Odometry Scale Observability (Range-VIO)
- **Link**: https://arxiv.org/abs/2103.15215
- **Tier**: L1
- **Publication Date**: 2021
- **Timeliness Status**: ✅ Currently valid (fundamental research)
- **Target Audience**: VIO researchers
- **Research Boundary Match**: ✅ Full match
- **Summary**: Monocular VIO cannot observe metric scale without accelerometer excitation (not constant velocity). A 1D range sensor makes scale observable. For our case, barometric altitude + known flight altitude provides this constraint.
- **Related Sub-question**: SQ-2, SQ-4
## Source #8
- **Title**: NaviLoc: Trajectory-Level Visual Localization for GNSS-Denied UAVs
- **Link**: https://www.mdpi.com/2504-446X/10/2/97
- **Tier**: L1
- **Publication Date**: 2025
- **Timeliness Status**: ✅ Currently valid
- **Target Audience**: GPS-denied UAV navigation researchers
- **Research Boundary Match**: ✅ Full match
- **Summary**: Trajectory-level optimization fusing VPR with VIO achieves 19.5m mean error at 50-150m altitude. Key insight: treating satellite matching as noisy measurement rather than ground truth, with trajectory-level optimization. Runs at 9 FPS on RPi 5.
- **Related Sub-question**: SQ-3, SQ-13
## Source #9
- **Title**: SatLoc-Fusion: Hierarchical Adaptive Fusion Framework
- **Link**: https://www.scilit.com/publications/e5cafaf875a49297a62b298a89d5572f
- **Tier**: L1
- **Publication Date**: 2025
- **Timeliness Status**: ✅ Currently valid
- **Target Audience**: GPS-denied UAV researchers
- **Research Boundary Match**: ✅ Full match
- **Summary**: Three-layer fusion: absolute geo-localization (DinoV2), relative VO (XFeat), optical flow velocity. Adaptive weighting based on confidence. Achieves <15m error, >90% trajectory coverage. 2Hz on 6 TFLOPS edge.
- **Related Sub-question**: SQ-3, SQ-10, SQ-13
## Source #10
- **Title**: Auterion GPS-Denied Workflow
- **Link**: https://docs.auterion.com/vehicle-operation/auterion-mission-control/useful-resources/operations/gps-denied-workflow
- **Tier**: L2
- **Publication Date**: 2025
- **Timeliness Status**: ✅ Currently valid
- **Target Audience**: UAV operators
- **Research Boundary Match**: ⚠️ Partial overlap (multirotor focus, but procedures applicable)
- **Summary**: Pre-flight: manually set home position, reset heading/position, configure wind. In-flight: enable INS mode. Defines operational procedures for GPS-denied missions.
- **Related Sub-question**: SQ-15
## Source #11
- **Title**: PX4 GNSS-Degraded & Denied Flight (Dead-Reckoning)
- **Link**: https://docs.px4.io/main/en/advanced_config/gnss_degraded_or_denied_flight.html
- **Tier**: L1
- **Publication Date**: 2025
- **Timeliness Status**: ✅ Currently valid
- **Target Audience**: PX4 users
- **Research Boundary Match**: ⚠️ Partial overlap (PX4-specific, but concepts apply to ArduPilot)
- **Summary**: GPS-denied requires redundant position/velocity sensors. Dead-reckoning mode for intermittent GNSS loss. Defines failsafe behaviors when GPS is lost.
- **Related Sub-question**: SQ-15
## Source #12
- **Title**: Google Maps Ukraine satellite imagery coverage
- **Link**: https://newsukraine.rbc.ua/news/google-maps-has-surprise-for-satellite-imagery-1727182380.html
- **Tier**: L3
- **Publication Date**: 2024-09
- **Timeliness Status**: ✅ Currently valid
- **Target Audience**: General public
- **Research Boundary Match**: ✅ Full match
- **Summary**: Google Maps improved imagery quality with Cloud Score+ AI. However, conflict zone imagery is intentionally older (>1 year). Ukrainian officials flagged security concerns about imagery revealing military positions.
- **Related Sub-question**: SQ-10
## Source #13
- **Title**: Jetson Orin Nano Super thermal behavior at 25W
- **Link**: https://edgeaistack.app/blog/jetson-orin-nano-power-consumption/
- **Tier**: L3
- **Publication Date**: 2025
- **Timeliness Status**: ✅ Currently valid
- **Target Audience**: Jetson developers
- **Research Boundary Match**: ✅ Full match
- **Summary**: Thermal throttling at SoC junction >80°C. Sustained GPU at 25W: ~50-51°C reported. Active cooling required for >15W. Most production workloads 8-15W.
- **Related Sub-question**: SQ-11
## Source #14
- **Title**: Automated Image Matching for Satellite Images with Different GSDs
- **Link**: https://www.kjrs.org/journal/view.html?pn=related&uid=756&vmd=Full
- **Tier**: L1
- **Publication Date**: 2024
- **Timeliness Status**: ✅ Currently valid
- **Target Audience**: Remote sensing researchers
- **Research Boundary Match**: ✅ Full match
- **Summary**: GSD mismatch between satellite and aerial images requires scale normalization via subsampling/super-resolution. Coarse-to-fine matching strategy effective. Scale-invariant features (SIFT, deep features) partially handle scale differences.
- **Related Sub-question**: SQ-7
## Source #15
- **Title**: Optimized VO and satellite image matching for UAVs (Istanbul Tech thesis)
- **Link**: https://polen.itu.edu.tr/items/1fe1e872-7cea-44d8-a8de-339e4587bee6
- **Tier**: L1
- **Publication Date**: 2025
- **Timeliness Status**: ✅ Currently valid
- **Target Audience**: GPS-denied UAV researchers
- **Research Boundary Match**: ✅ Full match
- **Summary**: Complete VO+satellite matching pipeline. Coordinate transforms: GPS → local NED for trajectory comparison. PnP solver for UAV pose from correspondences. Map retrieval using VO-estimated position to crop satellite tiles.
- **Related Sub-question**: SQ-4, SQ-5
@@ -0,0 +1,169 @@
# Fact Cards
## Fact #1 — ArduPilot minimum GPS rate is 5Hz
- **Statement**: ArduPilot enforces a hard minimum of 5Hz for GPS_INPUT updates. The GPS_RATE parameter description states: "Lowering below 5Hz(default) is not allowed." The EKF scales buffers based on this rate.
- **Source**: Source #1 (ArduPilot AP_GPS_Params)
- **Phase**: Assessment
- **Target Audience**: ArduPilot-based flight controllers
- **Confidence**: ✅ High
- **Related Dimension**: Flight controller integration completeness
## Fact #2 — GPS_INPUT requires velocity + accuracy + GPS time fields
- **Statement**: GPS_INPUT message requires not just lat/lon/alt, but also: vn/ve/vd velocity components, hdop/vdop, horiz_accuracy/vert_accuracy/speed_accuracy, fix_type, time_week/time_week_ms, satellites_visible, and ignore_flags bitmap. The solution draft05 mentions GPS_INPUT but does not specify how these fields are populated (especially velocity from ESKF, accuracy from covariance, GPS time conversion from system time).
- **Source**: Source #2 (MAVLink GPS_INPUT definition), Source #3 (pymavlink example)
- **Phase**: Assessment
- **Target Audience**: GPS-denied system
- **Confidence**: ✅ High
- **Related Dimension**: Flight controller integration completeness
## Fact #3 — ESKF state vector needs explicit definition
- **Statement**: Standard ESKF for UAV VIO fusion uses 15-16 state error vector: δp[3], δv[3], δθ[3] (attitude error in so(3)), δba[3] (accel bias), δbg[3] (gyro bias), optionally δg[3] (gravity). The solution draft05 says "16-state vector" and "ESKF + buffers ~10MB" but never defines the actual state vector, process model (F, Q matrices), measurement models (H matrices for VO and satellite), or noise parameters.
- **Source**: Source #5 (ludvigls/ESKF), Source #6 (EliaTarasov/ESKF)
- **Phase**: Assessment
- **Target Audience**: GPS-denied system
- **Confidence**: ✅ High
- **Related Dimension**: Sensor fusion completeness
## Fact #4 — Monocular VIO has scale ambiguity without excitation
- **Statement**: Monocular visual-inertial odometry cannot observe metric scale during constant-velocity flight (zero accelerometer excitation). This is a fundamental observability limitation. The solution uses monocular cuVSLAM + IMU, and fixed-wing UAVs fly mostly at constant velocity. Scale must be provided externally — via known altitude (barometric + predefined mission altitude) or satellite matching absolute position.
- **Source**: Source #7 (Range-VIO scale observability paper)
- **Phase**: Assessment
- **Target Audience**: GPS-denied system
- **Confidence**: ✅ High
- **Related Dimension**: Sensor fusion completeness, core algorithm correctness
## Fact #5 — cuVSLAM requires explicit camera calibration and IMU calibration
- **Statement**: PyCuVSLAM requires Camera(fx, fy, cx, cy, width, height) + Distortion model + ImuCalibration(gyroscope_noise_density, gyroscope_random_walk, accelerometer_noise_density, accelerometer_random_walk, frequency, T_imu_rig). The solution draft05 does not specify any camera calibration procedure, IMU noise parameters, or the T_imu_rig (IMU-to-camera) extrinsic transformation.
- **Source**: Source #4 (PyCuVSLAM docs)
- **Phase**: Assessment
- **Target Audience**: GPS-denied system
- **Confidence**: ✅ High
- **Related Dimension**: Visual odometry completeness
## Fact #6 — cuVSLAM IMU fallback provides ~1s acceptable tracking
- **Statement**: When visual tracking fails (featureless terrain, darkness), cuVSLAM falls back to IMU-only integration which provides "approximately 1 second" of acceptable tracking quality before drift becomes unacceptable. After that, tracking is lost.
- **Source**: Source #4 (PyCuVSLAM/Isaac ROS docs)
- **Phase**: Assessment
- **Target Audience**: GPS-denied system
- **Confidence**: ✅ High
- **Related Dimension**: Resilience, edge case handling
## Fact #7 — Disconnected route segments need satellite re-localization
- **Statement**: When a UAV makes a sharp turn and the next photos have no overlap with previous frames, cuVSLAM will lose tracking. The solution must re-localize using satellite imagery. The AC requires handling "more than 2 such disconnected segments" as a core strategy. Solution draft05 mentions this requirement but does not define the concrete re-localization algorithm (how satellite match triggers, how the new position is initialized in ESKF, how the map is connected to the previous segment).
- **Source**: Source #8 (NaviLoc), Source #9 (SatLoc-Fusion), AC requirements
- **Phase**: Assessment
- **Target Audience**: GPS-denied system
- **Confidence**: ✅ High
- **Related Dimension**: Functional completeness — disconnected segments
## Fact #8 — Coordinate transformation chain is undefined
- **Statement**: The system needs a well-defined coordinate transformation chain: (1) pixel coordinates → camera frame (using intrinsics), (2) camera frame → body frame (camera mount extrinsics), (3) body frame → NED frame (using attitude from ESKF), (4) NED → WGS84 (using reference point). For satellite matching: geo-referenced tile coordinates → WGS84. For object localization: pixel + camera angle + altitude → ground point → WGS84. None of these transformations are explicitly defined in draft05.
- **Source**: Source #15 (Istanbul Tech thesis), Source #7
- **Phase**: Assessment
- **Target Audience**: GPS-denied system
- **Confidence**: ✅ High
- **Related Dimension**: Coordinate system completeness
## Fact #9 — GSD normalization required for satellite-aerial matching
- **Statement**: Camera GSD at 600m altitude with ADTI 20L V1 (16mm, APS-C) is ~15.9 cm/pixel. Google Maps zoom 19 ≈ 0.3 m/pixel, zoom 18 ≈ 0.6 m/pixel. The GSD ratio is ~2:1 to ~4:1 depending on zoom level and altitude. Draft05's "pre-resize" step in the offline pipeline is mentioned but not specified: what resolution? what zoom level? The matching model (LiteSAM/XFeat) input size must match appropriately.
- **Source**: Source #14 (GSD matching paper), solution_draft05 calculations
- **Phase**: Assessment
- **Target Audience**: GPS-denied system
- **Confidence**: ✅ High
- **Related Dimension**: Satellite matching completeness
## Fact #10 — Google Maps imagery in conflict zones is intentionally outdated
- **Statement**: Google Maps deliberately serves older imagery (>1 year) for conflict zones in Ukraine. Ukrainian officials have flagged security concerns. The operational area (eastern/southern Ukraine) is directly in the conflict zone. Imagery may be 1-3+ years old, with seasonal differences (summer tiles vs winter flight, or vice versa). This is a HIGH-severity gap for satellite matching accuracy.
- **Source**: Source #12 (Google Maps Ukraine coverage)
- **Phase**: Assessment
- **Target Audience**: GPS-denied system
- **Confidence**: ✅ High
- **Related Dimension**: Satellite imagery quality risk
## Fact #11 — No operational procedures defined
- **Statement**: Mature GPS-denied systems (Auterion, PX4) define: pre-flight checklist (set home position, verify sensors, verify tile coverage), in-flight monitoring procedures (what to watch, when to intervene), and post-flight analysis (compare estimated vs actual GPS on return). Solution draft05 has no operational procedures section.
- **Source**: Source #10 (Auterion), Source #11 (PX4)
- **Phase**: Assessment
- **Target Audience**: GPS-denied system operators
- **Confidence**: ✅ High
- **Related Dimension**: Operational maturity
## Fact #12 — Object localization lacks implementation detail
- **Statement**: AC requires: "Other onboard AI systems can request GPS coordinates of objects detected by the AI camera." The solution says "trigonometric calculation using UAV GPS position, camera angle, zoom, altitude." But no API is defined, no coordinate math is shown, no handling of camera zoom/angle → ground projection is specified. The Viewpro A40 Pro gimbal angle and zoom parameters are not integrated.
- **Source**: Acceptance criteria, solution_draft05
- **Phase**: Assessment
- **Target Audience**: GPS-denied system
- **Confidence**: ✅ High
- **Related Dimension**: Object localization completeness
## Fact #13 — Tech stack document is inconsistent with draft05
- **Statement**: tech_stack.md says "camera @ ~3fps" in non-functional requirements. Draft05 corrected this to 0.7fps. tech_stack.md lists LiteSAM benchmark decision at 480px/640px/800px; draft05 uses 1280px. tech_stack.md doesn't mention EfficientLoFTR as fallback. These inconsistencies indicate the tech_stack.md was not updated after draft05 changes.
- **Source**: tech_stack.md, solution_draft05.md
- **Phase**: Assessment
- **Target Audience**: GPS-denied system
- **Confidence**: ✅ High
- **Related Dimension**: Document consistency, maturity
## Fact #14 — Confidence scoring is undefined in draft05
- **Statement**: Draft05 says "Confidence Scoring → GPS_INPUT Mapping — Unchanged from draft03" but draft05 is supposed to be self-contained. The actual confidence scoring logic (how VO confidence + satellite match confidence map to GPS_INPUT fix_type, hdop, horiz_accuracy) is never defined in the current draft. This is critical because ArduPilot's EKF uses these accuracy fields to weight the GPS data.
- **Source**: Source #2 (GPS_INPUT fields), solution_draft05
- **Phase**: Assessment
- **Target Audience**: GPS-denied system
- **Confidence**: ✅ High
- **Related Dimension**: Flight controller integration, confidence scoring
## Fact #15 — Initial bootstrap sequence is incomplete
- **Statement**: Draft05 startup reads GLOBAL_POSITION_INT to get initial GPS position. But: (1) cuVSLAM needs its first frame + features to initialize — how is the first satellite match triggered? (2) ESKF needs initial state — position from GPS, but velocity? attitude? (3) How does the system know GPS is denied and should start sending GPS_INPUT? (4) Is there a handoff protocol from real GPS to GPS-denied system?
- **Source**: solution_draft05, Source #10 (Auterion procedures)
- **Phase**: Assessment
- **Target Audience**: GPS-denied system
- **Confidence**: ✅ High
- **Related Dimension**: Startup and bootstrap completeness
## Fact #16 — No recovery from companion computer reboot
- **Statement**: AC requires: "On companion computer reboot mid-flight, the system should attempt to re-initialize from the flight controller's current IMU-extrapolated position." Draft05 does not address this scenario. The system needs: read current FC position estimate, re-initialize ESKF, reload TRT engines (~1-3s), start cuVSLAM with no prior map, trigger immediate satellite re-localization.
- **Source**: Acceptance criteria, solution_draft05
- **Phase**: Assessment
- **Target Audience**: GPS-denied system
- **Confidence**: ✅ High
- **Related Dimension**: Resilience, failsafe completeness
## Fact #17 — No position refinement mechanism
- **Statement**: AC states: "The system may refine previously calculated positions and send corrections to the flight controller as updated estimates." Draft05 does not define how this works. When a satellite match provides an absolute correction, do previously estimated positions get retroactively corrected? Is this communicated to the flight controller? How?
- **Source**: Acceptance criteria, solution_draft05
- **Phase**: Assessment
- **Target Audience**: GPS-denied system
- **Confidence**: ⚠️ Medium (AC is ambiguous about necessity)
- **Related Dimension**: Position refinement
## Fact #18 — Tile storage requirements not calculated
- **Statement**: The solution mentions "preload tiles ±2km" and "GeoHash-indexed directory" but never calculates: how many tiles are needed for a mission area, what storage space is required, what zoom levels to use, or how to handle the trade-off between tile coverage area and storage limit. At zoom 19 (~0.3m/pixel), each 256×256 tile covers ~77m × 77m. Covering a 200km flight path with ±2km buffer would require ~130,000 tiles (~2.5GB JPEG).
- **Source**: solution_draft05, tech_stack.md
- **Phase**: Assessment
- **Target Audience**: GPS-denied system
- **Confidence**: ⚠️ Medium (estimates based on standard tile sizes)
- **Related Dimension**: Offline preparation completeness
## Fact #19 — 3 consecutive failed frames → re-localization request undefined
- **Statement**: AC requires: "If system cannot determine position of 3 consecutive frames by any means, send re-localization request to ground station operator via telemetry link." Draft05 does not define: (1) the re-localization request message format, (2) what "any means" includes (VO failed + satellite match failed + IMU drift exceeded threshold?), (3) how the operator response is received and applied, (4) what the system does while waiting.
- **Source**: Acceptance criteria, solution_draft05
- **Phase**: Assessment
- **Target Audience**: GPS-denied system
- **Confidence**: ✅ High
- **Related Dimension**: Ground station integration, resilience
## Fact #20 — FastAPI endpoints mentioned but not defined
- **Statement**: Draft05 mentions FastAPI for "local IPC" but the REST API endpoints are only defined in security_analysis.md (POST /sessions, GET /sessions/{id}/stream, POST /sessions/{id}/anchor, DELETE /sessions/{id}). The solution draft itself doesn't specify the API contract, request/response schemas, or how other onboard systems interact.
- **Source**: solution_draft05, security_analysis.md
- **Phase**: Assessment
- **Target Audience**: GPS-denied system
- **Confidence**: ✅ High
- **Related Dimension**: API completeness
## Fact #21 — NaviLoc achieves 19.5m error with trajectory-level optimization
- **Statement**: Recent research (NaviLoc, 2025) shows that trajectory-level optimization treating satellite matching as noisy measurement achieves 19.5m mean error at 50-150m altitude, 16× better than per-frame matching. The solution's approach of per-keyframe satellite matching + ESKF correction is simpler but potentially less accurate than trajectory-level optimization.
- **Source**: Source #8 (NaviLoc)
- **Phase**: Assessment
- **Target Audience**: GPS-denied system
- **Confidence**: ⚠️ Medium (NaviLoc operates at lower altitude with higher overlap)
- **Related Dimension**: Algorithm maturity, accuracy potential
@@ -0,0 +1,63 @@
# Comparison Framework
## Selected Framework Type
Knowledge Organization + Problem Diagnosis — mapping completeness dimensions against solution state
## Completeness Dimensions
1. Core Pipeline Definition
2. Sensor Fusion (ESKF) Specification
3. Visual Odometry Configuration
4. Satellite Image Matching Pipeline
5. Coordinate System & Transformations
6. Flight Controller Integration (GPS_INPUT)
7. Disconnected Route Segment Handling
8. Startup, Bootstrap & Failsafe
9. Object Localization
10. Offline Preparation Pipeline
11. Ground Station Integration
12. Operational Procedures
13. API & Inter-system Communication
14. Document Consistency
15. Testing Coverage vs AC
## Maturity Scoring
| Score | Level | Description |
|-------|-------|-------------|
| 1 | Concept | Mentioned but not specified |
| 2 | Defined | Architecture-level description, component selected |
| 3 | Detailed | Implementation details, data flows, algorithms specified |
| 4 | Validated | Benchmarked, tested, edge cases handled |
| 5 | Production | Field-tested, operational procedures, monitoring |
## Completeness Assessment Matrix
| Dimension | Current State | Maturity | Key Gaps | Facts |
|-----------|--------------|----------|----------|-------|
| Core Pipeline | VO → ESKF → GPS_INPUT flow well-defined. Dual CUDA stream architecture. Camera rate corrected to 0.7fps. Time budgets calculated. | 3 | Self-contained — but references "unchanged from draft03" in several places instead of restating | — |
| ESKF Specification | "16-state vector", "ESKF + buffers ~10MB", "ESKF measurement update" | 1.5 | No state vector definition, no process model (F,Q), no measurement models (H for VO, H for satellite), no noise parameters, no scale observability analysis, no tuning strategy | #3, #4 |
| VO Configuration | cuVSLAM selected, 0.7fps feasibility analyzed, pyramid-LK range calculated, overlap >95% | 2.5 | No camera calibration procedure, no IMU calibration parameters (noise density, random walk), no T_imu_rig extrinsic, no cuVSLAM mode selection (Mono vs Inertial), no cuVSLAM initialization procedure | #5, #6 |
| Satellite Matching | LiteSAM/EfficientLoFTR/XFeat decision tree, TRT conversion workflow, async Stream B | 2.5 | No GSD normalization spec, no tile-to-camera scale matching, no matching confidence threshold, no geometric verification details beyond "RANSAC homography", no match-to-WGS84 conversion | #9, #14 |
| Coordinate System | WGS84 output mentioned. Camera footprint calculated. | 1 | No transformation chain defined (pixel→camera→body→NED→WGS84). No camera-to-body extrinsic. No reference point definition for NED. No handling of terrain elevation. | #8 |
| Flight Controller Integration | pymavlink, GPS_INPUT, 5-10Hz, UART | 2 | No GPS_INPUT field population spec (where do velocity, accuracy, hdop come from?). No fix_type mapping. No GPS time conversion. No ignore_flags. Confidence scoring undefined in current draft. | #1, #2, #14 |
| Disconnected Segments | Mentioned in AC, satellite matching acknowledged as solution | 1.5 | No algorithm for detecting tracking loss. No re-localization trigger. No position initialization after re-localization. No map discontinuity handling. | #7 |
| Startup & Failsafe | 12-step startup sequence. Engine load times. | 2 | No GPS-denied handoff protocol. No mid-flight reboot recovery. No "3 consecutive failed frames" handling. No operator re-localization workflow. | #15, #16, #19 |
| Object Localization | "Trigonometric calculation" mentioned | 1 | No math defined. No API endpoint. No Viewpro gimbal integration spec. No accuracy analysis. | #12 |
| Offline Preparation | Tile download → validate → pre-resize → store. TRT engine build. | 2 | No zoom level selection. No storage calculation. No coverage verification. No tile freshness check. No pre-flight validation tool. | #18 |
| Ground Station Integration | NAMED_VALUE_FLOAT at 1Hz for confidence/drift. Operator re-localization hint mentioned in AC. | 1.5 | Re-localization request/response undefined. Ground station display requirements undefined. Operator workflow undefined. | #19 |
| Operational Procedures | None defined | 0 | No pre-flight checklist. No in-flight monitoring guide. No post-flight analysis. No failure response procedures. | #11 |
| API & IPC | FastAPI mentioned for "local IPC" | 1.5 | Endpoints only in security_analysis.md, not in solution. No request/response schemas. No SSE event format. No object localization API. | #20 |
| Document Consistency | 3 documents (draft05, tech_stack, security) | — | tech_stack.md has 3fps (should be 0.7fps). LiteSAM resolution mismatch. EfficientLoFTR missing from tech_stack. | #13 |
| Testing vs AC | Tests cover TRT, cuVSLAM 0.7fps, shutter. | 2.5 | No explicit mapping of tests to AC items. Missing tests: disconnected segments, re-localization, 3-consecutive-failure, object localization, operator workflow, mid-flight reboot. | — |
## Overall Maturity Assessment
| Category | Avg Score | Assessment |
|----------|-----------|------------|
| Hardware/Platform | 3.5 | Well-researched: UAV specs, camera analysis, memory budget, thermal |
| Core Algorithms (VO, matching) | 2.5 | Component selection solid, but implementation specs missing |
| Sensor Fusion (ESKF) | 1.5 | Severely under-specified |
| System Integration | 1.5 | GPS_INPUT, coordinate transforms, API all incomplete |
| Operational Readiness | 0.5 | No operational procedures, no deployment pipeline |
| **Overall** | **~2.0** | **Architecture-level design, not implementation-ready** |
@@ -0,0 +1,166 @@
# Reasoning Chain
## Dimension 1: ESKF Sensor Fusion Specification
### Fact Confirmation
Per Fact #3, standard ESKF for UAV VIO uses 15-16 state error vector: δp[3], δv[3], δθ[3], δba[3], δbg[3]. Per Fact #4, monocular VIO cannot observe metric scale during constant-velocity flight (fundamental observability limitation). Per Fact #7, scale requires external constraint (altitude or satellite absolute position).
### Current State
Draft05 says "Custom ESKF (NumPy/SciPy)", "16-state vector", "ESKF measurement update ~1ms", "ESKF IMU prediction at 5-10Hz". But provides zero mathematical detail.
### Conclusion
The ESKF is the **most under-specified critical component**. Without defining:
- State vector and error state vector explicitly
- Process model (how IMU data propagates the state)
- VO measurement model (how cuVSLAM relative pose updates the filter)
- Satellite measurement model (how absolute position corrections are applied)
- How scale is maintained (altitude constraint? satellite corrections only?)
- Q and R matrices (at least initial values and tuning approach)
...the system cannot be implemented. The ESKF is the central hub connecting all sensors — its specification drives the entire data flow.
### Confidence: ✅ High
---
## Dimension 2: Flight Controller Integration (GPS_INPUT)
### Fact Confirmation
Per Fact #1, ArduPilot requires minimum 5Hz GPS_INPUT rate. Per Fact #2, GPS_INPUT has 15+ mandatory fields including velocity, accuracy, fix_type, GPS time. Per Fact #14, confidence scoring that maps internal state to GPS_INPUT accuracy fields is undefined.
### Current State
Draft05 specifies: pymavlink, GPS_INPUT, 5-10Hz, UART. This satisfies the rate requirement. But the message population is unspecified.
### Conclusion
The GPS_INPUT integration has the right architecture (5-10Hz, pymavlink, UART) but is missing the **data mapping layer**:
- `vn, ve, vd` must come from ESKF velocity estimate — requires ESKF to output velocity
- `horiz_accuracy, vert_accuracy` must come from ESKF covariance matrix (sqrt of position covariance diagonal)
- `hdop, vdop` need to be synthesized from accuracy values (hdop ≈ horiz_accuracy / expected_CEP_factor)
- `fix_type` must map from internal confidence (3=3D fix when satellite-anchored, 2=2D when VO-only?)
- `speed_accuracy` from ESKF velocity covariance
- GPS time (time_week, time_week_ms) requires conversion from system time to GPS epoch
- `satellites_visible` should be set to a constant (e.g., 10) to avoid triggering satellite-count failsafes
This is a tractable implementation detail but must be specified before coding.
### Confidence: ✅ High
---
## Dimension 3: Coordinate System & Transformations
### Fact Confirmation
Per Fact #8, the system needs pixel → camera → body → NED → WGS84 chain. Per Fact #9, satellite tiles have different GSD than camera imagery. Per Source #15, similar systems define explicit coordinate transforms with PnP solvers.
### Current State
Draft05 calculates camera footprint and GSD but never defines the transformation chain. Object localization mentions "trigonometric calculation" without math.
### Conclusion
This is a **fundamental architectural gap**. Every position estimate flows through coordinate transforms. Without defining them:
- cuVSLAM outputs relative pose in camera frame — how is this converted to NED displacement?
- Satellite matching outputs pixel correspondences — how does homography → WGS84 position?
- Object localization needs camera ray → ground intersection — impossible without camera-to-body and body-to-NED transforms
- The camera is "not autostabilized" — so body frame attitude matters for ground projection
The fix requires defining: camera intrinsic matrix K, camera-to-body rotation T_cam_body, and the ESKF attitude estimate for body-to-NED.
### Confidence: ✅ High
---
## Dimension 4: Disconnected Route Segments
### Fact Confirmation
Per Fact #7, AC explicitly requires handling disconnected segments as "core to the system." Per Fact #6, cuVSLAM IMU fallback gives ~1s before tracking loss. Per Source #8, trajectory-level optimization can handle segment connections.
### Current State
Draft05 acknowledges this in AC but the solution section says "sharp-turn frames are expected to fail VO and should be handled by satellite-based re-localization." No algorithm is specified.
### Conclusion
The solution needs a concrete **re-localization pipeline**:
1. Detect tracking loss (cuVSLAM returns tracking_lost state)
2. Continue ESKF with IMU-only prediction (high uncertainty growth)
3. Immediately trigger satellite matching on next available frame
4. If satellite match succeeds: reset ESKF position to matched position, reset cuVSLAM (or start new track)
5. If satellite match fails: retry on next frame, increment failure counter
6. If 3 consecutive failures: send re-localization request to ground station
7. When new segment starts: mark as disconnected, continue building trajectory
8. Optionally: if a later satellite match connects two segments to the same reference, merge them
This is not trivial but follows directly from the existing architecture. It's a missing algorithm, not a missing component.
### Confidence: ✅ High
---
## Dimension 5: Startup Bootstrap & Failsafe
### Fact Confirmation
Per Fact #15, the bootstrap sequence has gaps (first satellite match, initial ESKF state, GPS-denied handoff). Per Fact #16, AC requires mid-flight reboot recovery. Per Fact #19, AC requires 3-consecutive-failure re-localization request.
### Current State
Draft05 has a 12-step startup sequence that covers the happy path. The failure paths and special cases are not addressed.
### Conclusion
Three failsafe scenarios need specification:
1. **GPS-denied handoff**: How does the system know to start? Options: (a) always running — takes over when GPS quality degrades, (b) operator command, (c) automatic GPS quality monitoring. The system should probably always be running in parallel and the FC uses the best available source.
2. **Mid-flight reboot**: Read FC position → init ESKF with high uncertainty → start cuVSLAM → immediate satellite match → within ~5s should have a position estimate. TRT engine load (1-3s) is the main startup cost.
3. **3 consecutive failures**: Define "failure" precisely (VO lost + satellite match failed + IMU-only drift > threshold). Send NAMED_VALUE_FLOAT or custom MAVLink message to ground station. Define operator response format.
### Confidence: ✅ High
---
## Dimension 6: Satellite Matching Pipeline Details
### Fact Confirmation
Per Fact #9, camera GSD is ~15.9 cm/pixel at 600m with ADTI+16mm lens. Satellite at zoom 19 is ~0.3 m/pixel. Per Fact #10, Google Maps imagery in Ukraine conflict zone is intentionally >1 year old. Per Fact #14, the solution says "pre-resize" but doesn't specify to what resolution.
### Current State
Draft05 has a solid model selection decision tree (LiteSAM → EfficientLoFTR → XFeat) and TRT conversion workflow. But the actual matching pipeline data flow is incomplete.
### Conclusion
The matching pipeline needs:
- **Input preparation**: Camera frame (5456×3632 at ~15.9 cm/pixel at 600m) → downsample to matcher input resolution (1280px for LiteSAM). Satellite tile at zoom 18 (~0.6 m/pixel) → no resize needed if using 256px tiles, or assemble 5×5 tile mosaic for coverage.
- **GSD matching**: Either downsample camera image to satellite GSD, or specify that the matcher handles multi-scale internally. LiteSAM was designed for satellite-aerial matching so it may handle this. XFeat is general-purpose and may need explicit scale normalization.
- **Tile selection**: Given ESKF position estimate + uncertainty, select the correct satellite tile(s). What if the position estimate has drifted and the wrong tile is selected? Need a search radius based on ESKF covariance.
- **Match → position**: Homography from RANSAC → decompose to get translation in satellite coordinate frame → convert to WGS84 using tile's geo-reference.
- **Seasonal/temporal mismatch**: Tiles could be from different seasons. Feature matching must be robust to appearance changes.
### Confidence: ✅ High
---
## Dimension 7: Operational Maturity
### Fact Confirmation
Per Fact #11, mature systems (Auterion, PX4) define pre-flight checklists, in-flight monitoring, failure response procedures. Per Fact #13, documents are inconsistent (tech_stack.md still says 3fps).
### Current State
Zero operational procedures defined. Documents are partially inconsistent.
### Conclusion
This is expected at this stage of development (architecture/design phase). Operational procedures should come after implementation and initial testing. However, the document inconsistencies should be fixed now to avoid confusion during implementation. The tech_stack.md and solution_draft should be aligned.
### Confidence: ✅ High
---
## Dimension 8: Object Localization
### Fact Confirmation
Per Fact #12, AC requires other AI systems to request GPS coordinates of detected objects. The Viewpro A40 Pro gimbal has configurable angle and zoom.
### Current State
Draft05 says "trigonometric calculation using UAV GPS position, camera angle, zoom, altitude. Flat terrain assumed."
### Conclusion
The math is straightforward but needs specification:
- Input: pixel coordinates (u,v) in Viewpro image, gimbal angles (pan, tilt), zoom level, UAV position (from GPS-denied system), UAV altitude
- Process: (1) pixel → ray in camera frame using intrinsics + zoom, (2) camera frame → body frame using gimbal angles, (3) body frame → NED using UAV attitude, (4) ray-ground intersection assuming flat terrain at known altitude, (5) NED offset → WGS84
- Output: lat, lon of object + accuracy estimate (propagated from UAV position accuracy + gimbal angle uncertainty)
- API: FastAPI endpoint for other onboard systems to call
This is a 2-point complexity task but should be specified in the solution.
### Confidence: ✅ High
@@ -0,0 +1,96 @@
# Validation Log
## Validation Scenario 1: Normal Straight Flight (Happy Path)
### Expected Based on Conclusions
UAV flies straight at 70 km/h, 600m altitude. ADTI captures at 0.7fps. cuVSLAM processes each frame (~9ms), ESKF fuses VO + IMU. Every 5-10 frames, satellite matching provides absolute correction. GPS_INPUT sent at 5-10Hz.
### Actual Validation Results
The happy path is well-specified in draft05. Time budgets, memory budgets, overlap calculations all valid. The 5-10Hz ESKF IMU prediction fills gaps between 0.7fps camera frames. Satellite matching async on Stream B.
**GAP**: Even on straight flight, the GPS_INPUT message field population is undefined. Where does velocity come from? What fix_type is sent? What accuracy values?
---
## Validation Scenario 2: Sharp Turn with No Overlap
### Expected Based on Conclusions
UAV makes a 90° turn. Next frame has zero overlap with previous. cuVSLAM loses tracking. System falls back to IMU. ESKF uncertainty grows rapidly. Satellite matching on next frame provides re-localization.
### Actual Validation Results
**CRITICAL GAP**: No algorithm defined for this scenario. Questions:
1. How does the system detect cuVSLAM tracking loss? (cuVSLAM API presumably returns a tracking state)
2. During IMU-only phase, what is the ESKF prediction uncertainty growth rate? (~1-2m/s drift with consumer IMU)
3. When satellite match succeeds after the turn, how is ESKF re-initialized? (measurement update with very high innovation? or state reset?)
4. How is cuVSLAM re-initialized on the new heading? (new track from scratch? or cuVSLAM loop closure if it sees previous terrain?)
5. If the turn area is over featureless terrain (farmland), satellite matching may also fail — then what?
The AC says "sharp-turn frames should be within 200m drift and angle <70 degrees" — this bounds the problem but the solution doesn't address it algorithmically.
---
## Validation Scenario 3: Long Flight Over Uniform Terrain
### Expected Based on Conclusions
UAV flies 50km straight over large agricultural fields. cuVSLAM may struggle with low-texture terrain. Satellite matching is the only absolute correction source.
### Actual Validation Results
This scenario tests the limits of the design:
- cuVSLAM at 600m altitude sees ~577m × 870m footprint. If it's a single wheat field, features may be sparse. cuVSLAM falls back to IMU (~1s acceptable, then tracking lost).
- Satellite matching must carry the entire position estimation burden.
- At 0.7fps with keyframe every 5-10 frames: satellite match every 7-14s.
- Between matches, IMU-only drift at 70 km/h: ~14m/s × time × drift_rate. With consumer IMU: ~1-5m drift per second.
- Over 14s between matches: ~14-70m potential drift. AC requires <100m between anchors.
**GAP**: The system's behavior in this degraded mode needs explicit specification. Is this VO-failed + satellite-only + IMU acceptable? What's the accuracy?
---
## Validation Scenario 4: First Frame After GPS Denial
### Expected Based on Conclusions
GPS was working. Now it's denied/spoofed. System takes over.
### Actual Validation Results
**GAP**: No handoff protocol defined. Questions:
1. How does the system detect GPS denial? (it doesn't — it's assumed the operator knows)
2. The initial ESKF state comes from GLOBAL_POSITION_INT — but this might be the spoofed GPS. How to validate?
3. If GPS is being spoofed rather than denied, the initial position could be wrong by kilometers.
4. Draft05 assumes clean initial GPS. This is reasonable for the first version but should be acknowledged as a limitation.
---
## Validation Scenario 5: Mid-Flight Companion Computer Reboot
### Expected Based on Conclusions
Companion computer crashes and reboots. Flight controller continues flying on IMU dead reckoning.
### Actual Validation Results
**GAP**: AC requires recovery. Draft05 doesn't address it. Sequence should be:
1. Jetson boots (~30-60s depending on boot time)
2. GPS-Denied service starts (systemd)
3. Connect to FC, get current position (IMU-extrapolated, may have significant drift)
4. Load TRT engines (~1-3s each, total ~2-6s)
5. Start cuVSLAM (no prior map, fresh start)
6. Immediate satellite match to get absolute position
7. Total recovery time: ~35-70s. During this time, FC uses IMU-only. At 70 km/h: ~700-1400m of uncontrolled drift.
This is a real operational concern and should be documented even if the solution is "acknowledge the limitation."
---
## Counterexamples
- **NaviLoc achieves 19.5m at lower altitude**: Our system operates at 600-1000m with larger footprints and coarser GSD. The accuracy requirement (50m for 80%, 20m for 60%) is less demanding. The simpler ESKF approach may be adequate.
- **SatLoc-Fusion uses DinoV2 for place recognition**: Our system uses feature matching (LiteSAM/XFeat) which is more precise but less robust to appearance changes. DinoV2 is more robust to seasonal changes but gives coarser position.
## Review Checklist
- [x] Draft conclusions consistent with fact cards
- [x] No important dimensions missed
- [x] No over-extrapolation
- [ ] Issue: ESKF specification is too underspecified to validate accuracy claims
- [ ] Issue: Disconnected segment handling is critical AC and has no algorithm
- [ ] Issue: GPS_INPUT field mapping undocumented
- [ ] Issue: Object localization API undefined
## Conclusions Requiring Revision
None requiring reversal. All identified gaps are genuine and supported by facts.
+568
View File
@@ -0,0 +1,568 @@
# Solution Draft
## Assessment Findings
| Old Component Solution | Weak Point (functional/security/performance) | New Solution |
|------------------------|----------------------------------------------|-------------|
| ESKF described as "16-state vector, ~10MB" with no mathematical specification | **Functional**: No state vector, no process model (F,Q), no measurement models (H for VO, H for satellite), no noise parameters, no scale observability analysis. Impossible to implement or validate accuracy claims. | **Define complete ESKF specification**: 15-state error vector, IMU-driven prediction, dual measurement models (VO relative pose, satellite absolute position), initial Q/R values, scale constraint via altitude + satellite corrections. |
| GPS_INPUT at 5-10Hz via pymavlink — no field mapping | **Functional**: GPS_INPUT requires 15+ fields (velocity, accuracy, hdop, fix_type, GPS time). No specification of how ESKF state maps to these fields. ArduPilot requires minimum 5Hz. | **Define GPS_INPUT population spec**: velocity from ESKF, accuracy from covariance, fix_type from confidence tier, GPS time from system clock conversion, synthesized hdop/vdop. |
| Confidence scoring "unchanged from draft03" — not in draft05 | **Functional**: Draft05 is supposed to be self-contained. Confidence scoring determines GPS_INPUT accuracy fields and fix_type — directly affects how ArduPilot EKF weights the position data. | **Define confidence scoring inline**: 3 tiers (satellite-anchored, VO-tracked, IMU-only) mapping to fix_type + accuracy values. |
| Coordinate transformations not defined | **Functional**: No pixel→camera→body→NED→WGS84 chain. Camera is not autostabilized, so body attitude matters. Satellite match → WGS84 conversion undefined. Object localization impossible without these transforms. | **Define coordinate transformation chain**: camera intrinsics K, camera-to-body extrinsic T_cam_body, body-to-NED from ESKF attitude, NED origin at mission start point. |
| Disconnected route segments — "satellite re-localization" mentioned but no algorithm | **Functional**: AC requires handling as "core to the system." Multiple disconnected segments expected. No tracking-loss detection, no re-localization trigger, no ESKF re-initialization, no cuVSLAM restart procedure. | **Define re-localization pipeline**: detect cuVSLAM tracking loss → IMU-only ESKF prediction → trigger satellite match on every frame → on match success: ESKF position reset + cuVSLAM restart → on 3 consecutive failures: operator re-localization request. |
| No startup handoff from GPS to GPS-denied | **Functional**: System reads GLOBAL_POSITION_INT at startup but no protocol for when GPS is lost/spoofed vs system start. No validation of initial position. | **Define handoff protocol**: system runs continuously, FC receives both real GPS and GPS_INPUT. GPS-denied system always provides its estimate; FC selects best source. Initial position validated against first satellite match. |
| No mid-flight reboot recovery | **Functional**: AC requires: "re-initialize from flight controller's current IMU-extrapolated position." No procedure defined. Recovery time estimation missing. | **Define reboot recovery sequence**: read FC position → init ESKF with high uncertainty → load TRT engines → start cuVSLAM → immediate satellite match. Estimated recovery: ~35-70s. Document as known limitation. |
| 3-consecutive-failure re-localization request undefined | **Functional**: AC requires ground station re-localization request. No message format, no operator workflow, no system behavior while waiting. | **Define re-localization protocol**: detect 3 failures → send custom MAVLink message with last known position + uncertainty → operator provides approximate coordinates → system uses as ESKF measurement with high covariance. |
| Object localization — "trigonometric calculation" with no details | **Functional**: No math, no API, no Viewpro gimbal integration, no accuracy propagation. Other onboard systems cannot use this component as specified. | **Define object localization**: pixel→ray using Viewpro intrinsics + gimbal angles → body frame → NED → ray-ground intersection → WGS84. FastAPI endpoint: POST /objects/locate. Accuracy propagated from UAV position + gimbal uncertainty. |
| Satellite matching — GSD normalization and tile selection unspecified | **Functional**: Camera GSD ~15.9 cm/px at 600m vs satellite ~0.3 m/px at zoom 19. The "pre-resize" step is mentioned but not specified. Tile selection radius based on ESKF uncertainty not defined. | **Define GSD handling**: downsample camera frame to match satellite GSD. Define tile selection: ESKF position ± 3σ_horizontal → select tiles covering that area. Assemble tile mosaic for matching. |
| Satellite tile storage requirements not calculated | **Functional**: "±2km" preload mentioned but no storage estimate. At zoom 19: a 200km path with ±2km buffer requires ~130K tiles (~2.5GB). | **Calculate tile storage**: specify zoom level (18 preferred — 0.6m/px, 4× fewer tiles), estimate storage per mission profile, define maximum mission area by storage limit. |
| FastAPI endpoints not in solution draft | **Functional**: Endpoints only in security_analysis.md. No request/response schemas. No SSE event format. No object localization endpoint. | **Consolidate API spec in solution**: define all endpoints, SSE event schema, object localization endpoint. Reference security_analysis.md for auth. |
| cuVSLAM configuration missing (calibration, IMU params, mode) | **Functional**: No camera calibration procedure, no IMU noise parameters, no T_imu_rig extrinsic, no mode selection (Mono vs Inertial). | **Define cuVSLAM configuration**: use Inertial mode, specify required calibration data (camera intrinsics, distortion, IMU noise params from datasheet, T_imu_rig from physical measurement), define calibration procedure. |
| tech_stack.md inconsistent with draft05 | **Functional**: tech_stack.md says 3fps (should be 0.7fps), LiteSAM at 480px (should be 1280px), missing EfficientLoFTR. | **Flag for update**: tech_stack.md must be synchronized with draft05 corrections. Not addressed in this draft — separate task. |
## Overall Maturity Assessment
| Category | Maturity (1-5) | Assessment |
|----------|---------------|------------|
| Hardware & Platform Selection | 3.5 | UAV airframe, cameras, Jetson, batteries — well-researched with specs, weight budget, endurance calculations. Ready for procurement. |
| Core Algorithm Selection | 3.0 | cuVSLAM, LiteSAM/XFeat, ESKF — components selected with comparison tables, fallback chains, decision trees. Day-one benchmarks defined. |
| AI Inference Runtime | 3.5 | TRT Engine migration thoroughly analyzed. Conversion workflows, memory savings, performance estimates. Code wrapper provided. |
| Sensor Fusion (ESKF) | 1.5 | Mentioned but not specified. No implementable detail. Blockerfor coding. |
| System Integration | 1.5 | GPS_INPUT, coordinate transforms, inter-component data flow — all under-specified. |
| Edge Cases & Resilience | 1.0 | Disconnected segments, reboot recovery, re-localization — acknowledged but no algorithms. |
| Operational Readiness | 0.5 | No pre-flight procedures, no in-flight monitoring, no failure response. |
| Security | 3.0 | Comprehensive threat model, OP-TEE analysis, LUKS, secure boot. Well-researched. |
| **Overall TRL** | **~2.5** | **Technology concept formulated + some component validation. Not implementation-ready.** |
The solution is at approximately **TRL 3** (proof of concept) for hardware/algorithm selection and **TRL 1-2** (basic concept) for system integration, ESKF, and operational procedures.
## Product Solution Description
A real-time GPS-denied visual navigation system for fixed-wing UAVs, running on a Jetson Orin Nano Super (8GB). All AI model inference uses native TensorRT Engine files. The system replaces the GPS module by sending MAVLink GPS_INPUT messages via pymavlink over UART at 5-10Hz.
Position is determined by fusing: (1) CUDA-accelerated visual odometry (cuVSLAM in Inertial mode) from ADTI 20L V1 at 0.7 fps sustained, (2) absolute position corrections from satellite image matching (LiteSAM or XFeat — TRT Engine FP16) using keyframes from the same ADTI image stream, and (3) IMU data from the flight controller via ESKF. Viewpro A40 Pro is reserved for AI object detection only.
The ESKF is the central state estimator with 15-state error vector. It fuses:
- **IMU prediction** at 5-10Hz (high-frequency pose propagation)
- **cuVSLAM VO measurement** at 0.7Hz (relative pose correction)
- **Satellite matching measurement** at ~0.07-0.14Hz (absolute position correction)
GPS_INPUT messages carry position, velocity, and accuracy derived from the ESKF state and covariance.
**Hard constraint**: ADTI 20L V1 shoots at 0.7 fps sustained (1430ms interval). Full VO+ESKF pipeline within 400ms per frame. Satellite matching async on keyframes (every 5-10 camera frames). GPS_INPUT at 5-10Hz (ESKF IMU prediction fills gaps between camera frames).
```
┌─────────────────────────────────────────────────────────────────────┐
│ OFFLINE (Before Flight) │
│ 1. Satellite Tiles → Download & Validate → Pre-resize → Store │
│ (Google Maps) (≥0.5m/px, <2yr) (matcher res) (GeoHash)│
│ 2. TRT Engine Build (one-time per model version): │
│ PyTorch model → reparameterize → ONNX export → trtexec --fp16 │
│ Output: litesam.engine, xfeat.engine │
│ 3. Camera + IMU calibration (one-time per hardware unit) │
│ 4. Copy tiles + engines + calibration to Jetson storage │
└─────────────────────────────────────────────────────────────────────┘
┌─────────────────────────────────────────────────────────────────────┐
│ ONLINE (During Flight) │
│ │
│ STARTUP: │
│ 1. pymavlink → read GLOBAL_POSITION_INT → init ESKF state │
│ 2. Load TRT engines + allocate GPU buffers │
│ 3. Load camera calibration + IMU calibration │
│ 4. Start cuVSLAM (Inertial mode) with ADTI 20L V1 │
│ 5. Preload satellite tiles ±2km into RAM │
│ 6. First satellite match → validate initial position │
│ 7. Begin GPS_INPUT output loop at 5-10Hz │
│ │
│ EVERY CAMERA FRAME (0.7fps from ADTI 20L V1): │
│ ┌──────────────────────────────────────┐ │
│ │ ADTI 20L V1 → Downsample (CUDA) │ │
│ │ → cuVSLAM VO+IMU (~9ms) │ ← CUDA Stream A │
│ │ → ESKF VO measurement │ │
│ └──────────────────────────────────────┘ │
│ │
│ 5-10Hz CONTINUOUS (IMU-driven between camera frames): │
│ ┌──────────────────────────────────────┐ │
│ │ IMU data → ESKF prediction │ │
│ │ ESKF state → GPS_INPUT fields │ │
│ │ GPS_INPUT → Flight Controller (UART) │ │
│ └──────────────────────────────────────┘ │
│ │
│ KEYFRAMES (every 5-10 camera frames, async): │
│ ┌──────────────────────────────────────┐ │
│ │ Camera frame → GSD downsample │ │
│ │ Select satellite tile (ESKF pos±3σ) │ │
│ │ TRT inference (Stream B): LiteSAM/ │ │
│ │ XFeat → correspondences │ │
│ │ RANSAC → homography → WGS84 position │ │
│ │ ESKF satellite measurement update │──→ Position correction │
│ └──────────────────────────────────────┘ │
│ │
│ TRACKING LOSS (cuVSLAM fails — sharp turn / featureless): │
│ ┌──────────────────────────────────────┐ │
│ │ ESKF → IMU-only prediction (growing │ │
│ │ uncertainty) │ │
│ │ Satellite match on EVERY frame │ │
│ │ On match success → ESKF reset + │ │
│ │ cuVSLAM restart │ │
│ │ 3 consecutive failures → operator │ │
│ │ re-localization request │ │
│ └──────────────────────────────────────┘ │
│ │
│ TELEMETRY (1Hz): │
│ ┌──────────────────────────────────────┐ │
│ │ NAMED_VALUE_FLOAT: confidence, drift │──→ Ground Station │
│ └──────────────────────────────────────┘ │
└─────────────────────────────────────────────────────────────────────┘
```
## Architecture
### Component: ESKF Sensor Fusion (NEW — previously unspecified)
**Error-State Kalman Filter** fusing IMU, visual odometry, and satellite matching.
**Nominal state vector** (propagated by IMU):
| State | Symbol | Size | Description |
|-------|--------|------|-------------|
| Position | p | 3 | NED position relative to mission origin (meters) |
| Velocity | v | 3 | NED velocity (m/s) |
| Attitude | q | 4 | Unit quaternion (body-to-NED rotation) |
| Accel bias | b_a | 3 | Accelerometer bias (m/s²) |
| Gyro bias | b_g | 3 | Gyroscope bias (rad/s) |
**Error-state vector** (estimated by ESKF): δx = [δp, δv, δθ, δb_a, δb_g]ᵀ ∈ ℝ¹⁵
where δθ ∈ so(3) is the 3D rotation error.
**Prediction step** (IMU at 5-10Hz from flight controller):
- Input: accelerometer a_m, gyroscope ω_m, dt
- Propagate nominal state: p += v·dt, v += (R(q)·(a_m - b_a) - g)·dt, q ⊗= Exp(ω_m - b_g)·dt
- Propagate error covariance: P = F·P·Fᵀ + Q
- F is the 15×15 error-state transition matrix (standard ESKF formulation)
- Q: process noise diagonal, initial values from IMU datasheet noise densities
**VO measurement update** (0.7Hz from cuVSLAM):
- cuVSLAM outputs relative pose: ΔR, Δt (camera frame)
- Transform to NED: Δp_ned = R_body_ned · T_cam_body · Δt
- Innovation: z = Δp_ned_measured - Δp_ned_predicted
- Observation matrix H_vo maps error state to relative position change
- R_vo: measurement noise, initial ~0.1-0.5m (from cuVSLAM precision at 600m+ altitude)
- Kalman update: K = P·Hᵀ·(H·P·Hᵀ + R)⁻¹, δx = K·z, P = (I - K·H)·P
**Satellite measurement update** (0.07-0.14Hz, async):
- Satellite matching outputs absolute position: lat_sat, lon_sat in WGS84
- Convert to NED relative to mission origin
- Innovation: z = p_satellite - p_predicted
- H_sat = [I₃, 0, 0, 0, 0] (directly observes position)
- R_sat: measurement noise, from matching confidence (~5-20m based on RANSAC inlier ratio)
- Provides absolute position correction — bounds drift accumulation
**Scale observability**:
- Monocular cuVSLAM has scale ambiguity during constant-velocity flight
- Scale is constrained by: (1) satellite matching absolute positions (primary), (2) known flight altitude from barometer + predefined mission altitude, (3) IMU accelerometer during maneuvers
- During long straight segments without satellite correction, scale drift is possible. Satellite corrections every ~7-14s re-anchor scale.
**Tuning approach**: Start with IMU datasheet noise values for Q. Start with conservative R values (high measurement noise). Tune on flight test data by comparing ESKF output to known GPS ground truth.
| Solution | Tools | Advantages | Limitations | Performance | Fit |
|----------|-------|-----------|-------------|------------|-----|
| Custom ESKF (Python/NumPy) | NumPy, SciPy | Full control, minimal dependencies, well-understood algorithm | Implementation effort, tuning required | <1ms per step | ✅ Selected |
| FilterPy ESKF | FilterPy v1.4.5 | Reference implementation, less code | Less flexible for multi-rate fusion | <1ms per step | ⚠️ Fallback |
### Component: Coordinate System & Transformations (NEW — previously undefined)
**Reference frames**:
- **Camera frame (C)**: origin at camera optical center, Z forward, X right, Y down (OpenCV convention)
- **Body frame (B)**: origin at UAV CG, X forward (nose), Y right (starboard), Z down
- **NED frame (N)**: North-East-Down, origin at mission start point
- **WGS84**: latitude, longitude, altitude (output format)
**Transformation chain**:
1. **Pixel → Camera ray**: p_cam = K⁻¹ · [u, v, 1]ᵀ where K = camera intrinsic matrix (ADTI 20L V1: fx, fy from 16mm lens + APS-C sensor)
2. **Camera → Body**: p_body = T_cam_body · p_cam where T_cam_body is the fixed mounting rotation (camera points nadir: 90° pitch rotation from body X-forward to camera Z-down)
3. **Body → NED**: p_ned = R_body_ned(q) · p_body where q is the ESKF quaternion attitude estimate
4. **NED → WGS84**: lat = lat_origin + p_north / R_earth, lon = lon_origin + p_east / (R_earth · cos(lat_origin)) where (lat_origin, lon_origin) is the mission start GPS position
**Camera intrinsic matrix K** (ADTI 20L V1 + 16mm lens):
- Sensor: 23.2 × 15.4 mm, Resolution: 5456 × 3632
- fx = fy = focal_mm × width_px / sensor_width_mm = 16 × 5456 / 23.2 = 3763 pixels
- cx = 2728, cy = 1816 (sensor center)
- Distortion: Brown model (k1, k2, p1, p2 from calibration)
**T_cam_body** (camera mount):
- Navigation camera is fixed, pointing nadir (downward), not autostabilized
- R_cam_body = R_x(180°) · R_z(0°) (camera Z-axis aligned with body -Z, camera X with body X)
- Translation: offset from CG to camera mount (measured during assembly, typically <0.3m)
**Satellite match → WGS84**:
- Feature correspondences between camera frame and geo-referenced satellite tile
- Homography H maps camera pixels to satellite tile pixels
- Satellite tile pixel → WGS84 via tile's known georeference (zoom level + tile x,y → lat,lon)
- Camera center projects to satellite pixel (cx_sat, cy_sat) via H
- Convert (cx_sat, cy_sat) to WGS84 using tile georeference
### Component: GPS_INPUT Message Population (NEW — previously undefined)
| GPS_INPUT Field | Source | Computation |
|-----------------|--------|-------------|
| lat, lon | ESKF position (NED) | NED → WGS84 conversion using mission origin |
| alt | ESKF position (Down) + mission origin altitude | alt = alt_origin - p_down |
| vn, ve, vd | ESKF velocity state | Direct from ESKF v[0], v[1], v[2] |
| fix_type | Confidence tier | 3 (3D fix) when satellite-anchored (last match <30s). 2 (2D) when VO-only. 0 (no fix) when IMU-only >5s |
| hdop | ESKF horizontal covariance | hdop = sqrt(P[0,0] + P[1,1]) / 5.0 (approximate CEP→HDOP mapping) |
| vdop | ESKF vertical covariance | vdop = sqrt(P[2,2]) / 5.0 |
| horiz_accuracy | ESKF horizontal covariance | horiz_accuracy = sqrt(P[0,0] + P[1,1]) meters |
| vert_accuracy | ESKF vertical covariance | vert_accuracy = sqrt(P[2,2]) meters |
| speed_accuracy | ESKF velocity covariance | speed_accuracy = sqrt(P[3,3] + P[4,4]) m/s |
| time_week, time_week_ms | System time | Convert Unix time to GPS epoch (GPS epoch = 1980-01-06, subtract leap seconds) |
| satellites_visible | Constant | 10 (synthetic — prevents satellite-count failsafes in ArduPilot) |
| gps_id | Constant | 0 |
| ignore_flags | Constant | 0 (provide all fields) |
**Confidence tiers** mapping to GPS_INPUT:
| Tier | Condition | fix_type | horiz_accuracy | Rationale |
|------|-----------|----------|----------------|-----------|
| HIGH | Satellite match <30s ago, ESKF covariance < 400m² | 3 (3D fix) | From ESKF P (typically 5-20m) | Absolute position anchor recent |
| MEDIUM | cuVSLAM tracking OK, no recent satellite match | 3 (3D fix) | From ESKF P (typically 20-50m) | Relative tracking valid, drift growing |
| LOW | cuVSLAM lost, IMU-only | 2 (2D fix) | From ESKF P (50-200m+, growing) | Only IMU dead reckoning, rapid drift |
| FAILED | 3+ consecutive total failures | 0 (no fix) | 999.0 | System cannot determine position |
### Component: Disconnected Route Segment Handling (NEW — previously undefined)
**Trigger**: cuVSLAM reports tracking_lost OR tracking confidence drops below threshold
**Algorithm**:
```
STATE: TRACKING_NORMAL
cuVSLAM provides relative pose
ESKF VO measurement updates at 0.7Hz
Satellite matching on keyframes (every 5-10 frames)
STATE: TRACKING_LOST (enter when cuVSLAM reports loss)
1. ESKF continues with IMU-only prediction (no VO updates)
→ uncertainty grows rapidly (~1-5 m/s drift with consumer IMU)
2. Switch satellite matching to EVERY frame (not just keyframes)
→ maximize chances of getting absolute correction
3. For each camera frame:
a. Attempt satellite match using ESKF predicted position ± 3σ for tile selection
b. If match succeeds (RANSAC inlier ratio > 30%):
→ ESKF measurement update with satellite position
→ Restart cuVSLAM with current frame as new origin
→ Transition to TRACKING_NORMAL
→ Reset failure counter
c. If match fails:
→ Increment failure_counter
→ Continue IMU-only ESKF prediction
4. If failure_counter >= 3:
→ Send re-localization request to ground station
→ GPS_INPUT fix_type = 0 (no fix), horiz_accuracy = 999.0
→ Continue attempting satellite matching on each frame
5. If operator sends re-localization hint (approximate lat,lon):
→ Use as ESKF measurement with high covariance (~500m)
→ Attempt satellite match in that area
→ On success: transition to TRACKING_NORMAL
STATE: SEGMENT_DISCONNECT
After re-localization following tracking loss:
→ New cuVSLAM track is independent of previous track
→ ESKF maintains global NED position continuity via satellite anchor
→ No need to "connect" segments at the cuVSLAM level
→ ESKF already handles this: satellite corrections keep global position consistent
```
### Component: Satellite Image Matching Pipeline (UPDATED — added GSD + tile selection details)
**GSD normalization**:
- Camera GSD at 600m: ~15.9 cm/pixel (ADTI 20L V1 + 16mm)
- Satellite tile GSD at zoom 18: ~0.6 m/pixel
- Scale ratio: ~3.8:1
- Downsample camera image to satellite GSD before matching: resize from 5456×3632 to ~1440×960 (matching zoom 18 GSD)
- This is close to LiteSAM's 1280px input — use 1280px with minor GSD mismatch acceptable for matching
**Tile selection**:
- Input: ESKF position estimate (lat, lon) + horizontal covariance σ_h
- Search radius: max(3·σ_h, 500m) — at least 500m to handle initial uncertainty
- Compute geohash for center position → load tiles covering the search area
- Assemble tile mosaic if needed (typically 2×2 to 4×4 tiles for adequate coverage)
- If ESKF uncertainty > 2km: tile selection unreliable, fall back to wider search or request operator input
**Tile storage calculation** (zoom 18 — 0.6 m/pixel):
- Each 256×256 tile covers ~153m × 153m
- Flight path 200km with ±2km buffer: area ≈ 200km × 4km = 800 km²
- Tiles needed: 800,000,000 / (153 × 153) ≈ 34,200 tiles
- Storage: ~10-15KB per JPEG tile → ~340-510 MB
- With zoom 19 overlap tiles for higher precision: ×4 = ~1.4-2.0 GB
- Recommended: zoom 18 primary + zoom 19 for ±500m along flight path → ~500-800 MB total
| Solution | Tools | Advantages | Limitations | Performance (est. Orin Nano Super TRT FP16) | Params | Fit |
|----------|-------|-----------|-------------|----------------------------------------------|--------|-----|
| LiteSAM (opt) TRT Engine FP16 @ 1280px | trtexec + tensorrt Python | Best satellite-aerial accuracy (RMSE@30=17.86m UAV-VisLoc), 6.31M params | MinGRU TRT export needs verification (LOW-MEDIUM risk) | Est. ~165-330ms | 6.31M | ✅ Primary |
| EfficientLoFTR TRT Engine FP16 | trtexec + tensorrt Python | Proven TRT path (Coarse_LoFTR_TRT). Semi-dense. CVPR 2024. | 2.4x more params than LiteSAM. | Est. ~200-400ms | 15.05M | ✅ Fallback if LiteSAM TRT fails |
| XFeat TRT Engine FP16 | trtexec + tensorrt Python | Fastest. Proven TRT implementation. | General-purpose, not designed for cross-view gap. | Est. ~50-100ms | <5M | ✅ Speed fallback |
### Component: cuVSLAM Configuration (NEW — previously undefined)
**Mode**: Inertial (mono camera + IMU)
**Camera configuration** (ADTI 20L V1 + 16mm lens):
- Model: Brown distortion
- fx = fy = 3763 px (16mm on 23.2mm sensor at 5456px width)
- cx = 2728 px, cy = 1816 px
- Distortion coefficients: from calibration (k1, k2, p1, p2)
- Border: 50px (ignore lens edge distortion)
**IMU configuration** (Pixhawk 6x IMU — ICM-42688-P):
- Gyroscope noise density: 3.0 × 10⁻³ °/s/√Hz
- Gyroscope random walk: 5.0 × 10⁻⁵ °/s²/√Hz
- Accelerometer noise density: 70 µg/√Hz
- Accelerometer random walk: ~2.0 × 10⁻³ m/s³/√Hz
- IMU frequency: 200 Hz (from flight controller via MAVLink)
- T_imu_rig: measured transformation from Pixhawk IMU to camera center (translation + rotation)
**cuVSLAM settings**:
- OdometryMode: INERTIAL
- MulticameraMode: PRECISION (favor accuracy over speed — we have 1430ms budget)
- Input resolution: downsample to 1280×852 (or 720p) for processing speed
- async_bundle_adjustment: True
**Initialization**:
- cuVSLAM initializes automatically when it receives the first camera frame + IMU data
- First few frames used for feature initialization and scale estimation
- First satellite match validates and corrects the initial position
**Calibration procedure** (one-time per hardware unit):
1. Camera intrinsics: checkerboard calibration with OpenCV (or use manufacturer data if available)
2. Camera-IMU extrinsic (T_imu_rig): Kalibr tool with checkerboard + IMU data
3. IMU noise parameters: Allan variance analysis or use datasheet values
4. Store calibration files on Jetson storage
### Component: AI Model Inference Runtime (UNCHANGED)
Native TRT Engine — optimal performance and memory on fixed NVIDIA hardware. See draft05 for full comparison table and conversion workflow.
### Component: Visual Odometry (UNCHANGED)
cuVSLAM in Inertial mode, fed by ADTI 20L V1 at 0.7 fps sustained. See draft05 for feasibility analysis at 0.7fps.
### Component: Flight Controller Integration (UPDATED — added GPS_INPUT field spec)
pymavlink over UART at 5-10Hz. GPS_INPUT field population defined above.
ArduPilot configuration:
- GPS1_TYPE = 14 (MAVLink)
- GPS_RATE = 5 (minimum, matching our 5-10Hz output)
- EK3_SRC1_POSXY = 1 (GPS), EK3_SRC1_VELXY = 1 (GPS) — EKF uses GPS_INPUT as position/velocity source
### Component: Object Localization (NEW — previously undefined)
**Input**: pixel coordinates (u, v) in Viewpro A40 Pro image, current gimbal angles (pan_deg, tilt_deg), zoom factor, UAV position from GPS-denied system, UAV altitude
**Process**:
1. Pixel → camera ray: ray_cam = K_viewpro⁻¹(zoom) · [u, v, 1]ᵀ
2. Camera → gimbal frame: ray_gimbal = R_gimbal(pan, tilt) · ray_cam
3. Gimbal → body: ray_body = T_gimbal_body · ray_gimbal
4. Body → NED: ray_ned = R_body_ned(q) · ray_body
5. Ray-ground intersection: assuming flat terrain at UAV altitude h: t = -h / ray_ned[2], p_ground_ned = p_uav_ned + t · ray_ned
6. NED → WGS84: convert to lat, lon
**Output**: { lat, lon, accuracy_m, confidence }
- accuracy_m propagated from: UAV position accuracy (from ESKF) + gimbal angle uncertainty + altitude uncertainty
**API endpoint**: POST /objects/locate
- Request: { pixel_x, pixel_y, gimbal_pan_deg, gimbal_tilt_deg, zoom_factor }
- Response: { lat, lon, alt, accuracy_m, confidence, uav_position: {lat, lon, alt}, timestamp }
### Component: Startup, Handoff & Failsafe (UPDATED — added handoff + reboot + re-localization)
**GPS-denied handoff protocol**:
- GPS-denied system runs continuously from companion computer boot
- Reads initial position from FC (GLOBAL_POSITION_INT) — this may be real GPS or last known
- First satellite match validates the initial position
- FC receives both real GPS (if available) and GPS_INPUT; FC EKF selects best source based on accuracy
- No explicit "switch" — the GPS-denied system is a secondary GPS source
**Startup sequence** (expanded from draft05):
1. Boot Jetson → start GPS-Denied service (systemd)
2. Connect to flight controller via pymavlink on UART
3. Wait for heartbeat
4. Initialize PyCUDA context
5. Load TRT engines: litesam.engine + xfeat.engine (~1-3s each)
6. Allocate GPU I/O buffers
7. Create CUDA streams: Stream A (cuVSLAM), Stream B (satellite matching)
8. Load camera calibration + IMU calibration files
9. Read GLOBAL_POSITION_INT → set mission origin (NED reference point) → init ESKF
10. Start cuVSLAM (Inertial mode) with ADTI 20L V1 camera stream
11. Preload satellite tiles within ±2km into RAM
12. Trigger first satellite match → validate initial position
13. Begin GPS_INPUT output loop at 5-10Hz
14. System ready
**Mid-flight reboot recovery**:
1. Jetson boots (~30-60s)
2. GPS-Denied service starts, connects to FC
3. Read GLOBAL_POSITION_INT (FC's current IMU-extrapolated position)
4. Init ESKF with this position + HIGH uncertainty covariance (σ = 200m)
5. Load TRT engines (~2-6s total)
6. Start cuVSLAM (fresh, no prior map)
7. Immediate satellite matching on first camera frame
8. On satellite match success: ESKF corrected, uncertainty drops
9. Estimated total recovery: ~35-70s
10. During recovery: FC uses IMU-only dead reckoning (at 70 km/h: ~700-1400m uncontrolled drift)
11. **Known limitation**: recovery time is dominated by Jetson boot time
**3-consecutive-failure re-localization**:
- Trigger: VO lost + satellite match failed × 3 consecutive camera frames
- Action: send re-localization request via MAVLink STATUSTEXT or custom message
- Message content: "RELOC_REQ: last_lat={lat} last_lon={lon} uncertainty={σ}m"
- Operator response: MAVLink COMMAND_LONG with approximate lat/lon
- System: use operator position as ESKF measurement with R = diag(500², 500², 100²) meters²
- System continues satellite matching with updated search area
- While waiting: GPS_INPUT fix_type=0, IMU-only ESKF prediction continues
### Component: Ground Station Telemetry (UPDATED — added re-localization)
MAVLink messages to ground station:
| Message | Rate | Content |
|---------|------|---------|
| NAMED_VALUE_FLOAT "gps_conf" | 1Hz | Confidence score (0.0-1.0) |
| NAMED_VALUE_FLOAT "gps_drift" | 1Hz | Estimated drift from last satellite anchor (meters) |
| NAMED_VALUE_FLOAT "gps_hacc" | 1Hz | Horizontal accuracy (meters, from ESKF) |
| STATUSTEXT | On event | "RELOC_REQ: ..." for re-localization request |
| STATUSTEXT | On event | Tracking loss / recovery notifications |
### Component: Thermal Management (UNCHANGED)
Same adaptive pipeline from draft05. Active cooling required at 25W. Throttling at 80°C SoC junction.
### Component: API & Inter-System Communication (NEW — consolidated)
FastAPI (Uvicorn) running locally on Jetson for inter-process communication with other onboard systems.
| Endpoint | Method | Purpose | Auth |
|----------|--------|---------|------|
| /sessions | POST | Start GPS-denied session | JWT |
| /sessions/{id}/stream | GET (SSE) | Real-time position + confidence stream | JWT |
| /sessions/{id}/anchor | POST | Operator re-localization hint | JWT |
| /sessions/{id} | DELETE | End session | JWT |
| /objects/locate | POST | Object GPS from pixel coordinates | JWT |
| /health | GET | System health + memory + thermal | None |
**SSE event schema** (1Hz):
```json
{
"type": "position",
"timestamp": "2026-03-17T12:00:00.000Z",
"lat": 48.123456,
"lon": 37.654321,
"alt": 600.0,
"accuracy_h": 15.2,
"accuracy_v": 8.1,
"confidence": "HIGH",
"drift_from_anchor": 12.5,
"vo_status": "tracking",
"last_satellite_match_age_s": 8.3
}
```
## UAV Platform
Unchanged from draft05. See draft05 for: airframe configuration (3.5m S-2 composite, 12.5kg AUW), flight performance (3.4h endurance at 50 km/h), camera specifications (ADTI 20L V1 + 16mm, Viewpro A40 Pro), ground coverage calculations.
## Speed Optimization Techniques
Unchanged from draft05. Key points: cuVSLAM ~9ms/frame, native TRT Engine (no ONNX RT), dual CUDA streams, 5-10Hz GPS_INPUT from ESKF IMU prediction.
## Processing Time Budget
Unchanged from draft05. VO frame: ~17-22ms. Satellite matching: ≤210ms async. Well within 1430ms frame interval.
## Memory Budget (Jetson Orin Nano Super, 8GB shared)
| Component | Memory | Notes |
|-----------|--------|-------|
| OS + runtime | ~1.5GB | JetPack 6.2 + Python |
| cuVSLAM | ~200-500MB | CUDA library + map |
| LiteSAM TRT engine | ~50-80MB | If LiteSAM fails: EfficientLoFTR ~100-150MB |
| XFeat TRT engine | ~30-50MB | |
| Preloaded satellite tiles | ~200MB | ±2km of flight plan |
| pymavlink + MAVLink | ~20MB | |
| FastAPI (local IPC) | ~50MB | |
| ESKF + buffers | ~10MB | |
| **Total** | **~2.1-2.9GB** | **26-36% of 8GB** |
## Key Risks and Mitigations
| Risk | Likelihood | Impact | Mitigation |
|------|-----------|--------|------------|
| LiteSAM MinGRU ops unsupported in TRT 10.3 | LOW-MEDIUM | LiteSAM TRT export fails | Day-one verification. Fallback: EfficientLoFTR TRT → XFeat TRT. |
| cuVSLAM fails on low-texture terrain at 0.7fps | HIGH | Frequent tracking loss | Satellite matching corrections bound drift. Re-localization pipeline handles tracking loss. IMU bridges short gaps. |
| Google Maps satellite quality in conflict zone | HIGH | Satellite matching fails, outdated imagery | Pre-flight tile validation. Consider alternative providers (Bing, Mapbox). Robust to seasonal appearance changes via feature-based matching. |
| ESKF scale drift during long constant-velocity segments | MEDIUM | Position error exceeds 100m between satellite anchors | Satellite corrections every 7-14s re-anchor. Altitude constraint from barometer. Monitor drift rate — if >50m between corrections, increase satellite matching frequency. |
| Monocular scale ambiguity | MEDIUM | Metric scale lost during constant-velocity flight | Satellite absolute corrections provide scale. Known altitude constrains vertical scale. IMU acceleration during turns provides observability. |
| AUW exceeds AT4125 recommended range | MEDIUM | Reduced endurance, motor thermal stress | 12.5 kg vs 8-10 kg recommended. Monitor motor temps. Weight optimization. |
| ADTI mechanical shutter lifespan | MEDIUM | Replacement needed periodically | ~8,800 actuations/flight at 0.7fps. Estimated 11-57 flights before replacement. Budget as consumable. |
| Mid-flight companion computer failure | LOW | ~35-70s position gap | Reboot recovery procedure defined. FC uses IMU dead reckoning during gap. Known limitation. |
| Thermal throttling on Jetson | MEDIUM | Satellite matching latency increases | Active cooling required. Monitor SoC temp. Throttling at 80°C. Our workload ~8-15W typical — well under 25W TDP. |
| Engine incompatibility after JetPack update | MEDIUM | Must rebuild engines | Include engine rebuild in update procedure. |
| TRT engine build OOM on 8GB | LOW | Cannot build on target | Models small (6.31M, <5M). Reduce --memPoolSize if needed. |
## Testing Strategy
### Integration / Functional Tests
- **ESKF correctness**: Feed recorded IMU + synthetic VO/satellite data → verify output matches reference ESKF implementation
- **GPS_INPUT field validation**: Send GPS_INPUT to SITL ArduPilot → verify EKF accepts and uses the data correctly
- **Coordinate transform chain**: Known GPS → NED → pixel → back to GPS — verify round-trip error <0.1m
- **Disconnected segment handling**: Simulate tracking loss → verify satellite re-localization triggers → verify cuVSLAM restarts → verify ESKF position continuity
- **3-consecutive-failure**: Simulate VO + satellite failures → verify re-localization request sent → verify operator hint accepted
- **Object localization**: Known object at known GPS → verify computed GPS matches within camera accuracy
- **Mid-flight reboot**: Kill GPS-denied process → restart → verify recovery within expected time → verify position accuracy after recovery
- **TRT engine load test**: Verify engines load successfully on Jetson
- **TRT inference correctness**: Compare TRT output vs PyTorch reference (max L1 error < 0.01)
- **CUDA Stream pipelining**: Verify Stream B satellite matching does not block Stream A VO
- **ADTI sustained capture rate**: Verify 0.7fps sustained >30 min without buffer overflow
- **Confidence tier transitions**: Verify fix_type and accuracy change correctly across HIGH → MEDIUM → LOW → FAILED transitions
### Non-Functional Tests
- **End-to-end accuracy** (primary validation): Fly with real GPS recording → run GPS-denied system in parallel → compare estimated vs real positions → verify 80% within 50m, 60% within 20m
- **VO drift rate**: Measure cuVSLAM drift over 1km straight segment without satellite correction
- **Satellite matching accuracy**: Compare satellite-matched position vs real GPS at known locations
- **Processing time**: Verify end-to-end per-frame <400ms
- **Memory usage**: Monitor over 30-min session → verify <8GB, no leaks
- **Thermal**: Sustained 30-min run → verify no throttling
- **GPS_INPUT rate**: Verify consistent 5-10Hz delivery to FC
- **Tile storage**: Validate calculated storage matches actual for test mission area
- **MinGRU TRT compatibility** (day-one blocker): Clone LiteSAM → ONNX export → polygraphy → trtexec
- **Flight endurance**: Ground-test full system power draw against 267W estimate
## References
- ArduPilot GPS_RATE parameter: https://github.com/ArduPilot/ardupilot/pull/15980
- MAVLink GPS_INPUT message: https://ardupilot.org/mavproxy/docs/modules/GPSInput.html
- pymavlink GPS_INPUT example: https://webperso.ensta.fr/lebars/Share/GPS_INPUT_pymavlink.py
- ESKF reference (fixed-wing UAV): https://github.com/ludvigls/ESKF
- ROS ESKF multi-sensor: https://github.com/EliaTarasov/ESKF
- Range-VIO scale observability: https://arxiv.org/abs/2103.15215
- NaviLoc trajectory-level localization: https://www.mdpi.com/2504-446X/10/2/97
- SatLoc-Fusion hierarchical framework: https://www.scilit.com/publications/e5cafaf875a49297a62b298a89d5572f
- Auterion GPS-denied workflow: https://docs.auterion.com/vehicle-operation/auterion-mission-control/useful-resources/operations/gps-denied-workflow
- PX4 GNSS-denied flight: https://docs.px4.io/main/en/advanced_config/gnss_degraded_or_denied_flight.html
- ArduPilot GPS_INPUT advanced usage: https://discuss.ardupilot.org/t/advanced-usage-of-gps-type-mav-14/99406
- Google Maps Ukraine imagery: https://newsukraine.rbc.ua/news/google-maps-has-surprise-for-satellite-imagery-1727182380.html
- Jetson Orin Nano Super thermal: https://edgeaistack.app/blog/jetson-orin-nano-power-consumption/
- GSD matching research: https://www.kjrs.org/journal/view.html?pn=related&uid=756&vmd=Full
- VO+satellite matching pipeline: https://polen.itu.edu.tr/items/1fe1e872-7cea-44d8-a8de-339e4587bee6
- PyCuVSLAM docs: https://wiki.seeedstudio.com/pycuvslam_recomputer_robotics/
- Pixhawk 6x IMU (ICM-42688-P) datasheet: https://invensense.tdk.com/products/motion-tracking/6-axis/icm-42688-p/
- All references from solution_draft05.md
## Related Artifacts
- AC Assessment: `_docs/00_research/gps_denied_nav/00_ac_assessment.md`
- Completeness assessment research: `_docs/00_research/solution_completeness_assessment/`
- Previous research: `_docs/00_research/trt_engine_migration/`
- Tech stack evaluation: `_docs/01_solution/tech_stack.md` (needs sync with draft05 corrections)
- Security analysis: `_docs/01_solution/security_analysis.md`
- Previous draft: `_docs/01_solution/solution_draft05.md`