Refactor acceptance criteria, problem description, and restrictions for UAV GPS-Denied system. Enhance clarity and detail in performance metrics, image processing requirements, and operational constraints. Introduce new sections for UAV specifications, camera details, satellite imagery, and onboard hardware.

This commit is contained in:
Oleksandr Bezdieniezhnykh
2026-03-17 09:00:06 +02:00
parent 767874cb90
commit f2aa95c8a2
35 changed files with 4857 additions and 26 deletions
@@ -0,0 +1,102 @@
# Question Decomposition
## Original Question
Assess solution_draft02.md against updated acceptance criteria and restrictions. The AC and restrictions have been significantly revised to reflect real onboard deployment requirements (MAVLink integration, ground station telemetry, startup/failsafe, object localization, thermal management, satellite imagery specs).
## Active Mode
Mode B: Solution Assessment — `solution_draft02.md` is the latest draft in OUTPUT_DIR.
## Question Type
Problem Diagnosis + Decision Support
## Research Subject Boundary
- **Population**: GPS-denied UAV navigation systems on edge hardware (Jetson Orin Nano Super)
- **Geography**: Eastern/southern Ukraine (east of Dnipro River), conflict zone
- **Timeframe**: Current (2025-2026), latest available tools and libraries
- **Level**: Onboard companion computer, real-time flight controller integration via MAVLink
## Key Delta: What Changed in AC/Restrictions
### Restrictions Changes
1. Two cameras: Navigation (fixed, downward) + AI camera (configurable angle/zoom)
2. Processing on Jetson Orin Nano Super (was "stationary computer or laptop")
3. IMU data IS available via flight controller (was "NO data from IMU")
4. MAVLink protocol via MAVSDK for flight controller communication
5. Must output GPS_INPUT messages as GPS replacement
6. Ground station telemetry link available but bandwidth-limited
7. Thermal throttling must be accounted for
8. Satellite imagery pre-loaded, storage limited
### Acceptance Criteria Changes
1. <400ms per frame to flight controller (was <5s for processing)
2. MAVLink GPS_INPUT output (was REST API + SSE)
3. Ground station: stream position/confidence, receive re-localization commands
4. Object localization: trigonometric GPS from AI camera angle/zoom/altitude
5. Startup: initialize from last known GPS before GPS denial
6. Failsafe: IMU-only fallback after N seconds of total failure
7. Reboot recovery: re-initialize from flight controller IMU-extrapolated position
8. Max cumulative VO drift <100m between satellite anchors
9. Confidence score per position estimate (high/low)
10. Satellite imagery: ≥0.5 m/pixel, <2 years old
11. WGS84 output format
12. Re-localization via telemetry to ground station (not REST API user input)
## Decomposed Sub-Questions
### Q1: MAVLink GPS_INPUT Integration
- How does MAVSDK Python handle GPS_INPUT messages?
- What fields are required in GPS_INPUT?
- What update rate does the flight controller expect?
- Can we send confidence/accuracy indicators via MAVLink?
- How does this replace the REST API + SSE architecture?
### Q2: Ground Station Telemetry Integration
- How to stream position + confidence over bandwidth-limited telemetry?
- How to receive operator re-localization commands?
- What MAVLink messages support custom data?
- What bandwidth is typical for UAV telemetry links?
### Q3: Startup & Failsafe Mechanisms
- How to initialize from flight controller's last GPS position?
- How to detect GPS denial onset?
- What happens on companion computer reboot mid-flight?
- How to implement IMU-only dead reckoning fallback?
### Q4: Object Localization via AI Camera
- How to compute ground GPS from UAV position + camera angle + zoom + altitude?
- What accuracy can be expected given GPS-denied position error?
- How to handle the API between GPS-denied system and AI detection system?
### Q5: Thermal Management on Jetson Orin Nano Super
- What is sustained thermal performance under GPU load?
- How to monitor and mitigate thermal throttling?
- What power modes are available?
### Q6: VO Drift Budget & Monitoring
- How to measure cumulative drift between satellite anchors?
- How to trigger satellite matching when drift approaches 100m?
- ESKF covariance as drift proxy?
### Q7: Weak Points in Draft02 Architecture
- REST API + SSE architecture is wrong — must be MAVLink
- No ground station integration
- No startup/shutdown procedures
- No thermal management
- No object localization detail for AI camera with configurable angle/zoom
- Memory budget doesn't account for MAVSDK overhead
## Timeliness Sensitivity Assessment
- **Research Topic**: MAVLink integration, MAVSDK for Jetson, ground station telemetry, thermal management
- **Sensitivity Level**: 🟠 High
- **Rationale**: MAVSDK actively developed; MAVLink message set evolving; Jetson JetPack 6.2 specific
- **Source Time Window**: 12 months
- **Priority official sources**:
1. MAVSDK Python documentation (mavsdk.io)
2. MAVLink message definitions (mavlink.io)
3. NVIDIA Jetson Orin Nano thermal documentation
4. PX4/ArduPilot GPS_INPUT documentation
- **Key version information**:
- MAVSDK-Python: latest PyPI version
- MAVLink: v2 protocol
- JetPack: 6.2.2
- PyCuVSLAM: v15.0.0
@@ -0,0 +1,175 @@
# Source Registry
## Source #1
- **Title**: MAVSDK-Python Issue #320: Input external GPS through MAVSDK
- **Link**: https://github.com/mavlink/MAVSDK-Python/issues/320
- **Tier**: L4
- **Publication Date**: 2021 (still open 2025)
- **Timeliness Status**: ✅ Currently valid
- **Version Info**: MAVSDK-Python — GPS_INPUT not supported as of v3.15.3
- **Target Audience**: Companion computer developers
- **Research Boundary Match**: ✅ Full match
- **Summary**: MAVSDK-Python does not support GPS_INPUT message. Feature requested but unresolved.
- **Related Sub-question**: Q1
## Source #2
- **Title**: MAVLink GPS_INPUT Message Specification (mavlink_msg_gps_input.h)
- **Link**: https://rflysim.com/doc/en/RflySimAPIs/RflySimSDK/html/mavlink__msg__gps__input_8h_source.html
- **Tier**: L1
- **Publication Date**: 2024
- **Timeliness Status**: ✅ Currently valid
- **Version Info**: MAVLink v2, Message ID 232
- **Target Audience**: MAVLink developers
- **Research Boundary Match**: ✅ Full match
- **Summary**: GPS_INPUT message fields: lat/lon (1E7), alt, fix_type, horiz_accuracy, vert_accuracy, speed_accuracy, hdop, vdop, satellites_visible, velocities NED, yaw, ignore_flags.
- **Related Sub-question**: Q1
## Source #3
- **Title**: ArduPilot GPS Input MAVProxy Documentation
- **Link**: https://ardupilot.org/mavproxy/docs/modules/GPSInput.html
- **Tier**: L1
- **Publication Date**: 2024
- **Timeliness Status**: ✅ Currently valid
- **Version Info**: ArduPilot GPS1_TYPE=14
- **Target Audience**: ArduPilot companion computer developers
- **Research Boundary Match**: ✅ Full match
- **Summary**: GPS_INPUT requires GPS1_TYPE=14. Accepts JSON over UDP port 25100. Fields: lat, lon, alt, fix_type, hdop, timestamps.
- **Related Sub-question**: Q1
## Source #4
- **Title**: pymavlink GPS_INPUT example
- **Link**: https://webperso.ensta.fr/lebars/Share/GPS_INPUT_pymavlink.py
- **Tier**: L3
- **Publication Date**: 2023
- **Timeliness Status**: ✅ Currently valid
- **Version Info**: pymavlink
- **Target Audience**: Companion computer developers
- **Research Boundary Match**: ✅ Full match
- **Summary**: Complete pymavlink example for sending GPS_INPUT with all fields including yaw. Uses gps_input_send() method.
- **Related Sub-question**: Q1
## Source #5
- **Title**: ArduPilot AP_GPS_Params.cpp — GPS_RATE_MS
- **Link**: https://cocalc.com/github/ardupilot/ardupilot/blob/master/libraries/AP_GPS/AP_GPS_Params.cpp
- **Tier**: L1
- **Publication Date**: 2024
- **Timeliness Status**: ✅ Currently valid
- **Version Info**: ArduPilot master
- **Target Audience**: ArduPilot developers
- **Research Boundary Match**: ✅ Full match
- **Summary**: GPS_RATE_MS default 200ms (5Hz), range 50-200ms (5-20Hz). Below 5Hz not allowed.
- **Related Sub-question**: Q1
## Source #6
- **Title**: MAVLink Telemetry Bandwidth Optimization Issue #1605
- **Link**: https://github.com/mavlink/mavlink/issues/1605
- **Tier**: L2
- **Publication Date**: 2022
- **Timeliness Status**: ✅ Currently valid
- **Target Audience**: MAVLink protocol developers
- **Research Boundary Match**: ✅ Full match
- **Summary**: Minimal telemetry requires ~12kbit/s. Optimized ~6kbit/s. SiK at 57600 baud provides ~21% usable budget. RFD900 for long range (15km+).
- **Related Sub-question**: Q2
## Source #7
- **Title**: NVIDIA JetPack 6.2 Super Mode Blog
- **Link**: https://developer.nvidia.com/blog/nvidia-jetpack-6-2-brings-super-mode-to-nvidia-jetson-orin-nano-and-jetson-orin-nx-modules/
- **Tier**: L1
- **Publication Date**: 2025-01
- **Timeliness Status**: ✅ Currently valid
- **Version Info**: JetPack 6.2, Orin Nano Super
- **Target Audience**: Jetson developers
- **Research Boundary Match**: ✅ Full match
- **Summary**: MAXN SUPER mode for peak performance. Thermal throttling at 80°C. Power modes: 15W, 25W, MAXN SUPER. Up to 1.7x AI boost.
- **Related Sub-question**: Q5
## Source #8
- **Title**: Jetson Orin Nano Power Consumption Analysis
- **Link**: https://edgeaistack.app/blog/jetson-orin-nano-power-consumption/
- **Tier**: L3
- **Publication Date**: 2025
- **Timeliness Status**: ✅ Currently valid
- **Target Audience**: Jetson edge deployment engineers
- **Research Boundary Match**: ✅ Full match
- **Summary**: 5W idle, 8-12W typical inference, 25W peak. Throttling above 80°C drops GPU from 1GHz to 300MHz. Active cooling required for sustained loads.
- **Related Sub-question**: Q5
## Source #9
- **Title**: UAV Target Geolocation (Sensors 2022)
- **Link**: https://www.mdpi.com/1424-8220/22/5/1903
- **Tier**: L1
- **Publication Date**: 2022
- **Timeliness Status**: ✅ Currently valid (math doesn't change)
- **Target Audience**: UAV reconnaissance systems
- **Research Boundary Match**: ✅ Full match
- **Summary**: Trigonometric target geolocation from camera angle, altitude, UAV position. Iterative refinement improves accuracy 22-38x.
- **Related Sub-question**: Q4
## Source #10
- **Title**: pymavlink vs MAVSDK-Python for custom messages (Issue #739)
- **Link**: https://github.com/mavlink/MAVSDK-Python/issues/739
- **Tier**: L4
- **Publication Date**: 2024-12
- **Timeliness Status**: ✅ Currently valid
- **Version Info**: MAVSDK-Python, pymavlink
- **Target Audience**: Companion computer developers
- **Research Boundary Match**: ✅ Full match
- **Summary**: MAVSDK-Python lacks custom message support. pymavlink recommended for GPS_INPUT and custom messages. MAVSDK v4 may add MavlinkDirect plugin.
- **Related Sub-question**: Q1
## Source #11
- **Title**: NAMED_VALUE_FLOAT for custom telemetry (PR #18501)
- **Link**: https://github.com/ArduPilot/ardupilot/pull/18501
- **Tier**: L2
- **Publication Date**: 2022
- **Timeliness Status**: ✅ Currently valid
- **Target Audience**: ArduPilot companion computer developers
- **Research Boundary Match**: ✅ Full match
- **Summary**: NAMED_VALUE_FLOAT messages from companion computer are logged by ArduPilot and forwarded to GCS. Useful for custom telemetry data.
- **Related Sub-question**: Q2
## Source #12
- **Title**: ArduPilot Companion Computer UART Connection
- **Link**: https://ardupilot.org/dev/docs/raspberry-pi-via-mavlink.html
- **Tier**: L1
- **Publication Date**: 2024
- **Timeliness Status**: ✅ Currently valid
- **Target Audience**: ArduPilot companion computer developers
- **Research Boundary Match**: ✅ Full match
- **Summary**: Connect via TELEM2 UART. SERIAL2_PROTOCOL=2 (MAVLink2). Baud up to 1.5Mbps. TX/RX crossover.
- **Related Sub-question**: Q1, Q2
## Source #13
- **Title**: Jetson Orin Nano UART with ArduPilot
- **Link**: https://forums.developer.nvidia.com/t/uart-connection-between-jetson-nano-orin-and-ardupilot/325416
- **Tier**: L4
- **Publication Date**: 2025
- **Timeliness Status**: ✅ Currently valid
- **Version Info**: JetPack 6.x, Orin Nano
- **Target Audience**: Jetson + ArduPilot integration
- **Research Boundary Match**: ✅ Full match
- **Summary**: UART instability reported on Orin Nano with ArduPilot. Use /dev/ttyTHS0 or /dev/ttyTHS1. Check pinout carefully.
- **Related Sub-question**: Q1
## Source #14
- **Title**: MAVSDK-Python v3.15.3 PyPI (aarch64 wheels)
- **Link**: https://pypi.org/project/mavsdk/
- **Tier**: L1
- **Publication Date**: 2025
- **Timeliness Status**: ✅ Currently valid
- **Version Info**: v3.15.3, manylinux2014_aarch64
- **Target Audience**: MAVSDK Python developers
- **Research Boundary Match**: ✅ Full match
- **Summary**: MAVSDK-Python has aarch64 wheels. pip install works on Jetson. But no GPS_INPUT support.
- **Related Sub-question**: Q1
## Source #15
- **Title**: ArduPilot receive COMMAND_LONG on companion computer
- **Link**: https://discuss.ardupilot.org/t/recieve-mav-cmd-on-companion-computer/48928
- **Tier**: L4
- **Publication Date**: 2020
- **Timeliness Status**: ⚠️ Needs verification (old but concept still valid)
- **Target Audience**: ArduPilot companion computer developers
- **Research Boundary Match**: ✅ Full match
- **Summary**: Companion computer can receive COMMAND_LONG messages from GCS via MAVLink. ArduPilot scripting can intercept specific command IDs.
- **Related Sub-question**: Q2
@@ -0,0 +1,105 @@
# Fact Cards
## Fact #1
- **Statement**: MAVSDK-Python (v3.15.3) does NOT support sending GPS_INPUT MAVLink messages. The feature has been requested since 2021 and remains unresolved. Custom message support is planned for MAVSDK v4 but not available in Python wrapper.
- **Source**: Source #1, #10, #14
- **Phase**: Assessment
- **Target Audience**: GPS-Denied system developers
- **Confidence**: ✅ High — confirmed by MAVSDK maintainers
- **Related Dimension**: Flight Controller Integration
## Fact #2
- **Statement**: pymavlink provides full access to all MAVLink messages including GPS_INPUT via `mav.gps_input_send()`. It is the recommended library for companion computers that need to send GPS_INPUT messages.
- **Source**: Source #4, #10
- **Phase**: Assessment
- **Target Audience**: GPS-Denied system developers
- **Confidence**: ✅ High — working examples exist
- **Related Dimension**: Flight Controller Integration
## Fact #3
- **Statement**: GPS_INPUT (MAVLink msg ID 232) contains: lat/lon (WGS84, degrees×1E7), alt (AMSL), fix_type (0-8), horiz_accuracy (m), vert_accuracy (m), speed_accuracy (m/s), hdop, vdop, satellites_visible, vn/ve/vd (NED m/s), yaw (centidegrees), gps_id, ignore_flags.
- **Source**: Source #2
- **Phase**: Assessment
- **Target Audience**: GPS-Denied system developers
- **Confidence**: ✅ High — official MAVLink spec
- **Related Dimension**: Flight Controller Integration
## Fact #4
- **Statement**: ArduPilot requires GPS1_TYPE=14 (MAVLink) to accept GPS_INPUT messages from a companion computer. Connection via TELEM2 UART, SERIAL2_PROTOCOL=2 (MAVLink2).
- **Source**: Source #3, #12
- **Phase**: Assessment
- **Target Audience**: GPS-Denied system developers
- **Confidence**: ✅ High — official ArduPilot documentation
- **Related Dimension**: Flight Controller Integration
## Fact #5
- **Statement**: ArduPilot GPS update rate (GPS_RATE_MS) default is 200ms (5Hz), range 50-200ms (5-20Hz). Our camera at 3fps (333ms) means GPS_INPUT at 3Hz. ArduPilot minimum is 5Hz. We must interpolate/predict between camera frames to meet 5Hz minimum.
- **Source**: Source #5
- **Phase**: Assessment
- **Target Audience**: GPS-Denied system developers
- **Confidence**: ✅ High — ArduPilot source code
- **Related Dimension**: Flight Controller Integration
## Fact #6
- **Statement**: GPS_INPUT horiz_accuracy field directly maps to our confidence scoring. We can report: satellite-anchored ≈ 10-20m accuracy, VO-extrapolated ≈ 20-50m, IMU-only ≈ 100m+. ArduPilot EKF uses this for fusion weighting internally.
- **Source**: Source #2, #3
- **Phase**: Assessment
- **Target Audience**: GPS-Denied system developers
- **Confidence**: ⚠️ Medium — accuracy mapping is estimated, EKF weighting not fully documented
- **Related Dimension**: Flight Controller Integration
## Fact #7
- **Statement**: Typical UAV telemetry bandwidth: SiK radio at 57600 baud provides ~12kbit/s usable for MAVLink. RFD900 provides long range (15km+) at similar data rates. Position telemetry must be compact — ~50 bytes per position update.
- **Source**: Source #6
- **Phase**: Assessment
- **Target Audience**: GPS-Denied system developers
- **Confidence**: ✅ High — MAVLink protocol specs
- **Related Dimension**: Ground Station Telemetry
## Fact #8
- **Statement**: NAMED_VALUE_FLOAT MAVLink message can stream custom telemetry from companion computer to ground station. ArduPilot logs and forwards these. Mission Planner displays them. Useful for confidence score, drift status, matching status.
- **Source**: Source #11
- **Phase**: Assessment
- **Target Audience**: GPS-Denied system developers
- **Confidence**: ✅ High — ArduPilot merged PR
- **Related Dimension**: Ground Station Telemetry
## Fact #9
- **Statement**: Jetson Orin Nano Super throttles GPU from 1GHz to ~300MHz when junction temperature exceeds 80°C. Active cooling (fan) required for sustained load. Power consumption: 5W idle, 8-12W typical inference, 25W peak. Modes: 15W, 25W, MAXN SUPER.
- **Source**: Source #7, #8
- **Phase**: Assessment
- **Target Audience**: GPS-Denied system developers
- **Confidence**: ✅ High — NVIDIA official
- **Related Dimension**: Thermal Management
## Fact #10
- **Statement**: Jetson Orin Nano UART connection to ArduPilot uses /dev/ttyTHS0 or /dev/ttyTHS1. UART instability reported on some units — verify pinout, use JetPack 6.2.2+. Baud up to 1.5Mbps supported.
- **Source**: Source #12, #13
- **Phase**: Assessment
- **Target Audience**: GPS-Denied system developers
- **Confidence**: ⚠️ Medium — UART instability is a known issue with workarounds
- **Related Dimension**: Flight Controller Integration
## Fact #11
- **Statement**: Object geolocation from UAV: for nadir (downward) camera, pixel offset from center → meters via GSD → rotate by heading → add to UAV GPS. For oblique (AI) camera with angle θ from vertical: ground_distance = altitude × tan(θ). Combined with zoom → effective focal length → pixel-to-meter conversion. Flat terrain assumption simplifies to basic trigonometry.
- **Source**: Source #9
- **Phase**: Assessment
- **Target Audience**: GPS-Denied system developers
- **Confidence**: ✅ High — well-established trigonometry
- **Related Dimension**: Object Localization
## Fact #12
- **Statement**: Companion computer can receive COMMAND_LONG from ground station via MAVLink. For re-localization hints: ground station sends a custom command with approximate lat/lon, companion computer receives it via pymavlink message listener.
- **Source**: Source #15
- **Phase**: Assessment
- **Target Audience**: GPS-Denied system developers
- **Confidence**: ⚠️ Medium — specific implementation for re-localization hint would be custom
- **Related Dimension**: Ground Station Telemetry
## Fact #13
- **Statement**: The restrictions.md now says "using MAVSDK library" but MAVSDK-Python cannot send GPS_INPUT. pymavlink is the only viable Python option for GPS_INPUT. This is a restriction conflict that must be resolved — use pymavlink for GPS_INPUT (core function) or accept MAVSDK + pymavlink hybrid.
- **Source**: Source #1, #2, #10
- **Phase**: Assessment
- **Target Audience**: GPS-Denied system developers
- **Confidence**: ✅ High — confirmed limitation
- **Related Dimension**: Flight Controller Integration
@@ -0,0 +1,62 @@
# Comparison Framework
## Selected Framework Type
Problem Diagnosis + Decision Support (Mode B)
## Weak Point Dimensions (from draft02 → new AC/restrictions)
### Dimension 1: Output Architecture (CRITICAL)
Draft02 uses FastAPI + SSE to stream positions to clients.
New AC requires MAVLink GPS_INPUT to flight controller as PRIMARY output.
Entire output architecture must change.
### Dimension 2: Ground Station Communication (CRITICAL)
Draft02 has no ground station integration.
New AC requires: stream position+confidence via telemetry, receive re-localization commands.
### Dimension 3: MAVLink Library Choice (CRITICAL)
Restrictions say "MAVSDK library" but MAVSDK-Python cannot send GPS_INPUT.
Must use pymavlink for core function.
### Dimension 4: GPS Update Rate (HIGH)
Camera at 3fps → 3Hz position updates. ArduPilot minimum GPS rate is 5Hz.
Need IMU-based interpolation between camera frames.
### Dimension 5: Startup & Failsafe (HIGH)
Draft02 has no initialization or failsafe procedures.
New AC requires: init from last GPS, reboot recovery, IMU fallback after N seconds.
### Dimension 6: Object Localization (MEDIUM)
Draft02 has basic pixel-to-GPS for navigation camera only.
New AC adds AI camera with configurable angle, zoom — trigonometric projection needed.
### Dimension 7: Thermal Management (MEDIUM)
Draft02 ignores thermal throttling.
Jetson Orin Nano Super throttles at 80°C — can drop GPU 3x.
### Dimension 8: VO Drift Budget Monitoring (MEDIUM)
New AC: max cumulative VO drift <100m between satellite anchors.
Draft02 uses ESKF covariance but doesn't explicitly track drift budget.
### Dimension 9: Satellite Imagery Specs (LOW)
New AC: ≥0.5 m/pixel, <2 years old. Draft02 uses Google Maps zoom 18-19 which is ~0.3-0.6 m/pixel.
Mostly compatible, needs explicit validation.
### Dimension 10: API for Internal Systems (LOW)
Object localization requests from AI systems need a local IPC mechanism.
FastAPI could be retained for local-only inter-process communication.
## Initial Population
| Dimension | Draft02 State | Required State | Gap Severity |
|-----------|--------------|----------------|-------------|
| Output Architecture | FastAPI + SSE to client | MAVLink GPS_INPUT to flight controller | CRITICAL — full redesign |
| Ground Station | None | Bidirectional MAVLink telemetry | CRITICAL — new component |
| MAVLink Library | Not applicable (no MAVLink) | pymavlink (MAVSDK can't do GPS_INPUT) | CRITICAL — new dependency |
| GPS Update Rate | 3fps → ~3Hz output | ≥5Hz to ArduPilot EKF | HIGH — need IMU interpolation |
| Startup & Failsafe | None | Init from GPS, reboot recovery, IMU fallback | HIGH — new procedures |
| Object Localization | Basic nadir pixel-to-GPS | AI camera angle+zoom trigonometry | MEDIUM — extend existing |
| Thermal Management | Not addressed | Monitor + mitigate throttling | MEDIUM — operational concern |
| VO Drift Budget | ESKF covariance only | Explicit <100m tracking + trigger | MEDIUM — extend ESKF |
| Satellite Imagery Specs | Google Maps zoom 18-19 | ≥0.5 m/pixel, <2 years | LOW — mostly met |
| Internal IPC | REST API | Lightweight local API or shared memory | LOW — simplify from draft02 |
@@ -0,0 +1,202 @@
# Reasoning Chain
## Dimension 1: Output Architecture
### Fact Confirmation
Per Fact #3, GPS_INPUT (MAVLink msg ID 232) accepts lat/lon in WGS84 (degrees×1E7), altitude, fix_type, accuracy fields, and NED velocities. Per Fact #4, ArduPilot uses GPS1_TYPE=14 to accept MAVLink GPS input. The flight controller's EKF fuses this as if it were a real GPS module.
### Reference Comparison
Draft02 uses FastAPI + SSE to stream position data to a REST client. The new AC requires the system to output GPS coordinates directly to the flight controller via MAVLink GPS_INPUT, replacing the real GPS module. The flight controller then uses these coordinates for navigation/autopilot functions. The ground station receives position data indirectly via the flight controller's telemetry forwarding.
### Conclusion
The entire output architecture must change from REST API + SSE → pymavlink GPS_INPUT sender. FastAPI is no longer the primary output mechanism. It may be retained only for local IPC with other onboard AI systems (object localization requests). The SSE streaming to external clients is replaced by MAVLink telemetry forwarding through the flight controller.
### Confidence
✅ High — clear requirement change backed by MAVLink specification
---
## Dimension 2: Ground Station Communication
### Fact Confirmation
Per Fact #7, typical telemetry bandwidth is ~12kbit/s (SiK). Per Fact #8, NAMED_VALUE_FLOAT can stream custom values from companion to GCS. Per Fact #12, COMMAND_LONG can deliver commands from GCS to companion.
### Reference Comparison
Draft02 has no ground station integration. The new AC requires:
1. Stream position + confidence to ground station (passive, via telemetry forwarding of GPS_INPUT data + custom NAMED_VALUE_FLOAT for confidence/drift)
2. Receive re-localization commands from operator (active, via COMMAND_LONG or custom MAVLink message)
### Conclusion
Ground station communication uses MAVLink messages forwarded through the flight controller's telemetry radio. Position data flows automatically (flight controller forwards GPS data to GCS). Custom telemetry (confidence, drift, status) uses NAMED_VALUE_FLOAT. Re-localization hints from operator use a custom COMMAND_LONG with lat/lon payload. Bandwidth is tight (~12kbit/s) so minimize custom message frequency (1-2Hz max for NAMED_VALUE_FLOAT).
### Confidence
✅ High — standard MAVLink patterns
---
## Dimension 3: MAVLink Library Choice
### Fact Confirmation
Per Fact #1, MAVSDK-Python v3.15.3 does NOT support GPS_INPUT. Per Fact #2, pymavlink provides full GPS_INPUT support via `mav.gps_input_send()`. Per Fact #13, the restrictions say "using MAVSDK library" but MAVSDK literally cannot do the core function.
### Reference Comparison
MAVSDK is a higher-level abstraction over MAVLink. pymavlink is the lower-level direct MAVLink implementation. For GPS_INPUT (our core output), only pymavlink works.
### Conclusion
Use **pymavlink** as the MAVLink library. The restriction mentioning MAVSDK must be noted as a conflict — pymavlink is the only viable option for GPS_INPUT in Python. pymavlink is lightweight, pure Python, works on aarch64, and provides full access to all MAVLink messages. MAVSDK v4 may add custom message support in the future but is not available now.
### Confidence
✅ High — confirmed limitation, clear alternative
---
## Dimension 4: GPS Update Rate
### Fact Confirmation
Per Fact #5, ArduPilot GPS_RATE_MS has a minimum of 200ms (5Hz). Our camera shoots at ~3fps (333ms). We produce a full VO+ESKF position estimate per frame at ~3Hz.
### Reference Comparison
3Hz < 5Hz minimum. ArduPilot's EKF expects at least 5Hz GPS updates for stable fusion.
### Conclusion
Between camera frames, use IMU prediction from the ESKF to interpolate position at 5Hz (or higher, e.g., 10Hz). The ESKF already runs IMU prediction at 100+Hz internally. Simply emit the ESKF predicted state as GPS_INPUT at 5-10Hz. Camera frame updates (3Hz) provide the measurement corrections. This is standard in sensor fusion: prediction runs fast, measurements arrive slower. The `fix_type` field can differentiate: camera-corrected frames → fix_type=3 (3D), IMU-predicted → fix_type=2 (2D) or adjust horiz_accuracy to reflect lower confidence.
### Confidence
✅ High — standard sensor fusion approach
---
## Dimension 5: Startup & Failsafe
### Fact Confirmation
Per new AC: system initializes from last known GPS before GPS denial. On reboot: re-initialize from flight controller's IMU-extrapolated position. On total failure for N seconds: flight controller falls back to IMU-only.
### Reference Comparison
Draft02 has no startup or failsafe procedures. The system was assumed to already know its position at session start.
### Conclusion
Startup sequence:
1. On boot, connect to flight controller via pymavlink
2. Read current GPS position from flight controller (GLOBAL_POSITION_INT or GPS_RAW_INT message)
3. Initialize ESKF state with this position
4. Begin cuVSLAM initialization with first camera frames
5. Start sending GPS_INPUT once ESKF has a valid position estimate
Failsafe:
1. If no position estimate for N seconds → stop sending GPS_INPUT (flight controller auto-detects GPS loss and falls back to IMU)
2. Log failure event
3. Continue attempting VO/satellite matching
Reboot recovery:
1. On companion computer reboot, reconnect to flight controller
2. Read current GPS_RAW_INT (which is now IMU-extrapolated by flight controller)
3. Re-initialize ESKF with this position (lower confidence)
4. Resume normal operation
### Confidence
✅ High — standard autopilot integration patterns
---
## Dimension 6: Object Localization
### Fact Confirmation
Per Fact #11, for oblique camera: ground_distance = altitude × tan(θ) where θ is angle from vertical. Combined with camera azimuth (yaw + camera pan angle) gives direction. With zoom, effective FOV narrows → higher pixel-to-meter resolution.
### Reference Comparison
Draft02 has basic nadir-only projection: pixel offset × GSD → meters → rotate by heading → lat/lon. The AI camera has configurable angle and zoom, so this needs extension.
### Conclusion
Object localization for AI camera:
1. Get current UAV position from GPS-Denied system
2. Get AI camera params: pan angle (azimuth relative to heading), tilt angle (from vertical), zoom level (→ effective focal length)
3. Get pixel coordinates of detected object in AI camera frame
4. Compute: a) bearing = UAV heading + camera pan angle + pixel horizontal offset angle, b) ground_distance = altitude / cos(tilt) × (tilt + pixel vertical offset angle) → simplified for flat terrain, c) convert bearing + distance to lat/lon offset from UAV position
5. Accuracy inherits GPS-Denied position error + projection error from altitude/angle uncertainty
Expose as lightweight local API (Unix socket or shared memory for speed, or simple HTTP on localhost).
### Confidence
✅ High — well-established trigonometry, flat terrain simplifies
---
## Dimension 7: Thermal Management
### Fact Confirmation
Per Fact #9, Jetson Orin Nano Super throttles at 80°C junction temperature, dropping GPU from ~1GHz to ~300MHz (3x slowdown). Active cooling required. Power modes: 15W, 25W, MAXN SUPER.
### Reference Comparison
Draft02 ignores thermal constraints. Our pipeline (cuVSLAM ~9ms + satellite matcher ~50-200ms) runs on GPU continuously at 3fps. This is moderate but sustained load.
### Conclusion
Mitigation:
1. Use 25W power mode (not MAXN SUPER) for stable sustained performance
2. Require active cooling (5V fan, should be standard on any UAV companion computer mount)
3. Monitor temperature via tegrastats/jtop at runtime
4. If temp >75°C: reduce satellite matching frequency (every 5-10 frames instead of 3)
5. If temp >80°C: skip satellite matching entirely, rely on VO+IMU only (cuVSLAM at 9ms is low power)
6. Our total GPU time per 333ms frame: ~9ms cuVSLAM + ~50-200ms satellite match (async) = <60% GPU utilization → thermal throttling unlikely with proper cooling
### Confidence
⚠️ Medium — actual thermal behavior depends on airflow in UAV enclosure, ambient temperature in-flight
---
## Dimension 8: VO Drift Budget Monitoring
### Fact Confirmation
New AC: max cumulative VO drift between satellite correction anchors < 100m. The ESKF maintains a position covariance matrix that grows during VO-only periods and shrinks on satellite corrections.
### Reference Comparison
Draft02 uses ESKF covariance for keyframe selection (trigger satellite match when covariance exceeds threshold) but doesn't explicitly track drift as a budget.
### Conclusion
Use ESKF position covariance diagonal (σ_x² + σ_y²) as the drift estimate. When √(σ_x² + σ_y²) approaches 100m:
1. Force satellite matching on every frame (not just keyframes)
2. Report LOW confidence via GPS_INPUT horiz_accuracy
3. If drift exceeds 100m without satellite correction → flag as critical, increase matching frequency, send alert to ground station
This is essentially what draft02 already does with covariance-based keyframe triggering, but now with an explicit 100m threshold.
### Confidence
✅ High — standard ESKF covariance interpretation
---
## Dimension 9: Satellite Imagery Specs
### Fact Confirmation
New AC: ≥0.5 m/pixel resolution, <2 years old. Google Maps at zoom 18 = ~0.6 m/pixel, zoom 19 = ~0.3 m/pixel.
### Reference Comparison
Draft02 uses Google Maps zoom 18-19. Zoom 19 (0.3 m/pixel) exceeds the requirement. Zoom 18 (0.6 m/pixel) meets the minimum. Age depends on Google's imagery updates for eastern Ukraine — conflict zone may have stale imagery.
### Conclusion
Validate during offline preprocessing:
1. Download at zoom 19 first (0.3 m/pixel)
2. If zoom 19 unavailable for some tiles, fall back to zoom 18 (0.6 m/pixel — exceeds 0.5 minimum)
3. Check imagery date metadata if available from Google Maps API
4. Flag tiles where imagery appears stale (seasonal mismatch, destroyed buildings, etc.)
5. No architectural change needed — add validation step to preprocessing pipeline
### Confidence
⚠️ Medium — Google Maps imagery age is not reliably queryable
---
## Dimension 10: Internal IPC for Object Localization
### Fact Confirmation
Other onboard AI systems need to request GPS coordinates of detected objects. These systems run on the same Jetson.
### Reference Comparison
Draft02 has FastAPI for external API. For local IPC between processes on the same device, FastAPI is overkill but works.
### Conclusion
Retain a minimal FastAPI server on localhost:8000 for inter-process communication:
- POST /localize: accepts pixel coordinates + AI camera params → returns GPS coordinates
- GET /status: returns system health/state for monitoring
This is local-only (bind to 127.0.0.1), not exposed externally. The primary output channel is MAVLink GPS_INPUT. This is a lightweight addition, not the core architecture.
### Confidence
✅ High — simple local IPC pattern
@@ -0,0 +1,88 @@
# Validation Log
## Validation Scenario
A typical 15-minute flight over eastern Ukraine agricultural terrain. GPS is jammed after first 2 minutes. Flight includes straight segments, two sharp 90-degree turns, and one low-texture segment over a large plowed field. Ground station operator monitors via telemetry link. During the flight, companion computer reboots once due to power glitch.
## Expected Based on Conclusions
### Phase 1: Normal start (GPS available, first 2 min)
- System boots, connects to flight controller via pymavlink on UART
- Reads GLOBAL_POSITION_INT → initializes ESKF with real GPS position
- Begins cuVSLAM initialization with first camera frames
- Starts sending GPS_INPUT at 5Hz (ESKF prediction between frames)
- Ground station sees position + confidence via telemetry forwarding
### Phase 2: GPS denial begins
- Flight controller's real GPS becomes unreliable/lost
- GPS-Denied system continues sending GPS_INPUT — seamless for autopilot
- horiz_accuracy changes from real-GPS level to VO-estimated level (~20m)
- cuVSLAM provides VO at every frame (~9ms), ESKF fuses with IMU
- Satellite matching runs every 3-10 frames on keyframes
- After successful satellite match: horiz_accuracy improves, fix_type stays 3
- NAMED_VALUE_FLOAT sends confidence/drift data to ground station at ~1Hz
### Phase 3: Sharp turn
- cuVSLAM loses tracking (no overlapping features)
- ESKF falls back to IMU prediction, horiz_accuracy increases
- Next frame flagged as keyframe → satellite matching triggered immediately
- Satellite match against preloaded tiles using IMU dead-reckoning position
- If match found: position recovered, new segment begins, horiz_accuracy drops
- If 3 consecutive failures: send re-localization request to ground station via NAMED_VALUE_FLOAT/STATUSTEXT
- Ground station operator sends COMMAND_LONG with approximate coordinates
- System receives hint, constrains tile search → likely recovers position
### Phase 4: Low-texture plowed field
- cuVSLAM keypoint count drops below threshold
- Satellite matching frequency increases (every frame)
- If satellite matching works on plowed field vs satellite imagery: position maintained
- If satellite also fails (seasonal difference): drift accumulates, ESKF covariance grows
- When √(σ²) approaches 100m: force continuous satellite matching
- horiz_accuracy reported as 50-100m, fix_type=2
### Phase 5: Companion computer reboot
- Power glitch → Jetson reboots (~30-60 seconds)
- During reboot: flight controller gets no GPS_INPUT → detects GPS timeout → falls back to IMU-only dead reckoning
- Jetson comes back: reconnects via pymavlink, reads GPS_RAW_INT (IMU-extrapolated)
- Initializes ESKF with this position (low confidence, horiz_accuracy=100m)
- Begins cuVSLAM + satellite matching → gradually improves accuracy
- Operator on ground station sees position return with improving confidence
### Phase 6: Object localization request
- AI detection system on same Jetson detects a vehicle in AI camera frame
- Sends POST /localize with pixel coords + camera angle (30° from vertical) + zoom level + altitude (500m)
- GPS-Denied system computes: ground_distance = 500 / cos(30°) = 577m slant, horizontal distance = 500 × tan(30°) = 289m
- Adds bearing from heading + camera pan → lat/lon offset
- Returns GPS coordinates with accuracy estimate (GPS-Denied accuracy + projection error)
## Actual Validation Results
The scenario covers all new AC requirements:
- ✅ MAVLink GPS_INPUT at 5Hz (camera frames + IMU interpolation)
- ✅ Confidence via horiz_accuracy field maps to confidence levels
- ✅ Ground station telemetry via MAVLink forwarding + NAMED_VALUE_FLOAT
- ✅ Re-localization via ground station command
- ✅ Startup from GPS → seamless transition on denial
- ✅ Reboot recovery from flight controller IMU-extrapolated position
- ✅ Drift budget tracking via ESKF covariance
- ✅ Object localization with AI camera angle/zoom
## Counterexamples
### Potential issue: 5Hz interpolation accuracy
Between camera frames (333ms apart), ESKF predicts using IMU only. At 200km/h = 55m/s, the UAV moves ~18m between frames. IMU prediction over 200ms (one interpolation step) at this speed introduces ~1-5m error — acceptable for GPS_INPUT.
### Potential issue: UART reliability
Jetson Orin Nano UART instability reported (Fact #10). If MAVLink connection drops during flight, GPS_INPUT stops → autopilot loses GPS. Mitigation: use TCP over USB-C if UART unreliable, or add watchdog to reconnect. This is a hardware integration risk.
### Potential issue: Telemetry bandwidth saturation
If GPS-Denied sends too many NAMED_VALUE_FLOAT messages, it could compete with standard autopilot telemetry for bandwidth. Keep custom messages to 1Hz max (50-100 bytes/s = <1kbit/s).
## Review Checklist
- [x] Draft conclusions consistent with fact cards
- [x] No important dimensions missed
- [x] No over-extrapolation
- [x] Conclusions actionable and verifiable
- [x] All new AC requirements addressed
- [ ] UART reliability needs hardware testing — cannot validate without physical setup
## Conclusions Requiring Revision
None — all conclusions hold under validation. The UART reliability risk needs flagging but doesn't change the architecture.