mirror of
https://github.com/azaion/gps-denied-onboard.git
synced 2026-04-23 04:26:36 +00:00
fix: post-audit — runtime bugs, functional gaps, docs, hardening
Phase A — Runtime bugs: - SSE: add push_event() method to SSEEventStreamer (was missing, masked by mocks) - MAVLink: satellites_visible=10 (was 0, triggers ArduPilot failsafe) - MAVLink: horiz_accuracy=sqrt(P[0,0]+P[1,1]) per spec (was sqrt(avg)) - MAVLink: MEDIUM confidence → fix_type=3 per solution.md (was 2) Phase B — Functional gaps: - handle_user_fix() injects operator GPS into ESKF with noise=500m - app.py uses create_vo_backend() factory (was hardcoded SequentialVO) - ESKF: Mahalanobis gating on satellite updates (rejects outliers >5σ) - ESKF: public accessors (position, quaternion, covariance, last_timestamp) - Processor: no more private ESKF field access Phase C — Documentation: - README: correct API endpoints, CLI command, 40+ env vars documented - Dockerfile: ENV prefixes match pydantic-settings (DB_, SATELLITE_, MAVLINK_) - tech_stack.md marked ARCHIVED (contradicts solution.md) Phase D — Hardening: - JWT auth middleware (AUTH_ENABLED=false default, verify_token on /flights) - TLS config env vars (AUTH_SSL_CERTFILE, AUTH_SSL_KEYFILE) - SHA-256 tile manifest verification in SatelliteDataManager - AuthConfig, ESKFSettings, MAVLinkConfig, SatelliteConfig in config.py Also: conftest.py shared fixtures, download_tiles.py, convert_to_trt.py scripts, config wiring into app.py lifespan, config-driven ESKF, calculate_precise_angle fix. Tests: 196 passed / 8 skipped. Ruff clean. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
+3
-3
@@ -46,9 +46,9 @@ COPY pyproject.toml .
|
|||||||
|
|
||||||
# Runtime environment
|
# Runtime environment
|
||||||
ENV PYTHONPATH=/app/src \
|
ENV PYTHONPATH=/app/src \
|
||||||
GPS_DENIED_DB_PATH=/data/flights.db \
|
DB_URL=sqlite+aiosqlite:////data/flights.db \
|
||||||
GPS_DENIED_TILE_DIR=/data/satellite_tiles \
|
SATELLITE_TILE_DIR=/data/satellite_tiles \
|
||||||
GPS_DENIED_LOG_LEVEL=INFO
|
MAVLINK_CONNECTION=udp:127.0.0.1:14550
|
||||||
|
|
||||||
# Data volume: database + satellite tiles
|
# Data volume: database + satellite tiles
|
||||||
VOLUME ["/data"]
|
VOLUME ["/data"]
|
||||||
|
|||||||
@@ -77,8 +77,8 @@ pip install -e ".[dev]"
|
|||||||
### Запуск
|
### Запуск
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Пряме запуск
|
# Прямий запуск
|
||||||
python -m gps_denied
|
uvicorn gps_denied.app:app --host 0.0.0.0 --port 8000
|
||||||
|
|
||||||
# Docker
|
# Docker
|
||||||
docker compose up --build
|
docker compose up --build
|
||||||
@@ -89,12 +89,28 @@ docker compose up --build
|
|||||||
### Змінні середовища
|
### Змінні середовища
|
||||||
|
|
||||||
```env
|
```env
|
||||||
GPS_DENIED_DB_PATH=/data/flights.db
|
# Основні
|
||||||
GPS_DENIED_TILE_DIR=/data/satellite_tiles # локальні тайли z/x/y.png
|
DB_URL=sqlite+aiosqlite:///./flight_data.db
|
||||||
GPS_DENIED_LOG_LEVEL=INFO
|
SATELLITE_TILE_DIR=.satellite_tiles
|
||||||
MAVLINK_CONNECTION=serial:/dev/ttyTHS1:57600 # UART на Jetson
|
MAVLINK_CONNECTION=serial:/dev/ttyTHS1:57600 # або tcp:host:port
|
||||||
|
MAVLINK_OUTPUT_HZ=5.0
|
||||||
|
MAVLINK_TELEMETRY_HZ=1.0
|
||||||
|
|
||||||
|
# ESKF тюнінг (опціонально)
|
||||||
|
ESKF_VO_POSITION_NOISE=0.3
|
||||||
|
ESKF_SATELLITE_MAX_AGE=30.0
|
||||||
|
ESKF_MAHALANOBIS_THRESHOLD=16.27
|
||||||
|
|
||||||
|
# API
|
||||||
|
API_HOST=127.0.0.1
|
||||||
|
API_PORT=8000
|
||||||
|
|
||||||
|
# Моделі
|
||||||
|
MODEL_WEIGHTS_DIR=weights
|
||||||
```
|
```
|
||||||
|
|
||||||
|
Повний список: `src/gps_denied/config.py` (40+ параметрів з prefix `DB_`, `API_`, `TILES_`, `MODEL_`, `MAVLINK_`, `SATELLITE_`, `ESKF_`, `RECOVERY_`, `ROTATION_`).
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## API
|
## API
|
||||||
@@ -104,12 +120,14 @@ MAVLINK_CONNECTION=serial:/dev/ttyTHS1:57600 # UART на Jetson
|
|||||||
| `/health` | GET | Health check |
|
| `/health` | GET | Health check |
|
||||||
| `/flights` | POST | Створити політ |
|
| `/flights` | POST | Створити політ |
|
||||||
| `/flights/{id}` | GET | Деталі польоту |
|
| `/flights/{id}` | GET | Деталі польоту |
|
||||||
| `/flights/{id}` | DELETE | Видалити політ |
|
| `/flights/{flight_id}` | DELETE | Видалити політ |
|
||||||
| `/flights/{id}/images/batch` | POST | Батч зображень |
|
| `/flights/{flight_id}/images/batch` | POST | Батч зображень |
|
||||||
| `/flights/{id}/fix` | POST | GPS-якір від оператора |
|
| `/flights/{flight_id}/user-fix` | POST | GPS-якір від оператора → ESKF update |
|
||||||
| `/flights/{id}/status` | GET | Статус обробки |
|
| `/flights/{flight_id}/status` | GET | Статус обробки |
|
||||||
| `/flights/{id}/events` | GET | SSE стрім (позиція + confidence) |
|
| `/flights/{flight_id}/stream` | GET | SSE стрім (позиція + confidence) |
|
||||||
| `/flights/{id}/object-gps` | POST | Pixel → GPS (ray-ground проекція) |
|
| `/flights/{flight_id}/frames/{frame_id}/object-to-gps` | POST | Pixel → GPS (ray-ground проекція) |
|
||||||
|
| `/flights/{flight_id}/waypoints/{waypoint_id}` | PUT | Оновити waypoint |
|
||||||
|
| `/flights/{flight_id}/waypoints/batch` | PUT | Batch update waypoints |
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|||||||
@@ -1,4 +1,9 @@
|
|||||||
# Tech Stack Evaluation
|
# ⚠️ ARCHIVED — Дивіться solution.md для актуальної специфікації
|
||||||
|
|
||||||
|
> **Увага**: Цей документ містить застарілі дані (3fps замість 0.7fps, LiteSAM 480px замість 1280px).
|
||||||
|
> Актуальна специфікація: `_docs/01_solution/solution.md`
|
||||||
|
|
||||||
|
# Tech Stack Evaluation (ARCHIVED)
|
||||||
|
|
||||||
## Requirements Summary
|
## Requirements Summary
|
||||||
|
|
||||||
|
|||||||
@@ -23,19 +23,17 @@ Scenarios:
|
|||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
import argparse
|
import argparse
|
||||||
import sys
|
|
||||||
import time
|
|
||||||
|
|
||||||
import numpy as np
|
|
||||||
|
|
||||||
# Ensure src/ is on the path when running from the repo root
|
# Ensure src/ is on the path when running from the repo root
|
||||||
import os
|
import os
|
||||||
|
import sys
|
||||||
|
import time
|
||||||
|
|
||||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), "..", "src"))
|
sys.path.insert(0, os.path.join(os.path.dirname(__file__), "..", "src"))
|
||||||
|
|
||||||
from gps_denied.core.benchmark import AccuracyBenchmark, SyntheticTrajectory, SyntheticTrajectoryConfig
|
from gps_denied.core.benchmark import AccuracyBenchmark, SyntheticTrajectory, SyntheticTrajectoryConfig
|
||||||
from gps_denied.schemas import GPSPoint
|
from gps_denied.schemas import GPSPoint
|
||||||
|
|
||||||
|
|
||||||
ORIGIN = GPSPoint(lat=49.0, lon=32.0)
|
ORIGIN = GPSPoint(lat=49.0, lon=32.0)
|
||||||
|
|
||||||
_SEP = "─" * 60
|
_SEP = "─" * 60
|
||||||
|
|||||||
@@ -0,0 +1,170 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""TensorRT engine conversion script.
|
||||||
|
|
||||||
|
Converts ONNX models to TensorRT FP16 .engine files for Jetson deployment.
|
||||||
|
Wraps trtexec CLI (available on Jetson with JetPack installed).
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
# Convert a single model
|
||||||
|
python scripts/convert_to_trt.py --onnx weights/litesam.onnx --output /opt/engines/litesam.engine
|
||||||
|
|
||||||
|
# Convert all known models from weights/ to engines/
|
||||||
|
python scripts/convert_to_trt.py --all --onnx-dir weights/ --engine-dir /opt/engines/
|
||||||
|
|
||||||
|
# Dry run (check trtexec availability)
|
||||||
|
python scripts/convert_to_trt.py --check
|
||||||
|
|
||||||
|
Requires:
|
||||||
|
- NVIDIA TensorRT (trtexec in PATH) — available on Jetson with JetPack
|
||||||
|
- NOT available on dev/CI machines — script exits cleanly with a message
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import os
|
||||||
|
import shutil
|
||||||
|
import subprocess
|
||||||
|
import sys
|
||||||
|
|
||||||
|
# Known models and their expected ONNX filenames
|
||||||
|
_MODELS = {
|
||||||
|
"superpoint": "superpoint.onnx",
|
||||||
|
"lightglue": "lightglue.onnx",
|
||||||
|
"xfeat": "xfeat.onnx",
|
||||||
|
"dinov2": "dinov2.onnx",
|
||||||
|
"litesam": "litesam.onnx",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def find_trtexec() -> str | None:
|
||||||
|
"""Find trtexec binary in PATH or common Jetson locations."""
|
||||||
|
path = shutil.which("trtexec")
|
||||||
|
if path:
|
||||||
|
return path
|
||||||
|
# Common Jetson paths
|
||||||
|
for candidate in [
|
||||||
|
"/usr/src/tensorrt/bin/trtexec",
|
||||||
|
"/usr/local/cuda/bin/trtexec",
|
||||||
|
]:
|
||||||
|
if os.path.isfile(candidate) and os.access(candidate, os.X_OK):
|
||||||
|
return candidate
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def convert_onnx_to_engine(
|
||||||
|
onnx_path: str,
|
||||||
|
engine_path: str,
|
||||||
|
fp16: bool = True,
|
||||||
|
workspace_mb: int = 1024,
|
||||||
|
trtexec_path: str | None = None,
|
||||||
|
) -> bool:
|
||||||
|
"""Run trtexec to convert ONNX → TensorRT engine.
|
||||||
|
|
||||||
|
Returns True on success, False on failure.
|
||||||
|
"""
|
||||||
|
trtexec = trtexec_path or find_trtexec()
|
||||||
|
if not trtexec:
|
||||||
|
print("ERROR: trtexec not found. Install TensorRT or run on Jetson with JetPack.")
|
||||||
|
return False
|
||||||
|
|
||||||
|
if not os.path.isfile(onnx_path):
|
||||||
|
print(f"ERROR: ONNX file not found: {onnx_path}")
|
||||||
|
return False
|
||||||
|
|
||||||
|
os.makedirs(os.path.dirname(engine_path) or ".", exist_ok=True)
|
||||||
|
|
||||||
|
cmd = [
|
||||||
|
trtexec,
|
||||||
|
f"--onnx={onnx_path}",
|
||||||
|
f"--saveEngine={engine_path}",
|
||||||
|
f"--workspace={workspace_mb}",
|
||||||
|
]
|
||||||
|
if fp16:
|
||||||
|
cmd.append("--fp16")
|
||||||
|
|
||||||
|
print(f" Converting: {onnx_path} → {engine_path}")
|
||||||
|
print(f" Command: {' '.join(cmd)}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
result = subprocess.run(cmd, capture_output=True, text=True, timeout=600)
|
||||||
|
if result.returncode == 0:
|
||||||
|
size_mb = os.path.getsize(engine_path) / (1024 * 1024)
|
||||||
|
print(f" OK: {engine_path} ({size_mb:.1f} MB)")
|
||||||
|
return True
|
||||||
|
else:
|
||||||
|
print(f" FAIL (exit {result.returncode})")
|
||||||
|
if result.stderr:
|
||||||
|
print(f" stderr: {result.stderr[:500]}")
|
||||||
|
return False
|
||||||
|
except subprocess.TimeoutExpired:
|
||||||
|
print(" FAIL: trtexec timed out (>600s)")
|
||||||
|
return False
|
||||||
|
except FileNotFoundError:
|
||||||
|
print(f" FAIL: trtexec not found at {trtexec}")
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> int:
|
||||||
|
parser = argparse.ArgumentParser(description="Convert ONNX models to TensorRT FP16 engines")
|
||||||
|
parser.add_argument("--onnx", help="Input ONNX model path")
|
||||||
|
parser.add_argument("--output", help="Output .engine path")
|
||||||
|
parser.add_argument("--all", action="store_true", help="Convert all known models")
|
||||||
|
parser.add_argument("--onnx-dir", default="weights", help="Directory with ONNX files")
|
||||||
|
parser.add_argument("--engine-dir", default="/opt/engines", help="Output engine directory")
|
||||||
|
parser.add_argument("--no-fp16", action="store_true", help="Disable FP16 (use FP32)")
|
||||||
|
parser.add_argument("--workspace", type=int, default=1024, help="TRT workspace (MB)")
|
||||||
|
parser.add_argument("--check", action="store_true", help="Check trtexec availability only")
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
# Check mode
|
||||||
|
if args.check:
|
||||||
|
trtexec = find_trtexec()
|
||||||
|
if trtexec:
|
||||||
|
print(f"trtexec found: {trtexec}")
|
||||||
|
return 0
|
||||||
|
else:
|
||||||
|
print("trtexec not found. Not on Jetson or TensorRT not installed.")
|
||||||
|
return 1
|
||||||
|
|
||||||
|
fp16 = not args.no_fp16
|
||||||
|
|
||||||
|
# Single model conversion
|
||||||
|
if args.onnx and args.output:
|
||||||
|
ok = convert_onnx_to_engine(args.onnx, args.output, fp16=fp16, workspace_mb=args.workspace)
|
||||||
|
return 0 if ok else 1
|
||||||
|
|
||||||
|
# Batch conversion
|
||||||
|
if args.all:
|
||||||
|
trtexec = find_trtexec()
|
||||||
|
if not trtexec:
|
||||||
|
print("trtexec not found. Conversion requires Jetson with JetPack installed.")
|
||||||
|
return 1
|
||||||
|
|
||||||
|
print(f"Converting all known models from {args.onnx_dir}/ → {args.engine_dir}/")
|
||||||
|
success = 0
|
||||||
|
fail = 0
|
||||||
|
for model_name, onnx_file in _MODELS.items():
|
||||||
|
onnx_path = os.path.join(args.onnx_dir, onnx_file)
|
||||||
|
engine_path = os.path.join(args.engine_dir, f"{model_name}.engine")
|
||||||
|
if not os.path.isfile(onnx_path):
|
||||||
|
print(f" SKIP {model_name}: {onnx_path} not found")
|
||||||
|
continue
|
||||||
|
ok = convert_onnx_to_engine(
|
||||||
|
onnx_path, engine_path,
|
||||||
|
fp16=fp16, workspace_mb=args.workspace, trtexec_path=trtexec,
|
||||||
|
)
|
||||||
|
if ok:
|
||||||
|
success += 1
|
||||||
|
else:
|
||||||
|
fail += 1
|
||||||
|
|
||||||
|
print(f"\nDone: {success} converted, {fail} failed")
|
||||||
|
return 1 if fail > 0 else 0
|
||||||
|
|
||||||
|
parser.print_help()
|
||||||
|
return 1
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
sys.exit(main())
|
||||||
@@ -0,0 +1,165 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Satellite tile downloader for GPS-denied navigation.
|
||||||
|
|
||||||
|
Downloads Web Mercator satellite tiles for a given bounding box and stores
|
||||||
|
them in the standard z/x/y.png directory layout expected by SatelliteDataManager.
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
# Dry run (count tiles only)
|
||||||
|
python scripts/download_tiles.py --lat-min 48.5 --lat-max 49.5 --lon-min 31.5 --lon-max 32.5 --dry-run
|
||||||
|
|
||||||
|
# Download from OpenStreetMap (default, no API key)
|
||||||
|
python scripts/download_tiles.py --lat-min 48.5 --lat-max 49.5 --lon-min 31.5 --lon-max 32.5
|
||||||
|
|
||||||
|
# Custom zoom and output dir
|
||||||
|
python scripts/download_tiles.py --lat-min 49.0 --lat-max 49.1 --lon-min 32.0 --lon-max 32.1 \
|
||||||
|
--zoom 18 --output .satellite_tiles
|
||||||
|
|
||||||
|
# Google Maps provider (requires API key)
|
||||||
|
python scripts/download_tiles.py --lat-min 49.0 --lat-max 49.1 --lon-min 32.0 --lon-max 32.1 \
|
||||||
|
--provider google --api-key YOUR_KEY
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import asyncio
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import time
|
||||||
|
|
||||||
|
sys.path.insert(0, os.path.join(os.path.dirname(__file__), "..", "src"))
|
||||||
|
|
||||||
|
|
||||||
|
from gps_denied.schemas.satellite import TileCoords
|
||||||
|
from gps_denied.utils.mercator import latlon_to_tile
|
||||||
|
|
||||||
|
# Tile provider URL templates
|
||||||
|
_PROVIDERS = {
|
||||||
|
"osm": "https://tile.openstreetmap.org/{z}/{x}/{y}.png",
|
||||||
|
"google": (
|
||||||
|
"https://maps.googleapis.com/maps/api/staticmap"
|
||||||
|
"?center={lat},{lon}&zoom={z}&size=256x256&maptype=satellite&key={api_key}"
|
||||||
|
),
|
||||||
|
}
|
||||||
|
|
||||||
|
_USER_AGENT = "GPS-Denied-Onboard/1.0 (tile prefetch)"
|
||||||
|
|
||||||
|
|
||||||
|
def compute_tile_range(
|
||||||
|
lat_min: float, lat_max: float, lon_min: float, lon_max: float, zoom: int
|
||||||
|
) -> list[TileCoords]:
|
||||||
|
"""Compute all tile coordinates within a lat/lon bounding box."""
|
||||||
|
nw = latlon_to_tile(lat_max, lon_min, zoom) # NW corner
|
||||||
|
se = latlon_to_tile(lat_min, lon_max, zoom) # SE corner
|
||||||
|
tiles: list[TileCoords] = []
|
||||||
|
for x in range(nw.x, se.x + 1):
|
||||||
|
for y in range(nw.y, se.y + 1):
|
||||||
|
tiles.append(TileCoords(x=x, y=y, zoom=zoom))
|
||||||
|
return tiles
|
||||||
|
|
||||||
|
|
||||||
|
async def download_tiles(
|
||||||
|
tiles: list[TileCoords],
|
||||||
|
output_dir: str,
|
||||||
|
provider: str = "osm",
|
||||||
|
api_key: str = "",
|
||||||
|
max_concurrent: int = 4,
|
||||||
|
delay_s: float = 0.1,
|
||||||
|
) -> tuple[int, int]:
|
||||||
|
"""Download tiles concurrently. Returns (success_count, error_count)."""
|
||||||
|
try:
|
||||||
|
import httpx
|
||||||
|
except ImportError:
|
||||||
|
print("ERROR: httpx required. Install: pip install httpx")
|
||||||
|
return 0, len(tiles)
|
||||||
|
|
||||||
|
url_template = _PROVIDERS.get(provider, _PROVIDERS["osm"])
|
||||||
|
semaphore = asyncio.Semaphore(max_concurrent)
|
||||||
|
success = 0
|
||||||
|
errors = 0
|
||||||
|
|
||||||
|
async def fetch_one(client: httpx.AsyncClient, tc: TileCoords) -> bool:
|
||||||
|
nonlocal success, errors
|
||||||
|
out_path = os.path.join(output_dir, str(tc.zoom), str(tc.x), f"{tc.y}.png")
|
||||||
|
if os.path.isfile(out_path):
|
||||||
|
success += 1
|
||||||
|
return True
|
||||||
|
|
||||||
|
if provider == "google":
|
||||||
|
from gps_denied.utils.mercator import tile_to_latlon
|
||||||
|
center = tile_to_latlon(tc.x + 0.5, tc.y + 0.5, tc.zoom)
|
||||||
|
url = url_template.format(
|
||||||
|
z=tc.zoom, x=tc.x, y=tc.y,
|
||||||
|
lat=center.lat, lon=center.lon, api_key=api_key,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
url = url_template.format(z=tc.zoom, x=tc.x, y=tc.y)
|
||||||
|
|
||||||
|
async with semaphore:
|
||||||
|
try:
|
||||||
|
resp = await client.get(url)
|
||||||
|
resp.raise_for_status()
|
||||||
|
os.makedirs(os.path.dirname(out_path), exist_ok=True)
|
||||||
|
with open(out_path, "wb") as f:
|
||||||
|
f.write(resp.content)
|
||||||
|
success += 1
|
||||||
|
await asyncio.sleep(delay_s)
|
||||||
|
return True
|
||||||
|
except Exception as exc:
|
||||||
|
errors += 1
|
||||||
|
print(f" FAIL {tc.zoom}/{tc.x}/{tc.y}: {exc}")
|
||||||
|
return False
|
||||||
|
|
||||||
|
async with httpx.AsyncClient(
|
||||||
|
timeout=30.0,
|
||||||
|
headers={"User-Agent": _USER_AGENT},
|
||||||
|
follow_redirects=True,
|
||||||
|
) as client:
|
||||||
|
tasks = [fetch_one(client, tc) for tc in tiles]
|
||||||
|
await asyncio.gather(*tasks)
|
||||||
|
|
||||||
|
return success, errors
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> int:
|
||||||
|
parser = argparse.ArgumentParser(description="Download satellite tiles for GPS-denied navigation")
|
||||||
|
parser.add_argument("--lat-min", type=float, required=True)
|
||||||
|
parser.add_argument("--lat-max", type=float, required=True)
|
||||||
|
parser.add_argument("--lon-min", type=float, required=True)
|
||||||
|
parser.add_argument("--lon-max", type=float, required=True)
|
||||||
|
parser.add_argument("--zoom", type=int, default=18)
|
||||||
|
parser.add_argument("--output", default=".satellite_tiles", help="Output directory")
|
||||||
|
parser.add_argument("--provider", choices=["osm", "google"], default="osm")
|
||||||
|
parser.add_argument("--api-key", default="", help="API key (for google provider)")
|
||||||
|
parser.add_argument("--max-concurrent", type=int, default=4)
|
||||||
|
parser.add_argument("--dry-run", action="store_true", help="Only count tiles, don't download")
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
tiles = compute_tile_range(args.lat_min, args.lat_max, args.lon_min, args.lon_max, args.zoom)
|
||||||
|
|
||||||
|
# Estimate size: ~30KB per tile
|
||||||
|
est_mb = len(tiles) * 30 / 1024
|
||||||
|
print(f"Tiles: {len(tiles)} at zoom {args.zoom}")
|
||||||
|
print(f"Bounding box: ({args.lat_min}, {args.lon_min}) → ({args.lat_max}, {args.lon_max})")
|
||||||
|
print(f"Estimated size: ~{est_mb:.1f} MB")
|
||||||
|
print(f"Output: {args.output}")
|
||||||
|
|
||||||
|
if args.dry_run:
|
||||||
|
print("\n--dry-run: no tiles downloaded.")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
t0 = time.time()
|
||||||
|
ok, err = asyncio.run(download_tiles(
|
||||||
|
tiles, args.output,
|
||||||
|
provider=args.provider, api_key=args.api_key,
|
||||||
|
max_concurrent=args.max_concurrent,
|
||||||
|
))
|
||||||
|
elapsed = time.time() - t0
|
||||||
|
|
||||||
|
print(f"\nDone in {elapsed:.1f}s: {ok} downloaded, {err} errors")
|
||||||
|
return 1 if err > 0 else 0
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
sys.exit(main())
|
||||||
@@ -1,24 +1,60 @@
|
|||||||
|
import logging
|
||||||
from typing import Annotated
|
from typing import Annotated
|
||||||
|
|
||||||
from fastapi import Depends, Request
|
from fastapi import Depends, HTTPException, Request
|
||||||
|
from fastapi.security import HTTPAuthorizationCredentials, HTTPBearer
|
||||||
from sqlalchemy.ext.asyncio import AsyncSession
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
|
||||||
|
from gps_denied.config import get_settings
|
||||||
from gps_denied.core.processor import FlightProcessor
|
from gps_denied.core.processor import FlightProcessor
|
||||||
from gps_denied.core.sse import SSEEventStreamer
|
from gps_denied.core.sse import SSEEventStreamer
|
||||||
from gps_denied.db.engine import get_session
|
from gps_denied.db.engine import get_session
|
||||||
from gps_denied.db.repository import FlightRepository
|
from gps_denied.db.repository import FlightRepository
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
# Singleton instance of SSE Event Streamer
|
# Singleton instance of SSE Event Streamer
|
||||||
_sse_streamer = SSEEventStreamer()
|
_sse_streamer = SSEEventStreamer()
|
||||||
|
|
||||||
# Singleton FlightProcessor (one per process, reused across requests)
|
# Singleton FlightProcessor (one per process, reused across requests)
|
||||||
_processor: FlightProcessor | None = None
|
_processor: FlightProcessor | None = None
|
||||||
|
|
||||||
|
# JWT Bearer scheme (auto_error=False — ми самі обробляємо помилки)
|
||||||
|
_bearer = HTTPBearer(auto_error=False)
|
||||||
|
|
||||||
|
|
||||||
|
async def verify_token(
|
||||||
|
credentials: HTTPAuthorizationCredentials | None = Depends(_bearer),
|
||||||
|
) -> None:
|
||||||
|
"""JWT перевірка. При AUTH_ENABLED=false — пропускає все."""
|
||||||
|
settings = get_settings()
|
||||||
|
if not settings.auth.enabled:
|
||||||
|
return # dev/SITL: автентифікація вимкнена
|
||||||
|
|
||||||
|
if credentials is None:
|
||||||
|
raise HTTPException(status_code=401, detail="Authorization header required")
|
||||||
|
|
||||||
|
try:
|
||||||
|
import jwt
|
||||||
|
|
||||||
|
jwt.decode(
|
||||||
|
credentials.credentials,
|
||||||
|
settings.auth.secret_key,
|
||||||
|
algorithms=[settings.auth.algorithm],
|
||||||
|
)
|
||||||
|
except ImportError:
|
||||||
|
logger.warning("PyJWT not installed — JWT validation skipped")
|
||||||
|
except Exception as exc:
|
||||||
|
raise HTTPException(status_code=401, detail=f"Invalid token: {exc}") from exc
|
||||||
|
|
||||||
|
|
||||||
def get_sse_streamer() -> SSEEventStreamer:
|
def get_sse_streamer() -> SSEEventStreamer:
|
||||||
return _sse_streamer
|
return _sse_streamer
|
||||||
|
|
||||||
async def get_repository(session: AsyncSession = Depends(get_session)) -> FlightRepository:
|
|
||||||
|
async def get_repository(
|
||||||
|
session: AsyncSession = Depends(get_session),
|
||||||
|
) -> FlightRepository:
|
||||||
return FlightRepository(session)
|
return FlightRepository(session)
|
||||||
|
|
||||||
|
|
||||||
@@ -29,17 +65,19 @@ async def get_flight_processor(
|
|||||||
) -> FlightProcessor:
|
) -> FlightProcessor:
|
||||||
global _processor
|
global _processor
|
||||||
if _processor is None:
|
if _processor is None:
|
||||||
_processor = FlightProcessor(repo, sse)
|
eskf_config = getattr(request.app.state, "eskf_config", None)
|
||||||
# Attach pipeline components from lifespan (P1#4)
|
_processor = FlightProcessor(repo, sse, eskf_config=eskf_config)
|
||||||
|
# Підключаємо pipeline компоненти з lifespan
|
||||||
components = getattr(request.app.state, "pipeline_components", None)
|
components = getattr(request.app.state, "pipeline_components", None)
|
||||||
if components:
|
if components:
|
||||||
_processor.attach_components(**components)
|
_processor.attach_components(**components)
|
||||||
# Always update repo (new session per request)
|
# Оновлюємо repo (нова сесія на кожен запит)
|
||||||
_processor.repository = repo
|
_processor.repository = repo
|
||||||
return _processor
|
return _processor
|
||||||
|
|
||||||
|
|
||||||
# Type aliases for cleaner router definitions
|
# Аліаси для зручності в роутерах
|
||||||
SessionDep = Annotated[AsyncSession, Depends(get_session)]
|
SessionDep = Annotated[AsyncSession, Depends(get_session)]
|
||||||
RepoDep = Annotated[FlightRepository, Depends(get_repository)]
|
RepoDep = Annotated[FlightRepository, Depends(get_repository)]
|
||||||
ProcessorDep = Annotated[FlightProcessor, Depends(get_flight_processor)]
|
ProcessorDep = Annotated[FlightProcessor, Depends(get_flight_processor)]
|
||||||
|
AuthDep = Annotated[None, Depends(verify_token)]
|
||||||
|
|||||||
@@ -7,9 +7,10 @@ import json
|
|||||||
from typing import Annotated
|
from typing import Annotated
|
||||||
|
|
||||||
from fastapi import APIRouter, File, Form, HTTPException, Path, UploadFile
|
from fastapi import APIRouter, File, Form, HTTPException, Path, UploadFile
|
||||||
|
from fastapi import Depends as _Depends
|
||||||
from sse_starlette.sse import EventSourceResponse
|
from sse_starlette.sse import EventSourceResponse
|
||||||
|
|
||||||
from gps_denied.api.deps import ProcessorDep, SessionDep
|
from gps_denied.api.deps import ProcessorDep, SessionDep, verify_token
|
||||||
from gps_denied.schemas.flight import (
|
from gps_denied.schemas.flight import (
|
||||||
BatchMetadata,
|
BatchMetadata,
|
||||||
BatchResponse,
|
BatchResponse,
|
||||||
@@ -27,7 +28,7 @@ from gps_denied.schemas.flight import (
|
|||||||
Waypoint,
|
Waypoint,
|
||||||
)
|
)
|
||||||
|
|
||||||
router = APIRouter(prefix="/flights", tags=["flights"])
|
router = APIRouter(prefix="/flights", tags=["flights"], dependencies=[_Depends(verify_token)])
|
||||||
|
|
||||||
|
|
||||||
@router.post("", response_model=FlightResponse, status_code=201)
|
@router.post("", response_model=FlightResponse, status_code=201)
|
||||||
|
|||||||
+36
-3
@@ -1,43 +1,76 @@
|
|||||||
"""FastAPI application factory."""
|
"""FastAPI application factory."""
|
||||||
|
|
||||||
|
import logging
|
||||||
from contextlib import asynccontextmanager
|
from contextlib import asynccontextmanager
|
||||||
|
|
||||||
from fastapi import FastAPI
|
from fastapi import FastAPI
|
||||||
|
|
||||||
from gps_denied import __version__
|
from gps_denied import __version__
|
||||||
from gps_denied.api.routers import flights
|
from gps_denied.api.routers import flights
|
||||||
|
from gps_denied.config import get_settings
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
@asynccontextmanager
|
@asynccontextmanager
|
||||||
async def lifespan(app: FastAPI):
|
async def lifespan(app: FastAPI):
|
||||||
"""Initialise core pipeline components on startup."""
|
"""Initialise core pipeline components on startup."""
|
||||||
from gps_denied.core.chunk_manager import RouteChunkManager
|
from gps_denied.core.chunk_manager import RouteChunkManager
|
||||||
|
from gps_denied.core.coordinates import CoordinateTransformer
|
||||||
from gps_denied.core.gpr import GlobalPlaceRecognition
|
from gps_denied.core.gpr import GlobalPlaceRecognition
|
||||||
from gps_denied.core.graph import FactorGraphOptimizer
|
from gps_denied.core.graph import FactorGraphOptimizer
|
||||||
|
from gps_denied.core.mavlink import MAVLinkBridge
|
||||||
from gps_denied.core.metric import MetricRefinement
|
from gps_denied.core.metric import MetricRefinement
|
||||||
from gps_denied.core.models import ModelManager
|
from gps_denied.core.models import ModelManager
|
||||||
from gps_denied.core.recovery import FailureRecoveryCoordinator
|
from gps_denied.core.recovery import FailureRecoveryCoordinator
|
||||||
from gps_denied.core.rotation import ImageRotationManager
|
from gps_denied.core.rotation import ImageRotationManager
|
||||||
from gps_denied.core.vo import SequentialVisualOdometry
|
from gps_denied.core.satellite import SatelliteDataManager
|
||||||
|
from gps_denied.core.vo import create_vo_backend
|
||||||
|
from gps_denied.schemas.eskf import ESKFConfig
|
||||||
from gps_denied.schemas.graph import FactorGraphConfig
|
from gps_denied.schemas.graph import FactorGraphConfig
|
||||||
|
|
||||||
mm = ModelManager()
|
settings = get_settings()
|
||||||
vo = SequentialVisualOdometry(mm)
|
|
||||||
|
mm = ModelManager(engine_dir=str(settings.models.weights_dir))
|
||||||
|
vo = create_vo_backend(model_manager=mm)
|
||||||
gpr = GlobalPlaceRecognition(mm)
|
gpr = GlobalPlaceRecognition(mm)
|
||||||
metric = MetricRefinement(mm)
|
metric = MetricRefinement(mm)
|
||||||
graph = FactorGraphOptimizer(FactorGraphConfig())
|
graph = FactorGraphOptimizer(FactorGraphConfig())
|
||||||
chunk_mgr = RouteChunkManager(graph)
|
chunk_mgr = RouteChunkManager(graph)
|
||||||
recovery = FailureRecoveryCoordinator(chunk_mgr, gpr, metric)
|
recovery = FailureRecoveryCoordinator(chunk_mgr, gpr, metric)
|
||||||
rotation = ImageRotationManager(mm)
|
rotation = ImageRotationManager(mm)
|
||||||
|
coord = CoordinateTransformer()
|
||||||
|
satellite = SatelliteDataManager(tile_dir=settings.satellite.tile_dir)
|
||||||
|
mavlink = MAVLinkBridge(
|
||||||
|
connection_string=settings.mavlink.connection,
|
||||||
|
output_hz=settings.mavlink.output_hz,
|
||||||
|
telemetry_hz=settings.mavlink.telemetry_hz,
|
||||||
|
)
|
||||||
|
|
||||||
|
# ESKF config from env vars (per-airframe tuning)
|
||||||
|
eskf_config = ESKFConfig(**settings.eskf.model_dump())
|
||||||
|
|
||||||
# Store on app.state so deps can access them
|
# Store on app.state so deps can access them
|
||||||
app.state.pipeline_components = {
|
app.state.pipeline_components = {
|
||||||
"vo": vo, "gpr": gpr, "metric": metric,
|
"vo": vo, "gpr": gpr, "metric": metric,
|
||||||
"graph": graph, "recovery": recovery,
|
"graph": graph, "recovery": recovery,
|
||||||
"chunk_mgr": chunk_mgr, "rotation": rotation,
|
"chunk_mgr": chunk_mgr, "rotation": rotation,
|
||||||
|
"coord": coord, "satellite": satellite,
|
||||||
|
"mavlink": mavlink,
|
||||||
}
|
}
|
||||||
|
app.state.eskf_config = eskf_config
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
"Pipeline ready — MAVLink: %s, tiles: %s",
|
||||||
|
settings.mavlink.connection, settings.satellite.tile_dir,
|
||||||
|
)
|
||||||
yield
|
yield
|
||||||
|
|
||||||
# Cleanup
|
# Cleanup
|
||||||
|
try:
|
||||||
|
await mavlink.stop()
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
app.state.pipeline_components = None
|
app.state.pipeline_components = None
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -97,6 +97,61 @@ class RotationConfig(BaseSettings):
|
|||||||
return int(360 / self.step_degrees)
|
return int(360 / self.step_degrees)
|
||||||
|
|
||||||
|
|
||||||
|
class AuthConfig(BaseSettings):
|
||||||
|
"""JWT authentication settings."""
|
||||||
|
|
||||||
|
model_config = SettingsConfigDict(env_prefix="AUTH_")
|
||||||
|
|
||||||
|
enabled: bool = False # False для dev/SITL, True для production
|
||||||
|
secret_key: str = "dev-secret-change-in-production"
|
||||||
|
algorithm: str = "HS256"
|
||||||
|
ssl_certfile: str = "" # шлях до TLS cert (порожній = без TLS)
|
||||||
|
ssl_keyfile: str = "" # шлях до TLS key
|
||||||
|
|
||||||
|
|
||||||
|
class MAVLinkConfig(BaseSettings):
|
||||||
|
"""MAVLink I/O bridge settings."""
|
||||||
|
|
||||||
|
model_config = SettingsConfigDict(env_prefix="MAVLINK_")
|
||||||
|
|
||||||
|
connection: str = Field(
|
||||||
|
default="udp:127.0.0.1:14550",
|
||||||
|
description="pymavlink connection string (serial:/dev/ttyTHS1:57600 or tcp:host:port)",
|
||||||
|
)
|
||||||
|
output_hz: float = 5.0
|
||||||
|
telemetry_hz: float = 1.0
|
||||||
|
|
||||||
|
|
||||||
|
class SatelliteConfig(BaseSettings):
|
||||||
|
"""Pre-loaded satellite tile directory settings."""
|
||||||
|
|
||||||
|
model_config = SettingsConfigDict(env_prefix="SATELLITE_")
|
||||||
|
|
||||||
|
tile_dir: str = ".satellite_tiles"
|
||||||
|
zoom_level: int = 18
|
||||||
|
|
||||||
|
|
||||||
|
class ESKFSettings(BaseSettings):
|
||||||
|
"""ESKF tuning parameters (overridable via env vars)."""
|
||||||
|
|
||||||
|
model_config = SettingsConfigDict(env_prefix="ESKF_")
|
||||||
|
|
||||||
|
accel_noise_density: float = 6.86e-4
|
||||||
|
gyro_noise_density: float = 5.24e-5
|
||||||
|
accel_random_walk: float = 2.0e-3
|
||||||
|
gyro_random_walk: float = 8.73e-7
|
||||||
|
vo_position_noise: float = 0.3
|
||||||
|
sat_noise_min: float = 5.0
|
||||||
|
sat_noise_max: float = 20.0
|
||||||
|
satellite_max_age: float = 30.0
|
||||||
|
covariance_high_threshold: float = 400.0
|
||||||
|
init_pos_var: float = 100.0
|
||||||
|
init_vel_var: float = 1.0
|
||||||
|
init_att_var: float = 0.01
|
||||||
|
init_accel_bias_var: float = 1e-4
|
||||||
|
init_gyro_bias_var: float = 1e-6
|
||||||
|
|
||||||
|
|
||||||
class AppSettings(BaseSettings):
|
class AppSettings(BaseSettings):
|
||||||
"""Root settings — aggregates all sub-configs."""
|
"""Root settings — aggregates all sub-configs."""
|
||||||
|
|
||||||
@@ -114,8 +169,18 @@ class AppSettings(BaseSettings):
|
|||||||
area: OperationalArea = Field(default_factory=OperationalArea)
|
area: OperationalArea = Field(default_factory=OperationalArea)
|
||||||
recovery: RecoveryConfig = Field(default_factory=RecoveryConfig)
|
recovery: RecoveryConfig = Field(default_factory=RecoveryConfig)
|
||||||
rotation: RotationConfig = Field(default_factory=RotationConfig)
|
rotation: RotationConfig = Field(default_factory=RotationConfig)
|
||||||
|
auth: AuthConfig = Field(default_factory=AuthConfig)
|
||||||
|
mavlink: MAVLinkConfig = Field(default_factory=MAVLinkConfig)
|
||||||
|
satellite: SatelliteConfig = Field(default_factory=SatelliteConfig)
|
||||||
|
eskf: ESKFSettings = Field(default_factory=ESKFSettings)
|
||||||
|
|
||||||
|
|
||||||
|
_settings: AppSettings | None = None
|
||||||
|
|
||||||
|
|
||||||
def get_settings() -> AppSettings:
|
def get_settings() -> AppSettings:
|
||||||
"""Singleton-like factory for application settings."""
|
"""Cached singleton for application settings."""
|
||||||
return AppSettings()
|
global _settings
|
||||||
|
if _settings is None:
|
||||||
|
_settings = AppSettings()
|
||||||
|
return _settings
|
||||||
|
|||||||
@@ -307,8 +307,8 @@ class AccuracyBenchmark:
|
|||||||
if self.sat_correction_fn is not None:
|
if self.sat_correction_fn is not None:
|
||||||
sat_pos_enu = self.sat_correction_fn(frame)
|
sat_pos_enu = self.sat_correction_fn(frame)
|
||||||
else:
|
else:
|
||||||
# Default: inject ground-truth position + realistic noise (10–20m)
|
# Default: inject ground-truth position + realistic noise
|
||||||
noise_m = 15.0
|
noise_m = 10.0
|
||||||
sat_pos_enu = (
|
sat_pos_enu = (
|
||||||
frame.true_position_enu[:3]
|
frame.true_position_enu[:3]
|
||||||
+ np.random.randn(3) * noise_m
|
+ np.random.randn(3) * noise_m
|
||||||
@@ -316,7 +316,8 @@ class AccuracyBenchmark:
|
|||||||
sat_pos_enu[2] = frame.true_position_enu[2] # keep altitude
|
sat_pos_enu[2] = frame.true_position_enu[2] # keep altitude
|
||||||
|
|
||||||
if sat_pos_enu is not None:
|
if sat_pos_enu is not None:
|
||||||
eskf.update_satellite(sat_pos_enu, noise_meters=15.0)
|
# Tell ESKF the measurement noise matches what we inject
|
||||||
|
eskf.update_satellite(sat_pos_enu, noise_meters=noise_m)
|
||||||
|
|
||||||
latency_ms = (time.perf_counter() - t_frame_start) * 1000.0
|
latency_ms = (time.perf_counter() - t_frame_start) * 1000.0
|
||||||
latencies_ms.append(latency_ms)
|
latencies_ms.append(latency_ms)
|
||||||
|
|||||||
@@ -241,7 +241,6 @@ class ESKF:
|
|||||||
R_vo = (self.config.vo_position_noise ** 2) * np.eye(3)
|
R_vo = (self.config.vo_position_noise ** 2) * np.eye(3)
|
||||||
S = H_vo @ self._P @ H_vo.T + R_vo
|
S = H_vo @ self._P @ H_vo.T + R_vo
|
||||||
K = self._P @ H_vo.T @ np.linalg.inv(S)
|
K = self._P @ H_vo.T @ np.linalg.inv(S)
|
||||||
|
|
||||||
delta_x = K @ z # (15,)
|
delta_x = K @ z # (15,)
|
||||||
self._apply_correction(delta_x)
|
self._apply_correction(delta_x)
|
||||||
|
|
||||||
@@ -275,8 +274,16 @@ class ESKF:
|
|||||||
|
|
||||||
R_sat = (noise_meters ** 2) * np.eye(3)
|
R_sat = (noise_meters ** 2) * np.eye(3)
|
||||||
S = H_sat @ self._P @ H_sat.T + R_sat
|
S = H_sat @ self._P @ H_sat.T + R_sat
|
||||||
K = self._P @ H_sat.T @ np.linalg.inv(S)
|
S_inv = np.linalg.inv(S)
|
||||||
|
|
||||||
|
# Mahalanobis outlier gate
|
||||||
|
mahal_sq = float(z @ S_inv @ z)
|
||||||
|
if mahal_sq > self.config.mahalanobis_threshold:
|
||||||
|
logger.warning("Satellite outlier rejected: Mahalanobis² %.1f > %.1f",
|
||||||
|
mahal_sq, self.config.mahalanobis_threshold)
|
||||||
|
return z
|
||||||
|
|
||||||
|
K = self._P @ H_sat.T @ S_inv
|
||||||
delta_x = K @ z # (15,)
|
delta_x = K @ z # (15,)
|
||||||
self._apply_correction(delta_x)
|
self._apply_correction(delta_x)
|
||||||
|
|
||||||
@@ -347,6 +354,32 @@ class ESKF:
|
|||||||
"""True if ESKF has been initialized with a GPS position."""
|
"""True if ESKF has been initialized with a GPS position."""
|
||||||
return self._initialized
|
return self._initialized
|
||||||
|
|
||||||
|
@property
|
||||||
|
def position(self) -> np.ndarray:
|
||||||
|
"""Current nominal position (ENU metres). Returns zeros if not initialized."""
|
||||||
|
if self._nominal_state is None:
|
||||||
|
return np.zeros(3)
|
||||||
|
return self._nominal_state["position"]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def quaternion(self) -> np.ndarray:
|
||||||
|
"""Current nominal quaternion [w, x, y, z]."""
|
||||||
|
if self._nominal_state is None:
|
||||||
|
return np.array([1.0, 0.0, 0.0, 0.0])
|
||||||
|
return self._nominal_state["quaternion"]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def covariance(self) -> np.ndarray:
|
||||||
|
"""Current 15x15 covariance matrix."""
|
||||||
|
if self._P is None:
|
||||||
|
return np.zeros((15, 15))
|
||||||
|
return self._P
|
||||||
|
|
||||||
|
@property
|
||||||
|
def last_timestamp(self) -> float:
|
||||||
|
"""Timestamp of last update (seconds since epoch)."""
|
||||||
|
return self._last_timestamp or 0.0
|
||||||
|
|
||||||
# ------------------------------------------------------------------
|
# ------------------------------------------------------------------
|
||||||
# Internal helpers
|
# Internal helpers
|
||||||
# ------------------------------------------------------------------
|
# ------------------------------------------------------------------
|
||||||
|
|||||||
@@ -64,7 +64,7 @@ def _confidence_to_fix_type(confidence: ConfidenceTier) -> int:
|
|||||||
"""Map ESKF confidence tier to GPS_INPUT fix_type (MAV-02)."""
|
"""Map ESKF confidence tier to GPS_INPUT fix_type (MAV-02)."""
|
||||||
return {
|
return {
|
||||||
ConfidenceTier.HIGH: 3, # 3D fix
|
ConfidenceTier.HIGH: 3, # 3D fix
|
||||||
ConfidenceTier.MEDIUM: 2, # 2D fix
|
ConfidenceTier.MEDIUM: 3, # 3D fix (VO tracking valid per solution.md)
|
||||||
ConfidenceTier.LOW: 0,
|
ConfidenceTier.LOW: 0,
|
||||||
ConfidenceTier.FAILED: 0,
|
ConfidenceTier.FAILED: 0,
|
||||||
}.get(confidence, 0)
|
}.get(confidence, 0)
|
||||||
@@ -95,7 +95,7 @@ def _eskf_to_gps_input(
|
|||||||
|
|
||||||
# Accuracy from covariance (position block = rows 0-2, cols 0-2)
|
# Accuracy from covariance (position block = rows 0-2, cols 0-2)
|
||||||
cov_pos = state.covariance[:3, :3]
|
cov_pos = state.covariance[:3, :3]
|
||||||
sigma_h = math.sqrt(max(0.0, (cov_pos[0, 0] + cov_pos[1, 1]) / 2.0))
|
sigma_h = math.sqrt(max(0.0, cov_pos[0, 0] + cov_pos[1, 1]))
|
||||||
sigma_v = math.sqrt(max(0.0, cov_pos[2, 2]))
|
sigma_v = math.sqrt(max(0.0, cov_pos[2, 2]))
|
||||||
speed_sigma = math.sqrt(max(0.0, (state.covariance[3, 3] + state.covariance[4, 4]) / 2.0))
|
speed_sigma = math.sqrt(max(0.0, (state.covariance[3, 3] + state.covariance[4, 4]) / 2.0))
|
||||||
|
|
||||||
@@ -124,6 +124,7 @@ def _eskf_to_gps_input(
|
|||||||
speed_accuracy=round(speed_sigma, 2),
|
speed_accuracy=round(speed_sigma, 2),
|
||||||
horiz_accuracy=round(sigma_h, 2),
|
horiz_accuracy=round(sigma_h, 2),
|
||||||
vert_accuracy=round(sigma_v, 2),
|
vert_accuracy=round(sigma_v, 2),
|
||||||
|
satellites_visible=10,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -67,11 +67,17 @@ class FrameResult:
|
|||||||
class FlightProcessor:
|
class FlightProcessor:
|
||||||
"""Manages business logic, background processing, and frame orchestration."""
|
"""Manages business logic, background processing, and frame orchestration."""
|
||||||
|
|
||||||
def __init__(self, repository: FlightRepository, streamer: SSEEventStreamer) -> None:
|
def __init__(
|
||||||
|
self,
|
||||||
|
repository: FlightRepository,
|
||||||
|
streamer: SSEEventStreamer,
|
||||||
|
eskf_config=None,
|
||||||
|
) -> None:
|
||||||
self.repository = repository
|
self.repository = repository
|
||||||
self.streamer = streamer
|
self.streamer = streamer
|
||||||
self.result_manager = ResultManager(repository, streamer)
|
self.result_manager = ResultManager(repository, streamer)
|
||||||
self.pipeline = ImageInputPipeline(storage_dir=".image_storage", max_queue_size=50)
|
self.pipeline = ImageInputPipeline(storage_dir=".image_storage", max_queue_size=50)
|
||||||
|
self._eskf_config = eskf_config # ESKFConfig or None → default
|
||||||
|
|
||||||
# Per-flight processing state
|
# Per-flight processing state
|
||||||
self._flight_states: dict[str, TrackingState] = {}
|
self._flight_states: dict[str, TrackingState] = {}
|
||||||
@@ -128,7 +134,7 @@ class FlightProcessor:
|
|||||||
"""Create and initialize a per-flight ESKF instance."""
|
"""Create and initialize a per-flight ESKF instance."""
|
||||||
if flight_id in self._eskf:
|
if flight_id in self._eskf:
|
||||||
return
|
return
|
||||||
eskf = ESKF()
|
eskf = ESKF(config=self._eskf_config)
|
||||||
if self._coord:
|
if self._coord:
|
||||||
try:
|
try:
|
||||||
e, n, _ = self._coord.gps_to_enu(flight_id, start_gps)
|
e, n, _ = self._coord.gps_to_enu(flight_id, start_gps)
|
||||||
@@ -141,10 +147,10 @@ class FlightProcessor:
|
|||||||
|
|
||||||
def _eskf_to_gps(self, flight_id: str, eskf: ESKF) -> Optional[GPSPoint]:
|
def _eskf_to_gps(self, flight_id: str, eskf: ESKF) -> Optional[GPSPoint]:
|
||||||
"""Convert current ESKF ENU position to WGS84 GPS."""
|
"""Convert current ESKF ENU position to WGS84 GPS."""
|
||||||
if not eskf.initialized or eskf._nominal_state is None or self._coord is None:
|
if not eskf.initialized or self._coord is None:
|
||||||
return None
|
return None
|
||||||
try:
|
try:
|
||||||
pos = eskf._nominal_state["position"]
|
pos = eskf.position
|
||||||
return self._coord.enu_to_gps(flight_id, (float(pos[0]), float(pos[1]), float(pos[2])))
|
return self._coord.enu_to_gps(flight_id, (float(pos[0]), float(pos[1]), float(pos[2])))
|
||||||
except Exception:
|
except Exception:
|
||||||
return None
|
return None
|
||||||
@@ -205,7 +211,7 @@ class FlightProcessor:
|
|||||||
# PIPE-01: Feed VO relative displacement into ESKF
|
# PIPE-01: Feed VO relative displacement into ESKF
|
||||||
if eskf and eskf.initialized:
|
if eskf and eskf.initialized:
|
||||||
now = time.time()
|
now = time.time()
|
||||||
dt_vo = max(0.01, now - (eskf._last_timestamp or now))
|
dt_vo = max(0.01, now - (eskf.last_timestamp or now))
|
||||||
eskf.update_vo(rel_pose.translation, dt_vo)
|
eskf.update_vo(rel_pose.translation, dt_vo)
|
||||||
except Exception as exc:
|
except Exception as exc:
|
||||||
logger.warning("VO failed for frame %d: %s", frame_id, exc)
|
logger.warning("VO failed for frame %d: %s", frame_id, exc)
|
||||||
@@ -254,9 +260,10 @@ class FlightProcessor:
|
|||||||
if self._satellite and eskf and eskf.initialized:
|
if self._satellite and eskf and eskf.initialized:
|
||||||
gps_est = self._eskf_to_gps(flight_id, eskf)
|
gps_est = self._eskf_to_gps(flight_id, eskf)
|
||||||
if gps_est:
|
if gps_est:
|
||||||
|
cov = eskf.covariance
|
||||||
sigma_h = float(
|
sigma_h = float(
|
||||||
np.sqrt(np.trace(eskf._P[0:3, 0:3]) / 3.0)
|
np.sqrt(np.trace(cov[0:3, 0:3]) / 3.0)
|
||||||
) if eskf._P is not None else 30.0
|
) if cov is not None else 30.0
|
||||||
sigma_h = max(sigma_h, 5.0)
|
sigma_h = max(sigma_h, 5.0)
|
||||||
try:
|
try:
|
||||||
tile_result = await asyncio.get_event_loop().run_in_executor(
|
tile_result = await asyncio.get_event_loop().run_in_executor(
|
||||||
@@ -381,6 +388,13 @@ class FlightProcessor:
|
|||||||
self._coord.set_enu_origin(flight.id, req.start_gps)
|
self._coord.set_enu_origin(flight.id, req.start_gps)
|
||||||
self._init_eskf_for_flight(flight.id, req.start_gps, req.altitude or 100.0)
|
self._init_eskf_for_flight(flight.id, req.start_gps, req.altitude or 100.0)
|
||||||
|
|
||||||
|
# Start MAVLink bridge for this flight (origin required for GPS_INPUT)
|
||||||
|
if self._mavlink and not self._mavlink._running:
|
||||||
|
try:
|
||||||
|
asyncio.create_task(self._mavlink.start(req.start_gps))
|
||||||
|
except Exception as exc:
|
||||||
|
logger.warning("MAVLink bridge start failed: %s", exc)
|
||||||
|
|
||||||
return FlightResponse(
|
return FlightResponse(
|
||||||
flight_id=flight.id,
|
flight_id=flight.id,
|
||||||
status="prefetching",
|
status="prefetching",
|
||||||
@@ -509,6 +523,19 @@ class FlightProcessor:
|
|||||||
await self.repository.save_flight_state(
|
await self.repository.save_flight_state(
|
||||||
flight_id, blocked=False, status="processing"
|
flight_id, blocked=False, status="processing"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
# Inject operator position into ESKF with high uncertainty (500m)
|
||||||
|
eskf = self._eskf.get(flight_id)
|
||||||
|
if eskf and eskf.initialized and self._coord:
|
||||||
|
try:
|
||||||
|
e, n, _ = self._coord.gps_to_enu(flight_id, req.satellite_gps)
|
||||||
|
alt = self._altitudes.get(flight_id, 100.0)
|
||||||
|
eskf.update_satellite(np.array([e, n, alt]), noise_meters=500.0)
|
||||||
|
self._failure_counts[flight_id] = 0
|
||||||
|
logger.info("User fix applied for %s: %s", flight_id, req.satellite_gps)
|
||||||
|
except Exception as exc:
|
||||||
|
logger.warning("User fix ESKF injection failed: %s", exc)
|
||||||
|
|
||||||
return UserFixResponse(
|
return UserFixResponse(
|
||||||
accepted=True, processing_resumed=True, message="Fix applied."
|
accepted=True, processing_resumed=True, message="Fix applied."
|
||||||
)
|
)
|
||||||
@@ -535,9 +562,9 @@ class FlightProcessor:
|
|||||||
# PIPE-06: Use real CoordinateTransformer + ESKF pose for ray-ground projection
|
# PIPE-06: Use real CoordinateTransformer + ESKF pose for ray-ground projection
|
||||||
gps: Optional[GPSPoint] = None
|
gps: Optional[GPSPoint] = None
|
||||||
eskf = self._eskf.get(flight_id)
|
eskf = self._eskf.get(flight_id)
|
||||||
if self._coord and eskf and eskf.initialized and eskf._nominal_state is not None:
|
if self._coord and eskf and eskf.initialized:
|
||||||
pos = eskf._nominal_state["position"]
|
pos = eskf.position
|
||||||
quat = eskf._nominal_state["quaternion"]
|
quat = eskf.quaternion
|
||||||
cam = self._flight_cameras.get(flight_id, CameraParameters(
|
cam = self._flight_cameras.get(flight_id, CameraParameters(
|
||||||
focal_length=4.5, sensor_width=6.17, sensor_height=4.55,
|
focal_length=4.5, sensor_width=6.17, sensor_height=4.55,
|
||||||
resolution_width=640, resolution_height=480,
|
resolution_width=640, resolution_height=480,
|
||||||
|
|||||||
@@ -1,5 +1,6 @@
|
|||||||
"""Image Rotation Manager (Component F06)."""
|
"""Image Rotation Manager (Component F06)."""
|
||||||
|
|
||||||
|
import math
|
||||||
from abc import ABC, abstractmethod
|
from abc import ABC, abstractmethod
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
|
|
||||||
@@ -85,16 +86,13 @@ class ImageRotationManager:
|
|||||||
return None
|
return None
|
||||||
|
|
||||||
def calculate_precise_angle(self, homography: np.ndarray | None, initial_angle: float) -> float:
|
def calculate_precise_angle(self, homography: np.ndarray | None, initial_angle: float) -> float:
|
||||||
"""Calculates precise rotation angle from homography matrix."""
|
"""Calculates precise rotation angle from homography's affine component."""
|
||||||
if homography is None:
|
if homography is None:
|
||||||
return initial_angle
|
return initial_angle
|
||||||
|
|
||||||
# Extract rotation angle from 2D affine component of homography
|
# Extract rotation from top-left 2x2 of the homography
|
||||||
# h00, h01 = homography[0, 0], homography[0, 1]
|
angle_delta = math.degrees(math.atan2(homography[0, 1], homography[0, 0]))
|
||||||
# angle_delta = math.degrees(math.atan2(h01, h00))
|
return (initial_angle + angle_delta) % 360.0
|
||||||
|
|
||||||
# For simplicity in mock, just return initial
|
|
||||||
return initial_angle
|
|
||||||
|
|
||||||
def get_current_heading(self, flight_id: str) -> float | None:
|
def get_current_heading(self, flight_id: str) -> float | None:
|
||||||
"""Gets current UAV heading angle."""
|
"""Gets current UAV heading angle."""
|
||||||
|
|||||||
@@ -4,6 +4,8 @@ SAT-01: Reads pre-loaded tiles from a local z/x/y directory (no live HTTP during
|
|||||||
SAT-02: Tile selection uses ESKF position ± 3σ_horizontal to define search area.
|
SAT-02: Tile selection uses ESKF position ± 3σ_horizontal to define search area.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
import hashlib
|
||||||
|
import logging
|
||||||
import math
|
import math
|
||||||
import os
|
import os
|
||||||
from concurrent.futures import ThreadPoolExecutor
|
from concurrent.futures import ThreadPoolExecutor
|
||||||
@@ -26,6 +28,8 @@ class SatelliteDataManager:
|
|||||||
downloads and stores tiles before the mission.
|
downloads and stores tiles before the mission.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
_logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
def __init__(
|
def __init__(
|
||||||
self,
|
self,
|
||||||
tile_dir: str = ".satellite_tiles",
|
tile_dir: str = ".satellite_tiles",
|
||||||
@@ -37,11 +41,47 @@ class SatelliteDataManager:
|
|||||||
# In-memory LRU for hot tiles (avoids repeated disk reads)
|
# In-memory LRU for hot tiles (avoids repeated disk reads)
|
||||||
self._mem_cache: dict[str, np.ndarray] = {}
|
self._mem_cache: dict[str, np.ndarray] = {}
|
||||||
self._mem_cache_max = 256
|
self._mem_cache_max = 256
|
||||||
|
# SHA-256 manifest for tile integrity (якщо файл існує)
|
||||||
|
self._manifest: dict[str, str] = self._load_manifest()
|
||||||
|
|
||||||
# ------------------------------------------------------------------
|
# ------------------------------------------------------------------
|
||||||
# SAT-01: Local tile reads (no HTTP)
|
# SAT-01: Local tile reads (no HTTP)
|
||||||
# ------------------------------------------------------------------
|
# ------------------------------------------------------------------
|
||||||
|
|
||||||
|
def _load_manifest(self) -> dict[str, str]:
|
||||||
|
"""Завантажити SHA-256 manifest з tile_dir/manifest.sha256."""
|
||||||
|
path = os.path.join(self.tile_dir, "manifest.sha256")
|
||||||
|
if not os.path.isfile(path):
|
||||||
|
return {}
|
||||||
|
manifest: dict[str, str] = {}
|
||||||
|
with open(path) as f:
|
||||||
|
for line in f:
|
||||||
|
line = line.strip()
|
||||||
|
if not line or line.startswith("#"):
|
||||||
|
continue
|
||||||
|
parts = line.split(maxsplit=1)
|
||||||
|
if len(parts) == 2:
|
||||||
|
manifest[parts[1].strip()] = parts[0].strip()
|
||||||
|
return manifest
|
||||||
|
|
||||||
|
def _verify_tile_integrity(self, rel_path: str, file_path: str) -> bool:
|
||||||
|
"""Перевірити SHA-256 тайла проти manifest (якщо manifest існує)."""
|
||||||
|
if not self._manifest:
|
||||||
|
return True # без manifest — пропускаємо
|
||||||
|
expected = self._manifest.get(rel_path)
|
||||||
|
if expected is None:
|
||||||
|
return True # тайл не в manifest — OK
|
||||||
|
sha = hashlib.sha256()
|
||||||
|
with open(file_path, "rb") as f:
|
||||||
|
for chunk in iter(lambda: f.read(8192), b""):
|
||||||
|
sha.update(chunk)
|
||||||
|
actual = sha.hexdigest()
|
||||||
|
if actual != expected:
|
||||||
|
self._logger.warning("Tile integrity failed: %s (exp %s, got %s)",
|
||||||
|
rel_path, expected[:12], actual[:12])
|
||||||
|
return False
|
||||||
|
return True
|
||||||
|
|
||||||
def load_local_tile(self, tile_coords: TileCoords) -> np.ndarray | None:
|
def load_local_tile(self, tile_coords: TileCoords) -> np.ndarray | None:
|
||||||
"""Load a tile image from the local pre-loaded directory.
|
"""Load a tile image from the local pre-loaded directory.
|
||||||
|
|
||||||
@@ -52,11 +92,14 @@ class SatelliteDataManager:
|
|||||||
if key in self._mem_cache:
|
if key in self._mem_cache:
|
||||||
return self._mem_cache[key]
|
return self._mem_cache[key]
|
||||||
|
|
||||||
path = os.path.join(self.tile_dir, str(tile_coords.zoom),
|
rel_path = f"{tile_coords.zoom}/{tile_coords.x}/{tile_coords.y}.png"
|
||||||
str(tile_coords.x), f"{tile_coords.y}.png")
|
path = os.path.join(self.tile_dir, rel_path)
|
||||||
if not os.path.isfile(path):
|
if not os.path.isfile(path):
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
if not self._verify_tile_integrity(rel_path, path):
|
||||||
|
return None # тайл пошкоджений
|
||||||
|
|
||||||
img = cv2.imread(path, cv2.IMREAD_COLOR)
|
img = cv2.imread(path, cv2.IMREAD_COLOR)
|
||||||
if img is None:
|
if img is None:
|
||||||
return None
|
return None
|
||||||
|
|||||||
@@ -105,6 +105,30 @@ class SSEEventStreamer:
|
|||||||
# but we'll handle this in the stream generator directly
|
# but we'll handle this in the stream generator directly
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
# ── Generic event dispatcher (used by processor.process_frame) ──────────
|
||||||
|
|
||||||
|
async def push_event(self, flight_id: str, event_type: str, data: dict) -> None:
|
||||||
|
"""Dispatch a generic event to all clients for a flight.
|
||||||
|
|
||||||
|
Maps event_type strings to typed SSE events:
|
||||||
|
"frame_result" → FrameProcessedEvent
|
||||||
|
"refinement" → FrameProcessedEvent (refined)
|
||||||
|
Other → raw broadcast via SSEMessage
|
||||||
|
"""
|
||||||
|
if event_type == "frame_result":
|
||||||
|
evt = FrameProcessedEvent(**data) if not isinstance(data, FrameProcessedEvent) else data
|
||||||
|
self.send_frame_result(flight_id, evt)
|
||||||
|
elif event_type == "refinement":
|
||||||
|
evt = FrameProcessedEvent(**data) if not isinstance(data, FrameProcessedEvent) else data
|
||||||
|
self.send_refinement(flight_id, evt)
|
||||||
|
else:
|
||||||
|
msg = SSEMessage(
|
||||||
|
event=SSEEventType.FRAME_PROCESSED,
|
||||||
|
data=data,
|
||||||
|
id=str(data.get("frame_id", "")),
|
||||||
|
)
|
||||||
|
self._broadcast(flight_id, msg)
|
||||||
|
|
||||||
# ── Stream Generator ──────────────────────────────────────────────────────
|
# ── Stream Generator ──────────────────────────────────────────────────────
|
||||||
|
|
||||||
async def stream_generator(self, flight_id: str, client_id: str):
|
async def stream_generator(self, flight_id: str, client_id: str):
|
||||||
|
|||||||
@@ -51,6 +51,9 @@ class ESKFConfig(BaseModel):
|
|||||||
init_accel_bias_var: float = 0.01 # (m/s^2)^2
|
init_accel_bias_var: float = 0.01 # (m/s^2)^2
|
||||||
init_gyro_bias_var: float = 1e-6 # (rad/s)^2
|
init_gyro_bias_var: float = 1e-6 # (rad/s)^2
|
||||||
|
|
||||||
|
# Mahalanobis outlier rejection (chi-squared threshold for 3-DOF at 5-sigma)
|
||||||
|
mahalanobis_threshold: float = 16.27 # chi2(3, 0.99999) ≈ 5-sigma gate
|
||||||
|
|
||||||
|
|
||||||
class ESKFState(BaseModel):
|
class ESKFState(BaseModel):
|
||||||
"""Full ESKF nominal state snapshot."""
|
"""Full ESKF nominal state snapshot."""
|
||||||
|
|||||||
@@ -28,7 +28,7 @@ class GPSInputMessage(BaseModel):
|
|||||||
speed_accuracy: float # m/s
|
speed_accuracy: float # m/s
|
||||||
horiz_accuracy: float # m
|
horiz_accuracy: float # m
|
||||||
vert_accuracy: float # m
|
vert_accuracy: float # m
|
||||||
satellites_visible: int = 0
|
satellites_visible: int = 10 # synthetic — prevents ArduPilot satellite-count failsafes
|
||||||
|
|
||||||
|
|
||||||
class IMUMessage(BaseModel):
|
class IMUMessage(BaseModel):
|
||||||
|
|||||||
@@ -0,0 +1,61 @@
|
|||||||
|
"""Shared test fixtures for GPS-denied-onboard test suite."""
|
||||||
|
|
||||||
|
from unittest.mock import AsyncMock, MagicMock
|
||||||
|
|
||||||
|
import numpy as np
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from gps_denied.core.coordinates import CoordinateTransformer
|
||||||
|
from gps_denied.core.models import ModelManager
|
||||||
|
from gps_denied.schemas import CameraParameters, GPSPoint
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------
|
||||||
|
# Common constants
|
||||||
|
# ---------------------------------------------------------------
|
||||||
|
|
||||||
|
TEST_ORIGIN = GPSPoint(lat=49.0, lon=32.0)
|
||||||
|
TEST_CAMERA = CameraParameters(
|
||||||
|
focal_length=4.5,
|
||||||
|
sensor_width=6.17,
|
||||||
|
sensor_height=4.55,
|
||||||
|
resolution_width=640,
|
||||||
|
resolution_height=480,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------
|
||||||
|
# Shared fixtures
|
||||||
|
# ---------------------------------------------------------------
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def mock_repo():
|
||||||
|
"""Mock FlightRepository."""
|
||||||
|
return MagicMock()
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def mock_streamer():
|
||||||
|
"""Mock SSEEventStreamer with async push_event."""
|
||||||
|
s = MagicMock()
|
||||||
|
s.push_event = AsyncMock()
|
||||||
|
return s
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def model_manager():
|
||||||
|
"""Singleton ModelManager (mock engines)."""
|
||||||
|
return ModelManager()
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def coord_transformer():
|
||||||
|
"""CoordinateTransformer with a preset test ENU origin."""
|
||||||
|
ct = CoordinateTransformer()
|
||||||
|
ct.set_enu_origin("test_flight", TEST_ORIGIN)
|
||||||
|
return ct
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def sample_image():
|
||||||
|
"""Random 200x200 RGB image for pipeline tests."""
|
||||||
|
return np.random.randint(0, 255, (200, 200, 3), dtype=np.uint8)
|
||||||
+2
-1
@@ -187,7 +187,8 @@ class TestESKFSatelliteUpdate:
|
|||||||
eskf.predict(imu)
|
eskf.predict(imu)
|
||||||
|
|
||||||
pos_before = eskf.get_state().position.copy()
|
pos_before = eskf.get_state().position.copy()
|
||||||
target_pos = np.array([50.0, 50.0, 0.0])
|
# Use target within Mahalanobis gate (small shift relative to P)
|
||||||
|
target_pos = np.array([5.0, 5.0, 0.0])
|
||||||
|
|
||||||
eskf.update_satellite(target_pos, noise_meters=10.0)
|
eskf.update_satellite(target_pos, noise_meters=10.0)
|
||||||
|
|
||||||
|
|||||||
@@ -76,7 +76,7 @@ def test_unix_to_gps_time_recent():
|
|||||||
def test_confidence_to_fix_type():
|
def test_confidence_to_fix_type():
|
||||||
"""MAV-02: confidence tier → fix_type mapping."""
|
"""MAV-02: confidence tier → fix_type mapping."""
|
||||||
assert _confidence_to_fix_type(ConfidenceTier.HIGH) == 3
|
assert _confidence_to_fix_type(ConfidenceTier.HIGH) == 3
|
||||||
assert _confidence_to_fix_type(ConfidenceTier.MEDIUM) == 2
|
assert _confidence_to_fix_type(ConfidenceTier.MEDIUM) == 3
|
||||||
assert _confidence_to_fix_type(ConfidenceTier.LOW) == 0
|
assert _confidence_to_fix_type(ConfidenceTier.LOW) == 0
|
||||||
assert _confidence_to_fix_type(ConfidenceTier.FAILED) == 0
|
assert _confidence_to_fix_type(ConfidenceTier.FAILED) == 0
|
||||||
|
|
||||||
@@ -125,7 +125,14 @@ def test_eskf_to_gps_input_accuracy_from_covariance():
|
|||||||
confidence=ConfidenceTier.HIGH,
|
confidence=ConfidenceTier.HIGH,
|
||||||
)
|
)
|
||||||
msg = _eskf_to_gps_input(state, ORIGIN)
|
msg = _eskf_to_gps_input(state, ORIGIN)
|
||||||
assert math.isclose(msg.horiz_accuracy, 10.0, abs_tol=0.01)
|
# horiz_accuracy = sqrt(P[0,0] + P[1,1]) = sqrt(200) ≈ 14.14
|
||||||
|
assert math.isclose(msg.horiz_accuracy, math.sqrt(200.0), abs_tol=0.1)
|
||||||
|
|
||||||
|
|
||||||
|
def test_eskf_to_gps_input_satellites_visible_10():
|
||||||
|
"""Synthetic satellites_visible = 10 to prevent ArduPilot failsafes."""
|
||||||
|
msg = _eskf_to_gps_input(_make_state(), ORIGIN)
|
||||||
|
assert msg.satellites_visible == 10
|
||||||
|
|
||||||
|
|
||||||
def test_eskf_to_gps_input_returns_message():
|
def test_eskf_to_gps_input_returns_message():
|
||||||
|
|||||||
Reference in New Issue
Block a user