Files
detections/src/constants_inf.pxd
T
Oleksandr Bezdieniezhnykh 2149cd6c08 [AZ-180] Add Jetson Orin Nano support with INT8 TensorRT engine
- Dockerfile.jetson: JetPack 6.x L4T base image (aarch64), TensorRT and PyCUDA from apt
- requirements-jetson.txt: derived from requirements.txt, no pip tensorrt/pycuda
- docker-compose.jetson.yml: runtime: nvidia for NVIDIA Container Runtime
- tensorrt_engine.pyx: convert_from_source accepts optional calib_cache_path; INT8 used when cache present, FP16 fallback; get_engine_filename encodes precision suffix to avoid engine cache confusion
- inference.pyx: init_ai tries INT8 engine then FP16 on lookup; downloads calibration cache before conversion thread; passes cache path through to convert_from_source
- constants_inf: add INT8_CALIB_CACHE_FILE constant
- Unit tests for AC-3 (INT8 flag set when cache provided) and AC-4 (FP16 when no cache)

Made-with: Cursor
2026-04-02 07:12:45 +03:00

31 lines
584 B
Cython

cdef str CONFIG_FILE
cdef str AI_ONNX_MODEL_FILE
cdef str INT8_CALIB_CACHE_FILE
cdef str CDN_CONFIG
cdef str MODELS_FOLDER
cdef int SMALL_SIZE_KB
cdef str SPLIT_SUFFIX
cdef double TILE_DUPLICATE_CONFIDENCE_THRESHOLD
cdef int METERS_IN_TILE
cdef log(str log_message)
cdef logerror(str error)
cdef format_time(long ms)
cdef dict[int, AnnotationClass] annotations_dict
cdef class AnnotationClass:
cdef public int id
cdef public str name
cdef public str color
cdef public int max_object_size_meters
cdef enum WeatherMode:
Norm = 0
Wint = 20
Night = 40