Changed directory structure and renamed applications

- autopilot -> drone_controller
- rtsp_ai_player -> ai_controller
- added top level qmake project file
- updated documentation
- moved small demo applications from tmp/ to misc/
This commit is contained in:
Tuomas Järvinen
2024-10-19 14:44:34 +02:00
parent 54b7dc41ca
commit 45c19baa45
94 changed files with 149 additions and 204 deletions
+138 -48
View File
@@ -1,79 +1,169 @@
# Azaion Autopilot
A preliminary example of autonomous drone flight. This example is based on the MAVSDK framework and ArduPilot flight controller software. Running the example requires installing MAVSDK debian package and compiling ArduPilot. Example has been tested in Ubuntu 20.04 environment.
# Azaion DroneController and AiController
An autonomous drone controller with AI inference support. Programs uses MAVSDK framework and ArduPilot flight controller. Running the applications requires MAVSDK installation and compiling ArduPilot. Example has been tested in Ubuntu 22.04 and 24.04 environments.
## Install necessary dependencies
```bash
sudo apt update
sudo apt install ccache git build-essential qt5-qmake qtbase5-dev
sudo apt install ccache git build-essential qmake6 qt6-base-dev
```
## Speed up the compilations
```bash
echo "export MAKEFLAGS=\"-j$(($(nproc)))\"" >> ~/.bashrc
echo "export PATH=/usr/lib/ccache:\$PATH" >> ~/.bashrc
```
## Clone source codes. You must add your SSH key before the cloning!
## Clone source codes. SSH key must be added before the cloning!
```bash
git clone git@github.com:azaion/autopilot.git
git clone --recursive git@github.com:azaion/autopilot.git
git submodule update --init --recursive
```
## Install MAVSDK for Ubuntu 20.04
wget https://github.com/mavlink/MAVSDK/releases/download/v2.9.1/libmavsdk-dev_2.9.1_ubuntu20.04_amd64.deb
sudo dpkg -i libmavsdk-dev_2.9.1_ubuntu20.04_amd64.deb
```bash
wget https://github.com/mavlink/MAVSDK/releases/download/v2.12.10/libmavsdk-dev_2.12.10_ubuntu20.04_amd64.deb
sudo dpkg -i libmavsdk-dev_2.12.10_ubuntu20.04_amd64.deb
```
## Install MAVSDK for Ubuntu 22.04
wget https://github.com/mavlink/MAVSDK/releases/download/v2.9.1/libmavsdk-dev_2.9.1_ubuntu22.04_amd64.deb
```bash
wget https://github.com/mavlink/MAVSDK/releases/download/v2.12.10/libmavsdk-dev_2.12.10_ubuntu22.04_amd64.deb
sudo dpkg -i libmavsdk-dev_2.12.10_ubuntu22.04_amd64.deb
```
sudo dpkg -i libmavsdk-dev_2.9.1_ubuntu22.04_amd64.deb
## Install ONNX Runtime for Ubuntu (not required for embedded platforms)
## Install MAVSDK for embedded platforms
### With GPU inference
```bash
wget https://github.com/microsoft/onnxruntime/releases/download/v1.18.0/onnxruntime-linux-x64-gpu-cuda12-1.18.0.tgz
sudo tar xf onnxruntime-linux-x64-gpu-cuda12-1.18.0.tgz -C /opt/
sudo ln -s /opt/onnxruntime-linux-x64-gpu-cuda12 /opt/onnxruntime-linux-x64-1.18.0
```
### Update cmake (need to build mavsdk)
wget https://github.com/Kitware/CMake/releases/download/v3.29.3/cmake-3.29.3-linux-aarch64.sh
sudo cp cmake-3.29.3-linux-aarch64.sh /opt
sudo chmod +x /opt/cmake-3.29.3-linux-aarch64.sh
cd /opt
sudo bash cmake-3.29.3-linux-aarch64.sh
sudo rm cmake ccmake cpack ctest cmake-gui
sudo ln -s /opt/cmake-3.29.3-linux-aarch64/bin/* /usr/bin
### With CPU inference
```bash
wget https://github.com/microsoft/onnxruntime/releases/download/v1.18.0/onnxruntime-linux-x64-1.18.0.tgz
sudo tar xf onnxruntime-linux-x64-1.18.0.tgz -C /opt/
```
### Make sure gcc9 installed and is default (Need to build mavsdk. gcc -v to check)
sudo add-apt-repository ppa:ubuntu-toolchain-r/test
sudo apt update
sudo apt install gcc-9 g++-9
sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-9 60 --slave /usr/bin/g++ g++ /usr/bin/g++-9
## Install MAVSDK for Ubuntu 22.04 or 24.04 PCs
```bash
wget https://github.com/mavlink/MAVSDK/releases/download/v2.12.10/libmavsdk-dev_2.12.10_ubuntu22.04_amd64.deb
sudo dpkg -i libmavsdk-dev_2.12.10_ubuntu22.04_amd64.deb
```
### Build & Install from sources
sudo apt-get update
sudo apt-get install build-essential cmake git
git clone https://github.com/mavlink/MAVSDK.git
cd MAVSDK
git checkout tags/v2.9.1
git submodule update --init --recursive
cmake -Bbuild/default -DCMAKE_BUILD_TYPE=Release -H.
cmake --build build/default -j8
sudo cmake --build build/default --target install
### Resolving python problems
In case of an error "The python version is too old, expecting 3, 6, 9" during Ardupilot run,
install 3.6 python (most probably under alias python3) and
just change 1 line in ardupilot/modules/waf/waf-light from ... python -> python3
##Ardupilot
## Build ArduPilot
```bash
git clone --recursive https://github.com/ArduPilot/ardupilot.git
cd ardupilot
./Tools/environment_install/install-prereqs-ubuntu.sh -y
. ~/.profile
./waf configure --board=sitl
./waf build
```
## Build autopilot application
cd src && cmake . && make
## Build and install OpenCV 4.10.0
```bash
sudo apt update
sudo apt install libgtk-3-dev libpng-dev cmake ffmpeg libavcodec-dev libavformat-dev libavfilter-dev
wget https://github.com/opencv/opencv/archive/refs/tags/4.10.0.zip
unzip 4.10.0.zip
cd opencv-4.10.0
mkdir build && cd build
cmake -DCMAKE_INSTALL_PREFIX=/opt/opencv-4.10.0 -DBUILD_opencv_world=ON -DOPENCV_GENERATE_PKGCONFIG=ON -DBUILD_PERF_TESTS=OFF -DBUILD_TESTS=OFF ..
make -j8 && sudo make install
```
## Launch similator in the ArduPilot directory
## Install MAVSDK for embedded platforms
### Build and install MAVSDK from the sources if needed
```bash
sudo apt-get update
sudo apt-get install build-essential cmake git
git clone https://github.com/mavlink/MAVSDK.git
cd MAVSDK
git checkout tags/v2.12.10
git submodule update --init --recursive
cmake -Bbuild/default -DCMAKE_BUILD_TYPE=Release -H.
cmake --build build/default -j8
sudo cmake --build build/default --target install
```
## Build Azaion applications
### PC builds (uses ONNX Runtime for the AI inference)
```bash
qmake6 && make
```
### OPI5 builds (uses gimbal camera and RKNN libraries for the AI inference)
```bash
qmake6 CONFIG+=opi5 CONFIG+=gimbal && make
```
## Run Azaion drone controller with ArduPilot simulator in Ubuntu PC
### Launch similator in the ArduPilot directory
```bash
./Tools/autotest/sim_vehicle.py --map --console -v ArduCopter
```
## Launch example application in the new terminal window after waiting simulator (around 1 min) to be ready
cmake . && make && ./autopilot mission.json
### Launch example application in the new terminal window after waiting simulator (around 1 min) to be ready
```bash
./drone_controller/drone_controller ./drone_controller/mission.json quadcopter udp
```
## Run Azaion AI controller in Ubuntu PC
### Install ffmpeg and mediamtx
```bash
sudo apt update
sudo apt install ffmpeg
wget https://github.com/bluenviron/mediamtx/releases/download/v1.8.4/mediamtx_v1.8.4_linux_amd64.tar.gz
mkdir mediamtx
tar xf mediamtx_v1.8.4_linux_amd64.tar.gz -C mediamtx
```
### Launch RTSP server if real camera is not used
```bash
cd mediamtx
./mediamtx
```
### Playback of RTSP video stream when no real camera is used
```bash
ffmpeg -re -stream_loop -1 -i SOME_MP4_VIDEO_FILE -c copy -f rtsp rtsp://localhost:8554/live.stream
```
### Test RTSP stream with ffplay
```bash
ffplay -rtsp_transport tcp rtsp://localhost:8554/live.stream
```
### Compile and run AI controller without camera support
Modify ./ai_controller/aiengineconfig.h and change IP address of RTSP source
```bash
qmake6 CONFIG+=opi5 && make && ./ai_controller/ai_controller [ONNX_MODEL_FILE]
```
### Compile and run AI controller with camera support
Modify ./ai_controller/aiengineconfig.h and change IP address of RTSP source
```bash
qmake6 CONFIG+=opi5 CONFIG+=gimbal && make && ./ai_controller/ai_controller [ONNX_MODEL_FILE]
```
## Run Azaion AI controller in OPI5
### Compile and run AI controller without gimbal camera support
Modify ./ai_controller/aiengineconfig.h and change IP address of RTSP source
```bash
qmake6 CONFIG+=opi5 && make && ./ai_controller/ai_controller [RKNN_MODEL_FILE]
```
### Compile and run AI controller without gimbal camera support
Modify ./ai_controller/aiengineconfig.h and change IP address of RTSP source
```bash
qmake6 CONFIG+=opi5 CONFIG+=gimbal && make && ./ai_controller/ai_controller [RKNN_MODEL_FILE]
```

Before

Width:  |  Height:  |  Size: 602 KiB

After

Width:  |  Height:  |  Size: 602 KiB

+11
View File
@@ -0,0 +1,11 @@
TEMPLATE = subdirs
SUBDIRS += ai_controller \
drone_controller
# Pass CONFIG values to subprojects
ai_controller.subdir = ai_controller
drone_controller.subdir = drone_controller
ai_controller.CONFIG += $$CONFIG
drone_controller.CONFIG += $$CONFIG
-27
View File
@@ -1,27 +0,0 @@
# MAVSDK
sudo apt-get update -y
sudo apt-get install build-essential cmake git python3-pip -y
git clone https://github.com/mavlink/MAVSDK.git
cd MAVSDK
git checkout tags/v2.9.1
git submodule update --init --recursive
cmake -Bbuild/default -DCMAKE_BUILD_TYPE=Release -H.
cmake --build build/default -j8
sudo cmake --build build/default --target install
cd ..
#Ardupilot
git clone --recursive https://github.com/ArduPilot/ardupilot.git
cd ardupilot
./Tools/environment_install/install-prereqs-ubuntu.sh -y
. ~/.profile
./waf configure --board=sitl
./waf build
#Autopilot
cd src && qmake && make
#Run
./ardupilot/Tools/autotest/sim_vehicle.py --map --console -v ArduCopter &
cd src
./autopilot mission.json
-129
View File
@@ -1,129 +0,0 @@
# rtsp_ai_player
`rtsp_ai_player` is an application that listens to an RTSP stream, analyzes images with an AI model, and shows the results visually. It also controls a gimbal camera to zoom in on the recognized objects. Application uses YOLOv8 AI models converted to the ONNX format.
### How to convert Azaion AI model file to ONNX format
```bash
yolo export model=azaion-2024-08-13.pt dynamic=False format=onnx imgsz=640,640
```
## How to use application locally on a Linux PC.
# Speed up compilations
```bash
echo "export MAKEFLAGS=\"-j8\"" >> ~/.bashrc
echo "export PATH=/usr/lib/ccache:\$PATH" >> ~/.bashrc
```
### Install OpenCV 4.10.0
```bash
sudo apt update
sudo apt install libgtk-3-dev libpng-dev cmake ffmpeg libavcodec-dev libavformat-dev libavfilter-dev
wget https://github.com/opencv/opencv/archive/refs/tags/4.10.0.zip
unzip 4.10.0.zip
cd opencv-4.10.0
mkdir build && cd build
cmake -DCMAKE_INSTALL_PREFIX=/opt/opencv-4.10.0 -DBUILD_opencv_world=ON -DOPENCV_GENERATE_PKGCONFIG=ON -DBUILD_PERF_TESTS=OFF -DBUILD_TESTS=OFF ..
make -j8 && sudo make install
```
### Install ONNX runtime 1.18.0
```bash
wget https://github.com/microsoft/onnxruntime/releases/download/v1.18.0/onnxruntime-linux-x64-1.18.0.tgz
sudo tar xf onnxruntime-linux-x64-1.18.0.tgz -C /opt
```
### Install ffmpeg and mediamtx RTSP server:
```bash
sudo apt update
sudo apt install ffmpeg
# For Amd64-like platforms:
wget https://github.com/bluenviron/mediamtx/releases/download/v1.8.4/mediamtx_v1.8.4_linux_amd64.tar.gz
mkdir mediamtx
tar xf mediamtx_v1.8.4_linux_amd64.tar.gz -C mediamtx
# For OrangePi5 use this lib instead (since it's an ARM platform):
wget https://github.com/bluenviron/mediamtx/releases/download/v1.9.1/mediamtx_v1.9.1_linux_arm64v8.tar.gz
mkdir mediamtx
tar xf mediamtx_v1.9.1_linux_arm64v8.tar.gz -C mediamtx
```
### If you use video file from the local RTSP server:
```bash
cd mediamtx
./mediamtx
```
### Play Azaion mp4 video file from RTSP server ... :
```bash
ffmpeg -re -stream_loop -1 -i $HOME/azaion/videos/for_ai_cut.mp4 -c copy -f rtsp rtsp://localhost:8554/live.stream
```
### ... or play simple video file from RTSP server:
```bash
ffmpeg -re -stream_loop -1 -i $HOME/azaion/videos/table.mp4 -c copy -f rtsp rtsp://localhost:8554/live.stream
```
### Test RTSP streaming with ffplay:
```bash
ffplay -rtsp_transport tcp rtsp://localhost:8554/live.stream
```
### Compile and run rtsp_ai_player with YOLOv8 medium AI model:
```bash
cd autopilot/misc/rtsp_ai_player
qmake6 && make
./rtsp_ai_player ~/azaion/models/onnx/yolov8m.onnx
```
### Compile and run rtsp_ai_player with Azaion AI model:
```bash
cd autopilot/misc/rtsp_ai_player
qmake6 && make
./rtsp_ai_player ~/azaion/models/azaion/azaion-2024-08-13.onnx
```
### Compile and run rtsp_ai_player with YOLOv8 medium model and gimbal camera:
```bash
cd autopilot/misc/rtsp_ai_player
qmake6 CONFIG+=gimbal && make
./rtsp_ai_player ~/azaion/models/onnx/yolov8m.onnx
```
## How to use application on Orange PI 5.
### Install ffmpeg and mediamtx to Ubuntu PC:
```bash
sudo apt update
sudo apt install ffmpeg
wget https://github.com/bluenviron/mediamtx/releases/download/v1.8.4/mediamtx_v1.8.4_linux_amd64.tar.gz
mkdir mediamtx
tar xf mediamtx_v1.8.4_linux_amd64.tar.gz -C mediamtx
```
### Launch RTSP server in Ubuntu PC
```bash
cd mediamtx
./mediamtx
```
### Play RTSP video stream in Ubuntu PC
```bash
ffmpeg -re -stream_loop -1 -i SOME_MP4_VIDEO_FILE -c copy -f rtsp rtsp://localhost:8554/live.stream
```
### Test RTSP stream with ffplay in Ubuntu PC:
```bash
ffplay -rtsp_transport tcp rtsp://localhost:8554/live.stream
```
### Compile and launch RTSP AI PLAYER in Orange PI 5:
Modify autopilot/misc/rtsp_ai_player/aiengineconfig.h and change 192.168.168.91 to IP address of RTSP source
```bash
cd autopilot/misc/rtsp_ai_player
mkdir build
cd build
qmake6 CONFIG+=opi5 .. && make -j8 && ./rtsp_ai_player [RKNN_MODEL_FILE]
```