MAVLab Course2026 Updates (p1) (#3632)
Issues due date / Add labels to issues (push) Has been cancelled
Doxygen / build (push) Has been cancelled

* updated distance measurement script.
- automatically shows which ids are available
- filter out big jumps
- plot_summary.py shows distance over time with recording regions

* Adding config file for radiomaster pocket joystick ble/usb (#108)

* added radiomaster tx16s xml (#123)

Co-authored-by: Wiebe van der Knaap <wkvanderknaap@tudelft.nl>

* Fix joystick device argument parsed as single token in control panel sessions (#118)

The `-d 0` joystick device flag was passed as a single `flag` attribute,
causing the joystick program to receive it as one token instead of two
separate arguments. This prevented the device number from being recognized,
breaking joystick input in the Simulation - Gazebo and Flight UDP sessions.

Fixed by splitting into `<arg flag="-d" constant="0"/>`.

Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>

* fixed names of variables and resolution bugs, added documentation (#113)

Co-authored-by: macoman <macoman@student.tudelft.nl>

* Update Gazebo Models: Gate, Plants, Logo

* Added some (math) tests (#114)

* added a test for paparazzi's math librarie's int sqrt function and int quaternion normalization function
* Keep essential tests

Reduced the number of tests planned from 9 to 6 and removed tests for int32_sqrt.

---------

Co-authored-by: LSSchef <l.s.scheffer@student.tudelft.nl>
Co-authored-by: AniketBehura <aniketbehura1023@gmail.com>
Co-authored-by: diaa <D.abbasi@student.tudelft.nl>

* Feat: readme update for submodule installation (#115)

* feat: readme update for submodule installation

* Rename README to README.md

---------

Co-authored-by: Christophe De Wagter <dewagter@gmail.com>

---------

Co-authored-by: robinferede <robinferede@tudelft.nl>
Co-authored-by: Robin Euger <robin.euger@gmail.com>
Co-authored-by: Wiebe van der Knaap <wkvanderknaap@tudelft.nl>
Co-authored-by: EAbbenhuis <113993394+EAbbenhuis@users.noreply.github.com>
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-authored-by: Mihai Coman <127535163+miki133@users.noreply.github.com>
Co-authored-by: macoman <macoman@student.tudelft.nl>
Co-authored-by: Swayam Kuckreja <110131770+swayamkuckreja@users.noreply.github.com>
Co-authored-by: LSSchef <l.s.scheffer@student.tudelft.nl>
Co-authored-by: AniketBehura <aniketbehura1023@gmail.com>
Co-authored-by: diaa <D.abbasi@student.tudelft.nl>
Co-authored-by: Douwe-Rijs <Douwe@standofl.nl>
This commit is contained in:
Christophe De Wagter
2026-04-09 15:37:28 +02:00
committed by GitHub
parent 84adc6f1e2
commit cf7c8b3797
11 changed files with 738 additions and 153 deletions
+62
View File
@@ -0,0 +1,62 @@
<joystick>
<!--
RadioMaster TX16S (OpenTX/EdgeTX) — Paparazzi UAV joystick configuration
Axis mapping (verify with: ./sw/ground_segment/joystick/test_stick):
Axis 0 : Roll — right stick X
Axis 1 : Pitch — right stick Y
Axis 2 : Throttle — left stick Y
Axis 3 : Yaw — left stick X
Axis 5 : Mode switch — 3-position (SA/SB): low / mid / high
Axis 6 : Arm switch — 2-position (SF/SE): 0 = disarm, ~1024 = arm
TX16S switch axes output 0 / ~1024 / ~2047 (unsigned), which input2ivy
scales to -127 / 0 / +127 internally.
Pitch and throttle are inverted relative to Paparazzi convention — hence
the minus signs in the RC_UP field values below.
-->
<input>
<!-- deadband: eliminates stick drift at centre (range 0-127) -->
<!-- exponent: finer control near centre, faster at extremes (range 0-1.0) -->
<axis index="0" name="roll" deadband="10" exponent="0.5" limit="1.0" trim="0"/>
<axis index="1" name="pitch" deadband="10" exponent="0.5" limit="1.0" trim="0"/>
<axis index="2" name="throttle" deadband="5" exponent="0.0" limit="1.0" trim="0"/>
<axis index="3" name="yaw" deadband="10" exponent="0.5" limit="1.0" trim="0"/>
<axis index="5" name="mode_sw"/>
<axis index="6" name="arm_sw"/>
</input>
<variables>
<var name="mode" default="0"/>
</variables>
<messages period="0.05">
<message class="datalink" name="RC_UP" send_always="true">
<field name="channels"
value="roll;
-pitch;
yaw;
Fit(-throttle,-127,127,0,127);
Fit(mode_sw,-127,127,0,2)"/>
</message>
<!-- Arm switch: arm_sw > -60 = armed, arm_sw < -60 = disarmed/killed -->
<message class="ground" name="DL_SETTING"
on_event="arm_sw &gt; -60">
<field name="index" value="IndexOfSetting('autopilot.kill_throttle')"/>
<field name="value" value="0"/>
</message>
<message class="ground" name="DL_SETTING"
on_event="arm_sw &lt; -60">
<field name="index" value="IndexOfSetting('autopilot.kill_throttle')"/>
<field name="value" value="1"/>
</message>
</messages>
</joystick>
+42
View File
@@ -0,0 +1,42 @@
<!--
RadioMaster pocket Transmitter - Used as Bluetooth Joystick
The RadioMaster Pocket is a versatile 32CH radio transmitter with Bluetooth capability.
This configuration assumes no modification were done with jstest-gtk
In this file we will use it as a 6CH joystick to control a UAS.
- The left stick vertical axis will be used for throttle
- The right stick horizontal axis will be used for roll
- The right stick vertical axis will be used for pitch
- The left stick horizontal axis will be used for yaw
- The VRA axis will be used for arming (left top switch)
- The VRC axis will be used for mode switch (right top switch)
If you want to fly your UAS via the joystick add this to your session:
/home/username/paparazzi/sw/ground_segment/joystick/input2ivy -d 0 -ac yourairframe_name radiomaster_pocket_ble.xml
Where -d 0 must be -d 1 if you have a laptop with accelerometer installed
The basis of steering is the standard signs of aerospace convention
-->
<joystick>
<input>
<axis index="4" name="LeftStickHorizontal"/>
<axis index="3" name="LeftStickVertical"/>
<axis index="0" name="RightStickHorizontal"/>
<axis index="1" name="RightStickVertical"/>
<!-- VRA is SA and VRC is SC switch -->
<axis index="2" name="VRA"/>
<axis index="6" name="VRC"/>
</input>
<!-- Follow the order of rc_datalink.h -->
<messages period="0.05">
<message class="datalink" name="RC_UP" send_always="true">
<field name="channels" value="RightStickHorizontal;-RightStickVertical;LeftStickHorizontal;Fit(LeftStickVertical,-127,127,0,127);VRC;VRA"/>
</message>
</messages>
</joystick>
+42
View File
@@ -0,0 +1,42 @@
<!--
RadioMaster pocket Transmitter - Used as USB Joystick
The RadioMaster Pocket is a versatile 32CH radio transmitter.
This configuration assumes no modification were done with jstest-gtk
In this file we will use it as a 6CH joystick to control a UAS.
- The left stick vertical axis will be used for throttle
- The right stick horizontal axis will be used for roll
- The right stick vertical axis will be used for pitch
- The left stick horizontal axis will be used for yaw
- The VRA axis will be used for arming (left top switch)
- The VRC axis will be used for mode switch (right top switch)
If you want to fly your UAS via the joystick add this to your session:
/home/username/paparazzi/sw/ground_segment/joystick/input2ivy -d 0 -ac yourairframe_name radiomaster_pocket_usb.xml
Where -d 0 must be -d 1 if you have a laptop with accelerometer installed
The basis of steering is the standard signs of aerospace convention
-->
<joystick>
<input>
<axis index="3" name="LeftStickHorizontal"/>
<axis index="2" name="LeftStickVertical"/>
<axis index="0" name="RightStickHorizontal"/>
<axis index="1" name="RightStickVertical"/>
<!-- VRA is SA and VRC is SC switch -->
<axis index="4" name="VRA"/>
<axis index="6" name="VRC"/>
</input>
<!-- Follow the order of rc_datalink.h -->
<messages period="0.05">
<message class="datalink" name="RC_UP" send_always="true">
<field name="channels" value="RightStickHorizontal;-RightStickVertical;LeftStickHorizontal;Fit(LeftStickVertical,-127,127,0,127);VRC;VRA"/>
</message>
</messages>
</joystick>
+98 -22
View File
@@ -6,9 +6,9 @@
Compile and control the settings of the Bebop front and bottom cameras. Compile and control the settings of the Bebop front and bottom cameras.
</description> </description>
<section name="Front camera" prefix="MT9F002_"> <section name="Front camera" prefix="MT9F002_">
<define name="RESOLUTION" value="0" description="Preset image resolutions, 0 = VGA, 1 = 720p, 2 = 720p 4:3 aspect ratio, 3 = 1080p, 4 = 1080p 4:3 aspect ratio, 5 = 2048*2048, this is the max dimention allowable with isp"/> <define name="RESOLUTION" value="0" description="Preset image resolutions, 0 = VGA, 1 = 720p, 2 = 720p 4:3 aspect ratio, 3 = 1080p, 4 = 1080p 4:3 aspect ratio, 5 = 2048*2048, this is the max dimension allowable with isp"/>
<define name="OUTPUT_WIDTH" value="640" description="Image horizontal resolution used if MT9F002_RESOLUTION not defined"/> <define name="OUTPUT_WIDTH" value="640" description="Image horizontal resolution used if MT9F002_RESOLUTION not defined"/>
<define name="OUTPUT_HEIGHT" value="640" description="Image vertical resolution used if MT9F002_RESOLUTION not defined"/> <define name="OUTPUT_HEIGHT" value="480" description="Image vertical resolution used if MT9F002_RESOLUTION not defined"/>
<define name="OFFSET_X" value="0" description="Signed fractional offset from centre of image of original sensor [-0.5,0.5]"/> <define name="OFFSET_X" value="0" description="Signed fractional offset from centre of image of original sensor [-0.5,0.5]"/>
<define name="OFFSET_Y" value="0" description="Signed fractional offset from centre of image of original sensor [-0.5,0.5]"/> <define name="OFFSET_Y" value="0" description="Signed fractional offset from centre of image of original sensor [-0.5,0.5]"/>
<define name="ZOOM" value="1" description="Zoom factor of image"/> <define name="ZOOM" value="1" description="Zoom factor of image"/>
@@ -18,8 +18,8 @@
<define name="GAIN_GREEN2" value="2" description="Green gain"/> <define name="GAIN_GREEN2" value="2" description="Green gain"/>
<define name="GAIN_RED" value="1.4" description="Red gain"/> <define name="GAIN_RED" value="1.4" description="Red gain"/>
<define name="GAIN_BLUE" value="2.7" description="Blue gain"/> <define name="GAIN_BLUE" value="2.7" description="Blue gain"/>
<define name="FOCAL_X" value="0.48686" description="Focal length in the x-direction in pixels"/> <define name="FOCAL_X" value="0.48686" description="Normalized focal length in x-direction (fraction of image width)"/>
<define name="OFFSET_X" value="0.48908" description="Focal length in the y-direction in pixels"/> <define name="FOCAL_Y" value="0.48908" description="Normalized focal length in y-direction (fraction of image width)"/>
<define name="CENTER_X" value="0.51015" description="Center image coordinate in the x-direction"/> <define name="CENTER_X" value="0.51015" description="Center image coordinate in the x-direction"/>
<define name="CENTER_Y" value="0.51015" description="Center image coordinate in the y-direction"/> <define name="CENTER_Y" value="0.51015" description="Center image coordinate in the y-direction"/>
<define name="DHANE_K" value="1.25" description="(Un)Distortion parameter for a fish-eye lens"/> <define name="DHANE_K" value="1.25" description="(Un)Distortion parameter for a fish-eye lens"/>
@@ -27,60 +27,136 @@
<section name="Bottom camera" prefix="MT9V117_"> <section name="Bottom camera" prefix="MT9V117_">
<define name="TARGET_FPS" value="30" description="Desired frame rate"/> <define name="TARGET_FPS" value="30" description="Desired frame rate"/>
<define name="FOCAL_X" value="0.48686" description="Focal length in the x-direction in pixels"/> <define name="FOCAL_X" value="0.48686" description="Normalized focal length in x-direction (fraction of image width)"/>
<define name="OFFSET_X" value="0.48908" description="Focal length in the y-direction in pixels"/> <define name="FOCAL_Y" value="0.48908" description="Normalized focal length in y-direction (fraction of image width)"/>
<define name="CENTER_X" value="0.51015" description="Center image coordinate in the x-direction"/> <define name="CENTER_X" value="0.51015" description="Center image coordinate in the x-direction"/>
<define name="CENTER_Y" value="0.51015" description="Center image coordinate in the y-direction"/> <define name="CENTER_Y" value="0.51015" description="Center image coordinate in the y-direction"/>
<define name="DHANE_K" value="1.25" description="(Un)Distortion parameter for a fish-eye lens"/> <define name="DHANE_K" value="1.25" description="(Un)Distortion parameter for a fish-eye lens"/>
</section> </section>
</doc> </doc>
<!-- ================= Camera runtime settings ================= -->
<!-- These settings appear in the Paparazzi Ground Control Station
and allow adjusting camera parameters during flight. -->
<settings> <settings>
<dl_settings NAME="Bebop camera control">
<!-- Top-level group shown in the settings tree -->
<dl_settings name="Bebop camera control">
<!-- Settings for the Bebop FRONT camera (MT9F002 sensor) -->
<dl_settings name="Front camera"> <dl_settings name="Front camera">
<dl_setting var="mt9f002.set_zoom" min="1." step="0.05" max="2.5" shortname="zoom" param="MT9F002_ZOOM"/>
<dl_setting var="mt9f002.set_offset_x" min="-0.5" step="0.05" max="0.5" shortname="offset_x" param="MT9F002_OFFSET_X"/> <!-- Digital zoom factor applied by the ISP -->
<dl_setting var="mt9f002.set_offset_y" min="-0.5" step="0.05" max="0.5" shortname="offset_y" param="MT9F002_OFFSET_Y"/> <dl_setting var="mt9f002.set_zoom" min="1." step="0.05" max="2.5"
<dl_setting var="mt9f002_send_resolution" min="1" step="1" max="1" values="SEND" shortname="update_resolution" module="boards/bebop/mt9f002" handler="setting_update_resolution"/> shortname="zoom" param="MT9F002_ZOOM"/>
<dl_setting var="mt9f002.gain_green1" min="1" step="0.1" max="60" shortname="green_1" param="MT9F002_GAIN_GREEN1"/>
<dl_setting var="mt9f002.gain_green2" min="1" step="0.1" max="60" shortname="green_2" param="MT9F002_GAIN_GREEN2"/> <!-- Horizontal crop offset relative to image center
<dl_setting var="mt9f002.gain_blue" min="1" step="0.1" max="60" shortname="blue" param="MT9F002_GAIN_BLUE"/> Range [-0.5, 0.5] corresponds to fraction of sensor width -->
<dl_setting var="mt9f002.gain_red" min="1" step="0.1" max="60" shortname="red" param="MT9F002_GAIN_RED"/> <dl_setting var="mt9f002.set_offset_x" min="-0.5" step="0.05" max="0.5"
<dl_setting var="mt9f002_send_color" min="1" step="1" max="1" values="SEND" shortname="update_color" module="boards/bebop/mt9f002" handler="setting_update_color"/> shortname="offset_x" param="MT9F002_OFFSET_X"/>
<dl_setting var="mt9f002.target_exposure " min="0.1" step="0.1" max="80" shortname="exposure" param="MT9F002_TARGET_EXPOSURE"/>
<dl_setting var="mt9f002_send_exposure" min="1" step="1" max="1" values="SEND" shortname="update_exposure" module="boards/bebop/mt9f002" handler="setting_update_exposure"/> <!-- Vertical crop offset relative to image center -->
<dl_setting var="mt9f002.set_offset_y" min="-0.5" step="0.05" max="0.5"
shortname="offset_y" param="MT9F002_OFFSET_Y"/>
<!-- Trigger command to send updated resolution to the camera driver -->
<dl_setting var="mt9f002_send_resolution" min="1" step="1" max="1"
values="SEND" shortname="update_resolution"
module="boards/bebop/mt9f002"
handler="setting_update_resolution"/>
<!-- White balance gains for Bayer channels -->
<dl_setting var="mt9f002.gain_green1" min="1" step="0.1" max="60"
shortname="green_1" param="MT9F002_GAIN_GREEN1"/>
<dl_setting var="mt9f002.gain_green2" min="1" step="0.1" max="60"
shortname="green_2" param="MT9F002_GAIN_GREEN2"/>
<dl_setting var="mt9f002.gain_blue" min="1" step="0.1" max="60"
shortname="blue" param="MT9F002_GAIN_BLUE"/>
<dl_setting var="mt9f002.gain_red" min="1" step="0.1" max="60"
shortname="red" param="MT9F002_GAIN_RED"/>
<!-- Send updated color gain values to the camera hardware -->
<dl_setting var="mt9f002_send_color" min="1" step="1" max="1"
values="SEND" shortname="update_color"
module="boards/bebop/mt9f002"
handler="setting_update_color"/>
<!-- Target exposure level used by the automatic exposure control -->
<dl_setting var="mt9f002.target_exposure" min="0.1" step="0.1" max="80"
shortname="exposure" param="MT9F002_TARGET_EXPOSURE"/>
<!-- Apply the new exposure setting to the camera -->
<dl_setting var="mt9f002_send_exposure" min="1" step="1" max="1"
values="SEND" shortname="update_exposure"
module="boards/bebop/mt9f002"
handler="setting_update_exposure"/>
</dl_settings> </dl_settings>
</dl_settings> </dl_settings>
</settings> </settings>
<!-- ================= Module dependencies ================= -->
<!-- Requires I2C communication for the sensors and the
video processing thread for image streaming -->
<dep> <dep>
<depends>i2c,video_thread</depends> <depends>i2c,video_thread</depends>
</dep> </dep>
<!-- ================= Header files ================= -->
<!-- Camera driver interfaces used by this module -->
<header> <header>
<file name="mt9v117.h"/> <file name="mt9v117.h"/> <!-- bottom camera sensor -->
<file name="mt9f002.h"/> <file name="mt9f002.h"/> <!-- front camera sensor -->
</header> </header>
<!-- ================= Module initialization ================= -->
<!-- Called during autopilot startup to initialize sensors -->
<init fun="mt9v117_init(PTR(mt9v117))"/> <init fun="mt9v117_init(PTR(mt9v117))"/>
<init fun="mt9f002_init(PTR(mt9f002))"/> <init fun="mt9f002_init(PTR(mt9f002))"/>
<!-- ================= Build rules for autopilot target ================= -->
<makefile target="ap"> <makefile target="ap">
<!-- Camera driver implementations -->
<file name="mt9v117.c"/> <file name="mt9v117.c"/>
<file name="mt9f002.c"/> <file name="mt9f002.c"/>
<!-- Bebop image signal processor (ISP) library -->
<file name="libisp.c" dir="boards/bebop/isp"/> <file name="libisp.c" dir="boards/bebop/isp"/>
<!-- Size of I2C communication buffer -->
<define name="I2C_BUF_LEN" value="56"/> <define name="I2C_BUF_LEN" value="56"/>
<!-- Use Bebop I2C bus 0 -->
<define name="USE_I2C0"/> <define name="USE_I2C0"/>
<!-- Also define for test builds -->
<test> <test>
<define name="USE_I2C0"/> <define name="USE_I2C0"/>
</test> </test>
</makefile> </makefile>
<!-- ================= Build rules for NPS simulator ================= -->
<makefile target="nps"> <makefile target="nps">
<!-- Simulated camera drivers -->
<file name="mt9v117_nps.c"/> <file name="mt9v117_nps.c"/>
<file name="mt9f002_nps.c"/> <file name="mt9f002_nps.c"/>
<!-- Same ISP library used for simulation -->
<file name="libisp.c" dir="boards/bebop/isp"/> <file name="libisp.c" dir="boards/bebop/isp"/>
<define name="I2C_BUF_LEN" value="56"/> <define name="I2C_BUF_LEN" value="56"/>
<define name="USE_I2C0"/> <define name="USE_I2C0"/>
</makefile> </makefile>
</module> </module>
@@ -23,7 +23,7 @@
<program name="Joystick" command="sw/ground_segment/joystick/input2ivy"> <program name="Joystick" command="sw/ground_segment/joystick/input2ivy">
<arg flag="-ac" constant="@AIRCRAFT"/> <arg flag="-ac" constant="@AIRCRAFT"/>
<arg flag="sm600.xml"/> <arg flag="sm600.xml"/>
<arg flag="-d 0"/> <arg flag="-d" constant="0"/>
</program> </program>
<program name="Gazebo"/> <program name="Gazebo"/>
<program name="RtpViewer"> <program name="RtpViewer">
@@ -65,7 +65,7 @@
<program name="Joystick" command="sw/ground_segment/joystick/input2ivy"> <program name="Joystick" command="sw/ground_segment/joystick/input2ivy">
<arg flag="-ac" constant="@AIRCRAFT"/> <arg flag="-ac" constant="@AIRCRAFT"/>
<arg flag="sm600.xml"/> <arg flag="sm600.xml"/>
<arg flag="-d 0"/> <arg flag="-d" constant="0"/>
</program> </program>
<!--program name="NatNet3"> <!--program name="NatNet3">
<arg flag="-ac 9999" constant="@AC_ID"/> <arg flag="-ac 9999" constant="@AC_ID"/>
-1
View File
@@ -1 +0,0 @@
This directory contains libraries and tools in external git repositories used in paparazzi.
+214
View File
@@ -0,0 +1,214 @@
# External Libraries and Tools
This directory contains git submodules referencing external libraries and tools used by Paparazzi. These modules provide essential functionality for communications, flight control, simulation, cryptography, and more.
## Table of Contents
- [Quick Start](#quick-start)
- [Installing All Modules](#installing-all-modules)
- [Installing Individual Modules](#installing-individual-modules)
- [Available Modules](#available-modules)
- [OpenCV Bebop Setup](#opencv-bebop-setup)
- [Troubleshooting](#troubleshooting)
---
## Quick Start
Initialize and build all external modules:
```bash
make -C sw/ext
```
This will checkout and compile all dependencies automatically.
---
## Installing All Modules
To install and build all external modules at once:
```bash
make -C sw/ext
```
This runs the complete build chain for all modules that require compilation (chibios, fatfs, libsbp, TRICAL, hacl-c, key_generator, rustlink, ecl, matrix, mavlink, dronecan, and unifiedmocaprouter).
---
## Installing Individual Modules
To install or update only a specific module, run:
```bash
make -C sw/ext <module-name>
```
**Example:**
```bash
make -C sw/ext chibios
make -C sw/ext mavlink
```
Each module target will sync and update the git submodule, then build if required.
---
## Available Modules
See the main [Makefile in this directory](Makefile) for the complete list of modules and their build targets. Common modules include:
- **chibios** — Operating system for embedded systems
- **fatfs** — FAT filesystem library
- **mavlink** — MAVLink message definitions and code generation
- **pprzlink** — Paparazzi-specific communications protocol
- **libsbp** — Swift Binary Protocol for GNSS/RTK
- **TRICAL** — Magnetometer automatic calibration
- **opencv_bebop** — Computer vision library (see detailed instructions below)
- **dronecan** — DroneCAN protocol support
- **And many more...** — See [Makefile](Makefile) for details
---
## OpenCV Bebop Setup
This module provides advanced computer vision capabilities for Bebop drones.
### Prerequisites
This guide has been tested on **Ubuntu 22.04**. For Ubuntu 20.04 instructions, see: [`conf/modules/cv_opencvdemo.xml`](../../conf/modules/cv_opencvdemo.xml)
### Building OpenCV Bebop
From the paparazzi source directory:
```bash
cd sw/ext
make opencv_bebop
```
### Verify Installation
After a successful build, you should have these directories:
```
sw/ext/opencv_bebop/build_arm/
sw/ext/opencv_bebop/build_pc/
sw/ext/opencv_bebop/install_arm/
sw/ext/opencv_bebop/install_pc/
```
These contain compiled binaries for ARM (autopilot) and PC (simulation) targets.
### Configuring Your Airframe Module
To use OpenCV in your airframe, configure your module XML with build flags for both autopilot (`ap`) and simulation (`nps`) targets.
**Example airframe module configuration:**
```xml
<makefile target="ap">
<file name="opencv_example.cpp"/>
<file name="opencv_image_functions.cpp"/>
<flag name="CXXFLAGS" value="I$(PAPARAZZI_SRC)/sw/ext/opencv_bebop/install_arm/include/opencv4"/>
<flag name="LDFLAGS" value="L$(PAPARAZZI_HOME)/sw/ext/opencv_bebop/install_arm/lib"/>
<flag name="LDFLAGS" value="lopencv_world"/>
<flag name="LDFLAGS" value="L$(PAPARAZZI_HOME)/sw/ext/opencv_bebop/install_arm/lib/opencv4/3rdparty"/>
<flag name="LDFLAGS" value="llibprotobuf"/>
<flag name="LDFLAGS" value="llibjpeg-turbo"/>
<flag name="LDFLAGS" value="llibpng"/>
<flag name="LDFLAGS" value="llibtiff"/>
<flag name="LDFLAGS" value="llibopenjp2"/>
<flag name="LDFLAGS" value="lzlib"/>
<flag name="LDFLAGS" value="lade"/>
<flag name="LDFLAGS" value="ldl"/>
<flag name="LDFLAGS" value="lm"/>
<flag name="LDFLAGS" value="lpthread"/>
<flag name="LDFLAGS" value="lrt"/>
</makefile>
<makefile target="nps">
<file name="opencv_example.cpp"/>
<file name="opencv_image_functions.cpp"/>
<flag name="CXXFLAGS" value="I$(PAPARAZZI_SRC)/sw/ext/opencv_bebop/install_pc/include/opencv4"/>
<flag name="LDFLAGS" value="L$(PAPARAZZI_HOME)/sw/ext/opencv_bebop/install_pc/lib"/>
<flag name="LDFLAGS" value="lopencv_world"/>
<flag name="LDFLAGS" value="L$(PAPARAZZI_HOME)/sw/ext/opencv_bebop/install_pc/lib/opencv4/3rdparty"/>
<flag name="LDFLAGS" value="llibprotobuf"/>
<flag name="LDFLAGS" value="lade"/>
<flag name="LDFLAGS" value="L/usr/lib/x86_64-linux-gnu"/>
<flag name="LDFLAGS" value="ljpeg"/>
<flag name="LDFLAGS" value="lpng"/>
<flag name="LDFLAGS" value="ltiff"/>
<flag name="LDFLAGS" value="lopenjp2"/>
<flag name="LDFLAGS" value="L/usr/lib/x86_64-linux-gnu/hdf5/serial"/>
<flag name="LDFLAGS" value="lhdf5"/>
<flag name="LDFLAGS" value="lcrypto"/>
<flag name="LDFLAGS" value="lcurl"/>
<flag name="LDFLAGS" value="lpthread"/>
<flag name="LDFLAGS" value="lsz"/>
<flag name="LDFLAGS" value="lz"/>
<flag name="LDFLAGS" value="ldl"/>
<flag name="LDFLAGS" value="lm"/>
<flag name="LDFLAGS" value="lfreetype"/>
<flag name="LDFLAGS" value="lharfbuzz"/>
<flag name="LDFLAGS" value="lrt"/>
</makefile>
</module>
```
### Using OpenCV in C++ Code
Include the OpenCV headers in your C++ files:
```cpp
#include "opencv_image_functions.h"
using namespace std;
#include <opencv2/core/core.hpp>
#include <opencv2/imgproc/imgproc.hpp>
using namespace cv;
```
You can then use provided wrapper functions like `grayscale_opencv_to_yuv422()` in your code:
```cpp
// Convert and process images
Mat image = imread("image.jpg");
Mat gray;
cvtColor(image, gray, COLOR_BGR2GRAY);
// ... your vision processing
```
### Team Consistency
**Important:** Ensure all team members use the same build environment and configuration. Inconsistent builds can lead to compatibility issues during testing and deployment, leading to high dependency on one laptop/person.
---
## Troubleshooting
### Build Failures
- **Module not found:** Ensure you're running `make` from the paparazzi root or using `make -C sw/ext`
- **Dependency errors:** Check that required packages are installed (see individual module documentation)
- **Python build issues:** Some modules (mavlink, dronecan) require Python packages. See [Makefile](Makefile) for details on `MY_MAVLINKTOOLS` and `MY_DRONECANTOOLS`
### Submodule Issues
If a submodule is in a bad state, reinitialize it:
```bash
# Sync and reset a specific module
cd <paparazzi-root>
git submodule sync sw/ext/<module-name>
git submodule update --init --recursive sw/ext/<module-name>
# Or reset all modules
git submodule update --init --recursive
```
---
+193 -126
View File
@@ -1,126 +1,193 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
import time import time
import sys from datetime import datetime
import os import sys
import math import os
import argparse import math
import matplotlib.pyplot as plt import socket
import matplotlib.patches as mpatches import struct
import argparse
# if PAPARAZZI_HOME not set, then assume the tree containing this import matplotlib.pyplot as plt
# file is a reasonable substitute
PPRZ_HOME = os.getenv("PAPARAZZI_HOME", os.path.normpath(os.path.join(os.path.dirname(os.path.abspath(__file__)), '../../../'))) PPRZ_HOME = os.getenv("PAPARAZZI_HOME", os.path.normpath(os.path.join(os.path.dirname(os.path.abspath(__file__)), '../../../')))
sys.path.append(PPRZ_HOME + "/sw/ground_segment/python/natnet3.x") sys.path.append(PPRZ_HOME + "/sw/ground_segment/python/natnet3.x")
from NatNetClient import NatNetClient from NatNetClient import NatNetClient
recording = False
def discover_motive(multicast="239.255.42.99", data_port=1511, timeout=3):
def on_press(event): """Discover a Motive server by listening for NatNet multicast data."""
global recording sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDP)
if event.key == 'r': sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
recording = not recording sock.settimeout(timeout)
if event.key == 'q': sock.bind(('', data_port))
plt.close()
# Join the NatNet multicast group
mreq = struct.pack('4sL', socket.inet_aton(multicast), socket.INADDR_ANY)
# This is a callback function that gets connected to the NatNet client. It is called once per rigid body per frame sock.setsockopt(socket.IPPROTO_IP, socket.IP_ADD_MEMBERSHIP, mreq)
def receiveRigidBodyFrame(rigidBodyList, stamp):
# print(rigidBodyList) try:
for (id, position, quat, valid) in rigidBodyList: _, addr = sock.recvfrom(4096)
# print( "Received frame for rigid body", id ) sock.close()
global pos_x, pos_y, pos_z return addr[0]
global track_id except socket.timeout:
if track_id and id != track_id: sock.close()
continue return None
# print( "Received frame for rigid body", id ) # Global state
pos_x = position[0] recording = False
pos_y = position[1] pos_x, pos_y, pos_z = 0.0, 0.0, 0.0
pos_z = position[2] track_id = None
frame_count = 0
seen_ids = set()
def main(args):
global track_id
track_id = args.id def on_press(event):
global recording
global pos_x, pos_y, pos_z if event.key == 'r':
pos_x, pos_y, pos_z = 0.0, 0.0, 0.0 recording = not recording
fig = plt.figure() if event.key == 'q':
plt.axis([-6, 6, -6, 6]) plt.close()
# add key press event
fig.canvas.mpl_connect('key_press_event', on_press) def receive_rigid_body_frame(rigid_body_data, stamp):
global pos_x, pos_y, pos_z, frame_count, seen_ids
# title frame_count += 1
plt.title('Press r to start/stop recording \n press q to quit') for rb in rigid_body_data.rigid_body_list:
seen_ids.add(rb.id_num)
# This will create a new NatNet client if track_id is not None and rb.id_num != track_id:
streamingClient = NatNetClient( continue
server=args.server, pos_x = rb.pos[0]
multicast=args.multicast, pos_y = rb.pos[1]
commandPort=args.commandPort, pos_z = rb.pos[2]
dataPort=args.dataPort,
rigidBodyListListener=receiveRigidBodyFrame,
version=(3,0,0,0)) def main(args):
# Start up the streaming client now that the callbacks are set up. global track_id
# This will run perpetually, and operate on a separate thread.
streamingClient.run() # Discover or use provided server IP
if args.server:
time.sleep(2) server_ip = args.server
print('Start tracking') print(f"Using provided server IP: {server_ip}")
if args.outputfile: else:
file = open(args.outputfile, 'w') print("Discovering Motive server on the network...")
file.write('time, distance, x, y, z, recording \n') server_ip = discover_motive()
if server_ip is None:
old_z = pos_z print("ERROR: No Motive server found. Check that Motive is running and streaming.")
old_x = pos_x print("You can also specify the IP manually with --server <ip>")
distance = 0 return
start_time = time.time() print(f"Found Motive server at {server_ip}")
pre_time = time.time()
client = NatNetClient()
while plt.fignum_exists(fig.number): client.server_ip_address = server_ip
client.local_ip_address = "0.0.0.0"
h = math.hypot(pos_z - old_z, pos_x - old_x) client.set_print_level(0)
client.rigid_body_list_listener = receive_rigid_body_frame
if h > 0.10: client.run()
if recording:
distance += h print("Waiting for rigid body data...")
old_z = pos_z time.sleep(3)
old_x = pos_x
if time.time() - pre_time > .1: if frame_count == 0:
current_time = time.time() - start_time print("ERROR: No data received. Check that Motive is streaming.")
print(f'distance: {distance}; time: {current_time:.2f}; recording: {recording}') client.shutdown()
pre_time = time.time() return
if args.outputfile: print(f"Receiving data: {frame_count} frames, rigid body IDs: {sorted(seen_ids)}")
# data = '{}, {}, {}, {}, {}, {} \n'.format(time.time() - start_time, distance, pos_x, pos_y, pos_z, recording)
data = f'{current_time}, {distance}, {pos_x}, {pos_y}, {pos_z}, {recording} \n' # Pick which rigid body to track
file.write(data) if len(seen_ids) == 1:
track_id = list(seen_ids)[0]
if recording: print(f"Auto-selected ID {track_id}")
plt.plot(pos_z, pos_x, 'ro') else:
plt.title('Press r to start/stop recording \n press q to quit \n RECORDING') while True:
else: choice = input(f"Enter rigid body ID to track {sorted(seen_ids)}: ").strip()
plt.plot(pos_z, pos_x, 'bo') try:
plt.title('Press r to start/stop recording \n press q to quit \n NOT RECORDING') choice = int(choice)
if choice in seen_ids:
plt.pause(0.001) track_id = choice
# time.sleep(0.01) break
print(f"ID {choice} not available.")
streamingClient.stop() except ValueError:
if args.outputfile: print("Please enter a valid integer.")
file.close()
print(f"Tracking ID {track_id}. Press r to record, q to quit.")
if __name__ == '__main__': # Output file with timestamp and rigid body ID
parser = argparse.ArgumentParser() if args.outputfile:
parser.add_argument('--server', default="127.0.0.1") output_path = args.outputfile
parser.add_argument('--multicast', default="239.255.42.99") else:
parser.add_argument('--commandPort', type=int, default=1510) timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
parser.add_argument('--dataPort', type=int, default=1511) output_path = f"dist_rb{track_id}_{timestamp}.csv"
parser.add_argument('--id', type=int, default=None) file = open(output_path, 'w')
parser.add_argument('--outputfile', type=str, default=None) file.write('time, distance, x, y, z, recording\n')
args = parser.parse_args() print(f"Writing data to {output_path}")
main(args) # Setup plot
fig = plt.figure()
plt.axis([-6, 6, -6, 6])
fig.canvas.mpl_connect('key_press_event', on_press)
old_x = pos_x
old_z = pos_z
distance = 0
glitch_count = 0
start_time = time.time()
pre_time = time.time()
freq_count = 0
freq_time = time.time()
data_freq = 0.0
while plt.fignum_exists(fig.number):
h = math.hypot(pos_x - old_x, pos_z - old_z)
if h > 1.0:
# Glitch: jump too large, ignore but update position
glitch_count += 1
old_x = pos_x
old_z = pos_z
elif h > 0.10:
if recording:
distance += h
old_x = pos_x
old_z = pos_z
# Track data frequency
freq_count += 1
now = time.time()
if now - freq_time >= 1.0:
data_freq = freq_count / (now - freq_time)
freq_count = 0
freq_time = now
if now - pre_time > 0.1:
current_time = now - start_time
print(f'distance: {distance:.2f} m; time: {current_time:.2f} s; freq: {data_freq:.0f} Hz; recording: {recording}')
pre_time = now
data = f'{current_time}, {distance}, {pos_x}, {pos_y}, {pos_z}, {recording}\n'
file.write(data)
rec_str = "RECORDING" if recording else "NOT RECORDING"
plt.title(f'Distance: {distance:.2f} m | {data_freq:.0f} Hz | {rec_str}\nPress r to record, q to quit')
if recording:
plt.plot(pos_z, pos_x, 'ro')
else:
plt.plot(pos_z, pos_x, 'bo')
plt.pause(0.001)
client.shutdown()
file.close()
print(f"Total distance: {distance:.2f} m ({glitch_count} glitches ignored)")
print(f"Data saved to {output_path}")
if __name__ == '__main__':
parser = argparse.ArgumentParser(description="Distance counter using Optitrack NatNet stream")
parser.add_argument('--server', default=None, help="Motive server IP (auto-discovers if not set)")
parser.add_argument('--outputfile', type=str, default=None, help="CSV output file (default: dist_rb<id>_<timestamp>.csv)")
args = parser.parse_args()
main(args)
+65
View File
@@ -0,0 +1,65 @@
#!/usr/bin/env python3
"""Plot summary of a dist.py recording: distance over time, recording regions, totals."""
import csv
import argparse
import matplotlib.pyplot as plt
import matplotlib.patches as mpatches
def main(csvfile):
times = []
distances = []
recording = []
with open(csvfile) as f:
reader = csv.reader(f)
next(reader) # skip header
for row in reader:
times.append(float(row[0]))
distances.append(float(row[1]))
recording.append(row[5].strip() == 'True')
# Compute total distance and total time spent recording
total_distance = distances[-1] if distances else 0.0
total_rec_time = 0.0
for i in range(1, len(times)):
if recording[i]:
total_rec_time += times[i] - times[i - 1]
print(f"File: {csvfile}")
print(f"Total distance: {total_distance:.2f} m")
print(f"Total recording time: {total_rec_time:.1f} s")
print(f"Total session time: {times[-1]:.1f} s")
fig, ax = plt.subplots()
ax.plot(times, distances, 'b-', linewidth=1.5)
ax.set_xlabel('Time (s)')
ax.set_ylabel('Distance (m)')
ax.set_title(f'{csvfile}\nTotal distance: {total_distance:.2f} m | Recording time: {total_rec_time:.1f} s')
# Shade recording regions
in_region = False
region_start = 0
for i, rec in enumerate(recording):
if rec and not in_region:
region_start = times[i]
in_region = True
elif not rec and in_region:
ax.axvspan(region_start, times[i], alpha=0.2, color='red')
in_region = False
if in_region:
ax.axvspan(region_start, times[-1], alpha=0.2, color='red')
rec_patch = mpatches.Patch(color='red', alpha=0.2, label='Recording active')
ax.legend(handles=[rec_patch])
ax.grid(True, alpha=0.3)
plt.tight_layout()
plt.show()
if __name__ == '__main__':
parser = argparse.ArgumentParser(description="Plot summary of a dist.py recording")
parser.add_argument('csvfile', help="CSV file from dist.py")
args = parser.parse_args()
main(args.csvfile)
+19 -1
View File
@@ -34,7 +34,7 @@
int main() int main()
{ {
note("running algebra math tests"); note("running algebra math tests");
plan(3); plan(6); // number of tests in this file to run
/* test int32_vect2_normalize */ /* test int32_vect2_normalize */
struct Int32Vect2 v = {2300, -4200}; struct Int32Vect2 v = {2300, -4200};
@@ -57,6 +57,24 @@ int main()
ok((fabs(quat_zxy.qi - 0.9266) < 0.01 && fabs(quat_zxy.qx - -0.2317) < 0.01 && fabs(quat_zxy.qy - 0.1165) < 0.01) && fabs(quat_zxy.qz - 0.2722), ok((fabs(quat_zxy.qi - 0.9266) < 0.01 && fabs(quat_zxy.qx - -0.2317) < 0.01 && fabs(quat_zxy.qy - 0.1165) < 0.01) && fabs(quat_zxy.qz - 0.2722),
"float_quat_of_eulers_zxy(float_eulers_of_quat_zxy(0.9266, -0.2317, 0.1165, 0.2722)) returned [%f, %f, %f, %f]", quat_zxy.qi, quat_zxy.qx, quat_zxy.qy, quat_zxy.qz); "float_quat_of_eulers_zxy(float_eulers_of_quat_zxy(0.9266, -0.2317, 0.1165, 0.2722)) returned [%f, %f, %f, %f]", quat_zxy.qi, quat_zxy.qx, quat_zxy.qy, quat_zxy.qz);
/* test int32_quat_normalize */
struct Int32Quat q = {32768, 0, 0, 0}; // 1.0 in Q15 format
q.qx = 16384; // 0.5 in Q15
q.qy = 0;
q.qz = 0;
int32_quat_normalize(&q);
// After normalization, the quaternion should be unit length (|q| = 1.0 in Q15)
// For [1, 0.5, 0, 0], norm = sqrt(1^2 + 0.5^2) = sqrt(1.25) ≈ 1.118
// Normalized: [1/1.118, 0.5/1.118, 0, 0] ≈ [0.894, 0.447, 0, 0]
// In Q15: 0.894*32768 ≈ 29300, 0.447*32768 ≈ 14650
ok((q.qi >= 29295 && q.qi <= 29315) && (q.qx >= 14645 && q.qx <= 14655), // allowing a small margin of error due to integer rounding
"int32_quat_normalize([32768,16384,0,0]) returned [%d, %d, %d, %d] (expected approx [29300,14650,0,0] +/- 15)", q.qi, q.qx, q.qy, q.qz);
uint32_t sqrt_100 = int32_sqrt(100);
ok(sqrt_100 == 10, "int32_sqrt(100) == %d (expected 10)", sqrt_100);
uint32_t sqrt_12345 = int32_sqrt(12345);
ok(sqrt_12345 >= 111 && sqrt_12345 <= 112, "int32_sqrt(12345) == %d (expected approx 111)", sqrt_12345);
done_testing(); done_testing();
} }