Table of Contents

Simulink Quadcopter Simulation

Author: <Yehyun Lee> Email: leey93@unlv.nevada.edu
Date: Last modified on <04/21/2025>
Keywords: quadcopter, simulink, simulation, motion capture


1. Overview

This tutorial explains how to set up and use a quadcopter simulation based on the Parrot Minidrone using Simulink and the Aerospace Blockset. We will start by running the simulation model, then deploy control algorithms to an actual Parrot Mambo drone, collect flight data using a motion capture system, and finally modify the simulation to adapt different drone specifications, such as a LEGO-based drone.

The base simulation model used in this tutorial is:

Throughout the tutorial, you will learn how to:




2. Prerequisites

Before starting the simulation, ensure you have the following software and add-ons installed:

• Required Software

Software Purpose Download Link
MATLAB(R2022b or later) Main platform for simulation https://www.mathworks.com/products/matlab.html
Simulink Model-based design and simulation Included with MATLAB installation

• Installing Add-Ons

To install the required add-ons:

  1. Open MATLAB.
  2. Go to Home → Add-Ons → Get Add-Ons
  3. Search for and install each of the required add-ons listed above.
  4. (Optional) Install the Simulink Support Package for Parrot Minidrones if you plan to deploy your algorithms to a real drone.
  5. (Optional) Install a supported C/C+ + compiler (e.g., MinGW-w64) for hardware code generation.

You can check installed add-ons by navigating to Home → Environment → Add-Ons → Manage Add-Ons.

* Note: All required add-ons must be installed to successfully run the simulation. The Simulink Support Package and a C/C+ + compiler are only necessary for real-world deployment, not for simulation.

- Required Add-Ons

Add-On Purpose
Aerospace Blockset Provides the quadcopter simulation example and essential aerospace modeling blocks
Optimization Toolbox Enables optimization algorithms used during control tuning and simulation
Simulink Control Design Supports the design and analysis of control systems in Simulink models
Signal Processing Toolbox Offers tools for analyzing and processing signals from sensors and controllers
Computer Vision Toolbox Provides functions for simulating and processing camera and vision-based inputs
Simulink 3D Animation Allows 3D visualization of the drone and its environment during simulation

- Optional for Deployment

Add-On Purpose
Simulink Support Package for Parrot Minidrones Enables deployment of algorithms to a real Parrot Mambo drone
C/C++ Compiler (e.g., MinGW-w64) Required for code generation and deployment to hardware




3. Setting Up the Simulator

Once the required software and add-ons are installed, follow these steps to open the simulation project.

1. Open MATLAB.
2. In the Command Window, type the following command and press Enter:

openProject('asbQuadcopter')

3. This will open the Quadcopter Modeling and Simulation Based on Parrot Minidrone project provided by the Aerospace Blockset.

If the command does not work, double-check that the Aerospace Blockset is correctly installed.




4. Understanding the Simulator Structure

The quadcopter simulation model is organized into modular subsystems, each handling specific aspects of the drone's behavior. The overall system is visualized as follows:

(i) Command Subsystem

The Command subsystem generates the desired motion commands for the drone, including roll, pitch, yaw, and thrust. Inputs can be manually specified using a signal builder, a joystick, or loaded from a predefined spreadsheet trajectory. This subsystem mainly serves as a user input interface and is typically not modified during standard simulations.

(ii) Environment Subsystem

The Environment subsystem simulates external environmental conditions that can affect the drone’s behavior, such as gravity, air temperature, air density, and magnetic fields.

Two modeling options are available:

For standard simulation tasks, the constant environment model is often sufficient.

(iii) Sensors Subsystem

The Sensors subsystem provides measurements of the drone’s state, such as position, velocity, orientation, and landing detection based on visual data.

Two sensor configurations are available:

This allows the simulation to mimic either idealized or real-world sensing conditions.

(iv) Airframe Subsystem

The Airframe subsystem models the physical behavior of the drone based on forces and moments.
It offers two approaches:

- Nonlinear Airframe The Nonlinear Airframe models the drone's physical behavior using six degrees of freedom (6DOF) equations of motion. It calculates the effects of gravity, aerodynamic drag, and rotational dynamics without linear approximations.

Key forces and moments:

Main components:

The Nonlinear Airframe enables realistic flight simulation, critical for PID tuning and performance validation. In this project, it was used to closely match simulation results with real-world experiments.

- Linear Airframe A linearized version of the nonlinear model around a steady flight condition, using precomputed A, B, C, D matrices. This simplifies controller design but is less accurate for aggressive maneuvers.

The Nonlinear Airframe is typically used for more realistic flight dynamics and was used as the base for subsequent modifications and tuning in this project.

(v) Flight Control System (FCS)

The Flight Control System (FCS) is responsible for stabilizing the drone, maintaining its attitude, and controlling its trajectory. It processes sensor data and command inputs to generate actuator signals in real time.

The FCS consists of three main components:

Inputs to the FCS include aircraft commands from the Command subsystem, sensor measurements from the Sensors subsystem, and image data from the Camera model. The outputs are actuator signals sent to the motors, along with status flags used for visualization and system monitoring.

In this project, while the structure of the FCS remained the same, tuning of the controller gains and adjustments to estimation parameters were performed to improve overall flight performance and better match real-world behavior.

(vi) Visualization

To monitor and analyze the drone's behavior during simulation, the model provides multiple visualization modes. These modes display flight trajectories, orientation, control responses, and environment interaction, helping users better understand system performance. The available visualization options can be selected by setting the variable VSS_VISUALIZATION in the MATLAB workspace:

0: Scopes – Displays time-series plots of variables such as states and commands.
1: Workspace Logging – Saves selected signals to the workspace for custom plotting.
2: FlightGear – Uses the FlightGear simulator for 3D animation.
3: Airport Scene – Renders the drone in a virtual environment using Unreal Engine.
4: Apple Hill Scene – Similar to option 3 but uses a different 3D scene.

In this project, the Airport Scene (option 3) was used for real-time 3D animation. This allowed for an intuitive observation of the drone's behavior, including attitude, altitude, and response to control commands.




5. Validating with Motion Capture

In this section, we will apply the simulation to a real drone (Parrot Mambo) and collect flight data using a motion capture system. By doing this, we can verify whether the drone moves as intended based on the input commands given in Simulink.

5.1 Deploying the Model to the Parrot Mambo Drone

Before deploying the controller onto a real drone, ensure the following prerequisites are satisfied:

If everything is prepared, the next step is to configure the hardware connection.

Navigate to Home → Environment → Add-Ons → Manage Add-Ons in MATLAB. Locate the Simulink Support Package for Parrot Minidrones and click the gear icon⚙ to launch the hardware setup window. Follow the on-screen instructions to download the appropriate firmware onto the drone and complete the Bluetooth pairing process. Once paired successfully, the system is ready for deployment.

Only the Flight Control System (FCS) subsystem needs to be deployed to the drone. To enable external input commands, modifications were made to the Input block inside the FCS subsystem. In our experiment, we applied a step input to the Z-axis command to observe altitude control, as shown in the figure above. Other inputs such as roll, pitch, and yaw were kept at zero. (These values can be modified depending on the desired maneuver.)

After setting up the input, navigate to the Hardware tab in Simulink. Ensure the Hardware Board is selected as “Parrot Mambo” (or your specific drone model). Click the Build, Deploy & Start button to begin the deployment process. The build log and progress can be monitored through the Diagnostic Viewer.

Once code generation and deployment are complete, two windows will appear:

In the interface, set the Power Gain to 100% and click Start. The drone will take off and follow the pre-programmed input commands. After the flight is completed, the interface will show additional options to download the Flight Log and MAT File containing recorded flight data.

5.2 Collecting Motion Capture Data for Drone Flight

Now that the drone is flying, we want to measure its actual movement using the motion capture (Mocap) system.

- Preparing the Motion Capture System
Before collecting data, you need a working motion capture setup. Detailed setup procedures, including hardware calibration and scene creation, are already available in the Optitrack Motion Capture System Setup Guide.

- Setting Up ROS to Collect Data
Once the motion capture environment is ready and the drone is flying, you can record its motion data in real-time using a Python script.

We will subscribe to the /vrpn_client/estimated_transform topic, which provides the drone's 3D position from the motion capture system.

Here’s the full step-by-step process:
1. Open a terminal and run:

roscore

2. Open another terminal and launch the VRPN client node:

rosrun vrpn_client_ros vrpn_client_node _server:=<Mocap_Server_IP> _object_name:=<Rigid_Body_Name>

example:
rosrun vrpn_client_ros vrpn_client_node _server:=10.69.96.170 _object_name:=lego_drone

3. Open a third terminal and navigate to the following directory:

cd ~/catkin_ws/src/beginner_tutorials/src/

4. Then run the Python script:

python3 mambo_drone.py

Python Script to Collect Motion Capture Data
Below is the complete script that collects the drone’s position and saves it into a .csv file.

#!/usr/bin/python3

import rospy
import csv
from geometry_msgs.msg import TransformStamped

# Define the callback function to process incoming messages
def callback(msg):
    # Extract the XYZ data from the TransformStamped message
    x = msg.transform.translation.x
    y = msg.transform.translation.y
    z = msg.transform.translation.z
    print(f"{x}, {y}, {z}")

    # Write the data to a CSV file
    with open('transform_data.csv', mode='a') as file:
        writer = csv.writer(file)
        # Write the header if the file is empty
        if file.tell() == 0:
            writer.writerow(["Timestamp", "X", "Y", "Z"])
        # Write the data
        writer.writerow([rospy.get_time(), x, y, z])

def listener():
    # Initialize the ROS node
    rospy.init_node('transform_listener', anonymous=True)

    # Subscribe to the estimated transform topic
    rospy.Subscriber('/vrpn_client/estimated_transform', TransformStamped, callback)

    # Keep the node running
    rospy.spin()

if __name__ == '__main__':
    try:
        listener()
    except rospy.ROSInterruptException:
        pass

5.3 Validation through Mocap Experiment

After deploying the controller and executing flight tests in the motion capture environment, we validated the reliability of the Simulink quadcopter simulator by comparing the simulation outputs with the actual flight data.

The following video shows the experimental setup. The Parrot Mambo drone, loaded with the deployed controller, was flown in the motion capture space while the flight data was simultaneously collected using mocap tracking:

In this experiment, a step input was applied to the Z-axis command to observe the altitude control performance. Two different scenarios were tested:

Scenario 1: Default Drone Weight
A step input commanding a rise from approximately 0.6 meters to 1.1 meters was given. The comparison between the simulation and real flight data is shown below:

The blue solid and dashed lines represent the simulator and real-world Z positions, respectively. The results demonstrate that the real drone altitude closely followed the simulation prediction, validating the accuracy of the simulated model under nominal conditions.

Scenario 2: Additional Payload
In this test, a 5g weight was attached to the drone to assess the robustness of the controller and the simulator under additional load conditions. The results are shown below:

Despite the added weight, the actual drone flight trajectory remained highly consistent with the simulator's prediction, especially in the Z-axis control. This suggests that the Simulink-based model maintains its reliability even under moderate payload variations.

Overall, the close match between the simulated and actual flight data confirms the effectiveness and credibility of the developed quadcopter simulator. This validation provides confidence that the simulation environment can be used to predict real-world quadcopter behavior accurately.