User Tools

Site Tools


simulink_quadcopter_simulation

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
simulink_quadcopter_simulation [2025/04/26 20:15] yehyunsimulink_quadcopter_simulation [2025/04/26 22:21] (current) yehyun
Line 5: Line 5:
 **Date:** Last modified on <04/21/2025> **Date:** Last modified on <04/21/2025>
 \\ \\
-**Keywords:** Lego+**Keywords:** quadcopter, simulink, simulation, motion capture
 \\ \\
 \\ \\
Line 22: Line 22:
   * Deploy algorithms to a real Parrot Minidrone   * Deploy algorithms to a real Parrot Minidrone
   * Validate the results using motion capture data   * Validate the results using motion capture data
-  * Modify the simulation for new drone specifications 
 \\ \\
 ---- ----
Line 181: Line 180:
 ===== 5. Validating with Motion Capture ===== ===== 5. Validating with Motion Capture =====
  
 +In this section, we will apply the simulation to a real drone (Parrot Mambo) and collect flight data using a motion capture system.
 +By doing this, we can verify whether the drone moves as intended based on the input commands given in Simulink.
 +\\ 
 +\\ 
 ==== 5.1 Deploying the Model to the Parrot Mambo Drone ==== ==== 5.1 Deploying the Model to the Parrot Mambo Drone ====
  
Line 222: Line 225:
 The drone will take off and follow the pre-programmed input commands. The drone will take off and follow the pre-programmed input commands.
 After the flight is completed, the interface will show additional options to download the Flight Log and MAT File containing recorded flight data. After the flight is completed, the interface will show additional options to download the Flight Log and MAT File containing recorded flight data.
 +\\ 
 +\\
 ==== 5.2 Collecting Motion Capture Data for Drone Flight ==== ==== 5.2 Collecting Motion Capture Data for Drone Flight ====
  
-In this section, we will describe how to collect real-world flight data of the drone using a motion capture (Mocap) system. The gathered data will later be compared to the simulation results from Simulink.+Now that the drone is flying, we want to measure its actual movement using the motion capture (Mocap) system.
  
-**Preparing the Motion Capture System** +**Preparing the Motion Capture System** 
-Before collecting data, you need a working motion capture setup.+\\ Before collecting data, you need a working motion capture setup.
 Detailed setup procedures, including hardware calibration and scene creation, are already available in the [[optitrack_motion_capture_system_setup_guide|Optitrack Motion Capture System Setup Guide]]. Detailed setup procedures, including hardware calibration and scene creation, are already available in the [[optitrack_motion_capture_system_setup_guide|Optitrack Motion Capture System Setup Guide]].
  
-* Complete the basic calibration and scene setup following the motion capture tutorial. +  * Complete the basic calibration and scene setup following the motion capture tutorial. 
-Make sure that a rigid body for the drone is created. (In our example, the rigid body name is lego_drone, but you can choose any name as long as it matches later.)+  Make sure that a rigid body for the drone is created. (In our example, the rigid body name is lego_drone, but you can choose any name as long as it matches later.)
  
-**Recording Drone Flight Data** +**- Setting Up ROS to Collect Data** 
-Once the motion capture environment is ready and the drone is flying, you can record its motion data in real-time using a Python script.+\\ Once the motion capture environment is ready and the drone is flying, you can record its motion data in real-time using a Python script.
  
 We will subscribe to the <color #00a2e8>/vrpn_client/estimated_transform</color> topic, which provides the drone's 3D position from the motion capture system. We will subscribe to the <color #00a2e8>/vrpn_client/estimated_transform</color> topic, which provides the drone's 3D position from the motion capture system.
  
 Here’s the full step-by-step process: Here’s the full step-by-step process:
-1. Open a terminal and run:+\\ 1. Open a terminal and run:
   roscore   roscore
 2. Open another terminal and launch the VRPN client node: 2. Open another terminal and launch the VRPN client node:
Line 247: Line 251:
   example:   example:
   rosrun vrpn_client_ros vrpn_client_node _server:=10.69.96.170 _object_name:=lego_drone   rosrun vrpn_client_ros vrpn_client_node _server:=10.69.96.170 _object_name:=lego_drone
 +
   * Make sure:   * Make sure:
     * <Mocap_Server_IP> matches the IP address of the motion capture server computer.     * <Mocap_Server_IP> matches the IP address of the motion capture server computer.
Line 255: Line 260:
 4. Then run the Python script: 4. Then run the Python script:
   python3 mambo_drone.py   python3 mambo_drone.py
 +  
 +{{ :simulator_part5_7.jpg?nolink&800 |}}
  
 **Python Script to Collect Motion Capture Data** **Python Script to Collect Motion Capture Data**
-Below is the complete script that collects the drone’s position and saves it into a .csv file.+\\ Below is the complete script that collects the drone’s position and saves it into a .csv file.
   #!/usr/bin/python3   #!/usr/bin/python3
 +  
   import rospy   import rospy
   import csv   import csv
   from geometry_msgs.msg import TransformStamped   from geometry_msgs.msg import TransformStamped
 +  
   # Define the callback function to process incoming messages   # Define the callback function to process incoming messages
   def callback(msg):   def callback(msg):
Line 271: Line 278:
       z = msg.transform.translation.z       z = msg.transform.translation.z
       print(f"{x}, {y}, {z}")       print(f"{x}, {y}, {z}")
 +  
       # Write the data to a CSV file       # Write the data to a CSV file
       with open('transform_data.csv', mode='a') as file:       with open('transform_data.csv', mode='a') as file:
Line 280: Line 287:
           # Write the data           # Write the data
           writer.writerow([rospy.get_time(), x, y, z])           writer.writerow([rospy.get_time(), x, y, z])
 +  
   def listener():   def listener():
       # Initialize the ROS node       # Initialize the ROS node
       rospy.init_node('transform_listener', anonymous=True)       rospy.init_node('transform_listener', anonymous=True)
 +  
       # Subscribe to the estimated transform topic       # Subscribe to the estimated transform topic
       rospy.Subscriber('/vrpn_client/estimated_transform', TransformStamped, callback)       rospy.Subscriber('/vrpn_client/estimated_transform', TransformStamped, callback)
 +  
       # Keep the node running       # Keep the node running
       rospy.spin()       rospy.spin()
 +  
   if __name__ == '__main__':   if __name__ == '__main__':
       try:       try:
Line 296: Line 303:
       except rospy.ROSInterruptException:       except rospy.ROSInterruptException:
           pass           pass
 +
 +==== 5.3 Validation through Mocap Experiment ====
 +
 +After deploying the controller and executing flight tests in the motion capture environment, we validated the reliability of the Simulink quadcopter simulator by comparing the simulation outputs with the actual flight data.
 +
 +The following video shows the experimental setup. The Parrot Mambo drone, loaded with the deployed controller, was flown in the motion capture space while the flight data was simultaneously collected using mocap tracking:
 +{{youtube>ya10C-6nGKU?large}}
 +
 +In this experiment, a step input was applied to the Z-axis command to observe the altitude control performance. Two different scenarios were tested:
 +
 +**Scenario 1: Default Drone Weight**
 +\\ A step input commanding a rise from approximately 0.6 meters to 1.1 meters was given. The comparison between the simulation and real flight data is shown below:
 +{{ :simulator_part5_8.jpg?nolink&800 |}}
 +
 +The blue solid and dashed lines represent the simulator and real-world Z positions, respectively. The results demonstrate that the real drone altitude closely followed the simulation prediction, validating the accuracy of the simulated model under nominal conditions.
 +
 +**Scenario 2: Additional Payload**
 +\\ In this test, a 5g weight was attached to the drone to assess the robustness of the controller and the simulator under additional load conditions. The results are shown below:
 +
 +Despite the added weight, the actual drone flight trajectory remained highly consistent with the simulator's prediction, especially in the Z-axis control. This suggests that the Simulink-based model maintains its reliability even under moderate payload variations.
 +
 +Overall, the close match between the simulated and actual flight data confirms the effectiveness and credibility of the developed quadcopter simulator. This validation provides confidence that the simulation environment can be used to predict real-world quadcopter behavior accurately.
  
  
simulink_quadcopter_simulation.1745723747.txt.gz · Last modified: by yehyun