User Tools

Site Tools


robotic_manipulators_ibvs

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
Last revisionBoth sides next revision
robotic_manipulators_ibvs [2016/08/08 17:23] joaomatosrobotic_manipulators_ibvs [2016/08/08 19:29] joaomatos
Line 1: Line 1:
 ===== Image Based Visual Servoing ===== ===== Image Based Visual Servoing =====
 +
 +**Author:** Joao Matos Email: <[email protected]> <!-- replace with your email address -->
 +\\
 +**Date:** Last modified on 6/8/2016
 +\\
 +**Keywords:** Visual Servoing , Dynamixel SDK, Flea3 Camera, FlyCapture
 +\\
  
  This tutorial will show how to configure and program the Dynamixels servos in C++ using Visual Studio and the library provided by Dynamixel.Also , will show how to configure and program the Flea3 camera in C++ using Visual Studio and the library FlyCapture.  This tutorial will show how to configure and program the Dynamixels servos in C++ using Visual Studio and the library provided by Dynamixel.Also , will show how to configure and program the Flea3 camera in C++ using Visual Studio and the library FlyCapture.
Line 24: Line 31:
  
 {{ ::dt2.jpg?direct |}} {{ ::dt2.jpg?direct |}}
 +
 +The addr column can be useful if you want to know the address to access some information ( later on this tutorial we will need to read / write information on the servo using this address ). Also , the USB2Dynamixel can only communicate with dynamixels using the same baudrate , so before you start coding , is important to change the baudrate to 1000000 on every dynamixel found by the Dynamixel wizard.
  
 ---- ----
Line 65: Line 74:
   * Under Linker > general > additional library directories include the directory: Program Files\Point Grey\lib64\vs2015   * Under Linker > general > additional library directories include the directory: Program Files\Point Grey\lib64\vs2015
   * Under Link > input > additional dependencies include the library: FlyCapture2d_v140.lib   * Under Link > input > additional dependencies include the library: FlyCapture2d_v140.lib
 +  * Go to environment variables and add two paths into the "path" variables under system variable : Program Files\Point Grey\bin64 and Program Files\Point Grey\bin64\vs2015
  
  
Line 218: Line 228:
  This will open and configure the port to communicate with the USB2Dynamixel .  This will open and configure the port to communicate with the USB2Dynamixel .
  
- To read something (get information) from any dynamixel servo the API is:+** To read something (get information) from any dynamixel servo the API is:**
  
 <Code> <Code>
Line 231: Line 241:
 After the arrow we will define if we want to read or write. If we want to read , we use "read2ByteTxRx" after the packetHandler-> arrow. The first argument is already defined in our startup , the second argument is the dynamixel ID ( you can see which servo correspond to which id in the Dynamixel Wizard) , on this case we are reading an information from the dynamixel servo ID1. The third argument is what you want to read , note that the variable "ADDR_MX_PRESENT_POSITION" is defined as 36 , if we check on the address table from the dynamixel , the address 36 correspond to present position , so we are reading the present position from dynamixel ID1. The fourth argument will hold the result from the reading - the present position - and the last argument will hold an error information (depending on its value we can know if the reading was successful or not). After the arrow we will define if we want to read or write. If we want to read , we use "read2ByteTxRx" after the packetHandler-> arrow. The first argument is already defined in our startup , the second argument is the dynamixel ID ( you can see which servo correspond to which id in the Dynamixel Wizard) , on this case we are reading an information from the dynamixel servo ID1. The third argument is what you want to read , note that the variable "ADDR_MX_PRESENT_POSITION" is defined as 36 , if we check on the address table from the dynamixel , the address 36 correspond to present position , so we are reading the present position from dynamixel ID1. The fourth argument will hold the result from the reading - the present position - and the last argument will hold an error information (depending on its value we can know if the reading was successful or not).
  
- If we want to write something (send information) the idea is very similar to the reading:+** If we want to write something (send information) the idea is very similar to the reading:**
  
 <Code> <Code>
Line 249: Line 259:
  
 {{ ::ii9.jpg?direct |}}  {{ ::ii9.jpg?direct |}} 
 +
 +
 +----
 +
 +===== How to get Flea3 Camera feed and convert to OpenCV format ======
 +
 + In order to work with the camera feed provided by the FlyCapture library we need to convert the image to the Mat format - format that OpenCV function uses.
 + 
 + To start and get the camera feed we use the code:
 +
 +<Code>
 +//Variables to be used on the Flea3 Camera 
 + Error error;
 + Camera camera;
 + CameraInfo camInfo;
 +
 +
 +// Connect the Flea3 camera
 + error = camera.Connect(0);
 + if (error != PGRERROR_OK)
 + {
 + std::cout << "Failed to connect to camera" << std::endl;
 + return false;
 + }
 +
 + // Get the camera info and print it out
 + error = camera.GetCameraInfo(&camInfo);
 + if (error != PGRERROR_OK)
 + {
 + std::cout << "Failed to get camera info from camera" << std::endl;
 + return false;
 + }
 + std::cout << camInfo.vendorName << " "
 + << camInfo.modelName << " "
 + << camInfo.serialNumber << std::endl;
 +
 + error = camera.StartCapture();
 + if (error == PGRERROR_ISOCH_BANDWIDTH_EXCEEDED)
 + {
 + std::cout << "Bandwidth exceeded" << std::endl;
 + return false;
 + }
 + else if (error != PGRERROR_OK)
 + {
 + std::cout << "Failed to start image capture" << std::endl;
 + return false;
 + }
 +
 +
 + std::cout << " Flea3 Camera Connected " << std::endl;
 + std::wcout << "Press any key to continue " << std::endl;
 + cv::waitKey(300000000);
 +
 +
 +</Code>
 +
 + This will start the Flea3 Camera and check if the connection is working well. To get the camera feed and convert to Mat type we use:
 +
 +<Code>
 +                           // Get Flea3 camera image
 + Image rawImage;
 + Error error = camera.RetrieveBuffer(&rawImage);
 +                           
 +                           // convert to rgb
 + Image rgbImage;
 + rawImage.Convert(FlyCapture2::PIXEL_FORMAT_BGR, &rgbImage);
 +
 + // convert to OpenCV Mat 
 + unsigned int rowBytes = (double)rgbImage.GetReceivedDataSize() / (double)rgbImage.GetRows();
 + cv::Mat coloredimage = cv::Mat(rgbImage.GetRows(), rgbImage.GetCols(), CV_8UC3, rgbImage.GetData(), rowBytes);
 +
 + //Resize the image to 640 x 480 
 + cv::resize(coloredimage, coloredimage, cv::Size(640, 480));
 +
 +</Code>
 +
 +This code will create a Mat variable called coloredimage , using RGB channel , to be used in the OpenCV functions.
 +
 +
 +
 +----
 +===== Integrating vision and control =====
 +
 + The final part of this tutorial is to integrate the dynamixel control with the vision provided by the Flea3 camera. Is important that you have all the libraries working in the same Visual Studio Solution.
 + There are many ways to do the Visual Servoing , and it's up to the programmer decide how to integrate both. 
 +
 + My first code to start to perform the Peg-in-Hole follows the pseudo-algorithm:
 +
 +  - Apply circle detection function to detect hole
 +  - Find the biggest circle detected
 +  - Get radius and center point of the biggest circle detected
 +  - Calculate the distance from the hole to a desired setpoint
 +  - Based on the distance error , move the manipulator up/down/right/left
 +  - if the radius detected is bigger than a threshold the peg is in the hole and must release the gripper
 +  - if not cycle again.
 +
 +The idea here is to make the UAV to hover over hole , and make it go down slowly. As the UAV is moving down next to the hole , the arm will keep the peg centered into the hole . When the radius detected is bigger than a threshold , the camera is very near to the hole and peg should be inside the hole this time , so the gripper must release the peg and the task is done.
 +
 + This is a very simple control algorithm and it will be used to do the preliminary tests with the MM-UAV. We need to see how many disturbance will be created by the arm movement , and how to program a more robust hover control to the UAV in order to make it hold a position next to the hole while moving down.
 +
 + In my code , the angles to make the moving movement are calculated in order to keep the end effector always in the vertical position , because if the end effector is a little bit offset from the vertical position , the peg will not enter in the hole properly , causing a big disturbance in the UAV and this can cause a crash.
robotic_manipulators_ibvs.txt · Last modified: 2016/10/23 20:16 by dwallace