drexel_darwin_slam
Differences
This shows you the differences between two versions of the page.
Next revision | Previous revision | ||
drexel_darwin_slam [2016/11/01 15:08] – created dwallace | drexel_darwin_slam [2016/11/06 19:42] (current) – dwallace | ||
---|---|---|---|
Line 1: | Line 1: | ||
====== Simultaneous Localization and Mapping (SLAM) ====== | ====== Simultaneous Localization and Mapping (SLAM) ====== | ||
- | ==Motivation== | + | ===== Motivation ===== |
SLAM give the robot infomation of its whereabout and more relevantly, the position of the features of its surroundings. | SLAM give the robot infomation of its whereabout and more relevantly, the position of the features of its surroundings. | ||
This can be essential to our project to identify each rungs during each step of the ladder ascent. | This can be essential to our project to identify each rungs during each step of the ladder ascent. | ||
- | Refer here[http:// | + | Refer [[http:// |
- | ==Learning== | + | ===== Learning |
- | With reference to the webpage[http:// | + | |
- | ===Homework 1=== | + | With reference to [[http:// |
- | <syntaxhighlight lang="matlab"> | + | |
+ | ==== Homework 1 ==== | ||
+ | |||
+ | <code matlab | ||
R=0.01; | R=0.01; | ||
Q=0.00001; | Q=0.00001; | ||
Line 37: | Line 41: | ||
... | ... | ||
... | ... | ||
- | </syntaxhighlight> | + | </code> |
+ | |||
+ | Refer to [[http:// | ||
- | Refer to here[http:// | + | ==== Homework 2 ==== |
- | ===Homework 2=== | + | I personally recommend |
- | I personally recommend | + | |
- | <syntaxhighlight lang="matlab"> | + | <code matlab |
... | ... | ||
... | ... | ||
Line 124: | Line 129: | ||
... | ... | ||
... | ... | ||
- | </syntaxhighlight> | + | </code> |
- | ===Homework 4=== | + | === Homework 4 ==== |
- | *'' | + | |
- | <syntaxhighlight lang="matlab"> | + | **slam.m** |
+ | <code matlab | ||
xest(1,k) = xest(1,k-1) + dt*Vc*cos(phi) - dt*Vc/ | xest(1,k) = xest(1,k-1) + dt*Vc*cos(phi) - dt*Vc/ | ||
xest(2,k) = xest(2,k-1) + dt*Vc*sin(phi) + dt*Vc/ | xest(2,k) = xest(2,k-1) + dt*Vc*sin(phi) + dt*Vc/ | ||
Line 143: | Line 149: | ||
A(3,1)=0; A(3,2)=0; A(3,3)=1; | A(3,1)=0; A(3,2)=0; A(3,3)=1; | ||
Pest = ??; | Pest = ??; | ||
- | </syntaxhighlight> | + | </code> |
- | *'' | + | * **new_state.m** |
- | <syntaxhighlight lang="matlab"> | + | <code matlab |
... | ... | ||
... | ... | ||
Line 153: | Line 159: | ||
... | ... | ||
... | ... | ||
- | </syntaxhighlight> | + | </code> |
- | *'' | + | **updateNew.m** and **updateExisting.m** |
- | <syntaxhighlight lang="matlab"> | + | |
+ | <code matlab | ||
... | ... | ||
... | ... | ||
Line 173: | Line 180: | ||
... | ... | ||
... | ... | ||
- | </syntaxhighlight> | + | </code> |
+ | |||
+ | ==== Implementation ==== | ||
+ | |||
+ | === NXT + Ultrasonic Sensor === | ||
- | ==Implementation== | ||
- | ===NXT + Ultrasonic Sensor=== | ||
I will be implementing a pseudo " | I will be implementing a pseudo " | ||
Line 186: | Line 195: | ||
I designed the robot such that the Ultrasonic Sensor is along the axle of the motor driving the wheels. | I designed the robot such that the Ultrasonic Sensor is along the axle of the motor driving the wheels. | ||
- | [[File:2013-03-30 19.02.55.jpg|400px|none]][[File:2013-03-30 19.02.06.jpg|400px]] | + | |
+ | {{dylanw:2013-03-30_19.02.55.jpg}}\\ | ||
+ | {{dylanw:2013-03-30_19.02.06.jpg}}\\ | ||
Picture of the experimental setup. | Picture of the experimental setup. | ||
- | [[File:2013-04-03 08.01.01.jpg|400px|none]] | + | |
+ | {{dylanw:2013-04-03_08.01.01.jpg}}\\ | ||
This is the " | This is the " | ||
- | <syntaxhighlight lang="c"> | + | |
+ | <code c homework4.c> | ||
#define DIST 30 | #define DIST 30 | ||
#define EX_TIME 60000// | #define EX_TIME 60000// | ||
Line 279: | Line 292: | ||
CloseFile(angle_handle); | CloseFile(angle_handle); | ||
} | } | ||
- | </syntaxhighlight> | + | </code> |
This is the " | This is the " | ||
Line 285: | Line 298: | ||
Parts which are used for updating in the Extended Kalman Filter are commented out since there is no update in this pseudo " | Parts which are used for updating in the Extended Kalman Filter are commented out since there is no update in this pseudo " | ||
- | <syntaxhighlight lang="matlab"> | + | <code matlab> |
%VicExperiment | %VicExperiment | ||
clear all; clc; | clear all; clc; | ||
Line 401: | Line 414: | ||
axis([0 127 0 101.6]); | axis([0 127 0 101.6]); | ||
plot(mapOutline(1,: | plot(mapOutline(1,: | ||
- | </syntaxhighlight> | + | </code> |
This is the result on Matlab. The red line represents the experimental boundary, the green line represents the route taken by the robot WITHOUT corrections, | This is the result on Matlab. The red line represents the experimental boundary, the green line represents the route taken by the robot WITHOUT corrections, | ||
- | [[File:SLAM1.jpg|400px|none]] | + | |
+ | {{dylanw:slam1.jpg}}\\ | ||
This is a video of the experiment. | This is a video of the experiment. | ||
- | {{#ev:youtube|SDNhBZ4B8Mk}} | + | {{youtube>SDNhBZ4B8Mk?large}}\\ |
Thoughts: | Thoughts: | ||
- | *This is not exactly a SLAM implementation as the Ultrasonic sensor can only detect one obstacle at a time. Without detecting the position of the old landmarks, there can be no updating and SLAM cannot be implemented. | + | |
- | *Ultrasonic sensor is fairly inaccurate. | + | * Ultrasonic sensor is fairly inaccurate. |
Areas to improve: | Areas to improve: | ||
- | *Save data to arrays before writing to files. It will be much faster in recording data. | + | |
- | *put delays in between scanning and moving so that equipment are stable before measurement is taken and data is recorded. | + | * Put delays in between scanning and moving so that equipment are stable before measurement is taken and data is recorded. |
- | *Make sensor detect an area of obstacle instead of a point, and use it to do SLAM. | + | * Make sensor detect an area of obstacle instead of a point, and use it to do SLAM. |
- | *use the NXT camera to pinpoint position of static obstacles and find the distance at each step of the robot motion. | + | * Use the NXT camera to pinpoint position of static obstacles and find the distance at each step of the robot motion. |
+ | |||
+ | === NXT + Ultrasonic Sensor + NXT camera === | ||
- | ===NXT + Ultrasonic Sensor + NXTCamera=== | ||
Plan is to use the NXT camera to find the position of landmarks as the camera is made to scan 180 degrees of the front of the robot. | Plan is to use the NXT camera to find the position of landmarks as the camera is made to scan 180 degrees of the front of the robot. | ||
The ultrasonic sensor will then detect the distance of that object if the landmark is detected and at the center of the camera (the ultrasonic sensor is aligned to have the same orientation as the camera). | The ultrasonic sensor will then detect the distance of that object if the landmark is detected and at the center of the camera (the ultrasonic sensor is aligned to have the same orientation as the camera). | ||
- | ====Tutorial on NXTCamera==== | + | == Tutorial on NXTCamera == |
- | Refer to here[http:// | + | |
+ | Refer to [[http:// | ||
Here are some things I want to add on: | Here are some things I want to add on: | ||
- | *The location to install the USB driver appears in "Other devices" | + | |
I am using Windows 7. | I am using Windows 7. | ||
- | [[File: | + | {{dylanw: |
- | *After installation it, at the same location, the words "USB Driver" | + | |
This is what the tutorial meant by " | This is what the tutorial meant by " | ||
- | *Upon the 1st connection using the " | + | |
- | [[File: | + | {{dylanw: |
In the " | In the " | ||
- | [[File: | + | {{dylanw: |
+ | {{dylanw: | ||
The picture below is a demonstration of the tutorial sample code. | The picture below is a demonstration of the tutorial sample code. | ||
Line 448: | Line 465: | ||
The NXT screen shows the blobs of 2 different colours captured in one frame instance based on the colormaps pre-configured using the " | The NXT screen shows the blobs of 2 different colours captured in one frame instance based on the colormaps pre-configured using the " | ||
- | [[File: | + | {{dylanw: |
+ | |||
+ | == Trial 1 == | ||
- | ====Trial 1==== | ||
his is the first trial on capturing landmark on a 180 degrees sweep of the front surrounding, | his is the first trial on capturing landmark on a 180 degrees sweep of the front surrounding, | ||
- | {{#ev:youtube|D-EGa5qmdC4}}v= | + | {{youtube>D-EGa5qmdC4?large}}\\ |
This is the nxc code for: | This is the nxc code for: | ||
- | *Turning the camera 180 degrees to the right at 10 degrees at a time, starting from 90 degrees to the left. | + | |
- | *Getting the coordinates of the detected blob(s) to be stored in a file for processing in Matlab. | + | * Getting the coordinates of the detected blob(s) to be stored in a file for processing in Matlab. |
- | <syntaxhighlight lang="c"> | + | |
+ | <code c> | ||
#include " | #include " | ||
Line 567: | Line 586: | ||
NXTCam_SendCommand(camPort, | NXTCam_SendCommand(camPort, | ||
} | } | ||
- | </syntaxhighlight> | + | </code> |
- | Things to take note: | + | |
- | *Perform ''# | + | |
- | *From example tutorial " | + | |
- | *Using a sub() to process the results from the '' | + | |
- | *Do not disable the camera until the end of the program; camera cannot be enabled again once it is disabled in the runtime of the program. | + | |
- | *PosRegSetAng() cannot be used once RotateMotor() is used | + | |
- | *Limit of NXT runtime for a program is 10mins. NXT switch off after that length of run time regardless if the program has completed or not | + | |
+ | Things to take note: | ||
+ | * Perform **# | ||
+ | * From example tutorial " | ||
+ | * Using a sub() to process the results from the **NXTCam_GetBlobs** function will not work. Process the values inside the main loop itself. | ||
+ | * Do not disable the camera until the end of the program; camera cannot be enabled again once it is disabled in the runtime of the program. | ||
+ | * PosRegSetAng() cannot be used once RotateMotor() is used | ||
+ | * Limit of NXT runtime for a program is 10mins. NXT switch off after that length of run time regardless if the program has completed or not | ||
Below shows the collected figures from the program. | Below shows the collected figures from the program. | ||
- | [[File:coordinatesNAreas.jpg|400px]] | + | {{dylanw:coordinatesnareas.jpg}}\\ |
Explanation: | Explanation: | ||
- | *" | + | |
- | *" | + | * " |
- | *" | + | * " |
Going Further: | Going Further: | ||
- | *With the area value, I can filter out false blobs which are supposedly small, leaving only the bigger blob which is the actual landmark. | + | |
- | *The blobs will become small if the landmarks are too far; the area threshold which filters out the false noisy blobs must make sense with the distance threshold which filters out landmarks that are too far. | + | * The blobs will become small if the landmarks are too far; the area threshold which filters out the false noisy blobs must make sense with the distance threshold which filters out landmarks that are too far. |
+ | |||
+ | == Trial 2 == | ||
- | ====Trial 2==== | ||
Design of the robot is as shown in the picture below. | Design of the robot is as shown in the picture below. | ||
Line 597: | Line 617: | ||
This is specially designed as such to eliminated the need for transformation of the position of the sensor to the wheels, hence the vehicle can be modeled as a point in space easily. | This is specially designed as such to eliminated the need for transformation of the position of the sensor to the wheels, hence the vehicle can be modeled as a point in space easily. | ||
- | [[File: | + | {{dylanw: |
The robot moves in a straight line for 5 steps. | The robot moves in a straight line for 5 steps. | ||
Line 605: | Line 625: | ||
If detected and the blob has an area big enough to be considered a real landmark, the distance from it to the ultrasonic sensor is recorded. | If detected and the blob has an area big enough to be considered a real landmark, the distance from it to the ultrasonic sensor is recorded. | ||
- | ''' | + | **Note**: Only up to 4 files can be written in an nxc program. |
The nxc code for this trial is similar to the one above. | The nxc code for this trial is similar to the one above. | ||
Line 611: | Line 631: | ||
Here is the video. (I changed the white obstacle to black) | Here is the video. (I changed the white obstacle to black) | ||
- | {{#ev:youtube|Q4xNY9fbCFA}} | + | {{youtube>Q4xNY9fbCFA?large}}\\ |
The recorded files are displayed below with slight explanations. These files are recorded in a different trial as the one recorded in the video above. | The recorded files are displayed below with slight explanations. These files are recorded in a different trial as the one recorded in the video above. | ||
- | *''' | + | * **blue.dat** |
- | [[File: | + | |
- | *If a true blob is not detected in one scan, that is there is no blue object in the a scan, default value of 255 is written into a line in the file. | + | |
- | *If a true blob is detected, the angle and the distance at which it is detected is recorded. | + | |
- | *If more than one blob is detected, the pair of angle and distance values are recorded separately from the other pairs in a line. | + | |
- | *''' | + | {{dylanw:blue.jpg}}\\ |
- | [[File:white.jpg|200px]] | + | |
- | *Same explanation with the blue blob detection above. | + | |
- | *Note that a black object is used instead. | + | |
- | *''' | + | * If a true blob is not detected in one scan, that is there is no blue object in the a scan, default value of 255 is written into a line in the file. |
- | [[File: | + | * If a true blob is detected, the angle and the distance at which it is detected is recorded. |
- | *Each line represent the state of the robot at each step of its journey. | + | * If more than one blob is detected, the pair of angle and distance values are recorded separately from the other pairs in a line. |
- | *In each line, the value before the space is the change in orientation of the robot, while the value after is a state function of whether the robot move forward or not. | + | |
+ | * **white.dat** | ||
+ | |||
+ | {{dylanw: | ||
+ | |||
+ | * Same explanation with the blue blob detection above. | ||
+ | * Note that a black object is used instead. | ||
+ | |||
+ | * **distFwdAngle.dat** | ||
+ | |||
+ | {{dylanw: | ||
+ | |||
+ | * Each line represent the state of the robot at each step of its journey. | ||
+ | * In each line, the value before the space is the change in orientation of the robot, while the value after is a state function of whether the robot move forward or not. | ||
Thoughts: | Thoughts: | ||
- | *Angles made by the motor are not accurate; the motor does not return to the center after each scan. ''' | + | |
- | *True blobs are not detected. | + | * True blobs are not detected. |
- | ====Trial 3==== | + | == Trial 3 == |
- | *Motor angles are accurate with the implementation of the PID controller. The tension in the cables connecting to the port of the ultrasonic sensor and the camera gives an unwanted torque to turn the motor and make its angle inaccurate. The PID controller seeks to gives a counter torque to fix that error. This causes the vibration of the camera and sensor. | + | |
- | *A lower EYE_TURN_ANGLE to cover more details in each scan increased the success of detecting obstacles, along with adjustments to the other variables. | + | |
+ | * A lower EYE_TURN_ANGLE to cover more details in each scan increased the success of detecting obstacles, along with adjustments to the other variables. | ||
Here are the results of the data collected in the experiment shown in the video. | Here are the results of the data collected in the experiment shown in the video. | ||
- | [[File: | + | {{dylanw: |
+ | |||
+ | {{youtube> | ||
+ | |||
+ | == Trial 4 == | ||
- | {{# | ||
- | ====Trial 4==== | ||
Change of experiment plan. | Change of experiment plan. | ||
Line 655: | Line 684: | ||
In addition, I will use a white background instead as the capturing of blobs based on the designated colormap is not very effective. This picture shows the blobs detected when the same colormap is used to capture images at different angles using a green background. Using a white background and black landmarks might make the difference since they are the extremes of the color spectrum. | In addition, I will use a white background instead as the capturing of blobs based on the designated colormap is not very effective. This picture shows the blobs detected when the same colormap is used to capture images at different angles using a green background. Using a white background and black landmarks might make the difference since they are the extremes of the color spectrum. | ||
- | [[File:badOverlapping.jpg|400px]] | + | {{dylanw:badoverlapping.jpg}}\\ |
The final experimental set up is shown in the picture below. | The final experimental set up is shown in the picture below. | ||
- | [[File:trial4ExpSetup.jpg|400px]] | + | {{dylanw:trial4expsetup.jpg}}\\ |
The nxc code is show below. | The nxc code is show below. | ||
- | ''' | + | **NOTE**: There is no obstacle avoidance implemented for this experiment as the maximum runtime of the NXT is limited (up till 10 minutes). The robot will make a 140 degrees scan of the environment in front of it before taking a step forward. It takes 6 steps at maximum. |
- | The ''' | + | The **EYE** in the code refers to the motor, ultrasonic sensor and NXT camera assembly. |
- | <syntaxhighlight lang='c'> | + | <code c> |
#include " | #include " | ||
Line 839: | Line 868: | ||
NXTCam_SendCommand(camPort, | NXTCam_SendCommand(camPort, | ||
} | } | ||
- | </syntaxhighlight> | + | </code> |
This is the full video at normal speed. | This is the full video at normal speed. | ||
- | {{#ev:youtube|YjQGph-r7jo}} | + | {{youtube>YjQGph-r7jo?large}}\\ |
This is a compilation of the ' | This is a compilation of the ' | ||
- | [[File:retrievedData.jpg|800px]] | + | {{dylanw:retrieveddata.jpg}}\\ |
- | *Each line represented the 29 data of all the scans done at each move step. | + | |
- | *The last line is incomplete as it has reached the maximum run time of the NXT. | + | * The last line is incomplete as it has reached the maximum run time of the NXT. |
- | *Detected blobs and its corresponding distances detected is highlighted in yellow. | + | * Detected blobs and its corresponding distances detected is highlighted in yellow. |
- | *There are no blobs detected on the right side for the 1st few steps because the blobs are too far and thus filtered out. | + | * There are no blobs detected on the right side for the 1st few steps because the blobs are too far and thus filtered out. |
- | ''' | + | **Thoughts**: |
- | *Implementation SLAM is possible as we can see the detected values are in clusters and they are shifting to the side as the robot moves forward. | + | * Implementation SLAM is possible as we can see the detected values are in clusters and they are shifting to the side as the robot moves forward. |
- | The matlab code can be found here.''' | + | The matlab code can be found here.**GITHUB** |
The result of the code is shown below. | The result of the code is shown below. | ||
- | [[File:trial4.jpg|400px]] | + | {{dylanw:trial4.jpg}}\\ |
- | *The black dots represent the true position of the obstacles. | + | |
- | *The ' | + | * The ' |
- | *The green line is the true route of the robot by odometry input data. | + | * The green line is the true route of the robot by odometry input data. |
- | *The red line is the estimated route by going through the EKF. | + | * The red line is the estimated route by going through the EKF. |
- | ''' | + | **Thoughts**: |
- | *Outlier obstacle detected. | + | * Outlier obstacle detected. |
- | *EKF implemented is fairly accurate for this number of steps. | + | * EKF implemented is fairly accurate for this number of steps. |
- | *Newly detected obstacles at each step is correctly matched to the nearby obstacles. | + | * Newly detected obstacles at each step is correctly matched to the nearby obstacles. |
+ | |||
+ | == Trial 5 == | ||
- | ====Trial 5==== | ||
I tested out the algorithm on another experimental setup as shown below. | I tested out the algorithm on another experimental setup as shown below. | ||
- | [[File:trial5expSetup.jpg|400px]] | + | {{dylanw:trial5expsetup.jpg}}\\ |
With an adjustment to the minimum distance threshold between landmarks to be classified as the same landmark, the results are show below. | With an adjustment to the minimum distance threshold between landmarks to be classified as the same landmark, the results are show below. | ||
- | *'' | + | * **min_dist**=11 |
- | [[File: | + | {{dylanw: |
- | *'' | + | * **min_dist**=10 |
- | [[File: | + | {{dylanw: |
- | *'' | + | * **min_dist**=8,9 |
- | [[File: | + | {{dylanw: |
- | *'' | + | * **min_dist**=7 |
- | [[File: | + | {{dylanw: |
- | *'' | + | * **min_dist**=6 |
- | [[File: | + | {{dylanw: |
Thoughts: | Thoughts: | ||
- | *The number of observed landmarks increased as the '' | + | |
- | *At a higher threshold value for '' | + | * At a higher threshold value for **min_dist**, the number of observed landmarks fit the experimental setup but there is a high possibility of wrong association of true distinct landmarks as the same observed landmark. |
- | *Inaccuracy is also due to the limitations of the equipment. Below shows the observed landmarks at the very first scan of the experiment. One of the observed landmark is an abnormally circled in green. It may be due to overlap between the nearer landmark circled in red and the further one circled in blue during scanning. Adjustments need to the various threshold value need to be made to improve the results | + | * Inaccuracy is also due to the limitations of the equipment. Below shows the observed landmarks at the very first scan of the experiment. One of the observed landmark is an abnormally circled in green. It may be due to overlap between the nearer landmark circled in red and the further one circled in blue during scanning. Adjustments need to the various threshold value need to be made to improve the results |
+ | |||
+ | {{dylanw: | ||
- | [[File: | + | === Going Further === |
- | ===Going Further=== | + | Adjustments to values like **RIGHT_THRES**, **LEFT_THRES**, **AREA_THRES** and **EYE_TURN_ANGLE** will improve the results. The optimum result will require a good balance of these values. |
- | Adjustments to values like '' | + |
drexel_darwin_slam.1478038080.txt.gz · Last modified: by dwallace