User Tools

Site Tools


drexel_darwin_slam

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
drexel_darwin_slam [2016/11/01 15:08] – created dwallacedrexel_darwin_slam [2016/11/06 19:42] (current) dwallace
Line 1: Line 1:
 ====== Simultaneous Localization and Mapping (SLAM) ====== ====== Simultaneous Localization and Mapping (SLAM) ======
  
-==Motivation==+===== Motivation ===== 
 SLAM give the robot infomation of its whereabout and more relevantly, the position of the features of its surroundings.  SLAM give the robot infomation of its whereabout and more relevantly, the position of the features of its surroundings. 
  
 This can be essential to our project to identify each rungs during each step of the ladder ascent. This can be essential to our project to identify each rungs during each step of the ladder ascent.
  
-Refer here[http://ocw.mit.edu/courses/aeronautics-and-astronautics/16-412j-cognitive-robotics-spring-2005/projects/1aslam_blas_repo.pdf] for more details of SLAM.+Refer [[http://ocw.mit.edu/courses/aeronautics-and-astronautics/16-412j-cognitive-robotics-spring-2005/projects/1aslam_blas_repo.pdf|here]] for more details of SLAM.
  
-==Learning== +===== Learning ===== 
-With reference to the webpage[http://dasl.mem.drexel.edu/~billgreen/slam/slam.html] assigned by Prof Oh for me to work on, I did the homeworks and there are my codes. + 
-===Homework 1=== +With reference to [[http://dasl.mem.drexel.edu/~billgreen/slam/slam.html|this webpage]] assigned by Prof Oh for me to work on, I did the homeworks and there are my codes. 
-<syntaxhighlight lang="matlab">+ 
 +==== Homework 1 ==== 
 + 
 +<code matlab homework1.m>
 R=0.01; R=0.01;
 Q=0.00001; Q=0.00001;
Line 37: Line 41:
 ... ...
 ... ...
-</syntaxhighlight>+</code> 
 + 
 +Refer to [[http://studentdavestutorials.weebly.com/kalman-filter-with-matlab-code.html|here]] for an example of how the algorithm look like for a simple Kalman filter problem.
  
-Refer to here[http://studentdavestutorials.weebly.com/kalman-filter-with-matlab-code.html] for an example of how the algorithm look like for a simple Kalman filter problem.+==== Homework 2 ====
  
-===Homework 2=== +I personally recommend [[http://elib.uni-stuttgart.de/opus/volltexte/2005/2183/pdf/kleinbauer.pdf|here]] for a better reference on kalman and extended kalman filters.
-I personally recommend here[http://elib.uni-stuttgart.de/opus/volltexte/2005/2183/pdf/kleinbauer.pdf] for a better reference on kalman and extended kalman filters.+
  
-<syntaxhighlight lang="matlab">+<code matlab honework2.m>
 ... ...
 ... ...
Line 124: Line 129:
 ... ...
 ... ...
-</syntaxhighlight>+</code>
  
-===Homework 4=== +=== Homework 4 ===
-*''slam.m'' + 
-<syntaxhighlight lang="matlab">+**slam.m** 
 +<code matlab slam.m>
     xest(1,k) = xest(1,k-1) + dt*Vc*cos(phi) - dt*Vc/L*tan(alpha)*(a*sin(phi)+b*cos(phi));     xest(1,k) = xest(1,k-1) + dt*Vc*cos(phi) - dt*Vc/L*tan(alpha)*(a*sin(phi)+b*cos(phi));
     xest(2,k) = xest(2,k-1) + dt*Vc*sin(phi) + dt*Vc/L*tan(alpha)*(a*cos(phi)-b*sin(phi));     xest(2,k) = xest(2,k-1) + dt*Vc*sin(phi) + dt*Vc/L*tan(alpha)*(a*cos(phi)-b*sin(phi));
Line 143: Line 149:
     A(3,1)=0; A(3,2)=0; A(3,3)=1;     A(3,1)=0; A(3,2)=0; A(3,3)=1;
     Pest = ??;     Pest = ??;
-</syntaxhighlight>+</code>
  
-*''new_state.m'' +  * **new_state.m** 
-<syntaxhighlight lang="matlab">+<code matlab new_state.m>
 ... ...
 ... ...
Line 153: Line 159:
 ... ...
 ... ...
-</syntaxhighlight>+</code>
  
-*''updateNew.m'' and ''updateExisting.m'' +**updateNew.m** and **updateExisting.m** 
-<syntaxhighlight lang="matlab">+ 
 +<code matlab updatenewandexisting.m>
 ... ...
 ... ...
Line 173: Line 180:
 ... ...
 ... ...
-</syntaxhighlight>+</code> 
 + 
 +==== Implementation ==== 
 + 
 +=== NXT + Ultrasonic Sensor ===
  
-==Implementation== 
-===NXT + Ultrasonic Sensor=== 
 I will be implementing a pseudo "SLAM" on the NXT. I will be implementing a pseudo "SLAM" on the NXT.
  
Line 186: Line 195:
  
 I designed the robot such that the Ultrasonic Sensor is along the axle of the motor driving the wheels. I designed the robot such that the Ultrasonic Sensor is along the axle of the motor driving the wheels.
-[[File:2013-03-30 19.02.55.jpg|400px|none]][[File:2013-03-30 19.02.06.jpg|400px]]+ 
 +{{dylanw:2013-03-30_19.02.55.jpg}}\\  
 +{{dylanw:2013-03-30_19.02.06.jpg}}\\ 
  
 Picture of the experimental setup. Picture of the experimental setup.
-[[File:2013-04-03 08.01.01.jpg|400px|none]]+ 
 +{{dylanw:2013-04-03_08.01.01.jpg}}\\ 
  
 This is the "nxc" code for the whole procedure. This is the "nxc" code for the whole procedure.
-<syntaxhighlight lang="c">+ 
 +<code c homework4.c>
 #define DIST 30 #define DIST 30
 #define EX_TIME 60000//experiment time #define EX_TIME 60000//experiment time
Line 279: Line 292:
   CloseFile(angle_handle);   CloseFile(angle_handle);
 } }
-</syntaxhighlight>+</code>
  
 This is the "matlab" code for data processing part. This is the "matlab" code for data processing part.
Line 285: Line 298:
 Parts which are used for updating in the Extended Kalman Filter are commented out since there is no update in this pseudo "SLAM" experiment. Parts which are used for updating in the Extended Kalman Filter are commented out since there is no update in this pseudo "SLAM" experiment.
  
-<syntaxhighlight lang="matlab">+<code matlab>
 %VicExperiment %VicExperiment
 clear all; clc; clear all; clc;
Line 401: Line 414:
 axis([0 127 0 101.6]); axis([0 127 0 101.6]);
 plot(mapOutline(1,:),mapOutline(2,:),'red'); plot(mapOutline(1,:),mapOutline(2,:),'red');
-</syntaxhighlight>+</code>
  
 This is the result on Matlab. The red line represents the experimental boundary, the green line represents the route taken by the robot WITHOUT corrections, and the blue dots represent the detected obstacles at the respective part of the route. This is the result on Matlab. The red line represents the experimental boundary, the green line represents the route taken by the robot WITHOUT corrections, and the blue dots represent the detected obstacles at the respective part of the route.
-[[File:SLAM1.jpg|400px|none]]+ 
 +{{dylanw:slam1.jpg}}\\ 
  
 This is a video of the experiment. This is a video of the experiment.
  
-{{#ev:youtube|SDNhBZ4B8Mk}}+{{youtube>SDNhBZ4B8Mk?large}}\\ 
  
 Thoughts: Thoughts:
-*This is not exactly a SLAM implementation as the Ultrasonic sensor can only detect one obstacle at a time. Without detecting the position of the old landmarks, there can be no updating and SLAM cannot be implemented. +  * This is not exactly a SLAM implementation as the Ultrasonic sensor can only detect one obstacle at a time. Without detecting the position of the old landmarks, there can be no updating and SLAM cannot be implemented. 
-*Ultrasonic sensor is fairly inaccurate.+  * Ultrasonic sensor is fairly inaccurate.
  
 Areas to improve: Areas to improve:
-*Save data to arrays before writing to files. It will be much faster in recording data. +  * Save data to arrays before writing to files. It will be much faster in recording data. 
-*put delays in between scanning and moving so that equipment are stable before measurement is taken and data is recorded. +  Put delays in between scanning and moving so that equipment are stable before measurement is taken and data is recorded. 
-*Make sensor detect an area of obstacle instead of a point, and use it to do SLAM. +  * Make sensor detect an area of obstacle instead of a point, and use it to do SLAM. 
-*use the NXT camera to pinpoint position of static obstacles and find the distance at each step of the robot motion.+  Use the NXT camera to pinpoint position of static obstacles and find the distance at each step of the robot motion. 
 + 
 +=== NXT + Ultrasonic Sensor + NXT camera ===
  
-===NXT + Ultrasonic Sensor + NXTCamera=== 
 Plan is to use the NXT camera to find the position of landmarks as the camera is made to scan 180 degrees of the front of the robot. Plan is to use the NXT camera to find the position of landmarks as the camera is made to scan 180 degrees of the front of the robot.
  
 The ultrasonic sensor will then detect the distance of that object if the landmark is detected and at the center of the camera (the ultrasonic sensor is aligned to have the same orientation as the camera). The ultrasonic sensor will then detect the distance of that object if the landmark is detected and at the center of the camera (the ultrasonic sensor is aligned to have the same orientation as the camera).
  
-====Tutorial on NXTCamera==== +== Tutorial on NXTCamera == 
-Refer to here[http://dasl.mem.drexel.edu/wiki/index.php/LegoVision] for installation instructions.+ 
 +Refer to [[http://dasl.mem.drexel.edu/wiki/index.php/LegoVision|here]] for installation instructions.
  
 Here are some things I want to add on: Here are some things I want to add on:
-*The location to install the USB driver appears in "Other devices" in the "Device Manager" window. +  * The location to install the USB driver appears in "Other devices" in the "Device Manager" window. 
 I am using Windows 7. I am using Windows 7.
  
-[[File:deviceLocation.jpg|400px]]+{{dylanw:deviceLocation.jpg}}\\ 
  
-*After installation it, at the same location, the words "USB Driver" or something like that will appear (sorry I did not take a screen shot of it).+  * After installation it, at the same location, the words "USB Driver" or something like that will appear (sorry I did not take a screen shot of it).
 This is what the tutorial meant by "install 2 USB drives". This is what the tutorial meant by "install 2 USB drives".
-*Upon the 1st connection using the "NXTCam View" software, I had to configure the port of the new device. I chose Port 5.+  * Upon the 1st connection using the "NXTCam View" software, I had to configure the port of the new device. I chose Port 5.
  
-[[File:comPort.jpg|400px]]+{{dylanw:comPort.jpg}}\\ 
  
 In the "NXTCam View" software, I set the 1st colormap to the color of the blue pole and the 2nd colormap to the color of the white pole. In the "NXTCam View" software, I set the 1st colormap to the color of the blue pole and the 2nd colormap to the color of the white pole.
  
-[[File:blueColormap.jpg|400px]][[File:whiteColormap.jpg|400px]]+{{dylanw:blueColormap.jpg}}\\  
 +{{dylanw:whiteColormap.jpg}}\\ 
  
 The picture below is a demonstration of the tutorial sample code. The picture below is a demonstration of the tutorial sample code.
Line 448: Line 465:
 The NXT screen shows the blobs of 2 different colours captured in one frame instance based on the colormaps pre-configured using the "NXTCam View" software. The NXT screen shows the blobs of 2 different colours captured in one frame instance based on the colormaps pre-configured using the "NXTCam View" software.
  
-[[File:tutorialNXTCamera.jpg|400px]]+{{dylanw:tutorialNXTCamera.jpg}}\\  
 + 
 +== Trial 1 ==
  
-====Trial 1==== 
 his is the first trial on capturing landmark on a 180 degrees sweep of the front surrounding, and writing the values into a file. his is the first trial on capturing landmark on a 180 degrees sweep of the front surrounding, and writing the values into a file.
  
-{{#ev:youtube|D-EGa5qmdC4}}v=+{{youtube>D-EGa5qmdC4?large}}\\ 
  
 This is the nxc code for: This is the nxc code for:
-*Turning the camera 180 degrees to the right at 10 degrees at a time, starting from 90 degrees to the left. +  * Turning the camera 180 degrees to the right at 10 degrees at a time, starting from 90 degrees to the left. 
-*Getting the coordinates of the detected blob(s) to be stored in a file for processing in Matlab. +  * Getting the coordinates of the detected blob(s) to be stored in a file for processing in Matlab. 
-<syntaxhighlight lang="c">+ 
 +<code c>
 #include "D:\NXT\NXTCAM\nxtcamlib-default.nxc" #include "D:\NXT\NXTCAM\nxtcamlib-default.nxc"
  
Line 567: Line 586:
 NXTCam_SendCommand(camPort, CAMADDR, 'D'); // disable tracking NXTCam_SendCommand(camPort, CAMADDR, 'D'); // disable tracking
 } }
-</syntaxhighlight> +</code>
-Things to take note: +
-*Perform ''#include'' on '''only''' one of the "nxcCam" files (ie. ''nxtcamlib.nxc'', ''nxtcamlib–default.nxc'' or ''nxtcamlib–common.nxc''), otherwise there will be error on having the "same identifier" in multiple files. +
-*From example tutorial "camtest-v3.nxc", the file it ''#include'' is ''nxtcamlib-default.nxc'' where the ''NXTCam_GetBlobs'' function has 8 arguments. If you ''#include'' the ''nxtcamlib.nxc'' file instead, there are only 7 arguments. +
-*Using a sub() to process the results from the ''NXTCam_GetBlobs'' function will not work. Process the values inside the main loop itself. +
-*Do not disable the camera until the end of the program; camera cannot be enabled again once it is disabled in the runtime of the program. +
-*PosRegSetAng() cannot be used once RotateMotor() is used +
-*Limit of NXT runtime for a program is 10mins. NXT switch off after that length of run time regardless if the program has completed or not+
  
 +Things to take note:
 +  * Perform **#include** on **only** one of the "nxcCam" files (ie. **nxtcamlib.nxc**, **nxtcamlib–default.nxc** or **nxtcamlib–common.nxc**), otherwise there will be error on having the "same identifier" in multiple files.
 +  * From example tutorial "camtest-v3.nxc", the file it **#include** is **nxtcamlib-default.nxc** where the **NXTCam_GetBlobs** function has 8 arguments. If you **#include** the **nxtcamlib.nxc** file instead, there are only 7 arguments.
 +  * Using a sub() to process the results from the **NXTCam_GetBlobs** function will not work. Process the values inside the main loop itself.
 +  * Do not disable the camera until the end of the program; camera cannot be enabled again once it is disabled in the runtime of the program.
 +  * PosRegSetAng() cannot be used once RotateMotor() is used
 +  * Limit of NXT runtime for a program is 10mins. NXT switch off after that length of run time regardless if the program has completed or not
  
 Below shows the collected figures from the program. Below shows the collected figures from the program.
  
-[[File:coordinatesNAreas.jpg|400px]]+{{dylanw:coordinatesnareas.jpg}}\\ 
  
 Explanation: Explanation:
-*"color" refers to the colormaps pre-configured in the "NXTCam View software"; there are only 2 in this case where 0 caters to the blue landmark while 1 caters to the white one. +  * "color" refers to the colormaps pre-configured in the "NXTCam View software"; there are only 2 in this case where 0 caters to the blue landmark while 1 caters to the white one. 
-*"left", "right", "top" and "btm" refer to the coordinates of the blobs. +  * "left", "right", "top" and "btm" refer to the coordinates of the blobs. 
-*"area" refer to the area of the corresponding blob.+  * "area" refer to the area of the corresponding blob.
  
 Going Further: Going Further:
-*With the area value, I can filter out false blobs which are supposedly small, leaving only the bigger blob which is the actual landmark. +  * With the area value, I can filter out false blobs which are supposedly small, leaving only the bigger blob which is the actual landmark. 
-*The blobs will become small if the landmarks are too far; the area threshold which filters out the false noisy blobs must make sense with the distance threshold which filters out landmarks that are too far.+  * The blobs will become small if the landmarks are too far; the area threshold which filters out the false noisy blobs must make sense with the distance threshold which filters out landmarks that are too far. 
 + 
 +== Trial 2 ==
  
-====Trial 2==== 
 Design of the robot is as shown in the picture below. Design of the robot is as shown in the picture below.
  
Line 597: Line 617:
 This is specially designed as such to eliminated the need for transformation of the position of the sensor to the wheels, hence the vehicle can be modeled as a point in space easily. This is specially designed as such to eliminated the need for transformation of the position of the sensor to the wheels, hence the vehicle can be modeled as a point in space easily.
  
-[[File:NXTfinalDesign1.jpg|400px]][[File:NXTfinalDesign2.jpg|400px]]+{{dylanw:NXTfinalDesign1.jpg}}{{dylanw:NXTfinalDesign2.jpg}}\\ 
  
 The robot moves in a straight line for 5 steps. The robot moves in a straight line for 5 steps.
Line 605: Line 625:
 If detected and the blob has an area big enough to be considered a real landmark, the distance from it to the ultrasonic sensor is recorded. If detected and the blob has an area big enough to be considered a real landmark, the distance from it to the ultrasonic sensor is recorded.
  
-'''Note''': Only up to 4 files can be written in an nxc program.+**Note**: Only up to 4 files can be written in an nxc program.
  
 The nxc code for this trial is similar to the one above. The nxc code for this trial is similar to the one above.
Line 611: Line 631:
 Here is the video. (I changed the white obstacle to black) Here is the video. (I changed the white obstacle to black)
  
-{{#ev:youtube|Q4xNY9fbCFA}}+{{youtube>Q4xNY9fbCFA?large}}\\ 
  
 The recorded files are displayed below with slight explanations. These files are recorded in a different trial as the one recorded in the video above. The recorded files are displayed below with slight explanations. These files are recorded in a different trial as the one recorded in the video above.
-*'''blue.dat''' +  * **blue.dat**
-[[File:blue.jpg|300px]] +
-*If a true blob is not detected in one scan, that is there is no blue object in the a scan, default value of 255 is written into a line in the file. +
-*If a true blob is detected, the angle and the distance at which it is detected is recorded. +
-*If more than one blob is detected, the pair of angle and distance values are recorded separately from the other pairs in a line.+
  
-*'''white.dat''' +{{dylanw:blue.jpg}}\\ 
-[[File:white.jpg|200px]] +
-*Same explanation with the blue blob detection above. +
-*Note that a black object is used instead.+
  
-*'''distFwdAngle.dat''' +  * If a true blob is not detected in one scan, that is there is no blue object in the a scan, default value of 255 is written into a line in the file. 
-[[File:distFwdAng.jpg|100px]] +  * If a true blob is detected, the angle and the distance at which it is detected is recorded. 
-*Each line represent the state of the robot at each step of its journey. +  * If more than one blob is detected, the pair of angle and distance values are recorded separately from the other pairs in a line. 
-*In each line, the value before the space is the change in orientation of the robot, while the value after is a state function of whether the robot move forward or not.+ 
 +  * **white.dat** 
 + 
 +{{dylanw:white.jpg}}\\  
 + 
 +  * Same explanation with the blue blob detection above. 
 +  * Note that a black object is used instead. 
 + 
 +  * **distFwdAngle.dat** 
 + 
 +{{dylanw:distFwdAng.jpg}}\\  
 + 
 +  * Each line represent the state of the robot at each step of its journey. 
 +  * In each line, the value before the space is the change in orientation of the robot, while the value after is a state function of whether the robot move forward or not.
  
 Thoughts: Thoughts:
-*Angles made by the motor are not accurate; the motor does not return to the center after each scan. '''Solution''': Try using the inbuilt PID controller to reduce errors. +  * Angles made by the motor are not accurate; the motor does not return to the center after each scan. **Solution**: Try using the inbuilt PID controller to reduce errors. 
-*True blobs are not detected. '''Solution''': Adjust AREA_THRES and RIGHT_THRES and  LEFT_THRES values OR reduce EYE_TURN_ANGLE value to have more extensive scans OR ascertain a more precise or tolerant colormap for the camera to effectively and correctly track the objects.+  * True blobs are not detected. **Solution**: Adjust AREA_THRES and RIGHT_THRES and  LEFT_THRES values OR reduce EYE_TURN_ANGLE value to have more extensive scans OR ascertain a more precise or tolerant colormap for the camera to effectively and correctly track the objects.
  
-====Trial 3==== +== Trial 3 == 
-*Motor angles are accurate with the implementation of the PID controller. The tension in the cables connecting to the port of the ultrasonic sensor and the camera gives an unwanted torque to turn the motor and make its angle inaccurate. The PID controller seeks to gives a counter torque to fix that error. This causes the vibration of the camera and sensor. + 
-*A lower EYE_TURN_ANGLE to cover more details in each scan increased the success of detecting obstacles, along with adjustments to the other variables.+  * Motor angles are accurate with the implementation of the PID controller. The tension in the cables connecting to the port of the ultrasonic sensor and the camera gives an unwanted torque to turn the motor and make its angle inaccurate. The PID controller seeks to gives a counter torque to fix that error. This causes the vibration of the camera and sensor. 
 +  * A lower EYE_TURN_ANGLE to cover more details in each scan increased the success of detecting obstacles, along with adjustments to the other variables.
 Here are the results of the data collected in the experiment shown in the video. Here are the results of the data collected in the experiment shown in the video.
  
-[[File:bluedat.jpg|200px]][[File:whitedat.jpg|200px]]+{{dylanw:bluedat.jpg}}{{dylanw:whitedat.jpg}}\\  
 + 
 +{{youtube>fN0vzS6yzbQ?large}}\\  
 + 
 +== Trial 4 ==
  
-{{#ev:youtube|fN0vzS6yzbQ}} 
-====Trial 4==== 
 Change of experiment plan. Change of experiment plan.
  
Line 655: Line 684:
 In addition, I will use a white background instead as the capturing of blobs based on the designated colormap is not very effective. This picture shows the blobs detected when the same colormap is used to capture images at different angles using a green background. Using a white background and black landmarks might make the difference since they are the extremes of the color spectrum. In addition, I will use a white background instead as the capturing of blobs based on the designated colormap is not very effective. This picture shows the blobs detected when the same colormap is used to capture images at different angles using a green background. Using a white background and black landmarks might make the difference since they are the extremes of the color spectrum.
  
-[[File:badOverlapping.jpg|400px]]+{{dylanw:badoverlapping.jpg}}\\ 
  
 The final experimental set up is shown in the picture below. The final experimental set up is shown in the picture below.
  
-[[File:trial4ExpSetup.jpg|400px]]+{{dylanw:trial4expsetup.jpg}}\\ 
  
 The nxc code is show below. The nxc code is show below.
  
-'''NOTE''': There is no obstacle avoidance implemented for this experiment as the maximum runtime of the NXT is limited (up till 10 minutes). The robot will make a 140 degrees scan of the environment in front of it before taking a step forward. It takes 6 steps at maximum.+**NOTE**: There is no obstacle avoidance implemented for this experiment as the maximum runtime of the NXT is limited (up till 10 minutes). The robot will make a 140 degrees scan of the environment in front of it before taking a step forward. It takes 6 steps at maximum.
  
-The '''EYE''' in the code refers to the motor, ultrasonic sensor and NXT camera assembly.+The **EYE** in the code refers to the motor, ultrasonic sensor and NXT camera assembly.
  
-<syntaxhighlight lang='c'>+<code c>
 #include "D:\NXT\NXTCAM\nxtcamlib-default.nxc" #include "D:\NXT\NXTCAM\nxtcamlib-default.nxc"
  
Line 839: Line 868:
 NXTCam_SendCommand(camPort, CAMADDR, 'D'); // disable tracking NXTCam_SendCommand(camPort, CAMADDR, 'D'); // disable tracking
 } }
-</syntaxhighlight>+</code>
  
 This is the full video at normal speed. This is the full video at normal speed.
  
-{{#ev:youtube|YjQGph-r7jo}}+{{youtube>YjQGph-r7jo?large}}\\ 
  
 This is a compilation of the 'blob' and 'LMdist' data collected. This is a compilation of the 'blob' and 'LMdist' data collected.
  
-[[File:retrievedData.jpg|800px]]+{{dylanw:retrieveddata.jpg}}\\ 
  
-*Each line represented the 29 data of all the scans done at each move step. +  * Each line represented the 29 data of all the scans done at each move step. 
-*The last line is incomplete as it has reached the maximum run time of the NXT. +  * The last line is incomplete as it has reached the maximum run time of the NXT. 
-*Detected blobs and its corresponding distances detected is highlighted in yellow. +  * Detected blobs and its corresponding distances detected is highlighted in yellow. 
-*There are no blobs detected on the right side for the 1st few steps because the blobs are too far and thus filtered out.+  * There are no blobs detected on the right side for the 1st few steps because the blobs are too far and thus filtered out.
  
-'''Thoughts'''+**Thoughts**
-*Implementation SLAM is possible as we can see the detected values are in clusters and they are shifting to the side as the robot moves forward.+  * Implementation SLAM is possible as we can see the detected values are in clusters and they are shifting to the side as the robot moves forward.
  
-The matlab code can be found here.'''GITHUB'''+The matlab code can be found here.**GITHUB**
  
 The result of the code is shown below. The result of the code is shown below.
  
-[[File:trial4.jpg|400px]]+{{dylanw:trial4.jpg}}\\ 
  
-*The black dots represent the true position of the obstacles. +  * The black dots represent the true position of the obstacles. 
-*The '*' represents the position of the obstacles detected. +  * The '*' represents the position of the obstacles detected. 
-*The green line is the true route of the robot by odometry input data. +  * The green line is the true route of the robot by odometry input data. 
-*The red line is the estimated route by going through the EKF.+  * The red line is the estimated route by going through the EKF.
  
-'''Thoughts'''+**Thoughts**
-*Outlier obstacle detected. +  * Outlier obstacle detected. 
-*EKF implemented is fairly accurate for this number of steps. +  * EKF implemented is fairly accurate for this number of steps. 
-*Newly detected obstacles at each step is correctly matched to the nearby obstacles.+  * Newly detected obstacles at each step is correctly matched to the nearby obstacles. 
 + 
 +== Trial 5 ==
  
-====Trial 5==== 
 I tested out the algorithm on another experimental setup as shown below. I tested out the algorithm on another experimental setup as shown below.
  
-[[File:trial5expSetup.jpg|400px]]+{{dylanw:trial5expsetup.jpg}}\\ 
  
 With an adjustment to the minimum distance threshold between landmarks to be classified as the same landmark, the results are show below. With an adjustment to the minimum distance threshold between landmarks to be classified as the same landmark, the results are show below.
  
-*''min_dist''=11+  * **min_dist**=11
  
-[[File:trial5mindist11.jpg|400px]]+{{dylanw:trial5mindist11.jpg}}\\ 
  
-*''min_dist''=10+  * **min_dist**=10
  
-[[File:trial5mindist10.jpg|400px]]+{{dylanw:trial5mindist10.jpg}}\\ 
  
-*''min_dist''=8,9+  * **min_dist**=8,9
  
-[[File:trial5mindist8_9.jpg|400px]]+{{dylanw:trial5mindist8_9.jpg}}\\ 
  
-*''min_dist''=7+  * **min_dist**=7
  
-[[File:trial5mindist7.jpg|400px]]+{{dylanw:trial5mindist7.jpg}}\\ 
  
-*''min_dist''=6+  * **min_dist**=6
  
-[[File:trial5mindist6.jpg|400px]]+{{dylanw:trial5mindist6.jpg}}\\ 
  
 Thoughts: Thoughts:
-*The number of observed landmarks increased as the ''min_dist'' threshold value decreases. This is expected as observed landmarks whose distance in between them is initially well under the threshold value, is now above and are thus considered distinct landmarks. +  * The number of observed landmarks increased as the ''min_dist'' threshold value decreases. This is expected as observed landmarks whose distance in between them is initially well under the threshold value, is now above and are thus considered distinct landmarks. 
-*At a higher threshold value for ''min_dist'', the number of observed landmarks fit the experimental setup but there is a high possibility of wrong association of true distinct landmarks as the same observed landmark. +  * At a higher threshold value for **min_dist**, the number of observed landmarks fit the experimental setup but there is a high possibility of wrong association of true distinct landmarks as the same observed landmark. 
-*Inaccuracy is also due to the limitations of the equipment. Below shows the observed landmarks at the very first scan of the experiment. One of the observed landmark is an abnormally circled in green. It may be due to overlap between the nearer landmark circled in red and the further one circled in blue during scanning. Adjustments need to the various threshold value need to be made to improve the results+  * Inaccuracy is also due to the limitations of the equipment. Below shows the observed landmarks at the very first scan of the experiment. One of the observed landmark is an abnormally circled in green. It may be due to overlap between the nearer landmark circled in red and the further one circled in blue during scanning. Adjustments need to the various threshold value need to be made to improve the results 
 + 
 +{{dylanw:trial5scan1.jpg}}\\ 
  
-[[File:trial5scan1.jpg|400px]]+=== Going Further ===
  
-===Going Further=== +Adjustments to values like **RIGHT_THRES****LEFT_THRES****AREA_THRES** and **EYE_TURN_ANGLE** will improve the results. The optimum result will require a good balance of these values.
-Adjustments to values like ''RIGHT_THRES''''LEFT_THRES''''AREA_THRES'' and ''EYE_TURN_ANGLE'' will improve the results. The optimum result will require a good balance of these values.+
drexel_darwin_slam.1478038080.txt.gz · Last modified: by dwallace