This is an old revision of the document!
Simultaneous Localization and Mapping (SLAM)
Motivation
SLAM give the robot infomation of its whereabout and more relevantly, the position of the features of its surroundings.
This can be essential to our project to identify each rungs during each step of the ladder ascent.
Refer here[http://ocw.mit.edu/courses/aeronautics-and-astronautics/16-412j-cognitive-robotics-spring-2005/projects/1aslam_blas_repo.pdf] for more details of SLAM.
Learning
With reference to the webpage[http://dasl.mem.drexel.edu/~billgreen/slam/slam.html] assigned by Prof Oh for me to work on, I did the homeworks and there are my codes.
Homework 1
<syntaxhighlight lang=“matlab”> R=0.01; Q=0.00001; updatedState=-0.37727;%initial value stated in data.m updatedCovariance=R;%initial covariance is the measured covariance, R Vmeasured=zeros(1,50);%initialized 50 zeroes to Vmeasured Vkalman=zeros(1,50);
for i=1:1:50
if i~=1 predictedState=updatedState;%A=1 and B=0, there is no input predictedCovariance=updatedCovariance + Q;%A=1 innovation=voltage(i) - predictedState;%H=1 innovationCovariance=predictedCovariance + R; kalmanGain=predictedCovariance * innovationCovariance^-1; updatedState=predictedState + kalmanGain * innovation; updatedCovariance=(1-kalmanGain) * predictedCovariance;%I matrix=1 here, H=1 Vkalman(i)=updatedState; end
end
for i=1:1:50
Vmeasured(i)=-0.37727;
end … … </syntaxhighlight>
Refer to here[http://studentdavestutorials.weebly.com/kalman-filter-with-matlab-code.html] for an example of how the algorithm look like for a simple Kalman filter problem.
Homework 2
I personally recommend here[http://elib.uni-stuttgart.de/opus/volltexte/2005/2183/pdf/kleinbauer.pdf] for a better reference on kalman and extended kalman filters.
<syntaxhighlight lang=“matlab”> … … %const Jacobian Matrices V=eye(3); Vtrans=transpose(V); Jh=eye(3); Jhtrans=transpose(Jh); W=zeros(3); Wtrans=transpose(W);
I=eye(3);
%Initialize for i=1,t=0% deltaTheta= vel*tan(phi)/L;%constant
theta=(0)*deltaTheta;%increases by const deltaTheta every timestep
Vkalman=zeros(3,N); sysmodel=zeros(3,1);
%initilize initial state to zero updatedState=zeros(3,1); predictedState=zeros(3,1); updatedCovariance=P;%as stated in qns→P(1:3,1:3) = 1; %
%main interation for i=2:deltaT:N%t=1 and onwards
x=deltaT*vel*cos(theta); y=deltaT*vel*sin(theta); theta=(i-1)*deltaTheta; input=[x;y;theta];%values are at k-1 %every timestep the true position of the model changes by input %start from 2nd onwards sysmodel(1,i)=sysmodel(1,i-1)+input(1); sysmodel(2,i)=sysmodel(2,i-1)+input(2); sysmodel(3,i)=input(3); F=[1 0 -1*vel*sin(theta); 0 1 vel*cos(theta); 0 0 1]; Ftrans=transpose(F); predictedState(1)=updatedState(1)+input(1); predictedState(2)=updatedState(2)+input(2); predictedState(3)=updatedState(3); predictedCovariance=F*updatedCovariance*Ftrans + W*Q*Wtrans; %initial P(k) is as stated in qns->P(1:3,1:3) = 1;
temp=Jh*predictedState; innovation=[z(1,i)-temp(1); z(2,i)-temp(2); z(3,i)-temp(3)]; %actual measurement - H*predicted state innovationCovariance=Jh*predictedCovariance*Jhtrans + V*R*Vtrans;
kalmanGain=(predictedCovariance * Jhtrans)/ innovationCovariance; updatedState=predictedState+kalmanGain*innovation; updatedCovariance=(I-kalmanGain*Jh) * predictedCovariance;
Vkalman(1,i)=updatedState(1); Vkalman(2,i)=updatedState(2); Vkalman(3,i)=updatedState(3)+input(3); %updated state of theta work on the difference between measured and actual %need add back the input value onto the worked difference
end
for i=1:deltaT:N z(3,i)=z(3,i)*360/2/pi; Vkalman(3,i)=Vkalman(3,i)*360/2/pi; sysmodel(3,i)=sysmodel(3,i)*360/2/pi; end
%graphs … … </syntaxhighlight>
Homework 4
*slam.m
<syntaxhighlight lang=“matlab”>
xest(1,k) = xest(1,k-1) + dt*Vc*cos(phi) - dt*Vc/L*tan(alpha)*(a*sin(phi)+b*cos(phi)); xest(2,k) = xest(2,k-1) + dt*Vc*sin(phi) + dt*Vc/L*tan(alpha)*(a*cos(phi)-b*sin(phi)); xest(3,k) = phi + dt*Vc/L*tan(alpha);
… …
A(1,1)=??; A(1,2)=??; A(1,3)=??; A(2,1)=??; A(2,2)=??; A(2,3)=??; A(3,1)=??; A(3,2)=??; A(3,3)=??;
… …
A(1,1)=1; A(1,2)=0; A(1,3)= Vc*dt*(-1)* (sin(phi) + tan(alpha)/L*(a*cos(phi)-b*sin(phi))); A(2,1)=0; A(2,2)=1; A(2,3)=Vc*dt*(cos(phi) - tan(alpha)/L*(a*sin(phi)+b*cos(phi))); A(3,1)=0; A(3,2)=0; A(3,3)=1; Pest = ??;
</syntaxhighlight>
*new_state.m
<syntaxhighlight lang=“matlab”>
…
…
x = xest(1,k)+zlaser(1)*cos(xest(3,k)-pi/2+zlaser(2)); y = xest(2,k)+zlaser(1)*sin(xest(3,k)-pi/2+zlaser(2));
… … </syntaxhighlight>
*updateNew.m
and updateExisting.m
<syntaxhighlight lang=“matlab”>
…
…
Jh(1,1) = -dx/r; Jh(1,2) = -dy/r; Jh(1,3) = 0; Jh(2,1) = dy/(r^2); Jh(2,2) = -dx/(r^2); Jh(2,3) = -1;
Jh(1,numStates-1) = 0; Jh(1,numStates) = 0; Jh(2,numStates-1) = 0; Jh(2,numStates) = 0;
… …
K = Pest*Jh'/(Jh*Pest*Jh'+R); xest(:,k) = xest(:,k)+K*innov; xest(3,k) = normalizeAngle(xest(3,k)); I = eye(size(K*Jh)); Pest = (I-K*Jh)*Pest;
… … </syntaxhighlight>
Implementation
NXT + Ultrasonic Sensor
I will be implementing a pseudo “SLAM” on the NXT.
I will be using the ultrasonic sensor to detect obstacle distances from the robot.
Since this sensor can only detect one object at a time, it cannot do any updating and correction, that is the second of the SLAM process.
I will just plot a map of the route taken by the NXT and the experimental area using the data retrieved.
I designed the robot such that the Ultrasonic Sensor is along the axle of the motor driving the wheels. 400px|none400px
Picture of the experimental setup. 400px|none
Wait(DELAY);
} sub open_for_write() {
DeleteFile(TEST_FILE_NAME); CreateFile(TEST_FILE_NAME,TEST_FILE_SIZE,test_handle);
} task main() { open_for_write(); int init; init = NXTCam_Init(camPort, CAMADDR);initialize NXTCam_SendCommand(camPort, CAMADDR, 'E'); enable tracking +++++++++++++SCANNING++++++++++// int bluecount=1,whitecount=1; s="eyeFrontAng="; s+=NumToStr(eyeFrontAng); TextOut(0,LCD_LINE6,s,false); delay(); //++++++++++detect black landmark++++++++++// RotateMotor(EYE,EYE_POWER,-EYE_LEFT); eyeCurAng=-EYE_LEFT; while (eyeCurAng<EYE_RIGHT) { delay(); ClearScreen(); s="eyeCurAng="; s+=NumToStr(eyeCurAng); WriteLn(test_handle,s); TextOut(0,LCD_LINE7,s,false); NXTCam_SendCommand(camPort, CAMADDR, 'B'); // object tracking NXTCam_SendCommand(camPort, CAMADDR, 'E'); // enable tracking delay(); delay(); delay(); NXTCam_GetBlobs(camPort, CAMADDR, nblobs, color, left, top, right, bottom); //saving details of each blob for (colorCount=0;colorCount<nblobs;colorCount++) { s="color:"; s+=NumToStr(color[colorCount]); s+=" left:"; s+=NumToStr(left[colorCount]); s+=" right:"; s+=NumToStr(right[colorCount]); s+=" top:"; s+=NumToStr(top[colorCount]); s+=" btm:"; s+=NumToStr(bottom[colorCount]); area = (right[colorCount]-left[colorCount])*(bottom[colorCount]-top[colorCount]); s+=" area:"; s+=NumToStr(area); WriteLn(test_handle,s); } delay(); RotateMotor(EYE,EYE_POWER,EYE_TURN_ANGLE); eyeCurAng+=EYE_TURN_ANGLE; }//end while RotateMotor(EYE,EYE_POWER,-EYE_RIGHT); NXTCam_SendCommand(camPort, CAMADDR, 'D'); // disable tracking } </syntaxhighlight> Things to take note: *Perform ''#include'' on '''only''' one of the "nxcCam" files (ie. ''nxtcamlib.nxc'', ''nxtcamlib–default.nxc'' or ''nxtcamlib–common.nxc''), otherwise there will be error on having the "same identifier" in multiple files. *From example tutorial "camtest-v3.nxc", the file it ''#include'' is ''nxtcamlib-default.nxc'' where the ''NXTCam_GetBlobs'' function has 8 arguments. If you ''#include'' the ''nxtcamlib.nxc'' file instead, there are only 7 arguments. *Using a sub() to process the results from the ''NXTCam_GetBlobs'' function will not work. Process the values inside the main loop itself. *Do not disable the camera until the end of the program; camera cannot be enabled again once it is disabled in the runtime of the program. *PosRegSetAng() cannot be used once RotateMotor() is used *Limit of NXT runtime for a program is 10mins. NXT switch off after that length of run time regardless if the program has completed or not Below shows the collected figures from the program. [[File:coordinatesNAreas.jpg or not #define LM_DIST_FILE_NAME “LMdist.csv” #define FILE_SIZE 1048 #define DEFAULT 255 byte blob_handle=0,distFwd_ang_handle=0,LMdist_handle=0; #define DIST_THRES 45 only register landmark distance if near enof #define AREA_THRES 1200 #define MAX_AREA_THRES 20000 #define LEFT_THRES 46 #define RIGHT_THRES 130 #define BLOBHEIGHT 60 string sblob,sLMdist,sdistFwdAng; for getBlob() int color[10]; int left[10]; int top[10]; int right[10]; int bottom[10]; int nblobs; global variable long eyeCurAng; int journey=0;journey ++ after every scan and move #define END_OF_JOURNEY 7 experiment steps inline void delay() {
Wait(DELAY);
} sub open_for_write() {
DeleteFile(BLOB_FILE_NAME); DeleteFile(LM_DIST_FILE_NAME); DeleteFile(DISTFWD_ANG_FILE_NAME);
CreateFile(BLOB_FILE_NAME,FILE_SIZE,blob_handle); CreateFile(DISTFWD_ANG_FILE_NAME,FILE_SIZE,distFwd_ang_handle); CreateFile(LM_DIST_FILE_NAME,FILE_SIZE,LMdist_handle);
} sub fwd() {
RotateMotorPID(BOTH_MOTOR,WHEEL_POWER, WHEELS_TURN_ANGLE, PID_1,PID_5,PID_0); delay();//give time for corection of angle by PID distFwd=1; angle=0;
} task main() {
PosRegSetMax(EYE, MAX_POWER, MAX_ACC); PosRegEnable (EYE, PID_1, PID_5, PID_0); SetSensorLowspeed(US); open_for_write(); int init,blobHeight,detected=0; init = NXTCam_Init(camPort, CAMADDR);//initialize NXTCam_SendCommand(camPort, CAMADDR, 'E'); // enable tracking while(journey<END_OF_JOURNEY) { sdistFwdAng=NumToStr(angle);//record state of robot sdistFwdAng+=" "; sdistFwdAng+=NumToStr(distFwd); WriteLn(distFwd_ang_handle,sdistFwdAng);
//+++++++++++++++SCANNING++++++++++// PosRegSetAngle (EYE, -EYE_LEFT);//set the initial starting angle for the eye of the robot delay();//give time for correction of angle by PID eyeCurAng=-EYE_LEFT;//save current angle of eye detected=0;//reset variables before scanning loop while (eyeCurAng<=EYE_RIGHT)//per scan { dist=SensorUS(US); detected=0;//reset variables before scanning the next angle
NXTCam_SendCommand(camPort, CAMADDR, 'B'); // object tracking NXTCam_SendCommand(camPort, CAMADDR, 'E'); // enable tracking delay();//give time for stablise blob detection NXTCam_GetBlobs(camPort, CAMADDR, nblobs, color, left, top, right, bottom);
//blobFilter //saving details of each blob if (nblobs!=0)//if blob of the designated color is detected { for (colorCount=0;colorCount<nblobs;colorCount++)//each of the same color blob tt are detected { area = abs(right[colorCount]-left[colorCount])*abs(bottom[colorCount]-top[colorCount]); //calculate area of each blob detected in this angle mid = (right[colorCount]+left[colorCount])/2; //calculate mid point of each blob detected in this angle blobHeight = abs(bottom[colorCount]-top[colorCount]); //calculate height of each blob
if (area>AREA_THRES && area<MAX_AREA_THRES//if blob has a big enof area, and not too big due to error in detection && blobHeight>BLOBHEIGHT//if blob is tall enof && mid<RIGHT_THRES && mid>LEFT_THRES//if blob lies in the middle of the camera view && dist < DIST_THRES//if blob detected is not too far to be invalid && detected==0) //take in only one blob that pass the filters { sblob=NumToStr(2); Write(blob_handle,sblob);//save value 2 for a true blob deteced in this particular angle detected++; }
}//endfor }//endif
if (detected==0)//if at this angle no blobs recognised, value is 0 { sblob=NumToStr(1); Write(blob_handle,sblob);//save value 1 for no true blob deteced in this particular angle }
sLMdist=NumToStr(dist); Write(LMdist_handle,sLMdist);//save distance detected at this particular angle
PosRegAddAngle(EYE, EYE_TURN_ANGLE); delay(); eyeCurAng+=EYE_TURN_ANGLE;
}//end while
sblob=""; WriteLn(blob_handle,sblob);//after full 140 degrees scan completed, goes to next line
sLMdist=""; WriteLn(LMdist_handle,sLMdist);//after full 140 degrees scan completed, goes to next line
//+++++++++++end of 1 SCAN+++++++++++// PosRegSetAngle (EYE, ORIGIN);//return to front delay();//give time for correction of angle by PID
//++++++obstacle avoidance+++++++++//
fwd();//move forward after full 140 degrees scan journey++; }//endwhile
NXTCam_SendCommand(camPort, CAMADDR, 'D'); disable tracking
}
</syntaxhighlight>
This is the full video at normal speed.
YjQGph-r7jo
This is a compilation of the 'blob' and 'LMdist' data collected.
800px
*Each line represented the 29 data of all the scans done at each move step.
*The last line is incomplete as it has reached the maximum run time of the NXT.
*Detected blobs and its corresponding distances detected is highlighted in yellow.
*There are no blobs detected on the right side for the 1st few steps because the blobs are too far and thus filtered out.
'Thoughts
':
*Implementation SLAM is possible as we can see the detected values are in clusters and they are shifting to the side as the robot moves forward.
The matlab code can be found here.'GITHUB
'
The result of the code is shown below.
400px
*The black dots represent the true position of the obstacles.
*The '*' represents the position of the obstacles detected.
*The green line is the true route of the robot by odometry input data.
*The red line is the estimated route by going through the EKF.
'Thoughts
':
*Outlier obstacle detected.
*EKF implemented is fairly accurate for this number of steps.
*Newly detected obstacles at each step is correctly matched to the nearby obstacles.
====Trial 5====
I tested out the algorithm on another experimental setup as shown below.
400px
With an adjustment to the minimum distance threshold between landmarks to be classified as the same landmark, the results are show below.
*min_dist
=11
400px
*min_dist
=10
400px
*min_dist
=8,9
400px
*min_dist
=7
400px
*min_dist
=6
400px
Thoughts:
*The number of observed landmarks increased as the min_dist
threshold value decreases. This is expected as observed landmarks whose distance in between them is initially well under the threshold value, is now above and are thus considered distinct landmarks.
*At a higher threshold value for min_dist
, the number of observed landmarks fit the experimental setup but there is a high possibility of wrong association of true distinct landmarks as the same observed landmark.
*Inaccuracy is also due to the limitations of the equipment. Below shows the observed landmarks at the very first scan of the experiment. One of the observed landmark is an abnormally circled in green. It may be due to overlap between the nearer landmark circled in red and the further one circled in blue during scanning. Adjustments need to the various threshold value need to be made to improve the results
400px
===Going Further===
Adjustments to values like RIGHT_THRES
, LEFT_THRES
, AREA_THRES
and EYE_TURN_ANGLE
will improve the results. The optimum result will require a good balance of these values.