User Tools

Site Tools


drexel_darwin_slam

This is an old revision of the document!


Simultaneous Localization and Mapping (SLAM)

Motivation

SLAM give the robot infomation of its whereabout and more relevantly, the position of the features of its surroundings.

This can be essential to our project to identify each rungs during each step of the ladder ascent.

Refer here[http://ocw.mit.edu/courses/aeronautics-and-astronautics/16-412j-cognitive-robotics-spring-2005/projects/1aslam_blas_repo.pdf] for more details of SLAM.

Learning

With reference to the webpage[http://dasl.mem.drexel.edu/~billgreen/slam/slam.html] assigned by Prof Oh for me to work on, I did the homeworks and there are my codes.

Homework 1

<syntaxhighlight lang=“matlab”> R=0.01; Q=0.00001; updatedState=-0.37727;%initial value stated in data.m updatedCovariance=R;%initial covariance is the measured covariance, R Vmeasured=zeros(1,50);%initialized 50 zeroes to Vmeasured Vkalman=zeros(1,50);

for i=1:1:50

  if i~=1
  predictedState=updatedState;%A=1 and B=0, there is no input
  predictedCovariance=updatedCovariance + Q;%A=1
  innovation=voltage(i) - predictedState;%H=1
  innovationCovariance=predictedCovariance + R;
  kalmanGain=predictedCovariance * innovationCovariance^-1;
  updatedState=predictedState + kalmanGain * innovation;
  updatedCovariance=(1-kalmanGain) * predictedCovariance;%I matrix=1 here, H=1
  Vkalman(i)=updatedState;
  end

end

for i=1:1:50

  Vmeasured(i)=-0.37727;

end … … </syntaxhighlight>

Refer to here[http://studentdavestutorials.weebly.com/kalman-filter-with-matlab-code.html] for an example of how the algorithm look like for a simple Kalman filter problem.

Homework 2

I personally recommend here[http://elib.uni-stuttgart.de/opus/volltexte/2005/2183/pdf/kleinbauer.pdf] for a better reference on kalman and extended kalman filters.

<syntaxhighlight lang=“matlab”> … … %const Jacobian Matrices V=eye(3); Vtrans=transpose(V); Jh=eye(3); Jhtrans=transpose(Jh); W=zeros(3); Wtrans=transpose(W);

I=eye(3);

%Initialize for i=1,t=0% deltaTheta= vel*tan(phi)/L;%constant

theta=(0)*deltaTheta;%increases by const deltaTheta every timestep

Vkalman=zeros(3,N); sysmodel=zeros(3,1);

%initilize initial state to zero updatedState=zeros(3,1); predictedState=zeros(3,1); updatedCovariance=P;%as stated in qns→P(1:3,1:3) = 1; %

%main interation for i=2:deltaT:N%t=1 and onwards

  x=deltaT*vel*cos(theta);
  y=deltaT*vel*sin(theta);
  theta=(i-1)*deltaTheta;
  input=[x;y;theta];%values are at k-1
  
  %every timestep the true position of the model changes by input
  %start from 2nd onwards
  sysmodel(1,i)=sysmodel(1,i-1)+input(1);
  sysmodel(2,i)=sysmodel(2,i-1)+input(2);
  sysmodel(3,i)=input(3);
  
  F=[1 0 -1*vel*sin(theta); 0 1 vel*cos(theta); 0 0 1];
  Ftrans=transpose(F);
  
  predictedState(1)=updatedState(1)+input(1);
  predictedState(2)=updatedState(2)+input(2);
  predictedState(3)=updatedState(3);
     
  predictedCovariance=F*updatedCovariance*Ftrans + W*Q*Wtrans;
  %initial P(k) is as stated in qns->P(1:3,1:3) = 1;
  temp=Jh*predictedState;
  
  innovation=[z(1,i)-temp(1);
              z(2,i)-temp(2);
              z(3,i)-temp(3)];
  %actual measurement - H*predicted state
  
  innovationCovariance=Jh*predictedCovariance*Jhtrans + V*R*Vtrans;
  kalmanGain=(predictedCovariance * Jhtrans)/ innovationCovariance;
  updatedState=predictedState+kalmanGain*innovation;
  updatedCovariance=(I-kalmanGain*Jh) * predictedCovariance;
  Vkalman(1,i)=updatedState(1);
  Vkalman(2,i)=updatedState(2);
  Vkalman(3,i)=updatedState(3)+input(3);
  %updated state of theta work on the difference between measured and actual
  %need add back the input value onto the worked difference

end

for i=1:deltaT:N z(3,i)=z(3,i)*360/2/pi; Vkalman(3,i)=Vkalman(3,i)*360/2/pi; sysmodel(3,i)=sysmodel(3,i)*360/2/pi; end

%graphs … … </syntaxhighlight>

Homework 4

*slam.m <syntaxhighlight lang=“matlab”>

  xest(1,k) = xest(1,k-1) + dt*Vc*cos(phi) - dt*Vc/L*tan(alpha)*(a*sin(phi)+b*cos(phi));
  xest(2,k) = xest(2,k-1) + dt*Vc*sin(phi) + dt*Vc/L*tan(alpha)*(a*cos(phi)-b*sin(phi));
  xest(3,k) = phi + dt*Vc/L*tan(alpha);

… …

  A(1,1)=??; A(1,2)=??; A(1,3)=??;
  A(2,1)=??; A(2,2)=??; A(2,3)=??; 
  A(3,1)=??; A(3,2)=??; A(3,3)=??;

… …

  A(1,1)=1; A(1,2)=0; A(1,3)= Vc*dt*(-1)* (sin(phi) + tan(alpha)/L*(a*cos(phi)-b*sin(phi)));
  A(2,1)=0; A(2,2)=1; A(2,3)=Vc*dt*(cos(phi) - tan(alpha)/L*(a*sin(phi)+b*cos(phi))); 
  A(3,1)=0; A(3,2)=0; A(3,3)=1;
  Pest = ??;

</syntaxhighlight>

*new_state.m <syntaxhighlight lang=“matlab”> … …

  x = xest(1,k)+zlaser(1)*cos(xest(3,k)-pi/2+zlaser(2));
  y = xest(2,k)+zlaser(1)*sin(xest(3,k)-pi/2+zlaser(2));

… … </syntaxhighlight>

*updateNew.m and updateExisting.m <syntaxhighlight lang=“matlab”> … …

  Jh(1,1) = -dx/r;   Jh(1,2) = -dy/r;   Jh(1,3) = 0;
  Jh(2,1) = dy/(r^2);   Jh(2,2) = -dx/(r^2);   Jh(2,3) = -1;
  Jh(1,numStates-1) = 0;     Jh(1,numStates) = 0;
  Jh(2,numStates-1) = 0;     Jh(2,numStates) = 0;

… …

  K = Pest*Jh'/(Jh*Pest*Jh'+R);
  xest(:,k) = xest(:,k)+K*innov;
  xest(3,k) = normalizeAngle(xest(3,k));
  I = eye(size(K*Jh));
  Pest = (I-K*Jh)*Pest;

… … </syntaxhighlight>

Implementation

NXT + Ultrasonic Sensor

I will be implementing a pseudo “SLAM” on the NXT.

I will be using the ultrasonic sensor to detect obstacle distances from the robot.

Since this sensor can only detect one object at a time, it cannot do any updating and correction, that is the second of the SLAM process.

I will just plot a map of the route taken by the NXT and the experimental area using the data retrieved.

I designed the robot such that the Ultrasonic Sensor is along the axle of the motor driving the wheels. 400px|none400px

Picture of the experimental setup. 400px|none

This is the “nxc” code for the whole procedure. <syntaxhighlight lang=“c”> #define DIST 30 #define EX_TIME 60000experiment time #define REV_POWER 50 #define TURN_ANGLE 139 turn 30 deg abt left wheel #define FWD_POWER 50 #define COUNT 2number of times to turn,formula= #define FILE_NAME “distance.dat” #define SPD_FILE_NAME “distFwd.dat” #define ANG_FILE_NAME “angle.dat” #define FILE_SIZE 2048 int distFwd=0,angle=0,count=COUNT+1; byte handle=0,speed_handle=0,distfwd_handle=0,angle_handle=0; inline void turnleft() { RotateMotor(OUT_C,REV_POWER,TURN_ANGLE); angle;//on every turn, angle increase distFwd=0; count++; } inline void fwd() { RotateMotor(OUT_AC,FWD_POWER,180);//move forward by 0.5*circumference of wheel distFwd=1; } task main() { //declarations SetSensorLowspeed(S4); int dist; DeleteFile(FILE_NAME); DeleteFile(SPD_FILE_NAME); DeleteFile(ANG_FILE_NAME); CreateFile(FILE_NAME,FILE_SIZE,handle); CreateFile(SPD_FILE_NAME,FILE_SIZE,distfwd_handle); CreateFile(ANG_FILE_NAME,FILE_SIZE,angle_handle); //MAIN LOOP string s; while(CurrentTick() - initTime <= EX_TIME) { //keeping track of time t = CurrentTick() - initTime; dist = SensorUS(S4); if (dist >200) dist = 255;//objects that are too far have great inaccuracy, //so I just standardized their value above a certain threshold s = NumToStr(dist); WriteLn(handle, s);//save distance sensed s = NumToStr(distFwd); WriteLn(distfwd_handle, s);//save forward or not from the previous step s = NumToStr(angle); WriteLn(angle_handle, s);//save angle if ((dist <= DIST) && (count < COUNT)) { turnleft(); } else if (dist <= DIST && count >= COUNT)//obstacle after fwd { count = 0; turnleft(); } else if (dist > DIST && count < COUNT)//obstacle disappear for 1st time { turnleft(); } //the COUNT is there to ensure that the NXT has turn an angle big enough //to account for its width else if (dist > DIST && count >= COUNT) { fwd(); } } Off(OUT_AC); CloseFile(handle); CloseFile(speed_handle); CloseFile(angle_handle); } </syntaxhighlight> This is the "matlab" code for data processing part. Parts which are used for updating in the Extended Kalman Filter are commented out since there is no update in this pseudo "SLAM" experiment. <syntaxhighlight lang="matlab"> %VicExperiment clear all; clc; %%%%%%%%COLLECTING DATA%%%%%%%%%%%%% fid = fopen('distance.dat');%read file fid_time = fopen('time.dat'); fid_x = fopen('distFwd.dat'); fid_angle = fopen('angle.dat'); Dist = fscanf(fid, '%f %*c',inf); Time = fscanf(fid_time, '%f %*c'); X = fscanf(fid_x, '%d %*c'); Angle = fscanf(fid_angle, '%f %*c'); fclose(fid); fclose(fid_time); fclose(fid_x); fclose(fid_angle); %%%%%%%%COLLECTING DATA%%%%%%%%%%%%% % declarations r = 12.7/2;%distance from wheel to sensor position along axle chord = sqrt(2*r*r-2*r*r*cos(pi/6));%linear distance between sensor position on each 30 deg turn %used for calcuating position of sensor when NXT turns %NXT will not turn about sensor position %so have to account for offset from the turning point (left wheel) %set matrices % EKF components used in multiple functions global xest Pest A Q W global numStates k xest = zeros(3,1); % state matrix % Pest = zeros(3,3); % covariance matrix numStates=3; % initially, there are no landmarks %A = zeros(3,3); % process model jacobian/state transition %W = eye(3); % process noise jacobian//noise to predicted state %Q = [.49 0 0; 0 .49 0; 0 0 (7*pi/180)^2];%noise to predicted state %Jh = zeros(2,3); % sensor model jacobian K = zeros(3,3); % kalman gain t = Time(1); %1st time in time.dat k = 1; % loop iteration counter % initialize estimates xest(1,1) = 61.7; % x position of vehicle xest(2,1) = 38.1; % y position of vehicle xest(3,1) = 0; % phi of vehicle % Pest(:,:) = [0.1, 0, 0; 0, 0.1, 0; 0, 0, 15*pi/180]; map = zeros(2,62); map(1,1) = xest(1,1) + Dist(1);%1st obstacle sensed map(2,1) = xest(2,1); % initial inputs angle = 0; move = 0.5*5.6*pi; %%EKF while k<62%number of data collected k=k+1; t=Time(k); dt=t-Time(k-1); %v and angle will change before loop again %everytime xest(3,k-1)>phi,xest(1,k) phi = xest(3,k-1)/360*2*pi; % in rad if (move ~= 0) xest(1,k) = xest(1,k-1) + move*cos(phi);% x prediction of NXT USsensor position if it moved forward xest(2,k) = xest(2,k-1) + move*sin(phi);% y prediction of NXT USsensor position if it moved forward xest(3,k) = xest(3,k-1); % NXT orientation in degrees (no change) else xest(1,k) = xest(1,k-1) + chord*cos(phi);% x prediction of NXT USsensor if it turned xest(2,k) = xest(2,k-1) + chord*sin(phi);% y prediction of NXT USsensor if it turned xest(3,k) = angle; % NXT orientation in degrees (changes) end % % Calculate Jacobian A based on vehicle model (df/dx) % A(1,1)=1; A(1,2)=0; A(1,3) = move*sin(phi); % A(2,1)=0; A(2,2)=1; A(2,3) = -move*cos(phi); % A(3,1)=0; A(3,2)=0; A(3,3) = 1; % % rest of A (which is just 2 null matrices and 1 identity) is added in new_state function % % % Calculate prediction for covariance matrix % Pest = A*Pest*A' + W*Q*W';%Using initial Pest for 1st timestep %update v and angle/alpha from speed.dat and angle.dat move = X(k)*0.5*5.6*pi;%diameter of wheel is 5.6cm angle = Angle(k)*30;%every turn upon sensing a close obstacle is 30 degrees % if Dist(k)>200 % Dist(k)%do something if the sensed wall is too far? % end if Dist(k)~= 255 map(1,k) = xest(1,k) + Dist(k)*cos(phi); map(2,k) = xest(2,k) + Dist(k)*sin(phi); else map(1,k) = xest(1,k-1); map(2,k) = xest(2,k-1); end end mapOutline=[0,105.4,127,127,39.4,0,0; 0,0,25.4,101.6,101.6,80,0];%plots experimental boundary hold all; %plot(xest(3,1:k)); plot(xest(1,1:k),xest(2,1:k),'green'); % robot position (model) axis([0 127 0 101.6]); hold on; plot(map(1,1:k),map(2,1:k),'*'); % map of what robot saw axis([0 127 0 101.6]); plot(mapOutline(1,:),mapOutline(2,:),'red'); </syntaxhighlight> This is the result on Matlab. The red line represents the experimental boundary, the green line represents the route taken by the robot WITHOUT corrections, and the blue dots represent the detected obstacles at the respective part of the route. [[File:SLAM1.jpg

Wait(DELAY);

} sub open_for_write() {

DeleteFile(TEST_FILE_NAME);
CreateFile(TEST_FILE_NAME,TEST_FILE_SIZE,test_handle);

} task main() { open_for_write(); int init; init = NXTCam_Init(camPort, CAMADDR);initialize NXTCam_SendCommand(camPort, CAMADDR, 'E'); enable tracking +++++++++++++SCANNING++++++++++// int bluecount=1,whitecount=1; s="eyeFrontAng="; s+=NumToStr(eyeFrontAng); TextOut(0,LCD_LINE6,s,false); delay(); //++++++++++detect black landmark++++++++++// RotateMotor(EYE,EYE_POWER,-EYE_LEFT); eyeCurAng=-EYE_LEFT; while (eyeCurAng<EYE_RIGHT) { delay(); ClearScreen(); s="eyeCurAng="; s+=NumToStr(eyeCurAng); WriteLn(test_handle,s); TextOut(0,LCD_LINE7,s,false); NXTCam_SendCommand(camPort, CAMADDR, 'B'); // object tracking NXTCam_SendCommand(camPort, CAMADDR, 'E'); // enable tracking delay(); delay(); delay(); NXTCam_GetBlobs(camPort, CAMADDR, nblobs, color, left, top, right, bottom); //saving details of each blob for (colorCount=0;colorCount<nblobs;colorCount++) { s="color:"; s+=NumToStr(color[colorCount]); s+=" left:"; s+=NumToStr(left[colorCount]); s+=" right:"; s+=NumToStr(right[colorCount]); s+=" top:"; s+=NumToStr(top[colorCount]); s+=" btm:"; s+=NumToStr(bottom[colorCount]); area = (right[colorCount]-left[colorCount])*(bottom[colorCount]-top[colorCount]); s+=" area:"; s+=NumToStr(area); WriteLn(test_handle,s); } delay(); RotateMotor(EYE,EYE_POWER,EYE_TURN_ANGLE); eyeCurAng+=EYE_TURN_ANGLE; }//end while RotateMotor(EYE,EYE_POWER,-EYE_RIGHT); NXTCam_SendCommand(camPort, CAMADDR, 'D'); // disable tracking } </syntaxhighlight> Things to take note: *Perform ''#include'' on '''only''' one of the "nxcCam" files (ie. ''nxtcamlib.nxc'', ''nxtcamlib–default.nxc'' or ''nxtcamlib–common.nxc''), otherwise there will be error on having the "same identifier" in multiple files. *From example tutorial "camtest-v3.nxc", the file it ''#include'' is ''nxtcamlib-default.nxc'' where the ''NXTCam_GetBlobs'' function has 8 arguments. If you ''#include'' the ''nxtcamlib.nxc'' file instead, there are only 7 arguments. *Using a sub() to process the results from the ''NXTCam_GetBlobs'' function will not work. Process the values inside the main loop itself. *Do not disable the camera until the end of the program; camera cannot be enabled again once it is disabled in the runtime of the program. *PosRegSetAng() cannot be used once RotateMotor() is used *Limit of NXT runtime for a program is 10mins. NXT switch off after that length of run time regardless if the program has completed or not Below shows the collected figures from the program. [[File:coordinatesNAreas.jpg or not #define LM_DIST_FILE_NAME “LMdist.csv” #define FILE_SIZE 1048 #define DEFAULT 255 byte blob_handle=0,distFwd_ang_handle=0,LMdist_handle=0; #define DIST_THRES 45 only register landmark distance if near enof #define AREA_THRES 1200 #define MAX_AREA_THRES 20000 #define LEFT_THRES 46 #define RIGHT_THRES 130 #define BLOBHEIGHT 60 string sblob,sLMdist,sdistFwdAng; for getBlob() int color[10]; int left[10]; int top[10]; int right[10]; int bottom[10]; int nblobs; global variable long eyeCurAng; int journey=0;journey ++ after every scan and move #define END_OF_JOURNEY 7 experiment steps inline void delay() {

Wait(DELAY);

} sub open_for_write() {

DeleteFile(BLOB_FILE_NAME);
DeleteFile(LM_DIST_FILE_NAME);
DeleteFile(DISTFWD_ANG_FILE_NAME);
CreateFile(BLOB_FILE_NAME,FILE_SIZE,blob_handle);
CreateFile(DISTFWD_ANG_FILE_NAME,FILE_SIZE,distFwd_ang_handle);
CreateFile(LM_DIST_FILE_NAME,FILE_SIZE,LMdist_handle);

} sub fwd() {

RotateMotorPID(BOTH_MOTOR,WHEEL_POWER, WHEELS_TURN_ANGLE, PID_1,PID_5,PID_0);
delay();//give time for corection of angle by PID
distFwd=1;
angle=0;

} task main() {

PosRegSetMax(EYE, MAX_POWER, MAX_ACC);
PosRegEnable (EYE, PID_1, PID_5, PID_0);
SetSensorLowspeed(US);
open_for_write();
int init,blobHeight,detected=0;
init = NXTCam_Init(camPort, CAMADDR);//initialize
NXTCam_SendCommand(camPort, CAMADDR, 'E'); // enable tracking
while(journey<END_OF_JOURNEY)
{
	sdistFwdAng=NumToStr(angle);//record state of robot
	sdistFwdAng+=" ";
	sdistFwdAng+=NumToStr(distFwd);
	WriteLn(distFwd_ang_handle,sdistFwdAng);
	//+++++++++++++++SCANNING++++++++++//
	PosRegSetAngle (EYE, -EYE_LEFT);//set the initial starting angle for the eye of the robot
	delay();//give time for correction of angle by PID
	eyeCurAng=-EYE_LEFT;//save current angle of eye
	detected=0;//reset variables before scanning loop
	while (eyeCurAng<=EYE_RIGHT)//per scan
	{
		dist=SensorUS(US);
		detected=0;//reset variables before scanning the next angle
		NXTCam_SendCommand(camPort, CAMADDR, 'B');  // object tracking
		NXTCam_SendCommand(camPort, CAMADDR, 'E'); // enable tracking
		delay();//give time for stablise blob detection
		NXTCam_GetBlobs(camPort, CAMADDR, nblobs, color, left, top, right, bottom);
		//blobFilter
		//saving details of each blob
		if (nblobs!=0)//if blob of the designated color is detected
		{
			for (colorCount=0;colorCount<nblobs;colorCount++)//each of the same color blob tt are detected
			{
				area = abs(right[colorCount]-left[colorCount])*abs(bottom[colorCount]-top[colorCount]);
				//calculate area of each blob detected in this angle
				mid = (right[colorCount]+left[colorCount])/2;
				//calculate mid point of each blob detected in this angle
				blobHeight = abs(bottom[colorCount]-top[colorCount]);
				//calculate height of each blob
				if (area>AREA_THRES && area<MAX_AREA_THRES//if blob has a big enof area, and not too big due to error in detection
					&& blobHeight>BLOBHEIGHT//if blob is tall enof
					&& mid<RIGHT_THRES && mid>LEFT_THRES//if blob lies in the middle of the camera view
					&& dist < DIST_THRES//if blob detected is not too far to be invalid
					&& detected==0) //take in only one blob that pass the filters
				{
					sblob=NumToStr(2);
					Write(blob_handle,sblob);//save value 2 for a true blob deteced in this particular angle
					detected++;
				}
			}//endfor
		}//endif
	if (detected==0)//if at this angle no blobs recognised, value is 0
	{
		sblob=NumToStr(1);
		Write(blob_handle,sblob);//save value 1 for no true blob deteced in this particular angle
	}
	sLMdist=NumToStr(dist);
	Write(LMdist_handle,sLMdist);//save distance detected at this particular angle
	PosRegAddAngle(EYE, EYE_TURN_ANGLE);
	delay();
	eyeCurAng+=EYE_TURN_ANGLE;
	}//end while
	sblob="";
	WriteLn(blob_handle,sblob);//after full 140 degrees scan completed, goes to next line
	sLMdist="";
	WriteLn(LMdist_handle,sLMdist);//after full 140 degrees scan completed, goes to next line
	//+++++++++++end of 1 SCAN+++++++++++//
	PosRegSetAngle (EYE, ORIGIN);//return to front
	delay();//give time for correction of angle by PID
	//++++++obstacle avoidance+++++++++//
	fwd();//move forward after full 140 degrees scan
	journey++;
}//endwhile

NXTCam_SendCommand(camPort, CAMADDR, 'D'); disable tracking } </syntaxhighlight> This is the full video at normal speed. YjQGph-r7jo This is a compilation of the 'blob' and 'LMdist' data collected. 800px *Each line represented the 29 data of all the scans done at each move step. *The last line is incomplete as it has reached the maximum run time of the NXT. *Detected blobs and its corresponding distances detected is highlighted in yellow. *There are no blobs detected on the right side for the 1st few steps because the blobs are too far and thus filtered out. 'Thoughts': *Implementation SLAM is possible as we can see the detected values are in clusters and they are shifting to the side as the robot moves forward. The matlab code can be found here.'GITHUB' The result of the code is shown below. 400px *The black dots represent the true position of the obstacles. *The '*' represents the position of the obstacles detected. *The green line is the true route of the robot by odometry input data. *The red line is the estimated route by going through the EKF. 'Thoughts': *Outlier obstacle detected. *EKF implemented is fairly accurate for this number of steps. *Newly detected obstacles at each step is correctly matched to the nearby obstacles. ====Trial 5==== I tested out the algorithm on another experimental setup as shown below. 400px With an adjustment to the minimum distance threshold between landmarks to be classified as the same landmark, the results are show below. *min_dist=11 400px *min_dist=10 400px *min_dist=8,9 400px *min_dist=7 400px *min_dist=6 400px Thoughts: *The number of observed landmarks increased as the min_dist threshold value decreases. This is expected as observed landmarks whose distance in between them is initially well under the threshold value, is now above and are thus considered distinct landmarks. *At a higher threshold value for min_dist, the number of observed landmarks fit the experimental setup but there is a high possibility of wrong association of true distinct landmarks as the same observed landmark. *Inaccuracy is also due to the limitations of the equipment. Below shows the observed landmarks at the very first scan of the experiment. One of the observed landmark is an abnormally circled in green. It may be due to overlap between the nearer landmark circled in red and the further one circled in blue during scanning. Adjustments need to the various threshold value need to be made to improve the results 400px ===Going Further=== Adjustments to values like RIGHT_THRES, LEFT_THRES, AREA_THRES and EYE_TURN_ANGLE will improve the results. The optimum result will require a good balance of these values.

drexel_darwin_slam.1478038080.txt.gz · Last modified: by dwallace