User Tools

Site Tools


drexel_darwin_slam

Simultaneous Localization and Mapping (SLAM)

Motivation

SLAM give the robot infomation of its whereabout and more relevantly, the position of the features of its surroundings.

This can be essential to our project to identify each rungs during each step of the ladder ascent.

Refer here for more details of SLAM.

Learning

With reference to this webpage assigned by Prof Oh for me to work on, I did the homeworks and there are my codes.

Homework 1

homework1.m
R=0.01;
Q=0.00001;
updatedState=-0.37727;%initial value stated in data.m
updatedCovariance=R;%initial covariance is the measured covariance, R
Vmeasured=zeros(1,50);%initialized 50 zeroes to Vmeasured
Vkalman=zeros(1,50);
 
for i=1:1:50
    if i~=1
    predictedState=updatedState;%A=1 and B=0, there is no input
    predictedCovariance=updatedCovariance + Q;%A=1
    innovation=voltage(i) - predictedState;%H=1
    innovationCovariance=predictedCovariance + R;
    kalmanGain=predictedCovariance * innovationCovariance^-1;
    updatedState=predictedState + kalmanGain * innovation;
    updatedCovariance=(1-kalmanGain) * predictedCovariance;%I matrix=1 here, H=1
    Vkalman(i)=updatedState;
    end
end
 
for i=1:1:50
    Vmeasured(i)=-0.37727;
end
...
...

Refer to here for an example of how the algorithm look like for a simple Kalman filter problem.

Homework 2

I personally recommend here for a better reference on kalman and extended kalman filters.

honework2.m
...
...
%const Jacobian Matrices
V=eye(3);
Vtrans=transpose(V);
Jh=eye(3);
Jhtrans=transpose(Jh);
W=zeros(3);
Wtrans=transpose(W);
 
I=eye(3);
 
%%%%%Initialize for i=1,t=0%%%%%
deltaTheta= vel*tan(phi)/L;%constant
 
theta=(0)*deltaTheta;%increases by const deltaTheta every timestep
 
Vkalman=zeros(3,N);
sysmodel=zeros(3,1);
 
%initilize initial state to zero
updatedState=zeros(3,1);
predictedState=zeros(3,1);
updatedCovariance=P;%as stated in qns->P(1:3,1:3) = 1;
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
 
%main interation
for i=2:deltaT:N%t=1 and onwards
    x=deltaT*vel*cos(theta);
    y=deltaT*vel*sin(theta);
    theta=(i-1)*deltaTheta;
    input=[x;y;theta];%values are at k-1
 
    %every timestep the true position of the model changes by input
    %start from 2nd onwards
    sysmodel(1,i)=sysmodel(1,i-1)+input(1);
    sysmodel(2,i)=sysmodel(2,i-1)+input(2);
    sysmodel(3,i)=input(3);
 
    F=[1 0 -1*vel*sin(theta); 0 1 vel*cos(theta); 0 0 1];
    Ftrans=transpose(F);
 
    predictedState(1)=updatedState(1)+input(1);
    predictedState(2)=updatedState(2)+input(2);
    predictedState(3)=updatedState(3);
 
    predictedCovariance=F*updatedCovariance*Ftrans + W*Q*Wtrans;
    %initial P(k) is as stated in qns->P(1:3,1:3) = 1;
 
    temp=Jh*predictedState;
 
    innovation=[z(1,i)-temp(1);
                z(2,i)-temp(2);
                z(3,i)-temp(3)];
    %actual measurement - H*predicted state
 
    innovationCovariance=Jh*predictedCovariance*Jhtrans + V*R*Vtrans;
 
    kalmanGain=(predictedCovariance * Jhtrans)/ innovationCovariance;
    updatedState=predictedState+kalmanGain*innovation;
    updatedCovariance=(I-kalmanGain*Jh) * predictedCovariance;
 
    Vkalman(1,i)=updatedState(1);
    Vkalman(2,i)=updatedState(2);
    Vkalman(3,i)=updatedState(3)+input(3);
    %updated state of theta work on the difference between measured and actual
    %need add back the input value onto the worked difference
end
 
 
for i=1:deltaT:N
z(3,i)=z(3,i)*360/2/pi;
Vkalman(3,i)=Vkalman(3,i)*360/2/pi;
sysmodel(3,i)=sysmodel(3,i)*360/2/pi;
end
 
%graphs
...
...

Homework 4

slam.m

slam.m
    xest(1,k) = xest(1,k-1) + dt*Vc*cos(phi) - dt*Vc/L*tan(alpha)*(a*sin(phi)+b*cos(phi));
    xest(2,k) = xest(2,k-1) + dt*Vc*sin(phi) + dt*Vc/L*tan(alpha)*(a*cos(phi)-b*sin(phi));
    xest(3,k) = phi + dt*Vc/L*tan(alpha);
...
...
    A(1,1)=??; A(1,2)=??; A(1,3)=??;
    A(2,1)=??; A(2,2)=??; A(2,3)=??; 
    A(3,1)=??; A(3,2)=??; A(3,3)=??;
...
...
    A(1,1)=1; A(1,2)=0; A(1,3)= Vc*dt*(-1)* (sin(phi) + tan(alpha)/L*(a*cos(phi)-b*sin(phi)));
    A(2,1)=0; A(2,2)=1; A(2,3)=Vc*dt*(cos(phi) - tan(alpha)/L*(a*sin(phi)+b*cos(phi))); 
    A(3,1)=0; A(3,2)=0; A(3,3)=1;
    Pest = ??;
  • new_state.m
new_state.m
...
...
    x = xest(1,k)+zlaser(1)*cos(xest(3,k)-pi/2+zlaser(2));
    y = xest(2,k)+zlaser(1)*sin(xest(3,k)-pi/2+zlaser(2));
...
...

updateNew.m and updateExisting.m

updatenewandexisting.m
...
...
    Jh(1,1) = -dx/r;   Jh(1,2) = -dy/r;   Jh(1,3) = 0;
    Jh(2,1) = dy/(r^2);   Jh(2,2) = -dx/(r^2);   Jh(2,3) = -1;
 
    Jh(1,numStates-1) = 0;     Jh(1,numStates) = 0;
    Jh(2,numStates-1) = 0;     Jh(2,numStates) = 0;
...
...
    K = Pest*Jh'/(Jh*Pest*Jh'+R);
    xest(:,k) = xest(:,k)+K*innov;
    xest(3,k) = normalizeAngle(xest(3,k));
    I = eye(size(K*Jh));
    Pest = (I-K*Jh)*Pest;
...
...

Implementation

NXT + Ultrasonic Sensor

I will be implementing a pseudo “SLAM” on the NXT.

I will be using the ultrasonic sensor to detect obstacle distances from the robot.

Since this sensor can only detect one object at a time, it cannot do any updating and correction, that is the second of the SLAM process.

I will just plot a map of the route taken by the NXT and the experimental area using the data retrieved.

I designed the robot such that the Ultrasonic Sensor is along the axle of the motor driving the wheels.

2013-03-30_19.02.55.jpg
2013-03-30_19.02.06.jpg

Picture of the experimental setup.

2013-04-03_08.01.01.jpg

This is the “nxc” code for the whole procedure.

homework4.c
#define DIST 30
#define EX_TIME 60000//experiment time
#define REV_POWER 50
#define TURN_ANGLE 139 //turn 30 deg abt left wheel
#define FWD_POWER 50
#define COUNT 2//number of times to turn,formula=
#define FILE_NAME "distance.dat"
#define SPD_FILE_NAME "distFwd.dat"
#define ANG_FILE_NAME "angle.dat"
#define FILE_SIZE 2048
 
int distFwd=0,angle=0,count=COUNT+1;
byte handle=0,speed_handle=0,distfwd_handle=0,angle_handle=0;
 
inline void turnleft()
{
  RotateMotor(OUT_C,REV_POWER,TURN_ANGLE);
  angle++;//on every turn, angle increase
  distFwd=0;
  count++;
}
 
inline void fwd()
{
  RotateMotor(OUT_AC,FWD_POWER,180);//move forward by 0.5*circumference of wheel
  distFwd=1;
}
 
task main()
{
  //declarations
  SetSensorLowspeed(S4);
 
  int dist;
  DeleteFile(FILE_NAME);
  DeleteFile(SPD_FILE_NAME);
  DeleteFile(ANG_FILE_NAME);
  CreateFile(FILE_NAME,FILE_SIZE,handle);
  CreateFile(SPD_FILE_NAME,FILE_SIZE,distfwd_handle);
  CreateFile(ANG_FILE_NAME,FILE_SIZE,angle_handle);
 
  //MAIN LOOP
  string s;
  while(CurrentTick() - initTime <= EX_TIME)
  {
    //keeping track of time
  	t = CurrentTick() - initTime;
    dist = SensorUS(S4);
    if (dist >200) dist = 255;//objects that are too far have great inaccuracy, 
                              //so I just standardized their value above a certain threshold
    s = NumToStr(dist);
    WriteLn(handle, s);//save distance sensed
    s = NumToStr(distFwd);
    WriteLn(distfwd_handle, s);//save forward or not from the previous step
    s = NumToStr(angle);
    WriteLn(angle_handle, s);//save angle
 
    if ((dist <= DIST) && (count < COUNT))
    {
       turnleft();
    }
 
    else if (dist <= DIST && count >= COUNT)//obstacle after fwd
    {
       count = 0;
       turnleft();
    }
 
    else if (dist > DIST && count < COUNT)//obstacle disappear for 1st time
    {
        turnleft();
    }
 
    //the COUNT is there to ensure that the NXT has turn an angle big enough
    //to account for its width
 
    else if (dist > DIST && count >= COUNT)
  	{
        fwd();
        }
  }
  Off(OUT_AC);
  CloseFile(handle);
  CloseFile(speed_handle);
  CloseFile(angle_handle);
}

This is the “matlab” code for data processing part.

Parts which are used for updating in the Extended Kalman Filter are commented out since there is no update in this pseudo “SLAM” experiment.

%VicExperiment
clear all; clc;
 
%%%%%%%%COLLECTING DATA%%%%%%%%%%%%%
fid = fopen('distance.dat');%read file
fid_time = fopen('time.dat');
fid_x = fopen('distFwd.dat');
fid_angle = fopen('angle.dat');
 
Dist = fscanf(fid, '%f %*c',inf);
Time = fscanf(fid_time, '%f %*c');
X = fscanf(fid_x, '%d %*c');
Angle = fscanf(fid_angle, '%f %*c');
 
fclose(fid);
fclose(fid_time);
fclose(fid_x);
fclose(fid_angle);
%%%%%%%%COLLECTING DATA%%%%%%%%%%%%%
 
% declarations
r = 12.7/2;%distance from wheel to sensor position along axle
chord = sqrt(2*r*r-2*r*r*cos(pi/6));%linear distance between sensor position on each 30 deg turn
                                    %used for calcuating position of sensor when NXT turns
                                    %NXT will not turn about sensor position
                                    %so have to account for offset from the turning point (left wheel)
 
%set matrices
% EKF components used in multiple functions
global xest Pest A Q W
global numStates k
 
xest = zeros(3,1);  % state matrix
% Pest = zeros(3,3);  % covariance matrix
 
numStates=3;        % initially, there are no landmarks
 
%A = zeros(3,3);                 % process model jacobian/state transition
%W = eye(3);                     % process noise jacobian//noise to predicted state
%Q = [.49   0   0; 0 .49 0; 0 0 (7*pi/180)^2];%noise to predicted state
%Jh = zeros(2,3);                % sensor model jacobian
K = zeros(3,3);                 % kalman gain
 
t = Time(1);      %1st time in time.dat
k = 1;            % loop iteration counter
 
% initialize estimates
xest(1,1) = 61.7;      % x position of vehicle
xest(2,1) = 38.1;      % y position of vehicle
xest(3,1) = 0;    % phi of vehicle
% Pest(:,:) = [0.1, 0, 0; 0, 0.1, 0; 0, 0, 15*pi/180];
 
map = zeros(2,62);
map(1,1) = xest(1,1) + Dist(1);%1st obstacle sensed
map(2,1) = xest(2,1);
 
% initial inputs
angle = 0;
move = 0.5*5.6*pi;
 
%%EKF
while k<62%number of data collected
    k=k+1;
    t=Time(k);
    dt=t-Time(k-1);
 
    %v and angle will change before loop again
    %everytime xest(3,k-1)>phi,xest(1,k)
    phi = xest(3,k-1)/360*2*pi;  % in rad
    if (move ~= 0)
        xest(1,k) = xest(1,k-1) + move*cos(phi);% x prediction of NXT USsensor position if it moved forward
        xest(2,k) = xest(2,k-1) + move*sin(phi);% y prediction of NXT USsensor position if it moved forward
        xest(3,k) = xest(3,k-1);                % NXT orientation in degrees (no change)
    else
        xest(1,k) = xest(1,k-1) + chord*cos(phi);% x prediction of NXT USsensor if it turned
        xest(2,k) = xest(2,k-1) + chord*sin(phi);% y prediction of NXT USsensor if it turned
        xest(3,k) = angle;                       % NXT orientation in degrees (changes)
    end
 
%     % Calculate Jacobian A based on vehicle model (df/dx)
%     A(1,1)=1; A(1,2)=0; A(1,3) = move*sin(phi);
%     A(2,1)=0; A(2,2)=1; A(2,3) = -move*cos(phi);
%     A(3,1)=0; A(3,2)=0; A(3,3) = 1;
%     % rest of A (which is just 2 null matrices and 1 identity) is added in new_state function
%         
%     % Calculate prediction for covariance matrix
%     Pest = A*Pest*A' + W*Q*W';%Using initial Pest for 1st timestep
 
    %update v and angle/alpha from speed.dat and angle.dat
    move = X(k)*0.5*5.6*pi;%diameter of wheel is 5.6cm
    angle = Angle(k)*30;%every turn upon sensing a close obstacle is 30 degrees
 
%     if Dist(k)>200
%         Dist(k)%do something if the sensed wall is too far?
%     end
    if Dist(k)~= 255
    map(1,k) = xest(1,k) + Dist(k)*cos(phi);
    map(2,k) = xest(2,k) + Dist(k)*sin(phi);
    else 
    map(1,k) = xest(1,k-1);
    map(2,k) = xest(2,k-1);
    end
end
 
mapOutline=[0,105.4,127,127,39.4,0,0;
            0,0,25.4,101.6,101.6,80,0];%plots experimental boundary
 
hold all;
%plot(xest(3,1:k));
plot(xest(1,1:k),xest(2,1:k),'green');          % robot position (model)
axis([0 127 0 101.6]);
hold on;
plot(map(1,1:k),map(2,1:k),'*');          % map of what robot saw
axis([0 127 0 101.6]);
plot(mapOutline(1,:),mapOutline(2,:),'red');

This is the result on Matlab. The red line represents the experimental boundary, the green line represents the route taken by the robot WITHOUT corrections, and the blue dots represent the detected obstacles at the respective part of the route.

slam1.jpg

This is a video of the experiment.


Thoughts:

  • This is not exactly a SLAM implementation as the Ultrasonic sensor can only detect one obstacle at a time. Without detecting the position of the old landmarks, there can be no updating and SLAM cannot be implemented.
  • Ultrasonic sensor is fairly inaccurate.

Areas to improve:

  • Save data to arrays before writing to files. It will be much faster in recording data.
  • Put delays in between scanning and moving so that equipment are stable before measurement is taken and data is recorded.
  • Make sensor detect an area of obstacle instead of a point, and use it to do SLAM.
  • Use the NXT camera to pinpoint position of static obstacles and find the distance at each step of the robot motion.

NXT + Ultrasonic Sensor + NXT camera

Plan is to use the NXT camera to find the position of landmarks as the camera is made to scan 180 degrees of the front of the robot.

The ultrasonic sensor will then detect the distance of that object if the landmark is detected and at the center of the camera (the ultrasonic sensor is aligned to have the same orientation as the camera).

Tutorial on NXTCamera

Refer to here for installation instructions.

Here are some things I want to add on:

  • The location to install the USB driver appears in “Other devices” in the “Device Manager” window.

I am using Windows 7.

devicelocation.jpg

  • After installation it, at the same location, the words “USB Driver” or something like that will appear (sorry I did not take a screen shot of it).

This is what the tutorial meant by “install 2 USB drives”.

  • Upon the 1st connection using the “NXTCam View” software, I had to configure the port of the new device. I chose Port 5.

comport.jpg

In the “NXTCam View” software, I set the 1st colormap to the color of the blue pole and the 2nd colormap to the color of the white pole.

bluecolormap.jpg
whitecolormap.jpg

The picture below is a demonstration of the tutorial sample code.

The NXT screen shows the blobs of 2 different colours captured in one frame instance based on the colormaps pre-configured using the “NXTCam View” software.

tutorialnxtcamera.jpg

Trial 1

his is the first trial on capturing landmark on a 180 degrees sweep of the front surrounding, and writing the values into a file.


This is the nxc code for:

  • Turning the camera 180 degrees to the right at 10 degrees at a time, starting from 90 degrees to the left.
  • Getting the coordinates of the detected blob(s) to be stored in a file for processing in Matlab.
#include "D:\NXT\NXTCAM\nxtcamlib-default.nxc"
 
//hardware informations
#define CAMADDR 0x02
const byte camPort  =  IN_1;
#define EYE OUT_B
#define EYE_POWER 20
#define EYE_LEFT 90
#define EYE_RIGHT 90
#define LEFT_THRES
#define RIGHT_THRES
#define EYE_TURN_ANGLE 10
 
//delay()
#define DELAY 500
 
//file writing
int colorCount=0,area;
#define TEST_FILE_NAME "test.csv"
#define TEST_FILE_SIZE 2048
byte test_handle;
//#define AREA_THRES 2000 //can be used to filter out false noisy blobs, not implemented in this code
 
int color[10];
int left[10];
int top[10];
int right[10];
int bottom[10];
int nblobs;
 
long eyeCurAng,eyeFrontAng;
int blueMid,whiteMid;
string s;//global string for saving values to files
 
inline void delay()
{
	Wait(DELAY);
}
 
sub open_for_write()
{
	DeleteFile(TEST_FILE_NAME);
	CreateFile(TEST_FILE_NAME,TEST_FILE_SIZE,test_handle);
}
 
task main()
{
open_for_write();
int init;
init = NXTCam_Init(camPort, CAMADDR);//initialize
NXTCam_SendCommand(camPort, CAMADDR, 'E'); // enable tracking
 
//+++++++++++++++SCANNING++++++++++//
int bluecount=1,whitecount=1;
s="eyeFrontAng=";
s+=NumToStr(eyeFrontAng);
TextOut(0,LCD_LINE6,s,false);
delay();
 
//++++++++++detect black landmark++++++++++//
RotateMotor(EYE,EYE_POWER,-EYE_LEFT);
eyeCurAng=-EYE_LEFT;
 
while (eyeCurAng<EYE_RIGHT)
{
  delay();
  ClearScreen();
  s="eyeCurAng=";
  s+=NumToStr(eyeCurAng);
  WriteLn(test_handle,s);
  TextOut(0,LCD_LINE7,s,false);
  NXTCam_SendCommand(camPort, CAMADDR, 'B');  // object tracking
  NXTCam_SendCommand(camPort, CAMADDR, 'E'); // enable tracking
  delay();
  delay();
  delay();
	NXTCam_GetBlobs(camPort, CAMADDR, nblobs, color, left, top, right, bottom);
 
	//saving details of each blob
  for (colorCount=0;colorCount<nblobs;colorCount++)
  {
  s="color:";
  s+=NumToStr(color[colorCount]);
	s+=" left:";
 	s+=NumToStr(left[colorCount]);
 	s+=" right:";
 	s+=NumToStr(right[colorCount]);
 	s+=" top:";
 	s+=NumToStr(top[colorCount]);
  s+=" btm:";
 	s+=NumToStr(bottom[colorCount]);
 	area = (right[colorCount]-left[colorCount])*(bottom[colorCount]-top[colorCount]);
  s+=" area:";
  s+=NumToStr(area);
  WriteLn(test_handle,s);
  }
 
	delay();
 
	RotateMotor(EYE,EYE_POWER,EYE_TURN_ANGLE);
	eyeCurAng+=EYE_TURN_ANGLE;
 
}//end while
 
RotateMotor(EYE,EYE_POWER,-EYE_RIGHT);
 
NXTCam_SendCommand(camPort, CAMADDR, 'D'); // disable tracking
}

Things to take note:

  • Perform #include on only one of the “nxcCam” files (ie. nxtcamlib.nxc, nxtcamlib–default.nxc or nxtcamlib–common.nxc), otherwise there will be error on having the “same identifier” in multiple files.
  • From example tutorial “camtest-v3.nxc”, the file it #include is nxtcamlib-default.nxc where the NXTCam_GetBlobs function has 8 arguments. If you #include the nxtcamlib.nxc file instead, there are only 7 arguments.
  • Using a sub() to process the results from the NXTCam_GetBlobs function will not work. Process the values inside the main loop itself.
  • Do not disable the camera until the end of the program; camera cannot be enabled again once it is disabled in the runtime of the program.
  • PosRegSetAng() cannot be used once RotateMotor() is used
  • Limit of NXT runtime for a program is 10mins. NXT switch off after that length of run time regardless if the program has completed or not

Below shows the collected figures from the program.

coordinatesnareas.jpg

Explanation:

  • “color” refers to the colormaps pre-configured in the “NXTCam View software”; there are only 2 in this case where 0 caters to the blue landmark while 1 caters to the white one.
  • “left”, “right”, “top” and “btm” refer to the coordinates of the blobs.
  • “area” refer to the area of the corresponding blob.

Going Further:

  • With the area value, I can filter out false blobs which are supposedly small, leaving only the bigger blob which is the actual landmark.
  • The blobs will become small if the landmarks are too far; the area threshold which filters out the false noisy blobs must make sense with the distance threshold which filters out landmarks that are too far.
Trial 2

Design of the robot is as shown in the picture below.

The camera and sensor are placed at the same position in the horizontal plane along a straight vertical axis as the left wheel, which will be the pivot wheel when turning.

This is specially designed as such to eliminated the need for transformation of the position of the sensor to the wheels, hence the vehicle can be modeled as a point in space easily.

nxtfinaldesign1.jpgnxtfinaldesign2.jpg

The robot moves in a straight line for 5 steps.

At each step it scans 140 degrees in front of itself and record the distance of any detected blob of the designated colors.

If detected and the blob has an area big enough to be considered a real landmark, the distance from it to the ultrasonic sensor is recorded.

Note: Only up to 4 files can be written in an nxc program.

The nxc code for this trial is similar to the one above.

Here is the video. (I changed the white obstacle to black)


The recorded files are displayed below with slight explanations. These files are recorded in a different trial as the one recorded in the video above.

  • blue.dat

blue.jpg

  • If a true blob is not detected in one scan, that is there is no blue object in the a scan, default value of 255 is written into a line in the file.
  • If a true blob is detected, the angle and the distance at which it is detected is recorded.
  • If more than one blob is detected, the pair of angle and distance values are recorded separately from the other pairs in a line.
  • white.dat

white.jpg

  • Same explanation with the blue blob detection above.
  • Note that a black object is used instead.
  • distFwdAngle.dat

distfwdang.jpg

  • Each line represent the state of the robot at each step of its journey.
  • In each line, the value before the space is the change in orientation of the robot, while the value after is a state function of whether the robot move forward or not.

Thoughts:

  • Angles made by the motor are not accurate; the motor does not return to the center after each scan. Solution: Try using the inbuilt PID controller to reduce errors.
  • True blobs are not detected. Solution: Adjust AREA_THRES and RIGHT_THRES and LEFT_THRES values OR reduce EYE_TURN_ANGLE value to have more extensive scans OR ascertain a more precise or tolerant colormap for the camera to effectively and correctly track the objects.
Trial 3
  • Motor angles are accurate with the implementation of the PID controller. The tension in the cables connecting to the port of the ultrasonic sensor and the camera gives an unwanted torque to turn the motor and make its angle inaccurate. The PID controller seeks to gives a counter torque to fix that error. This causes the vibration of the camera and sensor.
  • A lower EYE_TURN_ANGLE to cover more details in each scan increased the success of detecting obstacles, along with adjustments to the other variables.

Here are the results of the data collected in the experiment shown in the video.

bluedat.jpgwhitedat.jpg


Trial 4

Change of experiment plan.

Use only one color for the landmark.

The ultrasonic sensor will record ALL the values in each scan, regardless if there is a blob detected.

The detected blob will be filtered and given a indication of a true blob at the corresponding angle and distance.

These changes are to make collection of data easier for easy transfer to Matlab.

In addition, I will use a white background instead as the capturing of blobs based on the designated colormap is not very effective. This picture shows the blobs detected when the same colormap is used to capture images at different angles using a green background. Using a white background and black landmarks might make the difference since they are the extremes of the color spectrum.

badoverlapping.jpg

The final experimental set up is shown in the picture below.

trial4expsetup.jpg

The nxc code is show below.

NOTE: There is no obstacle avoidance implemented for this experiment as the maximum runtime of the NXT is limited (up till 10 minutes). The robot will make a 140 degrees scan of the environment in front of it before taking a step forward. It takes 6 steps at maximum.

The EYE in the code refers to the motor, ultrasonic sensor and NXT camera assembly.

#include "D:\NXT\NXTCAM\nxtcamlib-default.nxc"
 
//hardware information
//EYE
#define CAMADDR 0x02
const byte camPort  =  IN_1;
#define EYE OUT_B
#define EYE_POWER 20
#define EYE_LEFT 70
#define EYE_RIGHT 70
#define EYE_TURN_ANGLE 5
#define US S2
#define MAX_POWER 75
#define MAX_ACC 15
#define ORIGIN 0
//wheels
#define R_MOTOR OUT_C
#define L_MOTOR OUT_A
#define BOTH_MOTOR OUT_AC
#define WHEEL_POWER 50
#define WHEELS_TURN_ANGLE 180//move forward by 0.5*5.6*pi
#define RIGHT_WHEEL_TURN_ANG 136
 
//delay()
#define DELAY 1500
 
//file writing
int colorCount=0,area,mid,dist,distFwd=0,angle=0;
#define EYEANG_FILE_NAME "eyeAng.csv"//store angle of eye when true blob detected at each step
#define BLOB_FILE_NAME "blob.csv"
#define DISTFWD_ANG_FILE_NAME "distFwdAng.csv"//store fwd or not at each step and if angle++ or not
#define LM_DIST_FILE_NAME "LMdist.csv"
#define FILE_SIZE 1048
#define DEFAULT 255
byte blob_handle=0,distFwd_ang_handle=0,LMdist_handle=0;
#define DIST_THRES 45 //only register landmark distance if near enof
#define AREA_THRES 1200
#define MAX_AREA_THRES 20000
#define LEFT_THRES 46
#define RIGHT_THRES 130
#define BLOBHEIGHT 60
string sblob,sLMdist,sdistFwdAng;
 
//for getBlob()
int color[10];
int left[10];
int top[10];
int right[10];
int bottom[10];
int nblobs;
 
//global variable
long eyeCurAng;
int journey=0;//journey ++ after every scan and move
#define END_OF_JOURNEY 7 //experiment steps
 
inline void delay()
{
  Wait(DELAY);
}
 
sub open_for_write()
{
	DeleteFile(BLOB_FILE_NAME);
	DeleteFile(LM_DIST_FILE_NAME);
	DeleteFile(DISTFWD_ANG_FILE_NAME);
 
	CreateFile(BLOB_FILE_NAME,FILE_SIZE,blob_handle);
	CreateFile(DISTFWD_ANG_FILE_NAME,FILE_SIZE,distFwd_ang_handle);
	CreateFile(LM_DIST_FILE_NAME,FILE_SIZE,LMdist_handle);
}
 
sub fwd()
{
	RotateMotorPID(BOTH_MOTOR,WHEEL_POWER, WHEELS_TURN_ANGLE, PID_1,PID_5,PID_0);
	delay();//give time for corection of angle by PID
	distFwd=1;
	angle=0;
}
 
task main()
{
	PosRegSetMax(EYE, MAX_POWER, MAX_ACC);
	PosRegEnable (EYE, PID_1, PID_5, PID_0);
	SetSensorLowspeed(US);
	open_for_write();
	int init,blobHeight,detected=0;
	init = NXTCam_Init(camPort, CAMADDR);//initialize
	NXTCam_SendCommand(camPort, CAMADDR, 'E'); // enable tracking
	while(journey<END_OF_JOURNEY)
	{
		sdistFwdAng=NumToStr(angle);//record state of robot
		sdistFwdAng+=" ";
		sdistFwdAng+=NumToStr(distFwd);
		WriteLn(distFwd_ang_handle,sdistFwdAng);
 
		//+++++++++++++++SCANNING++++++++++//
		PosRegSetAngle (EYE, -EYE_LEFT);//set the initial starting angle for the eye of the robot
		delay();//give time for correction of angle by PID
		eyeCurAng=-EYE_LEFT;//save current angle of eye
		detected=0;//reset variables before scanning loop
		while (eyeCurAng<=EYE_RIGHT)//per scan
		{
			dist=SensorUS(US);
			detected=0;//reset variables before scanning the next angle
 
			NXTCam_SendCommand(camPort, CAMADDR, 'B');  // object tracking
			NXTCam_SendCommand(camPort, CAMADDR, 'E'); // enable tracking
			delay();//give time for stablise blob detection
			NXTCam_GetBlobs(camPort, CAMADDR, nblobs, color, left, top, right, bottom);
 
			//blobFilter
			//saving details of each blob
			if (nblobs!=0)//if blob of the designated color is detected
			{
				for (colorCount=0;colorCount<nblobs;colorCount++)//each of the same color blob tt are detected
				{
					area = abs(right[colorCount]-left[colorCount])*abs(bottom[colorCount]-top[colorCount]);
					//calculate area of each blob detected in this angle
					mid = (right[colorCount]+left[colorCount])/2;
					//calculate mid point of each blob detected in this angle
					blobHeight = abs(bottom[colorCount]-top[colorCount]);
					//calculate height of each blob
 
					if (area>AREA_THRES && area<MAX_AREA_THRES//if blob has a big enof area, and not too big due to error in detection
						&& blobHeight>BLOBHEIGHT//if blob is tall enof
						&& mid<RIGHT_THRES && mid>LEFT_THRES//if blob lies in the middle of the camera view
						&& dist < DIST_THRES//if blob detected is not too far to be invalid
						&& detected==0) //take in only one blob that pass the filters
					{
						sblob=NumToStr(2);
						Write(blob_handle,sblob);//save value 2 for a true blob deteced in this particular angle
						detected++;
					}
 
				}//endfor
			}//endif
 
		if (detected==0)//if at this angle no blobs recognised, value is 0
		{
			sblob=NumToStr(1);
			Write(blob_handle,sblob);//save value 1 for no true blob deteced in this particular angle
		}
 
		sLMdist=NumToStr(dist);
		Write(LMdist_handle,sLMdist);//save distance detected at this particular angle
 
		PosRegAddAngle(EYE, EYE_TURN_ANGLE);
		delay();
		eyeCurAng+=EYE_TURN_ANGLE;
 
		}//end while
 
		sblob="";
		WriteLn(blob_handle,sblob);//after full 140 degrees scan completed, goes to next line
 
		sLMdist="";
		WriteLn(LMdist_handle,sLMdist);//after full 140 degrees scan completed, goes to next line
 
		//+++++++++++end of 1 SCAN+++++++++++//
		PosRegSetAngle (EYE, ORIGIN);//return to front
		delay();//give time for correction of angle by PID
 
		//++++++obstacle avoidance+++++++++//
 
		fwd();//move forward after full 140 degrees scan
		journey++;
	}//endwhile
 
NXTCam_SendCommand(camPort, CAMADDR, 'D'); // disable tracking
}

This is the full video at normal speed.


This is a compilation of the 'blob' and 'LMdist' data collected.

retrieveddata.jpg

  • Each line represented the 29 data of all the scans done at each move step.
  • The last line is incomplete as it has reached the maximum run time of the NXT.
  • Detected blobs and its corresponding distances detected is highlighted in yellow.
  • There are no blobs detected on the right side for the 1st few steps because the blobs are too far and thus filtered out.

Thoughts:

  • Implementation SLAM is possible as we can see the detected values are in clusters and they are shifting to the side as the robot moves forward.

The matlab code can be found here.GITHUB

The result of the code is shown below.

trial4.jpg

  • The black dots represent the true position of the obstacles.
  • The '*' represents the position of the obstacles detected.
  • The green line is the true route of the robot by odometry input data.
  • The red line is the estimated route by going through the EKF.

Thoughts:

  • Outlier obstacle detected.
  • EKF implemented is fairly accurate for this number of steps.
  • Newly detected obstacles at each step is correctly matched to the nearby obstacles.
Trial 5

I tested out the algorithm on another experimental setup as shown below.

trial5expsetup.jpg

With an adjustment to the minimum distance threshold between landmarks to be classified as the same landmark, the results are show below.

  • min_dist=11

trial5mindist11.jpg

  • min_dist=10

trial5mindist10.jpg

  • min_dist=8,9

trial5mindist8_9.jpg

  • min_dist=7

trial5mindist7.jpg

  • min_dist=6

trial5mindist6.jpg

Thoughts:

  • The number of observed landmarks increased as the min_dist threshold value decreases. This is expected as observed landmarks whose distance in between them is initially well under the threshold value, is now above and are thus considered distinct landmarks.
  • At a higher threshold value for min_dist, the number of observed landmarks fit the experimental setup but there is a high possibility of wrong association of true distinct landmarks as the same observed landmark.
  • Inaccuracy is also due to the limitations of the equipment. Below shows the observed landmarks at the very first scan of the experiment. One of the observed landmark is an abnormally circled in green. It may be due to overlap between the nearer landmark circled in red and the further one circled in blue during scanning. Adjustments need to the various threshold value need to be made to improve the results

trial5scan1.jpg

Going Further

Adjustments to values like RIGHT_THRES, LEFT_THRES, AREA_THRES and EYE_TURN_ANGLE will improve the results. The optimum result will require a good balance of these values.

drexel_darwin_slam.txt · Last modified: 2016/11/06 19:42 by dwallace