User Tools

Site Tools


nxt_pf_nav_ar_toolkit

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
nxt_pf_nav_ar_toolkit [2016/09/13 15:51] – [Integrating Bluetooth] alvaropintadonxt_pf_nav_ar_toolkit [2017/02/09 13:45] (current) dwallace
Line 1: Line 1:
 ====== Potential Fields Navigation with ARToolkit and NXT ====== ====== Potential Fields Navigation with ARToolkit and NXT ======
  
-**Author:** Dylan Wallace Email: wallad3@unlv.nevada.edu+**Authors:** Dylan Wallace Email: <wallad3@unlv.nevada.edu>, Alvaro Pintado
 \\ \\
-**Date:** Last modified on 08/29/16 +**Keywords:** Potential Fields, PFNav, ARToolkit, Computer Vision, Navigation, NXT, Cpp, C, Path Planning   
-\\ +\\  
-**Keywords:** Potential Fields, PFNav, ARToolkit, Computer Vision, Navigation, NXT, C++, C, Path Planning+**Date:** Last modified on 09/13/16
 \\ \\
 +
 +===== Introduction =====
 +
 +Potential Fields is a concept that comes from the fundamentals of electricity and magnetism. A charge with high potential (positive) will repel the electric field away from it, while a charge with low potential (negative) will attract the electric field. With many of these attractive or repulsive points present in the electric field, a field of vectors is created representing all of the forces at each point in the field. This vector field can be navigated by following each vector at each present location, leading towards the point of lowest potential in the field (global minimum).
 +
 +This can be applied to robotics by creating a field where obstacles are repulsive points, and the target is an attractive point. If the robot is instructed to follow the velocities (vectors) of this field, it will be led to the target, while also navigating around the obstacles. 
 +
 +In order to recognize the obstacles, ARToolkit is used because of its strengths in enclosed environments. This allows for the robot to update its information about the obstacles and target in real-time, giving the robot the ability to avoid mobile obstacles and targets. The NXT was used for this project due to its ease of use, and its Bluetooth communication capabilities. The velocities for each wheel (calculated based on the velocities from the vector field) are sent over a Bluetooth connection from the host PC to the NXT, allowing for real-time communication and navigation with the NXT.
 +
 +For information on how Potential Fields work, please see these [[http://daslhub.org/unlv/wiki/lib/exe/fetch.php?media=alvarop:lecturexxhandouts.pdf|lecture notes]]
 +\\ 
 +{{ youtube>L_391dBWB-I?large }}
 +\\ 
 +
 +If you just want to try out the program yourself, here is an [[https://github.com/D-Wazzle/Potential-Fields-Navigation/blob/master/PFNav%20ARToolkit.exe|executable]] that you can place into the bin directory of ARToolkit.
 +
 +If you want to view the source code directly, here is the [[https://github.com/D-Wazzle/Potential-Fields-Navigation/blob/master/ARToolkitPFNav.cpp|C++ file]] for the program.
 ===== Motivation and Audience ===== ===== Motivation and Audience =====
  
Line 17: Line 34:
 \\ \\
 The rest of this tutorial is presented as follows: The rest of this tutorial is presented as follows:
-  * [[|Integrating Potential Fields]] +  * Integrating Potential Fields 
-  * [[|Integrating Bluetooth]]+  * Integrating Bluetooth
   * Final Words   * Final Words
  
 ==== Integrating Potential Fields ==== ==== Integrating Potential Fields ====
  
 +We will start by editing the file that we used for the ARToolkit Coordinate Tracking. This file provides the necessary base for our ARToolkit program to run.
  
 +The first thing that we will want to define in our program is a new struct used to track the robot. This struct will be declared as follows:
  
 +<code c++>
 +struct Mobile
 +{
 + float x;
 + float y;
 + float theta;
 +};
 +</code>
 +
 +In order to integrate the Potential Fields, we must first provide some global variables for our program to use. These are global variables used for outputing our data to the screen.
 +
 +<code c++>
 +float   velLeftPrev;
 +float   velRightPrev;
 +char xValue[8][256];
 +char    yValue[8][256];
 +char    velLeftVal[256];
 +char    velRightVal[256];
 +char    rVelVal[256];
 +char deltaThetaVal[256];
 +char WVal[256];
 +char    angle[256];
 +char ur1[256];
 +char ur2[256];
 +char ur3[256];
 +char    ur4[256];
 +</code>
 +
 +Now we must define the constants that are used for the Potential Fields calculations. Keep in mind that for your setup, these constants may need to be changed (in the case of rho0, nAttract, and nRepulse they will definitely need to be tweaked for optimal performance) depending on your setup.
 +
 +<code c++>
 +const float         PI = 3.14159;
 +const float         unitDeg = 7.0;
 +const float         wheelR = 1.0625;
 +const float         baseR = 2.15625;
 +const float         T = 0.1;
 +const float         rho0[] = { 0, 450, 450, 450, 450 };
 +const float         nAttract = 1.05;
 +const float         nRepulse[] = { 0, 1500, 1500, 1500, 1500 };
 +</code>
 +
 +Now we can move onto the mainLoop function where our Potential Fields calculations are actually occurring. Since these are implemented into the same function as the video, these values will be re-calculated for every video frame. This is what gives the program the ability to calculate the navigation in real-time.
 +
 +First we will declare some variables.
 +
 +<code c++>
 +//PF Variables
 +float rho[4];
 +float ur_x[4];
 +float ur_y[4];
 +float xObstacle[5];
 +float yObstacle[5];
 +float xTarget;
 +float yTarget;
 +float xNext;
 +float Angle;
 +float A;
 +float B;
 +float yNext;
 +float xVel;
 +float yVel;
 +Mobile mobile;
 +mobile.x = 0.0;
 +mobile.y = 0.0;
 +mobile.theta = 90.0;
 +xCoord[0] = 0.0;
 +yCoord[0] = 0.0;
 +</code>
 +
 +Now we will actually be able to implement Potential Fields calculations into our program. This will occur just after the end of our Coordinate Tracking program. This first chunck is going to calculate the angle of the robot based on the relation of the markers located on the robot. This implementation allows us to have a real-time, accurate model for our robot's rotation. The last bit will set the locations of the target, obstacles, and will update the position of the robot. Finally, we will output the calculated angle of the robot for debugging.
 +
 +<code c++>
 +A = xCoordRel[7] - xCoordRel[6];
 +B = yCoordRel[7] - yCoordRel[6];
 +
 +Angle = atan2(B, A)*(180 / PI);
 +
 +if (B < 0) {
 + Angle += 360;
 +}
 +
 +xTarget = xCoordRel[5];
 +yTarget = yCoordRel[5];
 +mobile.x = xCoordRel[7];
 +mobile.y = yCoordRel[7];
 +mobile.theta = Angle;
 +for (h = 1; h < 5; h++) {
 + xObstacle[h] = xCoordRel[h];
 + yObstacle[h] = yCoordRel[h];
 +}
 +
 +sprintf(angle, "Angle: %.2f  ", Angle);
 +argDrawStringsByIdealPos(angle, 10, 150);
 +</code>
 +
 +Now we will implement the main section of the Potential Fields calculations. This will be located within a for loop that runs for the number of obstacles given. It will first calculate the Repulsive Potential of each obstacle, and output them to the screen. Then it will calculate the next X&Y coordinates and velocities based on the Potential Fields formulas for these values. Finally, it will invoke a function called diffSteer, which handles the kinematic equations and Bluetooth commands. For information on this function please see the next section.
 +
 +<code c++>
 +for (j = 1; j < 5; j++) {
 + rho[j] = sqrt(pow((yObstacle[j] - mobile.y), 2) + pow((xObstacle[j] - mobile.x), 2));
 +
 + // Calculate gradient (Eqns 11 and 13)
 + if (rho[j] < rho0[j]) {
 + ur_x[j] = nRepulse[j] * (xObstacle[j] - mobile.x)*((1 / rho[j]) - (1 / rho0[j])) / pow(rho[j], 3);
 + ur_y[j] = nRepulse[j] * (yObstacle[j] - mobile.y)*((1 / rho[j]) - (1 / rho0[j])) / pow(rho[j], 3);
 + }
 + else {
 + ur_x[j] = 0;
 + ur_y[j] = 0;
 + }
 +
 +}
 +
 +sprintf(ur1, "Ux(1): %.1f; Uy(1): %.1f", ur_x[1], ur_y[1]);
 +//sprintf(ur2, "Ux(2): %.1f; Uy(2): %.1f", ur_x[2], ur_y[2]);
 +argDrawStringsByIdealPos(ur1, 10, 25);
 +//argDrawStringsByIdealPos(ur2, 10, 50);
 +
 +xNext = mobile.x - (T*nAttract*(mobile.x - xTarget)) - (T*ur_x[1]) - (T*ur_x[2]) - (T*ur_x[3]) - (T*ur_x[4]);
 +yNext = mobile.y - (T*nAttract*(mobile.y - yTarget)) - (T*ur_y[1]) - (T*ur_y[2]) - (T*ur_y[3]) - (T*ur_y[4]);
 +xVel = -nAttract*(mobile.x - xTarget) - ur_x[1] - ur_x[2] - ur_x[3] - ur_x[4];
 +yVel = -nAttract*(mobile.y - yTarget) - ur_y[1] - ur_y[2] - ur_y[3] - ur_y[4];
 +diffSteer(mobile, xVel, yVel, xTarget, yTarget);
 +</code>
 ==== Integrating Bluetooth ==== ==== Integrating Bluetooth ====
  
Line 45: Line 188:
   - Writes the component velocities to the NXT to then control the motor power on the Bot   - Writes the component velocities to the NXT to then control the motor power on the Bot
  
-<code>+<code c++ diffSteer.cpp>
 static void diffSteer(Mobile &mobile, float xVel, float yVel) static void diffSteer(Mobile &mobile, float xVel, float yVel)
 { {
Line 104: Line 247:
 ==== Final Words ==== ==== Final Words ====
  
-This tutorial's objective was to <fill in the blank>. Complete <choose: construction detailssource code and program descriptions> for <fill in the blank>. Once the concepts were conveyed the reader could <fill in the blank>+This tutorial's objective was to show how a robot can be programmed to perform real-time navigationusing ARToolkit and Potential Fields. Once the concepts were conveyed the reader could create their own program for real-time navigation using the ARToolkit SDK & Potential Fields with an NXT robot.
-\\ +
-\\ +
-Speculating future work derived from this tutorial, includes <fill in the blank>. In the big picture, the problem of <fill in the blank> can be solved with this tutorial.+
 \\ \\
 \\ \\
-For questions, clarifications, etc, Email: wallad3@unlv.nevada.edu +For questions, clarifications, etc, Email: <wallad3@unlv.nevada.edu>
nxt_pf_nav_ar_toolkit.1473807104.txt.gz · Last modified: by alvaropintado