User Tools

Site Tools


Potential Fields Navigation with ARToolkit and NXT

Authors: Dylan Wallace Email:, Alvaro Pintado
Keywords:​ Potential Fields, PFNav, ARToolkit, Computer Vision, Navigation, NXT, Cpp, C, Path Planning
Date: Last modified on 09/13/16


Potential Fields is a concept that comes from the fundamentals of electricity and magnetism. A charge with high potential (positive) will repel the electric field away from it, while a charge with low potential (negative) will attract the electric field. With many of these attractive or repulsive points present in the electric field, a field of vectors is created representing all of the forces at each point in the field. This vector field can be navigated by following each vector at each present location, leading towards the point of lowest potential in the field (global minimum).

This can be applied to robotics by creating a field where obstacles are repulsive points, and the target is an attractive point. If the robot is instructed to follow the velocities (vectors) of this field, it will be led to the target, while also navigating around the obstacles.

In order to recognize the obstacles, ARToolkit is used because of its strengths in enclosed environments. This allows for the robot to update its information about the obstacles and target in real-time, giving the robot the ability to avoid mobile obstacles and targets. The NXT was used for this project due to its ease of use, and its Bluetooth communication capabilities. The velocities for each wheel (calculated based on the velocities from the vector field) are sent over a Bluetooth connection from the host PC to the NXT, allowing for real-time communication and navigation with the NXT.

For information on how Potential Fields work, please see these lecture notes

If you just want to try out the program yourself, here is an executable that you can place into the bin directory of ARToolkit.

If you want to view the source code directly, here is the C++ file for the program.

Motivation and Audience

This tutorial's motivation is to use the ARToolkit and NXT to navigate a robot through an “obstacle course”. This tutorial assumes the reader has the following background and interests:

* This tutorial combines all of the previous tutorials in the Potential Fields series. Please complete those tutorials before this one, in order to have a full understanding of this tutorial.

The rest of this tutorial is presented as follows:

  • Integrating Potential Fields
  • Integrating Bluetooth
  • Final Words

Integrating Potential Fields

We will start by editing the file that we used for the ARToolkit Coordinate Tracking. This file provides the necessary base for our ARToolkit program to run.

The first thing that we will want to define in our program is a new struct used to track the robot. This struct will be declared as follows:

struct Mobile
	float x;
	float y;
	float theta;

In order to integrate the Potential Fields, we must first provide some global variables for our program to use. These are global variables used for outputing our data to the screen.

float   velLeftPrev;
float   velRightPrev;
char	xValue[8][256];
char    yValue[8][256];
char    velLeftVal[256];
char    velRightVal[256];
char    rVelVal[256];
char	deltaThetaVal[256];
char	WVal[256];
char    angle[256];
char	ur1[256];
char	ur2[256];
char	ur3[256];
char    ur4[256];

Now we must define the constants that are used for the Potential Fields calculations. Keep in mind that for your setup, these constants may need to be changed (in the case of rho0, nAttract, and nRepulse they will definitely need to be tweaked for optimal performance) depending on your setup.

const float         PI = 3.14159;
const float         unitDeg = 7.0;
const float         wheelR = 1.0625;
const float         baseR = 2.15625;
const float         T = 0.1;
const float         rho0[] = { 0, 450, 450, 450, 450 };
const float         nAttract = 1.05;
const float         nRepulse[] = { 0, 1500, 1500, 1500, 1500 };

Now we can move onto the mainLoop function where our Potential Fields calculations are actually occurring. Since these are implemented into the same function as the video, these values will be re-calculated for every video frame. This is what gives the program the ability to calculate the navigation in real-time.

First we will declare some variables.

//PF Variables
float rho[4];
float ur_x[4];
float ur_y[4];
float xObstacle[5];
float yObstacle[5];
float xTarget;
float yTarget;
float xNext;
float Angle;
float A;
float B;
float yNext;
float xVel;
float yVel;
Mobile mobile;
mobile.x = 0.0;
mobile.y = 0.0;
mobile.theta = 90.0;
xCoord[0] = 0.0;
yCoord[0] = 0.0;

Now we will actually be able to implement Potential Fields calculations into our program. This will occur just after the end of our Coordinate Tracking program. This first chunck is going to calculate the angle of the robot based on the relation of the markers located on the robot. This implementation allows us to have a real-time, accurate model for our robot's rotation. The last bit will set the locations of the target, obstacles, and will update the position of the robot. Finally, we will output the calculated angle of the robot for debugging.

A = xCoordRel[7] - xCoordRel[6];
B = yCoordRel[7] - yCoordRel[6];
Angle = atan2(B, A)*(180 / PI);
if (B < 0) {
	Angle += 360;
xTarget = xCoordRel[5];	
yTarget = yCoordRel[5];
mobile.x = xCoordRel[7];
mobile.y = yCoordRel[7];
mobile.theta = Angle;
for (h = 1; h < 5; h++) {
	xObstacle[h] = xCoordRel[h];
	yObstacle[h] = yCoordRel[h];
sprintf(angle, "Angle: %.2f  ", Angle);
argDrawStringsByIdealPos(angle, 10, 150);

Now we will implement the main section of the Potential Fields calculations. This will be located within a for loop that runs for the number of obstacles given. It will first calculate the Repulsive Potential of each obstacle, and output them to the screen. Then it will calculate the next X&Y coordinates and velocities based on the Potential Fields formulas for these values. Finally, it will invoke a function called diffSteer, which handles the kinematic equations and Bluetooth commands. For information on this function please see the next section.

for (j = 1; j < 5; j++) {
	rho[j] = sqrt(pow((yObstacle[j] - mobile.y), 2) + pow((xObstacle[j] - mobile.x), 2));
	// Calculate gradient (Eqns 11 and 13)
	if (rho[j] < rho0[j]) {
		ur_x[j] = nRepulse[j] * (xObstacle[j] - mobile.x)*((1 / rho[j]) - (1 / rho0[j])) / pow(rho[j], 3);
		ur_y[j] = nRepulse[j] * (yObstacle[j] - mobile.y)*((1 / rho[j]) - (1 / rho0[j])) / pow(rho[j], 3);
	else {
		ur_x[j] = 0;
		ur_y[j] = 0;
sprintf(ur1, "Ux(1): %.1f; Uy(1): %.1f", ur_x[1], ur_y[1]);
//sprintf(ur2, "Ux(2): %.1f; Uy(2): %.1f", ur_x[2], ur_y[2]);
argDrawStringsByIdealPos(ur1, 10, 25);
//argDrawStringsByIdealPos(ur2, 10, 50);
xNext = mobile.x - (T*nAttract*(mobile.x - xTarget)) - (T*ur_x[1]) - (T*ur_x[2]) - (T*ur_x[3]) - (T*ur_x[4]);
yNext = mobile.y - (T*nAttract*(mobile.y - yTarget)) - (T*ur_y[1]) - (T*ur_y[2]) - (T*ur_y[3]) - (T*ur_y[4]);
xVel = -nAttract*(mobile.x - xTarget) - ur_x[1] - ur_x[2] - ur_x[3] - ur_x[4];
yVel = -nAttract*(mobile.y - yTarget) - ur_y[1] - ur_y[2] - ur_y[3] - ur_y[4];
diffSteer(mobile, xVel, yVel, xTarget, yTarget);

Integrating Bluetooth

All the calculations for the navigation of the potential field are done on the computer that runs the program, but as it calculates the navigation for the NXT Bot to take it must send those commands over Bluetooth so that the NXT Brick can perform the correct movements in sync with the calculations.

For this a function has to be written to calculate the movements based off the Potential Field calculations.

The function takes 3 parameters:

  1. The Mobile struct mobile
    1. x position of NXT
    2. y position of NXT
    3. angle of NXT
  2. xVel: the velocity in the x direction calculated by the potential field equations
  3. yVel: the velocity in the y direction calculated by the potential field equations


  1. Calculates resultant velocity vector based off of x and y velocity
  2. Based off resultant velocity vector, calculates a angular velocity constant
  3. Writes the component velocities to the NXT to then control the motor power on the Bot
static void diffSteer(Mobile &mobile, float xVel, float yVel)
	cli::array<wchar_t, 1>^ left = { 0x0C, 0x00, 0x00, 0x04, 0x01, 0x00, 0x03, 0x00, 0x00, 0x20, 0x00, 0x00, 0x00, 0x00  };
	cli::array<wchar_t, 1>^ right = { 0x0C, 0x00, 0x00, 0x04, 0x00, 0x00, 0x03, 0x00, 0x00, 0x20, 0x00, 0x00, 0x00, 0x00 };
	float rVel = sqrt(pow(xVel, 2) + pow(yVel, 2));
	sprintf(rVelVal, "rVel: %.2f", rVel);
	argDrawStringsByIdealPos(rVelVal, 10, 75);
	//float thetaTarget = ((PI / 2) - atan2((xNext - mobile.x), (yNext - mobile.y)))*(180 / PI);
	float thetaTarget = atan2(yVel, xVel)*(180 / PI);
	float deltaTheta = thetaTarget - mobile.theta;
	float W = deltaTheta*(baseR / wheelR);
	sprintf(WVal, "W: %.2f", W);
	argDrawStringsByIdealPos(WVal, 10, 100);
	float rightVel = rVel + W;
	float leftVel = rVel - W;
	sprintf(velRightVal, "rightVel: %.2f", rightVel);
	sprintf(velLeftVal, "leftVel: %.2f", leftVel);
	argDrawStringsByIdealPos(velRightVal, 10, 125);
	argDrawStringsByIdealPos(velLeftVal, 150, 125);
	left[5] = leftVel + 10;
	right[5] = rightVel + 10;
	if (leftVel > 100)
		left[5] = 100;
	else if (leftVel < -100)
		left[5] = -100;
	if (rightVel > 100)
		right[5] = 100;
	else if (rightVel < -100)
		right[5] = -100;
	if (rVel < 4.0) {
		left[5] = 0;
		right[5] = 0;
		NXT->Write(left, 0, 14);
		NXT->Write(right, 0, 14);
	NXT->Write(left, 0, 14);
	NXT->Write(right, 0, 14);
	velLeftPrev = leftVel;
	velRightPrev = rightVel;

Final Words

This tutorial's objective was to show how a robot can be programmed to perform real-time navigation, using ARToolkit and Potential Fields. Once the concepts were conveyed the reader could create their own program for real-time navigation using the ARToolkit SDK & Potential Fields with an NXT robot.

For questions, clarifications, etc, Email:

nxt_pf_nav_ar_toolkit.txt · Last modified: 2017/02/09 13:45 by dwallace