How do you do Position Tracking?

I heard it in another forum, but really don’t know how to implement it.

It’s… hard. Mainly because the vex sensors are prone to drift and a very small difference can cause major problems. 1826 managed to build a robot that utilized position tracking, and it worked fairly well, so one of them might be able to help you.

Take a look at this snip:

I took that from team WPI1’s autonomous robot code here: Team-Optimistic · GitHub

It works by computing the delta’s for left and right quad encoders for one time step (how far each encoder has moved since we last checked), recording the average and difference of these two values, then using it to compute our delta theta (how much we have rotated since we last checked). Then we compute our delta x and delta y in the global frame (field frame in this case), our velocity in the local frame (base_link frame), and then sums the new numbers from this time step into a running estimate. Here is the code linked above broken down line by line:

Compute our right and left change:

const int32_t rightDelta = (rightQuad - lastRightQuad),
      leftDelta = (leftQuad - lastLeftQuad);

Save the new quad values for the next loop:

lastRightQuad = rightQuad;
      lastLeftQuad = leftQuad;

Compute the average change and difference in change:

const float avg = (rightDelta + leftDelta) / 2.0,
                  dif = (rightDelta - leftDelta) / 2.0;

Compute the distance we moved and the angle we turned:

const float dist = (avg * straightConversion) / 1000.0, //robots coordinate frame
      dtheta = dif * thetaConversion;

Compute our new theta:

const float theta = thetaGlobal + dtheta;

Compute how far we moved in the global frame:

const float dx = cos(theta) * dist, //world coordinate frame
                  dy = sin(theta) * dist;

Compute our velocity in the local frame:

const float v = 1000* dist / dt,
                  vtheta = 1000 * dtheta / dt;

Here we construct a message for ROS. The important parts to see are the first line (we say that our linear velocity straight forward in the local frame is


) and last two lines one up from the last (we increment our global theta by


and say our angular velocity around the straight upwards axis (z axis) is



odom->twist.twist.linear.x = v;
      odom->twist.twist.linear.y = 0;
      odom->twist.twist.linear.z = 0;
      odom->twist.twist.angular.x = 0;
      odom->twist.twist.angular.y = 0;
      thetaGlobal += dtheta;
      odom->twist.twist.angular.z = vtheta;
      odom->twist.covariance = ODOM_TWIST_COV_MAT;

Here we make a second ROS message. Notice that we increment our global x and y positions by




, tell ROS about our new global position, and tell ROS our theta is



      xPosGlobal += dx;
      yPosGlobal += dy;
      odom->pose.pose.position.x = xPosGlobal;
      odom->pose.pose.position.y = yPosGlobal;
      odom->pose.pose.position.z = 0;
      odom->pose.pose.orientation = tf::createQuaternionMsgFromYaw(thetaGlobal);
      odom->pose.covariance = ODOM_POSE_COV_MAT;
1 Like

@rbenasutti How do you deal with the drift, noise, and general inaccuracy of the vex sensors?

Well something everyone can do is mount their quads on unpowered wheels, makes a world of difference.

Past that, I used a full EKF last year for the autonomous robot (not currently possible to do without a co-processor, we used a raspi). But for a 15 sec auto, it doesn’t matter. Unpowered encoder wheels are pretty darn good to begin with. You can use ultrasonics and line sensors to correct every now and then if you really need to.

@rbenasutti Thanks! Have you implemented this program yet in competition? If so, how well did it work?

It worked fairly well for what it was. We won a match and were competitive in a number of others, which was our goal for the year. It’s hard to beat a human driver autonomously, so I’m proud of the team. Had a few issues here and there, but nothing terrible. Our only real issues were ones that would have us not moving from the start of the match, which is unfortunate, but still good because it meant that our code worked and was solid, even though it had a few hiccups getting off the ground. Worlds was the 2nd ever competition this robot was run in, if we had more practice I think we would have done better.

Read more here: WPI1 Reveal - VEX Robot Showcase - VEX Forum

@rbenasutti In that thread someone mentioned that you used Lidar and an IMU. I’m in reg VRC and can’t use those sensors, so how much did you rely on them?

Tabor is another member on the team. Without those sensors, the robot would not have been possible in the slightest. We need LIDAR to see objects on the field and we need an IMU to keep our position tracking under control.

Edit: I should say that you really don’t need those for VRC. 15 seconds is not long, we went for the entire match so we had to be rock solid. Unpowered quads are plenty.

I’m thinking of using this system for the 15 sec autonomous period and 60 sec programming skills. Of course, the field would always be set up, so I wouldn’t need the LIDAR, right? Also, would the tracking itself work well enough for 60 seconds without the LIDAR? BTW, thanks for answering all my questions.

We use the IMU to help with tracking, not the LIDAR. In skills, you might have a shot, but you should still bump into walls to line up and such. We needed it because we could have objects thrown on us, we could tip and our quads could come off the ground while still spinning, and we slam into the fence many times per match, so just quads was not going to cut it.

Thanks a lot for answering all my questions.
… I just now realized that you made a robot that drove completely autonomously, even during the driver skills part. That’s amazing! How many lines of code did it take for everything?

Here is the cloc printout for all repositories:

      45 text files.
      45 unique files.                              
     198 files ignored. v 1.60  T=0.23 s (160.6 files/s, 24912.6 lines/s)
Language                     files          blank        comment           code
C                               14            392            458           1651
C++                              9            279            373           1187
C/C++ Header                     5             64            148            265
CMake                            4            102            407            122
XML                              4             51            108             97
YAML                             1              5              0             29
SUM:                            37            893           1494           3351

My bad, I forgot to init a submodule. Take 2:

     113 text files.
     113 unique files.                                          
     243 files ignored. v 1.60  T=0.20 s (492.5 files/s, 58152.0 lines/s)
Language                     files          blank        comment           code
C                               51           1059            927           4671
C++                              9            279            373           1187
C/C++ Header                    31            335           1149            907
CMake                            4            102            407            122
XML                              4             51            108             97
YAML                             1              5              0             29
SUM:                           100           1831           2964           7013

Do I dare ask why?

Not actually much code if you look through the repos.
We were all bored with regular vex and wanted to do something different. This seemed like a fun and challenging project for the year.

1 Like

And to show it could be done :).