I’ve implemented odometry on my robot using 5225A’s document as a guide. I also created a display program so I can see what the odometry is doing on the brain’s screen. My program works perfectly for translational motion, but for some reason it doesn’t read point turns (turning on the spot) correctly. If I do a point turn, the display shows the robot doing basically a donut rather than just turning on the spot. Has anyone else experienced this issue?
My tracking code:
while(1) {
//Get encoder values (DEGREES)
LPos = LTrack.position(rotationUnits::deg);
RPos = RTrack.position(rotationUnits::deg);
SPos = STrack.position(rotationUnits::deg);
//Calculate distance traveled by tracking each wheel (INCHES)
//Converts degrees to radians
deltaDistL = ((LPos - LPrevPos) * M_PI / 180) * WHEEL_RADIUS;
deltaDistR = ((RPos - RPrevPos) * M_PI / 180) * WHEEL_RADIUS;
deltaDistS = ((SPos - SPrevPos) * M_PI / 180) * WHEEL_RADIUS;
//Update previous values to be used next loop (DEGREES)
LPrevPos = LPos;
RPrevPos = RPos;
SPrevPos = SPos;
//Total change in each of the L and R encoders since last reset (INCHES)
//These are used to calculate the absolute orientation of the bot
totalDeltaDistL += deltaDistL;
totalDeltaDistR += deltaDistR;
//Calculate the current absolute orientation (RADIANS)
currentAbsoluteOrientation = THETA_START - ( (totalDeltaDistL - totalDeltaDistR) / (LTrackRadius + RTrackRadius) );
//Calculate the change in the angle of the bot (RADIANS)
deltaTheta = currentAbsoluteOrientation - previousTheta;
//Update the previous Theta value (RADIANS)
previousTheta = currentAbsoluteOrientation;
//If we didn't turn, then we only translated
if(deltaTheta == 0) {
deltaXLocal = deltaDistS;
// could be either L or R, since if deltaTheta == 0 we assume they're =
deltaYLocal = deltaDistL;
}
//Else, caluclate the new local position
else {
//Calculate the changes in the X and Y values (INCHES)
//General equation is:
//Distance = 2 * Radius * sin(deltaTheta / 2)
deltaXLocal = 2 * sin(deltaTheta / 2.0) * ((deltaDistS / deltaTheta) + STrackRadius);
deltaYLocal = 2 * sin(deltaTheta / 2.0) * ((deltaDistR / deltaTheta) + RTrackRadius);
}
//The average angle of the robot during it's arc (RADIANS)
avgThetaForArc = currentAbsoluteOrientation - (deltaTheta / 2);
deltaXGlobal = (deltaYLocal * cos(avgThetaForArc)) - (deltaXLocal * sin(avgThetaForArc));
deltaYGlobal = (deltaYLocal * sin(avgThetaForArc)) + (deltaXLocal * cos(avgThetaForArc));
//Update global positions
xPosGlobal += deltaXGlobal;
yPosGlobal += deltaYGlobal;
//loop every 10 milliseconds
task::sleep(10);
}
deltaDistL = ((LPos - LPrevPos) * M_PI / 180) * WHEEL_RADIUS;
deltaDistR = ((RPos - RPrevPos) * M_PI / 180) * WHEEL_RADIUS;
deltaDistS = ((SPos - SPrevPos) * M_PI / 180) * WHEEL_RADIUS;
Do you set the LPrevPos
etc values before entering the loop? Just making sure that the calcuation on the first loop iteration is correct…
All position and prevPosition values are declared as 0 outside the loop, and all the encoders are reset before starting the loop.
What is STrackRadius
set to?
These values are all the distances from the tracking center to the wheels:
double LTrackRadius = 5.75;
double RTrackRadius = 5.75;
double STrackRadius = 6.0;
To be clear, the tracking center is not actually moving in a donut?
Correct. The robot is turning about its center, not translating at all. However, the program that displays the odometry calculations shows the robot moving in a donut.
Another interesting note: I don’t remember this happening when I had my odometry setup with the old quad encoders. I don’t see a reason why the new sensors would cause this issue tho.
This behavior is what you would expect if your back tracking wheel did not work at all. Have you checked to ensure that it’s working properly?
The back wheel detects strafing just fine
I was able to fix the problem by changing the plus sign in:
((deltaDistR / deltaTheta) + RTrackRadius);
to a minus sign