My team is using 3 wheels for odometry. We are trying to use the left and right tracker wheels to track the angle the robot is facing. We followed this guide and the BLRS wiki, but the angle tracking isn’t that accurate. It is sometimes over 50 degrees off. Is there a better algorithm for this or should we stick with using the inertial sensor? We used the inertial sensor last season, but it was often a few degrees off by the end of a skills run.
It always stays in the general area of where it should be, but it isn’t always accurate. If the robot turns right, the angle will correctly move in that direction, but it wouldn’t always move the exact amount it is supposed to.
We already tried something like this. It isn’t scaled. One value of sL and sR makes it work well for when we rotate the robot 1080 degrees, but it will be far off when it rotates a few more times.
Check your physical trackers. Make sure they rotate freely when spun, they don’t drag on the field tiles, and that they don’t have dust or something (depending on if you’re using red encoders). You can test this by spinning them and looking at the brain screen’s readout. Also, inertial sensor works like a charm if you throw a couple wall resets into your code.