So Team GOAT would like to present some explanations of its Lidar code that ran at the VEXU world championship. We thought it would be a fun interesting challenge to make a dynamic autonomous(auto located balls around the field and pick them up) that we had programmed ourselves 100%(I believe it was the first fully autonomous individually programmed vex robot ever). This is obviously a difficult challenge in its self, but what made it even more difficult is we decided to forgo any additional processing power and did it all on the Cortex in ROBOTC. I would do a full code release but the issue is that we wrote it all in the span of 3 weeks and there are lots of hacked together and or bloated aspects that are not something perspective programmers should be exposed to. (If anyone wants to play around and explore it I can add you to the private git repo) So instead here is an explanation of the steps that went into autonomous locating balls.
To ensure accuracy steps 1-4 were ran 3 times and a ball was only confirmed if 2 of the 3 iterations through a ball was computed to exist
1 Parsing
Consists of reading in packets of, start byte, starting index and 4 consecutive distances in mm as well as RPM measurement of the motor. This data is read and saved to an array of shorts beginning at starting index and ending at starting index+3.
http://i.imgur.com/QNj550v.jpg raw data
2 Chassis filter
The data from the predefined ranges representing points blocked by the chassis are eliminated from the list to reduce cost on useless data.
3 Triangle filter
While still in polar coordinates the data was search for every distinct local minimum that occurred within 3 points such that no point is more than 50 mm farther away than the middle point. This test was designed to be extremely cheap computationally as it had to run on more or less all of the 360 data points. Each point was checked to its 2 neighbours to see if it passed the filter and keeping everything in polar coordinates kept the cost down. This created a list of possible balls.
4 Circular filter
This is the first test outside of polar coordinates. Cartesian coordinates in the reference frame of the robot being the origin and the X axis extending out to each possible ball to reduce cost normalizing data to global reference frame. In this reference frame a possible circle was calculated that crossed both the point to the left and the point to the right of the center( remember any 2 points within 2r of each other are guaranteed to have 2 circles of radius r that intersect them). The middle point was then used to calculate the certainty of a ball based off its distance from the center in relation to a known radius. Any balls within more than 5% uncertainty were refused. This test was extremely expensive due to the nature of the math involved but extremely restrictive on nonBall clusters of points.
Here half of the general equation for computing a circle based off 2 points after we did a lot of work optimizing it
circle.k = (-circle.a * sqrt(-bd2 * (a2 - 2 * circle.a * circle.c + b2 - 2 * circle.b * circle.d + c2 + d2) * (a2 - 2 * circle.a * circle.c + b2 - 2 * circle.b * circle.d + c2 + d2 - 4 * R2)) + circle.c * sqrt(-bd2 * (a2 - 2 * circle.a * circle.c + b2- 2 * circle.b * circle.d + c2 + d2) * (a2 - 2 * circle.a * circle.c + b2 - 2 * circle.b * circle.d+ c2 + d2 - 4 * R2)) + a2 * b2 - a2 * d2 - 2 * circle.a * b2 * circle.c + 2 * circle.a * circle.c * d2 + pow(circle.b, 4) - 2 * pow(circle.b, 3) * circle.d + b2 * c2 + 2 * circle.b * pow(circle.d, 3) - c2 * d2 - pow(circle.d, 4)) / (2 * (circle.b - circle.d) * (a2 - 2 * circle.a * circle.c + b2 - 2 * circle.b * circle.d + c2 + d2));
5 Normalize data to center off of the center of the robot
6 Compute global coordinates of ball
7 Blacklist balls in protected or inaccessible zones
8 Compute closest ball from list of allowed confirmed balls
Updates coming if there is interest.