I teach PLTW Automation & Robotics and we just upgraded last year so I don’t have a lot of experience using the line tracker with the V5 brains.
Here’s the issue we’re finding:
We want to use a single line tracker to detect whether or not a white block is in front of the sensor (this is the factory cell project from the older A&R curriculum. I really like this project so I’m still using it. Using a line tracker for this purpose worked alright with the old VEX Cortexes).
When we look at the 3-Wire Ports screen on the brain, it pretty consistently shows values of 70-71% if there’s nothing in front of the sensor, and values anywhere between 3 and 50% when the block is in front of the sensor, depending on how close the block is (within a range of about 2 cm). That right there is counterintuitive because shouldn’t the reflectivity it detects be near 0 when there’s nothing there? The code definitely isn’t interpreting it as 70% when there’s nothing there, because we have tried coding if reflectivity >20%, start the motor and it doesn’t start, which it should if it were getting the 70% that’s shown on the 3-Wire Ports screen of the brain.
We have tried different thresholds and we can run the same program multiple times without changing anything and sometimes it works and sometimes it doesn’t. I know the distance between the block and sensor might explain the inconsistency, but that still wouldn’t explain why the numbers don’t make sense. Is there something I’m missing about how to interpret the line tracker input the 3-Wire Ports screen shows? Do you have any tips for getting the line tracker to work more consistently?
We are having similar issues. We are on a dark surface trying to detect a lighter piece of tape for a sumo bot competition. I would think that we should write the following line of code:
if(LineTracker.reflectivity() > 60) to indicate that we have found the lighter-colored perimeter line.
But the code doesn’t trigger unless we say: if(LineTracker.reflectivity() < 60) suggesting that the lighter colored tape is reflecting less than the dark surface.
In any case, even when we write the code seemingly backwards, it still does not work properly from what I can tell because as mentioned in the original comment, when we modify the values to reflect what the brain monitor suggests the Line Tracker is sensing (60% being the lighter line and 40% being the darker surface), those values don’t always trigger appropriately in the code.
The dashboard doesn’t understand the difference between a line sensor, a potentiometer or any other analog device, so it just shows percentage of the 12 bit digital range. You could either ask for that raw value in the code (ie. 12 bit is 0 to 4095) and use that as the threshold, or simply test different conditions and display our calculated reflectivity on the brain screen and then decide on a suitable threshold from that.
for reference, we assume a raw digital value of 3000 would be 0% reflectivity and a raw value of 0 being 100%.
As the line sensor works in the IR spectrum, darker materials can sometimes reflect more IR than lighter materials, it’s not always completely intuitive, you have to test the specific materials you want to use.
Update:
not that this has 100% fixed it, it’s still being a bit inconsistent, but detecting reflectivity %s between about 18 and 40 seems to be working. It’s still tricky to find a balance between false positives if the minimum is too low or false negatives (failing to detect the block when it’s there) if the minimum is too high.
We’re going to put in a button as a backup - if we see the sensor fails to detect the block, the user can press the button to trigger the function instead.
I’m still really curious why it’s reading about 70% reflectivity when there’s nothing there, and lower numbers when something is there. I get that there’s IR radiation just in the room but why isn’t it getting an even higher value when there’s an object (white PLA, shouldn’t absorb much IR) right in front of it.
Anyway, after some trial and error, that range of just under 20 up to 40 % reflectivity seems to work OK.
Thank you everyone. This was extremely helpful. Do you have any suggestions for how we can take the analog values from the Line Sensor and calculate threshold percentage? Is there a suggested formula? Also, our brains do show analog input values. Are these also potentially imprecise for a Line Sensor or can we rely on them? Is there any benefit to outputting the values to the console in VexCodePro instead of using the values showing on the brain?