If you have any questions, don’t hesitate to ask!
full code please
Hello, this is a very good job, I would like to know more details about the code added to the driver.
It would be possible to provide the code to add to my V5!
And start a construction?
This would be a great study. I second @lacsap. I would love to see the code.
I imagine it’s similar to the code I posted here with the addition of measuring object size to adjust the robot’s distance from the object.
It’s also probably similar to what I made last year =)
The code can be found here.
In your demonstration, the robot seemed unnecessarily jerky. Is it because you are telling the robot to turn “right” or “left” depending on the direction of the error?
The better solution is to use a P loop to scale the motor power proportionally to the error.
Nice explanation at the end of the video btw
My color fusion is always poorly adjusted. Maybe it’s because the relationship between red and green is transparent. I wrote a code fusion myself. The performance is worrying.
To how far did the tracking work? When my team tried something similar the vision sensor started to fail to track objects once it was more than ~48" from its target (Most likely due to poor resolution)