I want to create a program on a vr that interfaces with the vex brain that displays info about the status of the robot in a screen to the side of the driver(on vr), gives warnings about opposing alliances activities, gives suggestions about what to do, displays live footage from ai vision sensor on the robot to the headset to be displayed, keeps track of rings and the score, and has automatic thing that are simple that could be handed off to an algorithm like corner guarding/battles, defending a specific target, and taking the optimal rout to the hang.
Would it be legal?
Is it possible to get a vr and vex brain to communicate, or even for the brain to transmit information?
How would you track rings/MG/enemy bots so that the vr can keep track of the score and/or warnings
How do I broadcast ai vision sensor feed to an outside computer/headset?
What are some more things that can be done automatically?(like the auto corner gaurd)
How do you think I should do those things(in software and hardware)
If you forget about the whole brain thing - pretend that the VR headset is completely isolated from the brain, that would work, but you’d have to jump through A LOT of hoops to get useable data.
It would be a fun project in the off-season, maybe with a simplified version of the game, but during the season, spending 10% of that time on driver practice will make your game better.
If you look at the full rule (added emphasis is mine):
The one thing I think might be possible is:
This shouldn’t require VR or anything, you just need to determine where the robot is relative to the ladder (you could maybe use the vision sensor to find the ladder). Obstacles between you and the ladder could be detected via a distance sensor.