Please describe your experiences with the V5 Vision Sensor.

  1. 2 weeks ago

    I have yet to get my hands on a V5 Vision Sensor (ordered in July, shipped in late December, and now lost in a logistical swamp locally known as FedEx), so I've been wondering what sort of experiences teams have had with it. Please share your experiences, how you've used it, what programming language you've been using, and that sort of thing. I'm debating whether or not it's even worth trying to fuss with these sensors this season once they arrive - if they ever do. Thanks.

  2. I’m not our programmer but I’ll do my best to provide some insight. We recently started messing with the vision sensor. I know we use VCS and I believe C++. At somewhat close range the auto aim for angling works quite nicely for us. I know we’re currently working on getting the height to auto adjust as well by backing up/moving forward. Overall if your robot is consistent you can do some cool things with it, but it’s definitly not the be all end all solution.

  3. Rick TYler

    Jan 9 Teachers/Coaches, Event Partner, V5 Beta Moderator Redmond, Washington Founder of Exothermic Robotics

    We've been playing with them around the office (and using them in curriculum development) since the beta days. As long as you remember that they are tracking reflected ambient light and not a radiant source, we've been really happy with them.

    Of course, we always program with Robot Mesh Studio, and have sample programs available in Blockly, Python and C++. Remember that RM Studio uses the standard VEX low-level API calls, and so most source code is compatible between RM Studio C++ and VCS C++ Pro.

    Here are a couple of links:

    Robot built and programmed by teachers at one of our training courses last summer .

    Here are sample proportional solutions for "aiming" a robot towards a vision target.

    Blockly: http://ow.ly/UgzW50jSCzl
    Python: http://ow.ly/UWuh50jSCJb
    C++: http://ow.ly/2L8q50jSCJB

  4. Deleted last week by Cam
  5. @Cam With some lighting the vision sensor is terrible.

    Are you saying that with different kinds of lighting you are getting different kinds of results? Would you mind elaborating on that a little? For example, which kinds of lighting gave you problems?

  6. Deleted last week by Cam
  7. Depending on how far away you are, the green flag can wind up being only a couple pixels wide. With a target that small, you have to have incredibly precise signature tuning.

  8. @FullMetalMentor For example, which kinds of lighting gave you problems?

    The success of the signatures heavily depends on the lighting they were taken in and the lighting they're actually being used. In tournament use, you would probably want to set the signatures straight from the field for optimal results.
    Our team found casting a bit of a shadow on the target helped when setting signatures and having the vision sensor actually pick up on the target.

  9. last week

    Adam T

    Jan 10 Event Partner Jenison, MI Jension Robotics
    Edited last week by Adam T

    @FullMetalMentor Are you saying that with different kinds of lighting you are getting different kinds of results? Would you mind elaborating on that a little? For example, which kinds of lighting gave you problems?

    For one of our teams at a league night the cafeteria lighting was just too poor for decent results with the vision sensor to detect flags. It worked ok in gym lighting and classroom lighting.

    At one event the host school's colors prominently include maroon. Every banner hanging high on the wall in the gym registered as a red flag. It was a real issue. Christmas time was not good, you never knew when someone in a Santa hat would walk behind the field and trigger the sensor.

  10. Adam T

    Jan 10 Event Partner Jenison, MI Jension Robotics
    Edited last week by Adam T

    For what it's worth, the team using the vision sensor spent December rebuilding their robot. The team just got their vision sensor working again last night, they are VERY glad to have it back.

  11. @Adam T For what it's worth, the team using the vision sensor spent December rebuilding their robot. The team just got their vision sensor working again last night, they are VERY glad to have it back.

    Do you mind if I ask what you're using the Vision Sensor for? Are you trying to do things like acquire balls and/or caps? Or are you using it for aiming at flags? Line following, maybe? Is there any particular application you've tried with it that simply doesn't work?

  12. Adam T

    Jan 10 Event Partner Jenison, MI Jension Robotics

    They are using it for aim bot for flags. They haven't really tried it for another use. They could add a second vision sensor for other tasks, but then a single battery might not get the through a tournament being drained like that.

    Another of our teams tried to use a vision sensor to align with posts, but black/grey doesn't register for a signature (so I've been told).

  13. @Adam T ...They could add a second vision sensor for other tasks, but then a single battery might not get the through a tournament being drained like that....

    Do the vision sensors use a lot of energy? Does it matter if you're using the wifi enabled part of it?

  14. The wifi eats a ton of power.

  15. @John TYler The wifi eats a ton of power.

    Okay, but we can disable that for the Vision Sensor, can't we?

  16. Adam T

    Jan 11 Event Partner Jenison, MI Jension Robotics

    I'll post it if I get any data, but the team can definitely operate comfortably with the vision sensor on their robot. And yes, the WiFi can be disabled and it helps a lot. Just anecdotally the more components they add the more power they will be using. I would feel more comfortable if each team were able to have two batteries, but they get by with one battery using one vision sensor.

  17. 3038922

    Jan 11 china ares

    Is there any way to adjust parameters adaptively? I don't know what those parameters mean at all. I don't even know how much time he should set for a loop. It's too difficult to input data parameters.
    Pixmon is a very good software. But it doesn't seem to support VEX. Importing data is very tedious.

  18. @FullMetalMentor As of a previous VEXos version, it is disabled by default. I know in beta it was on by default, but I don't know which version of VEXos changed it. With 1.0.5 you definitely have to opt-in to using the camera wifi.

  19. Has anyone figured out how to set or program the vision sensor so that it will work with any background ? We had a tournament where there was a very large blue background that interferes with all of our programs.

  20. You could check the size of your target. If it's too large, assume the background is wrong-colored and ignore vision returns. There would be no way to pick out your primary target just using a blue signature. You would have to fall back on either a green signature or blue-green code.

  21. Newer ›
 

or Sign Up to reply!