Purdue Dynamic Framework Demonstration

Hey everyone it has been a long time since I have posted on the forums but as callouts have concluded and we are officially kicking off the 2012-2013 season that means we have to disassemble our old robots. :frowning:

One robot in particular that gained a lot of attention at worlds this past year that I wanted to share was Honey Badger, our interaction zone bot. On him we had a bit of software that we had been working on since last September, a dynamic object recognition and allocation system or DORA for short. Utilizing two ultrasonic sensors and a webcam Honey Badger is able to dynamically find the objects and grab them. This was designed so that after the pyramid was destroyed we could still easily be able to pick up objects and deliver them our isolation zone robot despite not knowing where the objects may be on the field.

Video:
http://www.youtube.com/watch?v=lPtOc66stGk

1 Like

This looks so cool. Thanks for the share.

Very cool! Could you go into more detail of how it works? When it drives towards objects it seems very unstable, especially for the last red object, is there a reason for this? Is that just a way to avoid the blue objects?