Custom Joystick for Handicapped User

I have a student with a severe handicap in my Robotics class. He is only able to move his head left and right (he has no movement in his arm, leg, toes, etc.). I would like him to be able to move the robot. I was considering adapting a partner joystick somehow with a large push button.
Anyone ever tried something like that before?
This would be for classroom use and not for any vex event or competition.
Thanks!

This may sound weird but what about using the accelerometer on the joystick and strapping the joystick to his head? Tilting his head left and right will reference turning and tilting his head forward and backward will reference going forward and backward.
Edit//: And doing this will actually allow him to compete because using the accelerometer in the remote is VEX Legal

The code should look something like this:
motor[right]=vexRT[AccelY]-vexRT[AccelX];
motor[left]=vexRT[AccelY]+vexRT[AccelX];

Good idea! Attach to a helmet. I would love to see this in competition and other enabling technologies for students who need reasonable accommodations.

Love to see you demo different approaches at Worlds (in the pit :slight_smile: ).

Challenge Accepted! Maybe I can also take advantage of my 3D printer for a more professional mount for the remote as well :).

Team 6135K worked on various alternative input means for their FUTURE Foundation Robot construction challenge, including electromyography (measurement of nerve signals in muscles) electrooculography (measurment of eye position) and electroencephalography (measurement of brain waves). They called it “Emoto-Bot” because the idea is to improve the emotional state of the person being assisted. I am dead certain they would be happy to help you with this.

A description of their entry is here:
http://challenges.robotevents.com/challenge/65/entry/3887

The essay describing it is here:

FUTUREhttp://challenges.robotevents.com/uploads/0005830_original.pdf

Watch this video:

The first part of the video shows simple encoders. At the 53 second mark, they show control of VEX motors using nerve firings. You could measure nerve firings in the neck or face for your purposes.

At 1:25 they’re wiring to measure eye position. Note that the values shown in the debugStream change as Spencer (the test subject) moves his eye up and down. That value is read directly from an analog port on the cortex.

They used one cortex to take the data and process it into joystick commands, then hooked into the main joystick through the partner joystick port. Using this system and a program on the data acquisition cortex, they could control any robot.

As I said, they’d be happy to help.

There is no restriction for mechanical mounting of the joystick controllers in VEX - just no tampering with electronics within.

Will give a club shirt for a demo of your system.

Thank you for everyone’s suggestions.
I was able to modify a joystick to have it mounted to my student’s headpiece, 3d printed a button to go over the existing button and programmed the joystick to work as a partner joystick. It was a fun project and the results speak for themselves…

Wow! That is very inspiring and I’m glad that you were able to find a way for him to be able to drive

i know it sound out programming possibilities but you could use a sensor not sure which one would be best for this but you could do eye tracking

I doubt you’d be able to come up with a reasonable implementation with VEX sensors. I would expect it to require expensive 3rd party hardware (or at least a decent camera) and some ridiculous programming to go with it.

I know it’s expensive but you could possibly use something like this. If you do end up getting something like that, I would be more than happy to write some code to help implement it. In case you didn’t see the other link, here it is in plain form. https://www.naturalpoint.com/trackir/