I’m new to these forums (and vex) and I’ve been looking through the forums for anything that will help me create a tracking robot, but I haven’t been able to find much.
I want to mount an ultrasonic sensor on a servo and have it scan from left to right for objects. If it finds an abject, I want the robot to turn towards that object using the position of the servo. This is where I need help with a few things. I have it tracking from the left to keep it from getting confused with the position.
First of all, Does anyone know or have any code that will help me with setting up the servo to pan from left to right and then jump back to left again, while taking a reading from the ultra.
Secondly, I am NOT using vex motors, meaning that I will use the digital outputs to control an H-Bridge, which will then move my two NPC motors forward or back (I am doing a tank style robot). So with that said, does anyone know of a more efficient way of correlating the servo position with the motors? Since I really don’t have much control over the position of the motors because they just know forward, reverse, or stop.
This setup is the alternate from my other 4 ultra setup in my other post.
Usually, you would use shaft encoders to determine how much each of your wheels/tracks have turned, and have your program compute the resulting turn angle. This obviously won’t be perfect because of slippage, but with some experimentation you should be able to get pretty close.
It may also be effective to just turn some, rescan, turn a small bit more, etc. Once the scanned object is dead ahead (or within a few degrees), start moving towards it. You could create a feedback loop that adjusted the turning power based on the off-center angle of the detected object.
I think 4 ultrasonics will be easier than a scanning one, but need to be careful of blindspots.
You will probably need an array.
have an array of 256 locations, a variable to store min value and min position
move servo to P (position) 0
take a reading and store in array location P while comparing if reading is < than min. If it is, then replace min with new min.
move to position +1 and jump to step 3 until P = 255
you would know which servo position corresponds to how many degrees from the current heading, so simply rotate your robot (if you don’t have encoders since you’re using NPC motors, try a compass sensor!)
move forward a certain amount of distance, then repeat the process again
Well I guess I will just stick to using 4 ultrasonics, Since it seems that a panning one would be a tad bit more complicated than I can handle.
Now I just need to figure out how I can keep track of how far I am moving, since I don’t think I can use encoders with NPC motors and tank treads. And I’ve already used up 4 interrupt ports, 8 digital ports, and 3 analog ports. Is there any nifty vex accessory that can help me out?
you have 6 interrupt ports. try to chain encoders to your NPC motor outputs, that may be the easiest method or get some other encoder which will fit your drivetrain.
Some issues to be aware of is interference between each ultrasonic sensor, and as you are using an even number of sensors, it may not go straight easily. As Dean mentioned, encoders wont provide perfect accuracy due to slippage and/or gear backlash, but its a good estimate.
Well here is the code that I have written for the 4 ultrasonic sensor setup.
Do you think that it will work for the purpose, and also I am not sure if I was using the “break” command properly. I also need to know if the way I put the "while"s inside the "if"s makes any sense.
I couldn’t figure out how to put the code directly onto here from easyC Pro, so I just uploaded the entire saved easyC Pro file in zip format. This is the link to the upload since I have already uploaded the code beforehand
0.5. You have two 'frontleftline = GetAnalogInput … which is likely what you do not want.
put StartUltrasonic(…) outside your while loop. you only need to call it once to turn them on
unless you are only turning them on one at a time to prevent interference…
you may want to make your print-to-screen print only a few times per second else you will flood your console. try using timers to clock it
code can be improved by modeling using a state chart and coding according to it - will result in much lesser nested IFs.
I didn’t run it, but it looks to me that you may have some logic issues in your code. The first two if blocks (<= 20 and > 20) result in the rest of the elseif not being considered. I didn’t continue reading, but I think you really need to replan most of the code
Come to think of it, a fully autonomous sumo bot is not a trivial task to begin with. For example, you would also need to cater to conditions where the target is within ‘blind spots’ where your ultrasonics do not cover (if any), and also move straight towards the target (remember that ultrasonic has a rather wide angle of detection). But this really depends how your ultrasonics are positioned.
I am starting and stopping them one at a time to prevent interference. Would that make it too slow to be effective?
To be honest I really have no idea of what that is. I would appreciate it if you could show me what you are talking about
Hmm I thought that that was what I wanted, since my main sensors are just the front and back and I will be attacking with mainly the front. I could change the nesting to make it better…im not sure though
ok this is another question I have. Does the micro automatically switch to the Else If statement if the If statement becomes false in the middle? I don’t want to take that for granted while making the program.
Ya I’m definitely gonna have to spend sleepless nights trying to figure it out. The main thing I want to figure out is how to ensure that the robot is going straight towards the target, not just like clipping the edge of it. If only the ultrasonic had a way of triangulating the position of an object so that it could center itself.
You may want to take some ideas from the echo-autonomous-driving-bot I built a while back. It’s naviation is based on simple pattern recognition. There is a speadsheet that explains the basic patterns it understands – in a zip file at the bottom of that thread.
hmm wouldn’t it be hard to impliment mapping when you are going against a moving target? And also another problem is that the sensors will probably sense the people who are standing around the sumo ring so that is going to be a factor if I choose to do any type of mapping.
…wow I think I might be in a little over my head this time haha. But I got a few months to figure something out. Maybe once we actually get the robot built I will be able to see exactly whats lacking.
But just from looking at my code does it look as if its feasable so far? Ignoring the fact that there are blind spots and such.
try to make it take multiple maps while standing still, and if anything changes (for example, a dot on A1 changes to a dot on B9 on a grid) then that is a moving object. if you could also use a sound sensor to detect to motor noise of your competition. just so i get an idea, how many robots are you competing against in one match, and do you have alliances?
The TS mentioned in his first post that he originally had wanted to do a scanning mechanism similar to what KHall had done before. Perhaps he can read what KHall had done and try it out.
As for what a state chart is, its basically definiting the different states your robot is in. For this situation I think there would be 4 states: move forward, turn left, turn right, reverse.
Or perhaps:
attack, seek, avoid-line
So whats left is to determine the conditions (from the sensors) which would cause 1 state to switch to another. E.g. Assuming it starts in default state ‘seek’, what would ‘seek’ do? Turn until enemy is straight ahead of it before moving into ‘attack’ state? What does ‘attack’ state do? Move forward towards target while compensating misalignment?
while(true) {
… // sensor sampling code
switch (state) {
case SURVIVE:
… // survive code
break;
case HUNT:
… // hunt code
break;
case TARGET:
… // target code
break;
case ATTACK:
… // attack code
break;
}
}
This is where I have hit a wall since I’m not really sure how to translate that into easyC programming. I’m not really sure what the “case” instance is and how I can replicate that in the vex software. Do you think you can help translate that basic code structure into easyC or just regular C code, since I think thats probably in Java.