Programming Line Followers (RobotC)

For the past few months, for both our gateway and our round up robot I have been trying to use line followers with limited success. When following the line it is very sawtoothy, and to find the line it sometimes misses it. I was just wondering what you used to be most successful? Relating to both the line sensor array as well as the code used. Thank You in advance.

I posted some EasyC code for a line tracker in the Code section of the forum. I also found line tracking to be very sawtoothy (great description). It is was good enough for our purposes so we never completely solved the problem. Here is a video of our results.

I think part of the problem is that the mass of the robot takes time to start/stop/change direction. Since the line trackers pretty much just tell you if the line is under them or not, you have a hard time tracking smoothly.

I always thought it would be interesting to mount the trackers onto an arm that can be swept across the robot’s path using a servo. You could then use a fast-ish feedback loop to have the arm track the line, and a slower feedback loop to have the robot adjust its path to keep the arm centered. I’m not sure if it would be an improvement, but my gut instinct is that you could do smoother tracking.

Cheers,

  • Dean

Thanks for the fast input, but I REALLY am looking for is an efficient way to basically find a line, and then straighten itself on it. Unfortunately I am using RobotC so the EasyC code isn’t that helpful. Thanks for the try though. What i currently have is this:
The array is like a triangle it goes like this:
.| (that dot is only there because it wasn’t positioning the lines correctly with just a space)
|||

so what happens in the code is i have it back up until the center back line sensor hits the line, then it rotates until the front center hits the line as well. The main problem with this is the fact that that the line sensors sense the line before they are actually directly above it. Another problem is that sometimes it wont even sense the line in the first place.

This site : http://www.inpharmix.com/jps/PID_Controller_For_Lego_Mindstorms_Robots.html might just have the answer to all your questions about smooth line following. It’s about PID line following with the NXT, but the idea is the same. I’m sure that you will be able to write ROBOTC code for it once you understand the concept.

thanks so much. Hopefully this helps

Unfortunately that link is broken. D:

http://www.inpharmix.com/jps/PID_Controller_For_Lego_Mindstorms_Robots.html
Try this.
//Andrew

The C source is posted here, which should be pretty portable to RobotC. Of course, since my tracker isn’t working any better than what you’ve already, it may not be of much use to you.

That is what I was expecting the scanning platform to help with: scan back and forth and build up an array of samples. Using some simple code, you can find the center of the line. Kind of like a Cylon Eye. Combining scanning with forward motion might still be sawtoothy, though, since you would only be getting a couple position updates a second.

If your line is just wide enough that it can be partially-seen by two adjacent sensors at the same time, then perhaps you could use the difference in their two analog readings to get a better fix on the line.

Cheers,

  • Dean

thanks. this helped. Any idea whether the light sensors would sense a slightly lower value if they are just on the edge of the line? I would test it myself but the robot is at another member’s house

One thing to bear in mind is that the positioning of the sensors with respect to the rotational axis of the robot and the type of drive system used will affect your results. If the sensors are close to the center of rotation, driving the robot to the left or right may not move them enough and that will cause more swinging back and forth. Perhaps test moving the sensors forwards or backwards and see how that affects the robot motions. Moving them to far away can make tight turns hard but try some experiments. Also make sure the sensors are close enough to the floor so that a reasonable difference between dark and light values are seen.

yes, I have taken most of those into account. Although, i’m not sure if i can move them farther forward so it seems that they are in the best spots. I was just wondering if anyone had a good way to find a line then straighten itself onto it?

Team 1103 posted their code from roundup, it’s on the forum somewhere (including a plain text version), you may want to take a look at that.

ya that would be cool, although he uses EasyC right? I mean i can still use the concept though.

Ok, so i just wrote this based on that mindstorms link. It’s a first draft but i was wondering if this would work at all:

task linefollowset2
{
  while(true)
  {
  greyvalue = greyvalue + SensorValue(greyr) + SensorValue(greyl))/3;
  lineleft = (greyvalue - SensorValue(lineleftr))/4;
  lineright = (greyvalue - SensorValue(linerightr))/4;
  linecenter = (greyvalue - SensorValue(linecenterr))/8;
  linecentr = (greyvalue - SensorValue(linecentrr))/8;
  }
}

task linefollow2
{
  while(true)
  {
  motor[one] = lineleft + linecenter + linecentr +(127 - (lineright/3));
  motor[two] = lineleft + linecenter + linecentr +(127 - (lineright/3));
  motor[three] = (127 - (lineleft/3)) + linecenter + linecentr + lineright;
  motor[four] = (127 - (lineleft/3)) + linecenter + linecentr + lineright;
  }
}

any ideas if that would work?

I have also thought of using a servo-mounted sensor. Seems reasonable, and much lower mass to move. It also allows the possibility of tracking a line that is not under your robot, for those cases where you want to drive parallel to a line 6" to your right.

In the few experiments I have done, I found it is easier to follow the edge of a line with one sensor, than the center of a line with 3 sensors. The line trackers return an analog value. If you only compare that against a threshold, so that they “just tell you if the line is under them or not”, thats a mis-feature of your own implementation.

One of the linetracker kit-bots in Robot magazine advertised both PID and some kind of “track memory” to enable faster lap times on a closed-path course, by remembering the length of the straightaways vs the turns. Jordan’s code attempted similar things with ramp-up and ramp-down speed routines for following a line for a specific distance.

**** Swan has sample RobotC code for a full PID implementation, and a youtube video, comparing it to the sawtooth typical motion.

What angle do you approach the line? The usual method we see in Elevation Autonomous videos is to cross the line at right angles, backup, rotate in place to parallel the line, the follow it.
If you want to approach the line at odd angles, then smoothly sweep turn on to the line, you will need at least a statemachine to remember what you are currently doing. Possibly line sensors mounted both fore and aft would help: Drive until front crosses the line, then turn until front recrosses, or back crosses, and figure it out from there.

In general, mount the line sensors as close to floor as possible (maybe even on spring loaded spacers-as-idler-rollers), and have them peek through a 3" black paper square to block ambedient light.