I’m curious how many people use Modkit for their VexIQ programming, rather than RobotC? Any estimates?
I’d also be interested to know what people think the limits of Modkit are? Or, what’s the most complicated project you’ve successfully implemented using Modkit?
I’m just getting started, but I’m finding that - even though it looks like a fairly powerful language - my Modkit programs rarely work the way expect them to the first time, and that considerable debugging is required to make even simple things work. Do people have the same experience with RobotC?
Is there a site that offers tips for successful Modkit-Vex programming somewhere?
For students that are used to Scratch, Modkit makes a lot of sense very quickly. It’s really good for simple programs but it is very easy for students to get confused with very complicated programs - the main issue being that they often end up starting many parallel threads in programs by using the broadcast command, forgetting that the other threads may still be running. Once you understand the logic of it, it really is quite good. It addresses key loop structures and Booleans which is essential for the UK curriculum. We have done some excellent sessions for robot movement, line tracking, object sorting and lots more.
For me, there are still some fundamental issues which need to be addressed to make it as useful as RobotC for the classroom such as the ability to rename components, save and open files locally (initially we didn’t think this would be an issue but it is really hard for students to share programs with each other and for us to share example of programs with students) and copy and paste. It’s close, but still feels very unfinished (which it is!) compared with RobotC.
Yes, we tried writing a program to do a few scripted moves with the basic drive train robot and, while it seems obvious in retrospect, it took us a while to realize we needed to use delays between the event broadcasts that were long enough to ensure each move was completed before the next one could be initiated. Otherwise, the robot would just start moving spastically, because two (or more?) different threads were trying to drive the motors at the same time. It would be helpful if there were some simple mechanism for coordinating threads, perhaps a block you could stick in the middle of a sequence that would pause and wait for an event to be received. Then you could have the programs (threads) for a motion broadcast an event saying they were completed when they finished. Without that, it seems you just have to be conservative about allowing enough time for each step to complete.
I’m facing an issue now that may be similar. I’m trying to write a program that uses the distance sensor (mounted on a motor) to scan for a direction the robot can move to after encountering an obstacle. It seems to take the sensor a finite amount of time for the distance readings to settle down after it’s been reoriented, and the readings are a little noisy. What I’m suspecting may be necessary is to pause briefly after reorienting the sensor before trusting the distance it reports. In any case, setting delays between measurements is critical, but there’s no feedback from the sensor on this and no guidelines about what it’s reasonable to expect.
I agree with the need to be able to save and load programs locally. It looks like you can save programs locally in .mkc files, but I don’t see a way to load one of these back in. And, although they’re JSON files, they aren’t really meant to be human-readable - it would be helpful for students to be able to save a copy of their program in a form that they could study and share offline.
@calvc01 You said, “Once you understand the logic of it, it really is quite good.” How would you describe “the logic of it”?
Also, I’m curious how you solved line tracking. I’m familiar with line-tracking algorithms for robots that were designed for that, but the good designs typically require multiple sensors to let the robot figure out which way to correct when it veers from the line. Were you able to get something working with just a single color sensor?
By “the logic of it” I just mean the way that you would solve a particular problem by using multiple threads and broadcast commands – it’s just a little bit different to how you would use subroutines in some other languages.
For line tracking, there are loads of ways to do it but ultimately there is the one sensor or multiple sensor approaches.
Firstly, let’s look at a single sensor. The VEX IQ colour sensor can be set to “greyscale” mode which gives you a reading in “% of white”. This means it returns 100% if what it is seeing is perfectly white and 0% if it is perfectly black. This allows you to make a line tracker using the “edge follow” technique – i.e you are not tracking the line itself, but the transition between a black line and a white background. To see what I mean, set yourself up a line of black tape on a white piece of paper and create a simple program to “calibrate” the black, white and edge thresholds.
When you put the sensor over the black line, you’ll get a reading of around 1 to 10%, over the white background will probably be 70%+ and over the edge of the line (partially white, partially black) will be about 30 to 50%. Say we are going to follow the right hand edge of the line, you can set 3 basic thresholds – if the reading is greater than 50% you have strayed to the right (onto the white) so need to turn left. If the reading is less than 30% you have strayed to the left (onto the black) so need to turn right, else go forward.
In reality, this probably won’t work well, the robot will be bouncing from side to side and never settling so a better program would be to have a few thresholds that vary the turn speed depending on how far you have strayed from the optimum point.
To do some much better line following, you can use multiple colour sensors. Feel free to drop me an e-mail if you would like me to send a bit more info. chris dot calver at rapidelec dot co dot uk. Also happy to help with your rotating distance sensor program.
Thanks Chris. I’ll try to put something together on the distance sensor program. Some ideas occurred to me while describing my problems to you, and I’d like to try those out first.