Access to Slot and Program Name at Runtime

One of my teams has a single python file ( that implements their custom driver controls logic. But they also have the autonomous code in that same file, because it’s more convenient and shares code with the driver controls (functions like dispenseBlocks()). They have several different routines to run for auton.

They have logic like this:

if slotNumber == 1:
elif slotNumber == 2:
elif slotNumber == 3:

Nicely organized. But there’s problems here:
(1) They can’t find a way to detect at runtime what slot the program is running under, so they have a hack with a variable at the top of the file:

slotNumber = 1

They change it, change the Slot to 1, save. Then change it to 2, change the Slot to 2, then save. Repeat for all slots they need.

(2) Another problem is that they can’t control how the program name displays on the Gen2 brain. So as they toggle through the slots, sometimes under extreme time pressure (!), they just see the same name scrolling by. The Gen2 brain doesn’t even offer a helpful indicator of what slot # is currently selected (e.g. 1:Code, 2:Code, etc.). Even that would have been helpful.

My questions:
(1) Are we missing something? Is there a way to access the current Slot # at runtime?
(2) Is there a way to programmatically set the program display name on the brain?

BTW, before you suggest that we add an LED button to correspond to each program…the kids thought of that already. First, the ports on their brain are nearly at capacity - adding 4-5 new cables is not a good option. Second, the bot also doesn’t have a great place to add a bunch of LEDs physically (possibly fixable with rebuilding).

Another solution we’ve thought of is to share code - but can the brain support multiple python files, and python modules? But this is still kind of suboptimal, because you now have to maintain and load each python file into each slot, every time you change the shared module code, right? That’s bound to lead to mistakes.

We figure, all we really need is:

(1) Must-have: runtime slot # access, like:

if brain.slotNumber == 1:

(2) Ideally: I know this is a harder challenge due to bootstrapping the program, but a way to configure the program name based on slot #?

Suggestions welcome! Thank you.

1 Like
  1. There’s no way. We do not expose that inside vexos to a user program.
  2. also no way, that’s completely controlled by the metadata that gets downloaded to the brain alongside the user program. Best you can do is display something on the screen after the program has been run

You can import a module from the SD Card, not sure I would recommend that for competition use, and any external code you created this way is not supported by VEXcode. In general, all Python projects are intended to be single file projects. Using the VSCode extension and C++ allows multi file and library use etc.

if slotNumber == 1:
elif slotNumber == 2:
elif slotNumber == 3:

normally some type of user interaction (selection via screen menu etc.) would be used to select different paths through a user program.

Normally you would not download a program with the same name to multiple slots.
change the name before download.


Id just have a button on the controller to change the variable value and 1 light on the side that changes color depending on the slot

Got it - thank you for your quick reply on a Sunday!

I think your comment about selection via screen menu is the one that will help the team move forward.

For autonomous, per RSC6 no handling of the controller is permitted; however, we can event handle the brain arrows of course. But your main point is good: change a variable, and having an LED indicator is a terrific idea too. Thank you so much.

wanted to add a few further thoughts about this that you may want to consider.

First of all, Python programs are limited in size and can in general not be as large as C++ programs when measured using simple metrics such as lines of code. The reason for this is that the Python code is initially loaded into memory as Python source and then compiled (for want of a better word) into byte code that is then executed, this all uses additional memory and success is only known when the code is run. C++ on the other hand does not have this limitation, we pretty much know memory is used when compiled before download. On IQ generation 2 this means that Python code is limited to a few hundred lines, in terms of size somewhere around 32k bytes, but even this may prove too much. Just beware having all possible code as part of a single program if it’s getting large.

Second, you mentioned

Does this mean they are selecting a new program during an autonomous run to select a different path ?
If so, be aware that there will be a small performance hit to load the code, compile, run any inertial sensor calibration that VEXcode may add etc. there may be 3-5 seconds of delay before the code actually runs.

Finally I did have a think about what I would implement as a program selector on IQ2. With the possible constraint of time pressure in selecting something I concluded that, as well as displaying a selection in text, it may be useful to also assign distinct colors to the different selections, it may be easier to decide to run the “green” program rather than thinking about “slot 1”.

So, although it’s rare I post any complete examples these days, here is a small example that implements a simple selection mechanism. The screen looks like this.


It’s simple and just cycles though four selections (obviously extendable) using left and right buttons, the next and previous selection colors are shown to the left and right side of the screen. The check button is used to “run” that selection, what that means would be determined by the rest of the program, I just added a 2 second delay and then enabled the selector again.

IQ2 selector
# ---------------------------------------------------------------------------- #
#                                                                              #
# 	Module:                                                      #
# 	Author:       james                                                        #
# 	Created:      11/5/2023, 16:04:15 PM                                       #
# 	Description:  IQ2 project                                                  #
#                                                                              #
# ---------------------------------------------------------------------------- #

# Library imports
from vex import *

# Brain should be defined by default

program_names=["select first", "select second", "select another", "select last"]
program_colors=[Color.RED, Color.GREEN, Color("#0080FF"), Color.YELLOW]

def display_program(is_running):
    # determine the color for the selected program, next program
    # and previous program
    color_next=program_colors[(program_selector + 1) % program_max]
    color_prev=program_colors[(program_selector - 1 + program_max) % program_max]

    # Draw a color for selected program
    # and an indication of what left and right keys would select

    # display a circle to the number inside of

    # display a program number
    text_width=brain.screen.get_string_width("%d" % program_selector)
    brain.screen.print_at("%d" % program_selector, x=65, y=65, opaque=False)

    # display a program label
    brain.screen.print_at(program_names[program_selector], x=20, y=100)
    if is_running:
        brain.screen.print_at("Running... %d" % program_selector, x=10, y=15)
    # this helps remove some screen redraw artifacts (not all)

def right_press():
    global program_selector
    if program_running:
    # increase program_selector by 1 and wrap around
    program_selector = (program_selector + 1) % program_max

def left_press():
    global program_selector
    if program_running:
    # decrease program_selector by 1 and wrap around
    program_selector = (program_selector - 1 + program_max) % program_max

def check_press():
    global program_running

    if program_selector == 0:
        # run an auton seq here
        # sleep just simulates that
    elif program_selector == 1:
    elif program_selector == 2:
    elif program_selector == 3:



Wow, a full working sample - fantastic. Thank you!

Of course, unfortunately, in the spirit of G2, I’ll have to privately appreciate this example code and instead challenge that team on Monday to code a selector from scratch. :slight_smile: But when they succeed, I’ll share this implementation as a source of inspiration and knowledge. For some reason, we’ve never considered building a UI on the brain’s little screen.

RE: Code size - yes, they ran into this problem last year when their python code got large. They ended up aggressively cleaning up the code, finding opportunities to use functions, etc. I figured the VM’s gotta have a big footprint and even with Gen2’s larger RAM there’s not much room for a big program. My recollection is that the upper limit was smaller than I expected though; the program was maybe 600 lines long, and they had to prune it to less than 500.

Here’s their Slapshot Code that came close (GitHub - AriaCoder/Axobotl_2022). I think they didn’t commit the version that hit the out-of-memory exception. I encourage them to try not to commit broken code…but it might have been nice to have that example in the history.

Good tip about using C++ instead, to raise the memory ceiling. I haven’t used C++ in like 15 years, and the team only knows python and blocks … but we can probably give it a go. Still early in the season.

Thank you again!

1 Like

Haha, yea, That’s why I’ve backed off posting so many examples.

Now the unfortunate part of this is the team is already running many thousands of lines of my code as I wrote IQ2 vexos, the C++ SDK and the Python VM. Although my opinion doesn’t really matter in this situation, it would be that taking inspiration from examples such as this, which will score no points for the team as it doesn’t control the robot, really is no different than taking inspiration for building based on viewing youtube video of other robots.

Here’s their Slapshot Code that came close (GitHub - AriaCoder/Axobotl_2022

and if they wrote that unassisted, then a simple selector like I posed is well within their capabilities.
perhaps just show them the program running and then let them create their own.


If they cannot use potentiometer sensor like V5 has to dial into autonomous selection, could they use IQ optical sensor to read the color of a little card (attached on a robot) in front of it at the beginning of autonomous?

Yeah, G2’s funny that way. I always say that engineering never happens in a vacuum and code gets copied (and AI-generated!) by professionals all day long; haven’t written my own quicksort() since college. You know they get judge’s questions like, “Did you write this yourself?” - so… I mean, it’s turtles all the way down, right?

The main programmer hasn’t ever done brain draw/graphical operations (even in blocks), so an exact implementation would have a learning curve for her, but I know she could figure it out and probably enjoy it. My guess is she’d go with a text-based prompt to keep the code lean and simple, maybe I’ll suggest making it at least a bigger font and color. I think they’ll also like the idea from the other poster about setting an LED color; they already have a “health check” LED on the bot. In practice for auton, they’ve sometimes accidentally run the wrong program in the heat of the timed run and your instinct to have big colors would be helpful, I’m sure.

Thanks again for your amazing level of support, and for the IQ2 vexos and python VM too - I recall that we were ecstatic to learn Gen2 had python support and pre-ordered Gen2 kits on that basis alone.

Just one coach/mentor’s opinion, but I believe for VIQRC you can’t keep a card attached to the bot (not a Legal VEX part), though you could keep vex pieces instead . I imagine you can keep colored cards in your possession (not attached to a bot) without any problems. The color sensor seems pretty good, so I imagine that’s useable for this purpose? I’d be concerned about misfires and timing: your green shirt may accidentally trigger the program before you flash the red card? Or maybe you have to be holding the card up on the sensor before you press Start? I can imagine misplacing cards, or having to fumble for the right color, or the lighting in the Skills room affecting the sensor. Lots of possible mishaps. I suspect the aforementioned brain-button approach would be faster/more reliable. Cool idea though.

Simple solution that our kids have used, use a bumper switch to pick the path through the code instead of touch LEDs. Every press changes color (and updates the screen in a visible way).

This would require all your options in a single file, which to James’ point may overrun your ability if using Python, but there should also be opportunity for reuse/functions.