Sacramento State Management & Venue Problems

Hey everyone,

Last weekend, my team, 5776A, participated in the VEX Sacramento State Championship, which we’d been looking forward to ever since we qualified in November. Honestly, after over 3 months of meeting every day working on driver practice, tuning our programming skills/autons, and scrimmaging with local teams, we felt like we were ready to compete at our best and beat our previous skills score of 43.

Sadly we found that this wasn’t the case. Our robot uses 3 line sensors, two to detect ball movement, and one to detect our robots puncher position, all of which are used extensively in not only our autonomous programs but are CRUCIAL during the driver control period as well. In essence, line sensors work by emitting a faint infrared light and detecting how much is reflected back. Our brand new line sensors read values of around 2900 without a ball covering them, and 100-300 when the balls passed over them, giving me (programmer) the ability to detect a dip in the values and tell the robot that it was carrying a ball. In addition, many other teams relied on a vision sensor for their autonomous/programming skills programs, which the direct sunlight affected as it caused the sensors to overexpose.

Unfortunately, the event was set up in a way so that the fields were placed 5-6 feet from a massive glass wall that allowed sunlight to pass through unobstructed. The problem with this is that the fields were completely bathed in direct sunlight + infrared, meaning that our line sensors were consistently reading values of only 100-200, instead of the 2900-3000ish that we were used to. This absolutely destroyed our tolerances, and calibrating didn’t help because the sensor was barely reading above its minimum value to begin with. I and our other programmers scrambled to fix the issues, which we managed to minimize by making our robot fully manual, but that caused an extreme drop in speed and consistency. We lost the perfect ball auto-indexing program and macros we had spent so long on that gave our robot a competitive advantage. So yes, this affected us, but I want to stress that we’re not just complaining about our own problems; From my gathering over 20% of teams competing were affected by the sunlight in some way and the performance of their robots was degraded.

Event Photos Link

That’s alright. What angered us was the lack of responsibility taken by the EP of the event and our Regional Support Manager. Not only did our EP refuse to close the curtains to the window wall (which were motorized), but she also pushed the blame onto our teams. I quote, “All of these other teams(not using sensors) are doing fine, why can’t you?”. She even said, “We knew it was a possibility (that sensors could be affected) but decided it would be okay.” Our Regional Support Manager was unable to be found at that time and calling his “event day phone” yielded no response even after I left a voicemail detailing the problem. I and others affected persisted and practically begged our EP to run skills on the practice field instead (which was farther away from the glass wall), to which she threatened “Talk to me one more time and I’ll disqualify all of your teams from judged awards”. She did say later on that she didn’t have permission from the venue to close the curtains, but completely refused to take any effort to remedy the situation and also refused to give us the venue’s contact for us to ask permission.

Now you might think "This just seems like the kind of rant a team would make after mispreparing for a state competition and not qualifying".

Let me preface this next part by saying that yes, 5776A did qualify to the World Championships a couple of days after our State Championship through a skills score we received at the Google Signature Event (unqualified skills spots). We waited until after we received our invite and registered to post about our situation. This post isn’t meant to read like a rant, and we understand that unforeseen issues can arise the morning of the tournament. We’re not at all asking for teams to be given world qualification spots now. This is more of an issue about responsibility, or lack thereof. I’d feel absolutely crushed if I was one of the teams that paid $250 and drove 3 hours to an event just to see their world qualification chances destroyed by an improperly managed event and no one taking responsibility for it.

We’ve got a few questions we’d like help answering now in order to move on with the situation.

What can teams like mine do when faced with similar situations?

What kind of standards can we expect from a State Championship in the future?

Can teams expect to be provided a proper lighting environment at the World Championships or should we avoid using light-based sensors at all?

Whose responsibility is it to deal with issues like these when they inevitably pop up in the future? Is it RECF for picking the venue? The EP for improper planning? VEX for making sensors susceptible to ambient lighting changes? The students for using these sensors?

Thanks for making it through the post. We appreciate every response.

- Sid P on behalf of:
Ayush, 5327X
Sharwin, 5776E
Tanvi, 5776X
Akash, 5776T
Jish, 5327S


That actually sucks, I also have experienced horrible lighting at 2 comps. In one, a floodlight was shining right into the eyes of whoever was in the blue driver station, blinding me and my teammates whenever we were on blue, giving red an actual big advantage. another comp, spotlights projected blinding red and blue light all over the field, rendering all light based/color based sensors useless, and giving the drivers horrible headaches. I wish people would take lighting into account better when setting up events. Whenever my school hosts events we just stick with simple gym lights. No problems.


I guess we should all try to ask them to fix the lighting, but we should all build our robots so that they can at least function decently without light/color based sensors.


You’ve got to be kidding me, EP!

We expand a lot of effort to teach our students to rely on the feedback from sensors, instead of doing everything manually or by time, because this is what necessary in the real world.

Yes, in the real-real world there are many unforeseen conditions and analog sensors are subject to a lot of environmental noise and should not be the single point of trust when your program needs to make decision.

But VRC is an educational program, it is a learning environment, half way between ideal world where there is no noise, and the real world where s*** happens. Nobody should expect students to handle the all aspects of the last from the get go.

Did any adults, preferably engineers who understand how IR overexposure makes sensors useless, talked to EP?

I am always polite to any volunteers or EPs at the venues but, if I had been there, I would make lot of noise, until the curtains were closed, especially since it was technically possible.

It is definitely not the students’ responsibility. It is responsibility of all adults: mentors, EPs, and especially RECF, to create environment that rewards students for learning and practicing advanced skills (i.e. using sensors and automated algorithms).


I was thinking the same thing when I read the OP. So sorry that happened. If it were a tournament I was hosting, I would have at least made an effort. I’ve seen my teams struggle with sensors and know how much blood, sweat, and tears are put into it.


@bid.p, so what lessons can be learned from this?

I am an avid practitioner of the Murphy’s Law as it applies to engineering. In the real world s*** always happens at the worst possible moment and especially if humans are in the decision making loop.

I am sorry that all your fellow teams at the event run into situation like that this early in your learning. Reasonable lesson plan would call for you to learn at this time that external noise to be slightly different between home testing and the venue, not for the full overexposure.

You would be expected to look for changes in the sensor readings, instead of relying on hard-coded values, but not to have fully redundant logic and secondary sensors (using different physical means of measurement from primaries), should the primary sensors catastrophically fail.

Learning how to implement such level of redundancy should come much later in your learning process, after you fully mastered how to use the primary sensors in somewhat controlled environment.

On the other hand, you could thank the EP for teaching you hard life lesson way ahead of schedule. I think you now understand very well why I keep referring to the Murphy’s Law all the time.

I am sure many experienced engineers who are mentoring on this forum will be more than happy to help you learn how to do proper sensor fusion, where you use multiple sensors all at once to validate each other and detect any abnormal events.

As for the Worlds, it will be held indoors, just like the last year, and I would reasonably expect it to be consistent with past year environment, unless Kentucky Expo decided to switch all their lighting to LEDs or something like that. :slight_smile:


Yes, I was thinking this, too. Having a backup is important. I work on spacecraft and multiple backups and redundancies are common. However, for robotics, these kids are lucky to just one first strategy going. Having multiple redundancies is a lot to expect.


Thanks a lot for your support. We do look for changes in sensor readings instead of hardcoded values, but the light made our value change practically 0, so we couldn’t do much about it. We ended up

We also considered using backup sensors, however, VEX only offers 8 legacy ports at the moment, all of which we’ve used up. I was really looking forward to the release of their legacy port expander, but as expected, it hasn’t released yet. I’ve attempted using sensor fusion as well, but we simply don’t even have enough ports to use sensors on all our subsystems, much less combine readings.

This year will be my third year at worlds, but our first using line sensors. It’ll also be the first year the vision sensor will be used at worlds in a competitive environment. I wanted to be absolutely certain Worlds would be sensor-safe, as others have mentioned that events they’ve been to used tungsten lighting, which seems to mess with sensors as well.

In any case, thanks a lot for commenting and trying to help us out; we appreciate it.


If you need to connect more digital sensors like limit switches or even QEs there is a temporary workaround:

DM me if you have questions and want to discuss specific sensor allocation with that method.

It is great that you are trying to use Vision sensor! But I would be very cautious about its performance at Worlds, since it is the first year it was released and not all quirks had been discovered and fixed yet. You definitely need plan B for it…


Thanks for the post, OP!

In the weeks leading up to Sacramento, our team worked overtime and put in massive amounts of effort in developing a high scoring yet reliable programming skills routine. We eventually achieved a consistent 26 point routine using line sensors for sensing caps, and the vision sensor for shooting flags.

However, when we arrived at states, all our efforts were instantly thrown down the drain due to the presence of direct sunlight on the skills field. The sun saturated the readings of the vision sensor and completely disabled the line sensors causing not only our programming skills to fail but our match autonomous as well.

Our programming skills were reduced from 26 to a measly 2 points. We were devastated by the lack of understanding that the EP had for our situation, and felt that it was terribly unfair that our efforts had come to nothing just because of the venue.

Looking back on the situation, I do wish that we had had a plan B! But I guess we could thank the EP for teaching us a valuable lesson, and we’ll be sure to implement some sort of redundancy in the future.

Enjoy the video :))

5327S Skills 26 Point autonomous (killed by sunlight)


Don’t get me wrong here, but I actually like the venue with light coming in (I am biased, I work in a room with floor to ceiling southern facing windows).

Lighting situations will vary from venue to venue. This may be an extreme, or maybe not. The goal is for students to learn about the limitations of the components they are using and the range of environments in which they will operate. For example, cars operate from -40 F below to above 120 F…

No where in the Game Manual does it specify the lighting environments for competition. Anecdotally, we know from past competition seasons that sensors can be difficult to use in different venues. As others have mentioned, considering alternative solutions is worth it, so on competition day, you might have to implement a different plan quickly.

I know my MS team was considering light sensor for ball detection in intake as a possibility, but there was a lot of variability , but a limit switch appeared to them a better solution. I am sure if they failed with one, they would go back to the other because it was part of their list of what is possible.

Glad you are moving up to Worlds, I would not put too much salt on the EP. Worlds had issues with fields during Skyrise, but by the time they figured it out, they could not change the environment since too many teams had been impacted by it as a flaw.

Definitely an error by the EP for not working through this with you. However, I place most of the blame on VEX for not supplying modulated sensors for VRC and IQ. This whole calibrate and pray approach is a terrible way to introduce kids to sensor systems when there is a perfectly good, and pretty cheap, way to fix it. Saturation is one thing, the EP should fix that, but general changes in light conditions should be handled by the sensor itself so students get robust values.


Imagine doing skills at states. :slight_smile:

I really don’t like how this thread turned from the fact that the EP/Whoever was working with them didn’t take light into account and didn’t have the ability to fix it.

I have talked to the DVHS (5776) teams for years, I highly doubt this was their first and only way to track the ball, but when this same system works at every other venue they attended and even on different fields where the light wasn’t a problem at the same event, I think that is too much to take into account.

Yes in the future they will have to deal with redundancies in workplaces but when they have used all of the legacy ports available, there really isn’t much there for them to do.

At Arizona States we ran HS matches on three fields, while going through matches we noticed that one of the fields was causing many more disconnects with V5 than the other fields (me being the head ref). So when a couple teams came up before eliminations asked to not use that field we conferred and decided to run only on the other two fields for elims. So yes, sometimes things happen, but as volunteers and EP’s, we should strive (and I know most EP’s in our region do) to not have outside factors affect competition or skills play.


Thanks Justin. We’ve definitely experimented with different sensors in the past and found that line sensors were the most effective and port-efficient method of tracking balls. And in our 5 months of using our robot, we’ve never had a single issue with them at regionals. It should be quite embarrassing for RECF as a whole that they can’t maintain Regional Tournament standards at a State Championship Event.


To change gears here, what can be done for other teams to prevent these types of problems?
In the real world, redundant safeties are everywhere. Very little crucial components rely only on one thing. As others have said, having a plan B is a really good idea.
I would’ve used a potentiometer, encoder with PID, or even a limit switch if things went wrong. But that’s more effort than should be necessary.
So now I’m left to wonder if making an anti slip light shield would’ve helped. One thing I’m personally good at is improvisation and adaptation. While you shouldn’t need to use these skills outside of what you could’ve controlled (which I applaud you for trying to do), sometimes there’s no other choice.

If this thread has done nothing else, I hope it raises awareness to other EPs about lighting concerns for the few World qualifiers left, and furthermore for future events.

1 Like

Thanks for the OP Sid.

First, congrats on your qualification to worlds! As a competitor at Google, I can testify that your programming skills routine was one of the best and cleanest I’ve seen all year, and a small field variance shouldn’t have caused it to malfunction. But massive overexposure to direct sunlight? That’ll do it.

There was a similar problem at SD states. The fields were set up backwards, so the flags faced the same direction as the red and blue bleachers and the event staff, who were all wearing red shirts. One team with a vision sensor asked event staff if he could purchase a big white sheet to put behind the skills field, which would help his vision sensor. The event staff gave him the ok, so his adult adviser went out and bought a huge paper sheet. Upon the adviser’s return a few hours later, the event staff changed their mind and said he could not put the sheet behind the field, and then denied that they had ever said he could.

So obviously we can point fingers and accuse individual people of being idiots and so on, but I think this reflects a bigger problem. There need to be guidelines in place for standard fields. Lighting, background, properties of game elements, quality of floor tiles, etc. In games like ITZ and SS, antistatic on the field floors significantly changed gameplay, this year the variable stiffness of flags has led to a lot of inconsistency, and of course bathing a field in direct sunlight when teams rely on color, line, and vision sensors is absurd. Vex needs to put guidelines in place and make their products correctly, and the adults that host events need to be responsive to the requests of students, who are ultimately a lot more invested in the program than most of them are.

I’ll be leaving this program in a month and a half, but a lot of people here will be in vex for another year, two, or five. Getting these things right would be awesome.


We should encourage teams to try using sensors and attempt more sophisticated programming. Younger teams that encounter a situation like the one that occurred in Sacramento are likely to react by rejecting the use of the line sensor, or sensors in general on their next robot. This would be especially true if other teams from the same school, that had simple manual robots, fared better in the same situation. That’s not what we want.


I think it would be smarter if they had made the netting solid, to block out any background color distractions…


Unfortunately, these variances are bound to occur and are pretty much out of any competitor’s control. As much as I hate these imperfections, including BO1, I think this is something that VEX competitors will have to deal with. One single instance for one team won’t change VEX’s mind about lighting issues or flag and field defects. I guess being a part of the VEX competition is to embrace the rules that we, competitors, can’t control, whether we hate them or not.