Inconsistent skills scoring.

It has come to the attention of a multitude of teams that the tournament manager scoring program allows you to add highest stack onto skills runs. We have looked at scores on some teams skills runs with points such as 109 driver which seems very likely to have been scored with highest stack. This is because the only two ways to get an uneven score are through highest stack bonuses and 5 point zone which in this case is unlikely (with 7 cones minimum along with all the goals) although not impossible. We are worried that some skills scorers may include highest stack in some of there scoring and therefore inflate teams scores making it much more difficutl to place in skills. (I didnt know what topic i should have posted this under so i chose general forum) please discuss your views on this topic and maybe possible solutions.

It is extremely unlikely for teams scoring over 100 points to be using a 5 point goal, if a robot can do 7 goals in the 20 and 10 zone, it would be pure insanity to throw away 5 points and try to make the difference up on cones for no good reason, I do not believe anyone is doing that. what’s more, literally the only odd scores we see over 100 are 107 (the common 102 run + a stack bonus) and 109 (the common 104 run + a stack bonus). If teams WERE putting 1 goal in the 5 and stacking it we would expect to see 101, 103, and 105 runs, none of which appear in skills even once. As it stands bad scoring is going to determine which teams go to worlds. 5 points is LOT once you are over 100, and if you stack 2 goals then 10 points is an almost insurmountable advantage (it is also pretty much invisible because it is no longer an odd number). I don’t think it can be overemphasized how badly this needs to be fixed immediately if worlds qualification through skills is to maintain integrity, the fix is really simple, don’t allow stack bonuses to be awarded in the scoring tool, or record how the score is made, so any mistakes can be subtracted later. Failure to change this is going to make the world championships a celebration of teams from regions with scorers who don’t know the rules. Note that this is not a criticism of teams with 107 or 109 scores, some of them could be legitimately run with 5 point goals (everyone over 100 knows they aren’t, but I have to pretend they could be just in case), and it’s not their job to police accurate scoring… That said, you do have to wonder if it’s possible to not know that you have been given 5 free points if you are good enough to nail a 100+, but I’ll give the benefit of the doubt there and say that the teams with those scores must not know the rules too. This issue is going to get worse later in the season as the ceiling is pushed higher, and stack bonuses are lost when an even number are wrongly awarded. this has to be fixed right now.

This was fixed in a previous TM release (either the latest or the one before it). The real issue is that early season scores were entered by people who did not realize that high stack was not part of the scoring of skills. This is a training issue of people who do the scoring.

I would not be so alarmed by this.

And they fixed the scores. When we upload the data they can see that teams got these points in skills they just adjusted them.

Where are you seeing 109? There are no 109s as far as I can see in the global skills rankings.

I think to their point, there should be no situation where a skills run should have an odd point and deserves scrutiny as to whether or not it was mis-scored by awarding high zone bonus(es)

109 was specifically mentioned; I just want to know where that was observed.

I don’t see 109, but I do see 95, 99, 97, 77, 79, and those are just in the top 80. Those are only driver skills though, because IMO a 5 point mobile goal is not reasonable for a score that high, however in programming it is more reasonable

It’s still visible on because he hasn’t done a full update. All of 6842Z’s scores on robotevents correctly reflect the 104 now from our first tournament.

Jacob 8757 and StimpNZ, it seems like you are not aware of the official response to this question that was made over a month ago by Grant Cox. You may want to review that as I believe it answers your questions:

Jacob 8757 and StimpNZ both referenced a score of 109 and StimpNZ referenced a score of 107 . The 109 score does not exist (sorry, I just double-checked and the 107 does exist but it’s not even in the top 100). My assumption therefore is this complaint is being generated by people who are looking at the site That site is unofficial and more importantly it is not accurate. Thus, it should not be relied on, especially when asking questions such as this. Unfortunately when that site publishes outdated data for which the correct data is readily available on it does everyone a disservice as can be seen here.

I spot-checked all of those scores and they all do indeed come from scoring in the 5 point zone. Hopefully this puts your mind at ease.

vexDB does not publish false information, @nallen01 has yet to rescrub the skills scores to update them. At the time of their publishing they did reflect RobotEvent’s scores.

Thanks, it does

Vexdb is showing (“publishing”) incorrect/outdated scores right now. The OP and StimpNZ presumably used that information when creating their posts. I understand why it’s incorrect (because it hasn’t been refreshed in over a month), but that doesn’t change the fact that it is incorrect/outdated data.

I appreciate what vexdb is trying to do, but in this case it’s causing harm. Furthermore, there’s not even a mention or disclaimer anywhere that the data is unofficial and may be out of date.

LOL. Looks like someone may jealous of the great job a team does for free in their spare time yet makes it easier to find lots of information than the official site that still has incorrect (or at least misleading) info on it.

The official site still referrers to the top 50 skills scores getting to worlds when, in fact, there will be teams who score in the top 50 that will not earn the automatic bid and teams well outside the top 50 who will get the automatic bid due to the change this year.

This year, it is the top 35 high school teams and the top 15 middle school teams. If we did it based on scores today, 8 of the top 15 middle school teams are outside the top 50, and 8 high school teams who are in the top 50 who would not get the automatic bid.

I know it would probably take quite a bit of time and a very high level of skill to edit the web page to be able to show teams what number they rank either by high school or middle school and changing the verbiage on the skills page would be a task way too hard and confusing to do, but one can only dream.

Maybe we could get the kids from to help with that.

I am sure someone will be defensive and feel a need to chastise me for my remarks but that will just illustrate the accuracy of my assessment.

keep in mind the incorrect information originates from a bug in skills scoring in tournament manager. The skills scoring is still not completely fixed (you can’t award the highest stack, and the cones scored is limited correctly but you can still score more than 8 mogos which I would consider a problem even though it highlights the text red). So even though vexdb can’t be relied on the main reason comes back to an error in TM that DWAB took to long to fix (at least that’s my opinion but I’m sure other people agree). I still look at vexdb first because even though it’s made by one person who isn’t payed for his work, it’s has a nicer interface, and more features than robot events which I assume is made by multiple payed developers (though if it’s not that would explain a lot).

If this was fixed in a TM release then that’s great to hear, I checked my copy last night and it was available, so I guess I’ll have to do an update myself. I did know that some scores had been “fixed” on robotevents, but that puts someone in a position of judging when a score is legitimate, which is easy when there are 5 point bonuses involved, but pretty much impossible if two are awarded. Also, vexDB is sometimes the only source we have for data, if you’ve checked the official vexu skills rankings recently you’ll see there aren’t a lot of scores on there (read, none). It’s good that they fixed the bad skills scores that had been uploaded, but the most important thing is that no new ones can be added, thanks for the replies everyone.


As of this current moment on, 6842Z (PigPen) has a score of 106. That’s all fine and dandy. However, their original robotevents skills score was 107. It is highly unlikely that this score changed by one point, because many different things would need to change in order to make it sensible. Multiple different teams (to my knowledge) with these kinds of scores have had their scores changed by one point or so just to make it even. That makes absolutely no sense. Please address this issue. It is not an adequate fix for the last time the community brought up the problem.

Additionally, this entire issue came about because Tournament Manager allows the input of inaccurate scores. I find that it is extremely crucial for DWAB tech or robotevents to also address the problem of tournament manager.

I feel like they may have just gotten 106. Here is them with 104 in early November.

here’s what I think happened, the score of 107 is a separate run from what is now a score of 106. I was watching the skills scores as they happened at this event and I don’t remember 107 ever being 6842Z’s driver skills score at that event (or at the very least not before the 106 happened).

this isn’t pigpen, but I can confirm that 6842Z did get 104 at this event where I was running skills.