I know this might well come across the wrong way. I don’t mean to disrespect team 4004X, or the organisers of VEXPO 14. It’s also possible that the RECF is aleady aware of this issue.
However, I feel the subject of this post is important and this is the only way I can think of to make sure the issue is fixed.
As you may know, there was a tournament called VEXPO 14 held in Tahlequah, Oklahoma in November 2014. Among the skills scores uploaded were two abnormally high ones - 65 by 2398E, and 48 by 4004X. The 65-point run was since removed because it was clearly a mistake. The reason I am making this thread is that the 48-point run is also clearly a mistake, but it hasn’t been removed. Currently it is still taking up a spot on the global top 30. This is a problem because it looks increasingly likely that 48 will end up being a qualifying score at the end of the season (which is very soon).
To make the case that this score of 48 was in fact a mistake, here are the reasons why we ought not to believe that 4004X scored 48 points at VEXPO 14:
We know scoring at the event was unreliable.
The event submitted a skills score that later got taken down off robotevents.com because it wasn’t accurate. That by itself is probably grounds to disregard any other scores from the event, but at the very least it means that there’s a good chance other mistakes were also made.
We can also see this from the scores at the low end of robot skills - a robot skills run can only end in 1 or 2 points if the robot actively removes cubes from tiles. This seems like a less likely possibility than the runs being recorded incorrectly.
A forum member claims that the Robot Skills award was not given at the event because of uncertainty about the scores.
Despite much discussion, no one in the community has come forward to back up the claim that 400X scored 48 points at this event.
Unlike 2398, who came forward to say that the score recorded for them wasn’t accurate, neither 4004X nor anyone who knows them has (to the best of my knowledge) said anything on the forum or in the wider online Vex community to support the 48 point claim.
The score was very high for the time it was scored.
The red point is 4004X’s supposed score. It doesn’t look all that remarkable, unless you know that all the green triangles represent just one school in Singapore (Hai Sing Catholic School). It wasn’t until nearly two months after VEXPO 14 that a team other than Hai Sing managed to beat 48 points. If they really did score 48 then they were way ahead of the vast majority of US teams, let alone Oklahoma teams.
No other 4004 team has a robot skills score this season of over 14.
Not all organisations share people and resources between their teams, but they do tend to at least share ideas. It would be very strange to see an organisation where the best team is setting world records, and the second best team can just score four cubes on posts in a minute.
4004X’s OPR was low.
OPR is a measure of a team’s ability to score points in competition matches (find out more here: http://vex.us.nallen.me/extras/ranking_methods).
If 4004X had actually scored 48 points in skills at VEXPO 14, they would have had a significantly lower OPR than any other team scoring more than 46 points in skills at any other event.
Teams that are able to set world record robot skills scores are usually well known and come from areas with other very competitive teams. It would be very strange for a team to come out of the blue and score a world record in robot skills at an event like this - where their match performance is decent but not great, and where another world record skills score just happens to be entered by mistake. As such, it seems highly unlikely that this is a genuine score. Since there is good evidence that the scores recorded at this event were unreliable, and since this score may cost a deserving team a World Champs qualification, 4004X’s score of 48 should be removed from the robot skills rankings.
I believe this is an important issue, not just for the one team who could lose their qualification spot but for the whole global skills qualification system. The system relies on trust - event partners have a responsibility to make sure that skills runs are properly set up, well refereed and accurately reported. To maintain that trust, I think the RECF should be more careful about the scores it accepts - including not accepting highly implausible scores from events that are known to have reported other scores inaccurately.
Thank you for reading.