Vex Virtual Worlds 2020

If anyone watched the award ceremony for the TN Halloween Tournament yesterday you would have heard an REC Representative (please correct me if any part of this is wrong as I am notably bad at titles) announce that the same algorithm used at the 2020 Virtual Worlds competition correctly predicted the winners for past Worlds competitions.

I, and I’m sure many others, were/would have been very surprised by that statement for, what I understand to be, very legitimate reasons. There were several results of the Virtual Worlds competition that many saw as very surprising.

How the algorithm worked:

  1. The top 15 scores for each individual team were saved for that team
  2. In each match, one score is randomly selected for each team in an alliance (score cannot be used again)
  3. Each alliances average score is compared, the team with the higher average wins

[Alliance Selection]

  1. Each team picks the team right below them in rankings
  2. Remaining scores are used for Division Elimination matches

Given this methodology it seems bound to produce a large level of randomness in match results.

Our surprise could very well be unwarranted, but I think it would be useful to see these results for ourselves in order for us to collectively better understand the accuracy of the results of Virtual Worlds 2020. It would be very useful to be able to see how this algorithm performed when tested on past worlds events (And which events these were), especially if it performs as well as the above statement says.

I am in no way trying to take credit away from the teams who ranked high in the virtual worlds competition, since I know that many of them were super competitive teams and deserve some level of recognition. I also appreciated RECF’s attempt to recreate worlds in light of the growing threat of Covid-19.

8 Likes

I mean does it really matter that the algorithm was flawed? they’re never going to create an algorithm that can accurately determine the winners, and when there are pretty much no stakes involved (not a real event so nobody really cares about the winners/scores tbh) I don’t see how its that important of a thing to worry about.

6 Likes

Yes and no. There are no “real” stakes, however I did note that the results from the vex virtual worlds appear to be showing up on the team stats as if it was a legit result. I find this a little strange, and potentially very frustrating for those who think their actual results would have been significantly better (right or wrong).
Thinking more on it, I really don’t particularly appreciate the system putting up results from an algorithm as if they actual happened. Looking back on those stats from the future, one might not even know that the event itself didn’t occur and could assume the results to be accurate.

Also, it just seems like a mighty strong claim from the RECF to be made with no backup data - especially considering the nature of the competition. If someone stated something similar in their engineering notebook, I’d be looking for more info for sure…

14 Likes

I am really taking the results as a pinch of salt.

And i believe Dan Mantz (or was it someone else from REC) had explicitly mentioned that this virtual worlds was just for fun.

As for the accuracy of the results or the effectiveness of the algorithm… well… just consider this - think none of the teams from China was even in the worlds semi-finals… kinda hard to imagine this actually happened in the actual worlds. plus… they ranked 8059A at #64 out of 93 teams!! (ok… i am biased over there. lol)

13 Likes

I definitely agree with this. The only reason I even thought to make the post was because that statement contradicted this viewpoint. (And I wanted to see the tested results for it)

Also im not sure if Dan Mantz said that or not, but in the stream I linked Grant Cox did say that the simulation was just for fun.

2 Likes

just for fun yet red trophies were still mailed out :confused:

I’m not saying that the teams who received them didn’t deserve them, but I consider it to be pretty disrespectful to send some of the most distinguished awards in the largest robotics competition in the world to teams based on a flawed algorithm.

As one of the teams who made it to the world finals, I 100% do not believe this.

If worlds had happened we would not have made it to the finals. That’s just a fact. I definitely understand the frustration for better teams than us.

About Backtesting

If they backtested this algorithm for starstruck and turning point, there is no way the results would have been right. I don’t know if they made up the statistic or only backtested certain games or what, but either way it seems a bit messed up.

9 Likes

I mean does it really matter though? doesn’t recf have bigger things to think about right now then listen to people complain about the results of a competition that didn’t even happen? it was nice of them to do it even though they didn’t have to, and at the end of the day it really didn’t mean anything to any of the teams, the pretend winners or the pretend losers.

12 Likes