Video verification of skills runs

Perhaps next year’s game is the board game “Sorry” blown up onto a 12 foot field and played with robots. When taking the place of an opponent robot, you must catapult them back to their starting tile, somehow without damaging their robot. :stuck_out_tongue:

This seems fair to me. I’m sure Worlds will end up with at least one lost spot.

Moving forward, perhaps some measures should be taken next year to ensure skills scores are more accurate, seeing as they are a major part of the current qualification system. A reasonable measure would seem to be to have a video of each run, submitted to RECF reps in each region so that if a skills score is disputed, the concrete evidence is there. This would be fairly easy, as I’m sure a skills-scoring volunteer with a smartphone would be more than willing to record the runs.
Often less than 20 total teams do skills runs at an average-size tournament, so there would only be 30-40 minutes of footage per tournament. This footage would only require review in a case when a score seems out of place or unlikely.

This seems a little inadequate during skills challenges, though, where there would be about 400-480 minutes of footage per tournament.

Also, even if a volunteer had a smartphone I’m not sure how many would be willing to record the matches and send them to RECF. A video of each run seems a bit too much.

On the other hand the only other verification methods I can think of are to have teams record their own runs or to record a practice run that scores about as much as the score. It would be nice if all teams followed the honor system but obviously that doesn’t always happen.

Any other ideas? It seems that verification is not always possible, but sometimes when it isn’t other teams start to question scores a lot.

480 minutes of footage?? I do not remember a tournament I went to, besides Worlds, where there were 480 skills runs. 1 minute per skills run means that the time is roughly equal to the number of runs. An average local tournament has 20-30 total skills runs, meaning that there will only be 20-30 minutes of footage taken.

A video doesn’t need to be perfect quality. It just needs to show that a team managed to score the points that are recorded by the scorekeepers. My guess is that, between the Top 30 Robot and Programming scores, plus spots at state and regional competitions allocated to skills, about a fourth of the total teams at worlds are determined by skills scores. In addition, state competitions pick teams off of their waiting list by choosing skills scores. It seems to me like this kind of verification is worth it.

I think the important thing to upload is each team’s best run at the event (and it’s only really important for teams who beat their previous best score). That shouldn’t be all that much video.

To be able to fully verify tiebreakers through videos you would need to upload every run, but I don’t think that’s all that important. If a team can show that their highest run is genuine then that should almost always be enough.

I would have thought that (depending on the game) a couple of photos that showed the final condition of the field and the robot, preferably with team number visible, would be enough to verify a score.

The problem is that often the skills field gets less attention than the competition fields. I see less experienced volunteers using a phone for timing at many events, almost never using computer control to enable and disable the robot. Sometimes I see teams that are allowed to make multiple runs in short period of time if they make a mistake in the first few seconds. At a recent competition I saw one team try perhaps 25 times in the space of 15 minutes before they achieved a score they were happy with.

So I agree that as these scores are used as a world championship qualifying means we need a more formalized and consistent method of running the skill fields.

I may have been unclear, but when I mentioned “Skills Challenge” I meant an event hosted specifically for skills runs. Often there are two fields that are continually running skills challenges for 8+ hours, and assuming 1 minute reset time that would roughly equate to 480 minutes of footage.

This is true in theory, but the reality in most cases is very different. Most skills competitions are fairly small, and often don’t last a full 8 hours, nor do they run at constant 1 minute cycles (and usually only have 1 field). For example:
http://www.robotevents.com/robot-competitions/vex-robotics-competition/re-vrc-15-2755.html
17 teams signed up for a skills event is a fairly good turnout. In total, the teams ran 40 attempts that day. Hence, only about 40 minutes of footage would have been taken.

Another issue with this (aside from the ones you pointed out) is that a volunteer at a tournament might know or be part of the host team. It’s conceivable that a team could set up a scored field by hand, and simply photograph the field to submit a score. A video is not impossible to fake, but is much harder than a single photograph.

You guys have to understand that in an ideal world, yeah, video evidence of scores would be nice, but you have to look at the scale. The suggestions for regional managers to review video evidence is A LOT of work. Most regional managers are in charge of multiple regions and some are in charge of whole countries. As I’m sure you’re all aware, there’s anywhere from 1-3 events going on in a single state almost every Saturday during the regular season. Multiply that by the amount of states/countries that a regional manager is in charge of, and you’re easily suggesting that a regional manager review not 1 video reel from an event, but somewhere between 9-12 in one weekend.

What you’re thinking of is an ideal scenario. Yes, skills runs are 1 minute each, but in the video reel, you also have to make it easily identifiable which team is currently running on the screen. Also, keep in mind some events allow unlimited runs at skills. You’re then suggesting that somebody look through the entire video to find out if some teams scores are legit. Shooting video requires equipment, a phone camera can record a run, but now we need somebody who is willing to use their phone to record every skills run, edit it, and then submit it to somebody to review.

The fact of the matter is that the RECF simply does not have enough staff to support video recordings of skills scores. The staff on their website is pretty much it, and they do more than just answer questions, they actively try and start programs in other countries and regions in addition to all the usual stuff like organizing Worlds and state competitions. If they hired more people, yeah, it could work, but please realize that video evidence for every competition is A LOT of work logistically.

I think that only the videos of teams whose score is called into question or at least just the videos of teams who qualify for worlds through skills (both the top 30s and those who get one of their state’s spots) would be reviewed. People would not have to watch every video in order to verify scores.

That being said, I do agree that it is somewhat unrealistic. That is a lot of video footage to store somewhere, and honestly a lot of people have a very difficult time just getting one video off of their phone let alone a whole competition’s worth. If you just have volunteers hang onto the videos and note who recorded the skills at each event, well, then the RECF has to go and track down some volunteer if scores are questioned later on.

I’m not sure I agree with the idea, however, one way to implement this would be a dedicated PC (or iPad) application that would both enable/disable the skills run, save the scores and record and archive the video using a web cam. A mini tournament manager type of thing that was dedicated to skills.

Yes, I like that idea. It’s pretty much just the new TM app, with a video recording feature that automatically uploads the videos to the server together with the scores.

You’re misunderstanding what’s being proposed here.

Teams would upload videos of their skills runs and submit them to a site run by RECF. Those videos would be very short (just over 1 minute).

If a score qualified a team for Worlds and the score needed to be verified, that could easily be done by watching the video for that score. A minute of video for each team that qualifies through skills isn’t a huge amount of footage. All the setup involved would be one more submission box for teams to use on robotevents.com where they can enter the URL for an unlisted video on YouTube.

Yes, making sure they have a video of their best score uploaded is a bit more work for teams to do. But compared to the work involved in building a robot that can qualify for Worlds (and the work involved in actually going to worlds) it’s not that big of a deal. And in many cases events will have enough volunteer support to do this for the teams.

My worry is that with teams uploading videos of their skills runs themselves, there might be a discrepancy in the authenticity of the videos. Though in that case, the RECF would have to contact the event partners, ask them to also verify that the video happened at their event. Like I said before, this is an issue that would be easier to handle with a lot more staff, but unless the RECF actually hires a bunch of people to review all that footage, I’m pretty sure we won’t be seeing video replay anytime soon.

I believe the point of video verification would be, in the case there is something questionable, that specific run could be reviewed. Say a team asked RECF, “Did team X really score N?”. As JPearman mentioned, hopefully there would be a skills portion of the TM where RECF could choose the skills runs to look at and it would take all of 5 minutes to look at and verify.

I think an important part of video verification is for the use of a consistent camera (hopefully distributed by RECF) that integrates with some TM software that would crop video (or save timestamps in the video) for each run. If good software is made for this, I don’t see video verification being much hassle at all (plug in your camera and go). Infact, checking all the teams top scores from an event should take no more then a few minutes for staff members. If the software isn’t developed as it should though, this may create alot of issues and waste alot of time as your suggesting.

I think your misunderstanding what’s being proposed too? My understanding was that we’re talking about a camera that records the skills fields for each event and is later used to verify “did this team really score this amount”. To me, having teams upload videos themselves seems like it could lead to bad things… perhaps I’m misunderstanding as well?