We had a situation where our alliance team dropped their robot 1 min before the game. The judges called them up to stage and gave 30 sec grace wait time. The robot was in several pieces. Only one team was up on stage. What is the rule or how is a score determined on this match?
Thanks Steve. We did score during that match and the other team just stayed at their pit table trying to assemble. The judges did score that match. Then we were awarded zero. In effect, it was if that match was nullified.
If was as if the alliance team is not present, both teams get zero?
We are iq elem 15A. This was at Mt Sac on 11/15. This was our last preselected alliance match of the day in the list. We had a good score over 100 but given 0 points. I was not paying attention and thought we got that. I found out 2 days later that we got a 0. I thought to request some clarification from Official Vex staff.
Edit: I realized that I should post in the other Q/A sub-forum if we need an answer officially from the staff. I will post this there also if they do not reply here…
It looks like you’re seeing a bug in the software for how it displays results on the web. If you download the CSV file (link at the bottom of the page) it shows team 15A received 51 points, and team 2151A received 0.
I think they assumed that 99% of the time, both teams will receive the same score, so they simply show one score.
The results in the CSV file appear correct. Wouldn’t you agree?
Yeah, you are right. Software bug. Did not check the csv. I will check with the team if it scored correctly. So they got 50s. Looks fine.
Edit: This might also mean that the same software may or may not calculate the final scoring (ranking) correctly. If the same algorithm evaluates to a 0 and the lowest 2 scores are dropped, it would have dropped the wrong lowest scores. For example, 51 was not the lowest so if 0 was dropped then it means it was also in error. I would have to manually validate.
Update: Looks like the software bug applies to how the (scores) rankings are displayed during finals determination as well. Visually at least. Checking manually, the final scores were calculated using the wrong score being dropped (to discount the 2 lowest scores). So while the final score may be correct in the csv, it is not calculated correctly.This would affect the next rounds. I will post this as a bug in the official sub-forum. Thanks.
We take concerns regarding the rankings in Tournament Manager very seriously, and so I’ve investigated your allegation of a bug in the rankings calculation and what you are stating does not appear to match the available data. I opened the tournament database for the event in question and found that team 15A earned qualification scores of 14, 30, 41, 51, 57, 65, 73, and 106. I confirmed that Tournament Manager dropped the 2 lowest scores, leaving scores of 41, 51, 57, 65, 73, and 106 for a total of 393. The total of 393 points matches what is displayed in the Tournament Manager for qualification rankings and is the total which was used when comparing team 15A to the others at the event in order to populate the Finals matches. This placed team 15A in 2nd place after qualifications and caused them to be paired with the #1 seed, 10600A, in Finals Match #6.
Does that satisfy your concern? If not, can you please elaborate further on how you feel the wrong scores were dropped?
Please note that the display of scores on RE.com is unrelated to any calculations performed within Tournament Manager for the rankings or placement in the finals. The data you see in the .CSV file is the data that Tournament Manager calculated and uploaded. After the upload, RE.com uses the data for display purposes, but this in no way affects the calculations or operation of the tournament.
I hope this helps clear up some concerns raised in this thread.
I should have pointed out that there are two different types of software in use. The Tournament Manager, that… well… manages the tournament. And the web software used to display it in a pretty fashion for us to review. Sometime last year, they updated the web software for IQ so it only shows one score for both the “red alliance” and “blue alliance”,
It would be cool (for people like me, who actually use the CSV file) to indicate in that file, which scores are dropped for each team.