Let’s assume that team 1 specializes in making good software. Like, bleeding-edge software (like Microsoft/Linus). However, team 1 is not that good at making good quality robots (computers). Now, let’s assume that team 2 specializes in making good robots and has all the expertise of making really good build quality (like AMD/Intel), but don’t really specialize in the software.
If these two teams work together, they can make the most beautiful and well-programmed robots (or computers) that not only work well but functional well. However, if they are forced to not collaborate, they won’t do well competitively (cough Microsoft phone cough) despite having such strong and cohesive knowledge that is deeper than having to spread thin.
In this sort of situation, can team1 compete legally with team2’s robot design, provided the equal exchange that team2 gets the mutual benefit of team1’s programming software? Or do we wish to artificially limit a team’s functionality over the notion that “teams must learn everything and spread themselves thin for the sake of fairness?” Should we accept the practicality/impracticality of such a realistic/unrealistic decision applied to the STEM field?
Does RECF endorce or incentive the collaboration between teams? And if so, in what way(s)?
This is a really good/interesting question.
So VEX robotics is meant to give an introduction to real world engineering right? So my question would be (I don’t know the answer) is would this happen with Microsoft and another company?
If it would than I would say that you should treat it as G4 (in IQ manual I don’t know if it’s different rule number for V5 and VEX U .
As per G4 a iii If a Student leaves a Team to join another Team, still applies to the Students remaining on the previous Team. For example, if a Coder leaves a Team, then that Team’s Robot must still represent the skill level of the Team without that Coder. One way to accomplish this would be to ensure that the Coder teaches or trains a “replacement” Coder in their absence.
I think it’s safe to say, however, that one team could not function without the other and thus the robot would not represent the skill level of the teams.
Sorry I know that was hard to read to put it clearly, it could go either way since it could probably happen in real world engineering, but the robots would not reflect the skill level of the team.
Good question though!
G4 - The Robot must represent the skill level of the Team
I am not affiliated with RECF or the GDC, but my interpretation is - to the extent collaboration between teams leads to knowledge sharing (e.g. leveling up team2’s programming knowledge), collaboration is welcome. To the extent that team1 directly programs team2’s robot - to the extent that team2 cannot explain it - that is a G4 violation, and could escalate to G2 and G1 violations.
At that point if they are using the same hardware and software, why aren’t they just the same team? Or have half from each university compete on the other universities team.
To the extent you suggested, I find it absurd to say that should be legal. Now if team 1 designs a new mechanism, like a custom fabricated rubber intake and shares their process (or notebook entry) with team 2 and they can replicate it, I think that is acceptable. And similarly if team 2 develops a new algorithm and shares the pseudo code and math behind the algorithm with team 1 than they should be able to replicate it.
Using your company analogy, if a team makes something (hardware, software etc.) I think a good way to think about it, is as if it was “patented.” You can design something very similar and you can base your design heavily off theirs, but you shouldn’t be hole counting the entire robot or copy pasting another teams code.
That is true but there are situations where there is a degree of nullification. For example, AMD sells their AMD64 patent to Intel whereas Intel sells their x86 patent to AMD, therefore pseudo-nullifying the patents, and both Intel and AMD can use x86 and AMD64 architecture.
This sort of situation I believe also happens with VEX too as a lot of times RECF would pay VEX to have someone talk about VEX and represent VEX at events, and VEX likely pays RECF for their competition hosting services and gives discounts.
I think the issue here is “what is the purpose of competitive robotics?” If the purpose is collaboration and teamwork, then is the concept of preventing teams from sharing designs or code from other teams the antithesis of collaboration and teamwork?
With LemLib, JAR-Template, Okapilib, etc, it exposes teams to look into the back-end and source code to see not just only what something does hut how it operates. Sometimes seeing examples can be a really quick way to learn. And other ways, the act of socializing, understanding and implementing libraries whether well-documented or poorly written is one of the greatest coding skillsets to learn. I learned how to code when I was young by just grabbing what other people wrote and seeing how it works, and molding it into something I like. It took me 2-3 years in VEX to learn PID, 1-2 years too many that I could have learned way easier if someone just have shown me how to do it or gave an example to understand it fundamentally.
However, there really isn’t a “if they understand it then it would be legal and if they don’t it’s not” since if so many teams have the code/templates available nobody would raise an eye if a team doesn’t understand a library or template works fundamentally. However just the act of being available for teams raises the knowledge ceiling far greater and faster than isolating teams to their own walled gardens. More information => more knowledge => more mental tools available and therefore greater intelligence.
So if code can do it, robot designs can and should too.
What are you talking about? G4 is extremely clear. The robot (and code) must match the ability of the team. If a team uses a template, it must be able to explain the concepts of the functionality used in the templates. If a team uses a mechanical system (e.g. a PTO) on their robot, they must be able to describe how it works, how they built it, etc.
While I don’t agree with this statement as written, I do agree with the sentiment of raising the ceiling thru knowledge sharing.
And while I understand the GDC’s stance on prohibiting the use of AI (e.g. ChatGPT) for Notebooks and code, I do disagree with it. While I acknowledge that tools like CoPilot (or other Code Assistants) can be used to write code that is potentially above the ability of the team, I believe these tools can provide excellent help in understanding code as well as fostering good coding techniques. With first hand experience using these tools in a professional capacity, I am also of the opinion that there is a certain amount of skill (or at least a learning curve) to get the most out of them. CoPilot will definitely hallucinate code, and will produce code that doesn’t compile or doesn’t do what the user specified.
Tools like CoPilot, Grammarly, and LLMs (like ChatGPT) are becoming table-stakes in the professional world. Restricting their use (like restricting use of 3D printers), are all constraints that I believe are overly restrictive, however well-intentioned.
The Robotics Education & Competition (REC) Foundation’s global mission is to provide every educator with competition, education, and workforce readiness programs to increase student engagement in science, technology, engineering, math, and computer science.
Programming libraries are resources that teams can use, but they should not be utilized unless teams understand the underlying concepts of how the libraries work. Last year as a judge, I encountered issues with teams using libraries they may not have understood and it was brought up as a G2 issue.
I agree that there should be mentorship within the community. I also think that collaboration between teams is good. For example, teams QUEEN and TNTN are currently working on some custom electronics in a collaboration effort and this is both legal and a good collaboration effort.
With respect to hardware, there is documentation such as on BLRS website for different mechanisms, but again I do not believe it is to the benefit of the team to allow another team to design the robot or to hole count another teams robot.
I feel like I’m not communicating appropriately about what I mean… In all of my years competing I was never asked about the “ability of my team” or to show proof of anything during inspections. I have never heard of any team getting interrogated or brought up about " Team members may move from one Team to another for non-strategic reasons outside of the Team’s control."
In my opinion, this rule literally adds fluff to the manual and holds negligible merit, and honestly is the opposite of a real-world situation as companies collaborate with each other, which is a literal contradiction of the statement in red. VEX themselves collaborates with RECF and even sells products and services to each other, which they are two separate “teams” of people.
And to be honest… I’m not saying this out of spite, but it is impossible to stop a team from collaborating with another team regardless of what pixie dust solution the GDC proposes. I feel like this rule is no-different than GDC being that grumpy guy being mad at children from separate families for playing together outside in front of their lawn. So I feel like instead of the GDC pretending that this rule is enforcable and has merit, telling others that they must stick within their walled gardens and not collaborate with each other, I feel like even if this rule exists any team would and should collaborate, give each other ideas, share code and designs with each other, etc. because this rule is so unenforcable that it practically holds no merit in the manual.
And honestly, I will guarantee you that these teams who collaborate ideas, sharing designs and code, have more time to dig deeper into the research hole without needing to spread thin often are more successful later on, because they are less stressed and have more time to actually enjoy at their own time. If you were given a library from scratch, you will use it. You’ll be comfortable. But then you will grow bored, grow curiosity, and then want to expand such a library for yourself. Then you try to look into the source code and start to learn the fundamentals of C++ out of your own curiosity on your own terms, because you see code of another student’s, often better explained because they are closer to the same grade level as you as compared to someone older.
I honestly wish I can believe this mission, but considering the price hikes being so substantial to a point where it may even surpass FTC costs.
Given the more practical applications to FTC, and eerily similar costs as VRC, the only difference is “scare factor.” I feel like a VEX robot looks less scary to enter into STEM than an FTC robot. However, I feel like replacement costs of broken electronics and motors would make VEX surpass FTC in costs if you play your cards right with borrowing equipment and materials for potentially $0. Given these sort of discrepancies, like the prevention of teams from 3D printing parts for their robot, I feel like the only thing RECF has a grasp on is the competition environment for “workforce readiness” which is honestly a contradiction when it comes to restricting team collaboration.
I understand that there are teams who use libraries should understand how the libraries work. But I find that if competitions are taken seriously and non-seriously at the same time, the competitions become contradictory without a concrete path. Any team can quite literally be fine if they simply “walk the walk” with blind confidence or just glance at the code really quick beforehand to get a general understanding of it. And honestly that’s fine. But if that’s all it takes, then the rule shouldn’t exist and we can let curiosity control whether or not they check the code.
If there are so many teams doing x and y already, why penalize the small few that get caught when practically most are equally guilty? I feel like in that sort of situation you might as well just allow it without questioning any team because honestly, it doesn’t even matter at that point, so long as they eventually try to learn the code out of curiosity.
As for mechanical design, V5 robots nowadays are quite rinse-and-repeat almost to a formula, that mathematics of the mechanical parts are not really present and are only really pragmatic instead of mathematic. That being said, VEX robots themselves are more in-line with art and LEGO pieces than engineering, if we set aside programming as a whole (which may explain the term “STEAM” being more frequently used for “STEM”). Like any art, it often starts as sketching over previous art, and coloring inside the lines, then understanding the fundamentals and eventually the formulas to create a masterpiece. Similarly, most good VEX robots I have seen are often from looking over your shoulder to see how other people made their artwork and using that as inspiration for your own art.
Therefore, there is quite a blurry line between “holecounting” and “looking over your shoulder to vaguely see the holes”. The only difference is the level of detail you let yourself see, both achieving pretty much the same result and knowledge gained.
Most good teachers not only explain how something works, but equally give an example (often practical) to put what is taught into light. I would like to think of any other attempt than deliberately and concisely giving an explanation as sort of “obscurantism.” I believe that all individuals have the right to learn, not only by using what was already learned but having the leisure to learn it on their own terms. By actively restraining knowledge and forcing individuals to make their own library, only those who were taught in a well-rounded educational institution would actively know how to do so and therefore those who are less fortunate will be unable to learn it. By at least presenting the code, it gives one who is less fortunate the capacity to be curious and have a good foot in the door with the additional spare time to learn how it works fundamentally via reverse-engineering.
In my less than one year competing I actually have seen this! There was a team (who I will obviously not name) who were very very young. They were around 8 years old, were competing in IQ and had a modified Ben Lipper bot at the first competition I saw them at, but at the second, is was heavily modified and a lot better. There was a lot of discussion about whether these kids had built it or it was mentor built. (It was a private team). I heard from my mentor that there was some “investigation” into the team but ultimately they found that the team was legit.
The judges ask about your robot though, and if you have no clue how your robot works, chances are, it’s mentor built
Tell that to the teams that have been removed from consideration for judged awards or removed completely from events. Your experience does not mean there have not been consequences. G1/G2/G4 are very real issues and dealt with by Judge Advisors. Head Referees, and Event Partners. Sure there are no big posters saying “we have caught # of teams this season”.
I would recommend you read the Guide to Judging, specifically the section on academic honesty - integrity matters.
As for RECF making space for collaboration, it is pretty clear it is encouraged at VEXU level where you do not attending a specific college to be part of that team. If you want to collaborate, it has to be on a per team level for purposes of design, build, and programming.
I would spend more time on a response, but I have a plane to catch to the EP Summit where such issues are discussed in depth and RECF is not “pixie dust” but real consequences for teams that do not follow what the Game Manual and RECF policies.
Tech industry is heavy on not collaborating in a ad-hoc method you wish - look at all those non-competes employees must sign. (yes the FTC is trying to cancel them, but Supreme Court is indicating that they might be over-reaching their authority to regulate without specific congressional approval…)
The “real world” industry is not so utopian as you make it to be.
I understand your concerns, and yes, these are hard issues to police, but the rules are there to encourage teams to follow them even when nobody is looking. Just like no one will no if you sped to work yesterday unless you actually got pulled over for it.
However, I would suggest that you attempt being a judge at a competition this year. When you interview teams and go over their notebooks, it becomes quite clear which teams can say, “PID is a control algorithm that sets the robots speed proportional to the distance away from the target, slows down as it gets close to the target to prevent overshooting and uses an integral term to overcome small forces of friction in the drivetrain” compared to a team that says, “we tried coding PID, but couldn’t get it to work. Our team captain said to just use so and so library.”
Additionally, there is a big difference between RECF and VEX. RECF is a non-profit company that designs the game and has employees to help at events, while Vex is a for profit company. RECF does not have the ability to control the prices of VEX’s parts, but regardless, it’s not really relevant to whether or not teams should be allowed to collaborate. The GDC makes the game manual with rules limiting parts to try to ensure a more fair playing ground for teams, but that doesn’t mean the system is perfect.
I just want to add that it is hard to catch teams breaking G4, but it is possible. Between our VEX U team and the South Carolina Regional Events and Management, we have volunteers at almost every competition in the state. Having consistent volunteers interacting with the community and talking to teams, we have been able to catch individuals competing on multiple teams.
But overall, I believe that the rules RECF have are their for a purpose and just because y"ou probably won’t get caught" and “everyone else is doing it” arent good reasons to break those rules.
My read is that with the heightened emphasis on Engineering Notebooks this year, that G4 violations (possibly elevating to G2 and G1) will start to become more prevalent. I’m pretty sure Grant or James has mentioned seeing teams AT WORLDs that could NOT explain how their code worked, nor could they write code that (and I’m paraphrasing) could make their robot go straight and then turn right.
Please read G4’s iii clause which says that G4 still applies - TO BOTH teams. The robot and code must match the skill level of the teams competing.
Where the statement in red is:
Points ii and iii are intended to represent real-world situations that are found in industry engineering. If a vital member of a professional engineering team were to suddenly leave, the remaining members of the team should still be capable of working on / maintaining their project.
This is literally how things work. When a co-worker leaves a company or moves to another team, if the remaining members of the team are unable to maintain their former colleague’s work, bad things happen.
As far as code and templates go, this has been discussed:
The spirit behind all these rules is a couple things:
To combat the “clone-bots”, especially where, say, a team of Seniors and a team of Freshman from the same org show up with the “same” bot and perform virtually identically but the Freshman are unable to explain anything about their robot to judges (or competitors, etc.), making it pretty clear that the Seniors made 2 robots and gave one to the Freshman.
With the (new) emphasis on citations in the Engineering Notebook to force teams to document these collaborations, making it more clear how “balanced” the collaboration was. It’s one thing (IMO) for 2 teams to work together on a mech where each team is contributing equally and another thing more akin to the situation above.