While this post’s purpose isn’t to argue whether they should have posted it or not, it will be interesting to see how the GDC answers it and how this impacts autonomous development for teams (both VRC & VEXU) this season.
The only thing is that I do not know how this will be enforced. While it is pretty easy to check the use of this template or okapi(i think) there are many templates on this forum and other places for chassis control, pid, odom etc. The only way that I can think of is havin a QnA with judges where they inspect your code and make you explain it? This would take a lot of time for the QnA itself as well as requiring judges who have experience in programming. Maybe this could only be enforced for excellence/design/think award candidates?
Seems like this should be covered by the notes on programming in the RECF student centered policy.
My opinion for what it’s worth.
Templates such as the JAR template which is available to everyone (ie. open source etc.) should be allowed as long as students using it understand how it works and can explain the concepts used.
A template written by an adult* (or senior student programmer) on one team should not be shared internally with other teams of the same organization.
(* I guess it should not be written by an adult under any circumstances)
Curious distinction on I guess it should not be written by an adult under any circumstances - VEXU participants are considered “adults” in the context of VRC HS/MS. Would that place projects such as Purdue’s ARMS and LEGS projects (and, pending the future of OkapiLib) in jeopardy under this definition? Conceivably PROS itself would fall outside the definition of “template” being used here.
I don’t really know. I was thinking primarily of HS/MS and mentors directly involved with an organization. But I’m not sure where the line is between community projects such as ARMS and frameworks provided to teams that are created by professional programmers.
I read the student centered policy again. This section is relevant.
Students should be able to understand and explain the code used on their machines, and students should be able to demonstrate that they can program on a level equivalent to the code included on their machine.
based on that, and my experience talking to teams over the past 12 years, I think most HS/MS teams should not be using ARMS, LEGS or OkapiLib.
I agree, while I love the concept of these libraries, they really don’t help the roboteer understand what is going on. Even super simple PID functions throw programming teams for a loop when asked to explain them. I’m all for being able to put a facade over some of the bit twiddle that goes on to make the library more efficient.
I even look at the drive base code that is part of the standard coding packages (Vex Code/Text/Block). Most roboteers can’t even explain what that does past “Makes the robot move”.
In the PIC days there was a WPILibrary that you could use that save a ton of time and code lines by using. But we spent the first three weeks of programmer build season going over each function and talking about how it worked and what similar hand crafted C code would look like. It helped teams because they also got the interactions between some of the library routines.
I’ve worked in IT where libraries are used on a regular basis to save time, but I’ve come across developers that can’t explain why they picked that library, or how it works, dependencies, memory used, etc. I traced a huge performance issue back to a developer that pulled in a 420Mb (runtime size) library in to get access to a function they could have written in about the time it took me to write this post.
Maybe there is an opportunity for VEX/RECF to put out a certification in OkapiLib (or ALMS, LEGS, JAR) to show the judges that the programmer does understand the code.
The library part is one of the reasons I looked sideways at the VEX AI program. A huge chunk of that program was based on code you really couldn’t see but were expected to use.
(and as a complete aside, I’m thrown every time I see JAR Template, my brain goes oh Cool it’s a JAR, we can do Java! and then I need reset)
Student-centered Policy is real, and has been enforced for sure with regarded to judged awards, and/or pulling team(s) from events. This season there are new procedures for G1/G2 violations/complaints: Code of Conduct Reporting Tools
This is posted on every event for Over Under competition season. I highly recommend ALL teams and volunteers read this knowledge base article.
I like this consistent process - it allows to educate teams about Student-Center before going for a hammer. It is applied not in a vacuum - EP, Head Referee, and Judge advisor work as a team to deal with issues at their events, and loop in RECF EEM and TEM before any extreme action taken (such as removal from consideration of judged awards, or from event).
This was discussed in depth at EP Summit, and I expect to be discussed at Coaches Summit this summer. Student-centered violation apparently have become a bigger issue last season.
As James indicated, teams should be prepared to explain their code, why they chose to use an outside package, and be prepared to demonstrate they can write their own version of the routine.
With the VEXU being outside of the VRC MS/HS program, by definition virtually all are considered adults not meeting definition of student on VRC team.
To remove all doubt about a team’s capabilities, would it be possible to consider making a tutorial on how to write a JAR template explaining all the fundamentals, so that whoever uses the JAR template would have a clear understanding of what drives this template’s functionality? That way a team may use this template, without prior experience, now into a team fully equipped on how to make something as such?
So that interpretation would seem to also bar “most” teams from using even VRC HS/MS authored libraries such as LemLib or JAR.
To me, as a software professional and be prepared to demonstrate they can write their own version of the routine is not the only way to satisfy demonstrate that they can program on a level equivalent to the code included on their machine
I’m generally disappointed with the quality of code I see in James’ annual ask for teams to share the prior year’s code. I think that draconian policies/interpretations of “student-centered” in this context has more potential to stiffle students rather than empowering them to see/use/learn from better written and organized code-bases.
My personal belief is that students should be able to use any open-source software. Projects like LemLib, JAR, SylvieLib and others raise the floor for teams, and their use should be encouraged. How many teams struggle with PID because they don’t know if they have a bug in their code or if they haven’t tuned their parameters properly? Personally, I don’t find there to be much value in having 5000 PID implementations floating around when open-source implementations exist.
Considering the large explosion of PROS and VEXCode templates, I think it would be a really cool idea for RECF to create a process to make templates legal for all teams. For instance, the RECF may allow teams to submit their templates to their website, including tutorials and practice problems on how to write the code. After given approval by RECF, any team who watches the tutorial and practice problems would be given a certification they can provide to judges that show that they have the knowledge capable of utilizing the template.
Per se, templates are not illegal on their own. The question at hand, does it meet the standard that it meets the skill level of the team. There have been instances of adults providing code to teams that the team does not understand and way above their skill level.
I love the idea of tutorials that help teams develop new coding skills and helps solve problems.
I wholeheartedly agree on this part. I started out my coding adventure on Roblox, of all places when I was 10 years old (my Roblox account is 12 years old now, older than a large sum of people on this forum now, and I occasionally participate in developer/beta programs as well). I learned Lua by looking at code written by people and trying to understand how the code works in the back-end. By making these systems readily available and open source, it allows teams to easily and readily use more advanced programs, while equivalently veering them into using better coding practices by exposing them to software that is neater and should be more documented. If templates become more readily-available and the necessary structure is set, it can also allow teams the opportunity to show that they can work with open source code from Github to enhance their robot’s capabilities, which I consider a massive stepping stone for a resume builder. It announces that a team can work with open-source communities to further improve designs and systems, which can be perceived as a massive green flag for computer science students around the world.
Documenting a project correctly is as much work, if not more, than actually writing the code in the first place. Some of the template code I have looked at doesn’t even have a single comment, let alone any explanation of what the code is trying to achieve. Software should be written to be read and understood by others, at the moment the quality of the various open source offerings varies considerably.
Are teams that don’t understand how the various Vex sensors work barred from using those sensors? How deeply are students expected to be able to explain how the inertial sensor works and “be prepared to demonstrate they can write their own version of the routine” that provides the heading? If a team were to use the Vex GPS sensor, are they to be expected to understand in-depth the patent and algorithm it uses to provide location? Or are those somehow different because Vex supplies them?
I understand (and agree with) the desire and ethos behind “student-centered”. I understand that we don’t want John Carmack programming VRC Middle School robots. I understand we’d like to ensure that the line between “team collaboration” and “teams building/programming robots for other teams in their org or friends” promotes a culture where the robot a team fields is a product of the work of that team.
What I fear is that a stringent definition will stifle legitimate good-faith collaborative efforts. I see James’ name credited in both JAR and Sylib for providing insight and advice. I think that allowing these libraries to be used by the community scales this knowledge much more effectively, and probably reduces the incentives to DadCode, than making teams question whether they’ll be disqualified for using them.
As I believe where this policy is going, I don’t see the value in driving students to implement their own PID. So much of software is building on what someone else has done. There is real skill involved in taking what someone else has done (e.g. a PID implementation, or motion profiling, or whatever technique) and making it work for your situation.
Much like physically constructed DadBots, I know we don’t want programmed DadBots. I do think judging is a reasonable way to sus out the following three cases:
Team Newbie who uses open-source PID but can’t explain what the terms mean, and say something like “it was really hard to get right and we got help from X/Y/Z” but now our robot moves more reliably
Team Sister-team/DadProgrammed bot using open-source PID who can’t explain what the terms mean or even know where it is being used
Team Dome - who use open-source PID because it saved them the time and hassle of writing and debugging it. They understand and can explain the terms along with the pros and cons of integral windup, etc.
I think a policy that has the chilling effect on using JAR/LemLib/ARMS/SyLib in an effort to prevent the second-type of team has the knock-on effect of punishing the other 2 types of teams.
I know I posted this concept on a separate thread about 3D Printing. Perhaps making changes to RobotEvents to allow/require teams to publish/link to any 3rd party open-source software their robot uses would be a path forward.
First, at EP Summit, it was pointed out we should not be using “DadBot”. That said - that overall issuing being tackled. I do not expect hunt for sharing of ideas between teams is the issue being addressed by Student-Centered policy.
I agree with the idea, but enforcement of this is nearly impossible. To confirm that a team wasn’t using any third party software, EPs/judges/refs would have to do the following:
have access to the teams code and manually confirm this. This would require much more extensive training than is currently needed, and would be a very long process. Having to read through 60+ programs would not be fun for any party
confirm that the code submitted by the team is the same as the code being run on their robot. I’m not really sure how this could be done effectively.
To give my brief take on this argument/debate/discussion that has taken place many times over the past couple years on the unofficial VRC discord:
yes, teams should be allowed to publish open-source free to use libraries/templates, and other teams should be allowed to use those. In my opinion, these libraries accelerate how fast a team is able to start programming an autonomous, without doing a large majority of the work for them. Learning PID and odometry is a process that will take a long time for many people, but in my experience developing autonomous routes and tuning them to perfection is always what takes the longest. If a team with no knowledge on how a library works decides to use it, they will still have to spend some amount of time learning how to implement that library on to their robot, how to tune PID, odometry, etc, and how to effectively use those tools (albiet less than learning to create that tool themselves). If they encounter a bug in that library, they will likely be at a loss and unable to progress further with it. On the contrary, if a team understands the inner workings of a library and decides to use it, they are simply able to start programming their autonomous routes faster. If there is a bug in this open source code that they find, they are simply able to download the original code, fix the problem, and use this updated code on their robot. This creates an environment where students are allowed to ask for help from other students, and other students are allowed to help them.
In terms of if VRC competitors are allowed to use tools created by VEXU competitors, I still think it should be allowed. VEXU members are still students, we are just in college/university instead of middle/high school. In my opinion, there is no difference between a high school freshman asking a high school senior for help, and a high school junior asking a college sophomore for help.
In conclusion, open source free to use templates raise the skill floor for everyone without taking away many of the advantages of programming these things yourself, and should be permitted to exist in competitions.
teams should NOT be allowed to publish closed-source/ paid for code, and teams should not be permitted to essentially pay for a program. Pretty self explanatory.
teams should be expected to be able to explain these templates and how they were used on a robot. This is already explained in the student centered policy. As I said above, I think efficiently enforcing this is near impossible, and should rather be left to judges when deciding an award. If two teams are similar in every field, but one used a template while the other did everything themselves, the latter should receive the award. Teaching judges to ask “did you use any third party software while programming your robot”, and following up with “what software did you use and how did you use it”, continuing with any relevant questions about that, is a good way to ascertain a large majority of teams’ understanding of the libraries they used.
I think I’ve had this discussion before. Teams need to be able to dig their own silica, forge it into a pure bar of silicone then slice it into 1 mm discs, photo imprint the die layout and then photo etch the proper “doped” material on the substrate, laser weld #32 wire onto the imprint then encapsulate that into a potted compound they can then attach pins to. They need to write an Intel 8080 emulator to run on the chip they developed.
Yes, for sensors they need to be able to say the sonar sensor issues a “ping” from the tiny speaker and at the same time starts a counter. The sound bounces off the wall or game object and is picked up by the microphone. The timer is stopped. By looking at the time and knowing that the speed of sound is ~1100 feet per second, the distance can be calculated.
There are a few thousand descriptions of IMU and how they work, how you read them at the “bare metal” and the data you get back. While an IMU is part of rocket science, it’s not rocket science to know how they work,
Likewise the code behind vision sensors is easy to explain. Google will be your friend.
I would expect a team that is using a PID routine to be able to explain it. I actually don’t care if they use the code from StackOverflow, since the entire internet appears to run on samples from there. (Or according to my Devs that just jam that gist into production code)
But I expect teams to be able to explain it. PID ( and because I appear to be on a rant here AND Holonomic Drive) code isn’t that hard. We are looking for a higher level of understanding than “I push the buttons the robot moves” (I’m fresh off of rocket camp where they teach us how to teach rocketeers some of the math behind rockets. So they are not going “I say 3-2-1 GO and push the button and rocket goes UP!!” ) A little bit of math and Computer Science goes a long way.
But in your instant case it comes down to why I just hate the “Student Led” or “Student Centered” dogma. Dumping $1800 worth of parts on a table, and now adding the FSL library and saying "Build a winning robot! " is just stupid. I have a new grand child, so I’m going to apply Student Centered to them, I’m going to stack books around them and when they crawl over them I expect they can read.?!?
What I want, and have done for almost 20 years is to mentor / teach roboteers on how this stuff works. "Hey, think of how an elevator works and how smooth that is, here is how that works, lets talk through this method called PID, how we can create one. Awesome, now that you know that, there is one built into the code that you can use. " I would expect to see that discussion marked in their notebooks.
Lather, rinse, repeat with sonar, rotational sensors to get distance, vision sensors, etc. I love how the term Artificial Intelligence is just a lump term for “computer stuff I don’t understand is AI” ( by inference my microwave from 2005 has AI built in, since there is a PopCorn button, therefore it “knows” how to make popcorn (in reality it sets the time to 4:25 and starts)) But knowing how to calculate distance based on two points and all the other things in the “highly secret and expensive library” In the 70’s I wrote a highly efficient hidden line / hidden surface routine. It ended up being used by a ton of software. I don’t want roboteers to be able to reproduce that code, but knowing how to do hidden line, hidden surface and then using the library is fine.
The overall theme is “if you know how it works with some level of detail” then use the library. And I hear a voice in the back going “I don’t need to know how to build an internal combustion engine or a multi-phase electric motor to drive a car.” You don’t, but the roboteers I’m / you / we are investing in need to have the ability to dig down more than a few layers.
I’m all up for sharing, I’m 100% behind not reinventing the wheel (or lines of code). But I am also all in for knowing why you are doing this other than “makes the robot move”. Inspire and educate.
Here is one thing I am concerned of it a ruling is passed in some way that makes it illegal for teams to use templates or if a team doesn’t have an understanding of the template they are using. Yes there is the official process of reporting it as a student centered policy breach but I could see this being quite common. Also, this would likely only be discovered in the middle of qualification matches. So what is supposed to happen day of? Do you not allow the team to enter elims? Do you void there skills scores? Without doing these things it gives teams an unfair advantage that could qualify them for a state event (at least in Kansas teams frequently win states quals from skills scores). It also seems unnecessary harsh to DQ multiple teams from an event (say team 123A made a PID program and shared it with 123B-F)