Currently it's easy to cheat. When claiming a certification, you can:
Instead of trying to prevent cheating - which is futile - we need to catch cheaters after the fact.
The good news is we have essentially infinite time to do this. It doesn't matter whether a day has passed or a week or a year. If someone cheated by submitting someone else's work as their own, we can eventually detect this and forever brand that person as a cheater. (Once marked a cheater, this is extremely difficult to appeal. Most of the time cheaters just delete their accounts and leave the community. Which is for the better.)
So here is what I propose we build: some sort of crawler that visits the submitted projects of everyone who has claimed a certification and checks:
If anything is awry with the project, we flag it for human review. And if it's clear after human review that they cheated, we flag them as a cheater.
So really there are two systems that we would need to build:
If the tools are powerful enough (and don't kick up too many false positives, we could have a single paid staff audit these.) Even though millions of people use freeCodeCamp, only a few thousand people claim certifications each month, and I believe a vast majority of these are legitimate.
Still, the prospect of anyone having cheated dampens the sense of accomplishment for everyone else. This is why we have been so aggressive about banning cheaters in the past.
We are academic honesty hardliners and don't want cheaters in our community. Help us build the tools to better detect cheating so we can all have more peace of mind.
As for the web crawler part, we can use dom loading to detect if the link is legitimate or false or broken(404 error). [there must be a way to detect whether the page loads and displays something or doesn鈥檛 or if the server throws a default error page]
And also the test suite we use hopefully requires a script to be inserted in the header tag, which can be detected.
Links failing these initial tests can be rejected straight away and the user may be given a warning first as these could be human errors as well. Failure to rectify may lead to more severe steps
That all sounds like a good start, but none of it would catch someone who copies another persons working project.
What we can do is have the test script generate a code that writes to a database based on when the project was tested successfully and maybe get the CSS part and salt it then log it and when the user is submitting the project, if a duplicate if found, their account can be flagged.
This is at least a way for us to go forward with and this would prevent cheating in the future.
Of course, this is an alternative to the web crawler but if we use both side-by-side, I think it would make for better security
Edit: Doing the salting and database thing would solve our problem which @moT01 to quite some extent
@QuincyLarson Are there any stats on how many projects are submitted in a day as if its a low number then a group of people can go through the links and check if they are valid.
But if it's more, then we could do something similar to what @thecodingaviator suggested. We could log each link into a database when they are submitted and the next time any user submits another link a script could be made to see if the new link already exists in the database. If the new link does exist then it should be flagged to be checked by someone or the user could be sent a warning. That way no new links will be plagiarized or the same link won't be copy pasted for each project.
Hi, just pop in with my suggestions.
It would be a great idea using an extension to handle the situation, like for example if the user passed all the test suite a random code will be prompted out as a proof that the person meets all the criteria set and can just put the code within the challenge submit area with the link also.
Maybe after the link and the code have been submitted, the link is then checked against the database to see if it was submitted earlier and fire up a cheater.
Sorry for intruding and thanks for listening.
Have a nice day,
Carlos
We could potentially use https://github.com/sindresorhus/capture-website
How many projects are submitted a day? A week?
Maybe all could be volunteer reviewed.
Or users could have to link a Codepen to their FCC account, proving they're the owner.
How many projects are submitted a day? A week?
Maybe all could be volunteer reviewed.
Or users could have to link a Codepen to their FCC account, proving they're the owner.
This could lead to some projects never being (really) approved. Maybe with a max-time restriction?
Most helpful comment
As for the web crawler part, we can use dom loading to detect if the link is legitimate or false or broken(404 error). [there must be a way to detect whether the page loads and displays something or doesn鈥檛 or if the server throws a default error page]
And also the test suite we use hopefully requires a script to be inserted in the header tag, which can be detected.
Links failing these initial tests can be rejected straight away and the user may be given a warning first as these could be human errors as well. Failure to rectify may lead to more severe steps