Yeah fake. No way you can get 90%+ using chatGPT without understanding code. LLMs barf out so much nonsense when it comes to code. You have to correct it frequently to make it spit out working code.
Anon cheats through college
Submitted 3 days ago by Early_To_Risa@sh.itjust.works to greentext@sh.itjust.works
https://sh.itjust.works/pictrs/image/08dff855-4581-4899-9a3c-a74944b92aa6.png
Comments
SkunkWorkz@lemmy.world 3 days ago
Artyom@lemm.ee 2 days ago
If we’re talking about freshman CS 101, where every assignment is the same year-over-year and it’s all machine graded, yes, 90% is definitely possible because an LLM can essentially act as a database of all problems and all solutions. A grad student TA can probably see through his “explanations”, but they’re probably tired from their endless stack of work, so why bother?
If we’re talking about a 400 level CS class, this kid’s screwed and even someone who’s mastered the fundamentals will struggle through advanced algorithms and reconciling math ideas with hands-on-keyboard software.
AeonFelis@lemmy.world 2 days ago
- Ask ChatGPT for a solution.
- Try to run the solution. It doesn’t work.
- Post the solution online as something you wrote all on your own, and ask people what’s wrong with it.
- Copy-paste the fixed-by-actual-human solution from the replies.
threeduck@aussie.zone 3 days ago
Are you guys just generating insanely difficult code? I feel like 90% of all my code generation with o1 works first time? And if it doesn’t, I just let GPT know and it fixes it right then and there?
KillingTimeItself@lemmy.dbzer0.com 2 days ago
the problem is more complex than initially thought, for a few reasons.
One, the user is not very good at prompting, and will often fight with the prompt to get what they want.
Two, often times the user has a very specific vision in mind, which the AI obviously doesn’t know, so the user ends up fighting that.
Three, the AI is not omnisicient, and just fucks shit up, makes goofy mistakes sometimes. Version assumptions, code compat errors, just weird implementations of shit, the kind of stuff you would expect AI to do that’s going to make it harder to manage code after the fact.
unless you’re using AI strictly to write isolated scripts in one particular language, ai is going to fight you at least some of the time.
Earflap@reddthat.com 3 days ago
Can not confirm. LLMs generate garbage for me, i never use it.
nimbledaemon@lemmy.world 2 days ago
I just generated an entire angular component (table with filters, data services, using in house software patterns and components, based off of existing work) using copilot for work yesterday. It didn’t work at first, but I’m a good enough software engineer that I iterated on the issues, discarding bad edits and referencing specific examples from the extant codebase and got copilot to fix it. 3-4 days of work (if you were already familiar with the existing way of doing things) done in about 3-4 hours. But if you didn’t know what was going on and how to fix it you’d end up with an unmaintainable non functional mess, full of bugs we have specific fixes in place to avoid but copilot doesn’t care about because it doesn’t have an idea of how software actually works, just what it should look like. So for anything novel or complex you have to feed it an example, then verify it didn’t skip steps or forget to include something it didn’t understand/predict, or make up a library/function call. So you have to know enough about the software you’re making to point that stuff out, because just feeding whatever error pops out of your compiler may get you to working code, but it won’t ensure quality code, maintainability, or intelligibility.
surph_ninja@lemmy.world 2 days ago
A lot of people assume their not knowing how to prompt is a failure of the AI. Or they tried it years ago, and assume it’s still as bad as it was.
JustAnotherKay@lemmy.world 2 days ago
My first attempt at coding with chatGPT was asking about saving information to a file with python. I wanted to know what libraries were available and the syntax to use them.
It gave me a three page write up about how to write a library myself, in python. Only it had an error on damn near every line, so I still had to go Google the actual libraries and their syntax and slosh through documentation
UnsavoryMollusk@lemmy.world 3 days ago
Garbage for me too except for basic beginners questions
WoodScientist@sh.itjust.works 1 day ago
Two words: partial credit.
xor@lemmy.dbzer0.com 3 days ago
i guess the new new gpt actually makes code that works on the first time
Eheran@lemmy.world 3 days ago
You mean o3 mini? Wasn’t it on the level of o1, just much faster and cheaper? I noticed no increase in code quality, perhaps even a decrease. For example it does not remember things far more often, like variables that have a different name. It also easily ignores a bunch of my very specific and enumerated requests.
Maggoty@lemmy.world 2 days ago
Usually this joke is run with a second point of view saying, do I tell them or let them keep thinking this is cheating?
dilroopgill@lemmy.world 3 days ago
deepseek rnows solid, autoapprove works sometimes lol
nednobbins@lemm.ee 2 days ago
The bullshit is that anon wouldn’t be fsked at all.
If anon actually used ChatGPT to generate some code, memorize it, understand it well enough to explain it to a professor, and get a 90%, congratulations, that’s called “studying”.
MintyAnt@lemmy.world 2 days ago
Professors hate this one weird trick called “studying”
JustAnotherKay@lemmy.world 2 days ago
Yeah, if you memorized the code and it’s functionality well enough to explain it in a way that successfully bullshit someone who can sight-read it… You know how that code works. You might need a linter, but you know how that code works and can probably at least fumble your way through a shitty 0.5v of it
naught101@lemmy.world 2 days ago
I don’t think that’s true. That’s like saying that watching hours of guitar YouTube is enough to learn to play. You need to practice too, and learn from mistakes.
RobertoOberto@sh.itjust.works 2 days ago
I don’t think that’s quite accurate.
The “understand it well enough to explain it to a professor” clause is carrying a lot of weight here - if that part is fulfilled, then yeah, you’re actually learning something.
Unless of course, all of the professors are awful at their jobs too. Most of mine were pretty good at asking very pointed questions to figure out what you actually know, and could easily unmask a bullshit artist with a short conversation.
nednobbins@lemm.ee 2 days ago
It’s more like if played a song on Guitar Hero enough to be able to pick up a guitar and convince a guitarist that you know the song.
Code from ChatGPT (and other LLMs) doesn’t usually work on the first try. You need to go fix and add code just to get it to compile. If you actually want it to do whatever your professor is asking you for, you need to understand the code well enough to edit it.
It’s easy to try for yourself. You can go find some simple programming challenges online and see if you can get ChatGPT to solve a bunch of them for you without having to dive in and learn the code.
Maggoty@lemmy.world 2 days ago
No he’s right. Before ChatGPT there was Stack Overflow. A lot of learning to code is learning to search up solutions on the Internet. The crucial thing is to learn why that solution works though. The idea of memorizing code like a language is impossible. You’ll obviously memorize some common stuff but things change really fast in the programming world.
aliser@lemmy.world 3 days ago
deserved to fail
Agent641@lemmy.world 3 days ago
Probably promoted to middle management instead
SaharaMaleikuhm@feddit.org 3 days ago
He might be overqualified
kabi@lemm.ee 3 days ago
If it’s the first course where they use Java, then one could easily learn it in 21 hours, with time for a full night’s sleep. Unless there’s no code completion and you have to write imports by hand. Then, you’re fucked.
rockerface@lemm.ee 3 days ago
If there’s no code completion, I can tell you even people who’s been doing coding as a job for years aren’t going to write it correctly from memory. Because we’re not being paid to memorize this shit, we’re being paid to solve problems optimally.
ryannathans@aussie.zone 3 days ago
Also get paid extra to not use java
spamfajitas@lemmy.world 3 days ago
My undergrad program had us write Java code by hand for some beginning assignments and exams. The TAs would then type whatever we wrote into Eclipse and see if it ran. They usually graded pretty leniently, though.
jaemo@sh.itjust.works 3 days ago
Perfectly articulated.
404@lemmy.zip 3 days ago
My first programming course (in Java) had a pen and paper exam. Minus points if you missed a bracket. :/
SatanClaus@lemmy.dbzer0.com 3 days ago
Haha same. God that was such a shit show. My hand writing is terrible lmao
ECB@feddit.org 3 days ago
I got -30% for not writing comments for my pen and paper java final.
Somehow it just felt a bit silly to do, I guess
DragonOracleIX@lemmy.ml 3 days ago
It was the same for the class I took in high school. I remember the teacher saying that its to make sure we actually understand the code we write, since the IDE does some of the work for you.
kopasz7@sh.itjust.works 3 days ago
Remember having to use (a modified version of?) quincy for C. Trying to paste anything would put random characters into your file.
Still beats programming on paper.
TootSweet@lemmy.world 3 days ago
generate code, memorize how it works, explain it to profs like I know my shit.
ChatGPT was just his magic feather all along.
Bashnagdul@lemmy.world 3 days ago
Dumbo reference
boletus@sh.itjust.works 3 days ago
Why would you sign up to college to willfully learn nothing
Gutek8134@lemmy.world 3 days ago
My Java classes at uni:
Here’s a piece of code that does nothing. Make it do nothing, but in compliance with this design pattern.
When I say it did nothing, it had literally empty function bodies.
boletus@sh.itjust.works 3 days ago
Yeah that’s object oriented programming and interfaces. It’s a shit to teach people without a practical example but it’s a completely passable way to do OOP in industry, you start by writing interfaces to structure your program and fill in the implementation later.
Now, is it a good practice? Probably not, imo software design is impossible to get right without iteration, but people still use this method… good to understand why it sucks
e8d79@discuss.tchncs.de 3 days ago
So what? You also learn math with exercises that ‘do nothing’. If it bothers you so much add some print statements to the function bodies.
ICastFist@programming.dev 3 days ago
Mine were actually useful, gotta respect my uni for that. The only bits we didn’t manually program ourselves were the driver and the tomcat server, near the end of the semester we were writing our own Reflections to properly guess the object type from a database query.
TheSlad@sh.itjust.works 3 days ago
A lot of kids fresh out of highschool are pressured into going to college right away. Its the societal norm for some fucking reason.
Give these kids a break and let them go when they’re really ready. Personally I sat around for a year and a half before I felt like “fuck, this is boring lets go learn something now”. If i had gone to college straight from highschool I would’ve flunked out and just wasted all that money for nothing.
boletus@sh.itjust.works 3 days ago
Yeah I remember in high school they were pressuring every body to go straight to uni and I personally thought it was kinda predatory.
dwindling7373@feddit.it 3 days ago
To get hired.
boletus@sh.itjust.works 3 days ago
A diploma ain’t gonna give you shit on its own
SoftestSapphic@lemmy.world 3 days ago
To get the peice of paper that lets you access a living wage
Crampi@sh.itjust.works 3 days ago
To get a job so you don’t starve
blackbeards_bounty@lemmy.dbzer0.com 3 days ago
Because college is awesome and many employers use a degree as a simple filter any way
boletus@sh.itjust.works 3 days ago
Not a single person I’ve worked with in software has gotten a job with just a diploma since like the early 2000s
Maybe it’s different in some places.
GraniteM@lemmy.world 1 day ago
If you go through years of education, learn nothing, and all you get is a piece of paper, then you’ve just wasted thousands of hours and tens of thousands of dollars on a worthless document. You can go down to FedEx and print yourself a diploma on nice paper for a couple of bucks.
If you don’t actually learn anything at college, you’re quite literally robbing yourself.
SoftestSapphic@lemmy.world 3 days ago
This person is LARPing as a CS major on 4chan
It’s not possible to write functional code without understanding it, even with ChatGPT’s help.
ILikeBoobies@lemmy.ca 3 days ago
where’s my typo
;
Simulation6@sopuli.xyz 3 days ago
I don’t think you can memorize how code works enough to explain it and not learn codding.
xelar@lemmy.ml 2 days ago
Brainless GPT coding is becoming a new norm on uni.
Even if I get the code via Chat GPT I try to understand what it does. How you gonna maintain these hundreds of lines if you dont know what it does?
janus2@lemmy.zip 3 days ago
isn’t it kinda dumb to have coding exams that aren’t open book? if you don’t understand the material, on a well-designed test you’ll run out of time even with access to the entire internet
disclaimer did not major in CS
Fleur_@lemm.ee 2 days ago
Why would you even be taking the course at this point
Xanza@lemm.ee 3 days ago
pay for school do anything to avoid actually learning
Why tho?
Melatonin@lemmy.dbzer0.com 1 day ago
They’re clever. Cheaters, uh, find a way.
burgersc12@mander.xyz 3 days ago
Bro just sneak to the bathroom and use chatgpt on your phone like everyone else does
IndustryStandard@lemmy.world 2 days ago
Anon volunteers for Neuralink
licheas@sh.itjust.works 2 days ago
Why do they even care? it’s not like your future bosses are going to give a flying fuck how you get your code. at least, they won’t until you cause the machine uprising or something.
Psaldorn@lemmy.world 3 days ago
Now imagine how it’ll feel in interviews
2ugly2live@lemmy.world 3 days ago
He should be grateful. I hear programming interviews are pretty similar, as in the employer provides the code, and will pretty much watch you work it in some cases. Rather be embarrassed now than interview time. I’m honestly impressed he went the entire time memorizing the code enough to be able to explain it, and picked up nada.
Korhaka@sopuli.xyz 3 days ago
Open the browser in one VM. Open chatgpt in another VM.
GarlicToast@programming.dev 3 days ago
Been a TA when chatGPT was released. Most students shot their own foot this way before we figured what was happening. Grades went from bell shaped to U shaped. A few students got 85+, the rest failed, it was brutal. Thought I failed my students horribly before I found out it was happening in all classes.
If you actually stuck in such a situation, solve as many problems as you can. An approach that will work for most people:
- Try to solve
- Fail
- Take a peek, understand your failure. If the peek didn’t include full solution, go back to step 1. Else continue to step 4.
- Move to the next question and go back to step 1.
Make sure to skip questions if they are too easy. Evey 4~ hours take a 20 minutes nap (not longer than 25 minutes). If you actually manage to solve enough problems to pass, go to sleep, 4.5 hours or a longer multiplier of 1.5 hours.
After the exam go back and solve all homework yourself. DO NOT cram it, spread it or you will retain nothing long term.
Good luck.
WolfLink@sh.itjust.works 3 days ago
Any competent modern IDE or compiler will help you find syntax mistakes. Knowing the concepts is way more important.
RaoulDook@lemmy.world 3 days ago
Unless they’re being physically watched or had their phone sequestered away, they could just pull it up on a phone browser and type it out into the computer. But if they want to be a programmer they really should learn how to code.
levzzz@lemmy.world 3 days ago
Java is literally easy bro tf is there to stress about…
nsrxn@lemmy.dbzer0.com 3 days ago
run it in a vm
PlantDadManGuy@lemmy.world 2 days ago
I mean at this point just commit to the fraud and pay someone who actually knows how to code to take your exam for you.
NigelFrobisher@aussie.zone 2 days ago
I remember so little from my studies I do tend to wonder if it would have cheating to… er… cheat. Higher education was like this horrendous ordeal where I had to perform insane memorisation tasks between binge drinking, and all so I could get my foot in the door as a dev and then start learning real skills on the job (e.g. “agile” didn’t even exist yet then, only XP. Build servers and source control were in their infancy. Unit tests the distant dreams of a madman.)
Ascend910@lemmy.ml 2 days ago
virtual machine
bappity@lemmy.world 3 days ago
Java is pies easy, they’re fine
UnfairUtan@lemmy.world 3 days ago
nmn.gl/blog/ai-illiterate-programmers
Every time we let AI solve a problem we could’ve solved ourselves, we’re trading long-term understanding for short-term productivity. We’re optimizing for today’s commit at the cost of tomorrow’s ability.
boletus@sh.itjust.works 3 days ago
Hey that sounds exactly like what the last company I worked at did for every single project 🙃
Daedskin@lemm.ee 3 days ago
I like the sentiment of the article; however this quote really rubs me the wrong way:
Why would that ship have sailed? No one is forcing you to use an LLM. If, as the article supposes, using an LLM is detrimental, and it’s possible to start having days where you don’t use an LLM, then what’s stopping you from increasing the frequency of those days until you’re not using an LLM at all?
I personally don’t interact with any LLMs, neither at work or at home, and I don’t have any issue getting work done. Yeah there was a decently long ramp-up period — maybe about 6 months — when I started on ny current project at work where it was more learning than doing; but now I feel like I know the codebase well enough to approach any problem I come up against. I’ve even debugged USB driver stuff, and, while it took a lot of research and reading USB specs, I was able to figure it out without any input from an LLM.
Maybe it’s just because I’ve never bought into the hype; I just don’t see how people have such a high respect for LLMs. I’m of the opinion that using an LLM has potential only as a truly last resort — and even then will likely not be useful.
gamermanh@lemmy.dbzer0.com 3 days ago
Because the tools are here and not going anyway
The actually useful shit LLMs can do. Their point is that using only majorly an LLM hurts you, this does not make it an invalid tool in moderation
You seem to think of an LLM only as something you can ask questions to, this is one of their worst capabilities and far from the only thing they do
Mnemnosyne@sh.itjust.works 3 days ago
“Every time we use a lever to lift a stone, we’re trading long term strength for short term productivity. We’re optimizing for today’s pyramid at the cost of tomorrow’s ability.”
Ebber@lemmings.world 3 days ago
If you don’t understand how a lever works, then it’s a problem. Should we let any person with an AI design and operate a nuclear power plant?
julietOscarEcho@sh.itjust.works 3 days ago
Precisely. If you train by lifting stones you can still use the lever later, but you’ll be able to lift even heavier things by using both your new strength AND the leaver’s mechanical advantage.
By analogy, if you’re using LLMs to do the easy bits in order to spend more time with harder problems fuckin a. But the idea you can just replace actual coding work with copy paste is a shitty one. Again by analogy with rock lifting: now you have noodle arms and can’t lift shit if your lever breaks or doesn’t fit under a particular rock or whatever.
trashgirlfriend@lemmy.world 3 days ago
“If my grandma had wheels she would be a bicycle. We are optimizing today’s grandmas at the sacrifice of tomorrow’s eco friendly transportation.”
AeonFelis@lemmy.world 3 days ago
Actually… Yes? People’s health did deteriorate due to over-reliance on technology over the generations. At least, the health of those who have access to that technology.
AdamBomb@lemmy.sdf.org 2 days ago
LLMs are absolutely not able to create wonders on par with the pyramids. They’re at best as capable as a junior engineer who has read all of Stack Overflow but doesn’t really understand any of it.
Guttural@jlai.lu 3 days ago
This guy’s solution to becoming crappier over time is “I’ll drink every day, but abstain one day a week”.
I’m not convinced that “that ship has sailed” as he puts it.
Hoimo@ani.social 3 days ago
Not even. Every time someone lets AI run wild on a problem, they’re trading all trust I ever had in them for complete garbage that they’re not even personally invested enough in to defend it when I criticize their absolute shit code. Don’t submit it for review if you haven’t reviewed it yourself, Darren.
wizardbeard@lemmy.dbzer0.com 3 days ago
My company doesn’t even allow AI use, and the amount of times I’ve tried to help a junior diagnose an issue with a simple script they made, only to be told that they don’t actually know what their code does to even begin troubleshooting…
“Why do you have this line here? Isn’t that redundant?”
“Well it was in the example I found.”
“Ok, what does the example do? What is this line for?”
Crickets.
I’m not trying to call them out, I’m just hoping that I won’t need to familiarize myself with their whole project and every fucking line in their script to help them, because at that point it’d be easier to just write it myself than try to guide them.
Glide@lemmy.ca 3 days ago
Capitalism is inherently short-sighted.
Agent641@lemmy.world 3 days ago
Nahhh, I never would have solved that problem myself, I’d have just googled the shit out of it til I found someone else that had solved it themselves
merc@sh.itjust.works 3 days ago
And also possibly checking in code with subtle logic flaws that won’t be discovered until it’s too late.