If you run your AI, point doesn’t matter. However, what matters more is the fact that if you don’t use a skill, you just straight up lose it and that’s what AI is doing to developers. Mfs straight up forget how to write code
It’s very helpful that there are a handful of nonsense phrases that AI has scraped by reading journal articles wrong. They’re commonly published in magazine format with a bunch of narrow columns, so there’s some gibberish that AI scraped by reading across the page instead of down the columns. I want to make a database of those nonsense phrases so that I can just Ctrl+F in a journal article to see if I should just skip reading it because it’s AI garbage.
I use ai to become a better coder, not to replace me.
You can use local models for free, it’s just slower.
I’ve been using chatgpt to help me build a Bubble website. That is, I am doing all the work, I just bounce questions of how to achieve things and structure conditional statements correctly.
Because I’m basically sanity checking everything it says vs copying blindly, it’s interesting to see just how much it gets caught in a loop of misinformation. I’m lucky to be one of those learners who just needs an example, even if it’s a shitty one, to figure it out myself, so I often find myself using it simply to see how it’s NOT done.
But yeah, I know jack shit about coding but I’m sure AI code sucks ass.
Good for you to want to learn a new skill and taking things that LLMs spit out with healthy skepticism. I’m afraid future generations will lack such motivation.
Having been a coder for decades before AI came on the scene, I don’t understand how inexperienced programmers could possibly write a serious amount of working code with AI.
It’s wrong, like, at least half the time, but as an experienced coder, I can look at the “code” it generated and know what it was trying to do, and then write it correctly. I do find AI useful when I’m not sure how to go about solving a particular code-related issue, but … it just gives me something to think about, not an answer I can use directly.
It’s like google-coding in 2010; nothing you search for is exactly what you need, but it could help you see why your code isn’t working.
I really don’t dig that comparison. When you look up a snippet on stackoverflow, for example, you can immediately see the quality of the answer, as well as feedback from real people
Yeah like if you start coming across snippets that aren’t even properly indented, you know you’re digging the real bottom of the barrel (been there while struggling to fix email templating I know nothing about back in the day). Now, the code you get from the LLM looks totally legit to the untrained eye, and it may even generate a convincing explanation.
But you won’t have any indication when it’s dead wrong until you try to run it. And even then, it may be “working” in a way unintended because you don’t actually understand what you copy+pasted, because neither does the LLM ofc.
I can’t even imagine the spaghetti bowl you can get yourself into if you just keep vibe coding yourself deeper and deeper, while understanding nothing.
The spaghetti bowl is the real problem. You can make something that works, but it’s so fragile because the solution is rarely general and never elegant. The snippet might be surprisingly elegant, but it will reimplement the same code 3 different ways in 3 different places and the whole thing turns into a mess
You can see the quality if you’re an experienced coder. My comment lacks personal context in that I was in school in 2010 and there were plenty of my classmates who would plug snippets into their projects without fundamentally understanding what it did or learning what the project was supposed to teach us. Similar to a shortcut with AI in 2025.
There are definitely people who cut & pasted from stack overflow in the work environment, too. The difference is that I, as the clean-up crew, could google their code and find the post it came from … and then I could read the comments and figure out wtf they thought they were trying to do. When they paste LLM-generated code in, there’s no trace of where the dumbfuckery came from.
Just thinking about it makes me glad I’m near retirement.
that was exactly my point, for the “non experts” googling and using AI is very much not the same, as googling provides them with a lot more actual information (quality, alternatives)
I tried using chatgpt to write a basic batch file, it ended up such a horrendous mess that i gave up halfway through. Fucker got told four times, still kept putting the REM on the same line as actual code.
Well if it helps for y’all to know, if I can’t put my measly webpage making skills to decent use in the course of a weeks time, I’ll be buying the services of a freelancer because hoooooly shite am I rusty.
(I need to update my basic website and am terribly lazy. Maybe making some extra cash would make a kid somewhere happy.)
((Don’t message me here though I don’t check messages))
It’s the same cycle since the '70s. Whether it’s COBOL or VB.NET or vibe coding, the premise hasn’t changed.
There’s three broad categories of code:
- Monkey code (random applets that are almost entirely business logic and non-critical)
- Actual code (most things)
- Crazy shit like kernel or browser code.
I can see vibe coding, situationally, lower the barrier to entry of (1). But also that’s no different from COBOL or VB.NET which both promise “MBAs can now write code”, which conveniently never extends to maintaining said code. And vibe coding doesn’t help with that either, ChatGPT is an awful debugger.
Your boss thinks ChatGPT will help with (2), but it either won’t or only very slightly as an advanced autocomplete. For any problem-solving that requires more specific domain knowledge than can automatically find its way into their tiny context windows, LLMs are essentially useless.
… So I’m not worried. Today’s vibe coders are yesterday’s script kiddies.
Debugging is the hardest part, and now you get to spend all your time doing it
It’s not possible to make you unskilled if you’re skilled. At worst, you’d get rusty. It is possible that your skills might not be in high demand anymore though.
The only thing that would make programmers not be in demand is if “vibe coding” were truly producing a better product than traditional programming. So far, the only ones making that claim are the ones desperately trying to sell “AI” before the bubble bursts. It’s true that there are some companies that really want to believe it. But, companies are always desperately hoping for something that can allow them to fire their expensive workers. It’s rare that that works out.
It’s been aggressively pushed upon new programmers though, a whole generation who might potentially never develop skills to begin with
So was Mountain Dew. That doesn’t mean people had to drink it.
You are thinking to short term
This is not about you, but the next generations
In that case it’s not talking about “deskilling”, it’s about “not skilling in the first place”.
But, those are completely different things. I was never skilled in riding horses, the way I assume my great grandparents were. I didn’t learn how to use a sliderule like my grandfather did. But, I still learned skills that were valuable for the moment in history where I grew up. There’s never any guarantee that a baby born today will get to the age of 20 with skills that are useful enough that someone will pay them to use those skills.
As for programming, it isn’t some kind of nefarious goal to make sure that tomorrow’s children won’t know how to do it. It’s an immediate short-term goal to try to save money by not having to hire people with specialty skills. If that gamble pays off, then it will be like using a sliderule. Kids won’t learn it because it isn’t a skill that’s in demand anymore. If AI turns out to be a niche thing, rather than a massively transformational technology, then tomorrow’s kids will learn to be programmers in whatever languages are hot in 20 years.
No, it’s about deskilling the workforce. Not an individual
That just sounds like a conspiracy theory.
You don’t need a conspiracy to motivate companies to make you dependent on their subscription service. Their goal is not to deskill workers for evil’s sake. They the norm to be using their systems instead of your brain.
Might be, but it’s obvious that they want people to rely on their products and then sell it as a subscription. Like everything else
It’s worse than that.
The goal isn’t to sell coding superpowers to programmers. It’s to drive a wedge between employer and employee. Make both of them dependent on an intermediary instead of each other.
Think DoorDash but for coding gigs. You don’t have a job, but a series of push notifications offering a chance to review an 18-line PR for $3.81.
Remember to respond within the next 90 seconds to maintain your priority status, and don’t decline too many offers.
Edit: See also, chickenized reverse-centaurs.
This 11 year old adult swim comedy video doesn’t even feel that ridiculous anymore.
Here’s a fun thing. Using the latest AI to code backend and front-end code. Every couple of weeks, have to stop, go through every line and module, and throw out pretty much 90% of the code, manually refactor, and rewrite it.
It offers a good starting point, but the minute things get slightly complicated, you have to step in. I feel bad for people who think this will make it so they don’t need experienced developers and architects. They’re in for a rough ride.
An interesting point I heard the other day: if AI can replace entry level jobs, doing simple scripts that AI can definitely do (because it essentially just spits out the stack overflow/Reddit/etc training data verbatim), then companies no longer need entry level programmers.
If they don’t need entry level programmers, how do you get future senior programmers? Skipping directly to advanced stuff without getting practical experience on the simple stuff is incredibly hard.
What happens when the current senior programmers retire in larger numbers, and there’s very few replacements because the ladder is gone?
That’s a problem for Q72 and they’re incapable of looking past Q4. Besides, they’ll have already jumped ship by then, what do the execs care if they make this quarter just ever so slightly more profitable
Every couple of weeks, have to stop, go through every line and module, and throw out pretty much 90% of the code
It offers a good starting point
It doesn’t sound like a good starting point if you have to throw out 90% of it every couple of weeks.
Agree. Software engineering is a marathon - not a sprint. These AI tools are useful to get something up real quick, but I have a hard time seeing how they can be useful for long term maintenance work.
Software engineering is a marathon - not a sprint.
Oh BOY do I have this ‘brand new shiny’ thing called Agile at almost every fucking company ever.
Agile doesn’t claim that a project can be completed in a sprint.
Tell that to middle management
My middle management knows this. I don’t stick around for shit management, we don’t deserve each other. There are always other opportunities.
It’s still a marathon, even if the name ”sprint” is used. The point is the same: software engineering is about ensuring long term maintenance. It’s about building software that can sustain through multiple sprints.
The typical code from an AI agent can barely sustain a single sprint without having to restart from scratch.
I know, but in most companies they don’t give a fuck.
What’s done is done, sure there can be some minor maintenance, but goodness forbids you need to rewrite something that handles the 10x throughtput that built up over the years.
I am usually able to get some cleanup tasks in, but from what I’ve heard, not many people are.
It’s just sad, that some think ‘sprint’ means ‘this is done and dont dare to tell me you need more time, what have you been doing the last X sprints?’.
I am usually able to get some cleanup tasks in, but from what I’ve heard, not many people are.
If the company you work for truly does not value this effort, then do not do it.
It’s not your code base. It’s theirs. You are not being rewarded for saving them from themselves. Don’t work for free.
Plus “getting something up real quick” is the fun part.
The first draft is fun.
The second draft is pain.
The third draft is cathartic.Figure out features, add add add.
Add/change features, realise the spaghetti mess and poor design decisions you made in the first draft.
Clean everything up with better design and code.
Drag feels schadenfreude for them. If they’re going to fire their workforce to chase trends, it would be fun for them to go out of business about it.
It’s exactly the opposite of teaching a man to fish, this is telling that man to depend on whatever floats down the river and just pick whatever seems edible, if the man gets enough or poisons himself nobody will know, because the skill to fish would have been lost.
Like people who only had a smartphone for everything, they’ll never know the advantages of an actual computer and will struggle with it when they need to use one.
I think this so much less convincing than selling AI as a replacement for skilled labor, not as a way to intentionally deskill actual software engineers.
Capitalism already has a way of preventing you from making your own commodities - you sell your time, and the less they pay you for it relative to how much you need to live, the less time you have for yourself to put towards self sufficiency. We don’t have many FOSS products, not because nobody has the knowledge or skill to make them, but because nobody has the time to make them.
There are plenty of reasons to hate corporate-owned AI products, we don’t need to be hallucinating new ones.
I run free local models…
This, i hope we just dislike the monopolization of AI here and not the technology in general. Self hosting is the way.
The output is still slop, no matter if it’s local or oligarch-owned.
It gives you access to information in an extremely efficient way though, before AI i was often scrolling through hundreds of forum posts to find the solution to my problems, now i just ask AI and get a straightforward answer. Its a great efficiency tool in general and increases your overall skillsets, sure it wont replace highly skilled people yet but usually those people are only very skilled in one area, AI enables people to increase their baseline of skillsets so anyone can code, write, gather information, etc.
Imo the problem lies in what big corps and governments will do with it and that will fuck us heavily.
LLMs provide as much information as a parrot repeating most heard words.
It’s a terrible, terrible “source” of information that will lead to an insane amount of misinformed people.
It seems that you haven’t put your own hands on it yet, there’s so much more than just chatgpt. You should definitely try out perplexity, it made using search engines obsolete and the information is neutral, up to date and on point, at least most of the times, rarely you have to push it or give it more information, sadly its big corp and closed source but you can self host with Perplexica and local AI models running on your own, loathing it is one thing but completely ignoring is another, you shouldn’t hate what you don’t know.
You are assuming too much.
It’s a terrible, terrible “source” of information
There’s not much room for interpretation here.
It’s called Fuck_AI, not Fuck_Monopolization_of_AI.
“Sir, we are100% reactionary, no room for nuance in these parts!”
Remember when people said digital artists weren’t artists, because they didn’t use traditional mediums?
Ya @[email protected] local-free-open gonna eat some billion-$ corp lunch—this is exciting for everyone, almost
Eugene you able to connect the Reality Check complaint screenshot to the thread for me?
Also any issue with one volunteer contributor training a model on public domain code, and sharing with their friend who uses it free locally to generate a custom script? (Context to help can be disabled non-coder who wants an accessibility script.) Considering the contrast between that and the non-coder searching StackOverflow all day, and looking to understand what the volunteer & disabled friend did wrong in one scenario but not the other.
Excuse the sympathetic scenario, just wanna make things easier. Not asking anyone to defend Sam Altman!
Yeah, two completely similar things! The machine that allows ypu to draw without buying expensive art supplies constantly vs the “let’s turn shitty ideas into full products for the sake of social media clout” machine.🤡
Right?
I’ll go against the grain here: I’m not worried. If you actually care about what you do, even vibe coding can teach you something, it could be a starting point. The internet is not going away, and just looking up this or that thing the AI spit out will help you learn what you’re working with.
Is it the same as an uni CS course? No of course, but how many of us got our start just tinkering with stuff we didn’t understand?
The internet is not going away, and just looking up this or that thing the AI spit out will help you learn what you’re working with.
I think you mean “sifting through several pages of worthless search results while looking for something the AI spit out”
The internet is worse and it can still get worse.
Bad search results and Bad documentation specifically are a different problem tho
They’re a problem being made worse by AI.
While I agree with you, the unfortunate trend of common folks is to take the easiest path to accomplish their goal.
If that means using a tool they don’t understand to achieve a solution instead of being forced to learn from tinkering, I think most people will opt for that route.
They won’t take that extra step to comprehend what the AI spits out.
Those kind of people would have behaved the same anyway, copy pasting from the internet or wasting others’ time some different way.
I guess we could argue whether giving them AI will act as a multiplier for their damage output or will reduce it because the AI will be savvier than them, but personally I don’t see things changing much.