- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
It’s like a conspiracy theory for that guy. Everyone who tells them it’s not true that you can get rid of programmers, has to be a programmer, and therefore cannot be trusted.
To be fair, we should probably all start migrating to cybersecurity positions. They’ll need it when they discover how many vulnerabilities were created by all the non-programmers vibe coding.
We’re cooked because our leaders are pumping the AI bubble while crashing the rest of the economy. When that bubble pops those programmers are going to have to find a job in a nuclear sized crater of where the economy used to be.
If it’s any consolation, there was a nuclear sized crater of a job market for all engineers (software and otherwise) after Reagan’s 8 were up. The .com aftershocks were pretty huge, and the 2008 housing crisis hit everything really hard too. Then in 2012 I got laid off due to the end of our Afghanistan debacle, then there was that pandemic thingy…
So, yeah, the current foolishness is going to make a hell of a mess, but there has always been a huge mess either cleaning up, or coming soon for the past 35 years, and longer I’m sure, those are just the ones I’ve been hit by.
My favorite thing about ChatGPT is telling it consistently I’m writing stuff in AstroJS and not ReactJS.
So, uh where’s this link for a junior position starting out at $145k? Asking for a friend…
I find DeepSeek is incomparably better at coding tasks
Depends on the language I’d assume. The last thing I heard was that the current Codestral version is optimal for Python for example.
They call them Agents now to desperately slap a patina of futuristic competence over the word “chatbot”
I realise the dumbass here is the guy saying programmers are ‘cooked’, but there’s something kind of funny how the programmer talks about how people misunderstand the complexities of their job and how LLMs easily make mistakes because of an inability to understand the nuances of what he does everyday and understands deeply. They rightly point out how without their specialist oversight, AI agents would fail in ridiculous and spectacular ways, yet happily and vaguely adds as a throw away statement at the end “replacing other industries, sure.” with the exact same blitheness and lack of personal understanding with which ‘Ace’ proclaims all programmers cooked.
I find this is a really common trope where people appreciate the complexity of the domain they work in, but assume every other domain is trivial by comparison.
There’s a saying in Mandarin that translates to something like: Being in different professions is like being on opposite sides of a mountain. It basically means you can never fully understand a given profession unless you’re actually doing it.
I was making chatgpt do some tedious thing and I kept telling it “you got X wrong” and it kept going “oh you’re right I got X wrong, I will not do that again” and giving the exact same output. lol the one time ChatGPT was giving me consistent outputs for the same prompt
Yeah, same with Codestral. You have to tell it what to do very specifically, and once it gets stuck somewhere you have to move to a new session to get rid of the history junk.
Both it and ChatGPT also repeatedly told me to save binary data I wanted to store in memory as a list, with every 1024 bytes being a new entry… in form of a string (supposedly). And the worst thing is that, given the way it extracted that data later on, this unholy implementation from hell would’ve probably even worked up to a certain point.
The Y2K-level event for fixing the many many many mistakes of pure LLM coding without experts driving the things is going to be incredibly lucrative for some people. I’d guess in 3-5 years we’re going to see a whole boutique market around “shit I fired all of the devs and now my codebase is spaghetti (and linguine and angel hair and bow ties with some rigatoni and penne mixed in) to the extent that the AI can’t even try anymore”.
I’d guess this is going to happen in a lot of industries. At some point the undirected slop is going to clog things up to the point companies start folding over it, and then “Unslop LLC” is going to be the hottest shit for a while.
The problem is that too many execs are thinking like this guy. It’s not actually tenable to replace programmers with AI, but people who aren’t programmers are less likely to understand that.
If they actually follow through, there will be a very satisfying level of shit that comes down on their head.
When AI can sit through dozen meeting discussing stupid things only to finalize whatever you had decided earlier then I’ll be worried
Personally I would happily let my AI bot attend the stupid scrum meetings for me. Let it tell my scrum master and stakeholders whatever the progress of my day of work and in the sprint. Don’t bother me in my coding time.
We made a (so far internal) tool at work that takes your activity from Github, your calendar, and the issue tracker, feeds that to a local LLM, which spits out a report of what you have been doing for the week. It messes up sometimes, but speeds up the process of writing the report dramatically. This is one of those cases where an LLM actually fits.