Post:
If you’re still shipping load‑bearing code in C, C++, Python, or vanilla JavaScript in 2025, you’re gambling with house money and calling it “experience.”
As systems scale, untyped or foot‑gun‑heavy languages don’t just get harder to work with—they hit a complexity cliff. Every new feature is another chance for a runtime type error or a memory bug to land in prod. Now layer LLM‑generated glue code on top of that. More code, more surface area, less anyone truly understands. In that world, “we’ll catch it in tests” is wishful thinking, not a strategy.
We don’t live in 1998 anymore. We have languages that:
- Make whole classes of bugs unrepresentable (Rust, TypeScript)
- Give you memory safety and concurrency sanity by default (Rust, Go)
- Provide static structure that both humans and LLMs can lean on as guardrails, not red tape
At this point, choosing C/C++ for safety‑critical paths, or dynamic languages for the core of a large system, isn’t just “old school.” It’s negligence with better marketing.
Use Rust, Go, or TypeScript for anything that actually matters. Use Python/JS at the edges, for scripts and prototypes.
For production, load‑bearing paths in 2025 and beyond, anything else is you saying, out loud:
“I’m okay with avoidable runtime failures and undefined behavior in my critical systems.”
Are you?
Comment:
Nonsense. If your code has reached the point of unmaintainable complexity, then blame the author, not the language.
So there is apparently a problem with languages such as JavaScript and the solution is to use languages such as TypeScript.
Wut?
TypeScript and safety-critical paths should not be in one sentence.
wut ?
Why ?
Genuinely curious to learn from your arguments
It’s Javascript with types. You are still using one hundred NPM packages to do the simplest thing. Any string can be JSON. And Node is single-threaded, so if you plan to create some kind of parallel computation, you’d need to run 16 Docker containers of your Node server, one per CPU core, with NGINX or some other load balancer at the business end, and hope that your database engine won’t reorder transactions. And yeah, Docker is mandatory, because Node version in your latest Ubuntu release is already outdated.
don’t just m-dash
chat gippity
Maybe, but always remember LLMs are trained on real people. Some people naturally use similar styles to some LLM tica as it was stolen from them in the first place.
Don’t just state—regurgitate!
Go and Python and Typescript all have their own footguns.
I assume Rust is the same, but haven’t used it personally to see
Rust is the foot gun, it’s so perfect that you genuinely cannot just sit down and type out what you need.
Sounds like they want Ada Spark and not Rust.
Just don’t do bugs. How hard is that?
If I don’t have documentation or defined features, then I can’t do bugs!
According to all teams I’ve worked on.
Pretty fucking hard.
I know this is satire, But really though better languages that make various classes of defects unrepresentable reduce defects. It’s wild that such a statement needs to be made, but our industry is filled with folks who don’t critically think about decisions like these.
Like the age old advice for getting better at Smash Brothers - Don’t get hit.
My second favorite prompt, behind “Do not hallucinate”
I don’t get it.
Maybe the joke is nothing complex is written in fad languages?
Maybe the joke is the discounting of peer review and testing?
Maybe the joke is the lack of devops knowledge where Python is extra steps over other scripting languages?
It seems like promotion of fad languages. When I was younger, I chased fads and lost hard. I’ll stick with C and C++. Run-time failures happen to everyone including fad languages. Here’s looking at you Rust CVE’s. Better to have loved and lost, something, something.
I’m completely confused by why they seem to think it’s impossible to have coding errors in rust. I’m also confused as to why they seem to think that errors are actually a problem. You get them you fix them. Who cares about what language you do it in.
This stinks of somebody who’s been in the industry for about 2 years and now thinks they’re hot shit.
As an embedded dev, good luck not using C
You could use Forth.
Not an embedded dev. What’s the Rust situation in the embedded world? Is it ever used?
In my corner of the embedded world, it feels like everyone is practically jumping to integrate Rust. In the sense that vendors which haven’t had to innovate for 10+ years will suddenly publish a Rust API out of the blue. And I’m saying “out of the blue”, but I do also regularly hear from other devs, that they’ve been pestering the vendors to provide a Rust API or even started writing own wrappers for their C APIs.
And while it’s certainly a factor that Rust is good, in my experience they generally just want to get away from C. Even our management is well aware that C is a liability.
I guess, I should add that while I say “jumping”, this is the embedded world where everything moves extremely slowly, so we’re talking about a multi-year jump. In our field, you need to get certifications for your toolchain and code quality, for example, so lots of work is necessary to formalize all of that.
I’ve been an embedded developer for coming up on 20 years at this point, and recently went through a job hunt. Of the three that made it to the offer stage, two used Rust almost exclusively in their embedded stack and one used Rust in their embedded LInux stack and was trying to decide if they were going to use rust in their bare-metal/RTOS stack. I ended up at on of the Rust places, though I had no Rust experience. I have to say, while I do find many parts of the syntax too cute by half, in general I’m pretty happy with it as an embedded language. My current target architectures are ARM Cortex-M7 and Cortex-A53. In general toolchain, and debugger support has been good, peripheral support has been ok but could use improvement.
It’s not widely used. Some car manufacturers(Toyota if I remember correctly) have started testing it. Some parts are really nice.
There is exactly one hal for i2c, spi and Io pins. As long as both your chip and peripheral driver implement against it, it just works. There are more unified abstractions in the work for things like DMA, but they are not officially stable yet.
Cooperative Multi threading can easily be integrated thanks to Async rust and executors like embassy.
All the crates that are no_std compatible can be included.
It’s not perfect, but it’s getting there.
Bindings have been getting added to the Linux kernel so drivers can theoretically be written in Rust
Android has moved its IPC mechanism, Binder, over to Rust
Rust is more of a C++ replacement, no? Rather go with Odin or Zig or C3 for systemic stuff?
You often just want to go with what’s popular, since hardware vendors will only provide APIs for select languages.
Well, and depending on the field, you may need to get certifications for your toolchain and such, so then you have to use what’s popular.
Not really. The things where C++ used to shine are usually best done in higher level languages.
Rust is a great C replacement.
Rust is more of a C++ replacement, no?
It’s fast and memory secure, so it’s good for stuff you might do in C but you don’t want to risk a memory leak or segmentation fault.
I thought more in the way of atomic dependencies and how it handles features in-language vs. in-code and reliance on toolchain vs. standalone. In short, how you as a programmer are supposed to use it.
I know that some people have managed to get it working but I have yet to see it in practice. Granted, my experience in the industries is currently only what I learned during my studies and 2 internships.
In general, C is supported. C++ is sometimes supported and very few people even talk about Rust.
it’s just negligence with better marketing
Good damn I hate that tone it reeks of LinkedIn llm-powered personal branding. Weak ideas with writing that tries to sound strong is the worst.
Weak ideas with writing that tries to sound strong.
I move to make this the new definition of “Marketing”.
I guess I used to associate marketing speak with “let’s say something so bland nobody can really disagree with it” but this trend of writing platitudes like they are ground breaking expert insights is really grating to me
So much. It feels like an offshoot cousin of clickbait, basically.
I half way agree. I always say form shapes function. Sure you can write good code in any language. But some encourage it more then others. Ultimately it’s the programmer fault when things get over complex though
Honestly, I more than half agree because the factor most seem to conveniently ignore is that languages and environments that encourage better and safer code are aimed at the lowest common denominator.
The lowest common denominator of developers are the ones that benefit the most from a reduction in defects or unsafe code they may produce. They are the biggest pool of developers. And in my experience, the ones least likely to proactively take measures to reduce defect rates unless it’s forced upon them and/or baked into their environment.
They are the ones that will slap
anyin typescript to resolve errors instead of actually resolving them, or the ones that will usedynamicin C# instead of actually fixing the bad design … etc
“Blame the author, not the language”
Says the person who screams they have never worked professionally with a team before.
There is no excuse to not use statically typed, safe languages nowadays. There are languages that let you build faster like Python and Typescript, but faster does not mean safer. Even if your code is flawless it still isn’t safe because all it takes is a single flawed line of code. The more bug vectors you remove the better the language is.
Even if your code is flawless it still isn’t safe because all it takes is a single flawed line of code.
If there is a single flawed line of code, the code isn’t flawless.
Even if the code is flawless now, all it takes is a single flawed line of new code. This is of course true for all languages, but type safety helps a lot as some types of flaws would not compile.
I am not arguing against type safety, just pointing out the glaring contradiction in defense of it.
let you build faster like Python
I have to write so much boilerplate code to make sure my objects are of the correct type and have the required attributes! Every time I write an extension for Blender that uses context access, I have to make sure that the context is correct, that the context has the proper accessor attributes (which may not be present in some contexts), that the active datablock is not None, that the active datablock’s data type (with respect to Blender, not Python) is correct, that the active datablock’s data is not None… either all that or let the exception fall through the stack and catch it at the last moment with a bare
exceptand a generic error message.I used to think that static typing was an obstacle. Now I’m burning in the
isinstance/hasattr/getattr/setattrhell.I loved python when I was a junior dev. Now I hate it (except for things like computational math). I have to add debug statements to figure out that someone snuck in the wrong type into the code.
Type checkers are your friend if you can enforce them. I’ve started using them in my new projects and find that they make those types of bugs harder to sneak in, especially if you’re strict about requiring type hints/definitions in your functions and classes.
I like
ty, but it’s immature. Check out Pyright as well.
I have to write so much boilerplate code to make sure my objects are of the correct type and have the required attributes!
That is the trap that, sadly, my company fell for too. The POC was written in python. very fast i might add. but it was only that: a POC. if the whole backend crashes due to unexpected user input - noone cared. if the frontend displayed gibberish because the JSON made wrong assumptions about not defined data types - sweep it under the rug, don’t do that during presentations.
but if it came to building a resilient system, which can be shipped to customers and preferably maintained by them (with minimal consulting contract for access to our guys)… we cursed the way python worked.
There are definitely use cases where something like C is still the best option because it’s faster. For the most part consumer software it’s unnecessary, but it’s not obsolete for all applications.
Hell, assembly code is still necessary for the lowest-level init code. Once you have a functional stack and some var init logic you can graduate to C.
That’s ridiculous. Everyone knows its best to write modern bootloaders in Matlab.
There have been multiple operating systems written in Haskell
I believe you and I’m sure they were fine.
I wrote an XML parser in LabVIEW once. Just because you can doesn’t mean it’s the right thing to do lol.
You joke, but my first “lets make facebook, but…” comment was from an electrical engineer buddy that wanted to use matlab. That was the whole pitch. “Facebook, but matlab.”
It did not go far.
Real men use Scratch for everything.
A little hair-splicy, but an assembly-free bootloader is definitely doable on some platforms – Cortex-M processors load the stack pointer from the vector table, and the initialized memory setup can be taken care of with memcpy.
True, but you’re not gonna be setting the access levels or doing anything else with control registers on a Correx-M in pure C, let alone boot to a safe state with zeroed registers.
Yeah, if your bootloader is expected to handle that you’re going to need assembly. That can also be delegated to the kernel, RTOS, or bare metal reset vector later on in the boot sequence, though. I had to write a bootloader for an embedded system like this once and it basically just applied firmware updates, validated the firmware, and handed control over to the firmware.
You’re just describing more components that are written in C and assembly.
My point is that assembly isn’t strictly required. You can do memory-mapped reads and writes from C all you want, which is enough for plenty of I/O: storage, serial, sensors, GPIOs… You can build quite a few things with these without touching system registers.
I’m not saying we should abolish assembly. Just that it isn’t a universal requirement.
In my 15+ years of experience many of the actual field problems are not language / programming related at all. Unclear requirements or clear but stupid requirements cause loads of issues. These are often caused by communication problems between people and / or organizational issues.
It depends a lot on the industry of course. For embedded software, low level networking etc I mostly agree with you. However, in business applications or desktop applications it’s from my experience mostly bad requirements / communication.
Don’t forget to add incompetent leadership to that list. If feature needs to be shipped by some arbitrary deadline and the engineers are forced to rush through the design process, you end up with a patchwork hack of tech debt that leads to more tech debt.
Python isn’t “untyped;” it is, in fact, strongly-typed. (And is markedly different than and superior to JavaScript on that point.)
This rant feels like it was written by an OO programmer who was never able to wrap his head around functional programming.
Why are you talking about functional programming? Python sure as hell isn’t FP.
You might be confusing using functions with functional programming. Python is Object Oriented language at it’s core, most people use it as
procedural, and like most modern languages it supports also functional paradigms.Yeah, plus it has type hints and tooling to make said type hints mandatory.
Also, like, fuck golang, it’s such a shit language and the compiler does very little to protect you. I’d say that mypy does a better job of giving you AOT protection.
Also, like, fuck golang, it’s such a shit language and the compiler does very little to protect you
I never understood why people like it. It’s a “new” language, and it still doesn’t seem to get the basics right. No proper null handling, and don’t get me started on
interface{}. It’s like they set out to build a better alternative to C++ while ignoring the other developments outside C/C++ for the past 15 years. The compiler is damn quick, though.I’ve dumped 18 years of C++ experience for Go in 2018, and never wanted to come back. Took me a couple of months to become accustomed.
The main Go’s feature is a green light for ignoring OOP baggage collected for decades, which makes writing code unnecessary burden. And Go have tools for not doing that.
Yes, sometimes it can be a bit ugly, but if you’re ready to trade academic impeccability for ease of use, it’s a real blast.
I’ve seen a lot of bad code in Go, which tried to do OOP things taught in school or books. Just don’t. Go requires a different approach, different mindset. Then everything falls in their places.
deleted by creator
It’s also dynamically typed and inferior to TypeScript.
Depends enirely on the usecase. Python is loved for data processing but python GUIs get messy. And so do JS and TS GUIs.
I’ve never met a desktop GUI bigger than a single page with buttons that wasn’t messy and complicated.
Granted, I’m used to Qt in C++ and python, so I don’t think I’m the best sample collector.
Delphi could do complex GUIs pretty well. That makes me miss it sometimes.
Typescript fucking sucks.
Why?
Do you mean python has something to do with functional programming, or did I misread? Because I would say e.g. Typescript is (slightly) closer to FP than Python.
Yes. Python is a multi-paradigm language, but IMO proper “pythonic” python looks a lot more functional than OO, with liberal use of duck-typed list comprehensions and such.
I’m not even going to bother commenting on that train wreck of a post, but I just wanted to mention that I hate the writing style of programming-related LinkedIn posts. They’re just chock-full of sweeping generalizations presented as absolute truth in an extremely patronizing tone.
Why can’t people just say, “In my opinion, X technology is a better fit for Y situation for Z reason,” instead of “Every time you encounter X, you must do Y, otherwise you’re dead wrong.”
It’s just simultaneously so arrogant and also aggressively ignorant. If someone spoke to me like that in real life, I would never want to speak with them again. And these people are broadcasting this shit to their entire professional network.
Yeah, particularly the broadcasting really irks me.
That is an opinion you can hold for yourself and then make compromises as you encounter reality. I do expect programmers to hold strong opinions.But when you broadcast it, you strip yourself of the option to make compromises. You’re just saying something which is going to be wrong in one way or another in most situations. I do expect programmers to be smarter than that.
Nonsense. If your code has reached the point of unmaintainable complexity, then blame the author, not the language.
I feel like there’s about one person that can cast this stone, and that’s because preventing this has turns Torvalds into an abusive bridge troll sometimes, but he’s actually been successful.
Well, the kernel is unmaintainably complex. Linux saves his sanity by not looking deeply into modules and only inspecting the surfaces.










