While I (almost) agree with the conclusion, there is a lot of bullshit and unproven assumptions in this blog post. I always cringe about the “AI is democratising software development” argument in particular. This is just wrong on so many levels. Software development is not an ivory tower. Everyone with an internet connection had access to all the resources to learn the necessary skills for free, for decades. Everyone who had an interest in actually learning that stuff and putting a bit of effort into it was able to do so. What LLMs provide is not democratising anything but advertising the illusion that everyone can produce software, effortless and without any skills whatsoever. Software development is much more than just churning out lines of code that seem to work. The Vibecoding approach is like trying to build your own car without having the skills and asking an AI to construct it as the sum of individual parts which all come from different car models from a Lada to a Ferrari. The end result might be drivable, but it will be neither secure nor efficient nor fast nor stable nor maintainable etc. A Frankenstein car. Everyone with half a brain would agree that’s not a good idea, however with LLMs people just do pretend its fine.
Everyone could always learn woodworking, weaving, sewing, smithing, … that is not an argument. The point is that better tools make it easier to learn/perform/perfect these skills. Today anyone with a little torch and a hammer can play around with steel. 300 years ago you had to at least take on an apprenticeship to ever get to do that. Sewing with a sewing machine is so much faster, there is not much time to invest before you can make your own clothes.
Not everyone has 100s of hours free time to sink into this and that skill “the purist way”. Any tool that makes the learning curve more shallow and/or the process itself easier/cheaper/… helps democratizing these things.
You argue as if everyone needs to be a super duper software architect, while most people just want to create some tool or game or whatever they think of, just for themselves.
Not everyone has 100s of hours free time to sink into this and that skill
That’s life, buddy. Nobody can learn everything, so communities rely on specialists who can master their craft. Would you rather your doctor have 100s of hours of study and practice, or a random person off the street with ChatGPT? If something is worth studying for 100s of hours, then there’s more nuance to the skill than any layman or current AI system can capture in a few sentence prompt.
What kind of nonsense comparison is that? Somewhat off topic, borderline straw man.
People still have their job, better tools enable people to do more things in their free time. Some even switch professions later on, once they have enough experience. Lowering the bar (invest, skill, …) is simply a good thing.
I personally have spent those 100s (actually more like 1000s) of hours studying Software Engineering, and I was doing my best to give an example of how current AI tools are not a replacement for experience. Neither is having access to a sowing machine or blowtorch and hammer (you still need to know about knots and thread / metallurgy / the endless amount of techniques for using those tools).
Software in particular is an extremely theoretical field, similar to medicine (thus my example with a doctor).
ChatGPT is maybe marginally better than a simple web search when it comes to learning. There is simply no possible way to compress the decade of experience I have into a few hours of using an LLM. The usefulness of AI for me starts and ends at fancy auto-complete, and that literally only slightly speeds up my already fast typing speed.
Getting a good result out of AI for coding requires so much prerequisite knowledge to ask the right questions, a complete novice is not even going to know what they should be asking for without going through those same 100s of hours of study.
Well, if you want to use that stuff for your personal use that’s totally fine. But there is a difference between doing that and selling your creation as a product. To pick up on your example, it’s great if someone learns woodworking and puts together a table or something. You probably won’t sell it though because unless you get really good at it, the piece of furniture will not meet the standards for a good product. It’s absolutely the same when using LLMs to put together a piece of software. It will fall apart quickly unless you put some serious work in it. A lot of people think LLMs are a shortcut to learning this stuff and then go on and pretend to be professional software developers. I also doubt these vibecoders learn a lot about about coding when they don’t even understand what the LLM is putting together for them as a result of a few wishful prompts.
While I (almost) agree with the conclusion, there is a lot of bullshit and unproven assumptions in this blog post. I always cringe about the “AI is democratising software development” argument in particular. This is just wrong on so many levels. Software development is not an ivory tower. Everyone with an internet connection had access to all the resources to learn the necessary skills for free, for decades. Everyone who had an interest in actually learning that stuff and putting a bit of effort into it was able to do so. What LLMs provide is not democratising anything but advertising the illusion that everyone can produce software, effortless and without any skills whatsoever. Software development is much more than just churning out lines of code that seem to work. The Vibecoding approach is like trying to build your own car without having the skills and asking an AI to construct it as the sum of individual parts which all come from different car models from a Lada to a Ferrari. The end result might be drivable, but it will be neither secure nor efficient nor fast nor stable nor maintainable etc. A Frankenstein car. Everyone with half a brain would agree that’s not a good idea, however with LLMs people just do pretend its fine.
Everyone could always learn woodworking, weaving, sewing, smithing, … that is not an argument. The point is that better tools make it easier to learn/perform/perfect these skills. Today anyone with a little torch and a hammer can play around with steel. 300 years ago you had to at least take on an apprenticeship to ever get to do that. Sewing with a sewing machine is so much faster, there is not much time to invest before you can make your own clothes.
Not everyone has 100s of hours free time to sink into this and that skill “the purist way”. Any tool that makes the learning curve more shallow and/or the process itself easier/cheaper/… helps democratizing these things.
You argue as if everyone needs to be a super duper software architect, while most people just want to create some tool or game or whatever they think of, just for themselves.
That’s life, buddy. Nobody can learn everything, so communities rely on specialists who can master their craft. Would you rather your doctor have 100s of hours of study and practice, or a random person off the street with ChatGPT? If something is worth studying for 100s of hours, then there’s more nuance to the skill than any layman or current AI system can capture in a few sentence prompt.
What kind of nonsense comparison is that? Somewhat off topic, borderline straw man.
People still have their job, better tools enable people to do more things in their free time. Some even switch professions later on, once they have enough experience. Lowering the bar (invest, skill, …) is simply a good thing.
I personally have spent those 100s (actually more like 1000s) of hours studying Software Engineering, and I was doing my best to give an example of how current AI tools are not a replacement for experience. Neither is having access to a sowing machine or blowtorch and hammer (you still need to know about knots and thread / metallurgy / the endless amount of techniques for using those tools).
Software in particular is an extremely theoretical field, similar to medicine (thus my example with a doctor).
ChatGPT is maybe marginally better than a simple web search when it comes to learning. There is simply no possible way to compress the decade of experience I have into a few hours of using an LLM. The usefulness of AI for me starts and ends at fancy auto-complete, and that literally only slightly speeds up my already fast typing speed. Getting a good result out of AI for coding requires so much prerequisite knowledge to ask the right questions, a complete novice is not even going to know what they should be asking for without going through those same 100s of hours of study.
Well, if you want to use that stuff for your personal use that’s totally fine. But there is a difference between doing that and selling your creation as a product. To pick up on your example, it’s great if someone learns woodworking and puts together a table or something. You probably won’t sell it though because unless you get really good at it, the piece of furniture will not meet the standards for a good product. It’s absolutely the same when using LLMs to put together a piece of software. It will fall apart quickly unless you put some serious work in it. A lot of people think LLMs are a shortcut to learning this stuff and then go on and pretend to be professional software developers. I also doubt these vibecoders learn a lot about about coding when they don’t even understand what the LLM is putting together for them as a result of a few wishful prompts.