I was in a company that tried to develop some ai apps, but kind of failed, but I learned a lot about how to use ai, what can be done and what is not sensible to do with ai
That’s basically the “AI is replacing jobs. AI can’t replace jobs”.
C-suite don’t get it. It’s a hugely accessible framework that anyone can use. But only trained people can use the results. But c-suite trust the results because software has been so predictable (so trustworthy) in the past.
C-suite replace employees with AI. AI can’t actually do the job that it pretends it can do. Everyone suffers, and the people selling the shovels profit the most from the gold rush.
It lies on its resume and in it’s interviews, but in ways that are hard to detect.
I bet there was a similar sentiment when automation replaced blue collar jobs.
And yet, all those automations still require tool and die manufacturing and maintenance. Buy a tool & die from wherever which is purpose built to your process, and a year down the line you require the supplier to maintain the actual die - the actuators and machine can be maintained by anyone, but the “business logic” is what produces a good high quality part. Process changes? Updated design? Changing supplier to a slightly different material? Back to the supplier to new die.
But so many jobs were made “redundant” by cheap tooling and automation, and now it’s (nearly) impossible to actually manufacture something at scale in America.
Except LLMs action the next most likely step to the most likely dimensions based on the prompt and based on the popularity of similar/previous processes.
Fine for art and subjective medium, not for manufacturing and not for engineering.
I guess you could write automated tests which define the behaviour you want.
Probably better to write the behaviour you want and get AI to generate automated tests…
That’s basically the “AI is replacing jobs. AI can’t replace jobs”.
C-suite don’t get it. It’s a hugely accessible framework that anyone can use. But only trained people can use the results. But c-suite trust the results because software has been so predictable (so trustworthy) in the past.
C-suite replace employees with AI. AI can’t actually do the job that it pretends it can do. Everyone suffers, and the people selling the shovels profit the most from the gold rush.
It lies on its resume and in it’s interviews, but in ways that are hard to detect.
I bet there was a similar sentiment when automation replaced blue collar jobs.
And yet, all those automations still require tool and die manufacturing and maintenance. Buy a tool & die from wherever which is purpose built to your process, and a year down the line you require the supplier to maintain the actual die - the actuators and machine can be maintained by anyone, but the “business logic” is what produces a good high quality part. Process changes? Updated design? Changing supplier to a slightly different material? Back to the supplier to new die.
But so many jobs were made “redundant” by cheap tooling and automation, and now it’s (nearly) impossible to actually manufacture something at scale in America.
Except LLMs action the next most likely step to the most likely dimensions based on the prompt and based on the popularity of similar/previous processes.
Fine for art and subjective medium, not for manufacturing and not for engineering.
I guess you could write automated tests which define the behaviour you want.
Probably better to write the behaviour you want and get AI to generate automated tests…