As always, I use the term “AI” loosely. I’m referring to these scary LLMs coming for our jobs.
It’s important to state that I find LLMs to be helpful in very specific use cases, but overall, this is clearly a bubble, and the promises of advance have not appeared despite hundreds of billions of VC thrown at the industry.
So as not to go full-on polemic, we’ll skip the knock-on effects in terms of power-grid and water stresses.
No, what I want to talk about is the idea of software in its current form needing to be as competent as the user.
Simply put: How many of your coworkers have been right 100% of the time over the course of your career? If N>0, say “Hi” to Jesus for me.
I started working in high school, as most of us do, and a 60% success rate was considered fine. At the professional level, I’ve seen even lower with tenure, given how much things turn to internal politics past a certain level.
So what these companies are offering is not parity with senior staff (Ph.D.-level, my ass), but rather the new blood who hasn’t had that one fuckup that doesn’t leave their mind for weeks.
That crucible is important.
These tools are meant to replace inexperience with incompetence, and the beancounters at some clients are likely satisfied those words look similar enough to pass muster.
We are, after all, at this point, the “good enough” country. LLM marketing is on brand.


This seems like it pretty much sums things up from my experience.
We’re encouraged (coughrequiredcough) to use LLMs at work. So I tried.
There are things they can do. Sometimes. But you know what they can’t do? Be liable for a fuck up.
When I ask a coworker a question, if they confidently answer wrong, they fucked up, not me. When I ask a LLM? The LLM isn’t liable, it’s me for not verifying it. If I’m verifying anyway, why am I using the LLM?
They fuck up often enough that I can’t put my credibility on the line over speedy slop. People at work consider me to be a good programmer (don’t ask me how, I guess the bar is low lol). Imagine if my code was just whatever an LLM shat out. It’d be the same exact quality as all of my other coworkers who use whatever their LLM shat out. No difference in quality.
And we would all be liable when the LLMs fucked up. We would learn something. We would, not the LLM. And the LLM will make the same exact fuck up the next time.