I don't agree that the code is cheap.
It doesn't require a pipeline of people to be trained and that is huge, but it's not cheap.
Tokens are expensive.
We don't know what the actual cost is yet.
We have startups, who aren't turning a profit, buying up all the capacity of the supply chain.
There are so many impacts here that we don't have the data on.
You completely ignored the post you're replying to.
To recap, the author disagrees that writing code is cheap, because we've collectively invested trillions of dollars and redirected entire supply chains into automating code generation. The externalities will be paid for generations to come by all of humanity; it's just not reflected in your Claude subscription.
GP is not totally ignoring the post he replied to: we have models that are basically 6-months behind closed SOTA models and that we can run in the cloud and we fully know how much these costs to run.
The cat is out of the bag: compute shall keep getting cheaper as it's always been since 60 years or something.
It's always been maintenance that's been the killer and GP is totally right about that.
And if we look at a company like Cloudflare who basically didn't have any serious outage for five years then had five serious outages in six months since they drank the AI kool-aid, we kinda have a first data point on how amazing AI is from a maintenance point of view.
We all know we're generating more lines of underperforming, insecure, probably buggy, code than ever before.
Maintaining it is becoming more costly. The increasing burden of review on FOSS maintainers is one example. AWS going down because an agent decided to re-write a piece of critical infrastructure is another. We are rapidly creating new kinds of liability.
unlikely, FOSS is mostly driven by zero-cost maintenance but AI tools needs money to burn. So only few FOSS project will receive sponsored tools and some definitely reject to use by ideological reasons (for example it could be considered as poison pill from copyright perspective).
We kind of do? Local models (thought no state of the art) set a floor on this.
Even if prices are subsidized now (they are) that doesn't mean they will be more expensive later. e.g. if there's some bubble deflation then hardware, electricity, and talent could all get cheaper.
Or, and bear with me hear, there is a problem even if you aren't experiencing it.
I've been using spotlight since it was introduced for... everything.
In Tahoe it has been absolutely terrible. Unusable.
Always indexing.
Never showing me applications which is the main thing I use it for (yes, it is configured to show applications!).
They broke something.
Not seeing complaints doesn't mean they don't exist.
Not to mention ui latency that is common in electron apps that is just a low-level constant annoyance.
Now. I was at Red Hat at the time, in the BU that built podman, and Docker was just largely refusing any of Red Hat's patches around rootless operation, and this was one of the top 3, if not the top motivation for Red Hat spinning up podman.
You'd have to point me to those PR's, I don't recall anything specifically around rootless.
I recall a lot of things like a `--systemd` flag to `docker run`, and just general things that reduce container security to make systemd fit in.
Tokens are expensive. We don't know what the actual cost is yet. We have startups, who aren't turning a profit, buying up all the capacity of the supply chain. There are so many impacts here that we don't have the data on.
reply