another AI question

Craig MacGregor cmacgreg at gmail.com
Tue Apr 7 23:09:45 EDT 2026


> I'm curious what others think "AI apocalypse" means. Is it a concern,
> and if so, what does it mean in practice?

I think the AI companies are panicking because the coding use case is 
growing in popularity, and has paying customers... but they are losing 
money on both training and inference, and at the same time, local models 
are catching up quick. The window of time that anybody has the "best" 
models is short, and technical users are also the ones that will use the 
most resources (openclaw and junk like that), run local models, and jump 
from provider to provider, depending on the price that month... the 
other uses, like cheating on homework, reflecting the user's neuroses 
back to them, and generating slop images/video are unprofitable at best, 
and fraught with so many social/legal issues... and every other awful 
use case similar to chatbots are already essentially commodities, too (I 
just got an email for the "Wegmans AI Assistant" as I am writing this, 
haha). The fact that they can now quantify who will pay for their 
services and how much is why the sky is falling... they're not going to 
be able to replace every job with AI, at most it's a few hundred dollars 
per developer, per month. Every non-developer use of ChatGPT and Copilot 
that I've seen or heard of seems like a waste of time and money (OK 
maybe image/audio/video generation isn't useless, but it looks awful and 
is mostly harmful).

I figure OpenAI and Oracle will be hit the hardest (the Ellisons seem to 
have other interests these days, maybe they see the writing on the wall, 
too). Nvidia will probably be OK, Anthropic has the most to gain (right 
now anyway). Microsoft will probably also be hit by openai/oracle 
fallout, but likely minimized (and they will probably absorb openai)

Regarding BSDs and other free software... I think forking is going to 
become a lot more common. It's a lot easier to fork than submit a patch 
when you've had claude hacking away at some project for a few hours, or 
there is some sort of disagreement. So there's likely to be a glut of 
garbage free software projects (not that this is really anything new).. 
AI-generated "bug fixes" are certainly already a well known issue. I 
think the for-profit open source companies are cooked, as they say; it's 
hard to justify paying for extended features when you can just have 
Claude extend software to mimic the paid features... their entire 
business model has to be "AI can't be trusted, pay us instead".

-craig


More information about the talk mailing list