

This is spot on.
And on the apparently contradictory “AGI” development, I’ll add the Big Tech kingpens like Altman/Zuck/Musk are hyping that while doing the exact opposite. They say that, turn around, cut fundamental research, and focus more on scaling plain transformers up or even making AI ‘products’ like Zuck is. And the models they deploy are extremely conservative, architecturally.
Going on public statements from all three, I think their technical knowledge is pretty poor and they may be drinking some of their own kool-aid. They might believe scaling up transformers infinitely will somehow birth more generalist ‘in the sky’ intelligence, when in reality it’s fundamentally a tool like you describe, no matter how big it is.
It’s so ridiculous that many projects don’t even support pip+venv (much less system python packages. shivers). They literally check if that’s what you’re trying and pre-emptively fail.
Some projects become impossible to run in any combination because some dependency (looking at you, sentencepiece) no longer works outside a narrow set of circumstances, unless you hand build some obscure github PR, disable all the dependency checks and cross your fingers and hope the wheel builds.
And I’m sorry, but I don’t have 200GB of space, tons of spare RAM, and an intricate docker passthrough setup for every random python project. I just have like four different 3rd party managers (conda/mamba, uv, poetry… what’s the other one? Ugh)