(I'm mainly using token instead of compute to trigger you all, but also because its how gpu inference is billed)
assumptions: various AI models are becoming very useful and with the era of agents being on our doorstep there are few things on the horizon
main thesis:
- capital allocation game will change significantly, instead of giving 100s of startups preseed/seed checks you just buy gpus and allocate compute time to ideas
- anyone with access to cheap power and capital to build infrastructure now will be in a very advantageous position since compute is a lot more reusable than money paid out to developers
- its gonna be interesting to see how much of an edge running a large frontier model vs good enough model than can run on somewhat commodity hardware (lets say $10k budget) will be, I guess small good enough model with lots of unified memory for context could get decently far
- this has the potential to be an extremely centralizing force
- a lot of big bitcoin miners might be underpriced given that they are perfectly set up for this (tho to be fair many are already running hybrid loads)
- it could lead to interesting business models where people/companies could pitch in spare compute as investment into new ventures
thoughts?
When you say btc miners are perfectly set up, you mean bc they have solved the infra for massive amounts of electricity and to host and maintain servers?
Are there examples of this happening already? Seems like I've heard talk but I don't know where it stands.
yes, bitcoin miners tend to have lots of expertise in energy dense infrastructure deployment combined with existing contracts for cheap power
also a lot fo them are already doing it - some examples here. Some of them actually have their own power production as well which is beyond great.
I've been a/b testing large models on
venice(because they let me pay one-off with sats) vs small models on a macbook for agents. Large models are better at task breakdown and code generation, but evenqwen3:4blocally can do pretty amazing things if you instruct it correctly.You have to break down further for smaller models so you'll be slower, but if you have a 10x better idea that big tech can't steal from you while you're working it, 10x slower don't matter that much: compute sovereignty feels important.
To clarify "instruct it correctly". During
task graph decomposition, this is an example what a small model did locally (took forever) because of a modeling inefficiency in thejsonoutput I instructed it to give me:have you looked into combining large action models with llms for orchestration?
there was a research paper explaining the methodology of using large context frontier models as planning and smaller locally hosted models for execution, also used for working with sensitive data since no data was shared to the cloud based model, only processed by local ones. but i can't find it now...
I've been looking into something like that: I envision a work queue where:
devstralorcodellamafor actual operations.my thesis is not that AI is becoming useful to people, but that people are becoming useful to AI, so AI trains people to its advantage; the diseased mind is especially susceptible to this psychological operation
view on www.youtube.comIf you're just now going "aha" at AI, you're already too late. 😆
Interesting post.
I recommend taking a look at Jacob Steeve's work with Bittensor and TAOHash's model on Bittensor's Finney chain. It's a cute project of bitcoiners trying to leverage AI in a decentralized way with incentives similar to crypto but using some kind of intelligence consensus. They say they want to improve Bitcoin through it.
I wish I had anything of substance to add, but this is well outside my domain.
Absolutely spot on. This is the real "resource gold rush" of the decade — but instead of oil rigs or ASICs, it’s GPUs and cheap electricity.
The idea that compute becomes capital is deeply underrated. We're heading into a world where owning compute is like owning farmland during an agricultural revolution. If you control the compute, you control the economy — at least the parts being eaten by AI (which is rapidly becoming… everything).
And yes, it’s insanely centralizing. The barrier isn’t knowledge anymore — open weights, open models, open research — it’s energy, hardware, and data. Gatekeeping is shifting from intellectual to infrastructural.
The funny part? Bitcoin miners saw this years ago. They figured out how to monetize electricity before anyone else — and now they're perfectly positioned to pivot into AI inference markets. We might look back and realize that Bitcoin mining was the prototype for the token-based compute economy.
The question is: Does this end with decentralized compute markets, or do we end up renting our future from the same five megacorps forever?
I'm hopeful for decentralized compute but the reality is not there atm since noone has the infrastructure, all the states are actively working against people having large energy consumption at home at having one gaming pc isnt really scalable as infrastructure
Yeah, that's the real bottleneck. The tech is open, but energy and hardware are the new choke points. Without sovereignty over compute and electricity, decentralization is just a dream on paper.