Today
Les Barclays asks Who Captures the Value When AI Inference Becomes Cheap?:
Currently enterprise AI chatbots and platforms are marginally useful but largely disappointing because they currently fall short in reliably creating true business transformation. The issue is they don't sufficiently understand specific business processes and wider business transformation is required to enable AI which takes time. Based on what I’ve written, I’m led to believe that the larger direction of AI is going to focus on one subject applications built by startups.
As subsidies end and true costs surface, AI services will become expensive and eventually become commoditised. The current race to the bottom on simple tasks will continue, but complex reasoning and giant contexts will carry premium prices that reflect real compute costs.
A long read, but a really interesting summarization of the current state of the industry. One of my goals this year is to better understand the funding landscape around AI adoption.
Memory and continuous learning are perhaps some of the biggest bottlenecks holding back strong AI, among other stuff. Current stuff is narrowly capable, but still brittle. Solving continuous learning and memory seems a non-negotiable if they want it to shift to high-level machine intelligence.
Amen.

