OpenClaw is the hottest application in artificial intelligence right now, drawing strong interest from developers and many AI enthusiasts who seek practical, locally run agent systems. The platform has attracted attention for its approach to enabling autonomous assistance that runs a control plane on mini PCs, home servers, and other platforms, and can use local AI compute along with cloud compute to work on complex tasks. OpenClaw has a lot of momentum now, and instead of building a big company around it, its founder is moving to OpenAI.
Peter Steinberger Goes to OpenAI
This weekend, both Peter Steinberger and Sam Altman confirmed that OpenAI is going to be the home for OpenClaw. OpenClaw is expected to transition to a foundation while remaining open source and independent.
I’m joining @OpenAI to bring agents to everyone. @OpenClaw is becoming a foundation: open, independent, and just getting started.https://t.co/XOc7X4jOxq
— Peter Steinberger (@steipete) February 15, 2026
Normally, we do not cover folks changing jobs, but this one felt like it was going to have a big impact. If you have been away for a few days, this has increased the $10,000-ish Apple Mac Studio M3 Ultra 512GB to a 50+ day lead time. Mac Minis are selling out at retail locations. All of this is because OpenClaw is becoming extremely popular and there is a Mac .app that you can use.
For our STH readers, Project TinyMiniMicro or just running these into a Ubuntu VM running in Proxmox VE or another hosting platform will work as well. The rush to Apple Mac platforms seems to be for the unified memory running local models plus the ease of getting Apple products online.
Final Words
This feels like a big win for OpenAI and perhaps a big loss for Anthropic. Overall, OpenClaw has many security threat vectors, but it is also neat to see it run longer duration tasks.
Also, if you can, MiniMax M2.5 is a really good model that you can run locally, albeit you probably want 256GB or more of memory to do so. Over the weekend, Patrick showed the brand new MiniMax M2.5 model as a back-end using 8-bit MLX on a Mac Studio M3 Ultra.
This is MiniMax-M2.5 MLX running in LM Studio on an Apple Mac Studio M3 Ultra 512GB. Fast enough out of the box for hosting OpenClaw, n8n workflows, and Open WebUI for the team. pic.twitter.com/A9NoS9JMBR
— Patrick J Kennedy (@Patrick1Kennedy) February 14, 2026
It is a big upgrade over gpt-oss-120b. OpenClaw also works great with frontier models and you can hook it up to both local models like MiniMax-M2.5 or Kimi K2.5 as well as higher-end models using OpenRouter. OpenClaw will blow through tens of millions of tokens very quickly, so local AI is a good way to lower costs. We are covering this because a lot of folks ask how AI agents become useful. This is a great example of technology becoming more accessible, and there is a lot of interest in the project, so it is a good one to play with on a holiday day like today.



