X Watchlist: AI/Local Inference Leaders
March 22, 2026
X Watchlist: AI/Local Inference Leaders
Source: https://x.com/0xSero/status/2035064089345478658 Captured: 2026-03-21
Accounts
| # | Handle | Name/Role | Why They Matter |
|---|---|---|---|
| 1 | @karpathy | Andrej Karpathy | Best AI thought leader. Nanochat = best intro to training LLMs. |
| 2 | @steipete | Peter Steinberger | OpenClaw creator. Peekaboo, summarize.sh, oracle. GitHub is a treasure. |
| 3 | @badlogicgames | Mario (badlogic) | Mario's Pi = best simplest open-source agentic loop. Setting new open source standards. |
| 4 | @TheAhmadOsman | Ahmad Osman | GPU king. Giveaways, dense educational content on self-hosting/home inference. Connected to open weight labs, does interviews. |
| 5 | @sudoingX | sudoingX | Up and comer pushing limits of single GPU inference. |
| 6 | @Ex0byt | Ex0byt | Will be fundamental in making local inference on massive models possible. |
| 7 | @alexinexxx | alexinexxx | GPU kernel programming. Hard worker, learning in public. |
| 8 | @gospaceport | gospaceport | Hardware/homelab expert. Teaches hardware economics. Most impressive homelabs. |
| 9 | @alexocheema | Alex Ocheema | Founder of Exolabs. Pioneering Apple hardware inference. Mac mini/Studio expert. |
| 10 | @nummanali | nummanali | Prolific CLI tool builder -- LLM subscription budgets, Claude Code with alt models. |
| 11 | @thdxr | Dax (Opencode) | Opencode team. Good writer. Anti-doomer content. |
| 12 | @juliarturc | Julia Turc | LLM compression science. Best source for learning compression techniques. |
| 13 | @Teknium | Teknium | Nous Research & Prime Intellect. Hard-working, principled open-source teams. |
| 14 | @victormustar | Victor Mustar | Head of Product at Hugging Face. Enables publishing/sharing. |
| 15 | @louszbd | louszbd | Head of community at ZAI. Top open-weight LLMs. Supercharged the movement. |
| 16 | @SkylerMiao7 | Skyler Miao | Making frontier intelligence fit on $10K hardware via MiniMax. |
| 17 | @crystalsssup | crystalsssup | Building best open-weight model. Releasing research before next-gen model. |
| 18 | @0xSero | 0xSero | The curator. Local inference community builder. |