Image created with gemini-3.1-flash-image-preview with claude-opus-4.7. Image prompt: High-end product photo of a classic cherry-dipped vanilla soft-serve cone with a perfect curl, wrapped in a custom paper sleeve printed with a hand-drawn small-town map and bold vintage diner lettering spelling ‘LOCAL’, sitting on a checkered red-and-white counter with a tiny ’75 — Milford, DE’ enamel pin clipped to the sleeve, soft directional studio light, glossy macro detail, shallow depth of field, landscape composition.
You can now run DeepSeek4-Flash on 256GB Mac. Next up speed 🚀 PR:
https://x.com/Prince_Canuma/status/2047685898163147125
A completely local agent that lives right inside your browser. Powered by Gemma 4 E2B and WebGPU, it uses native tool calling to: 🔍 Search browsing history 📄 Read and summarize pages 🔗 Manage tabs 100% local. No servers needed!
https://x.com/googlegemma/status/2048805789788413984
Here is how to run a coding agent fully locally on your machine with @googlegemma and Pi. – Gemma 4 26B A4B activates 4B parameters per token. – Pi provides four tools: read, write, edit, and bash. – LM Studio runs a server at localhost:1234 by default. – Pi runs YOLO by
https://x.com/_philschmid/status/2048719354905108623
Learn how to run a local coding agent! Use: – Pi agent – Gemma 4 26B – Serving engine of choice: e.g. LM Studio
https://x.com/googlegemma/status/2049163687639007451
Lately I’ve been having fun with running coding agents fully locally. The setup I landed on is: – Pi agent – Gemma 4 26B A4B – Server of choice: LM Studio/Ollama/llama.cpp I wrote a step-by-step guide with instructions on how to set it up:
https://x.com/patloeber/status/2048715918541558075
Totally offline agents are possible!
https://x.com/Teknium/status/2048975223853350976
Vibe code without internet 🚀 I built a vibe coding app powered by Gemma 4, running fully on-device on Mac with MLX. Pick your model, then chat or build with it. Watch it build the Chrome Dino game offline using Gemma 4 27b. Open sourcing all of it below👇
https://x.com/ammaar/status/2049169134429073471
Sigma: A fully private AI browser that runs agents locally on your machine. -No cloud. -No data leaving your device. -Open Source Qwen, Gemma, Nemotron – all running right in your browser. This is the direction browser AI should go!
https://x.com/kimmonismus/status/2049244932477759767
This is where we are right now. And i’m not gonna lie it feels pretty magical 🧚♀️ Qwen3.6 27B running inside of Pi coding agent via Llama.cpp on the MacBook Pro For non-trivial tasks on the @huggingface codebases, this feels very, very close to hitting the latest Opus in Claude
https://x.com/julien_c/status/2047647522173104145





Leave a Reply