Image created with OpenAI GPT-Image-1. Image prompt: over-the-top 1990s pro-wrestling promo poster, wind-tunnel runway featuring “Mistral Mauler” with billowing cape of swirling code gusts; cyclone spotlights, grainy print texture, vivid neon titles

We now have audited data on water consumption for AI. Over the 18 month lifespan of Mistral Large 2, a 128B model, all water usage (including chats, training; hardware & data centers) took as much water as 678 US households use yearly. Each additional query is 45 mL. (Fixed) https://x.com/emollick/status/1947782699948675528

I gave Claude the Mistral report on its AI’s environmental impact and the prompt: “”visualize this in two different ways, one that makes the numbers appear positive, one that makes them seem negative, using vivid comparisons”” (I then had it do some error checking & corrections) https://x.com/emollick/status/1948090558309613587

Mistral started it DeepSeek scaled it Kimi K2 confirmed it: always more convenient to train an MoE https://x.com/hkproj/status/1947571673021993152

RT @MistralAI: In our continued commitment to open-science, we are releasing the Voxtral Technical Report: https://x.com/andrew_n_carr/status/1947779499032285386

Trending

Discover more from Ethan B. Holland

Subscribe now to keep reading and get access to the full archive.

Continue reading