Image created with gemini-3.1-flash-image-preview with claude-opus-4.7. Image prompt: Using the provided Alesso ‘Heroes (We Could Be)’ reference image, preserve exactly the pure white landscape field, the galaxy-punchout Milky Way starfield clipped inside all letterforms, the precise vertical type stack, and the font contrast between bold condensed grotesque and light geometric sans, but replace ‘HEROES’ with ‘NVIDIA’ in the same condensed-grotesque galaxy-punchout headline, keep ‘(we could be)’ unchanged beneath it, replace ‘ALESSO’ with ‘SILICON FORGE’ in the light geometric all-caps galaxy treatment, keep ‘FEATURING.’ unchanged, and replace ‘TOVE LO’ with ‘CUDA’ in the condensed-grotesque galaxy-punchout style. Maintain identical letter tracking, generous white margins, and landscape aspect ratio with no illustrations or decorative elements.
I asked Jensen, why don’t you just become a hyperscaler yourself (rather than funding different neoclouds)?
https://x.com/dwarkesh_sp/status/2044868433381073000
Jensen on the famous story about Larry Ellison and Elon Musk begging him for GPUs over dinner: “”That never happened. We absolutely had dinner, and it was a wonderful dinner. At no time did they beg for GPUs. They just had to place an order.”” Jensen says Nvidia’s allocation
https://x.com/dwarkesh_sp/status/2044989230112506351
Nvidia has locked up many years of scarce components – almost a hundred billion dollars in purchase commitments. Is this Nvidia’s big moat? A competitor might design a great accelerator, but they don’t have Jensen’s LTAs with SK Hynix, TSMC, etc. Jensen: “If our next several
https://x.com/dwarkesh_sp/status/2044808033411223559
The most viral moment of the @dwarkesh_sp x Jensen Huang conversation was widely misunderstood It wasn’t really just about China, TPUs, or even Nvidia’s moat. It was about a much deeper disagreement over what it means for America to “”win”” in AI
https://x.com/TheTuringPost/status/2046366547619270665
Also, somehow everyone missed that Jensen Huang all but called Dario Amodei’s mindset a loser’s mindset
https://x.com/TheTuringPost/status/2046585887400604116
🚀SonicMoE🚀now runs at peak throughput on NVIDIA Blackwell GPUs 😃 54% & 35% higher fwd/bwd TFLOPS than the DeepGEMM baseline and 21% higher fwd TFLOPS than the triton official example. SonicMoE still maintains its minimum activation memory footprint: the same as a dense model
https://x.com/WentaoGuo7/status/2047007230847766951
NVIDIA Isaac GR00T N1.7 early-access is here – Open, commercially licensed 3B-parameter VLA model for humanoid robots – Action Cascade architecture (VLM reasoning + DiT motor control) – Trained on 20k+ hours of human egocentric video – Boosts dexterous finger-level manipulation
https://x.com/TheHumanoidHub/status/2045235958451421378
Nvidia backs AI company Vast Data at $30 billion valuation
https://www.cnbc.com/2026/04/22/nvidia-backs-ai-company-vast-data.html
Building a Fast Multilingual OCR Model with Synthetic Data
https://huggingface.co/blog/nvidia/nemotron-ocr-v2
What I learned this week: – Pretraining parallelisms – Can distillation be stopped – Mythos and the cybersecurity equilibrium – Pipeline RL – On why pretraining runs fails At the end of my conversation with @michael_nielsen, we talked about how to actually retain what you
https://x.com/dwarkesh_sp/status/2044793688279371982
Transformers are not the end game. AI still needs a breakthrough I talked to @FidlerSanja, VP of AI Research at NVIDIA, leading company’s Spatial Intelligence Lab, and she explains why ↓ And you should definitely watch our full conversation to understand where AI is heading
https://x.com/TheTuringPost/status/2046016440529248431
GPT-5.5 was designed for and trained on Nvidia GB200/300 The model itself helped in the deployment and improvement of the inference stack
https://x.com/scaling01/status/2047377992016384068
Holy crap, NVIDIA just made it drastically easier to create large scale explorable 3d worlds. No manual stitching of smaller 3d generations like other 3d models. Lyra 2.0 looks pretty damn impressive.
https://x.com/bilawalsidhu/status/2044681790195912972





Leave a Reply