Image created with gemini-3.1-flash-image-preview with claude-opus-4.7. Image prompt: Using the provided reference image, preserve every element exactly — the marigold-orange studio backdrop, the seated woman’s purple-and-white windbreaker and closed-eyes smile, the standing tattooed singer’s red beanie, layered red vest, and his hand position and head angle — but replace ONLY the black handheld microphone with a realistic articulated robotic gripper end-effector, a chrome and matte-black two-finger industrial claw on a short servo wrist held to his mouth in the same grip and scale as the original mic, photographed with matching cinematic studio lighting, shallow depth of field, and seamless photorealism. After generating the image, overlay the text “Robots” in the upper-left corner of the frame in large, bold, all-caps ITC Avant Garde Gothic Pro Medium (or a near-identical geometric sans-serif if unavailable), pure white (#FFFFFF), with no date, subtitle, drop shadow, or outline. The text should be substantial in scale — taking up a meaningful portion of the upper-left area — with comfortable margin from the top and left edges, set against the negative space of the orange backdrop so it does not overlap or obscure the singer, the seated woman, or the replaced object.
Also the robot you are seeing here is doing a high-volume package logistic use case It’s doing it fully autonomous with Helix-02
https://x.com/adcock_brett/status/2044799566760091716
Gemini Robotics ER 1.6: Enhanced Embodied Reasoning — Google DeepMind
https://deepmind.google/blog/gemini-robotics-er-1-6/
Instead of writing complex code, the team interacted with Spot using plain English. We built a bridge between Gemini Robotics ER and Spot’s system, giving the AI a basic set of tools to move freely, take photos, and grab things – enabling it to carry out more complex tasks.
https://x.com/GoogleDeepMind/status/2044763631858909269
Introducing Gemini Robotics ER 1.6, our new SOTA robotics model 🤖 which excels at visual and spacial reasoning, now available via the Gemini API!
https://x.com/OfficialLoganK/status/2044080025474126065
Robotics is making progress! 🤖 We just released @GoogleDeepMind Gemini Robotics-ER 1.6 for enhanced embodied reasoning. – Unlocks instrument reading capabilities for complex gauges and sight glasses. – Achieves 93% success on instrument reading tasks using agentic vision. –
https://x.com/_philschmid/status/2044071114578509971
We teamed up with @BostonDynamics to power their robot Spot with Gemini Robotics embodied reasoning models. This means it can better understand its surroundings, identify objects and follow simple commands – like tidying up a room.
https://x.com/GoogleDeepMind/status/2044763625680765408
We’re rolling out an upgrade designed to help robots reason about the physical world. 🤖 Gemini Robotics-ER 1.6 has significantly better visual and spatial understanding in order to plan and complete more useful tasks. Here’s why this is important 🧵
https://x.com/GoogleDeepMind/status/2044069878781390929
Skild Brain preparing an omelet with everyday human tools. The robot drops an eggshell into the bowl at one point but recovers and continues the task. The ability to self-correct during edge cases is what will make robots dependable for complex, long-horizon missions.
https://x.com/TheHumanoidHub/status/2044492735420502300
Kia’s CEO announced a phased roadmap for deploying the Boston Dynamics Atlas humanoid: – 2028: Full-scale deployment of Atlas at Hyundai Motor Group Metaplant America (HMGMA) in Georgia. – Second half of 2029: Expansion to Kia AutoLand (Georgia) and other global Group
https://x.com/TheHumanoidHub/status/2042231158889759160
📢📢A double launch today! We’re releasing a paper analyzing the rapidly growing trend of “open-world evaluations” for measuring frontier AI capabilities. We’re also launching a new project, CRUX (Collaborative Research for Updating AI eXpectations), an effort to regularly
https://x.com/random_walker/status/2044841045867778365
Meet @HappyOysterAI from Alibaba ATH, an open‑ended world model built for real‑time world creation and interaction. Be part of the first wave and see what you can build. 🌍✨ #AlibabaAI #HappyOyster
https://x.com/AlibabaGroup/status/2044634595937882394?s=20
Most Physical AI models recognize patterns. They don’t understand the world. That’s why they fail on edge cases. BADAS 2.0 is a V-JEPA2 world model trained by @getnexar on real-world videos. We used the model to find what it didn’t understand, then trained on that. It
https://x.com/eranshir/status/2044759951340388611
Must-read research of the week ▪️ Neural Computers ▪️ The Illusion of Stochasticity in LLMs ▪️ Learning is Forgetting: LLM Training as Lossy Compression ▪️ A Frame is Worth One Token: Efficient Generative World Modeling with Delta Tokens ▪️ INSPATIO-WORLD: A Real-Time 4D World
https://x.com/TheTuringPost/status/2044113565771280775
We’re open-sourcing HY-World 2.0, a multimodal world model that generates, reconstructs, and simulates interactive *3D worlds* from text, images, and videos. Outputs can be integrated into game engines and embodied simulation pipelines. Key highlights: 🔹 One-click world
https://x.com/TencentHunyuan/status/2044604754836505076?s=20
Ashton Kutcher in the house!
https://x.com/adcock_brett/status/2042995606541734077
Tony Robbins in the house!
https://x.com/adcock_brett/status/2042409315068531111
Figure and Hark just took an entire data center of NVIDIA B200s – every rack in the building Figure will be using these to predict physics and Hark will train next generation multi-modal models
https://x.com/adcock_brett/status/2042675641037000868
‘If you don’t cannibalize yourself, someone else will’ — Steve Jobs
https://x.com/TheHumanoidHub/status/2043744529686376950
20 DoF, 4′ tall R1 humanoid: $6,800 7 DoF Unitree hand: $7,800 The hand literally costs more than an arm and a leg.
https://x.com/TheHumanoidHub/status/2044104379402465375
A highly affordable, fully 3D-printed robotic arm. [📍 open source] Features: > Fully 3D printed design > 7+1 DOF > or physical AI research & imitation learning Zero custom hardware needed, just print and add your servo kit! > URDF files > STEP & STL files >
https://x.com/IlirAliu_/status/2042302346403660272
Chinese OEMs love the Iron Man aesthetic.
https://x.com/TheHumanoidHub/status/2043778847905329276
Hey H1, get off your metal butt and run down to the store for milk and eggs. Chop-chop!””
https://x.com/TheHumanoidHub/status/2043114282955796949
Introducing Vulcan – a new Al robot controller This is an important safety feature that allows F.03 to lose up to 3 actuators in the lower body and still hobble around without falling Check it out
https://x.com/adcock_brett/status/2044797356965757065
New Episode: Ozgun Kilic Afsar (@ozgunkilicafsar) joins me to discuss her groundbreaking work at the MIT Media Lab. Ozgun led the research on a new class of electrically driven artificial fibers that create seamless motion without the need for bulky motors, noisy machinery, or
https://x.com/TheHumanoidHub/status/2043739550607171923
New humanoid top-speed record: Unitree H1 reaches 10 m/s (22.3 mph) top speed. The fastest top speed ever achieved by a human is Usain Bolt’s 12.42 m/s (27.8 mph).
https://x.com/TheHumanoidHub/status/2042942407046762676
No brand. No logo. Just a humanoid on a Times Square billboard. Any idea who this is??
https://x.com/IlirAliu_/status/2044478507837792640
Octo-Bouncer. Highly precise stepper motor driving with a Teensy 4.0 and custom pulse generating algorithm and PC based image processing with the goal of getting a machine to juggle a ping pong ball. [📍 bookmark for later to give it a try!] GitHub:
https://t.co/DzdJlp072L —–
https://x.com/IlirAliu_/status/2043027200853545308
Some humanoid maker rented a billboard in Times Square. “”It works around the house.”” “”It has a real brain.”” “”April 17th on X”” Alongside a blurred-out humanoid! 🤔
https://x.com/TheHumanoidHub/status/2044445022490378502
The Strait is open, and AGIBOT is sending more oil: data. AGIBOT WORLD 2026 just dropped on Hugging Face. • 100% real-world data + 1:1 GenieSim digital twins • Hundreds of hours of free-form (non-scripted) teleop in commercial and home settings • Force-control data + full
https://x.com/TheHumanoidHub/status/2042217162224644366
The Toyota CUE7 robot, a 7’2″” 74 kg wheeled humanoid, debuted during halftime at Japan’s basketball game. The robot made a free throw but missed a set shot from three.
https://x.com/TheHumanoidHub/status/2044504579480686900
The Unitree R1 specs are officially here. And the price starts at a disruptive $6,800!!
https://x.com/TheHumanoidHub/status/2043471968620699711
This robot can be torn apart and still keeps moving. Northwestern researchers built modular robot legs that each operate as fully independent machines: > Own battery > Own motor > Own sensors > Own brain A single module can jump, roll, and turn on its own. Bolt a few together
https://x.com/rowancheung/status/2044071325434818696
Unitree will start selling its cheapest humanoid robot, the R1, on Alibaba’s AliExpress next week for international markets including North America, Europe, Japan, and Singapore. The 123-cm-tall, 27-kg R1 was unveiled last year with a promise of a $5,900 starting price.
https://x.com/TheHumanoidHub/status/2042238281786765442
$6,800 for a humanoid robot!! Unitree isn’t just competing; they’re resetting the entire market. The Unitree R1 Humanoid Robot is now available for pre-order on AliExpress for global buyers. Starting price of $6,806; deliveries are scheduled to begin on June 30th. 4’0″” (123
https://x.com/TheHumanoidHub/status/2043470608089149449
A full MIT course on visual autonomous navigation. If you work on robotics, drones, or self-driving systems, this one is worth bookmarking‼️ MIT’s Visual Navigation for Autonomous Vehicles course covers the full perception-to-control stack, not just isolated algorithms. What
https://x.com/IlirAliu_/status/2042149519945515245
Imagine controlling a robotic arm with the same effortless intuition as reaching for a tool in your own workshop… No laggy screens or locked perspectives killing your focus. Robotics pros know the frustration of clunky teleoperation systems that turn precise tasks into
https://x.com/IlirAliu_/status/2044323889581346901
Latent Encoder-Decoder code base. Fully open sourced! You can train and visualize the latent space. [📍 Save it, to find it later when you need it] Thanks for sharing, Xueyan Zou (@xyz2maureen). Code:
https://t.co/32NHSbLInd Paper:
https://t.co/UeEfDAhEIA —– Weekly robotics
https://x.com/IlirAliu_/status/2044114254538674254
MIT 6.4210 — Robotic Manipulation (Russ Tedrake) Perception, motion planning, grasping, task planning; the full Manipulation-Stack. Students are building a software pipeline for autonomous grasping in unstructured environments. Everything is freely available. 📍
https://x.com/IlirAliu_/status/2042881297849135397
Robotics with MATLAB Intro to robotics with a strong focus on learning by doing. You use simulations and animations to understand how robots work, while the math builds intuition step by step. The course shows how to write code for robot physics, control systems, and motion
https://x.com/IlirAliu_/status/2043389440949649814
Robots usually forget what just happened. The policies see a few frames… then guess. That breaks the moment. The task needs memory. This one remembers everything. That’s how FutureVision hits 96%+ success on the shell game. • object hidden under identical cups • cups
https://x.com/IlirAliu_/status/2042529720625967444
You don’t need expensive hardware to build something that feels like real robotics. An Arduino, a servo, and an ultrasonic sensor can scan space and turn echoes into a simple radar view. It sweeps, measures distance, and visualizes it in real time. Cheap, simple, and a solid
https://x.com/IlirAliu_/status/2043236692773982704
Elon just announced the tape-out of the in-house-designed AI5 chip, a huge step forward for the Optimus brain. It’s going to be “magnitudes better” than the current generation. “Tape-out” is a key milestone in semiconductor chip design. It marks the end of the design phase, when
https://x.com/TheHumanoidHub/status/2044531761510814204
Grading and foundation support work is progressing rapidly for the ’10 million/year’ Optimus factory in Austin.
https://x.com/TheHumanoidHub/status/2043058797166637067





Leave a Reply