• AI Valley
  • Posts
  • NVIDIA’s car can explain itself

NVIDIA’s car can explain itself

PLUS: Boston Dynamics reveals new Atlas

Together with

Howdy, it’s Barsee.

Happy Tuesday, AI family, and welcome to another AI Valley edition.

Today’s climb through the Valley reveals:

  • NVIDIA’s new self-driving AI can explain its decisions

  • Boston Dynamics reveals Electric Atlas

  • Amazon launches Alexa+ on the web to rival ChatGPT

  • OpenAI’s "Project Gumdrop”

  • Plus trending AI tools, posts, and resources

Let’s dive into the Valley of AI…

MINIMAX

Courtesy: MiniMax

MiniMax-M2.1 delivers SOTA performance on major coding benchmarks (SWE, VIBE, Multi-SWE), outperforming Gemini 3 Pro and Claude Sonnet 4.5.

Built on a 10B active / 230B total MoE architecture, it offers faster inference, easier deployment, and even the ability to run locally.

On Code ArenaM2.1 ranks #1 among open-source models and #6 overall, performing close to Gemini 3 Pro while surpassing GPT-5.2.

For builders shipping production apps, agents, and tools, M2.1 is designed to be strong, fast, cost-efficient and deployable.

*This is sponsored

THROUGH THE VALLEY

Courtesy: Nvidia

At CES 2026, NVIDIA CEO Jensen Huang said “the ChatGPT moment for physical AI is here,” unveiling Alpamayo, a new autonomous driving system that reasons about what it sees instead of just reacting. Unlike traditional self-driving cars, Alpamayo explains its decisions, walks through unusual situations, and plans actions step by step. The first car to use it will be the Mercedes-Benz CLA, launching in Europe in early 2026 with a backup safety system running alongside it. NVIDIA also announced new AI supercomputers, faster speech models, humanoid robot software, and physical AI tools already being used by companies like Uber and Hitachi.

Boston Dynamics has unveiled a new version of its humanoid robot Atlas, which Hyundai plans to deploy in its factories starting in 2028. The robot is fully electric, stands 6 feet 2 inches tall, and can lift up to 110 pounds, with hands that use tactile sensing to adjust grip in real time. Atlas can work in temperatures from −4°F to 104°F and run for about four hours on a swappable battery. Hyundai plans to deploy the robots at plants like its Savannah, Georgia facility, with a long-term goal of producing around 30,000 Atlas units per year.

Courtesy: Amazon

Amazon has launched Alexa.com, a browser-based version of its new AI-powered Alexa+ assistant, putting it in direct competition with chatbots like ChatGPT, Gemini, Claude, and Grok. Early Access users can now use Alexa+ on the web for research, writing, and planning, marking its first major step beyond Echo devices. Alexa+ also supports agent-style actions through partners like Expedia, Yelp, Angi, Square, Uber, and OpenTable. Amazon says usage has jumped, with shopping and cooking interactions up 3 to 5 times. The Alexa mobile app is also shifting to a chatbot-first design, making conversation the core experience.

Concept render by Ventuals on 𝕏.

OpenAI is preparing its first consumer AI device, known internally as Project Gumdrop, with Foxconn set to handle production in Vietnam or the US. The device, expected in 2026 or 2027, is still in design and may take the form of a smart pen or small audio device with a microphone and camera that sends handwritten notes directly to ChatGPT. At the same time, OpenAI is expanding its UAE Stargate data center, aiming for 200MW of AI compute by late 2026. Separately, co-founder Greg Brockman donated $25 million to Trump’s MAGA Inc., highlighting growing ties between AI leaders and politics.

A Google engineer said Anthropic’s Claude Code built a working distributed agent system in about an hour, a task her team had spent over a year developing. With minimal prompts, Claude produced a usable prototype, showing how fast AI-assisted coding has advanced from simple code completion to full system orchestration. At the same time, Google’s Gemini 3.0 Pro decoded handwritten notes from a 1493 Nuremberg Chronicle, combining handwriting analysis, historical context, and biblical timelines. Aside from a few math errors, the interpretation was accurate, highlighting how modern models can unlock complex historical material at scale.

Courtesy: Neuralink

Elon Musk says Neuralink is moving faster toward large-scale production, with plans to ramp up output in 2026 and automate much of its brain implant surgery process. A key design change now allows the implant’s electrode threads to pass through the brain’s protective layer without removing it. Neuralink’s device, which Musk calls a “Fitbit in your skull,” lets paralyzed patients control computers using their thoughts through a chip and over 1,000 electrodes. After FDA approval, the company raised $650 million and has implanted the device in 12 patients so far, including quadriplegic patient Noland Arbaugh.

TRENDING TOOLS

  • Radial > Create macOS shortcuts you control with simple gestures

  • Talo > Real-time voice translation for calls, live events, and streaming broadcasts

  • Z.ai GLM-4.7 > A new model from a Chinese AI startup built for long, multi-step tasks in production environments

  • /agent by Firecrawl > An API that navigates complex websites and automatically extracts structured data.

THINK PIECES / BRAIN BOOST

THE VALLEY GEMS

What’s trending on social today:

THAT’S ALL FOR TODAY

Thank you for reading today’s edition. That’s all for today’s issue.

💡 Help me get better and suggest new ideas at [email protected] or @heyBarsee

👍️ New reader? Subscribe here

Thanks for being here.

REACH 100K+ READERS

Acquire new customers and drive revenue by partnering with us

Sponsor AI Valley and reach over 100,000+ entrepreneurs, founders, software engineers, investors, etc.

If you’re interested in sponsoring us, email [email protected] with the subject “AI Valley Ads”.