- AI Rundown
- Posts
- The AI Rundown by Lightscape Partners - 11/10/25
The AI Rundown by Lightscape Partners - 11/10/25
U.S. Blocks Nvidia’s Blackwell Chips to China, Google Rolls Out Ironwood TPUs, and OpenAI Signs $38B AWS Megadeal
Good morning and welcome back to another edition of The AI Rundown by Lightscape Partners.
Washington just drew a harder line on AI hardware exports, barring Nvidia from selling its upcoming Blackwell chips to China. That’s the U.S.’s most advanced training/inference silicon, so this isn’t a symbolic move, it tightens the choke point on frontier AI compute and forces Chinese players further into domestic alternatives.
At the same time, Google unveiled its 7th-gen Ironwood TPUs, a massive inference play built for the era of always-on AI. Anthropic has already committed to using up to 1 million of these chips, on pods with 9,216 TPUs, 1.77 PB of shared memory, and 9.6 Tbps links, because the price/performance math works at enterprise scale.
And OpenAI quietly made one of the largest AI infra commitments to date: a $38 billion, seven-year deal to run on AWS UltraClusters stocked with Nvidia GB200/GB300. That deal gives OpenAI cloud diversification beyond Microsoft, rewards Amazon with a marquee AI tenant, and signals how much compute OpenAI thinks it will actually need by 2026.
Stay tuned as we explore these stories and their implications for the future of AI, technology, and innovation.
If you haven’t yet, please support the newsletter by subscribing!
Hardware
Google unveils 7th generation Ironwood TPU chips with over 4x performance to power the age of inference. Link.
Anthropic committed to access up to one million Ironwood chips in a deal valued at tens of billions.
Ironwood pods interconnect 9,216 TPUs with 1.77 petabytes of shared memory and 9.6 terabit per second optical links.
Google argues end to end vertical integration, silicon through software, yields better cost performance at cloud scale.
Anthropic cites favorable price performance and reliability as it scales Claude capacity for enterprise workloads.
White House bars Nvidia from selling upcoming Blackwell AI chips to China amid national security concerns. Link.
Officials said America’s most advanced semiconductors should remain stateside, tightening earlier export restrictions.
Blackwell is Nvidia’s top tier generation for training frontier models and high end inference systems globally.
Decision follows prior bans on A100 and H100 GPUs and signals further limits on cutting edge chip exports.
Nvidia pledged compliance while prioritizing shipments to other markets and U.S. customers with strong demand.
Product Launches
Xpeng debuts VLA 2.0 model and a lineup of robotaxi EVs and humanoid robots at its November AI Day. Link.
Vision Language Action 2.0 maps vision directly to actions via implicit tokens, bypassing slower language mediation.
Proprietary Turing chips power in vehicle inference with roughly 2,250 TOPS for upgraded end to end driving.
Three purpose built robotaxi models arrive in 2026 with 3,000 TOPS and redundant safety systems per vehicle.
Navigation Guided Pilot advances approach Tesla FSD, and platform licensing begins with Volkswagen as first customer.
Enterprise + Consumer AI Applications
Cognizant partners with Anthropic to embed Claude across 350,000 employees and enterprise client workflows. Link.
Claude integrates into platforms to automate documentation, troubleshooting, and customer interactions under human oversight.
Consultancy aims to agentify offerings so AI systems handle multi step processes with measurable ROI and compliance.
Deal follows Anthropic alliances with Deloitte and IBM, extending reach into large regulated enterprises worldwide.
Approach targets faster delivery, higher productivity, and standardized, trustworthy AI across eleven industry sectors.
Apple nears a deal to pay about one billion dollars per year for Google’s Gemini model to power Siri. Link.
Arrangement reportedly brings a 1.2 trillion parameter model to enable more conversational and useful assistant behavior.
Apple keeps web search separate while renting Google’s largest language model to accelerate Siri improvements.
Seven year pact underscores urgency to upgrade Siri while internal projects mature toward first party capabilities.
Neither firm commented; rivals are also overhauling assistants, bringing generative AI to mainstream devices.
Data Centers + Energy
Meta plans roughly six hundred billion dollars of U.S. AI data center spending over three years. Link.
Company will front load capital to erect multiple supercomputing campuses aimed at long term superintelligence goals.
A twenty seven billion dollar financing with Blue Owl funds a massive Louisiana site; Texas builds are also underway.
Aggressive capacity ensures readiness for optimistic AI progress scenarios and surging compute requirements.
Plan dwarfs peers’ capex and ties into public private programs backing domestic AI infrastructure expansion.
Startup Funding & Valuations
Inception raises fifty million dollars to build diffusion based AI models for code and software. Link.
Seed led by Menlo Ventures with Microsoft M12, Nvidia NVentures, Snowflake, Databricks, Andrew Ng, and Andrej Karpathy.
Diffusion LLMs refine outputs iteratively, promising lower latency and compute costs versus auto regressive generators.
Mercury model is already integrated into developer tools such as ProxyAI and Kilo Code for practical use cases.
Team targets large scale code operations, arguing holistic refinement handles big codebases faster and cheaper.
MoEngage secures one hundred million dollars to expand globally and enhance Merlin AI marketing tools. Link.
Co led by Goldman Sachs Alternatives and A91 Partners with significant primary capital boosting total funding.
Platform helps brands use first party data for personalized outreach and automated decision making at scale.
North America now contributes about thirty percent of revenue after sustained international growth initiatives.
Repeat backing from Goldman validates fundamentals and supports hiring, product, and market expansion.
Beacon raises two hundred fifty million dollars to fund an anti private equity strategy for AI roll ups. Link.
General Catalyst, Lightspeed, and D1 Capital led; total funding reaches about three hundred thirty five million.
Holding company acquires niche software firms and modernizes products and back office with automation.
Acceleration teams rewrite codebases, automate payroll and accounting, and speed delivery using AI techniques.
CEO targets permanent ownership with reinvestment, avoiding typical cost cutting and quick exits playbook.
Giga raises sixty one million dollars to scale emotionally aware AI agents for customer support calls. Link.
Round led by Redpoint with Y Combinator and Nexus; platform handles multilingual calls using company knowledge bases.
Startup pivoted from on premises LLM operations to autonomous support agents resolving queries end to end.
Analytics, compliance, and quality tools are built in; volumes already reach millions of monthly interactions.
Funds expand engineering and go to market to meet demand across ecommerce, finance, healthcare, and telecom.
Regulation + Legal
EU draft plan considers easing AI Act rollout with targeted simplifications and delayed penalties. Link.
Proposal exempts narrow internal systems from database registration and phases labeling for AI generated content.
One year grace period would hold off fines until August 2027 as firms adapt to new compliance regimes.
Move follows heavy lobbying by U.S. tech firms and mirrors prior softening of environmental requirements.
Presentation expected November nineteenth; ethicists warn against weakening oversight for high impact systems.
Safety + Ethics
Berkshire warns of AI deepfakes impersonating Warren Buffett amid rising fraud and misinformation. Link.
Press release titled It’s Not Me highlights convincing videos offering advice never given by the Oracle of Omaha.
Company cautions less informed viewers could be misled and urges verification with official Berkshire sources.
FBI previously flagged scammers using AI generated voices of U.S. officials to breach government accounts.
Platforms and regulators face pressure to counter synthetic media risks as tools become accessible broadly.
OpenAI
OpenAI strikes a thirty eight billion dollar, seven year deal to run on Amazon Web Services UltraClusters. Link.
Agreement provides dedicated Nvidia GB200 and GB300 capacity, scaling toward roughly one million GPUs by 2026.
Diversifies beyond prior exclusivity by allowing cloud capacity purchases outside the Microsoft partnership.
Amazon shares hit all time highs as investors viewed the contract as a marquee AWS win in AI infrastructure.
White House approved arrangement under domestic compute initiatives encouraging U.S. based AI buildouts.
Sam Altman denies seeking a government bailout, clarifies chip factory loan guarantee discussions only. Link.
OpenAI plans about thirty gigawatts of compute, spending roughly 1.4 trillion dollars over eight years.
Talks with officials concerned onshoring semiconductor supply, not guarantees for OpenAI data centers.
OpenAI explores monetizing excess capacity, potentially selling compute to other companies as scale grows.
White House reiterated there will be no federal bailouts for AI; market forces must discipline spending.
Thank you for reading the AI Rundown by Lightscape Partners. Please send any questions, comments, or suggestions to [email protected].
