• AI Rundown
  • Posts
  • The AI Rundown by Lightscape Partners – 04/01/24

The AI Rundown by Lightscape Partners – 04/01/24

Top tech companies are going after Nvidia's chip moat, OpenAI debuts more cutting-edge genAI products, and GPT-4 loses the throne for the first time in its existence.

Screenshot from the sky kids – “Air Head”

Good morning and happy April.

Last week in AI,

  • Top tech companies are putting resources together to even the GPU playing field against Nvidia.

  • OpenAI flexed more of their generative AI models, highlighting Sora’s capabilities and a new Voice Engine model.

  • “The king is dead” as Anthropic’s Claude 3 has overtaken GPT-4 in performance capabilities and GPT-4 loses the throne for the first time since its release.

Check out the AI conferences happening this month here.

If you’d like a quick refresher on industry terms and the current landscape of AI, skip to the bottom.

Top AI Stories of the Week

The largest tech companies go after Nvidia’s lead in AI chips, targeting CUDA, the software developers depend on. Link.

  • 4+ million developers rely on Nvidia’s CUDA software worldwide to build AI and other apps.

  • The UXL foundation, a coalition of top tech companies, including Qualcomm, Google, and Intel, are developing a suite of software and tools that will be be able to “power multiple types of AI accelerator chips.”

  • As an open-source project, the mission is to enable computer code to run on any machine, agnostic of its chip or hardware.

  • Developers’ dependency on the CUDA software could be Nvidia’s biggest moat, meaning UXL could truly level the playing field.

OpenAI debuts a new series of Sora generations and their new Voice Engine model. Link.

  • The model has been in development since late 2022 and can create a synthetic voice based on a 15-second clip of someone speaking.

  • The Voice Engine model will power the text-to-speech function of ChatGPT.

  • In a blog article titled “Sora: first impressions,” OpenAI debuted a video in collaboration with shy kids showcasing the true capability of Sora. I highly recommend watching the video.

  • The videos in the blog illustrate how creative media companies and directors can work alongside AI to create new art and media.

Anthropic’s Claude 3 overtakes GPT-4 in performance capabilities. Link.

  • Claude 3 Opus, Anthropic’s most powerful new model surpassed GPT-4 on Chatbot Arena, Hugging Face’s crowdsourced leaderboard for all LLMs.

  • Variations of GPT-4 have topped the leaderboard since its release in May of 2023.

  • The shift in power foreshadows that each foundational LLM builder will likely leapfrog one another with each new release, a win for the consumer.

Software, LLMs + Generative AI

Databricks releases DBRX, their new open-source foundational LLM. Link.

  • The new model outperforms Mixtral MoE, Llama-2 70b, and Grok-1 in language understanding, programming, and math.

  • The model was trained in 2 months with $10M, highlighting the ever-increasing speed and cost-efficiency of developing new foundational models.

Chinese tech giant Baidu to be the generative AI behind Apple’s iPhone 16 and other products in China, allegedly. Link.

  • Baidu is poised to be Apple’s local generative AI model provider for the iPhone 16, Mac OS, and the upcoming iOS 18.

  • Each generative AI model will have to be approved by Chinese regulators before they can be launched to the public.

  • OpenAI’s ChatGPT and Google’s Gemini are not available in China, where the “top” generative AI model is still up for grabs.

  • Their latest open-source LLM with improved capabilities and 128K tokens of context length.

  • The model is set to be available on X to premium users.

Researchers at MIT announced new AI image generation method that makes images 30x faster. Link.

  • The new method is called “Matching Distillation,” which can generate images 30x faster than Stable Diffusion with the same or better quality outputs.

Infrastructure + Hardware

AI21 labs introduces Jamba, a state of the art hybrid SSM - Transformer model. Link.

  • The production-grade model enhances Mamba Structured State Space model technology with elements of the dominant Transformer architecture.

  • This is the first production-grade Mamba based model built on a novel SSM-Transformer hybrid architecture.

Intel confirms Microsoft Copilot will run locally on future PCs. Link.

  • Next-gen AI PCs will require 40 TOPS of NPU performance.

  • Copilot computation currently occurs in the cloud, whereas local models would drastically increase user experience and decrease latency.

Venture

Amazon invests another $2.75B in Anthropic. Link.

  • Amazon has already invested $1.25 billion, bringing their total investment to $4 billion.

  • The recurring investments, and latest from a cloud provider, highlight the ongoing race to stay ahead of the AI-curve.

Upcoming AI Conferences

  • San Jose, CA

  • April 16 – 17, 2024

Get up to speed on the current landscape

Image credit: SONYA HUANG, PAT GRADY, Sequoia Capital

Thank you for reading the AI Rundown by Lightscape Partners. Please send any questions, comments, or suggestions to [email protected].