January 26, 2025

Huge shoutout to the SemiAnalysis team for a ridiculously high-signal report answering the trillion dollar question: Can AMD catch Nvidia?

TLDR; Not anytime soon.


Recently, I've been getting several questions from friends that go like this: “How close is AMD to Nvidia?”

This is mainly due to the $500B Stargate Project that was just announced — but I didn’t really have a ton of conviction in my answer… “It’s CUDA”.

Like it’s just software — clearly this can be replicated, right? When AMD has a market cap of $200B and NVDA sits at $3.5T, they should be ready to sell their left kidney and naming rights to their first-born to catch-up ASAP. I didn’t feel like I had a really solid answer as to what this software moat — the ‘CUDA Moat’ — actually looked like and how big it really was.

But recently, I started digging into GPU programming for a systems course I’m taking at UCSD and read this report from SemiAnalysis — lots clicked with the other concepts I already knew on this issue. And it turns out it is more than just CUDA.

The SemiAnalysis report deserves all the love it can get because it is an incredible analysis and free. Basically, I think anyone with an opinion on AMD vs. NVDA should probably read this report (ask your favorite LLM to translate the technically dense sections) — but I’ll summarize my thoughts here for a broader audience.

I’m going to attempt to answer these two important questions in an approachable way:

  1. Why is Nvidia is currently winning?
  2. Why will Nvidia continue winning?

<aside> 💡

*For context, I used to work as an investment banker but pivoted to studying ML back in 2023. Currently, I’m a grad student at UCSD studying machine learning and trying my hand at research.

I remember trying to understand the semiconductor ecosystem several times back in finance and finding the learning curve way too steep. Now, having used various accelerators (TPUs, Nvidia GPUs), read many systems papers, and studied CUDA programming, I feel like I’m starting to wrap my head around much of this.

Preface: I’m no expert to GPU programming and I haven’t directly used an AMD chip before. This post summarizes my opinion on the situation — please interpret accordingly.*

</aside>

Hold up…What is this CUDA thing?

But AMD has great specs!

Why NVDA is Currently Winning

Why NVDA will Continue Winning

What Would it Take to Change My Thesis?