AI landscape is shifting from GPU to AI Accelerator

My thoughts on shifting AI landscape from GPU to AI Accelerator

Sung Kim
5 min readAug 25, 2023

This article is not very well-researched; it consists merely of my thoughts, expressed on this so-called blog. I hope to evolve this blog as I research the topics more.

Photo by PAUL SMITH on Unsplash

Many people assume that Nvidia’s dominance in the AI hardware market will continue for a few years. This assumption is understandable, given that the number of people pre-training Large Language Models (LLMs) or fine-tuning LLMs is increasing at an exponential rate. Evidence of this growth can be seen by simply looking at the number of models being uploaded to the Hugging Face Hub on a daily basis. Most likely, the majority of these individuals are using Nvidia GPUs or Google’s TPUs to train these models.

I would like to argue that the AI landscape is shifting away from GPUs to AI accelerators as people start implementing these LLMs in production. The very success Nvidia is experiencing today will result in them losing their virtual monopoly in AI hardware to other AI hardware companies, losing an AI inference market that will be exponentially bigger than the AI training market.

Problem Statement

Let’s me illustrate this with a typical business scenario. Your team decides to…

--

--

Sung Kim
Sung Kim

Written by Sung Kim

A business analyst at heart who dabbles in ai engineering, machine learning, data science, and data engineering. threads: @sung.kim.mw