The Nvidia Corporation is famous in the tech world for designing and manufacturing graphics processing units (GPUs) for the gaming and computer market and chips for the mobile computing and automotive industries.

The company was initially founded in 1993 by Jen-Hsun “Jensen” Huang, Curtis Priem and Chris Malachowsky in Santa Clara, California. 

In the 1990s, computer games were CPU-based, rather than GPU, but this was quickly becoming insufficient. Due to the rapid advancement and change from MS-DOS to Windows, graphics were impacted and needed a new, more efficient math coprocessor. And this was where Nvidia came into play. 

By committing to revolutionise the graphics world thanks to its GPU manufacturing and innovations in the gaming sector, Nvidia has moved forward through 3D graphics invention, high-performance computing (HPC) and even artificial intelligence (AI). 

What is the history of Nvidia?

When Nvidia was first created, the competition in its field was fierce. Alongside ATI Technologies, Chips & Technology, S3 Graphics and 3Dfx, it was only one of the many tech companies that offered similar services in the GPU manufacturing field. However, around 1999, Nvidia released the GeForce card, a particular brand of GPUs, and broke into a more advanced 3D graphics manufacturing field. Up until today, there have been 18 versions of it, and the only other competitor today is AMD’s Radeon GPUs.

This led to the company winning a contract to develop graphics hardware for Microsoft’s Xbox console.

In 2006, ATI was bought by AMD while Nvidia was working on new developments of its GPU technology. Around the same time, it developed compute unified device architecture (CUDA), which was a new coding language used for programming GPUs. CUDA was revolutionary: it allowed programmers to write large parallel programs to make simulations more efficient, alongside other kinds of visualisations that process big amounts of data. Today, many universities worldwide teach CUDA courses.

When Nvidia released the Tegra line of systems-as-a-chip (SoC), which combined an Arm CPU with an Nvidia GPU, it marked another important development. While at first it was mostly utilised by carmakers for in-dash systems, Nintendo purchased it in 2017 for its Switch console.

Apart from new inventions and creations, Nvidia is also known for acquiring other companies, including its rival 3Dfx in 2002 and Hybrid Graphics in 2006. One of the most noteworthy purchases happened in 2019 when Nvidia bought Mellanox Technologies for an alleged $7bn to benefit from its more advanced data processing units (DPUs).

Today, its revenue reached $26.9bn, and Nvidia has officially established itself in the tech manufacturing environment.

Apart from the GPU, one of the most vital technological advancements of the last decades, which is responsible for the existence of machine learning, video editing and modern gaming software, Nvidia is also known for other inventions, such as 3D in 1993. The company also powered the breakthro9ugh AlexNet neural network, which is behind AI. Today’s technological discoveries are possible partially because of Nvidia itself. 

For example, the former Nvidia Quadro, now called RTX, works on the same basis as GeForce, but it is modified to suit the needs of professional visual computing graphics processing products such as computer-aided design (CAD). Nvidia’s GeForce NOW, the latest update, is useful to purchase video games and transform any device into a gaming console while keeping every game up to date automatically. The DGX Servers too are an important part of Nvidia’s portfolio, as they are the company’s own line of hardware without a CPU, especially needed for HPC and AI projects. 

Throughout its existence, Nvidia has also gone beyond its competitors’ innovations with Spectrum, which is a next-generation Ethernet platform which provides high-performance networking and effective security for the data centre. 

The year 2022 also marked the launch of Nvidia Omniverse Cloud, which is embedded in the metaverse. CEO Huang said: “With Omniverse in the cloud, we can connect teams worldwide to design, build, and operate virtual worlds and digital twins.” Ultimately, it will be able to “train, simulate, test and deploy AI-enabled intelligent machines with increased scalability and accessibility” through its Isaac Sim app, part of the cloud. 

How is Nvidia involved with AI?

Nvidia is involved with AI thanks to its GPU and CUDA innovations. Nvidia’s hardware already makes up the majority of AI applications today, covering more than 95% of the machine learning field. For instance, ChatGPT was trained using 10,000 of the company’s GPUs in a Microsoft supercomputer. 

Huang and Nvidia have been focusing on  AI  since 2014, with the CEO describing it as  “one of the most exciting applications in high-performance computing today”.d. Nowadays, Nvidia could be considered a complete AI developer, rather than just hardware or software. 

At the end of May 2023, Nvidia’s shares rocketed by over 25% because of its AI-related announcement entailing a collaboration with Microsoft to accelerate enterprise-ready generative AI. Indeed, the GPU corporation will integrate with Azure Machine Learning to build an end-to-end cloud platform to allow developers to build and manage AI applications based on large language models. Dan Ives, Wedbush Securities analyst, said: “We view Nvidia at the core hearts and lungs of the AI revolution.” Its share prices closed at $401 each, finishing with an overall value of $990bn. 

Huang, Nvidia’s founder, explained that “Generative AI’s versatility and capability has triggered a sense of urgency at enterprises around the world to develop and deploy AI strategies.” This is because Nvidia is not just a GPU manufacturing company, but it is an AI supercomputing corporation. 

Its platform strategy, which oversees chips, hardware, AI, and software development systems, is still hard to beat. Manuvir Das, vice-president of enterprise computing at Nvidia, told VentureBeat: “We know we have the best-combined hardware and software platform for being the most efficient at generative AI. We constantly operate with the motto that we have no advantage – and nobody is going to outwork us or out-innovate.”

Read more: Can immersive technology help to close the UK’s digital skills gap?