Offbeat

Science

Good luck building a VR PC: Ethereum miners are buying all the GPUs

Once the tap turns on again, GPUs will restore PCs and edge computing to glory


Last month, one of my friends noted he’d been having enormous trouble trying to buy the components to assemble a virtual-reality-ready PC. Motherboards, memory, CPUs and solid state drives were easy to find, but the one absolutely essential component - a beefy GPU to drive a head-mounted display at a vomit-preventing 90 Hz - he couldn’t find anywhere. Every online vendor seemed to be out of stock, with long waiting times and stern warnings restricting purchases to ‘ONLY TWO PER HOUSEHOLD’. Why would anyone need two graphics cards? One for each eye?

This shortage had developed suddenly, over the month of May, in a curious lock-step with a seemingly unrelated development - an enormous rise in the price of a cryptocurrency known as Ethereum. A successor to Bitcoin, this second-generation of ‘magic internet money’ algorithmically grows its total money supply, rather like a central bank. Rather than employ legions of macroeconomists and quantitative easing, Ethereum demands ‘proof of work‘ - solutions to a cryptographic puzzle that require billions of educated guesses. Those guesses - you guessed it - can be tremendously accelerated by the very same GPUs that my friend had tried to buy.

Economics, it turns out, was the culprit. As Ethereum ballooned to its highest historical value (nearly USD $400 for a single ETH), it became a wise investment to buy a cheap PC with a beefy power supply, stock it up with GPUs, and let it compute its way into profits. One such PC, could - in the right circumstances - earn up to $10,000 a year, for a $2500 outlay. The formula, simply put: GPUs + electricity + time = profits!

Now that all the GPUs have been sucked into get-rich-quick schemes, what happens to a commercial and enterprise VR marketplace still trying to find its footing? This is less about kids at home playing PC games (though they’re clearly affected by this as well) than a story of all the other things we’ve come to depend on from our GPUs. These fast-and-efficient successors to the ‘math coprocessor’ (remember when those were a thing?) have become the single most important elements in computing.

While graphics provide the obvious use case for GPUs, they're also the accelerant for all sorts of workloads.

The rise of VR, machine learning - even this ‘tulipmania’ quality GPU shortage - points toward a larger shift. The generational tug-of-war between the centre and the periphery. PCs subverted mainframes, then the cloud drew everything back toward the centre. While the cloud hasn’t quite peaked, the next swing of the pendulum, into ‘edge computing’, enabled by highly performant GPUs, looks to be well underway.

Moving away from the general-purpose CPU (Intel has been four years at 14nm), this shift to the edge will make the GPU more important than the CPU. The GPU is central to all the roles we expect 21st century computers to fill. This may be the reason Apple recently ditched chip designer Imagination in favour of their own, home-grown GPU. Within a few years, every computing device of consequence - supercomputer, desktop or smartphone - will be driven by architectures and operating systems that center around the GPU.

Meanwhile, the global shortage of GPUs implies fat profits for nVidia and AMD as they shovel their chips into the maw of a market hungry to turn maths into capacity - and money. The PC, nearly driven into irrelevance by tablet computing, comes roaring back as a platform for visualisation and learning, transformed by the GPU into a power-hungry, expensive, finicky and absolutely essential tool for modern business. The pendulum swings again, and suddenly the edges are (yet again) the most interesting place to be, the place where the real work of computing happens.

For the moment, though, my friend has to patiently wait out this shortage. The chips will come: there’s too much money on the table. And as they arrive in their billions, the entire face of computing will change completely. ®

Send us news
71 Comments

Intel Gaudi's third and final hurrah is an AI accelerator built to best Nvidia's H100

Goodbye dedicated AI hardware and hello to a GPU that fuses Xe graphics DNA with Habana chemistry

AI cloud startup TensorWave bets AMD can beat Nvidia

Starts racking MI300X systems - because you can actually buy them and they beat the H100 on many specs

Dell shaves months off lead times for GPU-powered AI servers

TSMC turns advanced packaging production knob to 11

Next-gen Meta AI chip serves up ads while sipping power

Fresh silicon won't curb Zuck's appetite for GPUs just yet

Lambda borrows half a billion bucks to grow its GPU cloud

Will buy tens of thousands of Nvidia's prized accelerators, which will be collateral for the loan

AMD to open source Micro Engine Scheduler firmware for Radeon GPUs

And it was all thanks to peer pressure

Stability AI reportedly ran out of cash to pay its bills for rented cloudy GPUs

Generative AI darling was on track to pay $99M on compute to generate just $11M in revenues

What Nvidia's Blackwell efficiency gains mean for DC operators

Air cooling's diminishing returns on full display with Nv's B-series silicon

Overclocking muddies waters for Nvidia's redesigned RTX 4090 and US sanctions

Cut-down chips get a big boost

Nvidia software exec Kari Briski on NIM, CUDA, and dogfooding AI

A RAGs to riches story

Standardization could open door to third-party chiplets in AMD designs

Domain-specific accelerators are 'essential to progress' it claims, and a chiplet ecosystem is one way forward

Tiny Corp launches Nvidia-powered AI computer because 'it just works'

Startup slams AMD for buggy firmware