Artificial intelligence is increasingly making its presence felt in several areas of our lives, certainly since the launch of ChatGPT. Depending on your view, it’s the big bad bogeyman taking jobs and causing widespread copyright infringement, or a gift with the potential to catapult humanity into a new age of enlightenment.
What many have achieved with the new technology, from Midjourney and LLMs to smart algorithms and data analysis, is beyond radical. It’s a technology that, like most of the silicon-based breakthroughs that came before it, has a lot of power behind it. It can do a lot of good, but also, many fear, a lot of bad. And these results are entirely dependent on how it is manipulated, controlled and regulated.
It’s no surprise then, given how quickly AI has entered the zeitgeist, that tech companies and their sales teams are leaning into the technology just as much, cramming its various iterations into their latest products, all with the aim of encouraging us to buy their hardware.
Check out this new AI powered laptop, the motherboard that uses AI to overclock your CPU to the limit, the new webcams with AI technology for deep learning. You get the point. You just know that from Silicon Valley to Shanghai, shareholders and business executives are asking their marketing teams “How can we get AI into our products?” in time for the next CES or the next Computex, however modest the value. actually be for us consumers.
My biggest bugbear comes in the form of the latest generation of CPUs launched by the likes of AMD, Intel and Qualcomm. Now, these are not bad products, not by a long shot. Qualcomm is making leaps and bounds in the desktop and laptop chip markets, and the performance of both Intel and AMD’s latest chips is nothing if not impressive. Generation after generation, we’re seeing higher performance scores, better efficiency, wider connectivity, lower lag, and ridiculous power savings (here’s looking at you, Snapdragon), among a whole host of innovative design changes and choices. For most of us mortals, it’s magic far beyond the basic 0s and 1s.
Despite that, we still get AI slapped on everything, whether it actually adds anything useful to a product or not. We’ve added new neural processing units (NPUs) to chips, which are co-processors designed to accelerate low-level operations that can take advantage of AI. These are then put into low-power laptops, allowing them to use advanced AI capabilities such as Microsoft’s Copilot assistant to tick the AI ​​box as if making a difference to a predominantly cloud-based solution.
The thing is though, CPU performance when it comes to AI is negligible. Like seriously insignificant, to the point it’s not even mildly relevant. It’s like trying to launch NASA’s JWST space telescope with a bottle of Coke and some Mentos.
The Emperor’s New Clothes?
I’ve spent the last month testing a variety of laptops and processors, specifically for how they handle AI tasks and apps. Using UL’s Procyon benchmark suite (makers of the 3D Mark series), you can run its Computer Vision inference test and it can spit out a nice number for you, giving you a score for each component. Intel Core i9-14900K? 50. AMD Ryzen 9 7900X? 56. 9900X? 79 (that’s a 41% performance increase, by the way, over and over again, seriously huge).
But here’s the thing: Throw a GPU through the same test, such as Nvidia’s RTX 4080 Super, and it scores 2,123. That’s a 2,587% performance increase over the Ryzen 9 9900X, and that’s not even using Nvidia’s own TensorRT SDK, which scores even higher than that.
The simple fact is that AI requires parallel processing performance like nothing else, and nothing does it better than a graphics card right now. Elon Musk knows this – he just got installed 100,000 Nvidia H100 GPUs in xAI’s latest AI training system. That’s more than $1 billion worth of graphics cards in a single supercomputer.
Obscured by clouds
To add insult to injury, the vast majority of popular AI tools today require cloud computing to fully function anyway.
LLMs (large language models) such as ChatGPT and Google Gemini require so much processing power and storage that it is impossible to run them on a local machine. Even Adobe’s Generative Fill and AI smart filter technology in the latest versions of Photoshop require cloud computing to process images.
It’s just not possible or possible to really run the vast majority of these AI programs that are so popular today on your own home machine. There are of course exceptions; certain AI imaging tools are far easier to operate on a solo machine, yet you’re far better off using cloud computing to handle it in 99% of use cases.
The only major exception to this rule is localized upscaling and supersampling. Things like Nvidia’s DLSS and Intel’s XeSS, and even to a lesser extent AMD’s own FSR (although this is predominantly based on deep-learning models, applied via rasterization hardware, meaning you don’t need AI components) are great examples of good use of localized AI. But otherwise you’re basically out of luck.
But still, here we are. Another week, another AI-powered laptop, another AI chip, much of which, in my opinion, amounts to much ado about nothing.