Movatterモバイル変換


[0]ホーム

URL:


Skip to main content

Building the ultimate AI and machine learning PC

ByPublished February 10, 2025
Add as a preferred source on GoogleSave

One of the best ways to reduce your vulnerability to data theft or privacy invasions when using large language model artificial intelligence or machine learning, is to run the model locally. Depending on the model you choose to run, you don’t even need the most powerful system in the world — although it does help.

Here’s how to build a PC for AI and machine learning workloads, so you can keep your data secure and private, and ensure the AI is always ready and waiting for you.

Recommended Videos

What does an AI PC need?

AI PCs aren’t drastically different to high-powered PCs for different tasks, although they do have some slightly quirky requirements that mean building a system with in mind is slightly different to building apowerful gaming PC.

Although many of the major CPU manufacturers have talked a lot in the past year about neural processors and how efficient that can be, they only tend to offer a few 10s of TOPS (trillions of operations per second). That might seem like a lot, until you find out that an Nvidia RTX 4090 can deliver over 1,300 TOPS.

In short, for AI workloads that CPU is far less important. While a fast processor is always helpful and having lots of cores will absolutely speed up your machine learning workloads and ensure the system remains functional even when working hard, the real horsepower comes from the graphics card.

So we’re looking for a powerful GPU, preferably with lots of video memory, lots of system memory when that’s not enough, and some expansive and fast local storage. That also means we need a high-end motherboard. While that won’t give us any additional AI performance in its own right, a top-tier motherboard ensures smooth power to the CPU and GPU, as well as adding support for multiple graphics cards if you really want to accelerate your machine learning tasks, or run more than one in parallel.

Outside of that you can put it in whatever case you like, with a big power supply and some good cooling to keep the system running without overheating and throttling. Some nice-to-haves might include high efficiency through lower power draw to keep running costs down — but that moves counter intuitive to our high-end GPU choices. We’ll also consider upgradeability in the future.

CPU

A hand holding AMD's Ryzen 9 9950X.
Jacob Roach / Digital Trends

Usually the CPU is the heart of a PC, whether it’s used for gaming, office-work, streaming, or video editing. But while it still plays a part in our machine learning, AI PC, it’s not the lynchpin.

Still, you want a modern one with lots of cores and preferably a strong upgrade path for the future, too. To that end, we’d recommend the AMD Ryzen 9950X. It’s one of AMD’s latest CPUs with 16 cores and support for 32 threads. It’s relatively low-power for such a high-end CPU, too, and will give you plenty of scope for running its own large language models, or just supporting the system that’s training them on a monstrous GPU.

If you want a more affordable alternative, the last-generation 7950X is still plenty capable and around $100 cheaper and still offers excellent performance. If you’re more of an Intel fan, consider the Core Ultra 9 285K or Core Ultra 7 265K they have boat loads of cores and impressive efficiency, as well as their own onboard neural processor.

Motherboard

ASRock X870 Pro RS motherboard and box.
ASRock

The motherboard is rarely the most exciting component in any custom build PC, but with an AI and machine learning computer it plays a bigger role than you’d think. You want something with strong, stable VRMs for handling all the power this system will be dealing with. Ideally, you want PCIExpress 5 support for the fastest storage, and supporting multiple graphics cards doesn’t hurt if you want to double up your training GPUs.

Or you can just get any old motherboard because it’ll probably do. I’m being facetious, because who wants to spend close to $1,000 on a motherboard? But ultimately anything outside of the bargain basement models will probably suffice, just make sure it’s got the features you want for your kind of budget.

Also make sure to get one that matches your CPU. If in doubt, double check before buying.

Graphics card

The RTX 5090 sitting on top of the RTX 4080.
Jacob Roach / Digital Trends

If you’re going to sink your budget into any component in your AI and machine learning PC, make it the graphics card. When you’re training large language models, or even just running big and complicated ones, you need a powerful graphics card. They have the VRAM to store the model on the card itself, and the thousands of parallel processing cores to actually run it.

If you don’t have much budget to spare, look to a card like the Nvidia RTX 3060 12GB — you can grab that for around $300 at the time of writing. However, if you really want to push your AI training or run some of the most advanced, demanding models, then the higher-end you can go, the better. The RTX 5090 is the best graphics card in the world right now, but it’s very hard to get hold of.

Last-generation alternatives aren’t that easy to find either, so you may need to wait a little. The best we could find at the time of writing was a renewed RTX 3090 for $1500, or a 4070 Ti Super with 16GB of VRAM.

What about AMD? Unfortunately while AMD’s AI accelerators are great for gaming, they just don’t compete with CUDA and Tensor cores for AI tasks yet. Maybe that will change, but for now if you want to create an AI PC, Nvidia GPUs are the best option.

Memory

The Kingston Fury Renegade DDR5 memory modules in white.
Kingston

You can min-max performance with memory, but it’s not going to make a massive difference in an AI PC. The best thing you can do is make sure you have lots of fast memory and don’t overthink it —unless you’re into overclocking.

Grab yourself a 64GB kit of 6400 MHz memory from a major manufacturer like Corsair, Kingston, G-Skill, Patriot, or TeamGroup. Anything faster and you have to start dabbling in settings tweaks to make the most of it. Better to just make sure you have enough.

Storage

Hand holding up WD Black SSD.
WD

Lots of fast storage is useful for AI and machine learning PCs so that they can handle all the training data you’re going to be throwing their way. Fortunately, modern storage is faster and cheaper than ever, so you can grab yourself several terabytes of PCIe 5 SSD storage for a few hundred dollars.

Any of the major brand name SSDs will do here, but like with memory, just make sure you have lots of it.

Power

PC power supply cables on a modular PSU.
Digital Trends

Power supplies are one area you don’t want to try to skrimp and save on. A good power supply makes sure that your whole, expensive AI PC stays healthy for the longterm. Get a 1,200W + Titanium or Platinum PSU from one of the major PSU brands and you’ll have a solid choice. EVGA, Corsair, Seasonic, FSP, Thermaltake, Enermax, SuperFlower, or beQuiet! are great options.

Put it all together

If you grabbed all the above hardware but want some tips on how to actually build the thing,we’ve got you covered. Once it’s done (or you’ve had someone else build it for you) you’ll be off and running with a super powerful, super capable AI and machine learning PC.

Jon Martindale
Former Evergreen writer
Jon Martindale covers how to guides, best-of lists, and explainers to help everyone understand the hottest new hardware and…
Topics
Lenovo’s latest ThinkPad leak suggests a new Surface and iPad Pro competitor is imminent
ThinkPad X13 Detachable could challenge Surface Pro and iPad Pro
Lenovo ThinkPad

Lenovo appears ready to revive its detachable ThinkPad design with a new entrant that could challenge premium 2-in-1 devices such as Microsoft’s Surface Pro and even Apple’s iPad Pro in the high-end convertible market. Early leaks and renders of the ThinkPad X13 Detachable Gen 1 reveal a sleek refresh that blends classic ThinkPad durability with a modern, tablet-first form factor just ahead of Mobile World Congress 2026.

A new ThinkPad fold in a familiar 2-in-1 role

Read more
ChatGPT now has a Lockdown Mode, but should you enable it?
A digital safe mode that prioritizes caution over convenience.
chatgpt-chat-history-feature

ChatGPT has a new security feature called Lockdown Mode, but OpenAI is clear about one thing from the start: most people don’t need to turn it on.

The new mode is designed for users who face unusually high digital risk, such as journalists, activists, or people working in sensitive environments. For everyone else, OpenAI says the standard protections built into ChatGPT are already enough.

Read more
You’ll soon be able to cap your MacBook’s charge at 80 percent to preserve battery health
iPhone's Charge Limit feature is making its way to MacBooks with macOS Tahoe 26.4.
MacBook Pro in space grey sitting on a desk.

Apple's iPhone lineup has long included a charge limit feature designed to preserve battery health by preventing the device from charging to 100 percent. By capping the maximum charge level, users can reduce battery wear over time, particularly if their device spends long stretches plugged in. Now, that same functionality is making its way to MacBooks.

Apple has introduced a new Charge Limit option in the latest macOS Tahoe 26.4 beta. According to MacRumors, the feature allows users to set a maximum charge level between 80 percent and 100 percent, giving them direct control over battery longevity. Once enabled, the MacBook will stop charging when it reaches the selected limit rather than automatically topping off to full capacity.

Read more

More Computing

Branding Logo
Upgrade your inbox: sign-up for our newsletters
Check your inbox!

[8]ページ先頭

©2009-2026 Movatter.jp