Last Modified Aug 8, 2021


The primary motivator: my current home desktop is an aging overclocked i7-2600K on an X58 motherboard. Built in 2012, it's ancient by today's (October 2020) standards. And when you're a professional user, time is money (or at least productivity). Plus there's the "My parts are going to start dying soon" feeling when your desktop is 8 years old.

So, what do I need to come up to date for a decent while? Where are my pain points?


This is a primary motivator for an upgrade for me. I need at least 128G of RAM for some of the things I work on. The faster, the better. This puts me closer to what one would deploy for a deep learning system that is used for training (not just inference), and also allows me to run two or three OSes in virtual machines without feeling the walls closing in on me. This allows me to do parallel development on Linux and Windows relatively quickly. For most C++ work, this is beneficial even if you only ship on one platform. I fairly frequently have bugs exposed on one platform that are hidden (lurking in wait) by another platform. Many of these are found in unit tests, which is awesome. Those that are not are sometimes exposed by compiler warnings (also awesome). Others... it's nice to have two different toolchains from which to draw out bugs before a customer experiences them.


Clock speed isn't growing by leaps and bound like the old days. But for much of what I do, higher core counts save me time. The Microsoft C++ compiler is happy to use multiple cores when compiling, as are any of the build tools on Linux, FreeBSD, macOS, et. al. So jumping from a 4-core, 8-thread CPU to a 24-core, 48-thread CPU makes a significant difference in the amount of time I spend waiting for a build or compile to finish. Of course, I don't spend my whole day waiting for compiles on any project. My needs are dependent on what I'm doing at the time. Small stuff, like day-to-day TDD, ins't wildy affected by core count simply because only a small piece of code needs to be recompiled (often just a simgle compilation unit). And unfortunately, linkers don't see much benefit from multiple cores as a general rule. But when I'm refactoring, I'm likely changing header files. Which often means recompiling many compilation units. Since refactoring is critical to long-term success, I try to do what I can to make it less painful. CPU is important here.


This is huge. Maybe bigger than CPU and RAM. Compiling is an I/O intensive activity, and roughly linear with the parallelism of your build chain. Fortunately, we've come a long way in 8 years! My old desktop has Samsung SATA SSDs. They are great for what they are. But PCIe 4.0 NVMe is a full decimal order of magnitude faster than SATA. And it impacts nearly everything I do. Every program launches faster. Compiler/assembler/linker spends way less time saving output (literally 90% less). A 10X increase in persistent I/O is a game changer. My 2018 Macbook Pro runs circles around my old desktop here (roughly 6X faster), thanks to PCIe 3.0 NVMe. PCIe 4.0 NVMe ups the ante again. This is the right time to take advantage of all of the improvements here.

High speed networking

I have 10G ethernet in my home. The infrastructure is fiber where I need it, DAC where it makes sense (in same rack), and 10GbaseT to workstations and laptops. Technically I could run fiber to my desktop since it almost never moves. But for everything else, I need 10GbaseT (connector cyle rating, convenience of compatibility with 1GBaseT, etc.). I have an older Intel dual-10G card in my current desktop. It works well, but it's power-hungry. The electricity is not a big deal, but the heat is. And I want on-board so I don't have to populate a PCIe slot to get 10G ethernet. So a board with either Aquantia 10G or modern Intel 10G was important to me.


For deep learning work, I need a modern GPU. Without it, training is an order of magnitude slower (or worse). I don't have good options with my old desktop; it doesn't have the power delivery for a good GPU for deep learning. The power supply can't do it, and the on-board delivery isn't sufficient either. The slot layout of the motherboard isn't amenable to a pair of modern (3 slot!) GPU cards. And the case doesn't allow me to cool beefy GPUs; it's too small.

AMD versus Intel

I have no religion here. It's about getting what I need for the next several years, and bang for the buck. My needs put me in HEDT (high end desktop) / workstation territory for some things, but desktop for others. I don't need a workstation-class GPU, for example, in the sense that I don't really care about FP64 or even FP32, nor ECC, etc. I do need CPU, as does every professional. I need RAM and I/O. Today, my best option is AMD Threadripper 39xxX. I chose 3960X based on bang-for-the-buck. While Intel slashed the price of i9-109xx, and even the Xeon W-3175 isn't out of reach price-wise, AMD is ahead for my needs. PCIe 4.0, more than enough lanes for my needs, and multi-core performance that pretty much rules the roost at the moment. And given that I don't need more than 256G of RAM for the foreseeable future, Threadripper (versus EPYC) suits my needs and easily wins in the bang-for-the-buck category.

I will readily admit that the big headache with current Threadripper is the thermals. It's not so much a total power issue (though 280W TDP is nothing to sneeze at), but available options. Coolers with cold plates that cover the whole heatspreader of sTRX4 are still rare. And some of them that do cover the whole heatspreader are not up to the task. From all the research I did before this build, I knew my only real option was a custom watercooling loop. And there are currently only 2 water blocks that I know work effectively: the Heatkiller IV Pro for sTRX4 and the Optimus Absolute Threadripper 3+. EK's EK-Velocity sTR4 has had mixed reviews, and I was unable to find a review of the Bitspower Summit ELX for AMD TRX40. I also didn't find any reviews of the Swiftech Apogee SKF Prestige TR4, nor stock of it anywhere.

This isn't my first custom loop build. But my old desktop is using an all-in-one for a reason: I don't get excited about the maintenance of a custom loop. My desktops have been watercooled for a long time, going back to the mid-1990's. But I've never enjoyed having to flush a custom loop annually, or even top one off. If your computer is critical to your livelihood, downtime for maintenance is just not a good thing. But in this case, I decided it was worth it.

Planning and preparation helps, but in the end you still have to spend many hours setting up the loop (cleaning and flushing parts, installing all of it, leak testing, etc.) and you'll still need to do some periodic maintenance. If you're making this tradeoff, just be aware of what you're trading. Custom loops give you a lot of flexibility, but they take much more time to create correctly and they require periodic maintenance. And if you're using a case with a lot of tempered glass, you likely want hard tubing (probably PETG), which is time-consuming and not cheap (the tubing is cheap, but the fittings are not and you'll likely want some tools you may not already own).

Component Criteria and Priorities

Looking at my parts list, some would accuse me of being anti-RGB. The reality is that I just don't care. In other words, RGB lighting is not on my priority list at all. I want robust components that meet my performance needs. I don't need RGB lighting, and I'm compulsive enough that I'd fuss over it if it was in my list of criteria. I've seen some nice full-RGB builds. I've also seen some awful ones, aesthetically. The reality for me is that I'm building a professional machine, and hence RGB is a distraction. If by happenstance I wound up with a nice RGB setup, that'd be fine. But I can't compromise performance and reliability (nor my time) for lighting effects. This isn't a gaming machine (I'm not a PC gamer), and I have nothing against those who want RGB lighting everywhere. Have at it! It's just not important to me at all for this machine.

This is not to say I don't like case lighting of some kind when there's a custom watercooling loop inside; I want visibility to see dirty coolant, for example. But it doesn't need to be RGB. In fact, white is ideal. I can't believe it's 2020 and we still don't have RGBW lighting, but instead only have RGB (addressable or not).

So... fans? Noctua, period. Highest performance, highest reliability and longevity, quiet. Preferably NF-A12 PWM, but I bought a few NF-F12 PWM just due to local availability and my long-term great experiences with them. I also needed them to play with radiator choices. The remaining fans will be NF-A12 PWM. No lighting. Bizarre choice of frame/rotor/blade color scheme. I don't care, I've no interest in trading off fan performance and longevity for RGB. This was an easy decision.

Radiators... fitment rules the day here, along with loop routing. I wound up with a Corsair XR7 360 in the bottom of the case (intake), an EK PE360 in the side (intake) and an Alphacool ST30 X-Flow in the top of the case (exhaust). It is possible to go slightly thicker in any of these locations, but I didn't want to create difficulties in access to motherboard headers. I wanted a cross-flow in the top so I can keep nearly all of the plumbing out of the way. There was also a cosmetic consideration, which you'll see when I get to the point where I can take some pictures. The Asus ROG Zenith II Extreme Alpha motherboard has a diagonal graphic scheme, with diagonals on the I/O shroud and a large, obvious diagonal split between the chrome chipset heatsink cover and the black anodized heatsink for the M.2 slots that are between the PCIe slots. There's also an OLED display on the I/O shield that I don't want Obscure. My plumbing layout is designed mostly for function, and I don't consider watercooling all that interesting to observe. I mean think about it... would you highlight the waterpump, radiator and hoses in your car's engine compartment? It's not like this stuff is high tech or bleeding edge. It's pump(s), radiator(s), fans(s) and plumbing. That's not to say it shouldn't look nice. It's just that minimalism and function should be the guiding principle for the watercooling. So my CPU block's output plumbing follows the diagonal of the motherboard up to the top rear of the case where it enters the cross-flow radiator. This is also convenient because there is nothing there that needs to be accessed; no motherboard headers, etc. The input to the CPU block follows a roughly perpendicular path to the top front of the motherboard. There is no electrical that I can't reach, and I can even replace my RAM (carefully) since I spaced the tubing far enough from the motherboard to allow it. All without interrupting the watercooling.

It's worth noting that I am not looking to win any overclocking competitions. I want quiet, and obviously I want longevity. Nine radiator fans is only quiet if I'm able to run them at low speed. So having more radiator surface area helps. But the critical reason for three radiators is just the fact that the AMD Threadripper 3960X CPU has a TDP of 280W, and from the reviews I've seen thus far, the NVIDIA 3080 hits peaks of 450+W power consumption. This is sort of the sad reality we're in at the moment: the fast parts we need as professionals are either prohibitively expensive (say Quadro cards, dual Xeon platinum) or very power-hungry (AMD Threadripper, NVIDIA 3080/3090), or both (AMD EPYC). When I initially spec'ed out what I wanted, the NVIDIA RTX 3000 series cards were still an unknown, but I knew they'd be big power consumers. What I didn't know was what the Founder's card cooling setup would be, nor what the AIB cards would have, nor what waterblocks would be available. So I planned for enough radiator to handle the Threadripper 3960X, an NVIDIA 3000 series GPU (for deep learning training work), and cooling capacity for a second GPU (maybe Big Navi, depending on price/performance/availability). Since I'm not a PC gamer, this is all about compute, where I might run training for very long periods of time at full-bore. Less thermal throttling means faster training runs. Less noise in the process is also important since this computer will be a desktop in my home office. So... all the radiator I can fit conveniently, without getting in the way.

I also had to consider radiator air flow. Since I planned all of this without having the case on hand, and without real data, I just took a guess that a 54mm thick intake radiator plus a 35mm thick intake radiator would be reasonably balanced by a 30mm exhaust radiator and a 120mm rear exhaust fan. And that top exhaust is always wise just due to convection in the room in which the computer will live. Each of the radiator fan sets will be independently controllable from the BIOS fan curves, and the rear exhaust fan is also independently controllable. Hence I should be able to get a reasonably balanced air flow.

I want slight positive pressure from the intake radiators which have easy-to-clean filters. I'm sure I'll have to fiddle to get it where I want it, especially given the open cutouts in the rear of the case (all of the PCI slot covers are vented, the vertical GPU cover plate is vented, and the right side panel has 360mm of venting for the PSU and hard drive cages.

My brand preference for radiators: Hardware Labs. They happen to make the Corsair XR7 and XR5 radiators too, so those fall in my top preference. Next up is probably EK; they're easy to get, they fit in most cases, and their quality control seems good (even when their design isn't). Then Alphacool, Bitspower, etc. Don't go no-name here.

Watercooling pump... there are many options, but there are basically two prominent pump designs: D5 and DDC (though the D5 sort of has two variants). Pumps of either design from reputable companies are both good choices. My preference is D5 due to my personal experiences with reliability. My sample size is small, but it's what I'm most comfortable with in my own machines. I wanted one with a short resorvoir, and the EK Quantum Kinetic TBE 200 D5 PWM fit the bill.

Tubing... I'm going with PETG for the moment. Sounds crazy to do hardline as temporary, but most of the cost is in fittings (the PETG tubing is cheap). Sacrificing some PETG is cheaper than sacrificing fittings, and I know I want hardline for the final build. So instead of buying soft line fittings and soft tubing that I knew I wouldn't keep, I bought PETG and will be using the fittings indefinitely. It's also worth noting that it's likely that I won't have to redo all of the tubing to swap GPUs; I won't be moving the pump, radiators or CPU. In the end I might go with copper, but it's a lot more work than PETG to make it look nice, unless I use a whole lot of fittings.

Fittings... I chose Bitspower, 16mm for tubing. I like beefier fittings where they'll fit, with knurled collars, just to make finger-tightening easier. Sometimes I need to snug a fitting where I can't use the tool available for some fittings.

Coolant... solid-free only. Dyed is OK, but nothing with solids (no pastels). At a minimum, distilled water and a biocide (say Mayhem XT-1 Nuke). From there... it's up to you. I've not found big differences in solid-free coolants as long as they have biocide and ideally some corrosion inhibitors. Obviously the metals and plastics in your loop should guide you in making the right choice. I'm going with distilled water and Mayhem XT-1 Nuke.

Power supply... the saddest part of the story at the moment (October 3, 2020). I really wanted the Corsair AX1000, or the Seasonic PRIME TX-1000. But COVID-19 wreaked havoc on the supply chain for power supplies. I was unable to find either of these power supplies at less than outrageous prices (scalpers). In some ways it's worse than the issues we saw during the early days of crypto mining. For now I'm using a Seasonic Prime TX-850. Which should work just fine with a single GPU, it just won't be sufficient for two GPUs.

CPU... Threadripper 3960X is the sweet spot for my needs for the foreseeable future. I needed 24 cores based on the builds I run on my 12-core Xeon machine in the basement (which has a much lower base clock but I see benefits of parallelism all the way up to 24 threads/processes). I wanted PCIe 4.0 since it's the current standard and gives a big bump for NVMe (I'm sure it'll be superceded by PCIe 5.0 before I retire this machine). I wanted enough PCI lanes to run multiple 16X GPU plus multiple PCIe NVMe.

Motherboard... what I really wanted: 10G ethernet, 8 RAM slots, minimum 2 PCIe NVMe slots, a PCIe x16 slot layout amenable to two GPUs (on water) plus another card (4x4 NVMe for example), a reasonable number of rear USB ports (type A and type C), audio and Thunderbolt 3. Unfortunately there is no board that offers all of these for TRX40. The easiest to drop was Thunderbolt 3 (which I mostly like having for portable storage). Pricing for a high-end motherboard here is a bit odd but it's partly due to limited choices (presumably driven by the lack of a market). There's a big jump from a $400 range board to $700 to $850, with only a handful of choices at the moment.

GPU... story to be continued once NAVI 2 launches and/or RTX 3000 Founder's edition cards can actually be purchased.

Storage... I need PCIe NVMe. Minimum 1TB boot drive and 1TB data drive. My needs will grow here, but it's enough to be usable. A full Windows Pro development setup takes up a considerable amount of space, as does a full Linux development setup. The obvious choice is the Samsung 980 Pro, but it's not yet available. Second choice is Sabrent Rocket.


Some of the parts below will eventually be reallocated.

For a boot drive, I want a Samsung 980 Pro but they're not available yet. The Sabrent will become a data drive when the Samsung 980 Pro is available.

For a GPGPU, I am waiting until later in the year. The RX580 will likely be reassigned as a pass-through for a macOS VM once I choose a GPU for deep learning. I don't want to buy anything significant here before AMD releases RDNA 2 cards, despite the fact that I fully expect to wind up with an NVIDIA solution (Ampere or not). RDNA 2 should put some pressure on NVIDIA to release a card that isn't hobbled RAM-wise like the current 3080, and will hopefully allieviate some of the Ampere availability issues.

For what it's worth... NVIDIA should be pooped on for the Ampere launch. We have no cards available (Founder's or AIB), and those that got AIB cards have had some showstopper stability issues. Some look to be capacitor issues on AIB cards, but there also appear to be some driver stability problems. Things that would not occur if NVIDIA hadn't shorted the AIB vendors on information and driver updates, and had provided clear guidance for capacitor layout and materials. EVGA seems to have verified that the 6 POSCAP layout is not optimal. Yes, they're easier than an array of MLCC. But no AIB wants to save pennies on QC while spending tons on recalling cards, tossing out existing inventory, and redoing the design and layout to ship stable cards before the holiday season. Of course, any AIB maker that didn't find these issues on their own should get some blame. And both NVIDIA and the AIB makers should be ashamed of the abysmal launch availability. Even Intel doesn't launch with such paltry inventory. The end result: NVIDIA's image takes a unnecessary (but earned) hit.

Further... what's the story on the coolers? The NVIDIA cards have what appears to be a good solution for air cooling on both the 3080 and 3090. Meanwhile the AIB makers continue to produce cards with suboptimal cooling that recirculates hot air in the chassis, exhausts little to no air from the chassis, and are airflow constrained (especially if adjacent PCIe slots are occupied). Good luck if you need more than one GPU; if you're pairing with Epyc or 3rd-gen Threadripper, you'll need a huge PSU and means to remove a whole lot of heat via components that aren't included with an AIB card (chassis fans, radiators, water blocks, etc.). We've always had problems with tower PCs with respect to air flow for GPU (or any PCIe card). NVIDIA didn't think that far out of the box with their cooling solution. Why don't any of the AIB cards have a similar solution? Did NVIDIA lock down some patents or make it part of the licensing agreements? Or is it just the fact that the AIB makers only target users that don't care about stability, noise, longevity, etc.? For earlier generations of cards, the recirculation wasn't a big issue. But I've seen multiple reports of the 3000 series cards peaking at 450 watts. That's a tremendous amount of heat to be dumping against the motherboard and recirculating in the case.

I think the most disappointing part of this launch for me is the fact that NVIDIA hasn't done anything to close the gap we have to jump for common professional workloads. Namely deep learning. I think the 3090 does a nice job of filling the gap for creative professionals; the Founder's card fills in nicely there. But we still have a big jump from there to a really good card for deep learning (namely the training), CAD, etc. AMD is guilty of the same, of course, especially since they stopped production of Big NAVI (Radeon VII) long before we saw a replacement (and I'd argue that outside of what they've done with Apple, they're way behind NVIDIA and that there really isn't yet a replacement for the Radeon VII).

At the moment... Ampere feels overhyped. If you're really pessimistic... it feels like a pure money grab. Which failed due to poor availability, and stability problems for the AIB makers. I've never been terribly fond of NVIDIAs software, but I can say that about AMD and many other AIB makers outside of Intel. Driver and O/S issues tend to plague the launch of GPUs from all camps. But when it feels like the AIB makers didn't have enough information and time to launch stable product, and you can't buy any of them anywhere, and if you get one it has a high probability of proving unusable... you all suck. Really. We, as consumers, had literally zero need for the headaches and costs of a premature launch. As near as I can tell, we got an Ampere launch in September simply to be in front of the RDNA 2 launch from AMD. And it was blown spectacularly. Professional users who need cards right now... will buy what is avaiable and works, which is RTX 2080 or Titan. At higher cost, with a bitter taste in their mouth. Gamers will be salivating for months over cards they can't buy, and potentially dealing with instability problems if they manage to get a card before December 2020.

And unless AMD has a similarly bad availability problem in the first weeks of launch... they're likely going to eat NVIDIA's lunch at the 3080 and below performance marks. Which is most of the gaming market. If the AMD cards come with 16G or more of RAM, they will also pull in some of the professionals working from home (where their personal desktop fund doesn't support buying a Quadro or AMD Pro card). Not most of the CAD and scientific users, but creative professionals and a small bit of the deep learning professionals. Though sans Tensor cores in the hardware, it'll be a small group. And sans support in ROCm, it'll be almost no one that's working in machine learning. ROCm is still a work in progress, and still behind in terms of general usability due to no NAVI support. But I don't see it going away and the HPC world remains invested in AMD. It could fail, but it's in AMD's court and in their interest. They've done a nice job taking HEDT share from Intel with Threadripper 39xxX and Epyc on the CPU side. Despite the fact that I think I'll wind up with an NVIDIA solution this year, I want AMD to be competitive. And Intel too. I'm a hardware consumer, and I benefit from the competition. So I'm hoping for a smooth and interesting RDNA 2 launch regardless of what I need to buy for professional use.

Wish List by Vendor

Totals by Vendor

Parts Ordered

Below is the list of parts I've ordered or received. There were a few parts that changed when moving from the wish list to the bought list. One was 'radiator 1', which changed from an Alphacool XT45 X-flow to a Corsair XR7. That's largely because the Corsair is a Hardware Labs radiator, and appears to be slightly better than Hardware Labs' own (from reviews). High marks for quality and cooling. And I wanted one thicker radiator to start, to see what I think will work for me in the Lian Li PC-O11D XL case.

My plan is a 360mm side radiator, 360mm bottom radiator and either no top radiator (just fans) or a slim top radiator. As a general (with exceptions) rule, thicker radiators dissipate more heat. So I want to go as thick as I can without making motherboard acess overly difficult or completely destroying the cosmetics. The PC-O11D is a showcase chassis, you see the guts in all their glory. I didn't buy it for the showcase aspects; I bought it for the right layout for what I need, the features I need and the good build quality. I care more about the noise-blocking abilities of tempered glass panels than their visual transparency, but it means I want a reasonably tidy interior.

Parts Ordered by Vendor

Totals by Vendor