CPUs
The first piece of the puzzle in building your own (BIY) machine learning server is the CPU. The CPU dictates the type of motherboard to be used since both must be compatible. And the motherboard selected must have enough PCIe slots to support the GPU. Everything else from there becomes slightly easier to pick the remaining components.
 When it comes to machine learning, having more cores, threads, and L3 cache is always better. However, the budget drives the decision. The two giants of the industry are Intel and AMD. The choices available are the Intel Xeon Platinum series, AMD EPYC, and AMD Threadripper. Â
The third generation Xeon Platinum Processor are the latest and greatest from Intel that come with 32-40 cores, 48-60 MB of L3 cache, and supports 64-80 threads, depending on the processor. Along with that comes a hefty price tag. This CPU is ideal for high performance computing, cloud, AI, and IoT workloads. The low, high, and mid-range of the series are the following:
- Xeon 8380: 40 cores + 80 threads + 60 MB L3 at $8,090
- Xeon 8351N: 36 cores + 72 threads + 54 L3 at $3,027
- Xeon 8358P: 32 cores + 64 threads + 48 MB L3 at $3,950
The AMD EPYC is designed for the data center. It’s the ideal processor for cloud computing, AI, high performance computing, and other intense workloads. The high end, low end, and mid-range version are as follows:
- EPYC 7763: 64 cores + 128 threads + 256 MB L3 at $7,890Â Â
- EPYC 7513: 32 cores + 64 threads + 128 MB L3 at $2,840Â Â
- EPYC 7313: 16 cores + 32 threads + 128 MB L3 at $1,083Â Â
- PRO 3995WX: 64 cores + 128 threads + 256 MB L3 at $6,102
- PRO 3955WX: 16 cores + 32 threads + 64 MB L3 at $1,146
- 3990: 64 cores + 128 threads + 256 MB L3 at $5,099
- 2970WX: 24 cores + 48 threads + 64 MB L3 at $668
- 1900X: 8 cores + 16 threads + 16 MB L3 at $193