BlueEye

How to build inexpensive server for machine learning?

Ready-made ML servers are very expensive and additionally for home use very loud. We can use commercial clouds for ML calculations. This solution also has some disadvantages, we do not have full control over the system, we have to adapt to specific cloud provider solutions, costs. Below I will show you how to build a very cheap server that costs around 2000 PLN ($500). The only more serious problem will be building a cooling system for Tesla cards.

We need parts:
Tesla K20X
My server specyfication:
Corsair TX 750W
We must assemble server in sequence according to the steps:
  1. Assemble the server without installing accelerators
  2. Turn on and enter BIOS/UEFI and set integrated graphics card as default. It's important because when we install Tesla devices, motherboard use Tesla devices as grapic card and we will not be able to connect the monitor (Tesla chips don't have video outputs)
    Set integrated graphics card as default
  3. Install CUDA cards and cooling system if devices don't have active cooling. Few examples how to do:
  4. Install OS and additional software
Motherboard with installed 2x Tesla K20Xm