Posted on Tue 22 January 2019
It’s 2019. How much RAM do you need?
For a desktop or laptop, it’s fairly easy: everyone needs 8GB to get general work done. 16GB is suitable for people who are doing heftier work, and people who need 32GB or more always have a specific reason that they can articulate and justify.
Servers are different. Servers can be scaled vertically (made or purchased in heftier configurations) for a ways, but nearly always you will have a reason to scale horizontally (more machines of the same calibre) – high availability, or recovery from disaster, or workloads that are cost-effective to parallelize across machines.
A “house” server might play many roles at once: DNS and DHCP, mail, SSH gateway, web server, application server, database server, media archive, backup storage… such a box should have room to grow in whatever direction is later determined to be necessary, or cheap enough that the tasks can be easily distributed among many of them. Some people like to play virtualization games, buying one or two hefty physical machines and spreading the workload among a handful (or dozens) of virtual machines. Virtualization eats up memory, though it no longer incurs quite the level of CPU overhead that it used to require.
If you gravitate to many small servers, it’s reasonable to supply them with relatively small amounts of RAM: 2GB will handle DNS, DHCP, routing, firewalling, NTP and other low-intensity services easily – all at once, even, if you are careful. A media server might want 4GB to improve caching, but not much more. Database services are comparatively RAM-hungry, but it all goes to performance improvement – if your demands are low, you can get by with less. Especially: if you can run your database off SSD, I can’t think of a home usage that would need much more RAM than to prevent swapping.
In a business context, throw all of these rules-of-thumb out. Do a proper evaluation of what you actually intend to run, and then get extra RAM. Even if it all goes to extra caching, it won’t go to waste.