Cost to run a PC 24/7 per month to host a game server?

JordanStyp

n00b
Joined
Feb 6, 2022
Messages
21
I wasn't sure what section to post this in, so I thought this was a good spot. I am currently running an MSI B450 Tomahawk max board with a Ryzen 5 5600X, 3600mhz G.Skill ram, Radeon 6600XT, Sabrent 1TB M.2 and a Lepton 500 Watt 80Plus Gold PSU. I was thinking about using this system to host an Ark server but I was curious if anyone has run similar specs and hosted a server with it as well and as to how much it cost a month for electricity to do so. I was also curious if it ran well and if there were any issues. The server would be for about 20 players and have mods as well. If you have any advice, it would help a lot. TIA!
 
I don't think it is possible for us to estimate how much 24-hour average load that game server causes, and hence figure out where the power consumption falls between idle and full busy.

You should buy a power meter.

You didn't say where you are, but in the US you will usually pay roughly $1 per year for each watt that a 24/7 device takes.
 
My efficient 14000btu window unit cost about $30/month to run in the summer, used 231.8kWh in September, about 10c/kWh.

So, if your PC uses an average of 300Wh per day, and you pay 10c/kWh, you can expect a similar cost. Note that doesn't factor in fuel costs, etc, which vary by region and company, or discounts/fees for (not) running during peak hours. Check with your local evergy provider to get your rates, and check your bill for your current usage.
 
Figure ~.2kw consumption x 24hrs x 30days is about 144kwh
I pay 16c/kwh
~$23.53/month

Adjust your usage (using a watt meter) and kwh cost (from your utility bill, make sure to use your final bill including fees/tax) to match
 
Here's some points of comparison for you from equipment I've owned and tested:

10850k 5ghz, 3080 ti gpu, 64 gb DDR4 3600 16gbx4, WD SN750 ssd 1TB, Samsung 850 evo, Corsair Dual 140 AIO water cooler, EVGA 850w PS
164 watts idle
227 watts performing backup.
592 watts gaming Spiderman Miles Morales.

i7 7700k, 32 gb ddr4 3000, 500 gb Samsung 970 evo, Dual monitors connected, air cooler
32 watts idle 11/24/2022
88 watt performing backup
120 watt max boot draw

Home Network closet all equipment 2/21/2022
73 watts KillaWatt meter
82 watts per UPS
Dell T20 w/ 3 VMs, Asus AP, Huawai wifi modem, 5 port switch, 8 port poe switch. 1 POE camera.
$70.34 per year

Dell 9020 Micro i5 4590T quad core no HT, 12 GB ram, 256 GB SSD, Server 2022 + Hyper-V, 1 VM
15 watts idle
31 watts max during boot. 35 max observed during Windows updates.
----
Determine how many watts are spent in an hour. Divide the number of watts by 1000 to convert the number from watts to kilowatts. 60 watts divided by 1000 results in 0.06 kilowatts.
Multiply the number of kilowatts by the number of hours the device will be used to find kilowatt hours. If the device will be used for three hours, multiply 0.06kW by three to produce 0.18kWh. If a 60W light bulb is left on for three hours, it will use 0.18kWh of energy.

Galactica Server example:
34 watts / 1000 = 0.034 killowatts
0.034 kW * 24 hours of operation = 0.816 KwH per day
0.816 kWh * $0.11 per kWh = $0.09 cost per day, about $32.76 per year at 24/7 operation. 8760 hours in a year

43 watts / 1000 = 0.043 kW
0.043 kW * 24 hours = 1.032 kwh per day
13 cents per day.
$47.09 per year.

Edited to correct.
 
Last edited:
Here's some points of comparison for you from equipment I've owned and tested:

10850k 5ghz, 3080 ti gpu, 64 gb DDR4 3600 16gbx4, WD SN750 ssd 1TB, Samsung 850 evo, Corsair Dual 140 AIO water cooler, EVGA 850w PS
164 watts idle
227 watts performing backup.
592 watts gaming Spiderman Miles Morales.

i7 7700k, 32 gb ddr4 3000, 500 gb Samsung 970 evo, Dual monitors connected, air cooler
32 watts idle 11/24/2022
88 watt performing backup
120 watt max boot draw

Home Network closet all equipment 2/21/2022
73 watts KillaWatt meter
82 watts per UPS
Dell T20 w/ 3 VMs, Asus AP, Huawai wifi modem, 5 port switch, 8 port poe switch. 1 POE camera.
$70.34 per year

Dell 9020 Micro i5 4590T quad core no HT, 12 GB ram, 256 GB SSD, Server 2022 + Hyper-V, 1 VM
15 watts idle
31 watts max during boot. 35 max observed during Windows updates.
----
Determine how many watts are spent in an hour. If you have a 60W light bulb, for every hour the bulb is turned on, it uses 60W of power. Divide the number of watts by 1000 to convert the number from watts to kilowatts. 60W divided by 1000 results in 0.06kW.
Multiply the number of kilowatts by the number of hours the device will be used. If the device will be used for three hours, multiply 0.06kW by three to produce 0.18kWh. If a 60W light bulb is left on for three hours, it will use 0.18kWh of energy.

Galactica Server example:
34 watts / 1000 = 0.034 kWh
0.034 kWh * 24 hours of operation = 0.816 KwH per day
0.816 kWh * $0.11 per kWh = $0.09 cost per day, about $32.76 per year at 24/7 operation. 8760 hours in a year

43 watts / 1000 = 0.043 kWh
0.043 * 24 hours = 1.032 kwh per day
13 cents per day.
$47.09 per year.
This was extremely helpful! Ty! I just need to do some math now 😂
 
Determine how many watts are spent in an hour. If you have a 60W light bulb, for every hour the bulb is turned on, it uses 60W of power.
Nice and helpful post - thanks! It seems to have given me an irresistible urge to be super-nitpicky though, so here goes (sorry):

Power (watts) is an instantaneous quantity. It's not spent per hour or minute or anything like that; the bulb draws 60 W of power regardless of the number of hours it's turned on. That's why one has to multiply by hours later to get energy (Wh). So just figure out how much power the device draws, and use that figure in the rest of the calculations.

(Then, even more nitpicky, 34 watts = 0.034 kW. While 34 watts / 1000 = 34 mW. ;))
 
Nice and helpful post - thanks! It seems to have given me an irresistible urge to be super-nitpicky though, so here goes (sorry):

Power (watts) is an instantaneous quantity. It's not spent per hour or minute or anything like that; the bulb draws 60 W of power regardless of the number of hours it's turned on. That's why one has to multiply by hours later to get energy (Wh). So just figure out how much power the device draws, and use that figure in the rest of the calculations.

(Then, even more nitpicky, 34 watts = 0.034 kW. While 34 watts / 1000 = 34 mW. ;))
Edited for clarification.
 
Back
Top