Optimal hashrate v. power tradeoff (+ data for 6 x RX470 under Ubuntu)

magickmagick Member Posts: 67
Finally I got around to sticking a power monitoring plug on a rig with 6 x Sapphire Nitro+ RX470 8GB on ASRock H81 Pro BTC motherboard, Celeron G1840, 2GB RAM, ADATA 120GB SSD, running Ubuntu 16.04.1 with amdgpu-pro 16.40 driver. Cards all have vBIOS using 1750MHz timing strap, 2150MHz memory clock, 1100MHz GPU clock in BIOS but usually running 1265MHz, or that was how I set it up initially. With those settings and an aggressive fan control script, the rig produced a stable 165Mh/s, or 27.5Mh/s per card, the best I could manage without compromising stabillity. This weighs in at 970W continuous load, or 5.89W/Mhs, on a Corsair AX1200i PSU

With the power monitor installed and everything else as above, I dropped the GPU clocks back to 1100MHz, which yields 157Mh/s for 907W at the wall (5.77W/Mhs). That hurt the hashrate a bit much, so I also measured at 1155MHz, getting 161Mh/s for 933W at the wall (5.79W/Mhs). Then plugged the numbers into the calculator on whattomine.com. The result confirms (not totally a surprise here) that chasing the highest achievable hashrate and footing the power bill accordingly doesn't necessarily make a vast amount of sense.

My incremental power cost translates to $0.135 per kWh. As such, mining Ethereum, daily net returns (at time of writing) would be $2.83 for 157Mh/s after power costs of $2.96, $2.88 for 161Mh/s after power costs of $3.05, or $2.93 for 165Mh/s after power costs of $3.14. Not exactly high finance; I suppose if one were looking to produce the maximal amount of Ether it's a difference of $0.29 a day gross between the highest and lowest figures, but only $0.11 of that is additional profit. So, 3.88% extra profit for 6.1% extra energy cost. For Ethereum Classic, daily net figures (again at time of writing) are $3.15, $3.22 and $3.28 respectively, with hashrates and power costs as above. Similar numbers as above, it seems just about worth going over the 1100Mhz but not by any great margin. Getting to 1155Mhz produces ~98.2% of the net output I can obtain from running at 1265Mhz, with less noise and less heat. Which is "better" will of course depend on how much we care about efficiency versus output, and what we pay for power.

Thought I'd share current numbers for what must be a fairly typical rig in case they're of use to anyone, whether scoping out a rig or fiddling around with their BIOS settings. If anyone has equivalent Windows figures for comparison (or better Linux figures!) do chime in, would be interesting to know what sort of difference that makes.

Comments

Sign In or Register to comment.