For the record, per the "Cleaning Up This Forum" sticky post, I have spent more than 30mins searching this forum, and I have yet to find an answer to this particular question.
I want to run my 1060 6GB along with 2 new 1070's.
I am thinking of running a Gold 650watt, but I also like the idea of getting a 1000watt.
I could possibly add 3 more GPUs down the road for a total of 6, "if" I COI the cost of the two new GPUs and the PSU. What I just said there "automatically" makes peeps reply that the 1000watt is the better option since there is "hope" that later down the road I'll get 3 more GPUs.
In truth, I have no clue if I'll actually add 3 more (total 6) and even if I knew that I 100% would, I'd be looking at 6-12months before I even COI'd.
From research, I understand that you don't get the power savings until you use 50%+ of your power, so I figure I wouldn't hit 50% of a 1000watt by running 3 total GPUs. I run a lower voltage on my 1060, and I plan to do the same with the 1070's.
Would the savings of using a lower watt PSU at 50%+ load over 6-12 months more cost effective than paying more for a totally OP PSU that would run "well" under 50% load for that time period?