Any word on POW mining period?

o0ragman0oo0ragman0o Member, Moderator Posts: 1,291 mod
Are there any clear estimates of how long POW mining will be running for?

I'm spread sheeting GPU's for earning efficiency over time. Things like capital cost of rigs and power consumption obviously play a big factor over a limited period so for shorter terms a lower power lower hashrate card may earn more than an expensive card.

Comments

  • GenoilGenoil 0xeb9310b185455f863f526dab3d245809f6854b4dMember Posts: 769 ✭✭✭
    How do you measure each GPU's power draw? I think eth mining pulls less power from each card than full TDP.
  • quantumgravityquantumgravity Member Posts: 13
    Considering the AMD 390 series is coming out later this year, the depreciation on currently available GPUs will likely accelerate.
  • o0ragman0oo0ragman0o Member, Moderator Posts: 1,291 mod
    @Genoil Without direct measurements I'm only going off factory stats for each card. I read the other day that the GTX 750 TI's which are rated at 60W are actually firmware limited to only 38.5W. So without knowing the actual current draw on each card, can only really use the factory stats as a comparison for now.

    Would love to see some real numbers though...

    @quantumgravity, the R9 380 is already out. But only because it's using existing Hawaii silicon. So isn't anything more than an incremental gain and still can't really touch a GTX 980 which benchmarked at 18.4 @ 165W. the R9380 is spec'd at 220W.

    The one to really watch for is the launch of the R9 Nano, but whether it is launched in time to get an ROI on the limited POW period remains to be seen. From what I've been able to figure, it may be hashing at ~44MHs for only 175W at maybe $500 off the shelf. Even this is hard to predict given the new HBM technology sounds ideally suited to memory bandwidth dependant Dagger/Hashimoto algorithm

    For domestic electricity cost of .22 (Australia), it looks like the GTX titans are taking it out if you have the money. If not the next most efficient hashing cards per after market price (read second hand Ebay cards) for a mining period of between 6 months to a year are showing up to be HD 7950, R9 270/270x, GTX 750 TI.

    In the end, the biggest deciding factor will be the price of ether on the exchange and how big the networks gets as all those reanimated Scrypt GPU's. By current testnet figures, we'll need to exchange at over $0.35/eth. That will need rise considerablely as total hashrate grows.

    Presale eth were 2000/btc or at current value, $0.12/eth. I've seen off market offers of anywhere between $0.60/eth and $2.50/eth so gawd knows what it will hit on launch. If there's a dump of presale, then it 0.12 might be the floor and you'd be better off just buying off market instead of mining.

    I think it'll be an exciting and pretty volatile birth....
  • quantumgravityquantumgravity Member Posts: 13
    Actually the 3xx series doesn't look to have any improvement on memory bandwidth from the 2xx series, so the performance difference will be negligible.

    Also, I would say that the fixed cost will be negligible for POW that is short term, since the GPUs can be re-purposed or resold at/near purchase price.

    The lower power cards (i.e. 7xxx) have a corresponding lower hashrate based on benchmarks. The higher power cards (i.e 2xx) have better hashrate per watt and a better price per mh/s, so they will be more profitable. This is going from data in the benchmark thread.

    But if you are planning to mine based on how many dollars you will earn, then you can't really know for sure at all. I've been using the price of ETC (~$1) to approximate the future value of ETH, though I'm sure it will be pumped as soon as it is on the exchanges.
  • quantumgravityquantumgravity Member Posts: 13
    edited June 2015
    @o0ragman0o You got your post in right before mine LOL. The R9 Nano looks interesting. It will give about the same hashrate as two 280X's and save about $25 per month with electricity at $0.1 per kw hour. So, with that electricity rate it will pay for the difference in price in about three months. The Nano might actually be the best bet.
  • quantumgravityquantumgravity Member Posts: 13
    Also, I don't plan on selling any of my mined ETH, so the exchange rate doesn't really matter for me.
  • o0ragman0oo0ragman0o Member, Moderator Posts: 1,291 mod
    OK, spreadsheet is up here.

    Make a copy, have a play. Main parameters are electricity cost and the number of days to mine.
    Then there is a GPU table with hashrate, watts, and price of each GPU.

    There are two charts, one with the curves of each card over the mining period, the other integrates the mining over the period. The curves are calulated as GH.Days / cost. The curves change depending upon mining period and electricity costs and show that some cards are better at short term mining and others better in the long term

    Left hand column is for comparing subsets of GPUs, just blank out the cell of the GPU you don't want and filter out blanks. Hashrates and prices in italics are speculative values....

    Overall it give a good sense of how much you should be paying for different cards.


    There is a second sheet there with a simple farm earning's calculator also.
  • Michael_AMichael_A LondonMember Posts: 61
    @o0ragman0o
    How did you manage to estimate the Hashrate and the price for the AMD Radeon R9 Nano and MD Radeon R9 Fury they will be available this summer
    on a other note, what do you think about the Radeon R9 295X2 ?
  • quantumgravityquantumgravity Member Posts: 13
    That is an excellent and very useful source of information. Thank you @o0ragman0o
  • jzenjzen Member Posts: 49
    edited June 2015
    It would be interesting to see the data analyzed in terms of total target hash rates to achieve, for example what would be the cost efficiency of achieving 100, 200, 300 MH/s and so on using multiple cards of each type...
  • o0ragman0oo0ragman0o Member, Moderator Posts: 1,291 mod
    @Michael_A, it's a guesstimate base on AMD's claim that the Nano will be 2x more powerful per WATT than the R9 290x so it's basically the known 290x hashrate adjusted to the Nano's TPD.

    Here's an article also analyzing the Nano from what is already known
  • o0ragman0oo0ragman0o Member, Moderator Posts: 1,291 mod
    @jzen, I have in a more complex spreadsheet from which this was derived another column showing a 'target' price as well as a market price for each card. A hash.day/cost threshold can be entered and the target price for each card calculated and compared with the market price. You can then see what kind of 'bargin' you're getting. If the target price ends up more than the market price, then you know to avoid those cards.

    Even though it seems informative I didn't include that column because I couldn't work out a sensible way to choose a threshold.

    I've just realised also that the 'days' range in the table was not calculating according to input... fixed now.
  • o0ragman0oo0ragman0o Member, Moderator Posts: 1,291 mod
    @Michael_A, as for the R9 295x2, it's obviously a heavy hashing card but for the current price and power usage it always ends up in the middle of the pack.

    Given that some capital costs can be regained by selling GPU's after POW, it might be worth playing with the price minus the expected resale value of the card at some time in the future. However in a year or so, the Fiji GPU's will pretty much dominate and power hungry cards like the 295x2 will have a greatly depreciated market value.

    Note that the most significant factor on the curves is electricity price and at the end of the day you want the most ETH for the least $. If you have free power, the the curves are straight and so you beg borrow or steel the highest hashing card you can find. If on the other hand you have free cards or manage to recoup once sold, then the lines are completely flat and it becomes a purely hash/$W equation.

    If like me your in Australia and paying 0.22/kWh with no existing rig, then cards like the older R9 7950 and R9 270 seem always to come up top because of the cheaper capital outlay....

    All this is heavily dependent on how long POW will be. Stephen has said their is no scheduled timeline to POS Serenity so I'm guessing on a ballpark figure of a year of POW.
  • GenoilGenoil 0xeb9310b185455f863f526dab3d245809f6854b4dMember Posts: 769 ✭✭✭
    edited June 2015
    I wonder how these figures hold up against cloud mining. I've now finally managed to do some real olympic mining @ 44MH/s on Amazon EC2. I'm currently paying $0.50/hr (spot instances), including a few hundred GB of storage.

    Let me try, correct me if I'm wrong:

    If I enter the data into your sheet, it turns out that it will only be more profitable for about 40 days. $0.50/hr equals about 2300W continuous. And then $0 initial cost.
    Post edited by Genoil on
  • o0ragman0oo0ragman0o Member, Moderator Posts: 1,291 mod
    edited June 2015
    Ok, set your GPU cost to $0 and use the the kWh cost to $0.50. You should have a flat line in the graph for your cloud miner card and you should see something like 1800 Gh.Day/$. With that realised, it looks outstandingly profitable

    Need to work out purely as Hashrate.hour/$... 44MHs x 3600s / $0.5 = 317Ghhr/$. Turns out not to be that competitive when a 750 TI is over 2000Ghhr/$
    Post edited by o0ragman0o on
  • o0ragman0oo0ragman0o Member, Moderator Posts: 1,291 mod
    @Genoil, I've had to redo the sheet a bit. I've included new columns for Daily running cost, which is normally calculated as tarrif x Watts but can be set to a static value for something like EC2.

    I've dropped the area under curve graph as it was misleading and replaced it with a graph based on number of days POW mining speculated by the user. The line graph now spans 720 days.

    Each point along the line graph includes the amelioration of the total cost of the card over that number of days. Where there is no card cost, slope and curve is 0.
  • Michael_AMichael_A LondonMember Posts: 61
    edited June 2015
    @o0ragman0o thx you for your advice your last spreadsheet will be very useful specially when Frontier become official.
    I just can wait to get my hand on the new Fiji GPU, also I will test Amazon EC2 might be worth to use it as plan B...Salutation to @terzim for his amazing guideline Bellissimo ;)
    @Genoil 44MHs sounds pretty good ! are you using your special CUDA miner or OpenCL ?
  • GenoilGenoil 0xeb9310b185455f863f526dab3d245809f6854b4dMember Posts: 769 ✭✭✭
    edited June 2015
    @Michael_A my CUDA miner of course :). But it sounds better than it really is. Mind you that's 4 GPU's hashing at only 11MH/s each. The GRID K520's Amazon use aren't really made for heavy GPGPU taks, rather for 3D acceleration on remote virtual desktops. It's basically the server version of GTX680. If only I could get access to a Tesla K80 for a few hours, just to benchmark it...
    Post edited by Genoil on
Sign In or Register to comment.