First Build + Workstation/server card benchmark

Cpt_CudlzCpt_Cudlz Member Posts: 21
Well, I've done it. I've seen a few posts running around about it, but no real info to date. So - in the name of science, of course - I picked up a few FirePro S10000 12 GB cards and I've started mining with one of them. Why only one? Because I'm still waiting for FedEx to bring me my beefier PSU that can handle all of them. For those of you who have wondered, here's what I'm running and what I've found so far:
ASUS M5A97, AMD chipset with FX-4350 CPU.
16GB RAM
120 GB SSD
GT 710 for a simple energy-efficient VGA output (no video output direct from the motherboard)
Windows 10 Professional, but I might get set up on Ubuntu. Then again, my already limited Linux skills are pretty rusty.
Currently running Claymore's Dual miner, since I'll be running both AMD and NVidia cards.
The motherboard recognizes the two Tahiti cores of each S10000 as individual GPU's. That's probably normal, but this is my first build, so how would I know? I'm just hoping my CPU can handle enough PCIe lanes, etc. to manage all five S10000's that I have awaiting installation and tuning, in addition to the five 1050 TI's that are also on the way. Not to mention, I've yet to test all of my risers, splitters, etc.
For now, I just have a couple 92mm fans that I salvaged from the case that my brother-in-law had for the motherboard directed right at the S10000 that I've installed... right on the itty-bitty motherboard while I await the rest of my components to finish my build. This thing is monstrous, both in size and power.

So, The max TDP of the S10000 is rated at 375w, though I'm finding mixed information on whether that's per core, or for the whole thing. I really, REALLY hope it's the whole thing. I'm currently running an EVGA 850BQ PSU that I got to fire up and check out the motherboard, since the motherboard was a random Christmas gift/hand-me-down from my brother-in-law. That said, I haven't really been able to push the card that far. It already set off the over-wattage protection on the PSU a couple times during benchmarking, among a couple of random display issues. I'm not sure if my motherboard is trying to switch the display output from the 710 to the S10000 as the latter starts pulling some power, or what. Still troubleshooting that. As for hashing... HOLY COW. Each core can pull about 20 MH/s, at least until it crashes the PSU. So far, I've managed one stable benchmark at ~50% intensity, and each core averages 11.4 MH/s at that intensity. I know, I know. The jump from 50% to 100% intensity isn't necessarily linear in terms of a jump from 22 MH/s to ~40 MH/s. However, AwesomeMiner approximates its configuration as "Dual 20.4 MH/s," so I'll have to see how that pans out given adequate power to the cards, and some fine tuning in AfterBurner.

Now to the rest of the build:
All of the above, quite obviously
5x FirePro S10000
5x GTX 1050 TI
Dual PSU's: The aforementioned 850BQ for the Motherboard, 3 1050 TI's, and the fans. Plus a 2350w (supposedly) monstrosity I came across on Wish for dirt cheap (likewise with the 1050 TI's. I hope to God they're the real deal)
10x 40mm fans in a custom open air frame I'll be constructing myself, and I might find a use yet for the 92mm fans I salvaged from the old tower case. I like the idea of each GPU having a fan to itself, but I'm expecting some diminishing returns on CFM airflow to cooling efficiency.
All in, given some tuning, etc. I should be looking at about 250 MH/s and around 2000w power draw. It'll take some trial and error to optimize the power draw to hashing power ratio, but I think I'm headed in the right direction so far.

And in case anyone was wondering how I ever afforded five S10000's, they're not $3500 anymore. You can find 'em on eBay for about $300. Far less expensive than the Vega and 1080 series cards in terms of upfront cost, but at nearly double the power draw. Theoretically, though, they should offer an extra 10ish MH/s over their gaming-oriented counterparts. Whether that's worth it, I've yet to see. Anyway, thanks for reading, and I'll keep you posted on build progress and final benchmarks as I finish up.

Cheers,

Cuddles
«1

Comments

  • BUTUZBUTUZ Member Posts: 139 ✭✭
    Thanks for experimenting for us! I would be very interested to know how the S10000 perform in Zcash if you could try it?
  • Cpt_CudlzCpt_Cudlz Member Posts: 21
    Update #1! The max TDP of 375w is for the whole unit, so that's a plus. BUT, these cards play nicest with Windows 8.1. There are Windows 10 drivers, but they're still controlled by the old Catalyst Control Center, which offers fairly little in terms of OC'ing. Sapphire Trixx works, but only allows up to +100 MHz on the core clock and memory clock. It does NOT support voltage control, but I'm on the hunt for a compatible BIOS update for the Tahiti cores, and I might play around with flashing to another VBIOS from a more recent card with similar lithography, transistor count, BUS performance, and other specs - namely, anything that is compatible with Crimson, so I can switch up the workload to computational. That's probably a pipe dream, but I've unbricked flashed cards before, so worth a try or two. I've still gotta get that Equihash benchmark for BUTUZ, but I'm thinking to hold off until my big PSU gets here, so I can push the card without crashing the PSU.
  • Cpt_CudlzCpt_Cudlz Member Posts: 21
    Update #2! Equihash Benchmark is in at 535 Sol/S. I figured out the crashing issue. The PSU is just fine for the one card. The damn thing overheats, and that's what shuts everything down. Cooling for the Tahiti cores is all over, since they're shared across so many Radeon cards. Being that I'm insane, though, I'm going to run the craziest liquid cooling loop I've ever done. Again, I've scavenged for dirt cheap parts. I haven't really said it explicitly, but there is heavy emphasis on making this a budget rig with high profitability. Anyway, I know other Radeon cards with the Tahiti core have been used to mine in the past, as to why the passive cooling is such a killer in my case, I don't know. I guess the active heat sinks really do make that big of a difference. Even with the added expense of the liquid cooling, I'm still in on the S1000's for under $350 a piece, with a hashing power that still rivals that of cards that cost two or three times as much, depending on the algo and miner. My big remaining concern is seeing how all these cheap Chinese parts are going to hold up under the pressure of mining.
  • progfoxxprogfoxx Member Posts: 5
    Keep us posted on the cooling! You are not running 5 cards currently, only 1?
  • Cpt_CudlzCpt_Cudlz Member Posts: 21
    progfoxx said:

    Keep us posted on the cooling! You are not running 5 cards currently, only 1?

    That's correct. Just doing some testing and benchmarking on the one to start out. I still have to finish building my frame, and I'm waiting on the shipping for a more powerful PSU that will support all 5 cards. I found some compatible liquid heatsinks, a pump, and a pretty big radiator, but again, I can't finish the build until everything gets here. The thing with some of these individual components, is that they're pretty much available only manufacturer direct, but since I'm not a supplier, I can't exactly call up Corsair, or whoever, to ask for only ten of their heatsinks. So, I've resorted to ordering direct from China, which is a pain because sometimes the quality isn't very good, and I have to wait for shipping. I've had pretty good luck so far, though, so I'm not terribly concerned. I just wish I had patience to match my luck.
  • progfoxxprogfoxx Member Posts: 5
    Cpt_Cudlz said:

    progfoxx said:

    Keep us posted on the cooling! You are not running 5 cards currently, only 1?

    That's correct. Just doing some testing and benchmarking on the one to start out. I still have to finish building my frame, and I'm waiting on the shipping for a more powerful PSU that will support all 5 cards. I found some compatible liquid heatsinks, a pump, and a pretty big radiator, but again, I can't finish the build until everything gets here. The thing with some of these individual components, is that they're pretty much available only manufacturer direct, but since I'm not a supplier, I can't exactly call up Corsair, or whoever, to ask for only ten of their heatsinks. So, I've resorted to ordering direct from China, which is a pain because sometimes the quality isn't very good, and I have to wait for shipping. I've had pretty good luck so far, though, so I'm not terribly concerned. I just wish I had patience to match my luck.
    Have you tested multiple S10000 cards, and receive a consistent hashrate?
  • Cpt_CudlzCpt_Cudlz Member Posts: 21
    progfoxx said:

    Cpt_Cudlz said:

    progfoxx said:

    Keep us posted on the cooling! You are not running 5 cards currently, only 1?

    That's correct. Just doing some testing and benchmarking on the one to start out. I still have to finish building my frame, and I'm waiting on the shipping for a more powerful PSU that will support all 5 cards. I found some compatible liquid heatsinks, a pump, and a pretty big radiator, but again, I can't finish the build until everything gets here. The thing with some of these individual components, is that they're pretty much available only manufacturer direct, but since I'm not a supplier, I can't exactly call up Corsair, or whoever, to ask for only ten of their heatsinks. So, I've resorted to ordering direct from China, which is a pain because sometimes the quality isn't very good, and I have to wait for shipping. I've had pretty good luck so far, though, so I'm not terribly concerned. I just wish I had patience to match my luck.
    Have you tested multiple S10000 cards, and receive a consistent hashrate?
    Not just yet, since I'm still waiting on risers, splitters, and other things of that nature. In the meantime, I'm hunting for a compatible BIOS from other Tahiti core cards that are compatible with the new Radeon Crimson control system. I'm thinking I might have to run a Linux environment to allow so many cards on my system, but setting the workload to "compute" in Crimson ReLive usually allow Windows to support more than the 8-card limit.
  • progfoxxprogfoxx Member Posts: 5
    Cpt_Cudlz said:


    Not just yet, since I'm still waiting on risers, splitters, and other things of that nature. In the meantime, I'm hunting for a compatible BIOS from other Tahiti core cards that are compatible with the new Radeon Crimson control system. I'm thinking I might have to run a Linux environment to allow so many cards on my system, but setting the workload to "compute" in Crimson ReLive usually allow Windows to support more than the 8-card limit.

    I've got one on the way! I'd be interested to see how (5+) are handled in Windows. 8.1 is fine with me (as your findings may have suggested potential driver compatibility issues in Win10). Any possibility of testing Monero? My card isn't expected until the 17th.
  • ProcrastitatorProcrastitator Member Posts: 16
    This is really cool! Have you tried using a bios editing program to change the voltage and reduce power consumption?

    This might be some help: https://anorak.tech/t/anoraks-amd-vbios-hex-modification-tutorial/126/4
  • Cpt_CudlzCpt_Cudlz Member Posts: 21

    This is really cool! Have you tried using a bios editing program to change the voltage and reduce power consumption?

    This might be some help: https://anorak.tech/t/anoraks-amd-vbios-hex-modification-tutorial/126/4

    Short version: Yes, and it failed... spectacularly.

    Long version: So, I tried a full flash to the VBIOS's for various Tahiti core cards, only to run into several BSOD issues in Windows 10. Essentially, the driver didn't like the card's new BIOS. Ultimately, I ran into system-wide catastrophic failure of the OS through the many attempts at flashing, editing the registry, both manually and with the DDU cleanup feature as I tried to run the different VBIOS's and their corresponding drivers. In the end, I had to run a fresh install, AND I came to find out that it shouldn't be a huge deal, as the latest Wndows 10 Creators Update supposedly allows for support of more cards per single driver than previous builds, with or without AMD's fancy new compute workload. The S10000 passive (soon to be water-cooled), though, is still recognized as two separate Sky 900's at each PCIe slot, which may cause some problems later. Then again, it may not. I guess that will depend on whether each card gets its own instance of the driver, and whether this latest update will support as many as 10 cards on the single driver, plus the five 1050 TI's on their driver. Absolute worst case, I just run my rig off of two MOBO's. Second-to-worst case scenario, I'll just run Ubuntu, and use some Linux magic to run the executable GUI miners (I like the profit switching options in MinerGate -but I really don't care for their pools - and AwesomeMiner -better pool options). The good news is that I haven't totally bricked one of my S10000's, and even if I had, I have four more from which I can backup the BIOS and flash it to the broken one. Better news, the AMD Firepro Control Center will allow me to adjust voltages and clock speeds without the use of any other utilities, which keeps my OS environment nice and light, no bogged down RAM or storage space issues. The tradeoff, though, is that it seems I can only adjust those settings when I am outputting video from the card, which I expect to play a role in eating into computing power that would be better devoted to mining. I still have to check to see if the settings hold after I'm no longer using the card for my display output, as that will be the GT710's job. Initially, AMD's auto detect installed the classic Catalyst Control center, so the FirePro control center may allow me to adjust settings for cards that are not actively outputting the display. In the meantime, I'll look into that nifty tutorial you linked to. Thanks for the info.
  • progfoxxprogfoxx Member Posts: 5
    edited January 2018
    @Cpt_Cudlz Interesting findings. Since you can adjust the voltages/ clock speeds through the FirePro Control Center -- is it even relevant to pursue modifying the BIOS to do so?

    I'd be curious to see these cards used in a virtualized environment -- which could potentially eliminate that bottleneck of having a certain amount of cards (in theory), be enabling GPU pass-through to VM's. Not sure if that would kill performance -- but it is indeed worth a shot, considering these cards are VDI compliant and used heavily in environments that utilize GPU assignment to virtual machines.
  • ProcrastitatorProcrastitator Member Posts: 16
    Cpt_Cudlz said:

    This is really cool! Have you tried using a bios editing program to change the voltage and reduce power consumption?

    This might be some help: https://anorak.tech/t/anoraks-amd-vbios-hex-modification-tutorial/126/4

    Short version: Yes, and it failed... spectacularly.

    Long version: So, I tried a full flash to the VBIOS's for various Tahiti core cards, only to run into several BSOD issues in Windows 10. Essentially, the driver didn't like the card's new BIOS. Ultimately, I ran into system-wide catastrophic failure of the OS through the many attempts at flashing, editing the registry, both manually and with the DDU cleanup feature as I tried to run the different VBIOS's and their corresponding drivers. In the end, I had to run a fresh install, AND I came to find out that it shouldn't be a huge deal, as the latest Wndows 10 Creators Update supposedly allows for support of more cards per single driver than previous builds, with or without AMD's fancy new compute workload. The S10000 passive (soon to be water-cooled), though, is still recognized as two separate Sky 900's at each PCIe slot, which may cause some problems later. Then again, it may not. I guess that will depend on whether each card gets its own instance of the driver, and whether this latest update will support as many as 10 cards on the single driver, plus the five 1050 TI's on their driver. Absolute worst case, I just run my rig off of two MOBO's. Second-to-worst case scenario, I'll just run Ubuntu, and use some Linux magic to run the executable GUI miners (I like the profit switching options in MinerGate -but I really don't care for their pools - and AwesomeMiner -better pool options). The good news is that I haven't totally bricked one of my S10000's, and even if I had, I have four more from which I can backup the BIOS and flash it to the broken one. Better news, the AMD Firepro Control Center will allow me to adjust voltages and clock speeds without the use of any other utilities, which keeps my OS environment nice and light, no bogged down RAM or storage space issues. The tradeoff, though, is that it seems I can only adjust those settings when I am outputting video from the card, which I expect to play a role in eating into computing power that would be better devoted to mining. I still have to check to see if the settings hold after I'm no longer using the card for my display output, as that will be the GT710's job. Initially, AMD's auto detect installed the classic Catalyst Control center, so the FirePro control center may allow me to adjust settings for cards that are not actively outputting the display. In the meantime, I'll look into that nifty tutorial you linked to. Thanks for the info.
    I know for RX 480 cards you have to install some driver patch to get them working with a modded bios. Perhaps the driver patch for RX 480 cards would work here.

    This link has the instructions for the process, seems pretty easy: https://www.monitortests.com/forum/Thread-AMD-ATI-Pixel-Clock-Patcher
  • ProcrastitatorProcrastitator Member Posts: 16
    You've inspired me to buy one to play experiment with. Hopefully between the two of us, we can turn these into low(er) power mining beasts!
  • Cpt_CudlzCpt_Cudlz Member Posts: 21
    progfoxx said:

    @Cpt_Cudlz Interesting findings. Since you can adjust the voltages/ clock speeds through the FirePro Control Center -- is it even relevant to pursue modifying the BIOS to do so?

    I'd be curious to see these cards used in a virtualized environment -- which could potentially eliminate that bottleneck of having a certain amount of cards (in theory), be enabling GPU pass-through to VM's. Not sure if that would kill performance -- but it is indeed worth a shot, considering these cards are VDI compliant and used heavily in environments that utilize GPU assignment to virtual machines.

    @progfoxx To the extent that it would allow compatibility with more recent driver versions and AMD's newest control suite, it might still be worth modding the BIOS. The Tahiti cores can also handle a great deal more clock speed than the S1000's native BIOS allows, as well, with some of the HD7900 series cards allowing OC speeds up to 1500 MHz core/1700 MHz VRAM. One of the special edition 7990's is purported to boast a 6000 MHz effective memory clock across six 1000 MHz GDDR5 units. My guess is that there is something in the BIOS that allocates memory to the stream processors managed by the core in a way that allows each VRAM chip to function independently in that manner. Another interesting thing I've found is conflicting information about which variant of the Tahiti core is present on the S10000's PCB. Most databases report it as having the Tahiti Pro or Pro GL variant, whereas my all of my exploration in the BIOS says that it has the Orthrus variant. How Orthrus stacks up to its counterparts is unclear, since most databases only report information for consumer cards, and Orthrus seems to be unique to the FirePro workstation cards. Another shot in the dark: the variant was codenamed Orthrus after the mythical two-headed dog for the dual-core setup on the PCB. Two heads, two cores, each of which has a special BIOS for designating the master/slave relationship of the two. This was part of the many problems I ran into: I only backed up the Master BIOS the first time I went to flash. When I flashed back to manufacturer original, I had to backup the proper master and slave BIOS's from one of the other cards. The bottom line is: some of the later Tahiti core Radeon HD series cards are supported in Crimson ReLive, and the Firepro cards are not. Some of the gains blockchain'ers are seeing from switching their Radeon cards' workloads from graphics to compute aren't negligible. They're not great, but might be worth it at the TDP of the S10000. There's also the added plus of having absolute certainty that running 10+ cards won't crash Windows.
  • rmhrmh Member Posts: 410 ✭✭✭
    I don't get it why you're experimenting with server equipment on windows.
    Linux fglrx driver have no problem with modded bios on tahiti cards, just saying.
  • Cpt_CudlzCpt_Cudlz Member Posts: 21
    rmh said:

    I don't get it why you're experimenting with server equipment on windows.
    Linux fglrx driver have no problem with modded bios on tahiti cards, just saying.

    Well, if you recall, I mentioned my "rusty, already limited skills," in that area. I wouldn't have known that, but thanks for the info. I'll have to give that a shot.
  • ProcrastitatorProcrastitator Member Posts: 16
    Have you somehow checked how many watts it pulls while mining? Earlier you mentioned it tripped the PSU over-wattage protection, but later you mentioned it was actually just overheating. Since mining is mostly memory intensive, I'm curious if it's using all 375 watts.
  • Cpt_CudlzCpt_Cudlz Member Posts: 21
    @Procrastitator I have a multimeter running around here somewhere, though I'd have to look up which pins to check. The max TDP of the rig technically exceeds the wattage limitations of most wall power monitors, but setting the voltages down ought to reduce that load to a level that wouldn't set that off. That said, I need to actually buy a wall power meter. In other news, I've just about got everything I need to start the final stage of the build, but man, that shipping time from Shenzhen is a real pain. The wait has been just killing me. While I have set out to build on a budget, I'm also getting down to the last of the money I'd set aside for this. Between waiting on delivery for some of the components and running my budget dry, it might be a minute before I have all of my final benchmarking done and finish getting this thing off the ground. Right now, I'm relearning and learning a bunch of new stuff in Ubuntu to play around with optimizing the rig as a dual-boot system and comparing Ubuntu to Windows 10. In particular, the Windows drivers and FirePro control center have all kinds of quirks that make setting voltages and clock speeds basically impossible without having the S10000 set as my actual display output. What's more, is that since the MOBO recognizes each core of each individual card as a unique device, I can only ever adjust the settings for one of the cores from Windows, and I never know which one that is. Presumably, it's the master core. I'll keep you up to date as I get things wrapped up. I really hope that's soon, because I've got a ton on my plate right now.
  • JuggarJuggar Member Posts: 11
    edited January 2018
    I saw these cards many months ago and decided they were not worth it at all.

    They are passive and require at least 20 CFM airflow, if you ran them just as they were then no wonder they shutdown. Cooling will be an ordeal.

    They are GCN 1.0 and will be crippled for Ethash (eth mining). The amount of memory does not matter, the memory controller itself is the issue with these GCN 1.0 cards and you will never see 20 MHs per GPU now. That 11 figure you quoted seems right per GPU. And I dont have to tell you that a card that cost more than a GTX 1060(25 Mhs) and hashes less(22) while using over 4 times the power is a terrible deal.

    These S10000 6GB = 2 X 3GB Radeon 7950 GPUs. When's the last time you heard of anyone mining Eth with a Radeon 7950? Years ago.

    Back in June 2017, 11 Mhs was the norm for a 7950. they have not been viable for eth mining for a LONG time. Even the RX cards were feeling it, but AMD released mining drivers to fix the RX cards. GCN 1.0 cards could not be helped. So you are getting the correct speed (11 mhs per GPU).

    https://www.reddit.com/r/EtherMining/comments/6inw91/low_hashrate_on_hd_7950_112_mhs/


    Thats an OK Equihash benchmark, nearly GTX 1080 power but again, its using more than double the power with all the added BS of cooling it (and the cost associated with such). If I had to mine with these, this I what I would do. But you won't ROI any faster than someone buying a 3GB GTX 1060 so what's the point really..... And come summer time you will have some seriously hot cards putting out tons of heat. A GTX 1070 Ti for $500 is a better buy in the long run I believe.

    I cant believe you didn't know GCN 1.0 cards were crippled before you bought these. Now you've got to figure out a cooling solution, buy MASSIVE power supplies that WILL cost an arm and a leg. Not ot mention all the BS you've been through software wise.

    Why didn't you just buy passive S9150's??????? They would have been so much better off and they are not going to be crippled for eth mining as these will be. Well, very slightly impacted (512 bit bus saves them) but not much before PoS. with proper cooling you could see nearly 30 mhs on an S9150 and they recently were $200-$250 each on eBay.

    If you didnt know, S9150 = R9 290X (GCN 2.0). R9 290X are still good for eth mining, even if they are a bit power hungry. Much better value than a one trick pony (GCN 1.0).

    My friend almost bought these, as I already have enough cards but you could have got them for $1000 (they were taking best offers):

    https://www.ebay.com/itm/263408440073?ul_noapp=true


    No one here should EVER and I mean EVER be buying S10000's for mining. But hey man, I was always curious if someone would ever actually mine with these, so thanks for taking one for the team.

  • ProcrastitatorProcrastitator Member Posts: 16
    I found 3 of S10000 with 3 fan cooling systems for $310 each. It's really just for fun though, I used RX 580's in my serious builds. Pulling 20mh/s per core (40mh/s per card) isn't too shabby imo.
  • JuggarJuggar Member Posts: 11
    edited January 2018

    I found 3 of S10000 with 3 fan cooling systems for $310 each. It's really just for fun though, I used RX 580's in my serious builds. Pulling 20mh/s per core (40mh/s per card) isn't too shabby imo.

    If you READ his post he never actually did. He was getting 11 mhs.... which is exactly what youd expect a 7950 to do and these cards have EXACT chip.

    Im telling you people that these are fucking crippled. What's wrong with you???!!!!
  • JuggarJuggar Member Posts: 11
    edited January 2018

    Simply put, nothing about the S10000 makes it acceptable for mining Eth.
  • ProcrastitatorProcrastitator Member Posts: 16
    edited January 2018
    Juggar said:

    I found 3 of S10000 with 3 fan cooling systems for $310 each. It's really just for fun though, I used RX 580's in my serious builds. Pulling 20mh/s per core (40mh/s per card) isn't too shabby imo.

    If you READ his post he never actually did. He was getting 11 mhs.... which is exactly what youd expect a 7950 to do and these cards have EXACT chip.

    Im telling you people that these are fucking crippled. What's wrong with you???!!!!
    Chill out and work on your reading comprehension skills. @Cpt_Cudlz clearly stated in his original post that he is getting 20mh/s per core.
    Cpt_Cudlz said:

    Each core can pull about 20 MH/s, at least until it crashes the PSU.

  • JuggarJuggar Member Posts: 11
    edited January 2018

    Juggar said:

    I found 3 of S10000 with 3 fan cooling systems for $310 each. It's really just for fun though, I used RX 580's in my serious builds. Pulling 20mh/s per core (40mh/s per card) isn't too shabby imo.

    If you READ his post he never actually did. He was getting 11 mhs.... which is exactly what youd expect a 7950 to do and these cards have EXACT chip.

    Im telling you people that these are fucking crippled. What's wrong with you???!!!!
    Chill out and work on your reading comprehension skills. @Cpt_Cudlz clearly stated in his original post that he is getting 20mh/s per core.
    Cpt_Cudlz said:

    Each core can pull about 20 MH/s, at least until it crashes the PSU.

    Dude he was talking about a benchmark....

    I promise you are in a for a rough ride. No one is getting 20 fucking MHs on a 7950 now..... Not on our current DAG. Load up claymore miner OP and actually start a real mining run. When it loads up the current DAG you'll see what im talking about.

    If you benchmarked at the initial DAG then yes, it would show 20. But that is simply not the case anymore.

    You need to return/cancel whatever you bought. im not fucking trolling, I don't want to you lose money man.

    Lookup S10000 and look at the cores and their architecture. Now look at a 7950. They're the same damn thing. Now, look up Ethereum mining for 7000 series cards. You'll notice that results from the last year show the DAG has rendered the cards crippled and only good for Zcash mining at this point.

    OP is clearly a novice, knows just enough to be dangerous to himself (and others). there is some seriously bad info here and it's already affected YOU.
  • progfoxxprogfoxx Member Posts: 5
    edited January 2018
    --Deleted--
    Post edited by progfoxx on
  • Cpt_CudlzCpt_Cudlz Member Posts: 21
    Like @Procrastitator said, this is just getting started and mostly for fun. With some voltage adjustments that actually take, I can profitably mine Zcash for now, and focus on gathering resources for a more serious build. Less the cost of power an extra $800/month is nothing to scoff at, especially considering how fuckin broke I am - I'm really fuckin broke. That's not accounting for how the market will change with respect to Zcash, but that's how the game is played, I guess. In any case, I have an enthusiasm and preexisting skill set that I can put to use, and if that means resorting to lower cost, higher power cards until I get my shit going, then so be it. I got these for their high(er) VRAM bandwidth and their relatively low base price. The cost of energy will, as @Juggar pointed out, probably be a bit much. You're also very right to assert that I probably know just enough to get myself into trouble. The present goal, then, is to learn enough and do enough to get myself back out of trouble. Now, the latest Ubuntu drivers do support a compute workload for the S10000. This allows for some significant modification of their overall function. By knocking down the core voltages, I should be able to mitigate some of that power cost, among other parameter adjustments I can make to optimize their performance. Problem remains, though, that these cards I bought think they're Sky 900's. The Sky 900 is another dual Tahiti card, but it doesn't make any damn sense considering all of the product and package labeling. The stickers, boxes, S/N's all point to 12gb Firepro S10000 passive. I do have a water cooling setup in the works, as well. The power consumption of the water-cooling won't be but another $2/month, so even if I get my money back and replace these cards, which is looking like a really good possibility about now, I can still run the water cooler to really push whatever cards I end up with and not worry about cooling. To clarify, I'm getting closer to 11 MH/s per core at a tolerable, i.e. not overheating like all hell, load. How far can I push them? Can't say just yet. Will mining Ethereum be profitable? Maybe not as much as some other algos/coins, and that's fine by me. I can go post to the other boards specific to those coins and chat about it all there. I might well end up taking the advice from @Juggar and get my money back, maybe buy a couple of Vega 56's or 64's for the money. Just two of them would achieve about 80% of the hashing power at 1/3 the power consumption. The fact is, I'm wading into some murky water and uncharted territory here, at least for me. My presence on the forum is as much about calling on the experience of more seasoned miners as it is a learning experience for me as an enthusiast and general tech junkie. Given that these cards may not be what they were advertised to be, I definitely have a legitimate case to pursue action against the eBay seller who sold them to me, and get my money back. All of this said, I'm still doing as much as I can while I wait for a bunch of damn packages to come in from freakin' ShenZhen/wherever else overseas. So, I get what you all are getting at here, and while it comes across as harsh in some cases, I do appreciate your concern and advice. I'm a let-the-chips-fall-where-they-may kind of guy, so that's how I'll move forward. At the very least, I have something that will work for now, and I don't stand to lose all that much. I'm in at about $2k for absolutely everything, so a two-month turn around to break even is fine by me. Would I get that back as quickly mining Ethereum? No, but it is what it is, and I'm going to figure this out, because that's what I do. I'm a problem solver, and any advice you guys have for me will be given due consideration.
  • JuggarJuggar Member Posts: 11
    edited January 2018
    @Cpt_Cudlz Mine Zcash and pray the market recovers. Proper cooling will be key here. The dual core Tahiti boards (aka 7990, S10000, Sky 900) are some of the, if not the most power hungry hot things AMD has ever made. I wont blame you if you return them for being mislabeled.

    In 2017, I found a seller (on eBay) selling a 10 pack of Sky 900 cards for $1000. What a deal right? Even now, mining Zcash with that would be great. Ultimately. the seller backed out and claimed they were "out of stock". they turned around listed them for 3 X the cost.

    Before I made the commitment to buy I did a lot of research, and thats how ive some to know about these cards.

    In my opinion, if you want to buy things like this, the S9150's were a better deal but im sure they are jacked up in price as well now.

    If you bought these from China over Aliexpress or something I would not count on them taking a return, even if they do im sure they'll have you pay the shipping back. I just bought some CPU's (under $100 each) from an Alibaba seller in Shenzhen that should mine $3 a day. Well, that was before the crash but it will rebound. But the way I see it, you get what you get from the chinese without much protection.
  • Cpt_CudlzCpt_Cudlz Member Posts: 21
    @Juggar Fortunately, my seller is domestic, but technically doesn't take returns. But if he sold me the wrong cards, I can pursue formal action through eBay buyer protection. The 1050TI's were off of Wish, so similar to AliBaba, etc. I think I'll mine the ZCash, but do it on a BTC payout pool like MiningPoolHub or something. Since I'll have so many cards running, I can always multi-mine, too, until the market stabilizes a little bit and have my hand in a few different pots. So, if I can get my money back and find a few 1060's or something, awesome. If not, I can mine, cash-out, buy new cards, repeat until I have a better all-around rig.
  • Cpt_CudlzCpt_Cudlz Member Posts: 21
    UPDATE! Windows is dumb. For these particular cards, however, Ubuntu is dumber still. So, it turns out that my cards are what they say they are. AMD tech support has confirmed this for me. I have the frame up, the liquid loop running, and everything running smoothly so far. I tried firing this up in Ubuntu, but the latest kernel doesn't support the fglrx drivers, which are needed for optimal performance with these cards. Conversely, the most recent compatible kernel version for the drivers doesn't like to run the miners. So, I was totally screwed there. Soooo... I snagged an open box Asus z270e for dirt cheap at my local Micro Center, running four of the cards off of that, and the other one off of the old AM3+ board I have. For ether, I can't - just absolutely CANNOT - get above 25 MH/s per card stable. It's out of the question. But, for equihash, these things blow even the Vega 64 away, by about 20%, which might not really be worth the extra ~100% power consumption, but I'm fixing to get my house all solarized this year, so maybe I can swing it. I'm just praying the market recovers soon from all this recent mess, so I can actually turn some real profit. I'll take what I can get for now, though. In other news, still waiting on those 1050 ti's from friggin' China, but I have more than enough room on my motherboards for them. Also, I did find some pretty damn stable 1-to-3 and 1-to-4 pcie splitters, if anyone was wondering how those do. In theory, I can hook up as many cards as there are pcie lanes on my cpu's (16 for my intel chipset, not really sure about the AM3+ board I have). I'll try to get some pictures up soon of the crazy liquid loop I'm running. As a plus, the heat off the radiator is helping to warm my house. I doubt I'll make up the elctricity costs in heating, but like I said, I'll take what I can get.
  • rmhrmh Member Posts: 410 ✭✭✭
    Cpt_Cudlz said:

    I tried firing this up in Ubuntu, but the latest kernel doesn't support the fglrx drivers, which are needed for optimal performance with these cards. Conversely, the most recent compatible kernel version for the drivers doesn't like to run the miners. So, I was totally screwed there.

    Ubuntu 14.04.2 works out of the box with fglrx. ("apt-get install fglrx-updates" on a fresh install)
    http://old-releases.ubuntu.com/releases/14.04.2/ubuntu-14.04.2-server-amd64.iso
Sign In or Register to comment.