OT Update 2: Nvidia & AMD about to lose a lot of sales from cryptominers?

18911131418

Comments

  • GreymomGreymom Posts: 1,140
    Kitsumo said:

    As I understand it, GPU mining for Bitcoin isn't profitable and hasn't been for a long time. Bitcoin miners use ASICs almost exclusively. The main source of our pain is Ethereum, which was made to be ASIC resistant, so it requires GPUs. You can just look at https://bitgur.com/map and see that even if ETH does fall out of favor, there are at least a dozen currencies waiting to take it's place. I think the best we can hope for is that investors start pulling more money out of cryptocurrencies in general.

    Imago said:
    Greymom said:

    As I understand it, the Ethereum software loads a large dataset into VRAM and crunches away at it, sort of like GPU rendering.  The minimum memory required is 3 GB, and newer versions of the app will supposedly require even more.   New cards with more, faster, memory (and more cores too)  are advantaged.  According to the mining sites, my 2-year-old R9-290x 8 GB card could net about $50 per year.  A 1070ti more like $450.  It looks like most GPUs have a 2-3 year payback time.

     I saw a video that says the same, costs are way more greater than incomes. So why they insist on it?

    In that video the guy spent about 2000$ in a "base" rig with 4 GTX 1970ti and based on how much energy it consumed summed to the initial costs, even with the currency at high values, ne required 2 years only to gain back the 2000$ dollars he initially spent!

    About the "last" bitcoin... I guess it is the only way we can get rid of the GPU shortage. Once the mine is empty, miners will put down their pickaxes...

    It will be only a Wallstreet matter!

    I guess the reason for the insane level of interest in Ethereum and others (given the rather modest income from base mining)  is the belief that the value of the coins will continue to go up rapidly for years to come.

    But in comparison, if a miner had invested $10,000 in NVIDIA stock one year ago instead of buying parts for a mining rig, it would now be worth $25,000.

  • kyoto kidkyoto kid Posts: 41,859

    ...yes

    ...and I'd probably have that 1070 by now.

  • outrider42outrider42 Posts: 3,679

    The interest in Ethereum in mostly due to its ability to be mined by anyone. Small time people can mine with their rigs and make a little dime on the side. Just build a machine, let it run on its own, and let it make money while you do...nothing. The only effort required is building the PC, turning it on, and starting the mining. And that's it. No effort at all. That is the appeal, and that is all there is to it. It is also driving the value as Ether is considered the prime alternative to Bitcoin as Bitcoin is generally out of reach for any normal user. It doesn't matter how small you build, most half decent GPUs can mine Ether for a profit. A small one, yes, but profit none the less. Some people suggest buying GPUs, even at inflated prices, and mining in your spare time to make that money back. A financial analyst even broke it all down, taking into account time playing video games, and time mining. It is actually not so bad of plan. The risk is small, as you are building a PC you want to use, not just a mining rig. So even if the market crashes...you are not really out anything other than possibly having over paid for some parts.

    Few people are actually hording Ether. Most people are trading it out for real money at exchanges. But all this mining and trading is driving the value up.

  • kyoto kidkyoto kid Posts: 41,859

    ...nah, not going to succumb to the craze for the small return it would have.

     

  • I wanted to get a 1080 ti but the prices are still rediculous and I am now waiting/saving for Ampere. I hpe Nvidia will make enough Amperes not to have a shortage.

  • kyoto kidkyoto kid Posts: 41,859

    ...if this mining craze keeps up, I fear prices will still be ridiculous.

  • KitsumoKitsumo Posts: 1,221

    When prices change, and more people start mining (or stop) the difficulty adjusts to make mining more (or less) difficult.  When the price is going up, any idiot can make money mining. When it's stable, everyone's barely profitable at best. When it's falling, everyone is mining at a loss, holding their coins to sell when the price rebounds. So, as prices fall, we should get some relief on GPU prices, but mining will probably never go away completely. Pandora's box, yadda yadda...

  • ImagoImago Posts: 5,662

    I found an article about new NVidia cards, where they talk about Turing, a mining oriented "GPU"...

    I hope it can drag enough miners from other GPUs for enough time to let prices deflate a bit.

    They say it should be on the market in April. Let's cross fingers!

  • outrider42outrider42 Posts: 3,679
    edited March 2018

    The return would be owning a decent machine for Iray. <.<

    All speculation indicates that Nvidia will ask a much higher MSRP on their next gen cards, whatever they are called. (Volta, Ampere, and Turing have all been brought up as code names.)

    I noticed some comments about Intel getting into the GPU business. That is true. And the whole thing is weird. First Intel hired AMD's former head of graphics development. This was extremely unexpected, as he had just took part in launching AMD Vega. Then Intel shows plans to include AMD Graphics on Intel CPU chip designs. This was such a strange marriage that numerous brains exploded. And there is no doubt a connection between the two. Right now, it is not known if Intel is going to make fully discrete GPUs, or focus on these combo dies. But I really doubt Intel would stick with their arrangement with AMD for long after bringing Raja aboard. They have plans.

    https://www.theverge.com/circuitbreaker/2018/1/7/16861164/intel-amd-core-i5-i7-processor-radeon-rx-vega-m-gpu-cpu-h-series-ces-2018

    But either way, this creates pressure on Nvidia if these perform well and are priced reasonably. They will certainly be cheaper than buying a CPU and GPU separate. And being combo chips they should not be very desirable to miners since you can't stack them. Too bad Iray doesn't work on them.

    A number of things may go against Nvidia if they are not careful. Microsoft recently added AMD Freesync to their Xbox console lineup. This means that you can hook an Xbox to a Freesync monitor and eliminate screen tear. This gives TV makers a reason to add Freesync to units, and doing so is fairly simple and cheap. Xbox uses AMD. So does the PS4, and the PS4 is the market leader. If Sony also adds Freesync to PS4, that will be huge boost to AMD and Freesync. For those not versed, Freesync is better than Vsync when it comes to frame times in games. What are these? Basically, when you play a game and the system renders frames at a different pace from what is being displayed, you can get screen tearing. Vsync was created to force the system to wait for each screen refresh, at either 30 or 60 frames per second. This works to stop the screen tear, but it can create a small amount of input lag. Gamers flip out over input lag, and always look to make it as small as possible. Freesync solves that problem. Nvidia has their own version of this, because of course they do, called Gsync. Freesync is open, while Gsync is exclusive to Nvidia, because of course it is, LOL. And Gsync costs a lot more to build into your monitor because Nvidia charges a high fee to license it.

    Anyway...having major consoles use Freesync turns the tables on Nvidia a bit. So while Nvidia enjoys a huge lead in the GPU race, they still face some big hurdles going forward that could possibly keep them in check (and most importantly, keep them from charging whatever they feel like for GPUs.) Competition is important, even if you don't use the competitor, what they do still has a big impact on what the leader does.

    Post edited by outrider42 on
  • Oso3DOso3D Posts: 15,085

    So there are several possibilities.

    Folks have lofted the notion that GPU will get so expensive it will refocus developers on CPU rendering.

    The other possibility is that sustained mining will encourage a lot more development of GPUs, so cheaper not as cutting edge GPUs will become much faster very quickly. At which point CPU rendering will be a thing of the past.

    So, let's find out. ;)

     

  • CPU rendering is a step backwards. With a GPU, you can have one (or many) that does only rendering. No screen output, no audio output, just contributing to the rendering.

    CPUs can't do that. If you've got a dual-Xeon setup, you might be able to dedicate one to rendering and one to everything else, but not in Daz Studio 4.10. I've got a dual-Xeon system with 48GB RAM. I can either use both CPUs in Iray or none.

    On the other hand, I can attach 4 4x GPU clusters to my PC and have 16 GPUs doing nothing but rendering. And then I can put 2 more in the PC itself, so 18 GPUs running on one system dedicated to one task. You simply cannot get that with a CPU.

  • Oso3DOso3D Posts: 15,085

    Yes, but if it got to the point where 'CPU to render scene in X time' starts costing less than 'GPU to render scene in X time,' then shrugs, that's what people go do.

    Mind you, that'd require a bit shift, but it entirely depends on how trends unfold.

     

  • Griffin AvidGriffin Avid Posts: 3,817

    Well, I took the plunge. Found a new EVGA GeForce 1080Ti for $1200 on Amazon. More than I wanted to pay, but not the $1500 or more some people were asking. I've been looking all over the web for some time. I've waited two years for this card, and decided to pinch elsewhere and just do it. I wasn't going to wait another year or two for the price to go back down to $800 range that it was around Christmas. Still kicking myself for not getting it then, but didn't have the funds. I'm happy with my card, and I better be, because I'm done making hardware purchases for a while.

    @Llynara

    This, right there. I hesitated to speak on it, cause well. EVERYONE is like freaking out over the horrible prices. My 10 year old motherboard fried so I had no choice, had to do something. The full plunge and a whole new system. Me thinks, the TIME I'll save with a faster rig will be the true value in savings.

    I also have enough parts left over to rebuild a machine with a 1060 card. The advice is to build it for mining and the draw (no pun) back is the electricity.

    Not really interested in getting involved with mining. You really never know. We might look back and consider these purchases as necessary or a bad investment or getting the last gems from an era. When does a new card arrive, add things and NOT subtract things.

    I can see whe internet split over the next card releases, tons of benchmarks and also tons of threads back and forth about what they should/could have done instead. What we gained, what we lost and tons of "That was a stupid move" and speculations nVidia, had no choice! it's better for us! No, us is not their target customer...etc...

    All I know is I couldn't simply sit back and wait this out.

    I definately couldn't beat them, now the question is do you join them?

  • HavosHavos Posts: 5,582

    CPU rendering is a step backwards. With a GPU, you can have one (or many) that does only rendering. No screen output, no audio output, just contributing to the rendering.

    CPUs can't do that. If you've got a dual-Xeon setup, you might be able to dedicate one to rendering and one to everything else, but not in Daz Studio 4.10. I've got a dual-Xeon system with 48GB RAM. I can either use both CPUs in Iray or none.

    On the other hand, I can attach 4 4x GPU clusters to my PC and have 16 GPUs doing nothing but rendering. And then I can put 2 more in the PC itself, so 18 GPUs running on one system dedicated to one task. You simply cannot get that with a CPU.

    You can use Task Manager to tell it to only use 1 CPU with Daz Studio. You have to do it each time you run DS, so it is not as convienent as just clicking a box like when selecting the GPUs.

  • kyoto kidkyoto kid Posts: 41,859

    The return would be owning a decent machine for Iray. <.<

    All speculation indicates that Nvidia will ask a much higher MSRP on their next gen cards, whatever they are called. (Volta, Ampere, and Turing have all been brought up as code names.)

    I noticed some comments about Intel getting into the GPU business. That is true. And the whole thing is weird. First Intel hired AMD's former head of graphics development. This was extremely unexpected, as he had just took part in launching AMD Vega. Then Intel shows plans to include AMD Graphics on Intel CPU chip designs. This was such a strange marriage that numerous brains exploded. And there is no doubt a connection between the two. Right now, it is not known if Intel is going to make fully discrete GPUs, or focus on these combo dies. But I really doubt Intel would stick with their arrangement with AMD for long after bringing Raja aboard. They have plans.

    https://www.theverge.com/circuitbreaker/2018/1/7/16861164/intel-amd-core-i5-i7-processor-radeon-rx-vega-m-gpu-cpu-h-series-ces-2018

    But either way, this creates pressure on Nvidia if these perform well and are priced reasonably. They will certainly be cheaper than buying a CPU and GPU separate. And being combo chips they should not be very desirable to miners since you can't stack them. Too bad Iray doesn't work on them.

    A number of things may go against Nvidia if they are not careful. Microsoft recently added AMD Freesync to their Xbox console lineup. This means that you can hook an Xbox to a Freesync monitor and eliminate screen tear. This gives TV makers a reason to add Freesync to units, and doing so is fairly simple and cheap. Xbox uses AMD. So does the PS4, and the PS4 is the market leader. If Sony also adds Freesync to PS4, that will be huge boost to AMD and Freesync. For those not versed, Freesync is better than Vsync when it comes to frame times in games. What are these? Basically, when you play a game and the system renders frames at a different pace from what is being displayed, you can get screen tearing. Vsync was created to force the system to wait for each screen refresh, at either 30 or 60 frames per second. This works to stop the screen tear, but it can create a small amount of input lag. Gamers flip out over input lag, and always look to make it as small as possible. Freesync solves that problem. Nvidia has their own version of this, because of course they do, called Gsync. Freesync is open, while Gsync is exclusive to Nvidia, because of course it is, LOL. And Gsync costs a lot more to build into your monitor because Nvidia charges a high fee to license it.

    Anyway...having major consoles use Freesync turns the tables on Nvidia a bit. So while Nvidia enjoys a huge lead in the GPU race, they still face some big hurdles going forward that could possibly keep them in check (and most importantly, keep them from charging whatever they feel like for GPUs.) Competition is important, even if you don't use the competitor, what they do still has a big impact on what the leader does.

    ...none of these moves will help with Iray unless Nvidia changes their tack and decides to embrace Open_CL which is what AMD uses.

  • kyoto kidkyoto kid Posts: 41,859
    edited March 2018

    CPU rendering is a step backwards. With a GPU, you can have one (or many) that does only rendering. No screen output, no audio output, just contributing to the rendering.

    CPUs can't do that. If you've got a dual-Xeon setup, you might be able to dedicate one to rendering and one to everything else, but not in Daz Studio 4.10. I've got a dual-Xeon system with 48GB RAM. I can either use both CPUs in Iray or none.

    On the other hand, I can attach 4 4x GPU clusters to my PC and have 16 GPUs doing nothing but rendering. And then I can put 2 more in the PC itself, so 18 GPUs running on one system dedicated to one task. You simply cannot get that with a CPU.

    ...you are still limited to the VRAM on our largest GPU (minus about 17% if you are running in W10).  If the scene cannot beheld on the card's memory you are back on the CPU and all those cores you have become useless.  In that respect, a dual high core count Xeon system with a boatload of memory actually is better.

    Post edited by kyoto kid on
  • nicsttnicstt Posts: 11,715

    CPU rendering is a step backwards. With a GPU, you can have one (or many) that does only rendering. No screen output, no audio output, just contributing to the rendering.

    CPUs can't do that. If you've got a dual-Xeon setup, you might be able to dedicate one to rendering and one to everything else, but not in Daz Studio 4.10. I've got a dual-Xeon system with 48GB RAM. I can either use both CPUs in Iray or none.

    On the other hand, I can attach 4 4x GPU clusters to my PC and have 16 GPUs doing nothing but rendering. And then I can put 2 more in the PC itself, so 18 GPUs running on one system dedicated to one task. You simply cannot get that with a CPU.

    This is not really true; GPUs are RAM limited and further do not have slots for additional RAM, and whilst this is the case, CPUs have their place.

    They are not the preferred medium due to their speed when the scene fits; when the scene doesn't fit, despite optimisation, GPUs suck at rendering.

  • HavosHavos Posts: 5,582
    kyoto kid said:

    The return would be owning a decent machine for Iray. <.<

    All speculation indicates that Nvidia will ask a much higher MSRP on their next gen cards, whatever they are called. (Volta, Ampere, and Turing have all been brought up as code names.)

    I noticed some comments about Intel getting into the GPU business. That is true. And the whole thing is weird. First Intel hired AMD's former head of graphics development. This was extremely unexpected, as he had just took part in launching AMD Vega. Then Intel shows plans to include AMD Graphics on Intel CPU chip designs. This was such a strange marriage that numerous brains exploded. And there is no doubt a connection between the two. Right now, it is not known if Intel is going to make fully discrete GPUs, or focus on these combo dies. But I really doubt Intel would stick with their arrangement with AMD for long after bringing Raja aboard. They have plans.

    https://www.theverge.com/circuitbreaker/2018/1/7/16861164/intel-amd-core-i5-i7-processor-radeon-rx-vega-m-gpu-cpu-h-series-ces-2018

    But either way, this creates pressure on Nvidia if these perform well and are priced reasonably. They will certainly be cheaper than buying a CPU and GPU separate. And being combo chips they should not be very desirable to miners since you can't stack them. Too bad Iray doesn't work on them.

    A number of things may go against Nvidia if they are not careful. Microsoft recently added AMD Freesync to their Xbox console lineup. This means that you can hook an Xbox to a Freesync monitor and eliminate screen tear. This gives TV makers a reason to add Freesync to units, and doing so is fairly simple and cheap. Xbox uses AMD. So does the PS4, and the PS4 is the market leader. If Sony also adds Freesync to PS4, that will be huge boost to AMD and Freesync. For those not versed, Freesync is better than Vsync when it comes to frame times in games. What are these? Basically, when you play a game and the system renders frames at a different pace from what is being displayed, you can get screen tearing. Vsync was created to force the system to wait for each screen refresh, at either 30 or 60 frames per second. This works to stop the screen tear, but it can create a small amount of input lag. Gamers flip out over input lag, and always look to make it as small as possible. Freesync solves that problem. Nvidia has their own version of this, because of course they do, called Gsync. Freesync is open, while Gsync is exclusive to Nvidia, because of course it is, LOL. And Gsync costs a lot more to build into your monitor because Nvidia charges a high fee to license it.

    Anyway...having major consoles use Freesync turns the tables on Nvidia a bit. So while Nvidia enjoys a huge lead in the GPU race, they still face some big hurdles going forward that could possibly keep them in check (and most importantly, keep them from charging whatever they feel like for GPUs.) Competition is important, even if you don't use the competitor, what they do still has a big impact on what the leader does.

    ...none of these moves will help with Iray unless Nvidia changes their tack and decides to embrace Open_CL which is what AMD uses.

    Nvidia does support OpenCL. It is Iray that does not.

  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited March 2018

    Just an update on the Etherium front...

    https://arstechnica.com/tech-policy/2018/03/ether-plunges-after-sec-says-dozens-of-ico-investigations-underway/

    And now we continue to wait and see if the bottom actually falls out, with GPU prices dropping accordingly...  Keep in mind that there's probably a minor backlog of gamers also patiently waiting on cheaper graphic card prices...

    Post edited by tj_1ca9500b on
  • ImagoImago Posts: 5,662

    Just an update on the Etherium front...

    https://arstechnica.com/tech-policy/2018/03/ether-plunges-after-sec-says-dozens-of-ico-investigations-underway/

    And now we continue to wait and see if the bottom actually falls out, with GPU prices dropping accordingly...  Keep in mind that there's probably a minor backlog of gamers also patiently waiting on cheaper graphic card prices...

    Always less than miners...

    And you are forgetting that a big amount of pre-owned GPUs will flood the market the same time, so there will be a nice amount of pieces available. As I can recall, even damaged cards can be refurbished and return to pristine condition! I think (but math is my sworn enemy, so don't believe me) at least 200K GPUs will suddenly appear in the second-hand market, bringing the prices even lower.
    IF the crypto-currencies will keep losing value enough to discourage at least some miner...

  • GreymomGreymom Posts: 1,140

    Hmmm.   Two very interesting developments today:

    https://www.investopedia.com/news/nvidias-days-benefiting-crypto-mining-are-numbered-analyst-warns/?partner=YahooSA&yptr=yahoo

    This claims a new algorithm that will eliminate or reduce  the need for all this GPU power...but no info or links at all on this miraculous new algorithm...I've heard these claims before.

    More interesting (and IMO credible):

    Is this why NVIDIA transferred IRAY development to a third party?:   https://finance.yahoo.com/news/nvidia-rtx-technology-realizes-dream-170000039.html  a new real-time render engine for VOLTA, but apparently not Maxwell.

    The plot increases in viscosity!

  • GreymomGreymom Posts: 1,140
    edited March 2018

    Walmart claims to have some MSI GTX 1080s (8GB) in stock for $669-$749.  Not cheap, but the lowest price I have seen lately.

    MSI GTX 1070ti (8 GB) for $609.  Better than last week at $749.

     

    Post edited by Greymom on
  • PadonePadone Posts: 4,015
    edited March 2018

    It is clear that the direction is realtime PBR. For DAZ Studio there's iClone 7 already, and Blender's EEVEE supposed to be released this year. Both with importers for DAZ assets.

    This means fast photoreal rendering on the average card. That's specially good for independent short movies. No more mega-power cards or render farms required for animation. Provided that you can fit the textures to the vram.

    Post edited by Padone on
  • kyoto kidkyoto kid Posts: 41,859

    ...don't forget Octane 4.

  • JStryderJStryder Posts: 168

    Virtual reality and mixed reality is waiting in the wings, waiting to soak up gpu's too if prices do drop. VR and AR are more oriented to compact portable devices, but still absorb investment and resources for making chips. Investors' focus on portable devices has played a role in the shortage of graphics cards for PC's. Crypto-currency and blockchain is not going to magically end anytime soon, either. It will expand to new applications, is where cloud computing was ten or fifteen years ago in terms of market penetration. At lot of data is going to move to blockchains or similar distributed ledgers because storing data outside of some billionaire's data farm is safer and avoids a single party with control over the data. Data will become more of a public resource and more under the control of those to whom it belongs. There are still problems to be overcome but lots of geniuses working on those problems.

    With all that going on, the price of graphics cards won't crash anytime soon and may never return to 2016 levels. After a major drop in crypto-currency prices (like now) might be your best opportunity to buy for a while.

  • KitsumoKitsumo Posts: 1,221
    edited March 2018
    JStryder said:

    ... Crypto-currency and blockchain is not going to magically end anytime soon, either. It will expand to new applications, is where cloud computing was ten or fifteen years ago in terms of market penetration...

    ...With all that going on, the price of graphics cards won't crash anytime soon and may never return to 2016 levels. After a major drop in crypto-currency prices (like now) might be your best opportunity to buy for a while.

    I agree totally. I don't like cryptocurrency but it appears to be here to stay, at least for awhile. We may have a few weeks of GPU price drops before the Ethereum difficulty adjusts to the new lower prices (and makes mining easier, attracting miners to start buying again).

    Looking at the chart, mining profitability is near an all time low, meaning this is probably the best we can hope for until the supply of cards is increased. Happy shopping everyone! indecision

    ETH profitability.jpg
    982 x 587 - 185K
    Post edited by Kitsumo on
  • GreymomGreymom Posts: 1,140
    edited March 2018

    As of this morning, Newegg has a much larger selection of in-stock 1070s and 1080s than they did a couple of weeks ago.  Prices still inflated, but they have dropped for some cards. : )

    Also an interesting development:   the NVIDIA site does not have "Notify" buttons anymore for specific products, just "out of stock",,,,

    Post edited by Greymom on
  • nonesuch00nonesuch00 Posts: 18,729

    Just a matter of 3rd party resellers speculating and getting stuck with large inventory they couldn't resale to businesses and governments coming back to roost.

  • ImagoImago Posts: 5,662
    edited March 2018
Sign In or Register to comment.