OT Update 2: Nvidia & AMD about to lose a lot of sales from cryptominers?

1568101118

Comments

  • Oso3DOso3D Posts: 15,085

    I think they are in a precarious position; if they build a business around mining optimized cards, and that market evaporated... they’d be in serious trouble.

    But it’s a great idea otherwise!

  • outrider42outrider42 Posts: 3,679

    Current rumors are the Nvidia is going to cash in on the market prices with some much higher prices on the 2000 series. The flagship successor to the 1080ti could be well over $1000, even $1500.

    https://wccftech.com/rumor-nvidias-gtx-2080-flagship-graphics-cards-will-be-priced-significantly-upwards-of-699-msr-up-to-1500/

    But nobody really knows what Nvidia is doing. I'm not sure Nvidia really knows what they are doing, LOL. They were originally going to have the new cards out in next month or so, but now the speculation is later in the year. Pascal is supposedly at the end of its life, meaning production is supposed to have stopped. But then how are they getting new stock out? 

    My Guess is with 1080s going so wild, it would be easy to charge $700 or more MSRP for a 2080. But this would be a huge risk, too. If the mining boom was to suddenly die around the launch period, prices would bottom out and the high price on the 2080 would backfire.

  • KitsumoKitsumo Posts: 1,221

    Nvidia is going through with mining focused cards. They did some before, but they weren't anything special. However, this one is kind of interesting. Look at the rumored specs.

    – GPU: P102-100
    – CUDA Cores: 3200
    – Base Clock: 1582 MHz
    – Memory Clock: 11 Gbps
    – Physical Memory Size: 5 GB
    – Memory Type: GDDR5X
    – Memory Interface Width: 320-bit
    – Memory Bandwidth: 400 GB/s
    – Bus Support: PCIe Gen1 x4
    – Card Size: 21.5 cm length, 12.5 cm height, dual slot
    – Max TDP: 250 Watt
    – Power Connectors: 2x 8-pin PCI-E

    A Pascal card with 3200 cores is a beast by any standard. Though it only has 5GB, this is a mining card, it will not have any video output. Which raises a question...does a GPU with no video out get effected by Windows 10 VRAM usage? Or will all 5 GB be usable?

    There is no price on this card, but given that is a cut down 1080ti with no video out, it shouldn't be too high.

    I'm not sure miners will go for that. With regular video cards, if the bottom drops out of the market, at least they have a video card they can sell to some sucker enthusiast. With a stripped down card that's only good for mining, they're just stuck with it. Also, I hear the difficulty (and RAM requirement) of Etherium is going up; 4Gb cards are supposed to become useless for them by 2020, so a 5Gb card will have a limited lifespan. The scientific crowd would probably like it, though, not to mention SETI@home and Boinc.

    Unless they're opening a new factory to make these, I don't think it'll help. Miners are buying everything they can get their hands on. I think the only way for them to fix this mess is to ramp up production. Flood the market with cards. Once everyone's mining, it won't be profitable anymore. Of course if that happened, the market would be flooded with high end cards, retailers trying to unload everything they have, new and used, just to cut their losses. Good for us but bad for the industry.

    I'm eager to see what they come out with this summer though.

  • outrider42outrider42 Posts: 3,679

    They could probably sell it to an eager Iray user...3200 cores is nothing to sneeze at, even if it is only 5GB. And yes, the science community would flip out to have these kinds of cards for research. There are plenty of uses for these cards besides mining. So I really don't think they would have any problem selling these, even if its not a huge hit among miners.

  • laststand6522732laststand6522732 Posts: 866
    edited March 2018

    ~

    Post edited by laststand6522732 on
  • dragotxdragotx Posts: 1,147

    And one of the fans in one of my 1070s decided to start failing over night.  I'm not a happy camper

  • laststand6522732laststand6522732 Posts: 866
    edited March 2018

    ~

    Post edited by laststand6522732 on
  • dragotxdragotx Posts: 1,147
    Rottenham said:
    dragotx said:

    And one of the fans in one of my 1070s decided to start failing over night.  I'm not a happy camper

    Will you repair it?

    I've tried cleaning it, I'm seeing of that helps. If not, I'm going to check into the warranty, then if that doesn't work then I'll look into replacing it if I need to. I've got it cooking a render right now and it's staying inside the acceptable heat range, so I may be ok
  • laststand6522732laststand6522732 Posts: 866
    edited March 2018

    ~

    Post edited by laststand6522732 on
  • kyoto kidkyoto kid Posts: 41,848
    edited March 2018
    Greymom said:
    kyoto kid said:

    Your 740 KK is a Kepler.

    ...misnomer, actually a 750 Ti.

    The the 750 and 750ti are Maxwell-based GPUs, thank goodness. Based on the GM 107, which is first-generation Maxwell.

    ..didn't know that, thought they were Kepler based.  Thanks.

    Post edited by kyoto kid on
  • dragotxdragotx Posts: 1,147
    Rottenham said:
    dragotx said:
    Rottenham said:
    dragotx said:

    And one of the fans in one of my 1070s decided to start failing over night.  I'm not a happy camper

    Will you repair it?

     

    I've tried cleaning it, I'm seeing of that helps. If not, I'm going to check into the warranty, then if that doesn't work then I'll look into replacing it if I need to. I've got it cooking a render right now and it's staying inside the acceptable heat range, so I may be ok

    I took a brief look around, parts seem to be affordable and readily available, and there are many YT vids dealing with it. 

    Yeah, I know it can be done, and the cooling system can even been upgraded if I want.  I just don't want to bork the warranty without giving it a shot first

  • TaozTaoz Posts: 10,256

     

    Rottenham said:
    dragotx said:

    And one of the fans in one of my 1070s decided to start failing over night.  I'm not a happy camper

    Will you repair it?

    Should be covered by the warranty, I guess.

    MSI did a good job with their "silent mode" where the fans only run when the card gets over a certain temperature, which in practice means only when you render.

    Unfortunately companies often use low quality fans in their products to save a few bucks. I once had the fan bearings in a PSU in a brand new PC fail only two weeks after purchase. Fans with ball or ceramic bearings cost a little more but it's definitely worth it, they are very reliable and last for years. I never use anything else, and companies shouldn't either, IMO.

  • kyoto kidkyoto kid Posts: 41,848

    Current rumors are the Nvidia is going to cash in on the market prices with some much higher prices on the 2000 series. The flagship successor to the 1080ti could be well over $1000, even $1500.

    https://wccftech.com/rumor-nvidias-gtx-2080-flagship-graphics-cards-will-be-priced-significantly-upwards-of-699-msr-up-to-1500/

    But nobody really knows what Nvidia is doing. I'm not sure Nvidia really knows what they are doing, LOL. They were originally going to have the new cards out in next month or so, but now the speculation is later in the year. Pascal is supposedly at the end of its life, meaning production is supposed to have stopped. But then how are they getting new stock out? 

    My Guess is with 1080s going so wild, it would be easy to charge $700 or more MSRP for a 2080. But this would be a huge risk, too. If the mining boom was to suddenly die around the launch period, prices would bottom out and the high price on the 2080 would backfire.

    ...1,500$?  Crikey, almost better off getting a P5000 and having 5 extra GB along with the ability to sidestep W10 WDDM even if it has fewer cores.

  • GreymomGreymom Posts: 1,139
    edited March 2018
    kyoto kid said:
    Greymom said:
    kyoto kid said:

    Your 740 KK is a Kepler.

    ...misnomer, actually a 750 Ti.

    The the 750 and 750ti are Maxwell-based GPUs, thank goodness. Based on the GM 107, which is first-generation Maxwell.

    ..didn't know that, thought they were Kepler based.  Thanks.

    I got a couple cheap a while back, but wanted to check to be sure they were Maxwell first:  http://www.tomshardware.com/reviews/geforce-gtx-750-ti-review,3750.html

    The 4GB models are scarce and selling at a premium now, of course...about double or more what they were last year...deja vu....

    Post edited by Greymom on
  • Kitsumo said:
    Imago said:

    I was checking my Amazon wishlist to see if some price dropped (The 1070ti I was going to buy went form 400$ to 750$ in 7 days). My blood froze once again...
    Then in a corner I noticed a thing... Something that made me think "I can play their same game!" Everyone here has one or two "old" GPUs in their cabinets, put away because the new ones "are better", saved for emergencies or future rigs for a present to a nephew... Why not ADD THEM to our PCs like they do, adding CUDAs to our system?

    https://www.amazon.com/SEDNA-express-slots-Riser-Multiplier/dp/B075DF9L5V/ref=sr_1_78?ie=UTF8&qid=1520875751&sr=8-78&keywords=PCI-E+Express+riser

    Someone knows if this "download more RAM" thought I got could work? I mean, they use the GPUs at 100% to mine more, so I think we can do the same...
    I have two old cards, nothing exceptional, but the CUDA total of the three of them will surpass the number of the 1080ti!

    Yep, someone covered it earlier https://www.daz3d.com/forums/discussion/comment/3396241/#Comment_3396241 . Renders take a little longer to start (copying textures across a PCI-E x1 bus instead of x8 or x16) but after that it rendered normally. I'm going to try it once I get a new card. The only problem is finding someplace to mount your cards.

    This is similar to the Amfeltec GPU cluster I have, in principle at least. It's an easy way to expand your GPU capabilities, and with the Amfeltec unit, it comes with a frame on which to mount the board. However, as long as you have a spare PC case and the mount holes on the boards line up, you should be fine, and you have a place to mount the PSU for the cards.

    The big issue is that your lowest-RAM GPU will dictate the scene size, unless that's been fixed in recent Iray updates.

  • kyoto kid said:

    Current rumors are the Nvidia is going to cash in on the market prices with some much higher prices on the 2000 series. The flagship successor to the 1080ti could be well over $1000, even $1500.

    https://wccftech.com/rumor-nvidias-gtx-2080-flagship-graphics-cards-will-be-priced-significantly-upwards-of-699-msr-up-to-1500/

    But nobody really knows what Nvidia is doing. I'm not sure Nvidia really knows what they are doing, LOL. They were originally going to have the new cards out in next month or so, but now the speculation is later in the year. Pascal is supposedly at the end of its life, meaning production is supposed to have stopped. But then how are they getting new stock out? 

    My Guess is with 1080s going so wild, it would be easy to charge $700 or more MSRP for a 2080. But this would be a huge risk, too. If the mining boom was to suddenly die around the launch period, prices would bottom out and the high price on the 2080 would backfire.

    ...1,500$?  Crikey, almost better off getting a P5000 and having 5 extra GB along with the ability to sidestep W10 WDDM even if it has fewer cores.

    Yet again, with Iray, Core Count outranks VRAM, unless you're building 10+ GB scenes. A Titan Z will do a lot as long as you keep it under 6GB total. Unless the new one has been released, it's still the highest-core double-decker available. Even if the new double-Titan has been released, the Titan Z is still cheaper.

     

  • kyoto kidkyoto kid Posts: 41,848
    edited March 2018

    ...yeah I tend to build big scenes which, using certain workarounds on, would be diminishing returns time wise, so the P5000 or even a Maxwell Titan-X would be a more valuable asset. Again if the scene doesn't fit in VRAM all those cores mean nothing.

    The Titan-Z actually used to be quite expensive, as I remember around 1,500$, and when the Titan-X appeared, it was priced at 998$. 6 GB would handle maybe 50% of my scenes so in that respect, more VRAM actually translates to better render performance.

    Post edited by kyoto kid on
  • ImagoImago Posts: 5,658
    Kitsumo said:

    Yep, someone covered it earlier https://www.daz3d.com/forums/discussion/comment/3396241/#Comment_3396241 . Renders take a little longer to start (copying textures across a PCI-E x1 bus instead of x8 or x16) but after that it rendered normally. I'm going to try it once I get a new card. The only problem is finding someplace to mount your cards.

    Ah, I see!

    It looks like the increase isn't that much... Well, in the end it was really a "download more RAM" thing! cheeky

    Anyway I hope GPU prices will drop soon. I don't want the miners go bankrupt due to sudden failure of cryptomoney... I just want to buy a piece of techology at its right price!

  • TaozTaoz Posts: 10,256
    kyoto kid said:

    ...yeah I tend to build big scenes which, using certain workarounds on, would be diminishing returns time wise, so the P5000 or even a Maxwell Titan-X would be a more valuable asset. Again if the scene doesn't fit in VRAM all those cores mean nothing.

    Agree, I often run out of VRAM with 8 GB. Could easily use 16 GB or more.

  • laststand6522732laststand6522732 Posts: 866
    edited March 2018

    ~

    Post edited by laststand6522732 on
  • Imago said:
    Kitsumo said:

    Yep, someone covered it earlier https://www.daz3d.com/forums/discussion/comment/3396241/#Comment_3396241 . Renders take a little longer to start (copying textures across a PCI-E x1 bus instead of x8 or x16) but after that it rendered normally. I'm going to try it once I get a new card. The only problem is finding someplace to mount your cards.

    Ah, I see!

    It looks like the increase isn't that much... Well, in the end it was really a "download more RAM" thing! cheeky

    Anyway I hope GPU prices will drop soon. I don't want the miners go bankrupt due to sudden failure of cryptomoney... I just want to buy a piece of techology at its right price!

    If by "download more RAM" you mean "more GPUs means more VRAM", such is not the case, sadly. As I said, the Titan Z was two 6GB GPUs on one board, advertised as 12GB. Where Iray is concerned, it's two 6GB cards. The 2880x2 cores stack, but Iray only sees 6GB of VRAM. Even with a Titan X Pascal, a 1080ti, a Titan Z, a 980 (4GB), and 2 780ti (3GB), if my scene goes over 3GB, it dumps to the CPU.

    If Nvidia wanted to make a "targeted market" GPU series, it should be a GTX-priced line that had the capability to stack VRAM. No outputs, just processing, and a more flexible way to join them than a hard plastic SLI connector. They could even go so far as to make a GTX-based VCA in 4, 6, or 8 GPU configurations that could be daisy-chained for future expandability, and that would report to the OS as a single GPU to get around Windows' GPU count limit (without having to disable resources in Device Mismanager).

    If they really wanted to get ahead of the pack, they'd drop embedded VRAM entirely and use user-swappable SSDs, so long as their transfer rate was comparable. You could buy a 2GB card to save a few bucks up front, then attach a couple of 6GB SSDs to add 12GB more at a later time. This takes the pressure off them to keep the market supply up, and cuts out the headache and heartache of end-users who scrimp and save for a better card only to find the price has gone up another $200+ by the time they're ready to buy.

    It'd be nice if AMD cards were suddenly way better and cheaper for mining than Nvidia cards :P

     

  • laststand6522732laststand6522732 Posts: 866
    edited March 2018

    ~

     

    Post edited by laststand6522732 on
  • Richard HaseltineRichard Haseltine Posts: 107,997
    The big issue is that your lowest-RAM GPU will dictate the scene size, unless that's been fixed in recent Iray updates.

    Iray will, and always has, use any card into which the scene fits - if a card won't take the sceen it will be dropped without affecting other cards.

  • ImagoImago Posts: 5,658
    Imago said:
    Kitsumo said:

    Yep, someone covered it earlier https://www.daz3d.com/forums/discussion/comment/3396241/#Comment_3396241 . Renders take a little longer to start (copying textures across a PCI-E x1 bus instead of x8 or x16) but after that it rendered normally. I'm going to try it once I get a new card. The only problem is finding someplace to mount your cards.

    Ah, I see!

    It looks like the increase isn't that much... Well, in the end it was really a "download more RAM" thing! cheeky

    Anyway I hope GPU prices will drop soon. I don't want the miners go bankrupt due to sudden failure of cryptomoney... I just want to buy a piece of techology at its right price!

    If by "download more RAM" you mean "more GPUs means more VRAM", such is not the case, sadly. As I said, the Titan Z was two 6GB GPUs on one board, advertised as 12GB. Where Iray is concerned, it's two 6GB cards. The 2880x2 cores stack, but Iray only sees 6GB of VRAM. Even with a Titan X Pascal, a 1080ti, a Titan Z, a 980 (4GB), and 2 780ti (3GB), if my scene goes over 3GB, it dumps to the CPU.

    If Nvidia wanted to make a "targeted market" GPU series, it should be a GTX-priced line that had the capability to stack VRAM. No outputs, just processing, and a more flexible way to join them than a hard plastic SLI connector. They could even go so far as to make a GTX-based VCA in 4, 6, or 8 GPU configurations that could be daisy-chained for future expandability, and that would report to the OS as a single GPU to get around Windows' GPU count limit (without having to disable resources in Device Mismanager).

    If they really wanted to get ahead of the pack, they'd drop embedded VRAM entirely and use user-swappable SSDs, so long as their transfer rate was comparable. You could buy a 2GB card to save a few bucks up front, then attach a couple of 6GB SSDs to add 12GB more at a later time. This takes the pressure off them to keep the market supply up, and cuts out the headache and heartache of end-users who scrimp and save for a better card only to find the price has gone up another $200+ by the time they're ready to buy.

    It'd be nice if AMD cards were suddenly way better and cheaper for mining than Nvidia cards :P

     

    No, my tought was more to add CUDAs for IRay renders.

    I used the "Download more RAM" sentence because it sounded like a too easy solution for an hard problem... Like downloading RAM!cheeky
    Anyway the price increased again in the lastest days... At this rate, the NVidia 20XX series at their Day-One will cost half than older GPUs... then to be skyrocketed the the early evening tha same day!

  • TaozTaoz Posts: 10,256
    edited March 2018
    Rottenham said:
    Taoz said:

    Should be covered by the warranty, I guess.

    MSI did a good job with their "silent mode" where the fans only run when the card gets over a certain temperature, which in practice means only when you render.

    Unfortunately companies often use low quality fans in their products to save a few bucks. I once had the fan bearings in a PSU in a brand new PC fail only two weeks after purchase. Fans with ball or ceramic bearings cost a little more but it's definitely worth it, they are very reliable and last for years. I never use anything else, and companies shouldn't either, IMO.

    Yes, these speed-controlled fans, PWM as they are called, are new to me. I wonder if there is any measurable energy savings from using them.

    Not much I think, in general fans don't use a lot of power, like 3-5 W, depending on size.

     

    Rottenham said:

    I've been using Noctua fans myself. Agreed the quality is worth a couple of dollars more. I've never had a fan outright fail, but I've had them begin to squeel and moan. I would appreciate seeing MTBF numbers published for the parts I buy, but that's not about to happen. There are plenty of blue LEDs though.

    I used to use Antec 3xSpeed with ball bearings, I've used and still use many of these and only one have failed so far, after several years of use. No funny noises but in general they are not the quietest.

    I bought a stack of Arctic fans a few years ago, they're cheap, have "Fluid Dynamic Bearings" and 6 years of warranty. I have one in a machine, it's been running for what compares to about 3 years 24/7, still fine and no noises. Not PWM, but there is a PWM version also.

    https://www.amazon.com/ARCTIC-F14-Standard-Configuration-possible/dp/B01I6H5HHO/

     

    arctic_140mm.jpg
    600 x 611 - 73K
    Post edited by Taoz on
  • The big issue is that your lowest-RAM GPU will dictate the scene size, unless that's been fixed in recent Iray updates.

    Iray will, and always has, use any card into which the scene fits - if a card won't take the sceen it will be dropped without affecting other cards.

    Then there's a problem with my installation or something, because as soon as the scene exceeded 3GBs, it dumped off to the CPU only. If I unchecked the 780tis, then it fell to the 980 for the limit. If the scene exceeded 4GB total, it dumped off to the CPU only. If I unchecked the 780tis and the 980, leaving the Titan Z, Titan Pascal, and 1080ti, those 3 did the render, as long as I didn't exceed 6GB total.

     

  • TaozTaoz Posts: 10,256
    Imago said:
    Kitsumo said:

    Yep, someone covered it earlier https://www.daz3d.com/forums/discussion/comment/3396241/#Comment_3396241 . Renders take a little longer to start (copying textures across a PCI-E x1 bus instead of x8 or x16) but after that it rendered normally. I'm going to try it once I get a new card. The only problem is finding someplace to mount your cards.

    Ah, I see!

    It looks like the increase isn't that much... Well, in the end it was really a "download more RAM" thing! cheeky

    Anyway I hope GPU prices will drop soon. I don't want the miners go bankrupt due to sudden failure of cryptomoney... I just want to buy a piece of techology at its right price!

    If they really wanted to get ahead of the pack, they'd drop embedded VRAM entirely and use user-swappable SSDs, so long as their transfer rate was comparable.

    I'm not sure how VRAM works on a video card, but if there is a lot of constant read/write operations SSD wouldn't last long. Besides, VRAM is a lot faster than SSD, and they're making it faster and faster, which there must be a reason for. But maybe that's only relevant for gaming, and not still renders.

  • Holy Frijoles! Newegg has a refurbished Titan Z for $1300+!

    That's just insane! A Titan XP is only $1200 and it's a true 12GB card. Though it doesn't have 5000 cores.

    I wish I could get $1300 for mine, it'd be gone today! lol

     

  • laststand6522732laststand6522732 Posts: 866
    edited March 2018

    ~

    Post edited by laststand6522732 on
  • kyoto kidkyoto kid Posts: 41,848
    Imago said:
    Kitsumo said:

    Yep, someone covered it earlier https://www.daz3d.com/forums/discussion/comment/3396241/#Comment_3396241 . Renders take a little longer to start (copying textures across a PCI-E x1 bus instead of x8 or x16) but after that it rendered normally. I'm going to try it once I get a new card. The only problem is finding someplace to mount your cards.

    Ah, I see!

    It looks like the increase isn't that much... Well, in the end it was really a "download more RAM" thing! cheeky

    Anyway I hope GPU prices will drop soon. I don't want the miners go bankrupt due to sudden failure of cryptomoney... I just want to buy a piece of techology at its right price!

    If by "download more RAM" you mean "more GPUs means more VRAM", such is not the case, sadly. As I said, the Titan Z was two 6GB GPUs on one board, advertised as 12GB. Where Iray is concerned, it's two 6GB cards. The 2880x2 cores stack, but Iray only sees 6GB of VRAM. Even with a Titan X Pascal, a 1080ti, a Titan Z, a 980 (4GB), and 2 780ti (3GB), if my scene goes over 3GB, it dumps to the CPU.

    If Nvidia wanted to make a "targeted market" GPU series, it should be a GTX-priced line that had the capability to stack VRAM. No outputs, just processing, and a more flexible way to join them than a hard plastic SLI connector. They could even go so far as to make a GTX-based VCA in 4, 6, or 8 GPU configurations that could be daisy-chained for future expandability, and that would report to the OS as a single GPU to get around Windows' GPU count limit (without having to disable resources in Device Mismanager).

    If they really wanted to get ahead of the pack, they'd drop embedded VRAM entirely and use user-swappable SSDs, so long as their transfer rate was comparable. You could buy a 2GB card to save a few bucks up front, then attach a couple of 6GB SSDs to add 12GB more at a later time. This takes the pressure off them to keep the market supply up, and cuts out the headache and heartache of end-users who scrimp and save for a better card only to find the price has gone up another $200+ by the time they're ready to buy.

    It'd be nice if AMD cards were suddenly way better and cheaper for mining than Nvidia cards :P

     

    ...not sure if it is possible to pool VRAM for rendering (doesn't even work with Quadros).  It can be done through NVLink for the latest Volta Tesla cards but that is strictly for compute operations only.

    I agree with Taoz about using SSDs for such heavy duty read/write operations.

    As to VRAM, I believe the next generation (20xx) cards will have GDDR6 from what I have read.  HBM 2 is still more expesnive than conventional VRAM, so that may only be seen on the upper end Quadro cards. (The Volta GV-100 has 16 GB of HBM2 but costs around 6,500$ - 7,000$)

Sign In or Register to comment.