OT Update 2: Nvidia & AMD about to lose a lot of sales from cryptominers?

11214161718

Comments

  • CypherFOXCypherFOX Posts: 3,401
    edited March 2018

    Greetings,

    The real thing that'll bring down GPU mining is not newer GPUs.  It's ASICs targetted specifically at the ASIC-resistant cryptocurrencies.  Bitcoin has been beyond the realm of GPU mining for a while, but many of the newer coins are based off of Ethereum, and that has a RAM requirement that is harder to manage in ASIC form.

    As Ethereum becomes more mineable by application-specific processors, we might see a pullback in the number of GPU devices being sold to miners.  Or folks getting out of GPU mining before the ASICs come in, if they're aware of what's happening.

    kyoto kid said:

    See the Advanced tab of Render Settings.

    ...I have and it requires a remote system to link to.  That still means more expense.

    Sidenote: It doesn't...  You could run the Iray Server on your same box to get the effect you're looking for.  It just needs an IP address, and you can put your own (or 127.0.0.1, aka localhost, if the server was listening on it).

    --  Morgan

     

    Post edited by CypherFOX on
  • outrider42outrider42 Posts: 3,679
    kyoto kid said:
    kyoto kid said:

    The return would be owning a decent machine for Iray. <.<

    All speculation indicates that Nvidia will ask a much higher MSRP on their next gen cards, whatever they are called. (Volta, Ampere, and Turing have all been brought up as code names.)

    I noticed some comments about Intel getting into the GPU business. That is true. And the whole thing is weird. First Intel hired AMD's former head of graphics development. This was extremely unexpected, as he had just took part in launching AMD Vega. Then Intel shows plans to include AMD Graphics on Intel CPU chip designs. This was such a strange marriage that numerous brains exploded. And there is no doubt a connection between the two. Right now, it is not known if Intel is going to make fully discrete GPUs, or focus on these combo dies. But I really doubt Intel would stick with their arrangement with AMD for long after bringing Raja aboard. They have plans.

    https://www.theverge.com/circuitbreaker/2018/1/7/16861164/intel-amd-core-i5-i7-processor-radeon-rx-vega-m-gpu-cpu-h-series-ces-2018

    But either way, this creates pressure on Nvidia if these perform well and are priced reasonably. They will certainly be cheaper than buying a CPU and GPU separate. And being combo chips they should not be very desirable to miners since you can't stack them. Too bad Iray doesn't work on them.

    A number of things may go against Nvidia if they are not careful. Microsoft recently added AMD Freesync to their Xbox console lineup. This means that you can hook an Xbox to a Freesync monitor and eliminate screen tear. This gives TV makers a reason to add Freesync to units, and doing so is fairly simple and cheap. Xbox uses AMD. So does the PS4, and the PS4 is the market leader. If Sony also adds Freesync to PS4, that will be huge boost to AMD and Freesync. For those not versed, Freesync is better than Vsync when it comes to frame times in games. What are these? Basically, when you play a game and the system renders frames at a different pace from what is being displayed, you can get screen tearing. Vsync was created to force the system to wait for each screen refresh, at either 30 or 60 frames per second. This works to stop the screen tear, but it can create a small amount of input lag. Gamers flip out over input lag, and always look to make it as small as possible. Freesync solves that problem. Nvidia has their own version of this, because of course they do, called Gsync. Freesync is open, while Gsync is exclusive to Nvidia, because of course it is, LOL. And Gsync costs a lot more to build into your monitor because Nvidia charges a high fee to license it.

    Anyway...having major consoles use Freesync turns the tables on Nvidia a bit. So while Nvidia enjoys a huge lead in the GPU race, they still face some big hurdles going forward that could possibly keep them in check (and most importantly, keep them from charging whatever they feel like for GPUs.) Competition is important, even if you don't use the competitor, what they do still has a big impact on what the leader does.

    ...none of these moves will help with Iray unless Nvidia changes their tack and decides to embrace Open_CL which is what AMD uses.

     

    These moves are more about competition, which effects...everything. Strong competition leads to more innovation and better prices in the market. If AMD competes with Nvidia better, the better and cheaper these GPUs become. The Titan V would not cost $3000 if AMD had real competition against it.

     

    Obviously there are other factors driving prices right now, but if Nvidia has no threat from AMD, they can charge whatever they want for their cards and get away with it.

     

    Nvidia is starting to do some shady things to bully their position in the market. Right now there is controversy around their GPP program, which by all accounts is putting pressure on 3rd party board makers to sign up to its terms. What happens if they refuse? They lose the benefits they used to have with Nvidia, support and most importantly, they might not get the stock they want first to build their cards. GPP members get access to chips first. But the GPP has special guidelines that basically ask for preferential treatment to Nvidia. Its already had effects, with 2 GPP members releasing AMD gpus without their normal gaming branding attached. Its dirty as it sounds. Even if Nvidia gets slammed by regulatory sanctions, the damage may already be done, which is the whole point of such shady practices. I'm sure Nvidia would happily pay any fine they receive if it meant the death of AMD's gpu division. It would be worth it for them.

    ...as you and others here know, I tend value VRAM higher than boatloads of stream processors as I create highly complex and detailed scenes (remember Alpha Channel's epic sweeping scenes? Yeah as a former painter, that is my inspiration).  The 999$ Vega Frontier with 16 GB of HBM 2 would be very attractive however Iray is CUDA only and as we are just a small segment of the GPU consumer base next to gaming enthusiasts, that light, the the notion of "competition" from AMD has been and will always will be moot.  Add to this Otoy's forthcoming Octane 4 which will be available for 20$ a month and offers fast out of core rendering that makes the need for a super high VRAM GPU unnecessary (and will also support both the OpenCL graphics language as well as Vulkan API).

    While Daz incorporating Iray seemed a great advancement back in 2015, I find it has become somewhat limited for many given what has happened with GPU pricing and availability.  If they also had a similar out of core render mode, the situation might be different.  3DL and Octane both have standalone solutions (3DL's is free up to 4 cores/8 threads).  Iray's standalone is not only expensive (like the full unlimited core version of 3DL) but has no "third party" processing interface for Daz (like Renderman RIB for 3DL or Reality for LuxRender), only the high priced pro grade software.  Were there one for Daz, it would at least reduce the resource demand of having to keep the scene open in Daz while rendering in CPU mode. In that light it would be a bit faster as there would be less chance of falling into much slower swap mode.

    And if the Vega Frontier was able to better compete with Nvidia then you would see Nvidia follow suit with a 16 GB card of their own closer to that price range. Instead, we got a Titan V that is freakin' $3000. So again, that is why competition is so vital. Just like how we can thank AMD for finally getting us past 4 cores in mainstream CPUs because of how well Ryzen competes against Intel. But AMD's Vega has struggled, in large part because AMD simply cannot manufacture these things. Plus the performance of Vega is nowhere near as groundbreaking as Ryzen.

    So it bears repeating, the more AMD competes absolutely has a direct effect on what you can do with Daz Iray, even though Daz Iray cannot use AMD cards. This is why you want AMD to do well, to push Nvidia to do better. One would also hope that other rendering software would serve as competition for Iray to push it to improve as well. But that is not as likely (IMO,) gamers and now AI are the things Nvidia focuses on first. That Pascal did not ship with a working Iray driver is all the proof one needs to understand Nvidia's priorities. You think Nvidia would ship GPUs with no game drivers and expect gamers to wait several months to get them? LOL, such a farce would put them out of business.

    Iray is going to have some serious shortcomings if it doesn't keep up with other render engines this year. That's also something I have been warning about for a really long time, but that's another topic.

  • kyoto kidkyoto kid Posts: 41,851

    ...the issue with Nvidia is they will not compete against their more expensive pro grade Quadro line.  This is part of why VRAM for the GTX and Titan lines is capped at 11 and 12 GB respectively.  If they came out with a 16 GB GDDR6 GTX card even at the MSRP of a TitanXp, it would put a dent in sales of their 2,000$ Quadro P5000 and make the P4000 moot (actually both the standard 1080 and 1080 Ti do the latter).    However, if funds were not a concern and I was given the choice between the P5000 and the Titan V I'd actually take the Quadro for the higher VRAM as again for my purposes, that is more important.

    Part of the reason behind AMD's troubles with Vega Frontier is that HMB2 memory is in even shorter supply than standard VRAM (all Quadros save for the really 7,500$ GP100 still use GDDR5/5X). 

    Sadly as I have mentioned we are a small segment of the GPU buying public so our needs are not that high of a priority.  Indeed if they did the same to gaming enthusiasts as they did to us with Pascal, it would have produced a major uproar. and possibly driven some to the other side as games are not dedicated to a specific "brand" of GPU like render engines are.

    I already consider Iray to have major shortcomings compared to Octane and the forthcoming Octane 4.  Either you need a big VRAM GPU which for many have become stupidly expensive these days, or are stuck for the most part with rendering in the slow on the CPU. Again with Octane's out of core mode you don't need an ultra  high memory GPU card or have to spend time optimising to have improved rendering performance.  If Nvidia faces any challenge in the near future it may not so much come more from AMD as from developers like Otoy since V4 will also support OpenCL as well as Intel integrated graphics, no longer making it "brand specific" like Iray.

    ...and at 20$ a month for the subscription path, I see it as a better more elegant solution to the situation.

  • bluejauntebluejaunte Posts: 1,990
    kyoto kid said:

    ...the issue with Nvidia is they will not compete against their more expensive pro grade Quadro line.  This is part of why VRAM for the GTX and Titan lines is capped at 11 and 12 GB respectively.  If they came out with a 16 GB GDDR6 GTX card even at the MSRP of a TitanXp, it would put a dent in sales of their 2,000$ Quadro P5000 and make the P4000 moot (actually both the standard 1080 and 1080 Ti do the latter).    However, if funds were not a concern and I was given the choice between the P5000 and the Titan V I'd actually take the Quadro for the higher VRAM as again for my purposes, that is more important.

    Part of the reason behind AMD's troubles with Vega Frontier is that HMB2 memory is in even shorter supply than standard VRAM (all Quadros save for the really 7,500$ GP100 still use GDDR5/5X). 

    Sadly as I have mentioned we are a small segment of the GPU buying public so our needs are not that high of a priority.  Indeed if they did the same to gaming enthusiasts as they did to us with Pascal, it would have produced a major uproar. and possibly driven some to the other side as games are not dedicated to a specific "brand" of GPU like render engines are.

    I already consider Iray to have major shortcomings compared to Octane and the forthcoming Octane 4.  Either you need a big VRAM GPU which for many have become stupidly expensive these days, or are stuck for the most part with rendering in the slow on the CPU. Again with Octane's out of core mode you don't need an ultra  high memory GPU card or have to spend time optimising to have improved rendering performance.  If Nvidia faces any challenge in the near future it may not so much come more from AMD as from developers like Otoy since V4 will also support OpenCL as well as Intel integrated graphics, no longer making it "brand specific" like Iray.

    ...and at 20$ a month for the subscription path, I see it as a better more elegant solution to the situation.

    Do you know how much VRAM your scenes actually require?

  • kyoto kidkyoto kid Posts: 41,851

    ...more than the card I currently have as they more often than not dump to the CPU.   I tend to prefer creating vast sweeping scenes with a high amount of detail and effects at large resolution sizes which comes from my painting days.  Not into portraits or simple vignettes.

  • bluejauntebluejaunte Posts: 1,990

    But you currently have 2GB? What makes you think your stuff wouldn't fit on 11GB? Log file should contain the actual amount that your scene required. Something like

    IRAY&nbsp; &nbsp;rend stat : Texture memory consumption: 1.17012 GiB for 39 bitmaps (device 0)

  • kyoto kidkyoto kid Posts: 41,851
    edited March 2018

    ...4 GB, acquired an old Gigabit 750 Ti.

    Also don't forget that the larger the resolution size, the more memory that is required.

    Post edited by kyoto kid on
  • drzapdrzap Posts: 795
    edited March 2018
    kyoto kid said:
    kyoto kid said:

    The return would be owning a decent machine for Iray. <.<

    All speculation indicates that Nvidia will ask a much higher MSRP on their next gen cards, whatever they are called. (Volta, Ampere, and Turing have all been brought up as code names.)

    I noticed some comments about Intel getting into the GPU business. That is true. And the whole thing is weird. First Intel hired AMD's former head of graphics development. This was extremely unexpected, as he had just took part in launching AMD Vega. Then Intel shows plans to include AMD Graphics on Intel CPU chip designs. This was such a strange marriage that numerous brains exploded. And there is no doubt a connection between the two. Right now, it is not known if Intel is going to make fully discrete GPUs, or focus on these combo dies. But I really doubt Intel would stick with their arrangement with AMD for long after bringing Raja aboard. They have plans.

    https://www.theverge.com/circuitbreaker/2018/1/7/16861164/intel-amd-core-i5-i7-processor-radeon-rx-vega-m-gpu-cpu-h-series-ces-2018

    But either way, this creates pressure on Nvidia if these perform well and are priced reasonably. They will certainly be cheaper than buying a CPU and GPU separate. And being combo chips they should not be very desirable to miners since you can't stack them. Too bad Iray doesn't work on them.

    A number of things may go against Nvidia if they are not careful. Microsoft recently added AMD Freesync to their Xbox console lineup. This means that you can hook an Xbox to a Freesync monitor and eliminate screen tear. This gives TV makers a reason to add Freesync to units, and doing so is fairly simple and cheap. Xbox uses AMD. So does the PS4, and the PS4 is the market leader. If Sony also adds Freesync to PS4, that will be huge boost to AMD and Freesync. For those not versed, Freesync is better than Vsync when it comes to frame times in games. What are these? Basically, when you play a game and the system renders frames at a different pace from what is being displayed, you can get screen tearing. Vsync was created to force the system to wait for each screen refresh, at either 30 or 60 frames per second. This works to stop the screen tear, but it can create a small amount of input lag. Gamers flip out over input lag, and always look to make it as small as possible. Freesync solves that problem. Nvidia has their own version of this, because of course they do, called Gsync. Freesync is open, while Gsync is exclusive to Nvidia, because of course it is, LOL. And Gsync costs a lot more to build into your monitor because Nvidia charges a high fee to license it.

    Anyway...having major consoles use Freesync turns the tables on Nvidia a bit. So while Nvidia enjoys a huge lead in the GPU race, they still face some big hurdles going forward that could possibly keep them in check (and most importantly, keep them from charging whatever they feel like for GPUs.) Competition is important, even if you don't use the competitor, what they do still has a big impact on what the leader does.

    ...none of these moves will help with Iray unless Nvidia changes their tack and decides to embrace Open_CL which is what AMD uses.

     

    These moves are more about competition, which effects...everything. Strong competition leads to more innovation and better prices in the market. If AMD competes with Nvidia better, the better and cheaper these GPUs become. The Titan V would not cost $3000 if AMD had real competition against it.

     

    Obviously there are other factors driving prices right now, but if Nvidia has no threat from AMD, they can charge whatever they want for their cards and get away with it.

     

    Nvidia is starting to do some shady things to bully their position in the market. Right now there is controversy around their GPP program, which by all accounts is putting pressure on 3rd party board makers to sign up to its terms. What happens if they refuse? They lose the benefits they used to have with Nvidia, support and most importantly, they might not get the stock they want first to build their cards. GPP members get access to chips first. But the GPP has special guidelines that basically ask for preferential treatment to Nvidia. Its already had effects, with 2 GPP members releasing AMD gpus without their normal gaming branding attached. Its dirty as it sounds. Even if Nvidia gets slammed by regulatory sanctions, the damage may already be done, which is the whole point of such shady practices. I'm sure Nvidia would happily pay any fine they receive if it meant the death of AMD's gpu division. It would be worth it for them.

    ...as you and others here know, I tend value VRAM higher than boatloads of stream processors as I create highly complex and detailed scenes (remember Alpha Channel's epic sweeping scenes? Yeah as a former painter, that is my inspiration).  The 999$ Vega Frontier with 16 GB of HBM 2 would be very attractive however Iray is CUDA only and as we are just a small segment of the GPU consumer base next to gaming enthusiasts, that light, the the notion of "competition" from AMD has been and will always will be moot.  Add to this Otoy's forthcoming Octane 4 which will be available for 20$ a month and offers fast out of core rendering that makes the need for a super high VRAM GPU unnecessary (and will also support both the OpenCL graphics language as well as Vulkan API).

    While Daz incorporating Iray seemed a great advancement back in 2015, I find it has become somewhat limited for many given what has happened with GPU pricing and availability.  If they also had a similar out of core render mode, the situation might be different.  3DL and Octane both have standalone solutions (3DL's is free up to 4 cores/8 threads).  Iray's standalone is not only expensive (like the full unlimited core version of 3DL) but has no "third party" processing interface for Daz (like Renderman RIB for 3DL or Reality for LuxRender), only the high priced pro grade software.  Were there one for Daz, it would at least reduce the resource demand of having to keep the scene open in Daz while rendering in CPU mode. In that light it would be a bit faster as there would be less chance of falling into much slower swap mode.

    And if the Vega Frontier was able to better compete with Nvidia then you would see Nvidia follow suit with a 16 GB card of their own closer to that price range. Instead, we got a Titan V that is freakin' $3000. So again, that is why competition is so vital. Just like how we can thank AMD for finally getting us past 4 cores in mainstream CPUs because of how well Ryzen competes against Intel. But AMD's Vega has struggled, in large part because AMD simply cannot manufacture these things. Plus the performance of Vega is nowhere near as groundbreaking as Ryzen.

    So it bears repeating, the more AMD competes absolutely has a direct effect on what you can do with Daz Iray, even though Daz Iray cannot use AMD cards. This is why you want AMD to do well, to push Nvidia to do better. One would also hope that other rendering software would serve as competition for Iray to push it to improve as well. But that is not as likely (IMO,) gamers and now AI are the things Nvidia focuses on first. That Pascal did not ship with a working Iray driver is all the proof one needs to understand Nvidia's priorities. You think Nvidia would ship GPUs with no game drivers and expect gamers to wait several months to get them? LOL, such a farce would put them out of business.

    Iray is going to have some serious shortcomings if it doesn't keep up with other render engines this year. That's also something I have been warning about for a really long time, but that's another topic.

    The Titan V wasn't developed for the amateur 3D market, so technically, "we" didn't really get anything.  It's meant for the science and AI researchers and they are getting a hellava bargain at $3000.    "Our" card is due to hit this summer and then we can judge whether or not we are getting fleeced due to lack of competition.  We can decide for ourselves if we want to spend the quid for a Titan V, but to complain that Nvidia is marketing an overpriced card to 3D users is a misunderstanding of where (or to whom) the card is aimed.

    Post edited by drzap on
  • vrba79vrba79 Posts: 1,432

    I ended up going the APU build route. Not regretting my decision thus far.

  • Ghosty12Ghosty12 Posts: 2,080

    Just added another update in this saga, with the news that it is looking really good for those of us looking for video cards.. :)

  • outrider42outrider42 Posts: 3,679
    drzap said:
    kyoto kid said:
    kyoto kid said:

    The return would be owning a decent machine for Iray. <.<

    All speculation indicates that Nvidia will ask a much higher MSRP on their next gen cards, whatever they are called. (Volta, Ampere, and Turing have all been brought up as code names.)

    I noticed some comments about Intel getting into the GPU business. That is true. And the whole thing is weird. First Intel hired AMD's former head of graphics development. This was extremely unexpected, as he had just took part in launching AMD Vega. Then Intel shows plans to include AMD Graphics on Intel CPU chip designs. This was such a strange marriage that numerous brains exploded. And there is no doubt a connection between the two. Right now, it is not known if Intel is going to make fully discrete GPUs, or focus on these combo dies. But I really doubt Intel would stick with their arrangement with AMD for long after bringing Raja aboard. They have plans.

    https://www.theverge.com/circuitbreaker/2018/1/7/16861164/intel-amd-core-i5-i7-processor-radeon-rx-vega-m-gpu-cpu-h-series-ces-2018

    But either way, this creates pressure on Nvidia if these perform well and are priced reasonably. They will certainly be cheaper than buying a CPU and GPU separate. And being combo chips they should not be very desirable to miners since you can't stack them. Too bad Iray doesn't work on them.

    A number of things may go against Nvidia if they are not careful. Microsoft recently added AMD Freesync to their Xbox console lineup. This means that you can hook an Xbox to a Freesync monitor and eliminate screen tear. This gives TV makers a reason to add Freesync to units, and doing so is fairly simple and cheap. Xbox uses AMD. So does the PS4, and the PS4 is the market leader. If Sony also adds Freesync to PS4, that will be huge boost to AMD and Freesync. For those not versed, Freesync is better than Vsync when it comes to frame times in games. What are these? Basically, when you play a game and the system renders frames at a different pace from what is being displayed, you can get screen tearing. Vsync was created to force the system to wait for each screen refresh, at either 30 or 60 frames per second. This works to stop the screen tear, but it can create a small amount of input lag. Gamers flip out over input lag, and always look to make it as small as possible. Freesync solves that problem. Nvidia has their own version of this, because of course they do, called Gsync. Freesync is open, while Gsync is exclusive to Nvidia, because of course it is, LOL. And Gsync costs a lot more to build into your monitor because Nvidia charges a high fee to license it.

    Anyway...having major consoles use Freesync turns the tables on Nvidia a bit. So while Nvidia enjoys a huge lead in the GPU race, they still face some big hurdles going forward that could possibly keep them in check (and most importantly, keep them from charging whatever they feel like for GPUs.) Competition is important, even if you don't use the competitor, what they do still has a big impact on what the leader does.

    ...none of these moves will help with Iray unless Nvidia changes their tack and decides to embrace Open_CL which is what AMD uses.

     

    These moves are more about competition, which effects...everything. Strong competition leads to more innovation and better prices in the market. If AMD competes with Nvidia better, the better and cheaper these GPUs become. The Titan V would not cost $3000 if AMD had real competition against it.

     

    Obviously there are other factors driving prices right now, but if Nvidia has no threat from AMD, they can charge whatever they want for their cards and get away with it.

     

    Nvidia is starting to do some shady things to bully their position in the market. Right now there is controversy around their GPP program, which by all accounts is putting pressure on 3rd party board makers to sign up to its terms. What happens if they refuse? They lose the benefits they used to have with Nvidia, support and most importantly, they might not get the stock they want first to build their cards. GPP members get access to chips first. But the GPP has special guidelines that basically ask for preferential treatment to Nvidia. Its already had effects, with 2 GPP members releasing AMD gpus without their normal gaming branding attached. Its dirty as it sounds. Even if Nvidia gets slammed by regulatory sanctions, the damage may already be done, which is the whole point of such shady practices. I'm sure Nvidia would happily pay any fine they receive if it meant the death of AMD's gpu division. It would be worth it for them.

    ...as you and others here know, I tend value VRAM higher than boatloads of stream processors as I create highly complex and detailed scenes (remember Alpha Channel's epic sweeping scenes? Yeah as a former painter, that is my inspiration).  The 999$ Vega Frontier with 16 GB of HBM 2 would be very attractive however Iray is CUDA only and as we are just a small segment of the GPU consumer base next to gaming enthusiasts, that light, the the notion of "competition" from AMD has been and will always will be moot.  Add to this Otoy's forthcoming Octane 4 which will be available for 20$ a month and offers fast out of core rendering that makes the need for a super high VRAM GPU unnecessary (and will also support both the OpenCL graphics language as well as Vulkan API).

    While Daz incorporating Iray seemed a great advancement back in 2015, I find it has become somewhat limited for many given what has happened with GPU pricing and availability.  If they also had a similar out of core render mode, the situation might be different.  3DL and Octane both have standalone solutions (3DL's is free up to 4 cores/8 threads).  Iray's standalone is not only expensive (like the full unlimited core version of 3DL) but has no "third party" processing interface for Daz (like Renderman RIB for 3DL or Reality for LuxRender), only the high priced pro grade software.  Were there one for Daz, it would at least reduce the resource demand of having to keep the scene open in Daz while rendering in CPU mode. In that light it would be a bit faster as there would be less chance of falling into much slower swap mode.

    And if the Vega Frontier was able to better compete with Nvidia then you would see Nvidia follow suit with a 16 GB card of their own closer to that price range. Instead, we got a Titan V that is freakin' $3000. So again, that is why competition is so vital. Just like how we can thank AMD for finally getting us past 4 cores in mainstream CPUs because of how well Ryzen competes against Intel. But AMD's Vega has struggled, in large part because AMD simply cannot manufacture these things. Plus the performance of Vega is nowhere near as groundbreaking as Ryzen.

    So it bears repeating, the more AMD competes absolutely has a direct effect on what you can do with Daz Iray, even though Daz Iray cannot use AMD cards. This is why you want AMD to do well, to push Nvidia to do better. One would also hope that other rendering software would serve as competition for Iray to push it to improve as well. But that is not as likely (IMO,) gamers and now AI are the things Nvidia focuses on first. That Pascal did not ship with a working Iray driver is all the proof one needs to understand Nvidia's priorities. You think Nvidia would ship GPUs with no game drivers and expect gamers to wait several months to get them? LOL, such a farce would put them out of business.

    Iray is going to have some serious shortcomings if it doesn't keep up with other render engines this year. That's also something I have been warning about for a really long time, but that's another topic.

    The Titan V wasn't developed for the amateur 3D market, so technically, "we" didn't really get anything.  It's meant for the science and AI researchers and they are getting a hellava bargain at $3000.    "Our" card is due to hit this summer and then we can judge whether or not we are getting fleeced due to lack of competition.  We can decide for ourselves if we want to spend the quid for a Titan V, but to complain that Nvidia is marketing an overpriced card to 3D users is a misunderstanding of where (or to whom) the card is aimed.

    You are misunderstanding my point. Are you telling me that if AMD was competing with the Titan V that it would still be $3000? No, it would not.

    I'm not sure where you get this idea that Titans are exclusively for pro industries. The original Titans had GTX in their name, and Titans have been marketed as GTX members, and the card is sort of a hybrid between GTX and pro level series. And what is GTX? GTX is their gaming brand, is it not? Titans get gaming drivers. If Nvidia truly wanted to sell this as a pro card then it would have had a different name branding on it, and they have Quadro for this purpose. The Titan is their "pony car", but make no mistake, it is still a car. Not a truck, which is what a pro card would be compared to.

    And again, history is on that side. Titans have sold for $1000, $1200, and $1500 in past iterations, which is not so out of reach for "us". These were not decades ago, so inflation is not the cause of this price. The Titan Z was $3000, but it was not received very well because of its price.

    If you watch Nvidia's conference it will be quite clear what they are gearing towards science and research, and that is with Quadro, not Titan. There is no mention of Titan whatsoever at this conference.

  • GazukullGazukull Posts: 96

    For those folks interested in the article I has mentioned.  I followed this guys build pretty much.  Except even ECC used ram is up in price now.

    http://www.beer30.org/building-a-32-thread-workstation-for-under-750/

  • drzapdrzap Posts: 795
    edited March 2018
    drzap said:
    kyoto kid said:
    kyoto kid said:

    The return would be owning a decent machine for Iray. <.<

    All speculation indicates that Nvidia will ask a much higher MSRP on their next gen cards, whatever they are called. (Volta, Ampere, and Turing have all been brought up as code names.)

    I noticed some comments about Intel getting into the GPU business. That is true. And the whole thing is weird. First Intel hired AMD's former head of graphics development. This was extremely unexpected, as he had just took part in launching AMD Vega. Then Intel shows plans to include AMD Graphics on Intel CPU chip designs. This was such a strange marriage that numerous brains exploded. And there is no doubt a connection between the two. Right now, it is not known if Intel is going to make fully discrete GPUs, or focus on these combo dies. But I really doubt Intel would stick with their arrangement with AMD for long after bringing Raja aboard. They have plans.

    https://www.theverge.com/circuitbreaker/2018/1/7/16861164/intel-amd-core-i5-i7-processor-radeon-rx-vega-m-gpu-cpu-h-series-ces-2018

    But either way, this creates pressure on Nvidia if these perform well and are priced reasonably. They will certainly be cheaper than buying a CPU and GPU separate. And being combo chips they should not be very desirable to miners since you can't stack them. Too bad Iray doesn't work on them.

    A number of things may go against Nvidia if they are not careful. Microsoft recently added AMD Freesync to their Xbox console lineup. This means that you can hook an Xbox to a Freesync monitor and eliminate screen tear. This gives TV makers a reason to add Freesync to units, and doing so is fairly simple and cheap. Xbox uses AMD. So does the PS4, and the PS4 is the market leader. If Sony also adds Freesync to PS4, that will be huge boost to AMD and Freesync. For those not versed, Freesync is better than Vsync when it comes to frame times in games. What are these? Basically, when you play a game and the system renders frames at a different pace from what is being displayed, you can get screen tearing. Vsync was created to force the system to wait for each screen refresh, at either 30 or 60 frames per second. This works to stop the screen tear, but it can create a small amount of input lag. Gamers flip out over input lag, and always look to make it as small as possible. Freesync solves that problem. Nvidia has their own version of this, because of course they do, called Gsync. Freesync is open, while Gsync is exclusive to Nvidia, because of course it is, LOL. And Gsync costs a lot more to build into your monitor because Nvidia charges a high fee to license it.

    Anyway...having major consoles use Freesync turns the tables on Nvidia a bit. So while Nvidia enjoys a huge lead in the GPU race, they still face some big hurdles going forward that could possibly keep them in check (and most importantly, keep them from charging whatever they feel like for GPUs.) Competition is important, even if you don't use the competitor, what they do still has a big impact on what the leader does.

    ...none of these moves will help with Iray unless Nvidia changes their tack and decides to embrace Open_CL which is what AMD uses.

     

    These moves are more about competition, which effects...everything. Strong competition leads to more innovation and better prices in the market. If AMD competes with Nvidia better, the better and cheaper these GPUs become. The Titan V would not cost $3000 if AMD had real competition against it.

     

    Obviously there are other factors driving prices right now, but if Nvidia has no threat from AMD, they can charge whatever they want for their cards and get away with it.

     

    Nvidia is starting to do some shady things to bully their position in the market. Right now there is controversy around their GPP program, which by all accounts is putting pressure on 3rd party board makers to sign up to its terms. What happens if they refuse? They lose the benefits they used to have with Nvidia, support and most importantly, they might not get the stock they want first to build their cards. GPP members get access to chips first. But the GPP has special guidelines that basically ask for preferential treatment to Nvidia. Its already had effects, with 2 GPP members releasing AMD gpus without their normal gaming branding attached. Its dirty as it sounds. Even if Nvidia gets slammed by regulatory sanctions, the damage may already be done, which is the whole point of such shady practices. I'm sure Nvidia would happily pay any fine they receive if it meant the death of AMD's gpu division. It would be worth it for them.

    ...as you and others here know, I tend value VRAM higher than boatloads of stream processors as I create highly complex and detailed scenes (remember Alpha Channel's epic sweeping scenes? Yeah as a former painter, that is my inspiration).  The 999$ Vega Frontier with 16 GB of HBM 2 would be very attractive however Iray is CUDA only and as we are just a small segment of the GPU consumer base next to gaming enthusiasts, that light, the the notion of "competition" from AMD has been and will always will be moot.  Add to this Otoy's forthcoming Octane 4 which will be available for 20$ a month and offers fast out of core rendering that makes the need for a super high VRAM GPU unnecessary (and will also support both the OpenCL graphics language as well as Vulkan API).

    While Daz incorporating Iray seemed a great advancement back in 2015, I find it has become somewhat limited for many given what has happened with GPU pricing and availability.  If they also had a similar out of core render mode, the situation might be different.  3DL and Octane both have standalone solutions (3DL's is free up to 4 cores/8 threads).  Iray's standalone is not only expensive (like the full unlimited core version of 3DL) but has no "third party" processing interface for Daz (like Renderman RIB for 3DL or Reality for LuxRender), only the high priced pro grade software.  Were there one for Daz, it would at least reduce the resource demand of having to keep the scene open in Daz while rendering in CPU mode. In that light it would be a bit faster as there would be less chance of falling into much slower swap mode.

    And if the Vega Frontier was able to better compete with Nvidia then you would see Nvidia follow suit with a 16 GB card of their own closer to that price range. Instead, we got a Titan V that is freakin' $3000. So again, that is why competition is so vital. Just like how we can thank AMD for finally getting us past 4 cores in mainstream CPUs because of how well Ryzen competes against Intel. But AMD's Vega has struggled, in large part because AMD simply cannot manufacture these things. Plus the performance of Vega is nowhere near as groundbreaking as Ryzen.

    So it bears repeating, the more AMD competes absolutely has a direct effect on what you can do with Daz Iray, even though Daz Iray cannot use AMD cards. This is why you want AMD to do well, to push Nvidia to do better. One would also hope that other rendering software would serve as competition for Iray to push it to improve as well. But that is not as likely (IMO,) gamers and now AI are the things Nvidia focuses on first. That Pascal did not ship with a working Iray driver is all the proof one needs to understand Nvidia's priorities. You think Nvidia would ship GPUs with no game drivers and expect gamers to wait several months to get them? LOL, such a farce would put them out of business.

    Iray is going to have some serious shortcomings if it doesn't keep up with other render engines this year. That's also something I have been warning about for a really long time, but that's another topic.

    The Titan V wasn't developed for the amateur 3D market, so technically, "we" didn't really get anything.  It's meant for the science and AI researchers and they are getting a hellava bargain at $3000.    "Our" card is due to hit this summer and then we can judge whether or not we are getting fleeced due to lack of competition.  We can decide for ourselves if we want to spend the quid for a Titan V, but to complain that Nvidia is marketing an overpriced card to 3D users is a misunderstanding of where (or to whom) the card is aimed.

    You are misunderstanding my point. Are you telling me that if AMD was competing with the Titan V that it would still be $3000? No, it would not.

    I'm not sure where you get this idea that Titans are exclusively for pro industries. The original Titans had GTX in their name, and Titans have been marketed as GTX members, and the card is sort of a hybrid between GTX and pro level series. And what is GTX? GTX is their gaming brand, is it not? Titans get gaming drivers. If Nvidia truly wanted to sell this as a pro card then it would have had a different name branding on it, and they have Quadro for this purpose. The Titan is their "pony car", but make no mistake, it is still a car. Not a truck, which is what a pro card would be compared to.

    And again, history is on that side. Titans have sold for $1000, $1200, and $1500 in past iterations, which is not so out of reach for "us". These were not decades ago, so inflation is not the cause of this price. The Titan Z was $3000, but it was not received very well because of its price.

    If you watch Nvidia's conference it will be quite clear what they are gearing towards science and research, and that is with Quadro, not Titan. There is no mention of Titan whatsoever at this conference.

    If you look at the spec sheet for TitanV, you will see one unusual line that stands out as different from any other generation:  Tensor cores.   Now, what use would a non-scientist / AI researcher have for tensor cores?  We don't use tensor in gaming or 3D rendering or anything outside a laboratory.  So, should we choose to pay $3000 for the thing, please understand that we are paying extra for things that are useless in our 3D work and that responsibility lies completely on our laps, not Nvidia or AMD.  The TitanV is a break from the past.  A budget scientific research card.   It even offers buyers free access to scientific software!  Nvidia has no obligation to keep its Titan line consistent with any of our expectations.  And to expect AMD to automatically have a rival card for this very new niche is not very realistic.

    Post edited by drzap on
  • agent unawaresagent unawares Posts: 3,513
    drzap said:
    drzap said:
    kyoto kid said:
    kyoto kid said:

    The return would be owning a decent machine for Iray. <.<

    All speculation indicates that Nvidia will ask a much higher MSRP on their next gen cards, whatever they are called. (Volta, Ampere, and Turing have all been brought up as code names.)

    I noticed some comments about Intel getting into the GPU business. That is true. And the whole thing is weird. First Intel hired AMD's former head of graphics development. This was extremely unexpected, as he had just took part in launching AMD Vega. Then Intel shows plans to include AMD Graphics on Intel CPU chip designs. This was such a strange marriage that numerous brains exploded. And there is no doubt a connection between the two. Right now, it is not known if Intel is going to make fully discrete GPUs, or focus on these combo dies. But I really doubt Intel would stick with their arrangement with AMD for long after bringing Raja aboard. They have plans.

    https://www.theverge.com/circuitbreaker/2018/1/7/16861164/intel-amd-core-i5-i7-processor-radeon-rx-vega-m-gpu-cpu-h-series-ces-2018

    But either way, this creates pressure on Nvidia if these perform well and are priced reasonably. They will certainly be cheaper than buying a CPU and GPU separate. And being combo chips they should not be very desirable to miners since you can't stack them. Too bad Iray doesn't work on them.

    A number of things may go against Nvidia if they are not careful. Microsoft recently added AMD Freesync to their Xbox console lineup. This means that you can hook an Xbox to a Freesync monitor and eliminate screen tear. This gives TV makers a reason to add Freesync to units, and doing so is fairly simple and cheap. Xbox uses AMD. So does the PS4, and the PS4 is the market leader. If Sony also adds Freesync to PS4, that will be huge boost to AMD and Freesync. For those not versed, Freesync is better than Vsync when it comes to frame times in games. What are these? Basically, when you play a game and the system renders frames at a different pace from what is being displayed, you can get screen tearing. Vsync was created to force the system to wait for each screen refresh, at either 30 or 60 frames per second. This works to stop the screen tear, but it can create a small amount of input lag. Gamers flip out over input lag, and always look to make it as small as possible. Freesync solves that problem. Nvidia has their own version of this, because of course they do, called Gsync. Freesync is open, while Gsync is exclusive to Nvidia, because of course it is, LOL. And Gsync costs a lot more to build into your monitor because Nvidia charges a high fee to license it.

    Anyway...having major consoles use Freesync turns the tables on Nvidia a bit. So while Nvidia enjoys a huge lead in the GPU race, they still face some big hurdles going forward that could possibly keep them in check (and most importantly, keep them from charging whatever they feel like for GPUs.) Competition is important, even if you don't use the competitor, what they do still has a big impact on what the leader does.

    ...none of these moves will help with Iray unless Nvidia changes their tack and decides to embrace Open_CL which is what AMD uses.

     

    These moves are more about competition, which effects...everything. Strong competition leads to more innovation and better prices in the market. If AMD competes with Nvidia better, the better and cheaper these GPUs become. The Titan V would not cost $3000 if AMD had real competition against it.

     

    Obviously there are other factors driving prices right now, but if Nvidia has no threat from AMD, they can charge whatever they want for their cards and get away with it.

     

    Nvidia is starting to do some shady things to bully their position in the market. Right now there is controversy around their GPP program, which by all accounts is putting pressure on 3rd party board makers to sign up to its terms. What happens if they refuse? They lose the benefits they used to have with Nvidia, support and most importantly, they might not get the stock they want first to build their cards. GPP members get access to chips first. But the GPP has special guidelines that basically ask for preferential treatment to Nvidia. Its already had effects, with 2 GPP members releasing AMD gpus without their normal gaming branding attached. Its dirty as it sounds. Even if Nvidia gets slammed by regulatory sanctions, the damage may already be done, which is the whole point of such shady practices. I'm sure Nvidia would happily pay any fine they receive if it meant the death of AMD's gpu division. It would be worth it for them.

    ...as you and others here know, I tend value VRAM higher than boatloads of stream processors as I create highly complex and detailed scenes (remember Alpha Channel's epic sweeping scenes? Yeah as a former painter, that is my inspiration).  The 999$ Vega Frontier with 16 GB of HBM 2 would be very attractive however Iray is CUDA only and as we are just a small segment of the GPU consumer base next to gaming enthusiasts, that light, the the notion of "competition" from AMD has been and will always will be moot.  Add to this Otoy's forthcoming Octane 4 which will be available for 20$ a month and offers fast out of core rendering that makes the need for a super high VRAM GPU unnecessary (and will also support both the OpenCL graphics language as well as Vulkan API).

    While Daz incorporating Iray seemed a great advancement back in 2015, I find it has become somewhat limited for many given what has happened with GPU pricing and availability.  If they also had a similar out of core render mode, the situation might be different.  3DL and Octane both have standalone solutions (3DL's is free up to 4 cores/8 threads).  Iray's standalone is not only expensive (like the full unlimited core version of 3DL) but has no "third party" processing interface for Daz (like Renderman RIB for 3DL or Reality for LuxRender), only the high priced pro grade software.  Were there one for Daz, it would at least reduce the resource demand of having to keep the scene open in Daz while rendering in CPU mode. In that light it would be a bit faster as there would be less chance of falling into much slower swap mode.

    And if the Vega Frontier was able to better compete with Nvidia then you would see Nvidia follow suit with a 16 GB card of their own closer to that price range. Instead, we got a Titan V that is freakin' $3000. So again, that is why competition is so vital. Just like how we can thank AMD for finally getting us past 4 cores in mainstream CPUs because of how well Ryzen competes against Intel. But AMD's Vega has struggled, in large part because AMD simply cannot manufacture these things. Plus the performance of Vega is nowhere near as groundbreaking as Ryzen.

    So it bears repeating, the more AMD competes absolutely has a direct effect on what you can do with Daz Iray, even though Daz Iray cannot use AMD cards. This is why you want AMD to do well, to push Nvidia to do better. One would also hope that other rendering software would serve as competition for Iray to push it to improve as well. But that is not as likely (IMO,) gamers and now AI are the things Nvidia focuses on first. That Pascal did not ship with a working Iray driver is all the proof one needs to understand Nvidia's priorities. You think Nvidia would ship GPUs with no game drivers and expect gamers to wait several months to get them? LOL, such a farce would put them out of business.

    Iray is going to have some serious shortcomings if it doesn't keep up with other render engines this year. That's also something I have been warning about for a really long time, but that's another topic.

    The Titan V wasn't developed for the amateur 3D market, so technically, "we" didn't really get anything.  It's meant for the science and AI researchers and they are getting a hellava bargain at $3000.    "Our" card is due to hit this summer and then we can judge whether or not we are getting fleeced due to lack of competition.  We can decide for ourselves if we want to spend the quid for a Titan V, but to complain that Nvidia is marketing an overpriced card to 3D users is a misunderstanding of where (or to whom) the card is aimed.

    You are misunderstanding my point. Are you telling me that if AMD was competing with the Titan V that it would still be $3000? No, it would not.

    I'm not sure where you get this idea that Titans are exclusively for pro industries. The original Titans had GTX in their name, and Titans have been marketed as GTX members, and the card is sort of a hybrid between GTX and pro level series. And what is GTX? GTX is their gaming brand, is it not? Titans get gaming drivers. If Nvidia truly wanted to sell this as a pro card then it would have had a different name branding on it, and they have Quadro for this purpose. The Titan is their "pony car", but make no mistake, it is still a car. Not a truck, which is what a pro card would be compared to.

    And again, history is on that side. Titans have sold for $1000, $1200, and $1500 in past iterations, which is not so out of reach for "us". These were not decades ago, so inflation is not the cause of this price. The Titan Z was $3000, but it was not received very well because of its price.

    If you watch Nvidia's conference it will be quite clear what they are gearing towards science and research, and that is with Quadro, not Titan. There is no mention of Titan whatsoever at this conference.

    If you look at the spec sheet for TitanV, you will see one unusual line that stands out as different from any other generation:  Tensor cores.   Now, what use would a non-scientist / AI researcher have for tensor cores?

    Aren't tensor cores extremely likely to be used in AI denoising?

  • tj_1ca9500btj_1ca9500b Posts: 2,057

    So, the 32GB HBM2 Quadro GV100 was officially launched today, with a whopping $8.999 MSRP.

    Doing a quick comparison of the specs between it and the P6000 (24GB GDDR5X), which has a $4,999 MSRP, shows a small bump in 32 bit compute, from 12 Tflops (P6000) to 14.8 Tflops (GV100), despite having a significantly higher core count. 

    Almost twice the money for an extra 33% of VRAM seems steep to me... even if it is say 23% faster... I'm struggling hard enough already trying to swallow maybe buying a $5K card price wise, and $9K is just, ugh!

    ----

    Anyways, back on topic, I think AMD and NVidia will do just fine if/once the cryptominers move on to more specialized solutions.  It'll take some time for those solutions to hit the market en masse, and there are a number of other segments that are chomping at the bit for GPU capacity (deep learning, etc.).  Most of those segments are more NVidia oriented, of course.  Plus, I think that both companies have been careful about ramping up their production, so while there may be a short term glut of cards flooding the market, well they've weathered that before.

     

  • Almost twice the money for an extra 33% of VRAM seems steep to me... even if it is say 23% faster... I'm struggling hard enough already trying to swallow maybe buying a $5K card price wise, and $9K is just, ugh!

    It's not ugh for the people who it's designed for.. the cost/energy/space saving is worth it to enterprise level people, who's pockets are deeper than ours.. I know what's on my Christmas list this year :)

  • GazukullGazukull Posts: 96

    From anandtech

    From anandtech, pertainent to the discussion.

  • drzapdrzap Posts: 795
    edited March 2018

    "Aren't tensor cores extremely likely to be used in AI denoising?"   

    I have been following this line of thinking with a view to purchasing Voltas for a rendering rig.  Since Redshift, Iray, and Octane all have working builds with AI denoising, my primary conclusion is that tensors will have no significant implications for the end user.  However, tensor cores probably quicken the training of the denoiser but that is largely on the developer side before the product is shipped.  Further training is done as more renders are made, which don't involve tensor cores.  The basis for my assumptions is from the Redshift forums where users have already started using the TitanV and V100 in production work.  Of course, if the final release of the AI denoiser allows tensor cores to speed up the render, I would be tickled pink.  In the meantime, I have gone ahead with my purchase of several 1080ti's until everything shakes out.

    Post edited by drzap on
  • kyoto kidkyoto kid Posts: 41,851

    ...actually the GV100 compares more with it's predecessor the 16 GB GP100 in performance the latter which retails for around 7,000$.

    Save for Single FP and boost clock performance  the P6000 pretty much falls rather short.

    Memory bus width, (10.6 x that of the P6000)
    Error correction (full vs half)
    FP32 performance (19.5 x that of the P6000)
    FP64 performance of 118.5 TFlops (0 for the P6000)
    More than one third additional CUDA cores.

    ..and NVLink support between cards at 200GB/s (vs. 2GB/s bandwidth for high bandwidth SLI) as well as allowing memory pooling for pure compute tasks (It still retains PCIe 3.0 x 16 as the system connection though).

  • kyoto kidkyoto kid Posts: 41,851

    ..meanwhile back to the topic. 

    Interesting news. Time will only tell if Ethereum and Monero miners see the new ASICs as a more solid hardware investment and if the hit will be big enough to get GPU prices back in line.

    At best I see this happening in Q1, maybe Q2 of next year. as miners will still look to squeeze as much return out of their investment in GPU cards before making the switch and disposing of them on the resale market. By then we will be looking at the successor to Pascal (whatever Nvidia plans to call it) most likely with a price bump due to the new technology along with continued shortage of memory chips and wafers. 

    I wouldn't expect to see a new 8 GB GTX card priced under 400$ like the 1070 was, with 11 maybe 12 GB remaining the VRAM cap for the the consumer line even given the boost to Nvidia's top end Quadro card (most likely the Volta replacement for the P5000 will continue to have 16 GB but be upgraded to HBM2 memory and have Tensor cores as well as NVLink compatibility along with a price increase).

  • KitsumoKitsumo Posts: 1,221
    edited March 2018
    kyoto kid said:

    ..meanwhile back to the topic. 

    Interesting news. Time will only tell if Ethereum and Monero miners see the new ASICs as a more solid hardware investment and if the hit will be big enough to get GPU prices back in line.

    At best I see this happening in Q1, maybe Q2 of next year. as miners will still look to squeeze as much return out of their investment in GPU cards before making the switch and disposing of them on the resale market. By then we will be looking at the successor to Pascal (whatever Nvidia plans to call it) most likely with a price bump due to the new technology along with continued shortage of memory chips and wafers. 

    I wouldn't expect to see a new 8 GB GTX card priced under 400$ like the 1070 was, with 11 maybe 12 GB remaining the VRAM cap for the the consumer line even given the boost to Nvidia's top end Quadro card (most likely the Volta replacement for the P5000 will continue to have 16 GB but be upgraded to HBM2 memory and have Tensor cores as well as NVLink compatibility along with a price increase).

    Well, Monero already made a change in it's algorithm to avoid ASIC use, so there's no saying they can't do it again. And the Ethereum creators specifically said they want their currency mined on GPUs which everyone can afford, as opposed to only people rich enough to buy ASICs cornering the market. So it looks like they're prepared to do whatever it takes to prevent ASIC mining. But at least the ETH price is still dropping. It's at it's lowest since the surge started in December, and profitability is at an all time low. Prices should start to fall soon, hopefully.

    I can't decide between getting a 1080ti now or waiting to see what the new cards look like. I don't know if they will be that big of a leap over the current cards. Plus how long after the xx70/xx80 release are we going to have to wait for Ti versions? Plus availability. Lots of frustrated gamers have been waiting to buy for about 7 months now. This could be a long summer.

    Post edited by Kitsumo on
  • CybersoxCybersox Posts: 9,275

    All I know is that I'm so glad I broke down and bought a 1070 last year before the prices spiked.  I just did a lookup on Amazon and the exct same model I bought for $339.99 last May now sells for $699.99 and even the used ones are selling for over $500.  :o  Instead of investing in stocks and real estate, I should have invested in NVDIA cards...

     
  • kyoto kidkyoto kid Posts: 41,851
    edited March 2018
    Kitsumo said:
    kyoto kid said:

    ..meanwhile back to the topic. 

    Interesting news. Time will only tell if Ethereum and Monero miners see the new ASICs as a more solid hardware investment and if the hit will be big enough to get GPU prices back in line.

    At best I see this happening in Q1, maybe Q2 of next year. as miners will still look to squeeze as much return out of their investment in GPU cards before making the switch and disposing of them on the resale market. By then we will be looking at the successor to Pascal (whatever Nvidia plans to call it) most likely with a price bump due to the new technology along with continued shortage of memory chips and wafers. 

    I wouldn't expect to see a new 8 GB GTX card priced under 400$ like the 1070 was, with 11 maybe 12 GB remaining the VRAM cap for the the consumer line even given the boost to Nvidia's top end Quadro card (most likely the Volta replacement for the P5000 will continue to have 16 GB but be upgraded to HBM2 memory and have Tensor cores as well as NVLink compatibility along with a price increase).

    Well, Monero already made a change in it's algorithm to avoid ASIC use, so there's no saying they can't do it again. And the Ethereum creators specifically said they want their currency mined on GPUs which everyone can afford, as opposed to only people rich enough to buy ASICs cornering the market. So it looks like they're prepared to do whatever it takes to prevent ASIC mining. But at least the ETH price is still dropping. It's at it's lowest since the surge started in December, and profitability is at an all time low. Prices should start to fall soon, hopefully.

    I can't decide between getting a 1080ti now or waiting to see what the new cards look like. I don't know if they will be that big of a leap over the current cards. Plus how long after the xx70/xx80 release are we going to have to wait for Ti versions? Plus availability. Lots of frustrated gamers have been waiting to buy for about 7 months now. This could be a long summer.

    ...what I mentioned was a "best case scenario" however I am still more inclined to agree with your outlook that ETH and Monero miners (as well as whatever new coins become "hot") will continue using GPUs.  This indeed poses a bleak future for us for as card prices will continue to remain artificially high due to demand outpacing supply.  This is why I feel the affordability window has been closed and will be for a while to come.

    For the moment just biding my time working with 3DL until Octane 4 is released.  Interestingly, in stepping back,  I find I am enjoying this again, not having to wait countless hours to see the results.

    Post edited by kyoto kid on
  • freni-kynfreni-kyn Posts: 394
    nicstt said:

    Time to switch to AMD, and their new Rendering system; best part is you can use AMD and NVidia cards together.

    You can??? 

  • KitsumoKitsumo Posts: 1,221
    kyoto kid said:
    Kitsumo said:
    kyoto kid said:

    ..meanwhile back to the topic. 

    Interesting news. Time will only tell if Ethereum and Monero miners see the new ASICs as a more solid hardware investment and if the hit will be big enough to get GPU prices back in line.

    At best I see this happening in Q1, maybe Q2 of next year. as miners will still look to squeeze as much return out of their investment in GPU cards before making the switch and disposing of them on the resale market. By then we will be looking at the successor to Pascal (whatever Nvidia plans to call it) most likely with a price bump due to the new technology along with continued shortage of memory chips and wafers. 

    I wouldn't expect to see a new 8 GB GTX card priced under 400$ like the 1070 was, with 11 maybe 12 GB remaining the VRAM cap for the the consumer line even given the boost to Nvidia's top end Quadro card (most likely the Volta replacement for the P5000 will continue to have 16 GB but be upgraded to HBM2 memory and have Tensor cores as well as NVLink compatibility along with a price increase).

    Well, Monero already made a change in it's algorithm to avoid ASIC use, so there's no saying they can't do it again. And the Ethereum creators specifically said they want their currency mined on GPUs which everyone can afford, as opposed to only people rich enough to buy ASICs cornering the market. So it looks like they're prepared to do whatever it takes to prevent ASIC mining. But at least the ETH price is still dropping. It's at it's lowest since the surge started in December, and profitability is at an all time low. Prices should start to fall soon, hopefully.

    I can't decide between getting a 1080ti now or waiting to see what the new cards look like. I don't know if they will be that big of a leap over the current cards. Plus how long after the xx70/xx80 release are we going to have to wait for Ti versions? Plus availability. Lots of frustrated gamers have been waiting to buy for about 7 months now. This could be a long summer.

    ...what I mentioned was a "best case scenario" however I am still more inclined to agree with your outlook that ETH and Monero miners (as well as whatever new coins become "hot") will continue using GPUs.  This indeed poses a bleak future for us for as card prices will continue to remain artificially high due to demand outpacing supply.  This is why I feel the affordability window has been closed and will be for a while to come.

    For the moment just biding my time working with 3DL until Octane 4 is released.  Interestingly, in stepping back,  I find I am enjoying this again, not having to wait countless hours to see the results.

    I can't predict what the crypto guys are going to do. I know it's pretty much unprofitable to mine with just about any card as you can see here(you can plug in different cards and see the expected profit). So unless someone has free electricity, its unprofitable to mine Ethereum for now. The question is how many miners will stubbornly continue buying cards, hoping the ETH price goes back up. I'm not trying to be a Debbie Downer, I guess I'm just trying not to get my hopes up too high.

    I just realized something new. I used to buy a $150 card every three or four years. Now I'm considering a $900 card. Is this going to be the new normal? I guess the technology is advancing. I can look at pretty pictures while I eat my peanut butter and jelly sandwiches and huddle around my pc for warmth. During the housing bubble, Home Depot was rumored to be introducing a home equity line of credit so you could shop in their stores and charge it directly to your HELOC. Maybe Nvidia will have to try that so we can afford their cards. Either that or we all work for them and get paid in cards.cheeky

  • NathNath Posts: 2,941
    Kitsumo said:
    kyoto kid said:
    Kitsumo said:
    kyoto kid said:

    ..meanwhile back to the topic. 

    Interesting news. Time will only tell if Ethereum and Monero miners see the new ASICs as a more solid hardware investment and if the hit will be big enough to get GPU prices back in line.

    At best I see this happening in Q1, maybe Q2 of next year. as miners will still look to squeeze as much return out of their investment in GPU cards before making the switch and disposing of them on the resale market. By then we will be looking at the successor to Pascal (whatever Nvidia plans to call it) most likely with a price bump due to the new technology along with continued shortage of memory chips and wafers. 

    I wouldn't expect to see a new 8 GB GTX card priced under 400$ like the 1070 was, with 11 maybe 12 GB remaining the VRAM cap for the the consumer line even given the boost to Nvidia's top end Quadro card (most likely the Volta replacement for the P5000 will continue to have 16 GB but be upgraded to HBM2 memory and have Tensor cores as well as NVLink compatibility along with a price increase).

    Well, Monero already made a change in it's algorithm to avoid ASIC use, so there's no saying they can't do it again. And the Ethereum creators specifically said they want their currency mined on GPUs which everyone can afford, as opposed to only people rich enough to buy ASICs cornering the market. So it looks like they're prepared to do whatever it takes to prevent ASIC mining. But at least the ETH price is still dropping. It's at it's lowest since the surge started in December, and profitability is at an all time low. Prices should start to fall soon, hopefully.

    I can't decide between getting a 1080ti now or waiting to see what the new cards look like. I don't know if they will be that big of a leap over the current cards. Plus how long after the xx70/xx80 release are we going to have to wait for Ti versions? Plus availability. Lots of frustrated gamers have been waiting to buy for about 7 months now. This could be a long summer.

    ...what I mentioned was a "best case scenario" however I am still more inclined to agree with your outlook that ETH and Monero miners (as well as whatever new coins become "hot") will continue using GPUs.  This indeed poses a bleak future for us for as card prices will continue to remain artificially high due to demand outpacing supply.  This is why I feel the affordability window has been closed and will be for a while to come.

    For the moment just biding my time working with 3DL until Octane 4 is released.  Interestingly, in stepping back,  I find I am enjoying this again, not having to wait countless hours to see the results.

    I can't predict what the crypto guys are going to do. I know it's pretty much unprofitable to mine with just about any card as you can see here(you can plug in different cards and see the expected profit). So unless someone has free electricity, its unprofitable to mine Ethereum for now. The question is how many miners will stubbornly continue buying cards, hoping the ETH price goes back up. I'm not trying to be a Debbie Downer, I guess I'm just trying not to get my hopes up too high.

    I just realized something new. I used to buy a $150 card every three or four years. Now I'm considering a $900 card. Is this going to be the new normal? I guess the technology is advancing. I can look at pretty pictures while I eat my peanut butter and jelly sandwiches and huddle around my pc for warmth. During the housing bubble, Home Depot was rumored to be introducing a home equity line of credit so you could shop in their stores and charge it directly to your HELOC. Maybe Nvidia will have to try that so we can afford their cards. Either that or we all work for them and get paid in cards.cheeky

    I see a (tongue in cheek) option here: Nvidia introduces its own cryptocurrency, and if you opt in to mine for them you get a card at a reduced price (or a mining-based subscription, money back after every milestone achieved or something). What could possibly go wrong?

  • nicsttnicstt Posts: 11,715
    freni-kyn said:
    nicstt said:

    Time to switch to AMD, and their new Rendering system; best part is you can use AMD and NVidia cards together.

    You can??? 

    Of course, although it takes some work; export to Blender and use Cycles now; as soon as the AMD render works in blender, use that instead.

    ... I also saw a thread somewhere about someone making a plugin for AMD's renderer for Studio.

  • kyoto kidkyoto kid Posts: 41,851
    edited March 2018
    Kitsumo said:
    kyoto kid said:
    Kitsumo said:
    kyoto kid said:

    ..meanwhile back to the topic. 

    Interesting news. Time will only tell if Ethereum and Monero miners see the new ASICs as a more solid hardware investment and if the hit will be big enough to get GPU prices back in line.

    At best I see this happening in Q1, maybe Q2 of next year. as miners will still look to squeeze as much return out of their investment in GPU cards before making the switch and disposing of them on the resale market. By then we will be looking at the successor to Pascal (whatever Nvidia plans to call it) most likely with a price bump due to the new technology along with continued shortage of memory chips and wafers. 

    I wouldn't expect to see a new 8 GB GTX card priced under 400$ like the 1070 was, with 11 maybe 12 GB remaining the VRAM cap for the the consumer line even given the boost to Nvidia's top end Quadro card (most likely the Volta replacement for the P5000 will continue to have 16 GB but be upgraded to HBM2 memory and have Tensor cores as well as NVLink compatibility along with a price increase).

    Well, Monero already made a change in it's algorithm to avoid ASIC use, so there's no saying they can't do it again. And the Ethereum creators specifically said they want their currency mined on GPUs which everyone can afford, as opposed to only people rich enough to buy ASICs cornering the market. So it looks like they're prepared to do whatever it takes to prevent ASIC mining. But at least the ETH price is still dropping. It's at it's lowest since the surge started in December, and profitability is at an all time low. Prices should start to fall soon, hopefully.

    I can't decide between getting a 1080ti now or waiting to see what the new cards look like. I don't know if they will be that big of a leap over the current cards. Plus how long after the xx70/xx80 release are we going to have to wait for Ti versions? Plus availability. Lots of frustrated gamers have been waiting to buy for about 7 months now. This could be a long summer.

    ...what I mentioned was a "best case scenario" however I am still more inclined to agree with your outlook that ETH and Monero miners (as well as whatever new coins become "hot") will continue using GPUs.  This indeed poses a bleak future for us for as card prices will continue to remain artificially high due to demand outpacing supply.  This is why I feel the affordability window has been closed and will be for a while to come.

    For the moment just biding my time working with 3DL until Octane 4 is released.  Interestingly, in stepping back,  I find I am enjoying this again, not having to wait countless hours to see the results.

    I can't predict what the crypto guys are going to do. I know it's pretty much unprofitable to mine with just about any card as you can see here(you can plug in different cards and see the expected profit). So unless someone has free electricity, its unprofitable to mine Ethereum for now. The question is how many miners will stubbornly continue buying cards, hoping the ETH price goes back up. I'm not trying to be a Debbie Downer, I guess I'm just trying not to get my hopes up too high.

    I just realized something new. I used to buy a $150 card every three or four years. Now I'm considering a $900 card. Is this going to be the new normal? I guess the technology is advancing. I can look at pretty pictures while I eat my peanut butter and jelly sandwiches and huddle around my pc for warmth. During the housing bubble, Home Depot was rumored to be introducing a home equity line of credit so you could shop in their stores and charge it directly to your HELOC. Maybe Nvidia will have to try that so we can afford their cards. Either that or we all work for them and get paid in cards.cheeky

    ...not a being "Debbie Downer' at all. People who believe they can get what they think is "free money" will pursue it even if it ends up costing them more in the long run (look at chronic gamblers).  To them the possibility of a big payout blinds them to the investment cost. I see this whole situation is digressing to little more than a new form of gambling, hoping that like what happened in late December another spike in value will occur (ETH also rose considerably as well just not to the ridiculous high that Bticoin did).  Maybe the novelty will eventually wear off (of course I had hoped the same would have occurred in the case of SUVs, so much for that) and people will go back to their normal lives again.

    We can only hope.

    Post edited by kyoto kid on
  • kyoto kid said:
    Kitsumo said:
    kyoto kid said:
    Kitsumo said:
    kyoto kid said:

    ..meanwhile back to the topic. 

    Interesting news. Time will only tell if Ethereum and Monero miners see the new ASICs as a more solid hardware investment and if the hit will be big enough to get GPU prices back in line.

    At best I see this happening in Q1, maybe Q2 of next year. as miners will still look to squeeze as much return out of their investment in GPU cards before making the switch and disposing of them on the resale market. By then we will be looking at the successor to Pascal (whatever Nvidia plans to call it) most likely with a price bump due to the new technology along with continued shortage of memory chips and wafers. 

    I wouldn't expect to see a new 8 GB GTX card priced under 400$ like the 1070 was, with 11 maybe 12 GB remaining the VRAM cap for the the consumer line even given the boost to Nvidia's top end Quadro card (most likely the Volta replacement for the P5000 will continue to have 16 GB but be upgraded to HBM2 memory and have Tensor cores as well as NVLink compatibility along with a price increase).

    Well, Monero already made a change in it's algorithm to avoid ASIC use, so there's no saying they can't do it again. And the Ethereum creators specifically said they want their currency mined on GPUs which everyone can afford, as opposed to only people rich enough to buy ASICs cornering the market. So it looks like they're prepared to do whatever it takes to prevent ASIC mining. But at least the ETH price is still dropping. It's at it's lowest since the surge started in December, and profitability is at an all time low. Prices should start to fall soon, hopefully.

    I can't decide between getting a 1080ti now or waiting to see what the new cards look like. I don't know if they will be that big of a leap over the current cards. Plus how long after the xx70/xx80 release are we going to have to wait for Ti versions? Plus availability. Lots of frustrated gamers have been waiting to buy for about 7 months now. This could be a long summer.

    ...what I mentioned was a "best case scenario" however I am still more inclined to agree with your outlook that ETH and Monero miners (as well as whatever new coins become "hot") will continue using GPUs.  This indeed poses a bleak future for us for as card prices will continue to remain artificially high due to demand outpacing supply.  This is why I feel the affordability window has been closed and will be for a while to come.

    For the moment just biding my time working with 3DL until Octane 4 is released.  Interestingly, in stepping back,  I find I am enjoying this again, not having to wait countless hours to see the results.

    I can't predict what the crypto guys are going to do. I know it's pretty much unprofitable to mine with just about any card as you can see here(you can plug in different cards and see the expected profit). So unless someone has free electricity, its unprofitable to mine Ethereum for now. The question is how many miners will stubbornly continue buying cards, hoping the ETH price goes back up. I'm not trying to be a Debbie Downer, I guess I'm just trying not to get my hopes up too high.

    I just realized something new. I used to buy a $150 card every three or four years. Now I'm considering a $900 card. Is this going to be the new normal? I guess the technology is advancing. I can look at pretty pictures while I eat my peanut butter and jelly sandwiches and huddle around my pc for warmth. During the housing bubble, Home Depot was rumored to be introducing a home equity line of credit so you could shop in their stores and charge it directly to your HELOC. Maybe Nvidia will have to try that so we can afford their cards. Either that or we all work for them and get paid in cards.cheeky

    ...not a being "Debbie Downer' at all. People who believe they can get what they think is "free money" will pursue it even if it ends up costing them more in the long run (look at chronic gamblers).  To them the possibility of a big payout blinds them to the investment cost. I see this whole situation is digressing to little more than a new form of gambling, hoping that like what happened in late December another spike in value will occur (ETH also rose considerably as well just not to the ridiculous high that Bticoin did).  Maybe the novelty will eventually wear off (of course I had hoped the same would have occurred in the case of SUVs, so much for that) and people will go back to their normal lives again.

    We can only hope.

    SUVs have been around since the 1960's... I can name four models at least off the top of my head that have been around that long, at least here in the US.
  • kyoto kidkyoto kid Posts: 41,851
    edited March 2018

    ...yeah but in the proliferation we have today which began in the mid 1990s when every car company from BMW to VW to even "down to earth" Saturn began making them.  Crikey while waiting for a bus in the city centre, I often see more SUVs (not counting pickups) than sedans these days.

    Those old ones were at least practical, rugged, and designed to be work vehicles with the only "luxury" appointments being an AM radio and heat. Oh, and back then they were simply called "four wheel drives" or "utility vehicles".  The term "Sport Utility" almost borders on being an oxymoron.

    Most of today's are built for status and looks, not practicality. Take an Escalade where an old  Land Rover would easily go and would look it went through a demo derby if it even survived the trip.

    Post edited by kyoto kid on
Sign In or Register to comment.