OT: New Nvidia Cards to come in RTX and GTX versions?! RTX Titan first whispers.

12122242627

Comments

  • linvanchenelinvanchene Posts: 1,386
    edited November 2018

    @ Octane

    2018 - 11 - 23 edited to remove outdated information


    Please check the official thread on the Otoy forum for updated information about V4 licensing.
    OctaneRender 4 / All Access -Update FAQ
    https://render.otoy.com/forum/viewtopic.php?f=7&t=69646

    - - -

    @ RTX pricing

    Prices may be different in certain regions.

     

    But at least in my store it would be more expensive to purchase two GTX 1080 Ti than one RTX 2080 Ti.

    It would be even more expensive to upgrade to an i9 system mainboard to use four instead of two cards.

    From a 1070 to a 2080 Ti for rendering you get the performance you pay for.

    To me it is absolutely amazing that people can get almost the performance of four GTX 1080 Ti for rendering by purchasing two RTX 2080 Ti...

    - - -

     

    @ Test Results

     

    Edited 2018 10 23

    - added proper dimensions

    - updated screenshot

    Maybe I am amazed at all this progress because I started GPU rendering in 2013 when I had to struggle fitting a scene of 1.6 GB into a 2 GB card and it took more than 24 hours to render a scene at 3840x 2160.

    I increased the render size of the Iray test scene 2 

    ~ 1h 13 min for 7000x9100 resolution.

    This would have taken several days to render a few years ago without GPU rendering...

    - - -

     

     

    20181011 7000x9100 1h13min.jpg
    1920 x 1080 - 352K
    Post edited by linvanchene on
  • kyoto kidkyoto kid Posts: 41,844

    ...if I had to pay 195$ for the Daz plugin to use the free version the notion of "free" would be pretty much moot.  20$ a month to have access to the Daz plugin as well is a much better deal.

    So reading the details the 20$ price includes a subscription to both Octane VR and the Daz Plugin as each are listed at 10$.

  • linvanchenelinvanchene Posts: 1,386
    edited November 2018

    2018 - 11 - 23 edited to remove outdated information


    Please check the official thread on the Otoy forum for updated information about V4 licensing.
    OctaneRender 4 / All Access -Update FAQ


    https://render.otoy.com/forum/viewtopic.php?f=7&t=69646

    Post edited by linvanchene on
  • outrider42outrider42 Posts: 3,679
    edited October 2018
    DirectX12 is only for gaming anyway. There is a game called "Enlisted" which is claiming they can run at 4k 90 frames per second with ray tracing enabled. However I don't think they use the full feature, and they use Vulcan instead of DirectX12. The game is a MMO, and chose Vulcan for that reason.

    Battlefield 5 is now saying they can run the game at 4k 30 fps or 1080p 60 fps. That's an improvement, and perhaps more time to polish can get better performance. All of the demos shown during the Turing launch were made very quickly on short notice. Some only had 5 days with Turing hardware to put together their demo.

    But none of that pertains to Iray nor Optix Prime, nor does it change the sting of the prices. The Linus video on the 2070 is pretty much like every review of the 2070 I have seen. They all see a tiny bit better value compared to the other RTX, but not really. $500 is not really a value for slightly better than 1080 performance. Past x70 models have launched with better than the previous gen x80 performance, but they did so at a better price. The 670 beat the 580, and it did so for much less. The 770 beat the 680, the 970 beat the 780, and the 1070 destroyed the 980, and all of these cases featured the new x70 at much lower price. But the 2070 launches at a price that is too similar to the 1080. Just like how the 2080 launched at a price higher than the 1080ti, and the 2080ti launched at...ridiculous.

    It just happens that there plenty of unsold Pascal cards in stock though, unlike any previous generation...

    Post edited by outrider42 on
  • I like the denoising function in Iray but it doesn't work well for skin surfaces. After the denosing an image is very plastic and unnatural. In the worst scenario with a small amount of samples there are even geometry distorsions.

    Otoy has more accurate denoising. I imported Vic7 into Unity3D and ran Octane Render 4.

    2 (two) samples:

    Daz3D Iray

    image

    Unity/Octane

    image

     

    5 Samples:

    Daz3d Iray

    image

    Unity/Octane

    image

     

    50 samples:

    Daz3d/Iray (17 sec)

    image

    Unity/Octane (4 sec)

    image

    Iray_2.jpg
    800 x 800 - 482K
    Octane_2.png
    800 x 800 - 1M
    Iray_5.jpg
    800 x 800 - 444K
    Octane_5.png
    800 x 800 - 1M
    Iray_50_17sec.jpg
    800 x 800 - 429K
    Octane_50_04Sec.png
    800 x 800 - 937K
  • nicsttnicstt Posts: 11,715

    I agree that denoising skin is effectively useless

  • Kitana AeonKitana Aeon Posts: 17
    edited October 2018
    nicstt said:

    I agree that denoising skin is effectively useless

    There needs to be a custom shader designed that tells the renderer to not denoise skin or use some special settings that makes denoising useful.

    Post edited by Kitana Aeon on
  • AalaAala Posts: 140
    nicstt said:

    I agree that denoising skin is effectively useless

    There needs to be a custom shader designed that tells the renderer to not denoise skin or use some special settings that makes denoising useful.

    Can't do it on the shader end because the denoiser is a post-processing effect. What needs to actually happen is for the denoiser, since it's machine-learning based, to be taught not to denoise skin.

  • Kitana AeonKitana Aeon Posts: 17
    edited October 2018

    Can someone link me to the benchmarking thread? I can't find it.

    Post edited by Kitana Aeon on
  • outrider42outrider42 Posts: 3,679
    The link is in my sig.
  • Thank you.

  • outrider42outrider42 Posts: 3,679

    Hmm...that's a blower style card, so that may be a reference design. If that is the case, it will be running at a lower clock speed than other models, plus a blower can be less effecient at cooling. Though that may depend. In a multi-GPU set up a blower may not be a bad thing because it ejects the air out back instead of simply blowing its warm air around the case. So some food for thought, but in general people prefer dual or triple fan coolers.

    Right now, like literally right now for the next FIVE HOURS Newegg has a MSI Gaming 1080 for $489 plus a $20 rebate, meaning the card is $469 after rebate.

    https://www.newegg.com/Product/Product.aspx?Item=N82E16814127950&cm_re=1080-_-14-127-950-_-Product

    I have a MSI Gaming 970 and I like it. It runs cool on air, and I would expect this 1080 to perform similarly since it uses much the same cooling.

    Anyway I bought a EVGA1080ti SC2 for just over $500 a few weeks ago. :) Used from ebay, no tax and free shipping, plus it still has 2 years of warranty. I think that will do fine. When the scene fits both the 970 and 1080ti, I see a nice boost above what the 1080ti can do alone. So I am quite pleased. After tax these 1080's here will cost about the same, but of course the 1080ti has that extra 3GB and boat load of CUDA cores. 

  • ArtiniArtini Posts: 10,290

    Are there any thoughts of RTX 2070 cards?

    They just started to appearing and some of them has quite good prices, like MSI RTX 2070 ARMOR 8G or Palit GeForce RTX 2070 Dual.

    Still, one can find good deals on GTX 1080 or GTX 1080 Ti, but if one would like to go for RTX cards, may be it is an option.

     

  • nicsttnicstt Posts: 11,715
    edited October 2018
    Artini said:

    Are there any thoughts of RTX 2070 cards?

    They just started to appearing and some of them has quite good prices, like MSI RTX 2070 ARMOR 8G or Palit GeForce RTX 2070 Dual.

    Still, one can find good deals on GTX 1080 or GTX 1080 Ti, but if one would like to go for RTX cards, may be it is an option.

     

    Yeh, don't buy it.

    Personally, the 20 series cards are rediculously over-priced. We're expected to pay now for something still not available... No thanks.

    The only 20 series I can sort of understand folks buying is the ti version; as silly as the price is, there is no card available that does what it does for the price.

    Post edited by nicstt on
  • Not a good news that people start to have problems and RMA the 2080Ti...bsod, crash, glitch, and so on...if search on google the geforce forum show results.

  • algovincianalgovincian Posts: 2,664

    https://forums.geforce.com/default/topic/1075374/geforce-rtx-20-series/2080-ti-fe-artifacts-crash-bsod-/1/

    Think I'll put my plans to go hardware shopping this month on hold and wait for more dust to settle.

    - Greg

  • I am not worried about new technology... If Daz can not or will not keep up with the new graphic technology then it will be their problem.... 

  • nicsttnicstt Posts: 11,715

    How is Daz not keeping up?

    4.11 beta can use the 20 series card, which provides increased render performance over the 10 series; not sure how valuable as I'm noticing considerably slower renders, although no idea what percentage of that is due to Beta code.

  • outrider42outrider42 Posts: 3,679

    Its far too early to tell if a handful of forum posts indicate a broader problem. There are always going to be defective products, that is a given. And with how negative the reception to Turing is, the people covering tech are jumping on this. I bet you if Turing was cheaper and making people happy that these forums posts would not even be in the news. The question that matters is if there are a lot more defective cards than usual. Right now the 3rd party board makers and retailers that have been questioned are saying they have not seen an increase in returns. Of course that may not mean much, and that does not include Nvidia's own Founder's cards.

    However on the flip side, it would be logical given the shear size of the die. The 2080ti die is 754 mm, making it one of the largest chips Nvidia has ever made, and by a large margin. When you make such a large chip, there are more things that can go wrong. This is one of the reasons such large chips are rare (that and the cost, of which having poor yields directly impacts cost.) I would bet this is a reason why the 2080ti is a cut down version of the full chip...it will take a long time for Nvidia to get the near perfect chips of a fully enabled Turing. So given this, I can see how this is possible.

    This could also be an issue with the new GDDR6 VRAM. Being new these could be having issues. So you have a huge new chip and a new form of VRAM, there are things that can go wrong.

    Also, several youtubers experienced their own issues with review samples, so even they have had problems.

  • X3ZX3Z Posts: 14
    edited November 2018
    alex_e85 said:

    I like the denoising function in Iray but it doesn't work well for skin surfaces. After the denosing an image is very plastic and unnatural. In the worst scenario with a small amount of samples there are even geometry distorsions.

    Otoy has more accurate denoising. I imported Vic7 into Unity3D and ran Octane Render 4.

     In your examples Octane doesn't have SSS, only simple diffusive layer with some kind of specularity, while Iray uses physical SSS in its skin shader, this is the big difference in calculations, you simply cannot denoise SSS, because it is nothing more then a color noise, until you get some more samples.

    Post edited by X3Z on
  • Ghosty12Ghosty12 Posts: 2,080
    edited November 2018
    nicstt said:
    Artini said:

    Are there any thoughts of RTX 2070 cards?

    They just started to appearing and some of them has quite good prices, like MSI RTX 2070 ARMOR 8G or Palit GeForce RTX 2070 Dual.

    Still, one can find good deals on GTX 1080 or GTX 1080 Ti, but if one would like to go for RTX cards, may be it is an option.

     

    Yeh, don't buy it.

    Personally, the 20 series cards are rediculously over-priced. We're expected to pay now for something still not available... No thanks.

    The only 20 series I can sort of understand folks buying is the ti version; as silly as the price is, there is no card available that does what it does for the price.

    Not to forget that some who bought the new cards are now having to RMA them due to the cards going kaput..

    Its far too early to tell if a handful of forum posts indicate a broader problem. There are always going to be defective products, that is a given. And with how negative the reception to Turing is, the people covering tech are jumping on this. I bet you if Turing was cheaper and making people happy that these forums posts would not even be in the news. The question that matters is if there are a lot more defective cards than usual. Right now the 3rd party board makers and retailers that have been questioned are saying they have not seen an increase in returns. Of course that may not mean much, and that does not include Nvidia's own Founder's cards.

    However on the flip side, it would be logical given the shear size of the die. The 2080ti die is 754 mm, making it one of the largest chips Nvidia has ever made, and by a large margin. When you make such a large chip, there are more things that can go wrong. This is one of the reasons such large chips are rare (that and the cost, of which having poor yields directly impacts cost.) I would bet this is a reason why the 2080ti is a cut down version of the full chip...it will take a long time for Nvidia to get the near perfect chips of a fully enabled Turing. So given this, I can see how this is possible.

    This could also be an issue with the new GDDR6 VRAM. Being new these could be having issues. So you have a huge new chip and a new form of VRAM, there are things that can go wrong.

    Also, several youtubers experienced their own issues with review samples, so even they have had problems.


    The other thing that I have heard so is a rumor is that these cards are just a stop gap till next year when there is supposedly a new GPU coming out or something like that..

    Post edited by Ghosty12 on
  • kyoto kidkyoto kid Posts: 41,844

    ..if that is the situation and AMD comes out with a "rocking" architecture first, Nvidia might find themselves in trouble.

  • outrider42outrider42 Posts: 3,679
    ghosty12 said:
    nicstt said:
    Artini said:

    Are there any thoughts of RTX 2070 cards?

    They just started to appearing and some of them has quite good prices, like MSI RTX 2070 ARMOR 8G or Palit GeForce RTX 2070 Dual.

    Still, one can find good deals on GTX 1080 or GTX 1080 Ti, but if one would like to go for RTX cards, may be it is an option.

     

    Yeh, don't buy it.

    Personally, the 20 series cards are rediculously over-priced. We're expected to pay now for something still not available... No thanks.

    The only 20 series I can sort of understand folks buying is the ti version; as silly as the price is, there is no card available that does what it does for the price.

    Not to forget that some who bought the new cards are now having to RMA them due to the cards going kaput..

    Its far too early to tell if a handful of forum posts indicate a broader problem. There are always going to be defective products, that is a given. And with how negative the reception to Turing is, the people covering tech are jumping on this. I bet you if Turing was cheaper and making people happy that these forums posts would not even be in the news. The question that matters is if there are a lot more defective cards than usual. Right now the 3rd party board makers and retailers that have been questioned are saying they have not seen an increase in returns. Of course that may not mean much, and that does not include Nvidia's own Founder's cards.

    However on the flip side, it would be logical given the shear size of the die. The 2080ti die is 754 mm, making it one of the largest chips Nvidia has ever made, and by a large margin. When you make such a large chip, there are more things that can go wrong. This is one of the reasons such large chips are rare (that and the cost, of which having poor yields directly impacts cost.) I would bet this is a reason why the 2080ti is a cut down version of the full chip...it will take a long time for Nvidia to get the near perfect chips of a fully enabled Turing. So given this, I can see how this is possible.

    This could also be an issue with the new GDDR6 VRAM. Being new these could be having issues. So you have a huge new chip and a new form of VRAM, there are things that can go wrong.

    Also, several youtubers experienced their own issues with review samples, so even they have had problems.


    The other thing that I have heard so is a rumor is that these cards are just a stop gap till next year when there is supposedly a new GPU coming out or something like that..

    A lot of people say that, but I'm not so sure. Because if they release a successor that quickly it will only undermine the Turing launch that much more. The public opinion of Nvidia is dropping like a rock right now. They may be #1 by a large margin now, but that can change very quickly if anybody suddenly competes, be it Intel or AMD. I really hope AMD can get it together like they have for CPU, this industry needs competition.

    I don't know, Nvidia has gone so far off the path nobody can really predict what they will do. I say they only release a successor if AMD or Intel get something that competes out there. If not, then the '3000' series or whatever it will be called will wait for 2020. Otherwise there is no reason to launch a new series, they'd just be competing against themselves.
  • I say they only release a successor if AMD or Intel get something that competes out there. If not, then the '3000' series or whatever it will be called will wait for 2020. Otherwise there is no reason to launch a new series, they'd just be competing against themselves.

    This.. they hold all the cards now (pun not intended).. they can just sit back, wait for AMD to try to respond, and go from there.

    Supply is still crazy short for these as well.. I'm still waiting on mine I ordered a month ago.

  • linvanchenelinvanchene Posts: 1,386
    edited November 2018

    The public opinion of Nvidia is dropping like a rock right now. They may be #1 by a large margin now, but that can change very quickly if anybody suddenly competes, be it Intel or AMD. I really hope AMD can get it together like they have for CPU, 

    Which public opinion? The opinion of some self proclaimed social media  influencers broadcasting over youtube?

    When I talk to people not in the creative industry most are not even aware about Turing and RTX.

    Those people walk into a store and purchase a pre built computer that fits the price they are willing to spend without giving another thought what kind of video card is included.

    Game companies will either support RTX or they will not.

    In a few years you may find cards including RTX and tensor cores with prices ready for the main stream audience.

    It is normal that early adopters pay a higher price...

    At that point casual users may play their game and be happy with the result of real time ray tracing without understanding why that is so special.

    That audience does not care either what kind of video card sits in their Playstation or Xbox.

    So the only people talking about RTX now are technology enthusiast and some hardcore gamers.

     

    this industry needs competition.

    Which part of the video card industry? The one providing GPUs for games or the one providing solutions for the creative industry?

    Agreed, it will be interesting to observe what will happen to the game market when some games with RTX would indead offer real time ray tracing on computers with Nvidia cards.

    Is it worth the time and effort for the big studios to support RTX when current generation Playstation and Xbox include AMD cards that can not use it?

    Maybe the next Playstation or Xbox will use Nvidia cards featuring RTX?

    - - -

    But for the creative industry this all does simply not matter anymore.

    edited:

    After reading some replies I deceided to edit this sequence to make the intention more clear.

    Most creative software relies on Nvidia Cuda.

    Many software applications and plugins offer speed increases by using Nvidia Cuda.

    Other GPU providers may not be able to win the creative market back.

    And even if they could present some miracle technology it would take several years until that market could shift.

    - - -

    What the computer graphic industry needs right now is common standards.

    Many render engines and software supporting RTX and tensor cores  would be the best thing to happen.

    Different render engines using and supporting open source standards like the Nvidia Material Definition language would make life so much easier importing and exporting licensed content between different software and render engines as well.

    - - -

     

     

     

     

     

    Post edited by linvanchene on
  • Linvanchene,

    Depending on how you look at it, Nvidia already has too much power in this game and many view this as the fuel to the problem itself. Having more programmers adopting Turing, RTX, and nMDL are indeed the best possible things....for Nvidia, all of them will serve to further push Nvidia toward monopoly. But is that better for the rest of us in the long run? It basically tells everypone else (competitors) that if they want to play at all they must at least start by playing using Nvidias rules, and Nvidia's tech...allowing Nvidia to do as they have done which is to dictate to everyone else how they will be doing things well into the future.

     

  • drzapdrzap Posts: 795
    edited November 2018
     

     

     

    "But for the creative industry this all does simply not matter anymore.

    Most creative software relies on Nvidia Cuda.

    AMD and Intel may not be able to win the creative market back.

    And even if they could present some miracle technology it would take several years until that market could shift."

     

     

    Photoshop and the numerous image editing software on the market do not rely on CUDA
    Nuke, After Effects, Premiere Pro, Mocha and the numerous other post editing software on the market do not rely on CUDA
    Zbrush, Mudbox, Modo and other softwares that digital sculptors use in the pro industry do not rely on CUDA
    Mari, Substance Painter and numerous other tools that texture artists use do not require or even benefit from CUDA.
    Marvelous Designer, Shave and a Haircut, Yeti, FumeFX, and the countless other FX simulation products on the market don't rely on CUDA
    Maya, C4D, 3DSMax, Houdini nor any of the major content creation platforms rely on CUDA

    I could go on and on.  Not even majority of renderers on the market rely on or even use CUDA.  So while I think the rest of your argument is sound, you need to redefine what you think is "most creative software".  Most creative software is GPU agnostic and Nvidia is far from having a stranglehold on the industry as long as the common OpenGL format is being used.  AMD has been in Apple computers for a very long time and Apple has a pretty decent market share among creatives.  AMD GPU's have never been dominant, but they offer some unique features and price performance ratios that keep some creative pros buying them.  AMD doesn't need to win back a market they haven't lost.  Intel's GPU chips will probably be integrated into  motherboards and that will give them an advantage with many creative professionals who then don't need to even think about a GPU decision.

    Post edited by drzap on
  • Ghosty12Ghosty12 Posts: 2,080

    Its far too early to tell if a handful of forum posts indicate a broader problem. There are always going to be defective products, that is a given. And with how negative the reception to Turing is, the people covering tech are jumping on this. I bet you if Turing was cheaper and making people happy that these forums posts would not even be in the news. The question that matters is if there are a lot more defective cards than usual. Right now the 3rd party board makers and retailers that have been questioned are saying they have not seen an increase in returns. Of course that may not mean much, and that does not include Nvidia's own Founder's cards.

    However on the flip side, it would be logical given the shear size of the die. The 2080ti die is 754 mm, making it one of the largest chips Nvidia has ever made, and by a large margin. When you make such a large chip, there are more things that can go wrong. This is one of the reasons such large chips are rare (that and the cost, of which having poor yields directly impacts cost.) I would bet this is a reason why the 2080ti is a cut down version of the full chip...it will take a long time for Nvidia to get the near perfect chips of a fully enabled Turing. So given this, I can see how this is possible.

    This could also be an issue with the new GDDR6 VRAM. Being new these could be having issues. So you have a huge new chip and a new form of VRAM, there are things that can go wrong.

    Also, several youtubers experienced their own issues with review samples, so even they have had problems.

     

    ghosty12 said:
    nicstt said:
    Artini said:

    Are there any thoughts of RTX 2070 cards?

    They just started to appearing and some of them has quite good prices, like MSI RTX 2070 ARMOR 8G or Palit GeForce RTX 2070 Dual.

    Still, one can find good deals on GTX 1080 or GTX 1080 Ti, but if one would like to go for RTX cards, may be it is an option.

     

    Yeh, don't buy it.

    Personally, the 20 series cards are rediculously over-priced. We're expected to pay now for something still not available... No thanks.

    The only 20 series I can sort of understand folks buying is the ti version; as silly as the price is, there is no card available that does what it does for the price.

    Not to forget that some who bought the new cards are now having to RMA them due to the cards going kaput..

    Its far too early to tell if a handful of forum posts indicate a broader problem. There are always going to be defective products, that is a given. And with how negative the reception to Turing is, the people covering tech are jumping on this. I bet you if Turing was cheaper and making people happy that these forums posts would not even be in the news. The question that matters is if there are a lot more defective cards than usual. Right now the 3rd party board makers and retailers that have been questioned are saying they have not seen an increase in returns. Of course that may not mean much, and that does not include Nvidia's own Founder's cards.

    However on the flip side, it would be logical given the shear size of the die. The 2080ti die is 754 mm, making it one of the largest chips Nvidia has ever made, and by a large margin. When you make such a large chip, there are more things that can go wrong. This is one of the reasons such large chips are rare (that and the cost, of which having poor yields directly impacts cost.) I would bet this is a reason why the 2080ti is a cut down version of the full chip...it will take a long time for Nvidia to get the near perfect chips of a fully enabled Turing. So given this, I can see how this is possible.

    This could also be an issue with the new GDDR6 VRAM. Being new these could be having issues. So you have a huge new chip and a new form of VRAM, there are things that can go wrong.

    Also, several youtubers experienced their own issues with review samples, so even they have had problems.


    The other thing that I have heard so is a rumor is that these cards are just a stop gap till next year when there is supposedly a new GPU coming out or something like that..

     

    A lot of people say that, but I'm not so sure. Because if they release a successor that quickly it will only undermine the Turing launch that much more. The public opinion of Nvidia is dropping like a rock right now. They may be #1 by a large margin now, but that can change very quickly if anybody suddenly competes, be it Intel or AMD. I really hope AMD can get it together like they have for CPU, this industry needs competition.

     

    I don't know, Nvidia has gone so far off the path nobody can really predict what they will do. I say they only release a successor if AMD or Intel get something that competes out there. If not, then the '3000' series or whatever it will be called will wait for 2020. Otherwise there is no reason to launch a new series, they'd just be competing against themselves.

    Can only hope that AMD/Intel do have something on the GPU front as it is sorely needed..

  • outrider42outrider42 Posts: 3,679
    Early adopters do not determine success. Nearly every hardware launch is a sell out at first, even if the product flops. The reason why a product dies is because that product was unable to hit mainstream where the real money is. And you seriously cannot just brush aside the vast majority of...well...everybody bashing Nvidia. YouTubers have a lot of power now, there is reason they are called "influencers". And if you look up RTX videos, you will have a very hard time finding many videos that are very positive towards Turing. The comment sections are the same way. No GPPU launch in history has had this much negative press. I can also compare this to the $600 launch of the PlayStation 3 in 2006. That ended in disaster for Sony and company went through a very hard time. It wasn't until prices dropped a lot along with good games that got things going. And they received a lot of help from Xbox because of the infamous Red Ring of Death. If the 360 had not had such a disasterous failure rate, Sony might have struggled and never recovered.

    "Creative software" already has a GPU line dedicated to them, Quadro. If you are a professional in that field and price doesn't matter, then you buy a Quadro, not a RTX 2080ti. Quadro has many features for creation that the gaming cards lack.

    I get that people can expect a higher price on new tech, but this much of an increase? Are you seriously going to sit there and tell us that Turing should cost $1200? And the new tech in Turing is only a promise. We are still waiting for the first game that even uses tensor cores or ray tracing. Not even Iray uses the ray tracing cores on Turing...and Nvidia owns Iray. Meanwhile all costumers are being asked to pay super inflated prices. You simply cannot justify the cost increase going from the $700 1080ti to the $1200 2080ti. Yes, some people have paid that, but I would wager that many are not too happy about paying that figure. If they don't follow the news at all and never heard of the backlash against Turing, they don't have to. They will feel it the moment the see $1200 on the 2080ti. Unless they are truly insulated, any customer can see that $1200 vs $700 is a massive price increase. Not everybody wants the new features, either, but are being asked to pay for them. The 2070 for $500 is also a special level of absurd. The 2080 for $800-900 doesn't even make sense, why buy that over a 1080ti?

    Octane is gaining traction and they are going to start including support for AMD cards as well. So they are not CUDA exclusive for much longer. Many render engines adopting Nvidia standards would actually be the worst thing for the industry. If Nvidia tech becomes standard then that will be the end of competition for good. The market needs more open standards that can be supported. And the same goes for gaming. If all games become Nvidia titles then why would anyone ever buy AMD again? That would be the end of their GPU business even if consoles keep using them. Eventually the consoles will have to shift if they want to support Nvidia standards. Which BTW, Nintendo Switch is Nvidia.
Sign In or Register to comment.