2080 Ti Hallelujah!

Because products are becoming higher in quality, and g8 takes longer to render, my 980 was struggling. I spent a lot of time outdoors over the summer, came back from Spain keen to get back into rendering, and had to face up to tediously long renders.

I thought about getting a 1080Ti ... it is both the right time and the wrong time. They are still expensive here in the UK. I didn't fancy spending on a product that is essentially obsolete. I should have bought one a year ago.

The 2080 is too slow. I wanted a decent increment in performance. The price of the 2080Ti is breathtakingly, stupidly high, but I am retired and I spend a lot of time working on 3D ideas. I decided to take a gamble that the real-time rendering might amount to something in the next couple of years (discussions suggest that Optix Prime might not get any benefit, but who knows).

So the 2080Ti dropped straight in. I had a panic because I hadn't seated one of my PCI-E power connectors fully, and the board got sniffy and told me to get more dilithium crystals. But I figured it out. I am already using Studio 4.11 beta, so a quick driver update was all it took to get going.

It is all good. It just works. I'm so pleased. I'm poor, and will be eating bread and peanut paste till Xmas, but hey, quick renders.

I have attached a quirky image that I have been working on. It's a complex scene with 2 x G8 and 1 x G3, plus water, and took 20 minutes to 90%.

 

«1

Comments

  • FSMCDesignsFSMCDesigns Posts: 12,843

    Cool. Personally I'll be waiting since nearly every article I have read has said that the performance you get over the 1080 it doesn't come close to justifying the opening price on the 2080 ti and many tech reports have said they can't recommend it at this time at the price it's at now.

    here is a good article https://www.extremetech.com/gaming/278454-nvidia-rtx-2080-and-rtx-2080-ti-review-you-cant-polish-a-turing

    Love this quote also "ExtremeTech, therefore, does not recommend purchasing either the RTX 2080 or RTX 2080 Ti for their hoped-for performance in future games. Ray tracing may be the future of gaming, but that future isn’t here yet"

    I am about to purchase a new vehicle, so I can't justify any cost on a new GPU, so I am envious of you. Enjoy the rendering!

  • ValkeerieValkeerie Posts: 163

    Cool. Personally I'll be waiting since nearly every article I have read has said that the performance you get over the 1080 it doesn't come close to justifying the opening price on the 2080 ti and many tech reports have said they can't recommend it at this time at the price it's at now.

    here is a good article https://www.extremetech.com/gaming/278454-nvidia-rtx-2080-and-rtx-2080-ti-review-you-cant-polish-a-turing

    Love this quote also "ExtremeTech, therefore, does not recommend purchasing either the RTX 2080 or RTX 2080 Ti for their hoped-for performance in future games. Ray tracing may be the future of gaming, but that future isn’t here yet"

    I am about to purchase a new vehicle, so I can't justify any cost on a new GPU, so I am envious of you. Enjoy the rendering!

    Yes, all you say is true. It is not a rational purchase at this time. However, (I speak personally) this isn't an entirely rational hobby and it brings me so much pleasure I have been able to persuade myself :-)

    I hope your vehicle is a thing of joy and splendour.

  • L'AdairL'Adair Posts: 9,479

    @Valkeerie, Did you replace the 980? Or did you just add the 2080Ti?

    I'm asking for two reasons. 1) Having both might affect the speed of the render, and 2) I've got a 1080 in a rig with room for two more video cards and I'm curious if it will play nice with later cards, like the 1080Ti, or even better, a 2080Ti. No idea when I'll be able to afford a second card, though.

    Congratulations on the new card. Hopefully you can still afford some jam to go with that peanut paste, for a bit of variety in your diet.
    laugh

  • ValkeerieValkeerie Posts: 163
    L'Adair said:

    @Valkeerie, Did you replace the 980? Or did you just add the 2080Ti?

    I'm asking for two reasons. 1) Having both might affect the speed of the render, and 2) I've got a 1080 in a rig with room for two more video cards and I'm curious if it will play nice with later cards, like the 1080Ti, or even better, a 2080Ti. No idea when I'll be able to afford a second card, though.

    Congratulations on the new card. Hopefully you can still afford some jam to go with that peanut paste, for a bit of variety in your diet.
    laugh

    I replaced the 980. With its 4Gb I figured (based on reading here) that it might cause more trouble with textures than it was worth. I've read that GPU memory should match in multiple cards - don't know how true that is.

  • FSMCDesignsFSMCDesigns Posts: 12,843
    Valkeerie said:

    Yes, all you say is true. It is not a rational purchase at this time. However, (I speak personally) this isn't an entirely rational hobby and it brings me so much pleasure I have been able to persuade myself :-)

    I hope your vehicle is a thing of joy and splendour.

    No kidding, my GF looks at me with that WTH look when I talk about rendering or how much I have spent on it, LOL

    My i7 system in almost 3 years old so maybe I can get a 2080ti or better once I upgrade in a year, guess we'll see.

    yeah, the new vehicle won't be a thing of joy or splendour, just a used way to get from point a to point b comfortably and affordably, LOL. I stopped buying new awhile back and it makes no sense to drive the car of my dreams in Dallas traffic, but appreciate the thought

    Keep us informed on how the new card is performing

  • L'AdairL'Adair Posts: 9,479
    Valkeerie said:
    L'Adair said:

    @Valkeerie, Did you replace the 980? Or did you just add the 2080Ti?

    I'm asking for two reasons. 1) Having both might affect the speed of the render, and 2) I've got a 1080 in a rig with room for two more video cards and I'm curious if it will play nice with later cards, like the 1080Ti, or even better, a 2080Ti. No idea when I'll be able to afford a second card, though.

    Congratulations on the new card. Hopefully you can still afford some jam to go with that peanut paste, for a bit of variety in your diet.
    laugh

    I replaced the 980. With its 4Gb I figured (based on reading here) that it might cause more trouble with textures than it was worth. I've read that GPU memory should match in multiple cards - don't know how true that is.

    Interesting. I've not read that, but I have read that Daz/Iray will "drop" any card that doesn't have enough memory to hold the entire scene. So (as an example,) if one card has 8GB and one has 11GB, theoretically, only the 8GB card will be ignored if the scene takes 9GB, but both will be ignored if the scene takes 12GB.

    I'll have to do some real research on mixing cards, whether or not different memory capacities are a problem, and whether or not it's feasible to mix architectures, (Maxwell and Pascal, for example.) Or wait until the same card I already own drops low enough at Newegg to fit my budget! (Blasted cryptominers!)

  • L'Adair said:
    Valkeerie said:
    L'Adair said:

    @Valkeerie, Did you replace the 980? Or did you just add the 2080Ti?

    I'm asking for two reasons. 1) Having both might affect the speed of the render, and 2) I've got a 1080 in a rig with room for two more video cards and I'm curious if it will play nice with later cards, like the 1080Ti, or even better, a 2080Ti. No idea when I'll be able to afford a second card, though.

    Congratulations on the new card. Hopefully you can still afford some jam to go with that peanut paste, for a bit of variety in your diet.
    laugh

    I replaced the 980. With its 4Gb I figured (based on reading here) that it might cause more trouble with textures than it was worth. I've read that GPU memory should match in multiple cards - don't know how true that is.

    Interesting. I've not read that, but I have read that Daz/Iray will "drop" any card that doesn't have enough memory to hold the entire scene. So (as an example,) if one card has 8GB and one has 11GB, theoretically, only the 8GB card will be ignored if the scene takes 9GB, but both will be ignored if the scene takes 12GB.

    That is how it works, yes - there is no issue with having cards with different capacities installed and marked for use with Iray

  • outrider42outrider42 Posts: 3,679

    Cool. Personally I'll be waiting since nearly every article I have read has said that the performance you get over the 1080 it doesn't come close to justifying the opening price on the 2080 ti and many tech reports have said they can't recommend it at this time at the price it's at now.

    here is a good article https://www.extremetech.com/gaming/278454-nvidia-rtx-2080-and-rtx-2080-ti-review-you-cant-polish-a-turing

    Love this quote also "ExtremeTech, therefore, does not recommend purchasing either the RTX 2080 or RTX 2080 Ti for their hoped-for performance in future games. Ray tracing may be the future of gaming, but that future isn’t here yet"

    I am about to purchase a new vehicle, so I can't justify any cost on a new GPU, so I am envious of you. Enjoy the rendering!

    It is very important to note that those are all gaming benchmarks, and as such have little to do with actual Iray performance. The people who have tried out the 2 Iray benchmarks posted by sickleyield and me are seeing almost a double performance jump over the 1080ti. Not quite double, but certainly far higher than the 30% or so you see in gaming. And this is without Iray being optimized for Turing just yet. It stands to reason that it could get faster with more updates. 

    Here is the Daz user benchmark thread, which is also now in my sig. https://www.daz3d.com/forums/discussion/53771/iray-starter-scene-post-your-benchmarks#latest

    Keep in mind that no video game is making use of Turing's new features, the Tensor cores and the Ray Tracing cores. So of course video games are only going to show relatively small improvements. Right now Iray does not support the Ray Tracing cores (and there is question as to whether it can, but that's another topic.) But I believe that Iray is making use of the Tensor cores to some degree, and that is where the extra performance is coming from. Right now it is believed that Turing is running off the Volta drivers for Iray, and Volta does have Tensor cores, so this is logical. (The only Volta card available is the $3000 Titan V.)

    The 2080ti is about twice the price of the 1080ti, but again, it also gets you about twice the performance. So it is not such a bad investment for Iray, and as I said, it might yet improve even more with driver updates.

  • ValkeerieValkeerie Posts: 163

    Cool. Personally I'll be waiting since nearly every article I have read has said that the performance you get over the 1080 it doesn't come close to justifying the opening price on the 2080 ti and many tech reports have said they can't recommend it at this time at the price it's at now.

    here is a good article https://www.extremetech.com/gaming/278454-nvidia-rtx-2080-and-rtx-2080-ti-review-you-cant-polish-a-turing

    Love this quote also "ExtremeTech, therefore, does not recommend purchasing either the RTX 2080 or RTX 2080 Ti for their hoped-for performance in future games. Ray tracing may be the future of gaming, but that future isn’t here yet"

    I am about to purchase a new vehicle, so I can't justify any cost on a new GPU, so I am envious of you. Enjoy the rendering!

    It is very important to note that those are all gaming benchmarks, and as such have little to do with actual Iray performance. The people who have tried out the 2 Iray benchmarks posted by sickleyield and me are seeing almost a double performance jump over the 1080ti. Not quite double, but certainly far higher than the 30% or so you see in gaming. And this is without Iray being optimized for Turing just yet. It stands to reason that it could get faster with more updates. 

    Here is the Daz user benchmark thread, which is also now in my sig. https://www.daz3d.com/forums/discussion/53771/iray-starter-scene-post-your-benchmarks#latest

    Keep in mind that no video game is making use of Turing's new features, the Tensor cores and the Ray Tracing cores. So of course video games are only going to show relatively small improvements. Right now Iray does not support the Ray Tracing cores (and there is question as to whether it can, but that's another topic.) But I believe that Iray is making use of the Tensor cores to some degree, and that is where the extra performance is coming from. Right now it is believed that Turing is running off the Volta drivers for Iray, and Volta does have Tensor cores, so this is logical. (The only Volta card available is the $3000 Titan V.)

    The 2080ti is about twice the price of the 1080ti, but again, it also gets you about twice the performance. So it is not such a bad investment for Iray, and as I said, it might yet improve even more with driver updates.

    I read the ExtremeTech article and it makes some very sound points about the way Nvidia introduces new technology. I found myself nodding in agreement - the advantages of real-time ray-tracing may not be realised within the lifetime of the card.

    It's good to see that you are experiencing decent speedups in your benchmarking, and prior to purchase I did read the thread where you discussed them. I'll mention two things I forgot to say in my original post.

    The first is that my PC (win7) has been stable and reliable for about three years and I had serious doubts about sticking two 1080Tis in the chassis. I didn't think my power supply would handle it, and I would have to tear it down to stick a new one in. I can do that, I built it from scratch, but I didn't want to. The other factors were cooling and noise, and again, a single 2080Ti made more sense. If I was planning a chassis for dual 1080Tis I'd probably go for water cooling.

    Lastly, I have the Asus dual fan 2080Ti, and it is quieter on full render than the 980 was on idle. I'm so accustomed to the sound of VTOL aircraft coming in to land, I don't even realise the 2080Ti card is busy.

  • joseftjoseft Posts: 310
    Valkeerie said:

    Cool. Personally I'll be waiting since nearly every article I have read has said that the performance you get over the 1080 it doesn't come close to justifying the opening price on the 2080 ti and many tech reports have said they can't recommend it at this time at the price it's at now.

    here is a good article https://www.extremetech.com/gaming/278454-nvidia-rtx-2080-and-rtx-2080-ti-review-you-cant-polish-a-turing

    Love this quote also "ExtremeTech, therefore, does not recommend purchasing either the RTX 2080 or RTX 2080 Ti for their hoped-for performance in future games. Ray tracing may be the future of gaming, but that future isn’t here yet"

    I am about to purchase a new vehicle, so I can't justify any cost on a new GPU, so I am envious of you. Enjoy the rendering!

    It is very important to note that those are all gaming benchmarks, and as such have little to do with actual Iray performance. The people who have tried out the 2 Iray benchmarks posted by sickleyield and me are seeing almost a double performance jump over the 1080ti. Not quite double, but certainly far higher than the 30% or so you see in gaming. And this is without Iray being optimized for Turing just yet. It stands to reason that it could get faster with more updates. 

    Here is the Daz user benchmark thread, which is also now in my sig. https://www.daz3d.com/forums/discussion/53771/iray-starter-scene-post-your-benchmarks#latest

    Keep in mind that no video game is making use of Turing's new features, the Tensor cores and the Ray Tracing cores. So of course video games are only going to show relatively small improvements. Right now Iray does not support the Ray Tracing cores (and there is question as to whether it can, but that's another topic.) But I believe that Iray is making use of the Tensor cores to some degree, and that is where the extra performance is coming from. Right now it is believed that Turing is running off the Volta drivers for Iray, and Volta does have Tensor cores, so this is logical. (The only Volta card available is the $3000 Titan V.)

    The 2080ti is about twice the price of the 1080ti, but again, it also gets you about twice the performance. So it is not such a bad investment for Iray, and as I said, it might yet improve even more with driver updates.

    I read the ExtremeTech article and it makes some very sound points about the way Nvidia introduces new technology. I found myself nodding in agreement - the advantages of real-time ray-tracing may not be realised within the lifetime of the card.

    It's good to see that you are experiencing decent speedups in your benchmarking, and prior to purchase I did read the thread where you discussed them. I'll mention two things I forgot to say in my original post.

    The first is that my PC (win7) has been stable and reliable for about three years and I had serious doubts about sticking two 1080Tis in the chassis. I didn't think my power supply would handle it, and I would have to tear it down to stick a new one in. I can do that, I built it from scratch, but I didn't want to. The other factors were cooling and noise, and again, a single 2080Ti made more sense. If I was planning a chassis for dual 1080Tis I'd probably go for water cooling.

    Lastly, I have the Asus dual fan 2080Ti, and it is quieter on full render than the 980 was on idle. I'm so accustomed to the sound of VTOL aircraft coming in to land, I don't even realise the 2080Ti card is busy.

    Like you, my next upgrade is likely going to be a single card, rather than two. Its either that, or break the bank and get 2 or 3 cards and do a custom water cooling loop.

    My current rig has twin Titan X's plus a 770 to drive my displays, and having 3 cards in there, even with a good case that is designed for good air flow and cards that have very good air cooling systems on them, the cards are so close together that they just feed each other heat and it heatsoaks the case so i have to take the side off to vent it. Even doing that, and using MSI afterburner to create custom fan profiles so the fans work harder and sooner, they still hit their thermal throttling thresholds. And of course there is the noise on top of that.

    When i am convinced there is a single card that will give me a reasonable performance boost over running the twin Titan X's, then i will probably end up doing that, much easier to keep cool.

  • ArtiniArtini Posts: 10,307

    Great posts here. I also have an old computer with Windows 7, but I have bought GTX 1080, just before they released GTX 1080Ti.

    If the drivers for 2080 Ti become more mature and iray become much faster with it, then I will consider a purchase.

    Right now, I've just watching the threads about 2080 Ti, so thanks for yours.

     

  • ValkeerieValkeerie Posts: 163
    Artini said:

    Great posts here. I also have an old computer with Windows 7, but I have bought GTX 1080, just before they released GTX 1080Ti.

    If the drivers for 2080 Ti become more mature and iray become much faster with it, then I will consider a purchase.

    Right now, I've just watching the threads about 2080 Ti, so thanks for yours.

     

    I'm glad it was useful. Just to say, I've spent about 7 hours in Studio 4.11 beta since I installed the card, lots of test renders, and it has all been solid. Not one flaky moment. I was very apprehensive about the purchase and I can say that I'm delighted with the result. In the past I've had to avoid atmosphere effects (mist, fog, clouds etc) because they were so slow, but now it's fog in everything :-) More fog!

  • linvanchenelinvanchene Posts: 1,386
    edited October 2018

    It would be great if there could be one thread on this forum where those people who purchased a 2080Ti could focus on sharing their experience how it works in DAZ Studio without having to justify their purchase. indecision

    - - -

    I shared some test scene results last week and wonder if others who purchased a 2080 Ti can observe a similar behavior:

     

    - - -
    Test system

    Win 10 Pro 64bit
    Intel Core i7 5820K
    ASUS X99-E WS
    64 GB RAM

    Asus GTX 1080 STRIX A8G
    Asus GTX 1080 Ti FE
    ASUS GeForce RTX 2080 Ti TURBO

    Nvidia Driver Version: 416.16
    DAZ Studio 4.11.0.231
    Preview Viewport Wire Shaded
    DAZ Studio was closed and restarted between renders.

    - - -

    Iray Render Test 2 scene source:

    https://direct.daz3d.com/forums/discussion/comment/3969916/#Comment_3969916

    - - -


    1x GTX 1080
    Optix Prime Acceleration Off
    11 minutes 22.98 seconds
    OptiX Prime Acceleration On
    10 minutes 13.62 seconds

    - - -

    1x GTX 1080 Ti
    Optix Prime Acceleration Off
    8 minutes 28.28 seconds
    OptiX Prime Acceleration On
    7 minutes 34.25 seconds

    - - -

    2x GTX 1080 Ti
    OptiX Prime Acceleration Off
    4 minutes 27.10 seconds
    OptiX Prime Acceleration On
    4 minutes 1.66 seconds

    - - -

    1x RTX 2080 Ti
    OptiX Prime Acceleration Off
    4 minutes 37.84 seconds
    OptiX Prime Acceleration On
    4 minutes 44.83 seconds

    - - -

    2x RTX 2080 Ti
    OptiX Prime Acceleration Off
    2 minutes 36.17 seconds
    OptiX Prime Acceleration On
    2 minutes 36.98 seconds
    - - -

     


    - - -

     

    Can anyone else confirm that currently a RTX 2080 Ti is not using Optix Prime Acceleration at all?

     

    I was under the impression that Optix Prime Acceleration is based on CUDA.

    https://en.wikipedia.org/wiki/OptiX

    - - -

    - - -

    For using dForce the 2080Ti had to be initialized when running it for the first time but then worked.

    Would still be great to be able to use both 2080 Ti for dForce...

    - - -

     

    Post edited by linvanchene on
  • ebergerlyebergerly Posts: 3,255
    Linvanchene, so you bought 2 x 2080ti's? Thx.
  • According to a former nVidia employee who posted in one of these threads OptiX uses CUDA, OptiX Prime is a different thing and doesn't. Iray uses OptiX Prime.

  • linvanchenelinvanchene Posts: 1,386
    edited October 2018
    ebergerly said:
    Linvanchene, so you bought 2 x 2080ti's? Thx.

    The 1080 Ti is used to run the display, OpenGL, PhysX.

    The two 2080 Ti are assigned as Cuda device.

    That way it is possible to render and edit in photoshop at the same time...

     

    I did consider upgrading to i9 and a Asus Sage mainboard.

    But because it is currently not quite clear in which direction this all is headed I am happy with the speed increase I get by just swapping the rendering GPUs to 2080 Ti.

     

    Cuda and OpenGL separate.png
    1920 x 1080 - 161K
    PhysX.png
    1920 x 1080 - 174K
    Post edited by linvanchene on
  • outrider42outrider42 Posts: 3,679

    At this time all signs point to Turing not using optix. The former Nvidia employee posted that optix prime may not be able to update to use the ray tracing cores. But that does not mean that there will never be an optix prime update for Turing at all. I believe that Turing will get much faster if/when it gets proper optix prime updates, even if it does not get to use the ray tracing cores. Pretty much all cards run faster with optix on, and by a noticeable margin.

    Its too bad nobody has done a Titan V test on these benchmarks. I'd like to see how it compares.

  • mikekmikek Posts: 195
    edited October 2018

    The 2080ti is about twice the price of the 1080ti, but again, it also gets you about twice the performance. So it is not such a bad investment for Iray, and as I said, it might yet improve even more with driver updates.

    The 2080ti is closer to being a Titan than Ti. Even the price is the exact same the last Titan had.
    Unless one needs a new card one option probably worth considering is to wait if nvidia will release a "real Ti" next year. The 2080Ti is also a bit limited regarding memory for my taste considering the high price of what used to be a Titan.

    Post edited by mikek on
  • ValkeerieValkeerie Posts: 163

    Re: Optix Prime

    Yes, I've just unticked the Optix Prime box and the difference seems negligible.

    One thing I have noticed, both with the 980 and now the 2080 Ti is that GPU memory seems either to leak or become fragmented. After a few hours of rendering it will drop out into CPU rendering. I then have to reboot, and it is fine again. Memory leaks are legendary in complex programs, so this would not be completely unexpected.

  • grinch2901grinch2901 Posts: 1,247
    Valkeerie said:

    Re: Optix Prime

    Yes, I've just unticked the Optix Prime box and the difference seems negligible.

    One thing I have noticed, both with the 980 and now the 2080 Ti is that GPU memory seems either to leak or become fragmented. After a few hours of rendering it will drop out into CPU rendering. I then have to reboot, and it is fine again. Memory leaks are legendary in complex programs, so this would not be completely unexpected.

    I experience the same on my GT-1070.  I render simple scenes for a while, make some adjustments to poses, render & save, adjust, render & save ...  and then suddenly it's using CPU and taking forever. Same scene, same content, but the memory goes from "fits" to "doesn't fit" after a while. 

  • ArtiniArtini Posts: 10,307
    edited October 2018

    Is not the feature of the new Daz Studio 4.11 beta?

    I still use 4.10 beta and it seems pretty stable.

     

    Post edited by Artini on
  • ValkeerieValkeerie Posts: 163
    Artini said:

    Is not the feature of the new Daz Studio 4.11 beta?

    I still use 4.10 beta and it seems pretty stable.

     

    I have been experiencing the memory leak/fragmentation thing for a long time. I thought 4.10 was better than 4.9 (but it still did it in 4.10), and it is quite possible that 4.11 beta still needs a little honing.

  • outrider42outrider42 Posts: 3,679
    mikek said:

    The 2080ti is about twice the price of the 1080ti, but again, it also gets you about twice the performance. So it is not such a bad investment for Iray, and as I said, it might yet improve even more with driver updates.

    The 2080ti is closer to being a Titan than Ti. Even the price is the exact same the last Titan had.
    Unless one needs a new card one option probably worth considering is to wait if nvidia will release a "real Ti" next year. The 2080Ti is also a bit limited regarding memory for my taste considering the high price of what used to be a Titan.

    The 2080ti is not the full Turing chip, so there room for something bigger. But not massively bigger. The performance gain would not be much. So the only thing that can really change is VRAM offerings and price.

    But who is competing against Nvidia right now? There is no reason to change the price unless the cards straight up bomb. Even with gamers pushing back I personally don't see a price change unless AMD gets something out the door that actually competes. Who knows when or even if that will ever happen.

    I do think a Titan will come, but the Titan V was $3000, and the 2080ti is $1200. So even if a Titan "T" comes along with more VRAM it might end up being $3000.
  • mikekmikek Posts: 195

    The 2080ti is not the full Turing chip, so there room for something bigger. But not massively bigger. The performance gain would not be much. So the only thing that can really change is VRAM offerings and price.

     

    But who is competing against Nvidia right now? There is no reason to change the price unless the cards straight up bomb. Even with gamers pushing back I personally don't see a price change unless AMD gets something out the door that actually competes. Who knows when or even if that will ever happen.

     

    I do think a Titan will come, but the Titan V was $3000, and the 2080ti is $1200. So even if a Titan "T" comes along with more VRAM it might end up being $3000.

    I'm not so much talking about a new Titan. Previous Titans have been always expensive with the excuse of having more power and far more ram. But the ram advantage has been gone with the last gen so Titan isn't that interesting anymore if this continues:
    Titan        6GB  Feb 2013 $999
    780Ti      3GB  Nov 2013 $699

    TitanX     12GB Mar 2015 $999
    980Ti      6GB  Jun 2015 $649

    TitanXp   12GB Apr 2017 $1200
    1080Ti    11GB Mar 2017 $699

    The release cycle of the last couple years has been first Titan for $1000+ and x80 for $550-$650.  Some time later the Ti which was more or less a cheap Titan for $650-$700.
    Now we got the x80 and  Ti together. Ti now with Titan price and Titan release period but nothing named Titan in sight.
    Not sure whats going on. It looks like the 2080Ti is what used to be Titan now. Maybe there will still be a classic Ti equivalent next year? The full next year without any release would be a bit strange. They have been releasing a highend gpu every year for some time now. But who knows maybe they will just sit on their back and wait for Intel gpus in 2020.

  • outrider42outrider42 Posts: 3,679
    Everything about Turing is strange and new territory from Nvidia. Why they didn't call the 2080ti the new Titan, nobody knows. They have Tensor and ray tracing cores. The prices are sky high. The 2070 has not released, but it really looks like a lame duck in my and many others opinion. Youtuber after youtuber has spoke about the crazy prices on these. The 2060 is now where in sight, and will it even be a Turing chip if it has no ray tracing.

    And how would they do a "traditional" x80ti at this point? I don't see such a path without a drastic change to what tghey released so far. The 2080 is already $800 itself, so there isn't much room for a x80ti.

    VRAM has sort of stagnated at 12gb, but what can we do? Nvidia is only competing with themselves right now. If AMD popped out a 16gb gaming card we might see a change. Until then, no. Some games are pushing lots of VRAM at 4k, but it will still be a while before they max out a 12gb card. Supposedly the Cyberpunk 2077 demo was running on a 1080ti, which if you saw it, was amazing. It had a lot of people on screen at times, and they all looked different. In most games you see a lot of NPCs walking around that look a like to save on memory. As crowd density goes up, and lush foilage, you will see demands generally go up. Large open worlds can tax VRAM. So they are getting there. I wonder how ray tracing impacts VRAM in gaming? That could be an interesting topic.
  • mikekmikek Posts: 195
    edited October 2018

    Yes them changing things around and the uncertainty is why it might be not a bad idea to wait if someone is only considering the upgrade. Have they dropped the Titan or will it come later? Will there be a cheaper highend card like the classic Ti? Is the Ti replacing the Titan?
     

    And how would they do a "traditional" x80ti at this point? I don't see such a path without a drastic change to what tghey released so far. The 2080 is already $800 itself, so there isn't much room for a x80ti.

    The classic Ti didn't cost much more than the x80. 780Ti was $50 plus, 980Ti was $100 plus, 1080Ti had the same price. Something for $800 to $850 would still work.
    There is also the possibility of intel releasing a high end card for 2020. If it comes with somewhat good performance and decent price Nvidia would most likely lower their prices with the next gen again.

    I wonder how ray tracing impacts VRAM in gaming?

    My first guess would be it won't change much at least at the moment as it's still very limited. On the other hand with daz iray there is a good change in memory consumption with the same scene when just increasing the resolution.

    Post edited by mikek on
  • outrider42outrider42 Posts: 3,679
    VRAM with ray tracing probably depends on the game engine. There's a lot of things at play. In some ways, VRAM might actually decrease. To fake reflections in current games, you need to use cube maps in place of where real reflections would be. So with real ray tracing, you don't need these maps anymore. The surface settings takes care of this. That might actually save a bit of memory. On the flip side, in order to make ray tracing work correctly, you may need to have things outside the scene loaded in memory. Things that might not be needed when faking reflections, because now you need to worry about light sources and objects outside the camera that still cast shadows in the view of the camera. That could be a big resource hog. And how far is the ray tracing extended? In a large open world game where you can see for long distances, are they going to draw geometry out that far to cast shadows? That would be a massive amount of memory.

    And like I said, Nvidia broke all their past traditions with Turing. So the historical prices of x80ti just don't apply anymore. Everything is out the window. Nvidia has also made a mockery of MSRP with how they've done Founder's Edition cards. The 1080ti FE interestingly had no mark up over MSRP, but other Pascal cards did.

    Basically Nvidia rewrote the rule book with this launch. Its anybody's guess as to what happens next. But many gamers are not happy with RTX prices. It all comes down to that. If the prices were lower nobody would have complained. However with prices being what they are Turing is being picked apart at every angle. I think it will be very interesting to see what sales look like. And we can only hope that AMD or even Intel come through with something soon. I'd say more, but I can't.
  • Thanks for posting this. I was under the impression the NVLink would pool the memory of multiple 2080 TIs, I'm glad I found out now and did not go out to buy 4 ;-)

    I'm still in the air between a 2950x and a 2990xw + 64GB RAM but will most likely get dual RTX 2080 TIs - most of the images I render are in the 960x640 to 1280x960 range - quite small if you consider the size most people aim for (I do so for retro console graphics and rendering so don't need the excess size). Combining the Ryzen 32 cores with dual RTX 2080 TIs, do you believe I will end up with renders that take roughly a minute or two at most? For comparison sake, I'm using a 960 with 2GB of RAM (crunched and slow) with a 4790k that has 16GB of RAM. Even a 640x640 takes almost 2 hours with a lot of lighting and shadows (fireplace + gen8 character), then again, that could be pushing into CPU only render mode since the card is so limited with VRAM.

    Thanks!

  • outrider42outrider42 Posts: 3,679
    edited November 2018

    It really depends on what your are trying to render. Iray seems to scale very well. So if a card is 3 times faster in one scene, it is probably 3 times faster in just about all scenes. With this information you can check out the bench thread in my sig and compare your 960 to those who have tested with the 2080ti or any other card. There are two benches, SY's and mine, so be sure that you are comparing the right ones. Then it is simple matter of calculating the differences with your current render speeds. A 2080ti will certainly be a lot faster than a 960, but some things can still pump up long render times.

    BTW, if you render that small, try using the beta 4.11 with its new denoiser enabled. I think for you the denoiser will feel like magic! Give it a try.

    One more thing, just to blow some minds. It has been tested and proven that Nvlink CAN POOL VRAM in the 2080ti. The people at Vray have tested this and verify it works. There is a performance penalty for the Nvlink mode, but it is still faster than the 2080ti alone. The other caveat is nobody actually knows how much VRAM gets pooled, the tools that report VRAM are not reporting it correctly. But they can render a scene that would not fit on a single 2080ti.

    Here is the link that I mentioned. Judging by how they did this, I would think this is possible for Iray as well. But nobody has tested this yet. I hope any Daz users who do have two 2080tis can see this and try it. I know some of them did buy Nvlink.

    https://www.chaosgroup.com/blog/profiling-the-nvidia-rtx-cards

    If anyone does have the hardware to test this, I'd love to see them also run my and SY's benchmarks to see how much of a difference the render speeds are.

    Additionally, another Vray post tested 2080tis with the QUADRO version of the Nvlink and got the VRAM to pool even easier. My bet is that it also performs better than the gaming Nvlink. Of course the Quadro Nvlinks are super expensive. Also note that the 2080ti can only have two cards in Nvlink.

    See that test on the far right? That is the one that would only work with Nvlink mode enabled.

    Post edited by outrider42 on
  • pwiecekpwiecek Posts: 1,598
    Valkeerie said:

    The 2080 is too slow.

    Do you mean "The 980 is too slow."? 

Sign In or Register to comment.