Nvidia Ampere (2080 Ti, etc. replacements) and other rumors...

1679111245

Comments

  • marblemarble Posts: 7,500

    Frankly, all the talk about power and speed, while perhaps interesting, is of no consequence to me. My only concern is VRAM. It doesn't matter how fast the card is if the scene I want to render will not fit and drops to CPU. What's the point of spending over $1,000 for a piece of high-technology that is just not up to the task? And it looks increasingly probable that the VRAM increase, if it happens at all, will be significantly delayed. Which all prompts the search for alternatives such as rendering in other software or hoping for something like Google Filament to be a gift from the gods.

  • All the rumor sites are now saying the 3090 will have 24 GB VRAM.

  • nicsttnicstt Posts: 11,715
    marble said:

    Frankly, all the talk about power and speed, while perhaps interesting, is of no consequence to me. My only concern is VRAM. It doesn't matter how fast the card is if the scene I want to render will not fit and drops to CPU. What's the point of spending over $1,000 for a piece of high-technology that is just not up to the task? And it looks increasingly probable that the VRAM increase, if it happens at all, will be significantly delayed. Which all prompts the search for alternatives such as rendering in other software or hoping for something like Google Filament to be a gift from the gods.

    Seriously, try Blender.

  • nicsttnicstt Posts: 11,715
    edited August 2020

    All the rumor sites are now saying the 3090 will have 24 GB VRAM.

    I don't care what they're saying; I will wait until I see what is released. After the release of the last cards, and how long it took Nvidia to deliver what they promised, I'll be taking what Nvidia claim with a pinch of salt too.

    Post edited by nicstt on
  • marblemarble Posts: 7,500

    All the rumor sites are now saying the 3090 will have 24 GB VRAM.

    I'm not in the 3090 league, I'm afraid. I have been hoping thet the 3070 16GB rumour had some truth to it but I'm pretty much doubting that now.

  • marble said:

    All the rumor sites are now saying the 3090 will have 24 GB VRAM.

    I'm not in the 3090 league, I'm afraid. I have been hoping thet the 3070 16GB rumour had some truth to it but I'm pretty much doubting that now.

    Well they did a GTX 960 with 4GB back when the maximum memory available on Maxwwell was 4GB of GDDR5 so if RDNA 2 does some 16GB midrange cards then Nvidia should too.  Just hope to see them soon though.

  • I think it will depend on what AMD does for the 3070 to get 16 GB of VRAM, too. Looks like sometime in October according to some. Maybe you can get some more money saved up by then and get a little better Nvidia card and we'll maybe know how they work with Iray by then.

  • marblemarble Posts: 7,500

    I think it will depend on what AMD does for the 3070 to get 16 GB of VRAM, too. Looks like sometime in October according to some. Maybe you can get some more money saved up by then and get a little better Nvidia card and we'll maybe know how they work with Iray by then.

    I wish. I don't have spare cash as I'm on a fixed state pension so my annual holiday comes out of my life savings and if I choose not to take a holiday (meaning a trip somewhere), I use that money for a luxury such as a computer upgrade. With the pandemic, travel is limited so this year I'm spending on the computer.

  • nonesuch00nonesuch00 Posts: 18,762

    I've a feeling the the 30XX series will not be fast enough for someone on my small budget to justify spending $1400 on a 3090 GPU with 24GB I'll get 1 or 2 years use of before one with 24GB or more VRAM that is actually fast enough does come out. For that reason I've made up my mind to buy the 3060 6GB/12GB or 3070 8GB/16GB GPUs depening on cost. It might come down to the 3060 12GB or the 3070 8GB, I'm not sure.

  • marble said:

    I think it will depend on what AMD does for the 3070 to get 16 GB of VRAM, too. Looks like sometime in October according to some. Maybe you can get some more money saved up by then and get a little better Nvidia card and we'll maybe know how they work with Iray by then.

    I wish. I don't have spare cash as I'm on a fixed state pension so my annual holiday comes out of my life savings and if I choose not to take a holiday (meaning a trip somewhere), I use that money for a luxury such as a computer upgrade. With the pandemic, travel is limited so this year I'm spending on the computer.

    I'd wait on some user reviews then before making the leap. You've had continuous problems with Blender so I don't recommend it. Likewise the jury is still out on UE though it shows some promise with high quality renders, it still is rather complicated in some ways, but reports of using FBX export  with the Daz rig works better than the Epic rig. The current bridge seems focused on games and not the film production part of UE which may be closer to Iray in results. Anything game related isn't going to get you the quality you've said before that you want. Filament causes hesitation for me because of the game focus. It's still early though.

    I'm 64 and have been trying to do some quality work before I die like you. My stroke last year has slowed me down. But I'm still wary of fools errands which we see a lot of around here. I'm fortunate that I still have a job but who knows how long it will last. Good luck, Marble!

  • VisuimagVisuimag Posts: 578
    edited August 2020

     

    marble said:

    Frankly, all the talk about power and speed, while perhaps interesting, is of no consequence to me. My only concern is VRAM. It doesn't matter how fast the card is if the scene I want to render will not fit and drops to CPU. What's the point of spending over $1,000 for a piece of high-technology that is just not up to the task? And it looks increasingly probable that the VRAM increase, if it happens at all, will be significantly delayed. Which all prompts the search for alternatives such as rendering in other software or hoping for something like Google Filament to be a gift from the gods.

    Very much why I've cherished my TITAN RTX and will probably add another over buying one of these new cards. Of course, I'd need to see the new cards' NVLINK performance, but I'm willing to bet I'd be set with two TITAN RTXs for a good while!

     
    Post edited by Visuimag on
  • nicsttnicstt Posts: 11,715
    Visuimag said:

     

    marble said:

    Frankly, all the talk about power and speed, while perhaps interesting, is of no consequence to me. My only concern is VRAM. It doesn't matter how fast the card is if the scene I want to render will not fit and drops to CPU. What's the point of spending over $1,000 for a piece of high-technology that is just not up to the task? And it looks increasingly probable that the VRAM increase, if it happens at all, will be significantly delayed. Which all prompts the search for alternatives such as rendering in other software or hoping for something like Google Filament to be a gift from the gods.

    Very much why I've cherished my TITAN RTX and will probably add another over buying one of these new cards. Of course, I'd need to see the new cards' NVLINK performance, but I'm willing to bet I'd be set with two TITAN RTXs for a good while!

     

    You might find them going up in cost as it's discovered they are actually good at what they do versus the new 3000 series being disappointing.

    I have the cash for one, but after moving my rendering to Blender, decided to wait and see. My Threadripper outperforms my 980ti, and it's a first gen Threadripper .

  • Re: The Remarkable Art & Science of Modern Graphic Card Design. I'm interested!

    I'm building a machine, gradually, learning the newer technology as I go. One concern I've had is whether or not the new GPUs will require anything new or different in the other components I need to get.

    So according to the video at 6:18, the GPUs will include an adaptor to make their 12-pin connectors compatible with pcie 8-pin cables. If I'm understanding that correctly, hurray, one concern resolved.
    [Also planning: ASRock x570 Taichi, R7 3700x]

    My current machine can no longer render, so I'm eager to get the new build built. But it hasn't been easy. A person can't study for what doesn't exist yet.

    Oh! Just wondering: Does DS usually have the ability to render Iray, albeit poorly, with newly-released cards, or will the 30xx's be totally unable to render in Iray until a new version of DS comes out?

    Anyway, it's nice to have some good news these days, isn't it?
    -Ken

  • marblemarble Posts: 7,500
    marble said:

    I think it will depend on what AMD does for the 3070 to get 16 GB of VRAM, too. Looks like sometime in October according to some. Maybe you can get some more money saved up by then and get a little better Nvidia card and we'll maybe know how they work with Iray by then.

    I wish. I don't have spare cash as I'm on a fixed state pension so my annual holiday comes out of my life savings and if I choose not to take a holiday (meaning a trip somewhere), I use that money for a luxury such as a computer upgrade. With the pandemic, travel is limited so this year I'm spending on the computer.

    I'd wait on some user reviews then before making the leap. You've had continuous problems with Blender so I don't recommend it. Likewise the jury is still out on UE though it shows some promise with high quality renders, it still is rather complicated in some ways, but reports of using FBX export  with the Daz rig works better than the Epic rig. The current bridge seems focused on games and not the film production part of UE which may be closer to Iray in results. Anything game related isn't going to get you the quality you've said before that you want. Filament causes hesitation for me because of the game focus. It's still early though.

    I'm 64 and have been trying to do some quality work before I die like you. My stroke last year has slowed me down. But I'm still wary of fools errands which we see a lot of around here. I'm fortunate that I still have a job but who knows how long it will last. Good luck, Marble!

    This is my hobby in retirement. I was a techie for my whole working career, supporting computer hardware and networking in the  latter years. So I'm not afraid of technology but I find that I'm slow to grasp new techniques these days so learning Blender, for example, has become a slog and my comfort zone is still DAZ Studio. If DAZ could improve the timeline, speed up dForce, add soft body physics and allow me to render a scene with 4 characters for the cost of a 3070, I would look forward to enjoying this hobby for my remaining years. I didn't have a stroke but I did need a quadruple bypass in my mid-40's so I know about being slowed down. 
     

     

  • outrider42outrider42 Posts: 3,679

    @outrider42

    There should be a significant power savings by going to a smaller node, even with the performance bump.  We've already seen this in Picasso vs Renoir on the AMD end.  I need to put a caveat on this, though, as if Nvidia is indeed using a Samsung node for the new cards it's not a simple apples to apples comparison of a larger node vs a smaller node.  Kind of like trying to compare GloFo silcion to TSMC and Intel silicon.

    A more linear comparison would be seeing how much Intel has eeked out of their 14nm node over the years, but that's been incremental improvements at best, and maybe comparing that to the Intel 10nm node.

    I did see in one of the rumors recently where the core count bump with the new cards along with the clock speed improvements may mean as much as a 30%-50% improvement vs. the previous gen cards in the last day or so, but of course until we see independent benchmarks, grain of salt and all that.  If we were talking say a 50% improvement with the 3090 vs Titan RTX, that's pretty significant from a content creation standpoint.  Even 30% is a huge jump.  At that point you are talking with getting away with just 2 newer cards vs 3 older cards, but of course the power budget situation and heat management need to be taken into account.

    On the leak/rumor front, here's a leak featuring the Gainward 3090 and 3080 cards:

    https://videocardz.com/newz/gainward-geforce-rtx-3090-and-rtx-3080-phoenix-leaked-specs-confirmed

    The WCCFTech article goes into a bit more detail r.e. the power connectors on the Gainward cards:

    https://wccftech.com/gaiwanrd-geforce-rtx-3090-geforce-rtx-3080-phoenix-custom-graphics-cards-pictured-specifications-detailed/

    Two 8 pin power connectors sounds nice for people not wanting to have to track down that new 12 pin connector that'll work with their PSU... If NVLink is indeed being left off of the 3080's though, that's definitely something that a few people around here might care about.  Grain of salt and all that.

     

     

    Of course, that is always the case. That is why clock speeds often increase. My 670 had a clock of right around 1000 Mhz. That was good back then, but today that would be considered quite low. You get more performance by increasing clock speeds, it is often considered "free" performance since it does not require any additional hardware besides better cooling. But Nvidia has generally capped their top GPUs at around 250 Watts. That has been the norm for quite a long time. But the top Ampere breaks that, and at 350 Watts it does it by a full 100 Watts. The 2nd tier 3080 even breaks that barrier at 320 Watts. For reference, the 2080ti was rated at 260 Watts, and the Titan RTX was rated at 280 Watts. That 20 Watt difference is largely due to the additional VRAM, 14GB more VRAM will use about that much. The 3090 is using even more energy than the Titan RTX. But it will be a lot faster.

    That shows Nvidia is working very hard to add additional performance to these GPUs, beyond what they normally would do.

    There is also something that I have seen no tech people really discussing: that Nvidia is using 3 different types of cores in their GPUs now, while AMD might still be using just one type of core.

    Think about that. Back in the PS5 deep dive, they mentioned that the PS5 GPU has no dedicated ray tracing cores. This GPU is doing ray tracing through its existing cores. The PS5 is using a modified RDNA2, and this indicates that AMD is not using dedicated cores.

    So what's the big deal? Well that means that AMD can use their entire die space for their regular cores, and they can use ray tracing and upscaling over these same cores. But Nvidia cannot do this. For Nvidia, they have to dedicate a large amount of die space to these ray tracing and tensor cores they created. So while Nvidia will likely have a big performance advantage at ray tracing and upscaling/denoising, AMD might actually have an advantage at traditional rendering and performance. Most video games today still use traditional raster performance, so that is bad news for Nvidia.

    Thus, to make up for this potential disadvantage, Nvidia made large GPUs and over clocked them as much as possible to get every drop of performance out of them. So even though this is a smaller process node, the end result is a GPU that might use a lot more power than normal.

    So ultimately I believe a lot of what we are seeing has come down to Nvidia's design decision to create these dedicated ray tracing and tensor cores. This design choice is a double edged blade for them.

  • outrider42outrider42 Posts: 3,679
    nicstt said:
    Visuimag said:

     

    marble said:

    Frankly, all the talk about power and speed, while perhaps interesting, is of no consequence to me. My only concern is VRAM. It doesn't matter how fast the card is if the scene I want to render will not fit and drops to CPU. What's the point of spending over $1,000 for a piece of high-technology that is just not up to the task? And it looks increasingly probable that the VRAM increase, if it happens at all, will be significantly delayed. Which all prompts the search for alternatives such as rendering in other software or hoping for something like Google Filament to be a gift from the gods.

    Very much why I've cherished my TITAN RTX and will probably add another over buying one of these new cards. Of course, I'd need to see the new cards' NVLINK performance, but I'm willing to bet I'd be set with two TITAN RTXs for a good while!

     

    You might find them going up in cost as it's discovered they are actually good at what they do versus the new 3000 series being disappointing.

    I have the cash for one, but after moving my rendering to Blender, decided to wait and see. My Threadripper outperforms my 980ti, and it's a first gen Threadripper .

    A 1660 also out performs a 980ti in the iray benchmark thread. That is not the 1660ti or 1660 Super, that is the plain jane 1660, a sub $200 GPU. The 980ti is quite old and out dated now. It is great that Threadripper can beat a 980ti, but the entire RTX lineup does as well, all the way down to the 2060, which utterly destroys the 980ti in Iray. 

  • nicsttnicstt Posts: 11,715

    Re: The Remarkable Art & Science of Modern Graphic Card Design. I'm interested!

    I'm building a machine, gradually, learning the newer technology as I go. One concern I've had is whether or not the new GPUs will require anything new or different in the other components I need to get.

    So according to the video at 6:18, the GPUs will include an adaptor to make their 12-pin connectors compatible with pcie 8-pin cables. If I'm understanding that correctly, hurray, one concern resolved.
    [Also planning: ASRock x570 Taichi, R7 3700x]

    My current machine can no longer render, so I'm eager to get the new build built. But it hasn't been easy. A person can't study for what doesn't exist yet.

    Oh! Just wondering: Does DS usually have the ability to render Iray, albeit poorly, with newly-released cards, or will the 30xx's be totally unable to render in Iray until a new version of DS comes out?

    Anyway, it's nice to have some good news these days, isn't it?
    -Ken

    They might, which is why many folks have been saying wait; don't make decissions based on rumour.

  • nicsttnicstt Posts: 11,715

    Re: The Remarkable Art & Science of Modern Graphic Card Design. I'm interested!

    Oh! Just wondering: Does DS usually have the ability to render Iray, albeit poorly, with newly-released cards, or will the 30xx's be totally unable to render in Iray until a new version of DS comes out?

    Anyway, it's nice to have some good news these days, isn't it?
    -Ken

    How would anyone but Nvidia know what is happening; if Daz have any early info, they won't be able to say due to NDA. The lack of comment from Daz either means NDA or they don't know.

    ...And what good news? Speculation and guess work is not news.

  • TheKDTheKD Posts: 2,711

    Hopefully both nvidia and daz learned from the last time, and got an update ready to roll out on release lol.

  • kyoto kidkyoto kid Posts: 41,925
    nicstt said:

    All the rumor sites are now saying the 3090 will have 24 GB VRAM.

    I don't care what they're saying; I will wait until I see what is released. After the release of the last cards, and how long it took Nvidia to deliver what they promised, I'll be taking what Nvidia claim with a pinch of salt too.

    ..for me it's one of these:

     

    salt lick.jpg
    1280 x 720 - 58K
  • outrider42outrider42 Posts: 3,679

    There is no indication for Iray and Ampere yet, as Ampere is not even officially announced. Most of the things that have been shown are not official Nvidia. 

    However, we have some things that have come straight from 3rd party AIBs. So many pieces of info are basically as confirmed as confirmed can be. But what you will not hear about on Sept 1 is Iray. 

    As I have said before, the changeover to Iray RTX changed how Iray itself is updated. It now has the full OptiX 6, not the old OptiX Prime. Prime needed to be recompiled for every new GPU arch. 6.0 does not need to be. So Ampere *should* work with Iray pretty quickly, if not on launch day. I am willing to bet that Ampere will work with Daz Iray right away. However, don't take my word for it, wait for somebody to confirm. There is always somebody who gets one of these on launch who posts in the forums.

    As for me, I am keeping my 1080tis, so if by chance I am wrong, I have a backup plan. And I can just play Cyberpunk 2077 on my Ampere. If you do not have a backup plan, then just wait.

  • nicstt said:

    How would anyone but Nvidia know what is happening; if Daz have any early info, they won't be able to say due to NDA. The lack of comment from Daz either means NDA or they don't know.

    ...And what good news? Speculation and guess work is not news.

    Good news that they are actually coming out next week, (unless that's also speculation), so that I can finally get an idea of what the hell I can put into my new machine.

    Good news that I might be back to accomplishing something I enjoy instead of sitting here in this damn apartment listening to crappy world news, trying to deal with damn cataracts and other body, & computer, failures while my bit of time left on this ridiculous planet is wasting away.

    Damn! I'll take any bit of good news that I can freaking get these days.
    If I have to, I'll forgo the new cards and get whatever's left in the 20xx's. I just want to know ASAP what my options are.

  • fred9803fred9803 Posts: 1,565
    edited August 2020

    Gainward GeForce RTX 3090 Phoenix Golden Sample

    Post edited by fred9803 on
  • tj_1ca9500btj_1ca9500b Posts: 2,057
    fred9803 said:

    Gainward GeForce RTX 3090 Phoenix Golden Sample

    I was about to post this article, which features the above card:

    https://wccftech.com/nvidia-geforce-rtx-3090-geforce-rtx-3080-custom-graphics-cards-leak-out/

    In any case, yeah when official product pages go online with the specs, that's a bit more than just a rumor I'd say...

    So, if the latest leaks hold true, we are looking at a 24GB and a 10GB option at launch, but of course we have about a day to go until Nvidia's official presentation.  Gee, I wonder what a few people around here might be doing tomorrow around that time...  I still say Dr. Su looks better in a leather jacket than Jensen does!

  • I'm waiting patiently for the watercooled variants to surface.

  • LeanaLeana Posts: 12,799
    edited August 2020

    Oh! Just wondering: Does DS usually have the ability to render Iray, albeit poorly, with newly-released cards, or will the 30xx's be totally unable to render in Iray until a new version of DS comes out?

    For 10xx and 20xx new versions of Iray and DS were needed (and that took quite a bit of time for Iray to support them IIRC). No one but Nvidia could say whether or not it will be needed for 30xx, but I personally wouldn't bet on them being supported on release.

    Post edited by Leana on
  • nicsttnicstt Posts: 11,715
    edited August 2020
    nicstt said:

    How would anyone but Nvidia know what is happening; if Daz have any early info, they won't be able to say due to NDA. The lack of comment from Daz either means NDA or they don't know.

    ...And what good news? Speculation and guess work is not news.

    Good news that they are actually coming out next week, (unless that's also speculation), so that I can finally get an idea of what the hell I can put into my new machine.

    Good news that I might be back to accomplishing something I enjoy instead of sitting here in this damn apartment listening to crappy world news, trying to deal with damn cataracts and other body, & computer, failures while my bit of time left on this ridiculous planet is wasting away.

    Damn! I'll take any bit of good news that I can freaking get these days.
    If I have to, I'll forgo the new cards and get whatever's left in the 20xx's. I just want to know ASAP what my options are.

    The only thing that seems to be 100% true is that Nvidia have a clock counting down; they will be announcing something when said clock reaches zero - or so we're being lead to believe.

     

    Whatever card I end up getting, and presuming it's a Nvidia option, then it will be a Strix variant or perhaps a watercooled variant; well if I go AMD, and strix is an option there, then it will be true for AMD also.

    Post edited by nicstt on
  • nicsttnicstt Posts: 11,715
    Leana said:

    Oh! Just wondering: Does DS usually have the ability to render Iray, albeit poorly, with newly-released cards, or will the 30xx's be totally unable to render in Iray until a new version of DS comes out?

    For 10xx and 20xx new versions of Iray and DS were needed (and that took quite a bit of time for Iray to support them IIRC). No one but Nvidia could say whether or not it will be needed for 30xx, but I personally wouldn't bet on them being supported on release.

    10 series took ages iirc; 20 series was if not imediately then close

  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited August 2020

    Where's that meme of a big salt factory...

    OK, HUGE Grain of salt (article even says this), but if true, there may be some happy campers around here tomorrow.  If not, well you were warned, don't get your hopes up yet!

    https://wccftech.com/rumor-nvidia-rtx-3090-performance-slides-leaked/

    Almost 2x performance over 2080 Ti if leaked slides pan out.  If not, well that's why it's tagged RUMOR.

    1 day plus 1 hour to go, plus factor in initial blah blah at the beginning of the presentaiton time....

    Post edited by tj_1ca9500b on
  • nicstt said:
    Leana said:

    Oh! Just wondering: Does DS usually have the ability to render Iray, albeit poorly, with newly-released cards, or will the 30xx's be totally unable to render in Iray until a new version of DS comes out?

    For 10xx and 20xx new versions of Iray and DS were needed (and that took quite a bit of time for Iray to support them IIRC). No one but Nvidia could say whether or not it will be needed for 30xx, but I personally wouldn't bet on them being supported on release.

    10 series took ages iirc; 20 series was if not imediately then close

    The 20x0 cards came in two steps - basic support was a lot quicker than the 10x0s, but not instant by any means, full support taking advantage of the RTX cores was quite a bit longer (but I can't recall how it compared with the 10x0 release - I know we had it before I bought my 2080Ti at the end of last year).

Sign In or Register to comment.