Nvidia Ampere (2080 Ti, etc. replacements) and other rumors...

1363739414245

Comments

  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited October 2020
    Ghosty12 said:

    So, I just watched the AMD Big Navi/RDNA 2 presentation.

    Short form, 6900XT trades blows with 3090, 6800XT trades blows with the 3080, slightly lower power consumption, at least in the AMD benches.  We won't see the cards for almost another month though, and December for the 6900XT.

    There's also a 6800 which is a bit cheaper.  If I'm remembering correctly, 6900XT is $999, $649 for the $6800XT, and I missed the price for the 6800.  All three cards had 16GB of VRAM as I remember, and some other features.

    So short form, if AMD's benchmarks are honest, AMD is finally able to trade blows with the top end Nvidia cards.

    In other news, I've seen a couple of rumors in the last week about a 3080 Ti and 3070 Ti in the works, but no hard specs as of yet.

    Nvidia's 'out of the gate' pricing makes a bit more sense if the AMD numbers are accurate.  Of course, we get to wait on the independent reviewers to see how that shakes out in the various upcoming reviews.

    Rumor was that Nvidia canned those rumored cards, though seeing what AMD have just revealed Nvidia may want to reconsider..

    The rumors that surfaced this week indicated that the rumored 16 and 20 GB cards were the ones that ware put on the backburner before now, and that these 'Ti' cards are not the same as those other cards.  As to whether the rumored 2080 Ti would have 12gb of VRAM (i.e. half of a 3090) or more than that, is unknown at this point.  And, being a rumor, well mountain of salt and all that.  Plus, as the rumors indicated that the new cards were entering testing, odds are good we wouldn't be seeing them before sometime next year in any case. 

    My take on the situation is that Nvidia has nothing to worry about until November 18th, and even then, the MSRP pricing should keep them competitive for now.  If AMD had trounced 3080 performance with the 6800XT, it'd be a different story, and a lot of gamers will have the attitude 'if it's basically the same, well I'll stick with Nvidia'.  Not sure how many gamers jumped on the Radeon bandwagon for RDNA1, but my impression was that it was lower than 20-25% of the market, so Nvidia is still very much in the driver's seat.

    The hope here, of course, is that once the supply situation stabilizes,that this will put downward pressure on prices, hence helping us creative types out a bit with our budgets.  I'm not holding my breath as it's very lileky that won't happen before the new year though...

    Where AMD hasn't announced a 32 GB RDNA 2 card as of yet (at least not that I've seen), or even a 24GB one, I'm not expecting 3090 prices to drop much below MSRP at some point, but it'd be nice.

    Post edited by tj_1ca9500b on
  • Ghosty12Ghosty12 Posts: 2,080
    edited October 2020

    Ahh cool, good to know about those cards, as it is hard to tell these days with all these hardware sites and techtubers what is real information and what is not.. In the end it will be interesting to see what Nvidia do when November 18 hits.. Hopefully it will be good for us all, but yes I am holding out for now till we see what Nvidia come up with..

    Post edited by Ghosty12 on
  • MendomanMendoman Posts: 404
    edited October 2020
    Ghosty12 said:

    Well AMD just launched their new cards, and well it was interesting and all of their cards come with 16GB of vram.. And it looks like AMD's top of the range card is taking on the 3090, at $500 less.. I think that Nvidia may want to be concerned with what AMD have shown.. Will have to wait and see how Nvidia respond to this, only time will tell..

    It's enough VRAM because they're gaming cards and no game calls for even 8 Gb. Even for most casual creator uses 10Gb is plenty.  You guys have got to understand that DS is a very very niche use case. For 99.999999999999999999999999999+% of users the 3080 will never run out of VRAM.

    The only reason to increase the amount of VRAM is for advertising and epeen purposes and considering the cost and performance issues Nvidia clearly has decided against that. Maybe they'll eventually release the prosumer oriented Ampere cards once they figure out the GDDR6X production issues, but that isn't likely to happen for a while. 


    By that thinking then 4 to 8 GB of system ram is more than enough for most people.. But we all know most people will go for 16 to 32 GB of system ram, as it adds in an amount of future proofing to peoples computers..

    No. There are very good reasons for more system RAM. There is not any current reason for more VRAM. and the argument about not pushing VRAM because no cards exist that don't have more? WTF?

    Even in the Nvidia lineup there have been 11Gb cards for a long time. Nvidia made a very valid point though even with 2 generations of 11Gb flagships there is not a single game that pushes past 8Gb even with every texture setting maxed out at 4k. So why make consumers keep paying for VRAM they aren't using?

    That's a great theory, but did you ever stopped to think about that  (if I remember correctly ) 780 had 3GB and 980 had 4GB of VRAM. In your logic, there was never any need for higher VRAM amount because.... well, back in the day no game probably required any more than those high end cards had. And when you single out 1080ti and 2080ti, those might have been the flagships, but game developers don't make their games to run only on flagship GPUs. Normal 1080/2080 has only 8GB, so that's the reason why very few games require more than that nowadays. Now that AMD released Big Navis, and all have 16GB of VRAM, that's entirely new ballpark. I bet quite many gamers are going to consider that extra 6GB future proofing, even if you don't see it relevant.

    Post edited by Mendoman on
  • PerttiAPerttiA Posts: 10,024
    Ghosty12 said:

    Well AMD just launched their new cards, and well it was interesting and all of their cards come with 16GB of vram.. And it looks like AMD's top of the range card is taking on the 3090, at $500 less.. I think that Nvidia may want to be concerned with what AMD have shown.. Will have to wait and see how Nvidia respond to this, only time will tell..

    It's enough VRAM because they're gaming cards and no game calls for even 8 Gb. Even for most casual creator uses 10Gb is plenty.  You guys have got to understand that DS is a very very niche use case. For 99.999999999999999999999999999+% of users the 3080 will never run out of VRAM.

    The only reason to increase the amount of VRAM is for advertising and epeen purposes and considering the cost and performance issues Nvidia clearly has decided against that. Maybe they'll eventually release the prosumer oriented Ampere cards once they figure out the GDDR6X production issues, but that isn't likely to happen for a while. 


    By that thinking then 4 to 8 GB of system ram is more than enough for most people.. But we all know most people will go for 16 to 32 GB of system ram, as it adds in an amount of future proofing to peoples computers..

    No. There are very good reasons for more system RAM. There is not any current reason for more VRAM. and the argument about not pushing VRAM because no cards exist that don't have more? WTF?

    Even in the Nvidia lineup there have been 11Gb cards for a long time. Nvidia made a very valid point though even with 2 generations of 11Gb flagships there is not a single game that pushes past 8Gb even with every texture setting maxed out at 4k. So why make consumers keep paying for VRAM they aren't using?

    Bragging rights, the second most important reason on the list of things driving computer technology forward, the first being [censored]...

  • algovincianalgovincian Posts: 2,665
    Ghosty12 said:

    Well AMD just launched their new cards, and well it was interesting and all of their cards come with 16GB of vram.. And it looks like AMD's top of the range card is taking on the 3090, at $500 less.. I think that Nvidia may want to be concerned with what AMD have shown.. Will have to wait and see how Nvidia respond to this, only time will tell..

    It's enough VRAM because they're gaming cards and no game calls for even 8 Gb. Even for most casual creator uses 10Gb is plenty.  You guys have got to understand that DS is a very very niche use case. For 99.999999999999999999999999999+% of users the 3080 will never run out of VRAM.

    The only reason to increase the amount of VRAM is for advertising and epeen purposes and considering the cost and performance issues Nvidia clearly has decided against that. Maybe they'll eventually release the prosumer oriented Ampere cards once they figure out the GDDR6X production issues, but that isn't likely to happen for a while. 


    By that thinking then 4 to 8 GB of system ram is more than enough for most people.. But we all know most people will go for 16 to 32 GB of system ram, as it adds in an amount of future proofing to peoples computers..

    No. There are very good reasons for more system RAM. There is not any current reason for more VRAM. and the argument about not pushing VRAM because no cards exist that don't have more? WTF?

    Even in the Nvidia lineup there have been 11Gb cards for a long time. Nvidia made a very valid point though even with 2 generations of 11Gb flagships there is not a single game that pushes past 8Gb even with every texture setting maxed out at 4k. So why make consumers keep paying for VRAM they aren't using?

    And what percentage of the PC gaming market do you think has a card with more than 8GB?

    - Greg

  • Ghosty12 said:

    Well AMD just launched their new cards, and well it was interesting and all of their cards come with 16GB of vram.. And it looks like AMD's top of the range card is taking on the 3090, at $500 less.. I think that Nvidia may want to be concerned with what AMD have shown.. Will have to wait and see how Nvidia respond to this, only time will tell..

    It's enough VRAM because they're gaming cards and no game calls for even 8 Gb. Even for most casual creator uses 10Gb is plenty.  You guys have got to understand that DS is a very very niche use case. For 99.999999999999999999999999999+% of users the 3080 will never run out of VRAM.

    The only reason to increase the amount of VRAM is for advertising and epeen purposes and considering the cost and performance issues Nvidia clearly has decided against that. Maybe they'll eventually release the prosumer oriented Ampere cards once they figure out the GDDR6X production issues, but that isn't likely to happen for a while. 


    By that thinking then 4 to 8 GB of system ram is more than enough for most people.. But we all know most people will go for 16 to 32 GB of system ram, as it adds in an amount of future proofing to peoples computers..

    No. There are very good reasons for more system RAM. There is not any current reason for more VRAM. and the argument about not pushing VRAM because no cards exist that don't have more? WTF?

    Even in the Nvidia lineup there have been 11Gb cards for a long time. Nvidia made a very valid point though even with 2 generations of 11Gb flagships there is not a single game that pushes past 8Gb even with every texture setting maxed out at 4k. So why make consumers keep paying for VRAM they aren't using?

    And what percentage of the PC gaming market do you think has a card with more than 8GB?

    - Greg

    Once the AMD RX cards are released, possibly a lot.  Anyone remember when everyone dumped their Nvidia Geforce4 Ti 4600 cards in 2002 for ATI Radeon 9700 Pro cards?  It could happen again (except for those of us that are shackled to Iray for rendering).  Heck, ATI even sponsored QuakeCon that year (the only reason I remember this).  I still have a pre-production Ratzpad from the [H]ard OCP workshop that year around here somewhere.

  • MendomanMendoman Posts: 404
    edited October 2020

    Well, I think what AMD did will have an impact in the gaming industry, but there's still millions of Nvidia users and old AMD users, so game developers probably have to support that for quite some time. But I'd bet that every single future AMD sponsored game will have "High" and "Low" resolution textures, meaning that every single 3080 Nvidia owner knows that they are playing with lowres textures, while their AMD using counterparts, who paid less for their cards, are using high-resolution textures. In my opinion, Nvidia really can't wait another 2-3 years for next generation cards, because currently they are losing quite bad. Their GPUs are about the same performance level, use more power, has less memory and are more expensive. That is not a situation they want to be for long, or AMD will start to take bigger and bigger share of the cake. I hope Nvidia drops the price of 3090, since as I see it, it's the only card worth buying anymore...

    Post edited by Mendoman on
  • algovincianalgovincian Posts: 2,665
    Mendoman said:

    Well, I think what AMD did will have an impact in the gaming industry, but there's still millions of Nvidia users and old AMD users, so game developers probably have to support that for quite some time. But I'd bet that every single future AMD sponsored game will have "High" and "Low" resolution textures, meaning that every single 3080 Nvidia owner knows that they are playing with lowres textures, while their AMD using counterparts, who paid less for their cards, are using high-resolution textures. In my opinion, Nvidia really can't wait another 2-3 years for next generation cards, because currently they are losing quite bad. Their GPUs are about the same performance level, use more power, has less memory and are more expensive. That is not a situation they want to be for long, or AMD will start to take bigger and bigger share of the cake. 

    We'll see. AI is only going to become more and more prevalent in games, and I'm not just talking about rendering. It's here that Nvidia has a huge advantage.

    - Greg

  • Mendoman said:
    Ghosty12 said:

    Well AMD just launched their new cards, and well it was interesting and all of their cards come with 16GB of vram.. And it looks like AMD's top of the range card is taking on the 3090, at $500 less.. I think that Nvidia may want to be concerned with what AMD have shown.. Will have to wait and see how Nvidia respond to this, only time will tell..

    It's enough VRAM because they're gaming cards and no game calls for even 8 Gb. Even for most casual creator uses 10Gb is plenty.  You guys have got to understand that DS is a very very niche use case. For 99.999999999999999999999999999+% of users the 3080 will never run out of VRAM.

    The only reason to increase the amount of VRAM is for advertising and epeen purposes and considering the cost and performance issues Nvidia clearly has decided against that. Maybe they'll eventually release the prosumer oriented Ampere cards once they figure out the GDDR6X production issues, but that isn't likely to happen for a while. 


    By that thinking then 4 to 8 GB of system ram is more than enough for most people.. But we all know most people will go for 16 to 32 GB of system ram, as it adds in an amount of future proofing to peoples computers..

    No. There are very good reasons for more system RAM. There is not any current reason for more VRAM. and the argument about not pushing VRAM because no cards exist that don't have more? WTF?

    Even in the Nvidia lineup there have been 11Gb cards for a long time. Nvidia made a very valid point though even with 2 generations of 11Gb flagships there is not a single game that pushes past 8Gb even with every texture setting maxed out at 4k. So why make consumers keep paying for VRAM they aren't using?

    That's a great theory, but did you ever stopped to think about that  (if I remember correctly ) 780 had 3GB and 980 had 4GB of VRAM. In your logic, there was never any need for higher VRAM amount because.... well, back in the day no game probably required any more than those high end cards had. And when you single out 1080ti and 2080ti, those might have been the flagships, but game developers don't make their games to run only on flagship GPUs. Normal 1080/2080 has only 8GB, so that's the reason why very few games require more than that nowadays. Now that AMD released Big Navis, and all have 16GB of VRAM, that's entirely new ballpark. I bet quite many gamers are going to consider that extra 6GB future proofing, even if you don't see it relevant.

    LOL. There were games that pushed 3Gb when the 780ti was the flagship see "will it run Crysis" memes. Same with the 980 ti. There was much complaining when the 980 ti came out with only 4Gb. As it was clear that 4Gb wasn't really adequate at 1440p and wasn't going to work at 4k.

    Why don't the mid tier cards need more VRAM? Because most gamers play at 1080p not at 4k. At 1080p no game pushes anything like 6Gb much less 8Gb (no game pushes 8). As far as VRAM goes you could probably get by on a 4 or 5Gb card at 1080p just fine. It's at 1440p and 4k that you need more VRAM and at those resolutions you're spending more on the monitor so it is also reasonable to expect you to spend more on the GPU to drive that monitor.

    So there remains no there to your argument no matter how much you wish there was.

  • RayDAntRayDAnt Posts: 1,155
    edited October 2020
    Mendoman said:
    Ghosty12 said:

    Well AMD just launched their new cards, and well it was interesting and all of their cards come with 16GB of vram.. And it looks like AMD's top of the range card is taking on the 3090, at $500 less.. I think that Nvidia may want to be concerned with what AMD have shown.. Will have to wait and see how Nvidia respond to this, only time will tell..

    It's enough VRAM because they're gaming cards and no game calls for even 8 Gb. Even for most casual creator uses 10Gb is plenty.  You guys have got to understand that DS is a very very niche use case. For 99.999999999999999999999999999+% of users the 3080 will never run out of VRAM.

    The only reason to increase the amount of VRAM is for advertising and epeen purposes and considering the cost and performance issues Nvidia clearly has decided against that. Maybe they'll eventually release the prosumer oriented Ampere cards once they figure out the GDDR6X production issues, but that isn't likely to happen for a while. 


    By that thinking then 4 to 8 GB of system ram is more than enough for most people.. But we all know most people will go for 16 to 32 GB of system ram, as it adds in an amount of future proofing to peoples computers..

    No. There are very good reasons for more system RAM. There is not any current reason for more VRAM. and the argument about not pushing VRAM because no cards exist that don't have more? WTF?

    Even in the Nvidia lineup there have been 11Gb cards for a long time. Nvidia made a very valid point though even with 2 generations of 11Gb flagships there is not a single game that pushes past 8Gb even with every texture setting maxed out at 4k. So why make consumers keep paying for VRAM they aren't using?

    That's a great theory, but did you ever stopped to think about that  (if I remember correctly ) 780 had 3GB and 980 had 4GB of VRAM. In your logic, there was never any need for higher VRAM amount because.... well, back in the day no game probably required any more than those high end cards had. And when you single out 1080ti and 2080ti, those might have been the flagships, but game developers don't make their games to run only on flagship GPUs. Normal 1080/2080 has only 8GB, so that's the reason why very few games require more than that nowadays. Now that AMD released Big Navis, and all have 16GB of VRAM, that's entirely new ballpark. I bet quite many gamers are going to consider that extra 6GB future proofing, even if you don't see it relevant.

    LOL. There were games that pushed 3Gb when the 780ti was the flagship see "will it run Crysis" memes. Same with the 980 ti. There was much complaining when the 980 ti came out with only 4Gb. As it was clear that 4Gb wasn't really adequate at 1440p and wasn't going to work at 4k.

    Why don't the mid tier cards need more VRAM? Because most gamers play at 1080p not at 4k. At 1080p no game pushes anything like 6Gb much less 8Gb (no game pushes 8). As far as VRAM goes you could probably get by on a 4 or 5Gb card at 1080p just fine. It's at 1440p and 4k that you need more VRAM and at those resolutions you're spending more on the monitor so it is also reasonable to expect you to spend more on the GPU to drive that monitor.

    So there remains no there to your argument no matter how much you wish there was.

    Also keep in mind that AMD is - if anything - further along the road to hi-speed direct to SSD storage transfer adoption than Nvidia. Making these 16GB framebuffers on their new cards that much less reveant for the vast majority of their potential user base in the medium term than even what Nvidia has going.

    Post edited by RayDAnt on
  • tj_1ca9500btj_1ca9500b Posts: 2,057

    Just an interesting note.

    I wanted to note that, for AMD's last two presentations, that they've been pleasantly short.  No doubt that's due to video editing and cutting out the 'excess fat', something you can't do with a live presentation, but both the 5000 series Ryzens reveal and today's Big Navi reveal have clocked in at about 20 minutes or so.  Just the facts, ma'am!

    Sure, some people may want a few more details, but as these were just announcements and not launches, I figure that AMD can get down into the weeds later if they need to.  My point is, I didn't have to block out a full hour of my day for these announcements.  Well, if you combined both of them into one announcement maybe...

    Next stop, the Ryzen 5000 launches!

  • RayDAnt said:

    Also keep in mind that AMD is - if anything - further along the road to hi-speed direct to SSD storage transfer adoption than Nvidia. Making these 16GB framebuffers on their new cards that much less reveant for the vast majority of their potential user base in the medium term than even what Nvidia has going.

    @RayDAnt

    How would you respond to my assuming that Radeon ProRender is going to support outof core rendering very well? As it stands, the only reason I am excited about the 6900XT with its smaller 16G VRAM is that it will probably make the 3090s cheaper. If the VRAM buffer mattered less, then AMD may still be a viable choice.

    Blender's render Simplify is magic, but it still can't replace a larger map for a closeup shot.

  • MendomanMendoman Posts: 404
    edited October 2020
    Mendoman said:
    Ghosty12 said:

    Well AMD just launched their new cards, and well it was interesting and all of their cards come with 16GB of vram.. And it looks like AMD's top of the range card is taking on the 3090, at $500 less.. I think that Nvidia may want to be concerned with what AMD have shown.. Will have to wait and see how Nvidia respond to this, only time will tell..

    It's enough VRAM because they're gaming cards and no game calls for even 8 Gb. Even for most casual creator uses 10Gb is plenty.  You guys have got to understand that DS is a very very niche use case. For 99.999999999999999999999999999+% of users the 3080 will never run out of VRAM.

    The only reason to increase the amount of VRAM is for advertising and epeen purposes and considering the cost and performance issues Nvidia clearly has decided against that. Maybe they'll eventually release the prosumer oriented Ampere cards once they figure out the GDDR6X production issues, but that isn't likely to happen for a while. 


    By that thinking then 4 to 8 GB of system ram is more than enough for most people.. But we all know most people will go for 16 to 32 GB of system ram, as it adds in an amount of future proofing to peoples computers..

    No. There are very good reasons for more system RAM. There is not any current reason for more VRAM. and the argument about not pushing VRAM because no cards exist that don't have more? WTF?

    Even in the Nvidia lineup there have been 11Gb cards for a long time. Nvidia made a very valid point though even with 2 generations of 11Gb flagships there is not a single game that pushes past 8Gb even with every texture setting maxed out at 4k. So why make consumers keep paying for VRAM they aren't using?

    That's a great theory, but did you ever stopped to think about that  (if I remember correctly ) 780 had 3GB and 980 had 4GB of VRAM. In your logic, there was never any need for higher VRAM amount because.... well, back in the day no game probably required any more than those high end cards had. And when you single out 1080ti and 2080ti, those might have been the flagships, but game developers don't make their games to run only on flagship GPUs. Normal 1080/2080 has only 8GB, so that's the reason why very few games require more than that nowadays. Now that AMD released Big Navis, and all have 16GB of VRAM, that's entirely new ballpark. I bet quite many gamers are going to consider that extra 6GB future proofing, even if you don't see it relevant.

    LOL. There were games that pushed 3Gb when the 780ti was the flagship see "will it run Crysis" memes. Same with the 980 ti. There was much complaining when the 980 ti came out with only 4Gb. As it was clear that 4Gb wasn't really adequate at 1440p and wasn't going to work at 4k.

    Why don't the mid tier cards need more VRAM? Because most gamers play at 1080p not at 4k. At 1080p no game pushes anything like 6Gb much less 8Gb (no game pushes 8). As far as VRAM goes you could probably get by on a 4 or 5Gb card at 1080p just fine. It's at 1440p and 4k that you need more VRAM and at those resolutions you're spending more on the monitor so it is also reasonable to expect you to spend more on the GPU to drive that monitor.

    So there remains no there to your argument no matter how much you wish there was.

    Uhmm, aren't you contradicting yourself here now? If 4GB was not enough for 1440p gaming, doesn't that automatically mean that 8GB is not enough for 4k gaming? I mean 1440p is like 3.7M pixels and 4k is like 8.3M pixels, right? If there was lots of complaining back then that 780ti wasn't enough for 1440p gaming, why do you think that nowadays people would be happy that 3080 is not enough/barely adequate for 4k gaming?

     

    I think we just have to agree to disagree about my arguments. Maybe constant increase of VRAM usage just stops here because Nvidia and you say so, but I just fail to see it if AMD keeps pushing it forward.

    Post edited by Mendoman on
  • Mendoman said:
    Mendoman said:
    Ghosty12 said:

    Well AMD just launched their new cards, and well it was interesting and all of their cards come with 16GB of vram.. And it looks like AMD's top of the range card is taking on the 3090, at $500 less.. I think that Nvidia may want to be concerned with what AMD have shown.. Will have to wait and see how Nvidia respond to this, only time will tell..

    It's enough VRAM because they're gaming cards and no game calls for even 8 Gb. Even for most casual creator uses 10Gb is plenty.  You guys have got to understand that DS is a very very niche use case. For 99.999999999999999999999999999+% of users the 3080 will never run out of VRAM.

    The only reason to increase the amount of VRAM is for advertising and epeen purposes and considering the cost and performance issues Nvidia clearly has decided against that. Maybe they'll eventually release the prosumer oriented Ampere cards once they figure out the GDDR6X production issues, but that isn't likely to happen for a while. 


    By that thinking then 4 to 8 GB of system ram is more than enough for most people.. But we all know most people will go for 16 to 32 GB of system ram, as it adds in an amount of future proofing to peoples computers..

    No. There are very good reasons for more system RAM. There is not any current reason for more VRAM. and the argument about not pushing VRAM because no cards exist that don't have more? WTF?

    Even in the Nvidia lineup there have been 11Gb cards for a long time. Nvidia made a very valid point though even with 2 generations of 11Gb flagships there is not a single game that pushes past 8Gb even with every texture setting maxed out at 4k. So why make consumers keep paying for VRAM they aren't using?

    That's a great theory, but did you ever stopped to think about that  (if I remember correctly ) 780 had 3GB and 980 had 4GB of VRAM. In your logic, there was never any need for higher VRAM amount because.... well, back in the day no game probably required any more than those high end cards had. And when you single out 1080ti and 2080ti, those might have been the flagships, but game developers don't make their games to run only on flagship GPUs. Normal 1080/2080 has only 8GB, so that's the reason why very few games require more than that nowadays. Now that AMD released Big Navis, and all have 16GB of VRAM, that's entirely new ballpark. I bet quite many gamers are going to consider that extra 6GB future proofing, even if you don't see it relevant.

    LOL. There were games that pushed 3Gb when the 780ti was the flagship see "will it run Crysis" memes. Same with the 980 ti. There was much complaining when the 980 ti came out with only 4Gb. As it was clear that 4Gb wasn't really adequate at 1440p and wasn't going to work at 4k.

    Why don't the mid tier cards need more VRAM? Because most gamers play at 1080p not at 4k. At 1080p no game pushes anything like 6Gb much less 8Gb (no game pushes 8). As far as VRAM goes you could probably get by on a 4 or 5Gb card at 1080p just fine. It's at 1440p and 4k that you need more VRAM and at those resolutions you're spending more on the monitor so it is also reasonable to expect you to spend more on the GPU to drive that monitor.

    So there remains no there to your argument no matter how much you wish there was.

    Uhmm, aren't you contradicting yourself here now? If 4GB was not enough for 1440p gaming, doesn't that automatically mean that 8GB is not enough for 4k gaming? I mean 1440p is like 3.7M pixels and 4k is like 8.3M pixels, right? If there was lots of complaining back then that 780ti wasn't enough for 1440p gaming, why do you think that nowadays people would be happy that 3080 is not enough/barely adequate for 4k gaming?

     

    I think we just have to agree to disagree about my arguments. Maybe history is wrong, and constant increase of VRAM usage just stops here because Nvidia and you say so, but I just fail to see it if AMD is pushing it forward.

    You seem to think VRAM is some sort of linear thing based on the number of pixels. That's not remotely true.

    The frame bufer is essentially irrelevant, 4 bytes of color x 8.3M pixels is only, roughly, 32 Mb framebuffer at 4k and 8bit color. You could store movies in a couple of Gb if that was all that was going on.

    What is really going on, and I just assumed you'd know this, is that textures increase not linearly but geometrically. But just as we've learned that 4k textures are not needed very often game designers quickly found they didn't need heavy weight textures pretty much ever. They need textures that are just sharp enough. They may only have a 1k or 2k texture even if the resolution is 4k because that is good enough. 

    And of course the proof is in the actual facts, there is not a single game that exceeds 8Gb, IIRC Red Dead 2 is the heaviest game out there texture wise and it is right around 6Gb at 4k. Flight Sim 2020 uses more VRAM but it doesn't do it with textures but with streamed geographic data and it doesn't seem to really hurt performance that much to run it on a lower VRAM card (you can find people testing it on RTX Titans versus 2080ti's and not getting a change to FPS outside of variance).

  • It is true that right now games do not use the 8GB of memory on most video cards but with the Xbox X/S using RDNA2 graphics with 16GB of GDDR6, soon game developers may start designing game to use all of that memory and those with 8GB cards may be crying when the games are converted to PC..

  • outrider42outrider42 Posts: 3,679
    Ghosty12 said:

    Well AMD just launched their new cards, and well it was interesting and all of their cards come with 16GB of vram.. And it looks like AMD's top of the range card is taking on the 3090, at $500 less.. I think that Nvidia may want to be concerned with what AMD have shown.. Will have to wait and see how Nvidia respond to this, only time will tell..

    It's enough VRAM because they're gaming cards and no game calls for even 8 Gb. Even for most casual creator uses 10Gb is plenty.  You guys have got to understand that DS is a very very niche use case. For 99.999999999999999999999999999+% of users the 3080 will never run out of VRAM.

    The only reason to increase the amount of VRAM is for advertising and epeen purposes and considering the cost and performance issues Nvidia clearly has decided against that. Maybe they'll eventually release the prosumer oriented Ampere cards once they figure out the GDDR6X production issues, but that isn't likely to happen for a while. 


    By that thinking then 4 to 8 GB of system ram is more than enough for most people.. But we all know most people will go for 16 to 32 GB of system ram, as it adds in an amount of future proofing to peoples computers..

    No. There are very good reasons for more system RAM. There is not any current reason for more VRAM. and the argument about not pushing VRAM because no cards exist that don't have more? WTF?

    Even in the Nvidia lineup there have been 11Gb cards for a long time. Nvidia made a very valid point though even with 2 generations of 11Gb flagships there is not a single game that pushes past 8Gb even with every texture setting maxed out at 4k. So why make consumers keep paying for VRAM they aren't using?

    Incorrect. Doom uses 9gb at 4K with all settings maxed, and it actually uses it, it is NOT simply reserving it. Take a look at HardwareUnboxed videos about this topic, they did an investigation of Doom's performance. On 8gb cards, Doom will abnormally punish them compared to the higher capacities at max 4k settings. By dropping the texture detail a notch and running the benchmarks across all GPUs with this new setting, they found the 8gb cards performed more inline with where they were supposed to be.

    You may also remember that Nvidia specifically showed Doom running on a 3080 versus the 2080, and that the performance difference was massive. The reason for this difference ties into this, the 2080 was getting hammered by running out of VRAM. When you normalize the bench by dropping the textures, the performance gap shrinks drastically.

    So there is at least one example of a game currently available that uses more than 8gb VRAM, and suffers a clear performance hit on GPUs with 8gb. We are also just about to start a whole new console generation, and traditionally a new console generation pushes PC gaming specs up, as what is considered average or mid range is moved up.

    Now that we have 3 different AMD cards all boosting 16gb, there will be more games that do this in the future. I would bet money that AMD sponsored titles in particular will not be shy about using more VRAM in the very near future. The question of "How much VRAM is needed now?" is very much a chicken and egg situation. Developers will always try to build games that work with currently available hardware. It is extremely rare for any video game to feature settings that are impossible to run on hardware that was currently available at the time of release. The baseline has now moved. We now have a 16gb card in the mid range, and yes, $550 is mid range, folks.

    Benchmarks are going to be important. AMD barely even mentioned ray tracing in this presentation. That is a red flag that their ray tracing is not so great. They also have no DLSS like feature at launch, only a promise that they will in the future...but promises are not exactly good reasons to buy hardware. Most of the benchmark numbers they showed were AMD sponsored games, which should of course grant them an edge. Their 6900XT numbers were very sketchy, as they overclocked that GPU using their "RAGE mode" and SAM features to achieve numbers higher than what would be normally possible. Those auto overclock numbers are only possible if you buy a entire new PC with all compatible parts, the 6900XT, a Ryzen 5900XT, and a AMD 500 series motherboard. If you lack any of these parts, the 6900XT doesn't hit these numbers. But at the very least they are competitive, which is as important as winning outright.

    Another thing is that a lot of people thought AMD would be using way less power to do this. The 6800XT at 300 Watts is not way less than 320. It is less, but that is not drastic, and I can't imagine anybody choosing a GPU based on it using 20 Watts less than another, LOL. The 6900XT might be a bit different, since the 3090 is so power hungry, but I have a feeling that the overclocks on the 6900XT will push it well above 300, so not a huge win there, either.

    The 6800 is the pretty interesting, as it tackles the 3070 and appears to easily beat it, and do so with double the VRAM. However, it is not such an easy victory, since it costs $80 more and oh BTW it uses a good 30 Watts more, too...which I find funny considering how so many people are suddenly talking about power draw these days. So are we really going to split hairs over the 3080 using 20 Watts more than the 6800XT when the 6800 uses 30 more than the 3070?

  • outrider42outrider42 Posts: 3,679

    To go further, look at the interest people have in going beyond 60 fps. Even the consoles can do this, with numerous multiplayer based games pushing high frame rate modes. The consoles have also impacted televisions. Now you can buy TVs that offer variable refresh rates and high frame rates. There are good number of PC gamers actually looking at these TVs instead of monitors because of their size, and OLED's incredible picture that looks great even with game mode turned on. No gaming monitor has decent blacks, because blacks are one of things that must be sacrificed for latency. Local dimming adds a lot of input lag as it is a post image process, and many other features that help LEDs achieve a great picture are post processes as well. 

    At any rate, the bar has changed. TV specs are changing just for these new consoles. And that is the beginning of the advancements we will be seeing. PC hardware in general is going through major shifts in a very short time. SSDs are about to become mandatory. What used to be great hardware, like Pascal, is now getting left in the dust. The once mighty 1080ti will likely be getting smacked around not just by a 3060, but a 3050. 8 core CPUs will start to become much more common. And the funny part is that the new consoles do these things I just said, too. The consoles will bring about new shifts as much as anything.

  • Ghosty12 said:

    Well AMD just launched their new cards, and well it was interesting and all of their cards come with 16GB of vram.. And it looks like AMD's top of the range card is taking on the 3090, at $500 less.. I think that Nvidia may want to be concerned with what AMD have shown.. Will have to wait and see how Nvidia respond to this, only time will tell..

    It's enough VRAM because they're gaming cards and no game calls for even 8 Gb. Even for most casual creator uses 10Gb is plenty.  You guys have got to understand that DS is a very very niche use case. For 99.999999999999999999999999999+% of users the 3080 will never run out of VRAM.

    The only reason to increase the amount of VRAM is for advertising and epeen purposes and considering the cost and performance issues Nvidia clearly has decided against that. Maybe they'll eventually release the prosumer oriented Ampere cards once they figure out the GDDR6X production issues, but that isn't likely to happen for a while. 


    By that thinking then 4 to 8 GB of system ram is more than enough for most people.. But we all know most people will go for 16 to 32 GB of system ram, as it adds in an amount of future proofing to peoples computers..

    No. There are very good reasons for more system RAM. There is not any current reason for more VRAM. and the argument about not pushing VRAM because no cards exist that don't have more? WTF?

    Even in the Nvidia lineup there have been 11Gb cards for a long time. Nvidia made a very valid point though even with 2 generations of 11Gb flagships there is not a single game that pushes past 8Gb even with every texture setting maxed out at 4k. So why make consumers keep paying for VRAM they aren't using?

    Incorrect. Doom uses 9gb at 4K with all settings maxed, and it actually uses it, it is NOT simply reserving it. Take a look at HardwareUnboxed videos about this topic, they did an investigation of Doom's performance. On 8gb cards, Doom will abnormally punish them compared to the higher capacities at max 4k settings. By dropping the texture detail a notch and running the benchmarks across all GPUs with this new setting, they found the 8gb cards performed more inline with where they were supposed to be.

    You may also remember that Nvidia specifically showed Doom running on a 3080 versus the 2080, and that the performance difference was massive. The reason for this difference ties into this, the 2080 was getting hammered by running out of VRAM. When you normalize the bench by dropping the textures, the performance gap shrinks drastically.

    So there is at least one example of a game currently available that uses more than 8gb VRAM, and suffers a clear performance hit on GPUs with 8gb. We are also just about to start a whole new console generation, and traditionally a new console generation pushes PC gaming specs up, as what is considered average or mid range is moved up.

    Now that we have 3 different AMD cards all boosting 16gb, there will be more games that do this in the future. I would bet money that AMD sponsored titles in particular will not be shy about using more VRAM in the very near future. The question of "How much VRAM is needed now?" is very much a chicken and egg situation. Developers will always try to build games that work with currently available hardware. It is extremely rare for any video game to feature settings that are impossible to run on hardware that was currently available at the time of release. The baseline has now moved. We now have a 16gb card in the mid range, and yes, $550 is mid range, folks.

    Benchmarks are going to be important. AMD barely even mentioned ray tracing in this presentation. That is a red flag that their ray tracing is not so great. They also have no DLSS like feature at launch, only a promise that they will in the future...but promises are not exactly good reasons to buy hardware. Most of the benchmark numbers they showed were AMD sponsored games, which should of course grant them an edge. Their 6900XT numbers were very sketchy, as they overclocked that GPU using their "RAGE mode" and SAM features to achieve numbers higher than what would be normally possible. Those auto overclock numbers are only possible if you buy a entire new PC with all compatible parts, the 6900XT, a Ryzen 5900XT, and a AMD 500 series motherboard. If you lack any of these parts, the 6900XT doesn't hit these numbers. But at the very least they are competitive, which is as important as winning outright.

    Another thing is that a lot of people thought AMD would be using way less power to do this. The 6800XT at 300 Watts is not way less than 320. It is less, but that is not drastic, and I can't imagine anybody choosing a GPU based on it using 20 Watts less than another, LOL. The 6900XT might be a bit different, since the 3090 is so power hungry, but I have a feeling that the overclocks on the 6900XT will push it well above 300, so not a huge win there, either.

    The 6800 is the pretty interesting, as it tackles the 3070 and appears to easily beat it, and do so with double the VRAM. However, it is not such an easy victory, since it costs $80 more and oh BTW it uses a good 30 Watts more, too...which I find funny considering how so many people are suddenly talking about power draw these days. So are we really going to split hairs over the 3080 using 20 Watts more than the 6800XT when the 6800 uses 30 more than the 3070?

    Something is wonky with HardwareUnboxed's test and many people have pointed it out as you can see from that reddit thread. I just checked it myself. My 2070 runs Doom within variance of my 1080ti. If ultra nightmare at 4k really needed more than 8 I should be seeing a massive fps hit but I'm not. The cards are comparable in pretty much every game and Doom is no exception no matter what setting I use. That the rest of the tech review world did not jump all over that says they think HU was full of it.

  • JamesJABJamesJAB Posts: 1,766

    Not only Doom Eternal clocking in at 9GB Vram usage... Resident Evil 3 running max settings at 4K needs 14GB ov Vram to avoid swapping from sysem RAM.

    Also on a side note.... AMD has already won the video gaming Raytracing war.  RDNA2 has 1 hardware Raytracing core per compute unit.  This is cool and all, but the kicker is that the PS5 and Xbox Series X and S have RDNA2 GPUs.  So every single console game dev will be optimizing their games for AMD Raytracing.

    Unfortunately for Nvidia they decided to make Raytracing a "premium" feature that is only on the high end expensive cards.  And that is why there are only 12 or so games out that use the feature.

  • RayDAntRayDAnt Posts: 1,155
    edited October 2020
    RayDAnt said:

    Also keep in mind that AMD is - if anything - further along the road to hi-speed direct to SSD storage transfer adoption than Nvidia. Making these 16GB framebuffers on their new cards that much less reveant for the vast majority of their potential user base in the medium term than even what Nvidia has going.

    @RayDAnt

    How would you respond to my assuming that Radeon ProRender is going to support outof core rendering very well?

    Seems reasonable. Although imo all this GPU direct SSD access stuff (assumnig it really gets off the ground next year) is gonna end up obsoleting things like out-of-core rendering altogether.

     

    It is true that right now games do not use the 8GB of memory on most video cards but with the Xbox X/S using RDNA2 graphics with 16GB of GDDR6, soon game developers may start designing game to use all of that memory and those with 8GB cards may be crying when the games are converted to PC..

    Not exactly. Keep in mind that all the next gen consoles you speak of are utilizing a very high speed but shared (between GPU and CPU) memory buffer. Meaning that you need to be including system RAM capacity into the equation on the PC side before attempting to draw any conclusions.

     

     

    JamesJAB said:

    Not only Doom Eternal clocking in at 9GB Vram usage... Resident Evil 3 running max settings at 4K needs 14GB ov Vram to avoid swapping from sysem RAM.

    Also on a side note.... AMD has already won the video gaming Raytracing war.  RDNA2 has 1 hardware Raytracing core per compute unit.  This is cool and all, but the kicker is that the PS5 and Xbox Series X and S have RDNA2 GPUs.  So every single console game dev will be optimizing their games for AMD Raytracing.

    Unfortunately for Nvidia they decided to make Raytracing a "premium" feature that is only on the high end expensive cards.  And that is why there are only 12 or so games out that use the feature.

    There is no such thing as "Nvidia" or "AMD" exclusive raytracing. Both Nvidia and AMD's (upcoming) raytracing acceleration hardware is implemented via GPU agnostic APIs like DXR (part of DirectX.) Meaning that any game designed to take advantage of raytracing acceleration via DirectX either in the past (when that was an Nvidia "RTX" exclusive) or the future (when it will be a both an Nvidia and AMD thing) will be able to do it regardless of platform.

    Post edited by RayDAnt on
  • Thanks for all the info and analysis guys.  Interesting stuff.

    Sooo....

    All that being said, is there anything in the Daz-works to have something less iray/nvidia dependent in future releases?  Any rumors of anything?

    Cheers!

  • MendomanMendoman Posts: 404
    edited October 2020

    ...nevermind...

    Post edited by Mendoman on
  • kenshaw011267kenshaw011267 Posts: 3,805
    edited October 2020
    JamesJAB said:

    Not only Doom Eternal clocking in at 9GB Vram usage... Resident Evil 3 running max settings at 4K needs 14GB ov Vram to avoid swapping from sysem RAM.

    Also on a side note.... AMD has already won the video gaming Raytracing war.  RDNA2 has 1 hardware Raytracing core per compute unit.  This is cool and all, but the kicker is that the PS5 and Xbox Series X and S have RDNA2 GPUs.  So every single console game dev will be optimizing their games for AMD Raytracing.

    Unfortunately for Nvidia they decided to make Raytracing a "premium" feature that is only on the high end expensive cards.  And that is why there are only 12 or so games out that use the feature.

    LOL

    https://www.pcgamer.com/resident-evil-3-best-settings/#:~:text=Simply put: The VRAM numbers,3—but it runs fine.

    As already pointed out this whole raytracing is "unique" to RTX was BS. Nvidia subtly told every it was by comparing RTX to other cards with raytracing turned on. DirectX will very happily use software to raytrace if the driver supports it, the Nvidia driver does and the AMD driver doesn't to the best of my knowledge but that could well change with the next update. If you have a GTX card and want to really tank your FPS load up a ray traced game and turn ray tracing on. It's pretty brutal.

    So there will be no special sauce for these RDNA2 cards when they come out. Every title that supports ray tracing will support them, there might be some wonkiness with Vulkan I read but I also read that should be resolved shortly. 

    Post edited by kenshaw011267 on
  • tj_1ca9500btj_1ca9500b Posts: 2,057

    I'm just spitballing here, but I had a thought about the '3080 Ti' rumor.

    So, the 3000 series Nvidia cards are on a Samsung process node.  There's been some rumors, though, that Nvidia is looking at moving 3000 series production to TSMC.  One way they could do this is to manufacture the 'Ti' cards at TSMC and keep manufacturing the current cards at Samsung.  The differences in process nodes could give the TSMC manufactured 3xxx cards an added boost, depending on which node it is.  We might also see improved power consumption numbers, or not.

    It's a thought, anyways.  I'm curious to see where the Nvidia + Samsung parnership goes, and actually don't mind it.  It's nice to not have all of the PC chip manufacturing in the TSMC basket, and Intel of course.  Plus it puts pressure on TSMC to keep their pricing in check.

  • nonesuch00nonesuch00 Posts: 18,753
    edited October 2020

    Well this is embarrassing. AMD finally releases it's Big Navi video cards and even after a few months of saving money for one I'll still wind up with a model they are calling the Flounder.

    Will AMD Radeons have anything similar to this DLSS stuff that will clean up and upscale old movies and toon from the very early days of film? What about this Omniverse SDK that nVidia has talked about? It seems if you have more insterests than in just playing games AMD comes up really much much shorter than than nVCidia does. Or maybe it's just that people talk about the nVidia equivalents much more than the AMD equivalents? 

    Post edited by nonesuch00 on
  • Well this is embarrassing. AMD finally releases it's Big Navi video cards and even after a few months of saving money for one I'll still wind up with a model they are calling the Flounder.

    Will AMD Radeons have anything similar to this DLSS stuff that will clean up and upscale old movies and toon from the very early days of film? What about this Omniverse SDK that nVidia has talked about? It seems if you have more insterests than in just playing games AMD comes up really much much shorter than than nVCidia does. Or maybe it's just that people talk about the nVidia equivalents much more than the AMD equivalents? 

    AMD's DLSS equivalent is called Super Resolution. There isn't a ton about it known. It won't be enabled when the cards launch and it doesn't appear that it will require game makers to send in high res game images to AMD like DLSS does.

  • I'm just hoping that we get a RTX3090 price drop or maybe a 3080Ti or 3070Ti with more memory that the current RTX 3000 cards.  New rumors abound.

  • I'm just hoping that we get a RTX3090 price drop or maybe a 3080Ti or 3070Ti with more memory that the current RTX 3000 cards.  New rumors abound.

    Nvidia has definitely cancelled the double VRAM 3080 and 3070. They could bring them back at some later date but they are gone now. Apparently the yields of the actual Ampere chips are simply too low to split the SKU's. Gamers Nexus, which is about as reputable a news source as there is, has this today.

  • nonesuch00nonesuch00 Posts: 18,753

    I'm just hoping that we get a RTX3090 price drop or maybe a 3080Ti or 3070Ti with more memory that the current RTX 3000 cards.  New rumors abound.

    Cool, if the RTX 3090 drops to $1000 or less I'll save another 3 months. I better measure the card thickness & length very carefully though for my MB layout (microATX MB so RAM & NVMe SSD height might be a problem) and case width (microATX case) 

  • nonesuch00nonesuch00 Posts: 18,753
    edited November 2020

    Well this is embarrassing. AMD finally releases it's Big Navi video cards and even after a few months of saving money for one I'll still wind up with a model they are calling the Flounder.

    Will AMD Radeons have anything similar to this DLSS stuff that will clean up and upscale old movies and toon from the very early days of film? What about this Omniverse SDK that nVidia has talked about? It seems if you have more insterests than in just playing games AMD comes up really much much shorter than than nVCidia does. Or maybe it's just that people talk about the nVidia equivalents much more than the AMD equivalents? 

    AMD's DLSS equivalent is called Super Resolution. There isn't a ton about it known. It won't be enabled when the cards launch and it doesn't appear that it will require game makers to send in high res game images to AMD like DLSS does.

    Thanks, I didn't know DLSS sends images to nVidia AI supercomputers for calculations. That definately explains the better results the expert sites are saying nVidia's DLSS gets compared to AMD's (local only I suppose) solution.

    Post edited by nonesuch00 on
Sign In or Register to comment.