New NVIDIA Ampere leaks/rumours 3080 Ti, etc.

SevrinSevrin Posts: 6,314

What else can should you do while rendering but scan the globe for news about stuff that will make it quicker.

So there are new rumours about the upcoming Ampere cards, and it's kind of 

So, in Summary :
NVIDIA Ampere will feature :
1- "Massive" performance uplift in ray tracing.
2- Higher Rasterization throughput.
3- More vRAM across the board.
4- Lower TDPs.
5- Slightly higher clock speeds.
6- "Potentially" limited overclocking capability due to lower voltages.

WCCFTech Article here

The least certain, but possibly most exciting part of the rumour is about lower pricing.  I'll be in my bunk.

Comments

  • AsariAsari Posts: 703
    For Iray it will be interesting how much speed the improved rtx cores will provide. Also, whether the 3080ti can render considerably faster in 8k resolution. I find that the 2080ti renders well in 4k resolution but in 8k it's a long overnighter.
  • 7- More money !

    Lol :)

  • marblemarble Posts: 7,500

    7- More money !

    Lol :)

    Probably six months to save up for one.

  • I think Nvidia has learned its lesson. The 1080ti was a card every gamer wanted, and a lot bought. The 2080ti is viewed by most gamers as more than they want to spend.

    The Ampere cards will almost certainly cost less.

  • davesodaveso Posts: 7,939

    $500 my top price, so its going to have to be a lot cheaper

     

  • I have the 2080i and right now that is ample for me. As it is I can render and pump out higher-end content in a few mins. At most Id prolly consider it if I did more animation, which I dont. 

  • kyoto kidkyoto kid Posts: 42,137

    ...however for the Daz integrated Iray version it likely won't really matter. 

  • RobinsonRobinson Posts: 751
    daveso said:

    $500 my top price, so its going to have to be a lot cheaper

     

    Same but it won't be.  I expect similar prices to the current range, i.e. xx70 range will be the sweet spot for me.  This is OK actually.  A 2070 now is similar performance to the more expensive 1080 a few years back (way better RT performance of course).  Spending less just means you're a few years behind, that's all.

  • outrider42outrider42 Posts: 3,679

    Ha, I had thought of doing this thread, but now I don't to.

    I actually think Nvidia will hit nearly all of those points. I've said it before, but 2020 is going to be a wild year for GPU hardware. So Nvidia HAS to show up with something good at a reasonable price. AMD is doing pretty well, in fact the 5700 is outselling Nvidia's offerings in some markets. People are just not happy with Nvidia. There are gamers who argue that ray tracing is a gimmick in games and they don't want it.

    Of course they are wrong. Only a fool would truly believe ray tracing is a gimmick in games. It is hard, but it will get there. Even consoles will have ray tracing next year. Its not a fad if consoles are getting it.

    At any rate, regardless of rational gamers are frustrated with Nvidia. And Nvidia is going to have serious competition in 2020. AMD is set to release "big Navi", which will feature a GPU they call "the Nvidia Killer". Now certainly they could be grandstanding, but that would be very stupid and backfire if they failed. Rather, they really do seem to believe they have something really good, something that best the 2080ti. This would be huge because AMD has not competed with Nvidia at the top since at least 2012.

    Competition will drive prices down. I am very confident that the 3080ti will be cheaper than the 2080ti. How much I can't say. I doubt it will be as low as the 1080ti's $700 though. I'm thinking $800 at best, it could be $900, and unlike the 2080ti which has an absurd price for the FE, the FE for the 3080ti will not be $1200. I think it will drop to under $1000.

    For performce, yes, moving to 7nm will be a huge difference. They can do so much more with that. I also think using Samsung to fab the chips will be a major help as well.

    Pricing is only one thing. With both AMD and Intel releasing GPUs in 2020, Nvidia will want to push performance as well. They cannot screw around.

    The big complaint is about ray tracing's performance hit in games. So I believe Nvidia will be spending most of their efforts on this to kill this. First I believe the RT cores will be improved, its only natural, just like CUDA has improved each generation. Turing is only the 1st iteration. On top of this, I believe Nvidia will double the RT count in each tier. Currently, Turing has 1 RT core per SM. I believe Ampere will have 2 RT cores per SM. So, not only will these cores be faster, there will be twice as many of them (or more). That will lead to massive ray tracing gains.

    We've already seen what RT cores can do for Iray.  Going even faster will of course push Iray even faster.

    I also think Nvidia will work on making its Tensor cores more effective, more intelligent at denoising tasks. However I don't know if this will impact Iray. I think they need to rework how Tensor works for Iray.

    But of course ray tracing is only part of render. The rest is shading. And I think this will improve, too. I don't think the gain will be anything like RT core gains, but rather a proper CUDA generational jump. That's still good.

    Then there is VRAM. I already predicted Turing would go past 12gb of VRAM. And I was right. Maybe it was just the Titan, but still the Titan jumped to 24gb. I believe the next Titan will likely stay at that number. However I think we will see more gains in the mainstream cards. The 2060 Super is already sporting 8gb. So its logical the 3060 will see a 8gb version. It would be nuts to have 3 straight tiers (x60, x70, and x80) all at 8gb. The 3070 might stay at 8gb, but I think the 3080 will move to 10gb. The 3080ti could be interesting, but I think it will approach 16gb, though more likely it will be 12-14.

    One thing to keep in mind is the consoles will have lots of VRAM in 2020. Rumors speculate possibly 16gb. Why does that matter? Because PC gamers would be PISSED if a console had more VRAM than their $1000 GPU. I'm totally serious. So if the consoles really do get that much VRAM then look for the 3080ti to at least match that amount. The 3080 might also come closer, and again for the same reason. Plus if consoles get that much, that would indicate that games could start pushing more VRAM. Console games often limit what PC games do as many developers simply target console specs as their goals.

    I also think we could get some sweet surprises with Nvlink as well. If Nvidia goes with pcie 4.0 which they should, that would open the door to blazing speeds for GPUs to talk to each other. You might not even need a Nvlink connector to do Nvlink.

    Some of this probably sounds pie in the sky. But I did predict gamers would get both RT and Tensor cores long before the cards were announced.

    But basically, my advice if anyone is looking to buy a GPU is to wait for 2020 if you can. You will be able to get more bang for your buck.

    Oh, not to mention that new Ryzen CPUs should be coming out as well. Plus Intel is actually slashing prices to compete. 2020 is going to be an awesome year for hardware all around.

  • outrider42outrider42 Posts: 3,679

    I didn't cover the low to mid range segment. This segment, the $200-500 range GPUs, will see the most competition of all. While AMD might challenge Nvidia for the performance crown, it is still anybody's guess as to whether they actually will. As for Intel, they have already stated publicly that they will not be releasing high end GPUs in 2020, though they may surprise us, it looks unlikely. Intel will first release mainstream GPUs in the price range I mentioned. AMD has already been quite strong in this segment as well.

    So that means the mid range segment is going to be an absolute bloodbath. And we the consumers will reap the benefits. So look for your $200 or $400 to get you a lot more GPU from this battle. I would advise trying to wait for all 3 companies to get some cards out, because they will price themselves accordingly. There will probably be a variety of big sale discounts as well.

    There is even a small possibility that Intel has a way to use CUDA on their GPUs, because we just don't know how they will function yet. I stress this is a very small chance, but it is out there. So if Intel GPUs can use CUDA, you would have a choice between them and Nvidia for Iray. That would be pretty rad.

  • kyoto kidkyoto kid Posts: 42,137

    ....PCIe4.0?, Wonder if that will even run on my old PCI 2.0 MBs.  I know PCI3 3.0 will but with all the changes to required specs lately hard to tell what the new requirements will be.

  • nicsttnicstt Posts: 11,715

    7- More money !

    Lol :)

    This!

    ... But you forgot 'lots'.

  • outrider42outrider42 Posts: 3,679
    kyoto kid said:

    ....PCIe4.0?, Wonder if that will even run on my old PCI 2.0 MBs.  I know PCI3 3.0 will but with all the changes to required specs lately hard to tell what the new requirements will be.

    PCIE is always back compatible as part of its standard. The biggest thing here is just the bus bandwidth, pcie 4.0 doubles the bandwidth of 3.0's 16 GB/s to 32 GB/s. PCIE 2.0 caps at 8 GB/s. So the Ampere GPU would be forced to run at pcei 2.0 bus speed. However, I very seriously doubt it would be an issue as 3.0 didn't even saturate its bandwidth in most situations, certainly not for Iray. Running Iray in 2.0 has no effect whatsoever on rendering, so I cannot imagine a situation where it would with Ampere. Because as I have said before, Iray is not like video games, the bus is not being pushed as hard, at least for single GPUs.

    The one and only situation that may pose a performance penalty is multi-GPU setups, and that would be only if such a setup depended on 4.0 bus speed for communication. We don't know if Iray is even capable of taking advantage of that.

    Here is a question, has anybody used multiple GPUs in a pcie 2.0 setup? I'd like to know how that compares to the same cards in a 3.0 setup. If there is a difference, then we have an answer for how 4.0 would effect multiple GPU performance.

  •  

    There is even a small possibility that Intel has a way to use CUDA on their GPUs, because we just don't know how they will function yet. I stress this is a very small chance, but it is out there. So if Intel GPUs can use CUDA, you would have a choice between them and Nvidia for Iray. That would be pretty rad.

    CUDA is part of the Nvidia GPU's hardware. Nvidia provides an API for directly accessing that HW. Intel could not copy that HW, it is protected by numerous patents that Nvidia zealously protects.

    Both Intel and AMD could produce a software library to emulate CUDA, I know there is an open source project underway to do that, but software emulation of hardware always comes at a significant performance penalty.

  • outrider42outrider42 Posts: 3,679
    edited November 2019

    At the end of the day CUDA is nothing but software. There is nothing stopping Nvidia from licensing CUDA. And Nvidia uses Intel patented tech in their own stuff as well...just look at Iray, which uses Intel Eevee for CPU based rendering. All of these companies use some kind of tech that was build by the other when it suits them. Android pays royalties to Microsoft. Intel has a mountain of patents of their own, and they could wheel and deal their way to CUDA by leveraging their patents that Nvidia needs if they really wanted to. And if Intel licensed CUDA, that would mean that Nvidia makes royalties on every Intel GPU sold.

    And I never said this is a sure thing. The odds are quite slim. But an Intel discrete GPU is a whole new ballgame, they could try something totally new. We do know that Intel is focusing more on professional use than gamers, and CUDA is a big player in several fields, like GPU rendering.

    Post edited by outrider42 on
  • No, CUDA isn't just software. CUDA is a specific HW pipline inside each SM in a CUDA Nvidia GPU. It allows certain operations to be performed in parallel as if the GPU had thousands of processing cores.

    Nvidia makes literal billions off CUDA. They will not be licensing those patents, ever. It is at the core of their business.

  • 7- More money !

    Lol :)

    Actually they already announced the 3000 series will cost LESS than the 2000's.

  • outrider42outrider42 Posts: 3,679

    It is literally a software layer that accesses the GPU. That is in the description. It is software, pure and simple.

    Also, you might want to look at this. https://nvidianews.nvidia.com/news/intel-to-pay-nvidia-technology-licensing-fees-of-1-5-billion

    "Under the new agreement, Intel will have continued access to NVIDIA's full range of patents.  In return, NVIDIA will receive an aggregate of $1.5 billion in licensing fees, to be paid in annual installments, and retain use of Intel's patents, consistent with its existing six-year agreement with Intel. "

    Intel paid Nvidia 1.5 BILLION a few years ago for "licensing fees". Now there is a bit bit more to that deal than what was announced to the public, and I suppose I cannot discuss that here on Daz forums. But regardless of what the actual reason was, the fact of the matter is that Intel and Nvidia made a deal with each other concerning their technology. A big one. It has since expired, interestingly enough just last year.

    Moreover, Intel has the pure patent muscle to really push Nvidia around and force the issue. Intel owns quite a few patents that both Nvidia and AMD use in their GPUs. They can use that to their advantage at a negotiation table. And they have billions of cash in the bank to sweeten that deal, billions more than Nvidia is even worth as a company.

    That is why Intel's move into discrete GPU is such a wild card.

    I am not saying it will happen. But I am only saying it is possible. I am sure Nvidia would not just happily let Intel license CUDA, but business can be very complicated sometimes, and these two companies have a long history of deals, patent lawsuits, and other shenanigans that make anything possible. Dude, Intel and AMD mad a CPU+GPU combo on a single chip just last year in the middle of their bloody battle between each other. How many people predicted that?

    We have a Romance of the Three Kingdoms era of GPU technology coming upon us.

  • SevrinSevrin Posts: 6,314

    It is literally a software layer that accesses the GPU. That is in the description. It is software, pure and simple.

    Also, you might want to look at this. https://nvidianews.nvidia.com/news/intel-to-pay-nvidia-technology-licensing-fees-of-1-5-billion

    "Under the new agreement, Intel will have continued access to NVIDIA's full range of patents.  In return, NVIDIA will receive an aggregate of $1.5 billion in licensing fees, to be paid in annual installments, and retain use of Intel's patents, consistent with its existing six-year agreement with Intel. "

    Intel paid Nvidia 1.5 BILLION a few years ago for "licensing fees". Now there is a bit bit more to that deal than what was announced to the public, and I suppose I cannot discuss that here on Daz forums. But regardless of what the actual reason was, the fact of the matter is that Intel and Nvidia made a deal with each other concerning their technology. A big one. It has since expired, interestingly enough just last year.

    Moreover, Intel has the pure patent muscle to really push Nvidia around and force the issue. Intel owns quite a few patents that both Nvidia and AMD use in their GPUs. They can use that to their advantage at a negotiation table. And they have billions of cash in the bank to sweeten that deal, billions more than Nvidia is even worth as a company.

    That is why Intel's move into discrete GPU is such a wild card.

    I am not saying it will happen. But I am only saying it is possible. I am sure Nvidia would not just happily let Intel license CUDA, but business can be very complicated sometimes, and these two companies have a long history of deals, patent lawsuits, and other shenanigans that make anything possible. Dude, Intel and AMD mad a CPU+GPU combo on a single chip just last year in the middle of their bloody battle between each other. How many people predicted that?

    We have a Romance of the Three Kingdoms era of GPU technology coming upon us.

    Yeah, that deal came to an end 2 years ago but...Nvidia-Intel licensing was really just a payment

    The deal is over, and guess what, Intel didn’t announce any new license for the GPU, simply as this never was a GPU licensing deal. People close to the matter from several different independent sides have confirmed that the deal was never about  licensing. AMD’s deal with intel turned out to be a Vega GPU stitched with EMIB to a Core processor and nothing more than that.

  • It is literally a software layer that accesses the GPU. That is in the description. It is software, pure and simple.

    Also, you might want to look at this. https://nvidianews.nvidia.com/news/intel-to-pay-nvidia-technology-licensing-fees-of-1-5-billion

    "Under the new agreement, Intel will have continued access to NVIDIA's full range of patents.  In return, NVIDIA will receive an aggregate of $1.5 billion in licensing fees, to be paid in annual installments, and retain use of Intel's patents, consistent with its existing six-year agreement with Intel. "

    Intel paid Nvidia 1.5 BILLION a few years ago for "licensing fees". Now there is a bit bit more to that deal than what was announced to the public, and I suppose I cannot discuss that here on Daz forums. But regardless of what the actual reason was, the fact of the matter is that Intel and Nvidia made a deal with each other concerning their technology. A big one. It has since expired, interestingly enough just last year.

    Moreover, Intel has the pure patent muscle to really push Nvidia around and force the issue. Intel owns quite a few patents that both Nvidia and AMD use in their GPUs. They can use that to their advantage at a negotiation table. And they have billions of cash in the bank to sweeten that deal, billions more than Nvidia is even worth as a company.

    That is why Intel's move into discrete GPU is such a wild card.

    I am not saying it will happen. But I am only saying it is possible. I am sure Nvidia would not just happily let Intel license CUDA, but business can be very complicated sometimes, and these two companies have a long history of deals, patent lawsuits, and other shenanigans that make anything possible. Dude, Intel and AMD mad a CPU+GPU combo on a single chip just last year in the middle of their bloody battle between each other. How many people predicted that?

    We have a Romance of the Three Kingdoms era of GPU technology coming upon us.

    I've explained this twice! Why do you continue to contradict established facts. CUDA "cores" are referenced in every description of every Nvidia GPU. Those are real physical things that are part of each SM. They are what the CUDA API interacts with. 

    As Sevrin already pointed out  Intel never licensed anything from Nvidia. This was a payment to settle the anti competition lawsuit when Intel started putting iGPU's on their CPU's and made it impossible for Nvidia to continue selling on MoBo graphics chipsets.

    Intel may have a lot of money but Nvidia does as well and Nvidia knows full well that CUDA and GPGPU's are the future of all sorts of things. They stand to make billions off Nvidia Drive in just the next few years. My datacenter has more money in Quadros than in Xeons, its recent and its still not that much more but I cannot see that changing any time soon.

    tbh I expect Intels discrete GPU to not make much impact. They've already said they don't even plan to compete for teh high end of the market. That most consumers don't buy the top tier doesn't matter they read/watch the reviews and make a lot of their buying decisions based on which competitor is on top. Further their iGPU's have been generally awful and the recent new version was not much of an improvement.

     

  • kyoto kidkyoto kid Posts: 42,137

    ...I remember when Intel tried their hand at a dedicated GPU once before in 2010 with the old Larrabee based GPU chip which was intended as the core for a 3D GPU card but was scrapped due to poor performance. 

    Not holding my breath about this new offering.

  • nonesuch00nonesuch00 Posts: 18,830

    nVidia, intel, AMD none of them have exclusive rights to the mathematics of visual computer graphics, computer software, or computer hardware so any of them or another entity altogether could come up with the next greatest thing since sliced bread. Of those big now, I personally think AMD is the one that's going to vastly exceed what intel or nVidia are doing simply because they are more focused on giving the consumer value for money compared to intel and nVidia and that forces them to be more innovative to have competitive hardware & software and not "cheaper hardware and software because it's inferior". Why to I think that? Because they've demonstrated that with the Ryzen CPUs and their new GPUs, both integrated and discrete.

Sign In or Register to comment.