Nvidia Ampere (2080 Ti, etc. replacements) and other rumors...

13941434445

Comments

  • I may have spoken too soon.  AMD may be announcing it's MI100 Instinct CDNA based accelerators on November 16th...

    https://www.aroged.com/2020/11/amd-unveils-instinct-cdna-compute-accelerators-nov-16/

    I'm not sure how useful accelerators are for rendering, plus if you could use it, it'd mainly only be of help for Blender, Poser 12, and other 'brand agnostic' GPU rendering engines, plus it'll likely be pricey.  Has anyone here tried using an AMD Instinct for rendering?  I'm just curious, still planning on that 3090 early next year...

    These are new products so no one has ever seen them to know but since they lack CUDA they are useless for iRay.

     

  • outrider42outrider42 Posts: 3,679
    Ghosty12 said:

    Well AMD just launched their new cards, and well it was interesting and all of their cards come with 16GB of vram.. And it looks like AMD's top of the range card is taking on the 3090, at $500 less.. I think that Nvidia may want to be concerned with what AMD have shown.. Will have to wait and see how Nvidia respond to this, only time will tell..

    It's enough VRAM because they're gaming cards and no game calls for even 8 Gb. Even for most casual creator uses 10Gb is plenty.  You guys have got to understand that DS is a very very niche use case. For 99.999999999999999999999999999+% of users the 3080 will never run out of VRAM.

    The only reason to increase the amount of VRAM is for advertising and epeen purposes and considering the cost and performance issues Nvidia clearly has decided against that. Maybe they'll eventually release the prosumer oriented Ampere cards once they figure out the GDDR6X production issues, but that isn't likely to happen for a while. 


    By that thinking then 4 to 8 GB of system ram is more than enough for most people.. But we all know most people will go for 16 to 32 GB of system ram, as it adds in an amount of future proofing to peoples computers..

    No. There are very good reasons for more system RAM. There is not any current reason for more VRAM. and the argument about not pushing VRAM because no cards exist that don't have more? WTF?

    Even in the Nvidia lineup there have been 11Gb cards for a long time. Nvidia made a very valid point though even with 2 generations of 11Gb flagships there is not a single game that pushes past 8Gb even with every texture setting maxed out at 4k. So why make consumers keep paying for VRAM they aren't using?

    Incorrect. Doom uses 9gb at 4K with all settings maxed, and it actually uses it, it is NOT simply reserving it. Take a look at HardwareUnboxed videos about this topic, they did an investigation of Doom's performance. On 8gb cards, Doom will abnormally punish them compared to the higher capacities at max 4k settings. By dropping the texture detail a notch and running the benchmarks across all GPUs with this new setting, they found the 8gb cards performed more inline with where they were supposed to be.

    You may also remember that Nvidia specifically showed Doom running on a 3080 versus the 2080, and that the performance difference was massive. The reason for this difference ties into this, the 2080 was getting hammered by running out of VRAM. When you normalize the bench by dropping the textures, the performance gap shrinks drastically.

    So there is at least one example of a game currently available that uses more than 8gb VRAM, and suffers a clear performance hit on GPUs with 8gb. We are also just about to start a whole new console generation, and traditionally a new console generation pushes PC gaming specs up, as what is considered average or mid range is moved up.

    Now that we have 3 different AMD cards all boosting 16gb, there will be more games that do this in the future. I would bet money that AMD sponsored titles in particular will not be shy about using more VRAM in the very near future. The question of "How much VRAM is needed now?" is very much a chicken and egg situation. Developers will always try to build games that work with currently available hardware. It is extremely rare for any video game to feature settings that are impossible to run on hardware that was currently available at the time of release. The baseline has now moved. We now have a 16gb card in the mid range, and yes, $550 is mid range, folks.

    Benchmarks are going to be important. AMD barely even mentioned ray tracing in this presentation. That is a red flag that their ray tracing is not so great. They also have no DLSS like feature at launch, only a promise that they will in the future...but promises are not exactly good reasons to buy hardware. Most of the benchmark numbers they showed were AMD sponsored games, which should of course grant them an edge. Their 6900XT numbers were very sketchy, as they overclocked that GPU using their "RAGE mode" and SAM features to achieve numbers higher than what would be normally possible. Those auto overclock numbers are only possible if you buy a entire new PC with all compatible parts, the 6900XT, a Ryzen 5900XT, and a AMD 500 series motherboard. If you lack any of these parts, the 6900XT doesn't hit these numbers. But at the very least they are competitive, which is as important as winning outright.

    Another thing is that a lot of people thought AMD would be using way less power to do this. The 6800XT at 300 Watts is not way less than 320. It is less, but that is not drastic, and I can't imagine anybody choosing a GPU based on it using 20 Watts less than another, LOL. The 6900XT might be a bit different, since the 3090 is so power hungry, but I have a feeling that the overclocks on the 6900XT will push it well above 300, so not a huge win there, either.

    The 6800 is the pretty interesting, as it tackles the 3070 and appears to easily beat it, and do so with double the VRAM. However, it is not such an easy victory, since it costs $80 more and oh BTW it uses a good 30 Watts more, too...which I find funny considering how so many people are suddenly talking about power draw these days. So are we really going to split hairs over the 3080 using 20 Watts more than the 6800XT when the 6800 uses 30 more than the 3070?

    Something is wonky with HardwareUnboxed's test and many people have pointed it out as you can see from that reddit thread. I just checked it myself. My 2070 runs Doom within variance of my 1080ti. If ultra nightmare at 4k really needed more than 8 I should be seeing a massive fps hit but I'm not. The cards are comparable in pretty much every game and Doom is no exception no matter what setting I use. That the rest of the tech review world did not jump all over that says they think HU was full of it.

    Digital Foundry also observed this issue in their video. So something is wonky with 2 independent reviewers then! I'm sorry, but I would trust HU before 99% of "the rest of the tech world"...or you.

    At any rate, I'll agree to disagree.

    But remember how I also said this was a chicken and egg situation? And that once 16gb cards enter the more mainstream market we will start see games using more VRAM? Especially from any AMD sponsored titles? Remember that? Well, just one week after the AMD reveal we have a game saying they will be REQUIRING 12gb of VRAM for their highest settings. Oh snap! Not only does this step over the 3080, but even the 2080ti.

    https://videocardz.com/newz/godfall-requires-12gb-of-vram-for-ultrahd-textures-at-4k-resolution

    Of course this title also happens to be an AMD sponsored title. AMD is going to leverage this advantage real hard, and actively encourage game devs to take advantage of this VRAM. But even without marketing this is something that needs to happen. I've said it before VRAM is actively restricting video game design. Now that we have 16gb cards entering the market at multiple price points, the time has come. While tech that pulls data from SSD like the PS5 will eventually be a thing, it is still much easier to design a game around VRAM since it is right there on the GPU. That and most people will not have the PC hardware to do this for some time. The fact that AMD is even releasing 16gb cards is a sign that AMD themselves understands this.

    If that is not enough, the lesser AMD cards are rumored to have some large VRAM capacities as well. Rumors say the 6700XT could have 12gb itself, a possibly $400 card would have more VRAM than the 3080.

    As for the console advantage for AMD, it will be a factor this time. Sure, providing the hardware for the previous generation didn't help AMD...that is because AMD sucked so bad. AMD had nothing to compete with Nvidia, but the bigger issue was adoption. Nvidia had total command of the GPU market. Game developers are always going to target the larger markets, and since AMD could never gain any traction why would they support them? That changes now. AMD is poised to take at least some market share from Nvidia, almost by default as long as they can simply supply stock. So with AMD cards becoming more popular, it is only logical that devs will start to specifically target them more. AMD now has some of the fastest hardware and they are involved with consoles. They have a serious advantage thanks to Nvidia's mistakes.

    Poser's press release for Poser 12 has some sick burns aimed directly at Iray, LOL. Poser made I think the best choice they possibly could make by teaming up with Blender Cycles since it can support different hardware. Now maybe we will see a Poser comeback, all of this stuff can be tied together. If customers become frustrated with Nvidia, they may make a move to different software, software like Poser and Blender Cycles, which can use AMD hardware. If that happens, those customers may start buying models from elsewhere, too. If I were Daz, I'd be at least a little bit nervous about this. The Daz to Blender "export" is just a script, and it is limited in what it can do. Poser actually has SuperFly built right in, no messy and time consuming export process needed. This is something Daz has a genuine problem with. A lot of people are not fond of needing to export everything.

  • As PCGamer pointed out in regards RE3 until its actually tested what publishers says is required is irrelevant.

    Did you read the reddit thread you linked about the HU test? I wasn't alone in pointing out that the test wasn't having the results they claimed.

  • marblemarble Posts: 7,500
    Ghosty12 said:

    Poser's press release for Poser 12 has some sick burns aimed directly at Iray, LOL. Poser made I think the best choice they possibly could make by teaming up with Blender Cycles since it can support different hardware. Now maybe we will see a Poser comeback, all of this stuff can be tied together. If customers become frustrated with Nvidia, they may make a move to different software, software like Poser and Blender Cycles, which can use AMD hardware. If that happens, those customers may start buying models from elsewhere, too. If I were Daz, I'd be at least a little bit nervous about this. The Daz to Blender "export" is just a script, and it is limited in what it can do. Poser actually has SuperFly built right in, no messy and time consuming export process needed. This is something Daz has a genuine problem with. A lot of people are not fond of needing to export everything.

     

     

     

    Reading through the feature list for Poser 12, it looks to me like they have only implemented the CUDA rendering capabilities which requires an NVidia GPU. 

    GPU assisted rendering is supported via Nvidia CUDA for faster renders, users will be able to harness the power of their Nvidia GPU(s) when using the SuperFly renderer.

    System requirements for Poser installation are as follows:

    • CUDA enabled device required for hardware accelerated final rendering (2 GB VRAM minimum, Compute Capability 2.0 minimum)

  • NylonGirlNylonGirl Posts: 2,209

    "Another bonus is that Cycles doesn't restrict users to one type of GPU. Cycles supports GPU rendering with three GPU rendering modes: CUDA, which is the preferred method for older Nvidia graphics cards; OptiX, which utilizes the hardware ray-tracing capabilities of Nvidia's Turing architecture & Ampere architecture; and OpenCL, which supports rendering on AMD graphics cards. This means multiple GPUs are also supported, unlike Iray that only functions with Nvidia hardware."

    https://www.posersoftware.com/article/488/poser-12-update-how-the-new-superfly-improves-art-render-time

  • AsariAsari Posts: 703
    edited November 2020
    I have this problem too. I indeed moved away from Iray to become more hardware independent from NVidia and the fact that you have to spend 1500+ to get a gard at more than 10GB is kind of a bad deal for me. However I only gained the choice between CPU rendering and NVidia rendering which turned my upgrade game into Threadripper vs. 3090 ... because if the render engine is Optix-based there is no way AMD cards will work. The Threadripper blasts away all benchmarks and eats even the 3090 with ease but at this price point ...

    Somehow I feel the upgrading game has not become easier for people who do GPU rendering on hobbyist/student level. It would be interesting to see how Octane and Cycles would fare with AMD GPUs. I've read people claim that OpenCL is slower than Optix, so would be interesting to see how it fares in Blender.

    Post edited by Asari on
  • nonesuch00nonesuch00 Posts: 18,729
    Ghosty12 said:

    Well AMD just launched their new cards, and well it was interesting and all of their cards come with 16GB of vram.. And it looks like AMD's top of the range card is taking on the 3090, at $500 less.. I think that Nvidia may want to be concerned with what AMD have shown.. Will have to wait and see how Nvidia respond to this, only time will tell..

    It's enough VRAM because they're gaming cards and no game calls for even 8 Gb. Even for most casual creator uses 10Gb is plenty.  You guys have got to understand that DS is a very very niche use case. For 99.999999999999999999999999999+% of users the 3080 will never run out of VRAM.

    The only reason to increase the amount of VRAM is for advertising and epeen purposes and considering the cost and performance issues Nvidia clearly has decided against that. Maybe they'll eventually release the prosumer oriented Ampere cards once they figure out the GDDR6X production issues, but that isn't likely to happen for a while. 


    By that thinking then 4 to 8 GB of system ram is more than enough for most people.. But we all know most people will go for 16 to 32 GB of system ram, as it adds in an amount of future proofing to peoples computers..

    No. There are very good reasons for more system RAM. There is not any current reason for more VRAM. and the argument about not pushing VRAM because no cards exist that don't have more? WTF?

    Even in the Nvidia lineup there have been 11Gb cards for a long time. Nvidia made a very valid point though even with 2 generations of 11Gb flagships there is not a single game that pushes past 8Gb even with every texture setting maxed out at 4k. So why make consumers keep paying for VRAM they aren't using?

    Incorrect. Doom uses 9gb at 4K with all settings maxed, and it actually uses it, it is NOT simply reserving it. Take a look at HardwareUnboxed videos about this topic, they did an investigation of Doom's performance. On 8gb cards, Doom will abnormally punish them compared to the higher capacities at max 4k settings. By dropping the texture detail a notch and running the benchmarks across all GPUs with this new setting, they found the 8gb cards performed more inline with where they were supposed to be.

    You may also remember that Nvidia specifically showed Doom running on a 3080 versus the 2080, and that the performance difference was massive. The reason for this difference ties into this, the 2080 was getting hammered by running out of VRAM. When you normalize the bench by dropping the textures, the performance gap shrinks drastically.

    So there is at least one example of a game currently available that uses more than 8gb VRAM, and suffers a clear performance hit on GPUs with 8gb. We are also just about to start a whole new console generation, and traditionally a new console generation pushes PC gaming specs up, as what is considered average or mid range is moved up.

    Now that we have 3 different AMD cards all boosting 16gb, there will be more games that do this in the future. I would bet money that AMD sponsored titles in particular will not be shy about using more VRAM in the very near future. The question of "How much VRAM is needed now?" is very much a chicken and egg situation. Developers will always try to build games that work with currently available hardware. It is extremely rare for any video game to feature settings that are impossible to run on hardware that was currently available at the time of release. The baseline has now moved. We now have a 16gb card in the mid range, and yes, $550 is mid range, folks.

    Benchmarks are going to be important. AMD barely even mentioned ray tracing in this presentation. That is a red flag that their ray tracing is not so great. They also have no DLSS like feature at launch, only a promise that they will in the future...but promises are not exactly good reasons to buy hardware. Most of the benchmark numbers they showed were AMD sponsored games, which should of course grant them an edge. Their 6900XT numbers were very sketchy, as they overclocked that GPU using their "RAGE mode" and SAM features to achieve numbers higher than what would be normally possible. Those auto overclock numbers are only possible if you buy a entire new PC with all compatible parts, the 6900XT, a Ryzen 5900XT, and a AMD 500 series motherboard. If you lack any of these parts, the 6900XT doesn't hit these numbers. But at the very least they are competitive, which is as important as winning outright.

    Another thing is that a lot of people thought AMD would be using way less power to do this. The 6800XT at 300 Watts is not way less than 320. It is less, but that is not drastic, and I can't imagine anybody choosing a GPU based on it using 20 Watts less than another, LOL. The 6900XT might be a bit different, since the 3090 is so power hungry, but I have a feeling that the overclocks on the 6900XT will push it well above 300, so not a huge win there, either.

    The 6800 is the pretty interesting, as it tackles the 3070 and appears to easily beat it, and do so with double the VRAM. However, it is not such an easy victory, since it costs $80 more and oh BTW it uses a good 30 Watts more, too...which I find funny considering how so many people are suddenly talking about power draw these days. So are we really going to split hairs over the 3080 using 20 Watts more than the 6800XT when the 6800 uses 30 more than the 3070?

    Something is wonky with HardwareUnboxed's test and many people have pointed it out as you can see from that reddit thread. I just checked it myself. My 2070 runs Doom within variance of my 1080ti. If ultra nightmare at 4k really needed more than 8 I should be seeing a massive fps hit but I'm not. The cards are comparable in pretty much every game and Doom is no exception no matter what setting I use. That the rest of the tech review world did not jump all over that says they think HU was full of it.

    Digital Foundry also observed this issue in their video. So something is wonky with 2 independent reviewers then! I'm sorry, but I would trust HU before 99% of "the rest of the tech world"...or you.

    At any rate, I'll agree to disagree.

    But remember how I also said this was a chicken and egg situation? And that once 16gb cards enter the more mainstream market we will start see games using more VRAM? Especially from any AMD sponsored titles? Remember that? Well, just one week after the AMD reveal we have a game saying they will be REQUIRING 12gb of VRAM for their highest settings. Oh snap! Not only does this step over the 3080, but even the 2080ti.

    https://videocardz.com/newz/godfall-requires-12gb-of-vram-for-ultrahd-textures-at-4k-resolution

    Of course this title also happens to be an AMD sponsored title. AMD is going to leverage this advantage real hard, and actively encourage game devs to take advantage of this VRAM. But even without marketing this is something that needs to happen. I've said it before VRAM is actively restricting video game design. Now that we have 16gb cards entering the market at multiple price points, the time has come. While tech that pulls data from SSD like the PS5 will eventually be a thing, it is still much easier to design a game around VRAM since it is right there on the GPU. That and most people will not have the PC hardware to do this for some time. The fact that AMD is even releasing 16gb cards is a sign that AMD themselves understands this.

    If that is not enough, the lesser AMD cards are rumored to have some large VRAM capacities as well. Rumors say the 6700XT could have 12gb itself, a possibly $400 card would have more VRAM than the 3080.

    As for the console advantage for AMD, it will be a factor this time. Sure, providing the hardware for the previous generation didn't help AMD...that is because AMD sucked so bad. AMD had nothing to compete with Nvidia, but the bigger issue was adoption. Nvidia had total command of the GPU market. Game developers are always going to target the larger markets, and since AMD could never gain any traction why would they support them? That changes now. AMD is poised to take at least some market share from Nvidia, almost by default as long as they can simply supply stock. So with AMD cards becoming more popular, it is only logical that devs will start to specifically target them more. AMD now has some of the fastest hardware and they are involved with consoles. They have a serious advantage thanks to Nvidia's mistakes.

    Poser's press release for Poser 12 has some sick burns aimed directly at Iray, LOL. Poser made I think the best choice they possibly could make by teaming up with Blender Cycles since it can support different hardware. Now maybe we will see a Poser comeback, all of this stuff can be tied together. If customers become frustrated with Nvidia, they may make a move to different software, software like Poser and Blender Cycles, which can use AMD hardware. If that happens, those customers may start buying models from elsewhere, too. If I were Daz, I'd be at least a little bit nervous about this. The Daz to Blender "export" is just a script, and it is limited in what it can do. Poser actually has SuperFly built right in, no messy and time consuming export process needed. This is something Daz has a genuine problem with. A lot of people are not fond of needing to export everything.

    I have a AMD Radeon RX 570 8GB and while a good GPU, Blender 2.9's cycles support of it is a bit patchy and buggy. Now that won't stop me from buying one of those Big Navi cards next year after I buy an obligatory 3060 or 3070 card because I do expect AMD to make big inroads catching up to nVidia in SW over the next couple of years as well. A least that's my hope because without it AMD exceeding nVidia in HW quality and speed is pretty useless.

  • tj_1ca9500btj_1ca9500b Posts: 2,057

    For those of you that may want to try to snag a Ryzen 5000 series CPU today...

    https://wccftech.com/amd-ryzen-9-5950x-ryzen-9-5900x-ryzen-7-5800x-ryzen-5-5600x-zen-3-ryzen-5000-desktop-cpus-available-today-heres-where-to-buy-them/

    The scalper bots will probably still win, but best of luck!

  • billyben_0077a25354billyben_0077a25354 Posts: 771
    edited November 2020

    Yes, there appears to be hope for a decent rendering card.

    Post edited by billyben_0077a25354 on
  • For those of you that may want to try to snag a Ryzen 5000 series CPU today...

    https://wccftech.com/amd-ryzen-9-5950x-ryzen-9-5900x-ryzen-7-5800x-ryzen-5-5600x-zen-3-ryzen-5000-desktop-cpus-available-today-heres-where-to-buy-them/

    The scalper bots will probably still win, but best of luck!

    I have a friend who works in sales at a Microcenter and they do work on commission so take this advice with that knowledge.

    Microcenter is only selling to actual persons in their actual stores. If you live near one and you want one of these CPU's you might want to consider that as an option.

  • nonesuch00nonesuch00 Posts: 18,729

    For those of you that may want to try to snag a Ryzen 5000 series CPU today...

    https://wccftech.com/amd-ryzen-9-5950x-ryzen-9-5900x-ryzen-7-5800x-ryzen-5-5600x-zen-3-ryzen-5000-desktop-cpus-available-today-heres-where-to-buy-them/

    The scalper bots will probably still win, but best of luck!

    I have a friend who works in sales at a Microcenter and they do work on commission so take this advice with that knowledge.

    Microcenter is only selling to actual persons in their actual stores. If you live near one and you want one of these CPU's you might want to consider that as an option.

    I checked at the one nearest me and the 3070s are all sold out but they do have 3080s. It's probably like that at all their stores. I guess they really are buying for need and not to have the creme de la creme top card.  If your search on Amazon the prices are silly price gauging and this time the markups are do to the fake justification folks building lots cryptocurrencies rigs isn't flying. Even so, the Microcenters & every place I've seen that is an established chain or online business has markups between $30 and $60 over the MSRP of nVidia's models.

  • nonesuch00nonesuch00 Posts: 18,729

    My MB's manufacturer (Gigabyte) told me in a support ticket the B450 MB bought originally designed way back for the Ryzen 2 CPU generation will be bios upgraded to support the Ryzen 9 5950X. Pretty sweet even though earliest I am likely to buy is autumn 2021!

  • I got a 5600X, plus all the bits and pieces for a new system for the SO, at Microcenter. There was no significant rush on them as far as I could tell but they were doing more business than usual so who knows how long they'll stay in stock.

    Everything went together smoothly and am now waiting through the Windows update process.

    According to what I've seen the 400 series Mobo's won't get BIOS updates to support the CPU's till next year.

  • tj_1ca9500btj_1ca9500b Posts: 2,057

    I got a 5600X, plus all the bits and pieces for a new system for the SO, at Microcenter. There was no significant rush on them as far as I could tell but they were doing more business than usual so who knows how long they'll stay in stock.

    Everything went together smoothly and am now waiting through the Windows update process.

    According to what I've seen the 400 series Mobo's won't get BIOS updates to support the CPU's till next year.

    Congrats on the new CPU!

  • I got a 5600X, plus all the bits and pieces for a new system for the SO, at Microcenter. There was no significant rush on them as far as I could tell but they were doing more business than usual so who knows how long they'll stay in stock.

    Everything went together smoothly and am now waiting through the Windows update process.

    According to what I've seen the 400 series Mobo's won't get BIOS updates to support the CPU's till next year.

    Congrats on the new CPU!

    It's not mine. It's the wife's. It's to get her off my system and off the laptop she's been using since her desktop died back in August. I'm staying on my R7 2700 for a while.

    It is up and running now and some quick benchmarks before she kicked me off show it to be at least 50% faster than my 2700 in games and at DS CPU renders (even with the 2 fewer cores). 

    She seems happy which means I'm happy.

  • I got a 5600X, plus all the bits and pieces for a new system for the SO, at Microcenter. There was no significant rush on them as far as I could tell but they were doing more business than usual so who knows how long they'll stay in stock.

    Everything went together smoothly and am now waiting through the Windows update process.

    According to what I've seen the 400 series Mobo's won't get BIOS updates to support the CPU's till next year.

    Congrats on the new CPU!

    It's not mine. It's the wife's. It's to get her off my system and off the laptop she's been using since her desktop died back in August. I'm staying on my R7 2700 for a while.

    It is up and running now and some quick benchmarks before she kicked me off show it to be at least 50% faster than my 2700 in games and at DS CPU renders (even with the 2 fewer cores). 

    She seems happy which means I'm happy.

    Ah, you've discovered the secret to true happiness.  Happy wife. happy life.

    I am thinking about adding another 32GB of memory and maybe upgrading my Ryzen 3700X to a 5000 series CPU after I upgrade my GPU.  WIll be a 3080Ti if they have 20 GB of memory or go full Monte and get a 3090 if the 3080 Ti is released with a subpar quantity of memory for rendering.

    Now lets hope the current 3080Ti rumors are true (3090 chips that failed the bin tests to make 3090 speed and 20 GB of memory).

  • I got a 5600X, plus all the bits and pieces for a new system for the SO, at Microcenter. There was no significant rush on them as far as I could tell but they were doing more business than usual so who knows how long they'll stay in stock.

    Everything went together smoothly and am now waiting through the Windows update process.

    According to what I've seen the 400 series Mobo's won't get BIOS updates to support the CPU's till next year.

    Congrats on the new CPU!

    It's not mine. It's the wife's. It's to get her off my system and off the laptop she's been using since her desktop died back in August. I'm staying on my R7 2700 for a while.

    It is up and running now and some quick benchmarks before she kicked me off show it to be at least 50% faster than my 2700 in games and at DS CPU renders (even with the 2 fewer cores). 

    She seems happy which means I'm happy.

    Ah, you've discovered the secret to true happiness.  Happy wife. happy life.

    I am thinking about adding another 32GB of memory and maybe upgrading my Ryzen 3700X to a 5000 series CPU after I upgrade my GPU.  WIll be a 3080Ti if they have 20 GB of memory or go full Monte and get a 3090 if the 3080 Ti is released with a subpar quantity of memory for rendering.

    Now lets hope the current 3080Ti rumors are true (3090 chips that failed the bin tests to make 3090 speed and 20 GB of memory).

    The problem with that rumor is the 3080 and 3090 are the same chip. So 3090's that fail QA become 3080's. If yields were high enough then I'm sure they could split the yields up more but right now with no 3080's on the shelves I really cannot see Nvidia splitting the product stack up. Next spring or whenever they get the yields up maybe but you guys counting on some imminent release are I think going to be disappointed. 

  • nicsttnicstt Posts: 11,715

    I got a 5600X, plus all the bits and pieces for a new system for the SO, at Microcenter. There was no significant rush on them as far as I could tell but they were doing more business than usual so who knows how long they'll stay in stock.

    Everything went together smoothly and am now waiting through the Windows update process.

    According to what I've seen the 400 series Mobo's won't get BIOS updates to support the CPU's till next year.

    Ha nice

    Think it's finally time to upgrade my 1950x Threadripper; it is now outclassed. But it has been a great CPU, and still is in all fairness - there is just better out there

    Not sure if it's a Ryzen 9 or wait for the Threadripper. I'm also expecting to go AMD GPU if the performance lift on pairing them plays out - I render in Cycles so I don't care, presuming I'm not taking a cycles hit using AMD.

  • tj_1ca9500btj_1ca9500b Posts: 2,057

    Looks like EPYC Milan is starting to make appearances in the benchmark databases:

    https://wccftech.com/amd-3rd-gen-epyc-7713-milan-zen-3-cpu-spotted-64-cores-spotted-benchmarked/

  • Looks like EPYC Milan is starting to make appearances in the benchmark databases:

    https://wccftech.com/amd-3rd-gen-epyc-7713-milan-zen-3-cpu-spotted-64-cores-spotted-benchmarked/

    They've been in some AWS data centers for at least a month. 

  • I got a 5600X, plus all the bits and pieces for a new system for the SO, at Microcenter. There was no significant rush on them as far as I could tell but they were doing more business than usual so who knows how long they'll stay in stock.

    Everything went together smoothly and am now waiting through the Windows update process.

    According to what I've seen the 400 series Mobo's won't get BIOS updates to support the CPU's till next year.

    Congrats on the new CPU!

    It's not mine. It's the wife's. It's to get her off my system and off the laptop she's been using since her desktop died back in August. I'm staying on my R7 2700 for a while.

    It is up and running now and some quick benchmarks before she kicked me off show it to be at least 50% faster than my 2700 in games and at DS CPU renders (even with the 2 fewer cores). 

    She seems happy which means I'm happy.

    Ah, you've discovered the secret to true happiness.  Happy wife. happy life.

    I am thinking about adding another 32GB of memory and maybe upgrading my Ryzen 3700X to a 5000 series CPU after I upgrade my GPU.  WIll be a 3080Ti if they have 20 GB of memory or go full Monte and get a 3090 if the 3080 Ti is released with a subpar quantity of memory for rendering.

    Now lets hope the current 3080Ti rumors are true (3090 chips that failed the bin tests to make 3090 speed and 20 GB of memory).

    The problem with that rumor is the 3080 and 3090 are the same chip. So 3090's that fail QA become 3080's. If yields were high enough then I'm sure they could split the yields up more but right now with no 3080's on the shelves I really cannot see Nvidia splitting the product stack up. Next spring or whenever they get the yields up maybe but you guys counting on some imminent release are I think going to be disappointed. 

    I am hoping that the reason we are not seeing regular 3080's right now is that Nvidia is saving the GPU's they have for a 3080 Ti release before the end of the year.  I think that Nvidia is realizing that they may have goofed with the 3080 only having 10 GB of memory.  The perception being that the 3080 with only 10 GB of memory will not have long enough legs to future proof buyers where as they will see the 16 GB on the RX 6000 series cards as a better option   It doesn't matter if it is true or not and also we do not know what the game memory requirements will be Q3 & Q4 of next year.  With 16 GB of memory in the Xbox & Playstations and on the RX 6000 series cards, game developers will figure out a way to use it at some point and probably very soon leaving the 10 GB cards out in the cold.  Again this is all just wishful thinking and rumors.

  • PerttiAPerttiA Posts: 10,024

    I got a 5600X, plus all the bits and pieces for a new system for the SO, at Microcenter. There was no significant rush on them as far as I could tell but they were doing more business than usual so who knows how long they'll stay in stock.

    Everything went together smoothly and am now waiting through the Windows update process.

    According to what I've seen the 400 series Mobo's won't get BIOS updates to support the CPU's till next year.

    Congrats on the new CPU!

    It's not mine. It's the wife's. It's to get her off my system and off the laptop she's been using since her desktop died back in August. I'm staying on my R7 2700 for a while.

    It is up and running now and some quick benchmarks before she kicked me off show it to be at least 50% faster than my 2700 in games and at DS CPU renders (even with the 2 fewer cores). 

    She seems happy which means I'm happy.

    Ah, you've discovered the secret to true happiness.  Happy wife. happy life.

    I am thinking about adding another 32GB of memory and maybe upgrading my Ryzen 3700X to a 5000 series CPU after I upgrade my GPU.  WIll be a 3080Ti if they have 20 GB of memory or go full Monte and get a 3090 if the 3080 Ti is released with a subpar quantity of memory for rendering.

    Now lets hope the current 3080Ti rumors are true (3090 chips that failed the bin tests to make 3090 speed and 20 GB of memory).

    The problem with that rumor is the 3080 and 3090 are the same chip. So 3090's that fail QA become 3080's. If yields were high enough then I'm sure they could split the yields up more but right now with no 3080's on the shelves I really cannot see Nvidia splitting the product stack up. Next spring or whenever they get the yields up maybe but you guys counting on some imminent release are I think going to be disappointed. 

    I am hoping that the reason we are not seeing regular 3080's right now is that Nvidia is saving the GPU's they have for a 3080 Ti release before the end of the year.  I think that Nvidia is realizing that they may have goofed with the 3080 only having 10 GB of memory.  The perception being that the 3080 with only 10 GB of memory will not have long enough legs to future proof buyers where as they will see the 16 GB on the RX 6000 series cards as a better option   It doesn't matter if it is true or not and also we do not know what the game memory requirements will be Q3 & Q4 of next year.  With 16 GB of memory in the Xbox & Playstations and on the RX 6000 series cards, game developers will figure out a way to use it at some point and probably very soon leaving the 10 GB cards out in the cold.  Again this is all just wishful thinking and rumors.

    It was a weird decision in the first place to have less VRAM than in the previous generation, it would have been enough if they even had gone up by just 1GB

  • I got a 5600X, plus all the bits and pieces for a new system for the SO, at Microcenter. There was no significant rush on them as far as I could tell but they were doing more business than usual so who knows how long they'll stay in stock.

    Everything went together smoothly and am now waiting through the Windows update process.

    According to what I've seen the 400 series Mobo's won't get BIOS updates to support the CPU's till next year.

    Congrats on the new CPU!

    It's not mine. It's the wife's. It's to get her off my system and off the laptop she's been using since her desktop died back in August. I'm staying on my R7 2700 for a while.

    It is up and running now and some quick benchmarks before she kicked me off show it to be at least 50% faster than my 2700 in games and at DS CPU renders (even with the 2 fewer cores). 

    She seems happy which means I'm happy.

    Ah, you've discovered the secret to true happiness.  Happy wife. happy life.

    I am thinking about adding another 32GB of memory and maybe upgrading my Ryzen 3700X to a 5000 series CPU after I upgrade my GPU.  WIll be a 3080Ti if they have 20 GB of memory or go full Monte and get a 3090 if the 3080 Ti is released with a subpar quantity of memory for rendering.

    Now lets hope the current 3080Ti rumors are true (3090 chips that failed the bin tests to make 3090 speed and 20 GB of memory).

    The problem with that rumor is the 3080 and 3090 are the same chip. So 3090's that fail QA become 3080's. If yields were high enough then I'm sure they could split the yields up more but right now with no 3080's on the shelves I really cannot see Nvidia splitting the product stack up. Next spring or whenever they get the yields up maybe but you guys counting on some imminent release are I think going to be disappointed. 

    I am hoping that the reason we are not seeing regular 3080's right now is that Nvidia is saving the GPU's they have for a 3080 Ti release before the end of the year.  I think that Nvidia is realizing that they may have goofed with the 3080 only having 10 GB of memory.  The perception being that the 3080 with only 10 GB of memory will not have long enough legs to future proof buyers where as they will see the 16 GB on the RX 6000 series cards as a better option   It doesn't matter if it is true or not and also we do not know what the game memory requirements will be Q3 & Q4 of next year.  With 16 GB of memory in the Xbox & Playstations and on the RX 6000 series cards, game developers will figure out a way to use it at some point and probably very soon leaving the 10 GB cards out in the cold.  Again this is all just wishful thinking and rumors.

    That's post hoc stuff. They'd annoy the heck out of their own consumers plus they're losing sales every day by not having units on the shelf. This fantasy, by pretty much only DS users, that the console devs will suddenly start producing these games is just absurd.

    Console game devs are mostly the big publishers and they produce cheap games. They don't spend money they don't have to. For them to add textures they'd jump the prices, which they won't do, $60US is the sweet spot, or do them as a separate paid DL which is very unlikely to get past the EU regulators.

    Beyond that is the DL size. Textures do not compress. Going from 4 Gb of textures to 10 means a much bigger DL. For someone in an urban/suburban area with a decent high speed connection that's not a problem but for anyone else...

    You might see some games push up over 6 but there simply aren't going to some huge rush of them to 12 and it will not force Nvidia to release cheap cards with lots of VRAM. Even if 4K gaming does start calling for more than 8, I'll bet serious money it won't, 1080p definitely won't and most gamers are not leaving 1080. Nvidia is not stupid and they mostly sell cards to 1080 gamers. 

  • RayDAntRayDAnt Posts: 1,154

    With 16 GB of memory in the Xbox & Playstations and on the RX 6000 series cards, game developers will figure out a way to use it at some point and probably very soon leaving the 10 GB cards out in the cold. 

    Fyi the idea that both the Playstation 5 and the XBox Series X consoles have 16GB of video memory is a PR marketing ploy. They both only actually have 10GB of VRAM wired in such a way as to be usable for live graphics rendering. The other 6GB is locked behind a roughly 1/2 throughput bus interface and is only really usable for OS and non-graphics related functions (like audio processing.) 

     

     

    PerttiA said:

    I got a 5600X, plus all the bits and pieces for a new system for the SO, at Microcenter. There was no significant rush on them as far as I could tell but they were doing more business than usual so who knows how long they'll stay in stock.

    Everything went together smoothly and am now waiting through the Windows update process.

    According to what I've seen the 400 series Mobo's won't get BIOS updates to support the CPU's till next year.

    Congrats on the new CPU!

    It's not mine. It's the wife's. It's to get her off my system and off the laptop she's been using since her desktop died back in August. I'm staying on my R7 2700 for a while.

    It is up and running now and some quick benchmarks before she kicked me off show it to be at least 50% faster than my 2700 in games and at DS CPU renders (even with the 2 fewer cores). 

    She seems happy which means I'm happy.

    Ah, you've discovered the secret to true happiness.  Happy wife. happy life.

    I am thinking about adding another 32GB of memory and maybe upgrading my Ryzen 3700X to a 5000 series CPU after I upgrade my GPU.  WIll be a 3080Ti if they have 20 GB of memory or go full Monte and get a 3090 if the 3080 Ti is released with a subpar quantity of memory for rendering.

    Now lets hope the current 3080Ti rumors are true (3090 chips that failed the bin tests to make 3090 speed and 20 GB of memory).

    The problem with that rumor is the 3080 and 3090 are the same chip. So 3090's that fail QA become 3080's. If yields were high enough then I'm sure they could split the yields up more but right now with no 3080's on the shelves I really cannot see Nvidia splitting the product stack up. Next spring or whenever they get the yields up maybe but you guys counting on some imminent release are I think going to be disappointed. 

    I am hoping that the reason we are not seeing regular 3080's right now is that Nvidia is saving the GPU's they have for a 3080 Ti release before the end of the year.  I think that Nvidia is realizing that they may have goofed with the 3080 only having 10 GB of memory.  The perception being that the 3080 with only 10 GB of memory will not have long enough legs to future proof buyers where as they will see the 16 GB on the RX 6000 series cards as a better option   It doesn't matter if it is true or not and also we do not know what the game memory requirements will be Q3 & Q4 of next year.  With 16 GB of memory in the Xbox & Playstations and on the RX 6000 series cards, game developers will figure out a way to use it at some point and probably very soon leaving the 10 GB cards out in the cold.  Again this is all just wishful thinking and rumors.

    It was a weird decision in the first place to have less VRAM than in the previous generation, it would have been enough if they even had gone up by just 1GB

    It isn't a weird decision at all if you understand the full ramifications of high-speed direct SSD access on graphics processing. Hence why you see everyone (except AMD on the desktop end - undoubtedly because bigger number = better) targeting the exact same number (10GB) in hardware designs meant to take advantage of that tech in the near future.

  • nonesuch00nonesuch00 Posts: 18,729
    RayDAnt said:

    With 16 GB of memory in the Xbox & Playstations and on the RX 6000 series cards, game developers will figure out a way to use it at some point and probably very soon leaving the 10 GB cards out in the cold. 

    Fyi the idea that both the Playstation 5 and the XBox Series X consoles have 16GB of video memory is a PR marketing ploy. They both only actually have 10GB of VRAM wired in such a way as to be usable for live graphics rendering. The other 6GB is locked behind a roughly 1/2 throughput bus interface and is only really usable for OS and non-graphics related functions (like audio processing.) 

     

     

    PerttiA said:

    I got a 5600X, plus all the bits and pieces for a new system for the SO, at Microcenter. There was no significant rush on them as far as I could tell but they were doing more business than usual so who knows how long they'll stay in stock.

    Everything went together smoothly and am now waiting through the Windows update process.

    According to what I've seen the 400 series Mobo's won't get BIOS updates to support the CPU's till next year.

    Congrats on the new CPU!

    It's not mine. It's the wife's. It's to get her off my system and off the laptop she's been using since her desktop died back in August. I'm staying on my R7 2700 for a while.

    It is up and running now and some quick benchmarks before she kicked me off show it to be at least 50% faster than my 2700 in games and at DS CPU renders (even with the 2 fewer cores). 

    She seems happy which means I'm happy.

    Ah, you've discovered the secret to true happiness.  Happy wife. happy life.

    I am thinking about adding another 32GB of memory and maybe upgrading my Ryzen 3700X to a 5000 series CPU after I upgrade my GPU.  WIll be a 3080Ti if they have 20 GB of memory or go full Monte and get a 3090 if the 3080 Ti is released with a subpar quantity of memory for rendering.

    Now lets hope the current 3080Ti rumors are true (3090 chips that failed the bin tests to make 3090 speed and 20 GB of memory).

    The problem with that rumor is the 3080 and 3090 are the same chip. So 3090's that fail QA become 3080's. If yields were high enough then I'm sure they could split the yields up more but right now with no 3080's on the shelves I really cannot see Nvidia splitting the product stack up. Next spring or whenever they get the yields up maybe but you guys counting on some imminent release are I think going to be disappointed. 

    I am hoping that the reason we are not seeing regular 3080's right now is that Nvidia is saving the GPU's they have for a 3080 Ti release before the end of the year.  I think that Nvidia is realizing that they may have goofed with the 3080 only having 10 GB of memory.  The perception being that the 3080 with only 10 GB of memory will not have long enough legs to future proof buyers where as they will see the 16 GB on the RX 6000 series cards as a better option   It doesn't matter if it is true or not and also we do not know what the game memory requirements will be Q3 & Q4 of next year.  With 16 GB of memory in the Xbox & Playstations and on the RX 6000 series cards, game developers will figure out a way to use it at some point and probably very soon leaving the 10 GB cards out in the cold.  Again this is all just wishful thinking and rumors.

    It was a weird decision in the first place to have less VRAM than in the previous generation, it would have been enough if they even had gone up by just 1GB

    It isn't a weird decision at all if you understand the full ramifications of high-speed direct SSD access on graphics processing. Hence why you see everyone (except AMD on the desktop end - undoubtedly because bigger number = better) targeting the exact same number (10GB) in hardware designs meant to take advantage of that tech in the near future.

    What's the near future? They should demo it, shouldn't they, with an nVidia GPU.

  • RayDAnt said:

    With 16 GB of memory in the Xbox & Playstations and on the RX 6000 series cards, game developers will figure out a way to use it at some point and probably very soon leaving the 10 GB cards out in the cold. 

    Fyi the idea that both the Playstation 5 and the XBox Series X consoles have 16GB of video memory is a PR marketing ploy. They both only actually have 10GB of VRAM wired in such a way as to be usable for live graphics rendering. The other 6GB is locked behind a roughly 1/2 throughput bus interface and is only really usable for OS and non-graphics related functions (like audio processing.) 

     

     

    PerttiA said:

    I got a 5600X, plus all the bits and pieces for a new system for the SO, at Microcenter. There was no significant rush on them as far as I could tell but they were doing more business than usual so who knows how long they'll stay in stock.

    Everything went together smoothly and am now waiting through the Windows update process.

    According to what I've seen the 400 series Mobo's won't get BIOS updates to support the CPU's till next year.

    Congrats on the new CPU!

    It's not mine. It's the wife's. It's to get her off my system and off the laptop she's been using since her desktop died back in August. I'm staying on my R7 2700 for a while.

    It is up and running now and some quick benchmarks before she kicked me off show it to be at least 50% faster than my 2700 in games and at DS CPU renders (even with the 2 fewer cores). 

    She seems happy which means I'm happy.

    Ah, you've discovered the secret to true happiness.  Happy wife. happy life.

    I am thinking about adding another 32GB of memory and maybe upgrading my Ryzen 3700X to a 5000 series CPU after I upgrade my GPU.  WIll be a 3080Ti if they have 20 GB of memory or go full Monte and get a 3090 if the 3080 Ti is released with a subpar quantity of memory for rendering.

    Now lets hope the current 3080Ti rumors are true (3090 chips that failed the bin tests to make 3090 speed and 20 GB of memory).

    The problem with that rumor is the 3080 and 3090 are the same chip. So 3090's that fail QA become 3080's. If yields were high enough then I'm sure they could split the yields up more but right now with no 3080's on the shelves I really cannot see Nvidia splitting the product stack up. Next spring or whenever they get the yields up maybe but you guys counting on some imminent release are I think going to be disappointed. 

    I am hoping that the reason we are not seeing regular 3080's right now is that Nvidia is saving the GPU's they have for a 3080 Ti release before the end of the year.  I think that Nvidia is realizing that they may have goofed with the 3080 only having 10 GB of memory.  The perception being that the 3080 with only 10 GB of memory will not have long enough legs to future proof buyers where as they will see the 16 GB on the RX 6000 series cards as a better option   It doesn't matter if it is true or not and also we do not know what the game memory requirements will be Q3 & Q4 of next year.  With 16 GB of memory in the Xbox & Playstations and on the RX 6000 series cards, game developers will figure out a way to use it at some point and probably very soon leaving the 10 GB cards out in the cold.  Again this is all just wishful thinking and rumors.

    It was a weird decision in the first place to have less VRAM than in the previous generation, it would have been enough if they even had gone up by just 1GB

    It isn't a weird decision at all if you understand the full ramifications of high-speed direct SSD access on graphics processing. Hence why you see everyone (except AMD on the desktop end - undoubtedly because bigger number = better) targeting the exact same number (10GB) in hardware designs meant to take advantage of that tech in the near future.

    What's the near future? They should demo it, shouldn't they, with an nVidia GPU.

    The tech demos are coming. I'm not as confident as some that they'll be on the desktop in the next few years but Nvidia is promising them to enterprise users in an experimental form this generation and, hopefully, as a standard part of the "quadro" lineup for the next generation. 

    But until it is actually deployed who knows? Personally I think this all could be rendered moot. PCIE gen 5 and 6 are on the horizon, along with DDR5, so the throughput of modern systems may increase to the point where it just may not matter any more. Where that will leave things like iRay? Who really knows?

  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited November 2020

    So, Steve at GN noticed something about 2 sticks of ram vs 4 sticks on Ryzen 3...

    Ryzen has 2 channels of memory of course ('little' Threadrippers have 4, 'big' Threadrippers and EPYCs have 8).  If I'm remembering correctly, for overclocking purposes up to this point going with 1 stick per channel was preferred, so this is a significant change.  Short form, with Ryzen 3, it may be a better idea to fill all of your slots.  Steve noticed a 2-8% increase using 4 sticks vs 2.

    I wish he would have had a set of 2x16 sticks to test against vs the 4x8 sticks, with the same timings, but Steve was pretty sure that his systems were using less than 16 GB of memory in the tests he ran.  Nonetheless, it's a significant increase, so worth noting.

    There's also another nuance, that is how many 'ranks' are on every stick if I understood what Wendell was getting at (as Steve relayed it) correctly, and Wendell may do a followup on that subject.

    Post edited by tj_1ca9500b on
  • The weird thing is the memory controller is supposed to be unchanged from Zen 2 so I just don't get this. When I go to work Monday I'll have to see what I have in the way of a matched set of 4 sticks of DDR4 RAM to bring home to test this myself.

    Also this will be somewhat awkward to take advantage of for the average user. 4 x 4Gb kits are essentially unavailable right now, I could not find any. so if you just want 16Gb of RAM you'd be out of luck.

  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited November 2020

    The weird thing is the memory controller is supposed to be unchanged from Zen 2 so I just don't get this. When I go to work Monday I'll have to see what I have in the way of a matched set of 4 sticks of DDR4 RAM to bring home to test this myself.

    Also this will be somewhat awkward to take advantage of for the average user. 4 x 4Gb kits are essentially unavailable right now, I could not find any. so if you just want 16Gb of RAM you'd be out of luck.

    This may be a result of a number of improvements, the ones jumping out at me ATM is that they increased the number of table walkers from 2 to 6, which reduces that bottleneck.  They also increased the store queue depth from 48 to 64.  There are other improvements as well.

    https://www.anandtech.com/show/16214/amd-zen-3-ryzen-deep-dive-review-5950x-5900x-5800x-and-5700x-tested/4

    Speculating here, but AMD MAY have already built the IO die in such a way that the CPU can directly handle either slot in a channel, with the IO die just being the 'bridge'.  The 'cross ccd' thing in Zen 1-2 may have incurred some sort of penalty here before, which was resolved somehow when the CCX/D grew to 8 cores.  I honestly don't know, but I look forward to Wendell's investigation into this subject.

     

     

    Post edited by tj_1ca9500b on
Sign In or Register to comment.