Nvidia Ampere (2080 Ti, etc. replacements) and other rumors...

13940414244

Comments

  • tj_1ca9500btj_1ca9500b Posts: 2,057

    Now officially released and officially sold out unless you're willing to waste a $1000 for a few weeks of bragging rights, the AMD Radeon RX 6800 costs MSRP $579 and the AMD Radeon RX 6800 XT costs MSRP $649 at the AMD.com website.  

    The scuttlebutt is that OEMs may be saving a lot of their allocation of 6000 series Radeon dies for their own custom designs, which supposedly will begin launching in about a week from now.  So I definitely wouldn't recommend paying scalper prices for the OEM design.  The OEM design is competent, but as Steve from GN pointed out in his teardown, there's some missed opportunities in the design that custom designs have addressed in their own designs of previous gen units.

    So if you don't mind waiting a week or so, yeah there will be a second wave of these coming, and some may be better designs...

    Also, with the rumored Ti designs supposedly in the pipeline, It might behoove you to wait on those if you aren't in an immediate hurry to upgrade.  But if Radeon fills your bill for whatever reason, the 6000 series cards are very nice...

    And, if you do Linux, well Phoronix likes 'em!

    https://www.phoronix.com/scan.php?page=article&item=amd-rx6800-opencl&num=1

     

  • nonesuch00nonesuch00 Posts: 18,848
    edited November 2020

    Now officially released and officially sold out unless you're willing to waste a $1000 for a few weeks of bragging rights, the AMD Radeon RX 6800 costs MSRP $579 and the AMD Radeon RX 6800 XT costs MSRP $649 at the AMD.com website.  

    The scuttlebutt is that OEMs may be saving a lot of their allocation of 6000 series Radeon dies for their own custom designs, which supposedly will begin launching in about a week from now.  So I definitely wouldn't recommend paying scalper prices for the OEM design.  The OEM design is competent, but as Steve from GN pointed out in his teardown, there's some missed opportunities in the design that custom designs have addressed in their own designs of previous gen units.

    So if you don't mind waiting a week or so, yeah there will be a second wave of these coming, and some may be better designs...

    Also, with the rumored Ti designs supposedly in the pipeline, It might behoove you to wait on those if you aren't in an immediate hurry to upgrade.  But if Radeon fills your bill for whatever reason, the 6000 series cards are very nice...

    And, if you do Linux, well Phoronix likes 'em!

    https://www.phoronix.com/scan.php?page=article&item=amd-rx6800-opencl&num=1

     

    Nah, I read this thread to read the news & rumours without searching the web. I actually plan on buying both a GeForce RTX 3060TI 8GB and a Radeon RX 6600 16GB by Jun 2021. I'm not that sure on the real name of the Radeon RX 6600 16GB though, it's code name is 'Big Flounder' and is entry level Big Navi card.

    Post edited by nonesuch00 on
  • Oh, what a terrible time to build a barebones system specifically for Iray render upgrade!!  I wish I had found this thread earlier.  
    I had to settle on a GTX 1660 with only 6GB GPU power.  sad I currently have a GTX 960 4GB GPU, and it don't 'Iray.'

    Ive looked everywhere, and the only place I'm finding 8GB+ NVidia Graphics cards are on eBay, sold for 2-3X their regular listed MSRP.  
     

    Anyway, hopefully these cards will return at their regular price again and this hoarding/scalping graphic card bs will cease.  I never thought graphics cards would become the new toilet paper rush during this pandemic thingy. 

     

  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited November 2020

    I was doing my internet rounds this morning, and I noticed that some miner dude that had camped out all night managed to snag one of two 'restock' Ryzen 5950x CPUs at his local microcenter along with a RX 6800 GPU.  That store only received 2 6800Xs, not sure how many 6800s they had (certainly not enough for the line of 70+ people).  My point is that 5950x CPUs are still trickling in to the supply chain at least, so that's something...  With the next gen EPYC CPUs in the pipeline, I was a bit concerned that 5950x's might not show up again for a while.  It may still be nearly impossible to get one unless you don't mind paying scalper prices ATM, but it's something...

    Myself, I'm still watching for a nice 4750G that is a 'retail' model, but that may not happen.  I may end up buying that unboxed one that ships from China from Newegg at some point, but I'm not in an immediate hurry to get one ATM (early next year maybe).  With the '5000 series APU refresh' models coming out, well it's a long shot but I'm hoping that AMD finally relents and does a retail 8 core model of Renoir.  I'm pretty happy with my 2400G + 1080 Ti setup, but I'd love having 4 more cores, plus still have the integrated GPU for destop duties, keeping my PCIe slots open for other things (read: Nvidia GPUs and other addon cards).

    There's a Lenovo PC out there with a 4700G, but paying $700+ just to get the APU, and then discarding everything else seems very impractical, plus I'd still need to grab a B550/X570 board for it as the Lenovo board would probably be pretty useless for a multi GPU setup.  And if for some reason Lenovo decided to solder the CPU...

    I wish I had a Micro Center nearby, but alas only Walmart and Best Buy... and Office Depot, Staples, etc. but yeah they usually don't have 'bleeding edge' stuff.

    Post edited by tj_1ca9500b on
  • 4750G is not going into the retail chain. The only way to get it is from some grey market source. All the 4000 series desktop APU's are OEM only, they are being sold in trays of 1000 only. Why do you want a bad iGPU if you already have a 1080ti?

  • nonesuch00nonesuch00 Posts: 18,848

    I was doing my internet rounds this morning, and I noticed that some miner dude that had camped out all night managed to snag one of two 'restock' Ryzen 5950x CPUs at his local microcenter along with a RX 6800 GPU.  That store only received 2 6800Xs, not sure how many 6800s they had (certainly not enough for the line of 70+ people).  My point is that 5950x CPUs are still trickling in to the supply chain at least, so that's something...  With the next gen EPYC CPUs in the pipeline, I was a bit concerned that 5950x's might not show up again for a while.  It may still be nearly impossible to get one unless you don't mind paying scalper prices ATM, but it's something...

    Myself, I'm still watching for a nice 4750G that is a 'retail' model, but that may not happen.  I may end up buying that unboxed one that ships from China from Newegg at some point, but I'm not in an immediate hurry to get one ATM (early next year maybe).  With the '5000 series APU refresh' models coming out, well it's a long shot but I'm hoping that AMD finally relents and does a retail 8 core model of Renoir.  I'm pretty happy with my 2400G + 1080 Ti setup, but I'd love having 4 more cores, plus still have the integrated GPU for destop duties, keeping my PCIe slots open for other things (read: Nvidia GPUs and other addon cards).

    There's a Lenovo PC out there with a 4700G, but paying $700+ just to get the APU, and then discarding everything else seems very impractical, plus I'd still need to grab a B550/X570 board for it as the Lenovo board would probably be pretty useless for a multi GPU setup.  And if for some reason Lenovo decided to solder the CPU...

    I wish I had a Micro Center nearby, but alas only Walmart and Best Buy... and Office Depot, Staples, etc. but yeah they usually don't have 'bleeding edge' stuff.

    I'd totally go for an 8 core or 16 core Zen 3 with Big Navi APU but I reckon those will come out, or something with fewer core, Autumn 2021.

  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited November 2020

    4750G is not going into the retail chain. The only way to get it is from some grey market source. All the 4000 series desktop APU's are OEM only, they are being sold in trays of 1000 only. Why do you want a bad iGPU if you already have a 1080ti?

    I know already that you don't understand the performance benefit of running your desktop on the iGPU while you use the Nvidia GPU ONLY for Iray rendering, but there is one.  Mainly in that the dGPU doesn't need to allocate resources to the desktop, and the resulting lag if you want to do other stuff like light gaming or working in a separate Daz instance in non-Iray viewport mode while a render is baking.

    And 'freeing' all of my PCIe slots for other card, i.e. not allocating a PCIe slot to a 'desktop' GPU is helpful to me.  In my current build, there is only a single PCIe-16 slot (mini-ITX), but my next build, well I will need as many PCIe slots as I can (for a Ryzen build anyways) for other add-on cards, including a multi-GPU setup.

    Looking at my Firestorm GPU monitor for my 1080 Ti now.  GPU clock 164 GPU Memory Clock 405 Mem utilization 0%.  That's my 1080 Ti's 'idle' state when my computer is on and the iGPU is running the desktop, or even when working in Daz Studio in non-Iray render modes.  Note the 0% memory utilization. 

    Also, the iGPU is very power efficient.  65W or less for both the CPU and integrated GPU, and maybe up to 95W if you push things, but the Renoir APUs are very power efficient (note the long battery lives in the notebook reviews).

    THAT is why.  I had significant lag issues with dual 1080s on my uber laptop, which used the 1080s to run the destkop while a render was baking, and of course it needed to allocate some of that precious 8 GB of VRAM for said desktop.

    But I get that you don't get it. 

     

    Post edited by tj_1ca9500b on
  • 4750G is not going into the retail chain. The only way to get it is from some grey market source. All the 4000 series desktop APU's are OEM only, they are being sold in trays of 1000 only. Why do you want a bad iGPU if you already have a 1080ti?

    I know already that you don't understand the performance benefit of running your desktop on the iGPU while you use the Nvidia GPU ONLY for Iray rendering, but there is one.  Mainly in that the dGPU doesn't need to allocate resources to the desktop, and the resulting lag if you want to do other stuff like light gaming or working in a separate Daz instance in non-Iray viewport mode while a render is baking.

    And 'freeing' all of my PCIe slots for other card, i.e. not allocating a PCIe slot to a 'desktop' GPU is helpful to me.  In my current build, there is only a single PCIe-16 slot (mini-ITX), but my next build, well I will need as many PCIe slots as I can (for a Ryzen build anyways) for other add-on cards, including a multi-GPU setup.

    Looking at my Firestorm GPU monitor for my 1080 Ti now.  GPU clock 164 GPU Memory Clock 405 Mem utilization 0%.  That's my 1080 Ti's 'idle' state when my computer is on and the iGPU is running the desktop, or even when working in Daz Studio in non-Iray render modes.  Note the 0% memory utilization. 

    Also, the iGPU is very power efficient.  65W or less for both the CPU and integrated GPU, and maybe up to 95W if you push things, but the Renoir APUs are very power efficient (note the long battery lives in the notebook reviews).

    THAT is why.  I had significant lag issues with dual 1080s on my uber laptop, which used the 1080s to run the destkop while a render was baking, and of course it needed to allocate some of that precious 8 GB of VRAM for said desktop.

    But I get that you don't get it. 

    I know you are making claims you cannot support. I know I have already dealt with this many times. Why do you keep repeating debunked nonsense?

    The not amazing thing is a can run my system on just a 2070, I physically pulled the 1080ti to test, while rendering. I watched a full 4k movie. No lag and no reduction in render speed. CUDA isn't used by movie playback. I did the exact smae thing with the 1080ti with the 2070 pulled. Still no issue. Because of the same reason. Since the viewport of DS doesn't use CUDA at all unless you use iRay preview again you can run a separate instance without issue on an 8Gb card as well.

    I will explain to you again that it does not matter in the least what your crappy monitoring SW says WDDM ALWAYS reserves VRAM on every commercial graphics card attached to the system.  I get that you don't like this but reality is reality. You are welcome to look it up.

    Now you're talking about a laptop as a comparison point. Laptops suck. They always will compared to a desktop. Do not compare apples to oranges.

    But in the end those are OEM parts. Grey market is the only option to get them. AMD released them so guys like Dell and Lenovo could produce machines like ones they make with i3's and i5's that are on corporate desks all over the world.

     

  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited November 2020

    4750G is not going into the retail chain. The only way to get it is from some grey market source. All the 4000 series desktop APU's are OEM only, they are being sold in trays of 1000 only. Why do you want a bad iGPU if you already have a 1080ti?

    I know already that you don't understand the performance benefit of running your desktop on the iGPU while you use the Nvidia GPU ONLY for Iray rendering, but there is one.  Mainly in that the dGPU doesn't need to allocate resources to the desktop, and the resulting lag if you want to do other stuff like light gaming or working in a separate Daz instance in non-Iray viewport mode while a render is baking.

    And 'freeing' all of my PCIe slots for other card, i.e. not allocating a PCIe slot to a 'desktop' GPU is helpful to me.  In my current build, there is only a single PCIe-16 slot (mini-ITX), but my next build, well I will need as many PCIe slots as I can (for a Ryzen build anyways) for other add-on cards, including a multi-GPU setup.

    Looking at my Firestorm GPU monitor for my 1080 Ti now.  GPU clock 164 GPU Memory Clock 405 Mem utilization 0%.  That's my 1080 Ti's 'idle' state when my computer is on and the iGPU is running the desktop, or even when working in Daz Studio in non-Iray render modes.  Note the 0% memory utilization. 

    Also, the iGPU is very power efficient.  65W or less for both the CPU and integrated GPU, and maybe up to 95W if you push things, but the Renoir APUs are very power efficient (note the long battery lives in the notebook reviews).

    THAT is why.  I had significant lag issues with dual 1080s on my uber laptop, which used the 1080s to run the destkop while a render was baking, and of course it needed to allocate some of that precious 8 GB of VRAM for said desktop.

    But I get that you don't get it. 

    I know you are making claims you cannot support. I know I have already dealt with this many times. Why do you keep repeating debunked nonsense?

    The not amazing thing is a can run my system on just a 2070, I physically pulled the 1080ti to test, while rendering. I watched a full 4k movie. No lag and no reduction in render speed. CUDA isn't used by movie playback. I did the exact smae thing with the 1080ti with the 2070 pulled. Still no issue. Because of the same reason. Since the viewport of DS doesn't use CUDA at all unless you use iRay preview again you can run a separate instance without issue on an 8Gb card as well.

    I will explain to you again that it does not matter in the least what your crappy monitoring SW says WDDM ALWAYS reserves VRAM on every commercial graphics card attached to the system.  I get that you don't like this but reality is reality. You are welcome to look it up.

    Now you're talking about a laptop as a comparison point. Laptops suck. They always will compared to a desktop. Do not compare apples to oranges.

    But in the end those are OEM parts. Grey market is the only option to get them. AMD released them so guys like Dell and Lenovo could produce machines like ones they make with i3's and i5's that are on corporate desks all over the world.

     

    Until you actually try using the iGPU, or have a second discrete GPU installed to just run your desktop while a render is baking (not for rendering AT ALL), nothing that you say is relevant.  Whether the system 'caps' the amount of ram it'll allow one process to use isn't the same thing as the amount of ram that GPU is using currently.  I KNOW that in my setup, the only thing that VRAM actually gets used for is rendering.  Memory never hits 100% utilization, so that tells me that there's a cap (note that I'm still running an older version of Windows 10 on my offline system, so I still 'lose' 19% of my 1080 Ti's VRAM to that cap), but except when a scene is being 'set up' for an Iray render run I experience virtually no lag while a render is baking and I'm playing some game in windowed mode to pass the time, or doing photoshop work, or working in a second Daz instance (unless I try to use the Iray viewport).  Which wasn't the same experience on my uber laptop (I saw lag regularly when a render was baking), which was otherwise awesome when it worked.

    And the lag I experienced was more 'mouse/app responsiveness' related, not render time related, just to make that clear.  It was enough to be noticable to me, sometimes significantly so.

    BTW, I'm using Zotac's control panel utility to monitor my card usage.  As it gives me real time memory usage and clock numbers and temps, I tend to take it at it's word.  The card idles at a couple of degrees above ambient when memory is at 0% usage, to maintain that 164 MHz GPU clock, and is cool to the touch.  So I know the card is idled/not thinking about the desktop.  The fact that the desktop is running on Radeon graphics might help here, I'd be curious to know if the same behavior would result (It should) with 2 Nvidia cards (with the second, perhaps smaller, one set aside for desktop duty only).

    And the laptop I referenced was this one:

    https://us.msi.com/Laptop/GT83VR-TITAN-SLI-6th-Gen-GTX-1080-SLI.html

    For all intents and purposes, it's a desktop replacement, not a crappy laptop, which cost over $4000 when they were being made, due to 'because they could ask that much/virtually no competition', dual 1080's, a 18.4" screen (only HD though), and the mechanical keyboard.  It routinely beat a single 1080 Ti in Iray benchmarks by about the expected amount (I shared my result in the benchmarking thread at the time).  But due to a design flaw it ended up pretty much bricking itself and I went to an interim system (which I'm still using now).  I did recently get it working again, but haven't put it back into use yet as I don't trust it (multiple MSI laptop models several years ago had the same problem I experienced, it's well documented even though MSI wasn't the best at admitting/id'ing the problem on their forums, etc.).  I may start rendering with it again to see if the bios flash I did to 'reset' the issue, and the settings I picked stave off the 'gradually increasing boot times'  issue permanently, or if the problem persists, as a good portion of my renders will fit in the 8GB envelope (not all though, I like to render multiple characters in complex environments, and even with the 1080 Ti I hit the VRAM wall often enough.

    So, in short, the reason a number of us recommend using a second card (or iGPU) to run your desktop is to avoid a bit of lag when you want to do other stuff on the same computer while a render is baking on your other card.  It's as simple as that.  Your attempts to obsfucate this with the WDDM issue is irrelevant.

    ---------------------------------------------------------------------------------

    Anyways, that's not why I'm here.  More 3060 Ti leaks:

    https://videocardz.com/newz/nvidia-geforce-rtx-3060-ti-founders-edition-pictured

    Short form, 8 GB GDDR6 (non X), $399, faster than 2080 super, that's the current rumor.  If those numbers hold, that card may be very attractive to people looking for a good entry card with decent amounts of VRAM to get started with.  Of course, it'll probably sell out about immediately, but in a few months when the supply chain catches up with demand (hopefully), it'll be a great 'budget' card for Iray rendering.

     

    Post edited by tj_1ca9500b on
  • I see no point in continuing to argue a point that is long disproved. Multiple people have tested this claim and you are simply wrong. You can find when I tested it. You can find when otherstested this and we all found the same thing. There is zero benefit to a dedicated monitor card.

  • tj_1ca9500btj_1ca9500b Posts: 2,057

    I see no point in continuing to argue a point that is long disproved. Multiple people have tested this claim and you are simply wrong. You can find when I tested it. You can find when otherstested this and we all found the same thing. There is zero benefit to a dedicated monitor card.

    Show me where multiple people actually disproved it.  I can show you where a number of us recommend offloading the desktop to the second GPU, and our experiences by doing so, such as in this thread (note that I am not a participant in that thread):

    https://www.daz3d.com/forums/discussion/235136/second-graphic-card-for-daz-studio

    The OP experienced the lag issue I've been talking about, and the repsondents noted having the same issue, and that adding a second GPU addressed the issue.  Can't be more clear than that.

    The latest Nvidia cards are faster, so the lag may be a lot less with the new cards, plus i'm not sure how the whole 'two sets of cores' thing plays into that, and I don't have a 20xx series card so other 20xx users (other than yourself) will need to share their experience.  We already know your position, I'm interested in hearing from others.

    But, by all means, please do link to a thread where others back up what you are saying.  I'm not seeing it in my google search...

     

  • nonesuch00nonesuch00 Posts: 18,848
    edited November 2020

    For what it's worth I want a 16 core/32 thread Zen 3 with Big Navi GPU APU unit solely for the possibility to have modern ray tracing & DLSS/AI (and AMD equivalent) capable AMD & nVidia GPUs in my computer at the same time with no card swapping and I can keep my nice muted microATX small tower computer case and motherboard. I want it more from the perspective of a curious programmer than a person looking to having a powerful GPU to game with. I would probably need to upgrade my power supply though as the current is only a 450W PS.

    Also, I just replaced my 14 year old Acer 24" 1920x1080P that was still using the yellower 3500K light temperature settings and was so old it wasn't even an IPS monitor. It didn't break but I wanted to buy matching dual 27" IPS 1920x1080P monitors. What a great upgrade that was. The light temperature is 5500K or maybe higher and the monitor is bright with the extra 3" very helpful for seeing my renders how they are and a lot of those old renders had details and artifacts that for some very strange reason, even when viewed at full size on the old monitor, did not show up on the old monitor. I was very, very surprised at that. I just figured the old 24" FHD monitor was fine but it must not of been even though it was seeming to work. 

    Post edited by nonesuch00 on
  • For what it's worth I want a 16 core/32 thread Zen 3 with Big Navi GPU APU unit solely for the possibility to have modern ray tracing & DLSS/AI (and AMD equivalent) capable AMD & nVidia GPUs in my computer at the same time with no card swapping and I can keep my nice muted microATX small tower computer case and motherboard. I want it more from the perspective of a curious programmer than a person looking to having a powerful GPU to game with. I would probably need to upgrade my power supply though as the current is only a 450W PS.

    Also, I just replaced my 14 year old Acer 24" 1920x1080P that was still using the yellower 3500K light temperature settings and was so old it wasn't even an IPS monitor. It didn't break but I wanted to buy matching dual 27" IPS 1920x1080P monitors. What a great upgrade that was. The light temperature is 5500K or maybe higher and the monitor is bright with the extra 3" very helpful for seeing my renders how they are and a lot of those old renders had details and artifacts that for some very strange reason, even when viewed at full size on the old monitor, did not show up on the old monitor. I was very, very surprised at that. I just figured the old 24" FHD monitor was fine but it must not of been even though it was seeming to work. 

    That would be reasonable.However all Zen APU's are Vega. A lot of people assume Zen 3/RDNA 2 APU's will be coming since those are in the consoles but AMD has not said a thing about it and a mid tier gaming computer on a chip would seem to be a big deal. They may have to wait until demand for the console SOC;s dies down or there may be exclusivity deals with MS and Sony that preclude selling them for a while.

    TN's had a host of issues that could have resulted in it not being able to show the image clearly enough for you to see the issues you can see now. Or it could have just been age.

  • nicsttnicstt Posts: 11,715

    I see no point in continuing to argue a point that is long disproved. Multiple people have tested this claim and you are simply wrong. You can find when I tested it. You can find when otherstested this and we all found the same thing. There is zero benefit to a dedicated monitor card.

    Zero for you, then.

    I get a lot of benefit. If you can't see it, don't use it.

  • nonesuch00nonesuch00 Posts: 18,848

    For what it's worth I want a 16 core/32 thread Zen 3 with Big Navi GPU APU unit solely for the possibility to have modern ray tracing & DLSS/AI (and AMD equivalent) capable AMD & nVidia GPUs in my computer at the same time with no card swapping and I can keep my nice muted microATX small tower computer case and motherboard. I want it more from the perspective of a curious programmer than a person looking to having a powerful GPU to game with. I would probably need to upgrade my power supply though as the current is only a 450W PS.

    Also, I just replaced my 14 year old Acer 24" 1920x1080P that was still using the yellower 3500K light temperature settings and was so old it wasn't even an IPS monitor. It didn't break but I wanted to buy matching dual 27" IPS 1920x1080P monitors. What a great upgrade that was. The light temperature is 5500K or maybe higher and the monitor is bright with the extra 3" very helpful for seeing my renders how they are and a lot of those old renders had details and artifacts that for some very strange reason, even when viewed at full size on the old monitor, did not show up on the old monitor. I was very, very surprised at that. I just figured the old 24" FHD monitor was fine but it must not of been even though it was seeming to work. 

    That would be reasonable.However all Zen APU's are Vega. A lot of people assume Zen 3/RDNA 2 APU's will be coming since those are in the consoles but AMD has not said a thing about it and a mid tier gaming computer on a chip would seem to be a big deal. They may have to wait until demand for the console SOC;s dies down or there may be exclusivity deals with MS and Sony that preclude selling them for a while.

    TN's had a host of issues that could have resulted in it not being able to show the image clearly enough for you to see the issues you can see now. Or it could have just been age.

    Well, I can tell you I am not buying a APU that isn't (Zen 3 / RDNA 2) so if AMD wants to be clever and put their next CPU / GPU combo an APU combination from 2 or 3 years ago that's their perogative but I wouldn't be buying it.  However, I know not enough about the B450 MB I have that says it could support the RDNA 2 part of such a APU. I might have to buy a new motherboard but for such a APU it would be well worth it going forward. If not, I'll just keep using the Big Flounder card I'm buying next spring so not a bad situation in either case. I do like the convenience an APU would bring in having both brands of GPUs in a small tower at the same time though.

    Yeah, I'm still short of flabbergasted at some of the noise and just bad morphs and texture combinations (I chose but my monitor didn't show correctly) that absolutely did not show up and don't show up on that old monitor. It's like a very smart AI cleaned up the images before putting them on my old monitor but I know that's not true so if has to be the low yellow color light temperature hiding that noise and the tighter pack of FHD pixels on the smaller 24" monitor hiding the other noise. I've heard LED lights don't really burn out but get dimmer and dimmer such that you want to replace them every 5 years rather than the rated 20+ years. I suppose LEDs in computers are the same.

  • NylonGirlNylonGirl Posts: 2,294

    I like having the system use the CPU's integrated graphics. I think if the hardware is there, it might as well be used for something.

  • kenshaw011267kenshaw011267 Posts: 3,805
    edited November 2020

    For what it's worth I want a 16 core/32 thread Zen 3 with Big Navi GPU APU unit solely for the possibility to have modern ray tracing & DLSS/AI (and AMD equivalent) capable AMD & nVidia GPUs in my computer at the same time with no card swapping and I can keep my nice muted microATX small tower computer case and motherboard. I want it more from the perspective of a curious programmer than a person looking to having a powerful GPU to game with. I would probably need to upgrade my power supply though as the current is only a 450W PS.

    Also, I just replaced my 14 year old Acer 24" 1920x1080P that was still using the yellower 3500K light temperature settings and was so old it wasn't even an IPS monitor. It didn't break but I wanted to buy matching dual 27" IPS 1920x1080P monitors. What a great upgrade that was. The light temperature is 5500K or maybe higher and the monitor is bright with the extra 3" very helpful for seeing my renders how they are and a lot of those old renders had details and artifacts that for some very strange reason, even when viewed at full size on the old monitor, did not show up on the old monitor. I was very, very surprised at that. I just figured the old 24" FHD monitor was fine but it must not of been even though it was seeming to work. 

    That would be reasonable.However all Zen APU's are Vega. A lot of people assume Zen 3/RDNA 2 APU's will be coming since those are in the consoles but AMD has not said a thing about it and a mid tier gaming computer on a chip would seem to be a big deal. They may have to wait until demand for the console SOC;s dies down or there may be exclusivity deals with MS and Sony that preclude selling them for a while.

    TN's had a host of issues that could have resulted in it not being able to show the image clearly enough for you to see the issues you can see now. Or it could have just been age.

    Well, I can tell you I am not buying a APU that isn't (Zen 3 / RDNA 2) so if AMD wants to be clever and put their next CPU / GPU combo an APU combination from 2 or 3 years ago that's their perogative but I wouldn't be buying it.  However, I know not enough about the B450 MB I have that says it could support the RDNA 2 part of such a APU. I might have to buy a new motherboard but for such a APU it would be well worth it going forward. If not, I'll just keep using the Big Flounder card I'm buying next spring so not a bad situation in either case. I do like the convenience an APU would bring in having both brands of GPUs in a small tower at the same time though.

    Yeah, I'm still short of flabbergasted at some of the noise and just bad morphs and texture combinations (I chose but my monitor didn't show correctly) that absolutely did not show up and don't show up on that old monitor. It's like a very smart AI cleaned up the images before putting them on my old monitor but I know that's not true so if has to be the low yellow color light temperature hiding that noise and the tighter pack of FHD pixels on the smaller 24" monitor hiding the other noise. I've heard LED lights don't really burn out but get dimmer and dimmer such that you want to replace them every 5 years rather than the rated 20+ years. I suppose LEDs in computers are the same.

    AMD has been very cagey about their APU's. The widely held assumption when the Zen chiplet design came out with Zen 2 was that there would quickly be Vega and then Navi chiplets to put on the Zen APU packages but they haven't. Every APU so far has been "old" tech monolithic dies. Speculation is that the communication between chiplets is too slow for graphics processing but they haven't said. But so far they have kept APU's to the low end of their product stack. Even those Ryzen 4000 desktop SKU's which had better CPU "halfs" still had eally terrible GPU sides (they were at best the same as the 3200G which was the low end of the Ryzen 3000 APU's. 8 Vega compute units). 

    But those SOC's in the consoles are Zen 3/RDNA2 so we know they can make monolithic dies with all that but they also draw something like 150W (which does anyone really care since it is the whole computer minus memory?). They may not be able to get enough to sell any or they make an announcement at CES 2021, Lisa Su is delivering the keynote at the virtual show so who knows.

    I would never hold onto a monitor for more than about 5 years except under dire circumstances. they definitely get dimmer and the pixels get less ditsinct.

    Post edited by kenshaw011267 on
  • RayDAntRayDAnt Posts: 1,162
    edited November 2020

    I see no point in continuing to argue a point that is long disproved. Multiple people have tested this claim and you are simply wrong. You can find when I tested it. You can find when otherstested this and we all found the same thing. There is zero benefit to a dedicated monitor card.

    Here is a snapshot of the very real (if not monumental) difference in GPU resource consumption having a dedicated monitor card (in this case the iGPU of an 8700K) makes on an Nvidia GPU (in this case a Titan RTX.)

    Here's what you see with a single 4k monitor physically connected to a dGPU displayport output (GPU-Z instance on the left shows iGPU, the one on the right shows dGPU):

    And here's what you see with exactly the same system state, but the single 4k monitor switched to being physically connected to a dGPU displayport output:

    The most obvious difference is the 633mb reduction in dedicated vram usage seen when display duty is being hamdled separately. However this is not a fixed value. Doing exactly the same test but with the display set to 1080p resulted in a vram usage reduction of just 237mb. Meaning that the impact on dedicated vram memory consumption from a GPU being used to drive connected displays is directly linked to both the resolution and number of displays being driven. If I had two 4k monitors on my system and performed the same test, there would likely be a 1.266 gigabyte reduction in dedicated vram usage with display duties offloaded. Which would be far from an insignificant vram uplift - especially on something other than a Titan RTX/3090 24GB total card.

    4k monitor on dGPU.png
    1467 x 1162 - 297K
    4k monitor on iGPU.png
    1490 x 1158 - 301K
    Post edited by RayDAnt on
  • kenshaw011267kenshaw011267 Posts: 3,805
    edited November 2020

    I have patiently and repeatedly explained how GPU-Z, and every other such monitor does not accurately report anything useful about VRAM usage and that you can easily enough find this yourself with a simple Google search. Why do you guys keep going back to this? You should definitely look this up though because [gosh] I'm sick of being called a liar. WDDM reserves VRAM on every commercial card attached period. That usage is not accurately reported by any monitoring program going. That's why you can see 0 VRAM usage on a card not connected to a monitor when it definitely does not have it's full VRAM available.

    So please just stop until you get educated on the actual facts!

    Post edited by Richard Haseltine on
  • can the RX 6800 do iray rendering in Daz3d?

  • NylonGirlNylonGirl Posts: 2,294
    wdbnia223 said:

    can the RX 6800 do iray rendering in Daz3d?

    No. It has to be Nvidia.

  • RayDAntRayDAnt Posts: 1,162
    edited November 2020

    WDDM reserves VRAM on every commercial card attached period.

    Yes. And it utilizes an additional amount of vram beyond that minimum resource allocation for each display attached to that card in order to deliver a video feed to it. After all, that is the primary function of the computational resources physically located on graphics cards.

     

    That usage is not accurately reported by any monitoring program going. That's why you can see 0 VRAM usage on a card not connected to a monitor

    Where are you seeing examples of 0 VRAM usage on a card not connected to a monitor in the data I posted? Because the minimums for the headless 8700K and Titan RTX configurations found there are 227mb and 1.534GB respectively.

    Post edited by RayDAnt on
  • If you have a Titan RTX and an internal GPU on your 8700k, why would you not set the Titan to run in TCC driver mode so it has the full 24GB VRAM available for rendering?  That would seem to solve the VRAM reservation issue, no?

  • RayDAntRayDAnt Posts: 1,162
    edited November 2020

    If you have a Titan RTX and an internal GPU on your 8700k, why would you not set the Titan to run in TCC driver mode so it has the full 24GB VRAM available for rendering?  That would seem to solve the VRAM reservation issue, no?

    Because the largest real-world vram consumption I have yet to see in my Daz/Iray usage has been 17GB (I said it when I first got it: a full 24GB framebuffer is wholly unecessary for modern Iray workloads.) And leaving it in WDDM mode keeps its resources fully available for other types of workloads (like openGL enabled apps and - yes - gaming.) Plus the overall performance gain of TCC mode is negligible. 

    On top of all that, the Titan RTX in my system is actually easier to keep cool than the 8700K (due to the latter not being delidded), so the more work I can farm off to the GPU the better. 

    The irony about the Titan RTX and its support for things like NVLink and TCC driver functionality is that it's the one card where those things are generally the least useful to have.

    Post edited by RayDAnt on
  • The world has gone completely mad. I've got my 2060 super on ebay. Its over £400 with 17 hours still to go. I decided to put it up now because I know prices are good, but I wasn't expecting that.
  • nicsttnicstt Posts: 11,715

    haha, damn

  • Gr00vusGr00vus Posts: 372

    Looks like my chances of getting a 3090 anywhere near retail went down a fraction.

    Maybe something will shake loose in 2022. Until then I'm stuck with my coil whiney underpowered 980ti.

  • TorquinoxTorquinox Posts: 4,560

    My understanding is, if you want a 3090 right now, you should buy a system that has one installed. The systems I've seen that have 3090s are monsters - Pricey but definitely armed for bear. YMMV

Sign In or Register to comment.