Atrocious performance in iray on my new RTX 2080 Ti's

So I recently built a secondary PC containing two RTX 2080 Ti's. I wanted to use this for iray purposes since I'd been reading good things about people's benchmarks and render times, and it seemed like with some recent drivers nvidia adressed a lot of the issues the 2080 Ti's were having. I knew it was a bit of a gamble to get these, but I figured it would pay off.

So far, no good. My old system is running dual 1080's. Rendering the same scenes with the same settings, both in the 4.11 public build, the following happens: at the start, the RTX's take the lead as you would expect. They're a bit slower to kick in but when they do, they rapidly overtake the 1080's on iterations. Then, several minutes in, there is an inevitable slowdown. Sometimes this is accompanied by a black screen, other times it isn't. If a black screen is there, I eventually get my feed back and it appears as though nothing has crashed and Iray is still rendering. This is half true. Iray does still render, though at a crawl, but canceling the render immediately shuts down Daz Studio.

The 1080's, meanwhile, are happily chugging along. While the RTX's outpaced them at first, they catch up and steam by. Basically, I've not rendered anything to completion with my RTX's yet, because they never make it through the whole process.

What is going on here? Is it Turing architecture simply not playing nice with the 4.11 public build? Then why do others get such stellar results? Is it faulty hardware, problematic drivers? Is it an insufficient PSU? I have an 850w Corsair that by all accounts should suffice, but it could be that a 1000w would've been better suited. I guess this is easily tested by disabling one of the cards and seeing what happens. Meanwhile, I could really use some insight from others. I'm worried that I might've gotten two RTX's from a sick batch, since what other explanation can there be for 1080's outperforming it?

Comments

  • How much RAM is in the rig with the dual 2080ti's? That behavior certainly sounds like the system is having trouble with having not enough RAM to support those cards.

  • The DudeThe Dude Posts: 6
    edited December 2018

    16 GB of 3000 Mhz DDR4. You think that could be it? Honestly I hadn't even considered RAM, since I always assumed it was GPU RAM that mattered. For what it's worth, I took out one of the RTX's and am now (so far) succesfully rendering my scene.

    Edit: nvm, blackscreened and daz studio shut down again. After the black screen, I noticed iray had reinitialised itself. Between all the iteration numbers was the "VERBOSE" message and a mention of the CPU, the same one you get at the start of a render.

    Edit 2: Swapped out the cards. After originally trimming my configuration down to the card I had in the first PCIe slot, I now put the other card in that slot. This one won't even let me render on photoreal. Even in the 4.11 public build. Just straight up get a black picture after an instant render completion. So I have one card that can get me to a 65% complete render at around 1900 iterations on photoreal, before crashing, and a card that doesn't even let me render to begin with. Two faulty cards? RAM throttling still? Surely it's not a PSU issue, 850w is more than enough to support a single card.

    Post edited by The Dude on
  • My 1080ti uses well over 16Gb of system RAM during a render that uses most of the cards 11Gb. Therefore I do think that is why your renders are failing. I have no idea why they're failing instantly with one card and not the other.

  • The DudeThe Dude Posts: 6
    edited December 2018

    My 1080ti uses well over 16Gb of system RAM during a render that uses most of the cards 11Gb. Therefore I do think that is why your renders are failing. I have no idea why they're failing instantly with one card and not the other.

    Alright so I just put both cards back in and ran another render with HWMonitor on. During rendering my system memory consumption spikes up to 86% and remains there (actually, it peaks around 88%). GPU memory usage is about 40% on either card. I'm also noticing a jerkiness in mouse movement, with jitters and hangups, something that had escaped me earlier because I was mostly just watching the iray iterations report passively while working on my other PC. Seems to corroborate what you're saying?

    Edit: system just hard froze and bluescreened. I think this is starting to become a pretty clear picture :|

     

    Post edited by The Dude on
  • Get another 16Gb, exactly matched to what's in there now, and I strongly suspect that will solve your problem.

  • mx90209mx90209 Posts: 69
    The Dude, did you get more RAM and did that solve your issues?
  • The ram of your system should at least double the ram of your gpu. E.g. if you use a 8GB GPU then 16GB Ram is the absolute minimum

    If you got the 8GB 2080's I'd suggest you spend some lunchmoney on 24GB of fast ram and your issues should be resolved.

  • ebergerlyebergerly Posts: 3,255

    The ram of your system should at least double the ram of your gpu. E.g. if you use a 8GB GPU then 16GB Ram is the absolute minimum

    If you got the 8GB 2080's I'd suggest you spend some lunchmoney on 24GB of fast ram and your issues should be resolved.

    Very true. As an example, I just did a scene where my system RAM started around 4GB, and after I loaded the scene into Studio it went up to about 7GB. And after I hit Render, Studio/Iray loads all the data (textures, object info, etc.) into system RAM. At that point my system RAM usage was just under 17GB, Then Studio/Iray converts and condenses all of that data into Iray/CUDA instructions and sends it to the GPU. And that filled my 11GB of GPU memory to 5.6GB. 

    So 5.6GB of GPU VRAM and 17GB of system RAM means the system RAM is 3 times the GPU VRAM. Of course that includes 4GB of non-Studio applications, but it's probably a reasonable reflection of what others might see. So yeah, I generally assume you need 3x the amount of GPU VRAM as system RAM. 

Sign In or Register to comment.