iRAY and high monitor resolutions

So, I will probably start shopping for parts for a new build in a week or two. I'll be replacing a 27" iMac, which has a retina display (either 4K or 5K resolution wise). The plan is to build this new machine for rendering, but I will definitely miss the big monitor and it's high resolution, so that is definitely something I'll be purchasing as well. Back in the day when I actually kept up with this type of stuff, resolution was very important when you looked at memory needs. Is that still the case? Do folks usually buy machines with a card to drive their displays and a card(or cards) to handle rendering? For gaming, it makes sense that you would increase frame rates by reducing the resolution of your display via the game's settings, but that doesn't really work like that for doing renders. So, does it even matter? 

I remember when I first was trying to learn Cuda programming, my piddly desktop at work didn't really have enough memory to do a whole lot. That was possibly 10 years or more ago and was a moderately powerful desktop with a not very powerful video card. 

Comments

  • ArgleSWArgleSW Posts: 153

    Rendering does not care what resolution your monitor is. It all depends on the render output dimensions you choose. If you render an image at 4K, it will be the same render time regardless if your monitor is 1080p or 4k.

    I highly recommend you stick to 1 video card. Whatever your budget is for a GPU, get a single high end card instead of 2 mid-range ones.

  • mambanegramambanegra Posts: 599

    Rendering does not care what resolution your monitor is. It all depends on the render output dimensions you choose. If you render an image at 4K, it will be the same render time regardless if your monitor is 1080p or 4k.

    I highly recommend you stick to 1 video card. Whatever your budget is for a GPU, get a single high end card instead of 2 mid-range ones.

    I wasn't suggesting that the renderer cares, but I know for a fact that the system must utilize some amount of that memory to maintain the display of your desktop and the applications you can see (and some you probably aren't looking at). But, maybe it's so trivial that it isn't worth mentioning. I've dug and dug, and no one talks about what I'm worrying about. Only the effects of higher resolutions on game frame rates, which is a different issue. I'm probably imagining a problem that isn't there. 

  • Silent WinterSilent Winter Posts: 3,915

    I don't think the desktop display makes much difference, but if you've got DS set to Iray or Texture (OpenGL) preview, it's using some GRAM to display that image as well as working on the render (But you could just switch to wireframe preview (much lighter on the GPU) before rendering if necessary).

    As far as I know anyway - I'm no expert.

  • kenshaw011267kenshaw011267 Posts: 3,805

    If you're not gaming almost any GPU will work to drive a 4k monitor. 

    You should get the best GPU you can afford. 

  • ArtAngelArtAngel Posts: 2,073
    edited May 2019

    I beg to dffer. I can open the view port and multiple aux view ports eg: side by side, top, bottom, back, front and 4other views in one screen on my 4k 38" . It makes all the difference in the world. In photoshop I can view 4 up. It is worth investing in.

    Post edited by ArtAngel on
  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited May 2019

    Yeah, I'm in the 'One or more GPUs for rendering, plus a basic GPU to drive the viewport' camp.

    For rendering, you should get the biggest baddest Nvidia GPU you can afford of course, the more memory the better with overal rendering speed being the secondary but still important consideration. This is why the 1080 Ti is still popular.  11 GB of VRAM plus decent speed.  Sure, it's successors are faster, but the 1080 Ti's have some pretty compelling price ponts at the moment.  That may change now that mining is a thing again, not to mention the impending increased tariffs for US buyers...

    I use a Ryzen 2400G with integrated Vega graphics to drive my 4K monitor and the Daz viewport, with my 1080 Ti 100% dedicated for rendering.  What this does is allow me to work on other stuff while a render is baking, without the screen lagging.  Sure, the system is a bit slower since it still needs a bit of CPU power for Daz, but I can work in one Daz instance using texture shaded while a second instance is busy baking a render. 

    I could also use the Iray viewport in the first instance while the second instance is baking a render, but that may slow down the render speed, or may drop a render to CPU only if I'm near/at the memory limits.  Plus, it tends to lag...

    More cores would probably help me a bit more performance wise with setting up renders in Daz whle the other instance is busy baking a render, but AMD hasn't released an 8 core with vega graphics version of Ryzen yet.  Those aren't due until 2020... In the meantime, yeah I love this setup.

    My system is Mini-ITX, otherwise I would probably stick a cheap card in a second PCIe slot to drive the monitor and viewport and get an 8 core CPU.  My goal was to build a 7nm Threadripper system eventually that would essentially do this, but 7nm Threadripper has been delayed apparently - it was removed from AMD's current roadmap.

    Using AMD Vega for the viewport has an interesting advantage, in that the system doesn't have to think about not using said card for an Iray render.  I'm sure you could drop in a low end GTX card as well, you may need to 'uncheck' it in the render settingss though.  As I understand, the viewport relies on Open CL unless set as an Iray viewport, and supposedly AMD's current Open CL implementation is more up to date than Nvidia's from what I've read.

    I also have a theory that by using the Vega graphics for the desktop that Windows 10 doesn't assign the 18% VRAM tax to the Nvidia card, but I'm not 100% sure on this.  I've had a few render files that on my old system would go CPU only but are now fitting in the graphics card, but my old system had 6.4 GB available for rendering (1080s).  I've added a couple more characters and such to more recent scenes to take advantage of the extra ram, and I'm pretty sure I've went above 8.91 GB of ram for a couple of those scenes, but I haven't bothered to check the numbers.  It's a fun theory at least...

    The latest Intel vulnerabilities have definitely reinforced my decision to go with an AMD CPU. 

    So yeah, get a badass card(s) for your Iray rendering, and a cheap card that can drive your viewport.  Heck, even Intel Integrated graphics should be able to do the job in this regard (not sure about 4K though), assuming that your motherboard has a monitor port for said graphics.

     

    Post edited by tj_1ca9500b on
  • kenshaw011267kenshaw011267 Posts: 3,805

    I'm not sure I'd call a refurb card for $700 compelling except for Nvidia's crazy pricing of RTX 2080ti's.

  • mambanegramambanegra Posts: 599

    I'm not sure I'd call a refurb card for $700 compelling except for Nvidia's crazy pricing of RTX 2080ti's.

    Yeah, I saw a "Renewed" (I think that was the phrase the company used) 1080 TI for well under 1000 at amazon. Anyone had any experience with buying refurbished cards? I have noticed lots of folks claiming they buy theirs from ebay, which I'm more than a bit leery of, but maybe I shouldn't be? 

    I think I will plan on getting two. I had been considering AMD graphics for the display card thinking the price points may be a bit better since (as far as I know) the miners prefer nVidea to AMD. 

    The crappy thing is Apple is hosting their developers conference in two weeks (I think?) and are expected to finally reveal their plans for their new pro machine. If they actually revealed a truly upgradable machine (not unlike the original mac pro, one of which I have sitting beside my desk acting like a file server) then I would probably stay with them just because I have my workflow set up like I like and switching to windows will require a lot of changes and probably more than a little grumbling. But, I'm off next week, which would be GREAT for building a computer...decisions...decisions. I'm 99% sure what they reveal will be something stupid and useless for me, so I'm tempted to go ahead and start the final transition...but part of me says wait. I hate logical me :P

Sign In or Register to comment.