Vram usage

I am currently using a gtx 970 with 4 gb of vram. For monitoring i use GPU Z. For some scenes i hit the vram limit of this card and i can see in gpu z that the vram is 3,9 gb and the gpu  usage level is @ 0%. So my system is switching to cpu rendering.

What i would like to know is how many vram i would need for a scene. Is there a way in daz studio or with an other aplication to see this?

 

Michel

Comments

  • HavosHavos Posts: 5,581

    Working out the GPU memory needed is none trivial, as it can depend on so many factors, like the geometry in the scene (ie total polygons), and the textures needed, and then account for any compression that may take place to the textures prior to their movement into the GPU memory. DS is not able to do this, and I suspect it would be very hard to add it, though maybe in the future they could enhance it to include a rough guess.

    If you think your scene could be close to the limit, then try hiding a few of the high poly/detailed texture items to see if the scene will fit after that. If it does, make them visible again, and then adjust things that you feel should have minimal difference to the final render. For example hide any off screen items (unless they are needed due to appearing in reflections, or if they cast shadows onto the visible scene. If you have background characters in the scene, try reducing them to base resolution, and maybe remove any bump, specular etc maps on them.

    If the scene needs way over 4 GPU you could render the scene in two passes by hiding some characters in one, and others in a second pass. The resulting image can then be recombined in post work. This relies on the characters in each pass not casting shadows over each other or overlapping unless all characters of one pass are entirely behind those of the other pass.

    I have the same card as yours btw, and I have rendered some pretty complex scenes with it using GPU only, for example with 5 Genesis characters (all at High Resolution, but none using HD), plus many scene objects and props.

  • Thanks for the reply vey usefull.

    Indeed with the 970gtx i can load 4 G3 figures with clothing and hair in a scene and stay in the 4 gb limit. I also use this card for the os witch needs abouth 0,7 gb so if i could use a separate card or my motherboard gpu to run the OS i guess adding a 5 th gen 3 figure would be no problem. (I have to figure out how that works).

    The scene that is bigger then 4 gb consists of 3 G3 models with clothing an hair and stonemason's London model. One G3 model is using  HD.

    I purchased the gtx 970 card used for a good price and would consider upgrading to a 980ti. My guess is i would better wait until the release of the NVIDIA pascal gpu's wich would be for somewhere in 2016 and then buying such a card with the largest amount of vram that i can pay.

    Michel

  • HavosHavos Posts: 5,581

    The pascal's could be some time away, and I suspect will be pretty expensive to start with. However when they do arrive the prices of the older cards may drop, so a 980 would become more affordable. From what I have read however, a pascal card would be very nice :-)

    Since I do not play games I use the motherboard GPU as my display, I see little difference when looking at the screen through the GPU card.

    I do not have the London set, but I would imagine it uses a fair few resources given the incredible detail of that set. One HD character will use up more memory, but remember that textures tend to use up the majority of the GPU memory, not polygons. One set I have which my card struggles with is Andrey Pestryakov Forest sets, which are also very detailed.

  • Is using the motherboard gpu so straightforward as using the dvi output of the motherboard instead of using the one of the gtx970?

     

  • mjc1016mjc1016 Posts: 15,001

    Is using the motherboard gpu so straightforward as using the dvi output of the motherboard instead of using the one of the gtx970?

     

    Yep, that's pretty much it...although some BIOS settings may disable the onboard GPU with a card installed.   There is usually a setting in the BIOS to set it up for both.

  • Thanks,

     

    Will try that.

     

    Michel

  • HavosHavos Posts: 5,581
    mjc1016 said:

    Is using the motherboard gpu so straightforward as using the dvi output of the motherboard instead of using the one of the gtx970?

     

    Yep, that's pretty much it...although some BIOS settings may disable the onboard GPU with a card installed.   There is usually a setting in the BIOS to set it up for both.

    Interesting, that may be happening to me. Although I am using the motherboard graphics, I do not see any screen output during boot up until the windows login prompt arrives. All the text printed as it boots up only goes to the screen if I am viewing through the GPU card. I will have to check my BIOS settings, thanks for the advice.

  • JCThomasJCThomas Posts: 254

    There was/is also an issue with the gtx 970 and VRAM. The 970 has two chunks of VRAM, a 3.5 GB chunk and 500 (maybe 512) MB chunk. The 3.5 chunk is clocked at 7 Gbps, but the last 500 MB is significantly slower, up to 1/7th slower by some benches. The last 500 MB of VRAM is only accessed when the first 3.5 GB is at capacity. I'm not sure if this is causing your issue, but it seems relevant at least. If your scene fits in 4GB, it would at least explain a slow down in rendering (but not a reversion to CPU only). 

    There was a lot of consternation about this when gamers first discovered the issue, and there was even talk about a class action against Nvidia. It was basically explained away as a "feature" lost in communication between marketing and developers.

    While I'm certain the 4GB 980 does not have the same issue, I'm not sure about the 4GB 960. But even if the 960 has a single chunk of 4gb VRAM, the 970 would still be the better choice due to more CUDA cores.

    Anyway, if it's in the budget I'd say go with a 980 ti now. It wouldn't prevent you from adding a pascal card down the road anyway.

  • Thanks for the input.

     

Sign In or Register to comment.