Looking for advice on nVidia cards

2»

Comments

  • lol That's what I thought - I thought it was a joke - no offence einar

  • Havos said:

    For iRay the full scene is copied onto both cards, not shared across them. Having two 970s will give you twice the cores and you should render in around half the time, but the scene is still limited to 4GB total. Having said that, I have a 4GB 970 and so far I have not built a scene that the GPU could not render, and some of my scenes are quite complex.

    Thanks for clarifying.

    Is Iray linked to DirectX do you know ?

    It's just that with DX12 they are trying to get the VRAM shared...

  • nicsttnicstt Posts: 11,715
    Havos said:

    I am not sure many of us would describe a Titan X buyer as "being on a tight budget"

    He/she only got one /nod

  • mtl1mtl1 Posts: 1,508

    I find that card temperatures, at least for nvidia cards, tend to be manufacturer dependent. EVGA, imo, has superior cooler designs and generally run much cooler than the competition :)

    However, like some posters have already said, having sufficient case fans and ventilation is important too. If your case can't vent the hot air, then no amount of card cooling will help.

  • Yep, I have plenty of air flow. Got a couple of big chassis fans and liquid cooled CPU, stays quite cool. The side of my case is basically open too as it has a mesh there.

  • HavosHavos Posts: 5,576
    mrmorph said:
    Havos said:

    For iRay the full scene is copied onto both cards, not shared across them. Having two 970s will give you twice the cores and you should render in around half the time, but the scene is still limited to 4GB total. Having said that, I have a 4GB 970 and so far I have not built a scene that the GPU could not render, and some of my scenes are quite complex.

    Thanks for clarifying.

    Is Iray linked to DirectX do you know ?

    It's just that with DX12 they are trying to get the VRAM shared...

    No idea about Direct X, I have not read anywhere that there is a link between it and iRay

  • fastbike1fastbike1 Posts: 4,078

    The Geforce cards are intended to run at the design value (typically 80 C). You can find some discussion on the Nvidia ARC, that there may be some performance hit if they are not running at the design temp. The thermal limit is quite a bit higher.than 80C. I believe that both the Titan X and GTX980TI will overclock themselves until the design temperature is reached.

    The point here is that many/most of the cards are designed to run at, and operate best at 80C.

  • namffuaknamffuak Posts: 4,406

    fastbike1 said:

    The Geforce cards are intended to run at the design value (typically 80 C). You can find some discussion on the Nvidia ARC, that there may be some performance hit if they are not running at the design temp. The thermal limit is quite a bit higher.than 80C. I believe that both the Titan X and GTX980TI will overclock themselves until the design temperature is reached.

    The point here is that many/most of the cards are designed to run at, and operate best at 80C.

    I'm not seeing any noticable difference for rendering; possibly it makes a difference in gaming (which I don't do). Per GPU-Z my core and memory clocks ramp up to the same speed and I hit 97% GPU load just as I did before upping the fan speed. My maximum fan speed seems to be at 63% and the card runs at 63C. And I seem to be performance capped because I've not gone in and tweaked the card to allow a higher voltage. This is on a GTX 980 TI.

  • nicsttnicstt Posts: 11,715
    edited November 2015
    mrmorph said:
    Havos said:

    For iRay the full scene is copied onto both cards, not shared across them. Having two 970s will give you twice the cores and you should render in around half the time, but the scene is still limited to 4GB total. Having said that, I have a 4GB 970 and so far I have not built a scene that the GPU could not render, and some of my scenes are quite complex.

    Thanks for clarifying.

    Is Iray linked to DirectX do you know ?

    It's just that with DX12 they are trying to get the VRAM shared...

    Using shared RAM with D12 is likely with Pascal; so not likely atm, and maybe never with previous generations of cards.

    Pascal architecture is due out from 2nd quarter next year.

    http://blogs.nvidia.com/blog/2015/03/17/pascal/

    Post edited by nicstt on
  • nicsttnicstt Posts: 11,715
    mtl1 said:

    I find that card temperatures, at least for nvidia cards, tend to be manufacturer dependent. EVGA, imo, has superior cooler designs and generally run much cooler than the competition :)

    However, like some posters have already said, having sufficient case fans and ventilation is important too. If your case can't vent the hot air, then no amount of card cooling will help.

    Number of fans is less important than having them balanced. Basically go for positive or negative air pressure. And an odd number of fans doesn't mean you have one or the other. Depends how much air they move.

    And airflow where it's needed; throwing fans at the case is exactly that, it doesn't remove the portential for hotspots.

  • SixDsSixDs Posts: 2,384

    "Geforce cards are intended to run at the design value (typically 80 C)"

     

    Not exactly. I'm assuming that you are referring to the the TDP (Thermal Design Power)? That is not a metric that really was intended for consumer guidance in the beginning, although it has come to be used as a guideline for overclockers. It is a goal for engineers of graphics cards to achieve in the design and implementation of thermal solutions. In other words, design your thermal solutions to keep temperatures at or below this value in normal usage. Running any microprocessor at high temperatures does not, in and of itself, result in increased performance. It can, however, result in premature failure over time. It is not like an automobile engine which achieves optimal performance once it achieves a certain operating temperature. With CPUs and GPUs, the cooler the better. To put it another way, increased power consumption and higher temperatures may be necessary evils in order to increase performance, but the increased performance is not a result of either increasing voltages or attendant increased temperatures. It is the other way around. If this were not so, we could all simply replace our heatsinks and fans with crappy ones and increase our performance.

  • The new 980 8gb cards should be coming out in December. I has the potential of lowering the 6gb cards a little.

  • nicsttnicstt Posts: 11,715

    The new 980 8gb cards should be coming out in December. I has the potential of lowering the 6gb cards a little.

    I'm good till Pascal now; a 980ti and a 970.

  • I picked up a 970 in the sales in the end - it will do for now. A 980 Ti was too big a stretch with Christmas coming up.

    Next year I will see where they are with the Pascal versions.

  • wizwiz Posts: 1,100

    The new 980 8gb cards should be coming out in December. I has the potential of lowering the 6gb cards a little.

    That's what they said last November. The elusive 8gb 980 has been a month or two away for over a year now.

    I stopped waiting a couple of months ago.

  • I got my 970 today which I bought in the sales last weekend. An EVGA SSC GTX 970 with the improved cooling system. So thanks to all that gave advice.

    I did a quick test on a small scene that took 1 minute 43 seconds on just my CPU.

    With the 970 in play it rendered in 18 seconds !

    Nice, one fifth of the time so that will do me for a while smiley

  • FSMCDesignsFSMCDesigns Posts: 12,843
    Havos said:

    I am not sure many of us would describe a Titan X buyer as "being on a tight budget"

    No kidding, LOL

  • If you can get one the best bang for buck Iray card is a Nvidia GTX 780 6GB. I purchased 2 of these cards from Newegg refurbished and they were a great purchase around $350 US dollars:paying anymore than this and you are better off with a Nvidia 980 6GB. You can still find them on ebay but most owners of these cards are asking premium prices and it just makes the Nvida 980 GTX 6GB a better buy. If you are on a budget: a combo of a built in AMD/ATI CPU with built in Video and a $90 US dollar Nvidia Tesla m2090 is a great budget buy. You cannot run a nvida gaming card as primary video and a tesla at same time it won't work, but a ATI card for video and a Tesla M2090 which has 6GB of vram and 500 cuda cores should be in most peoples budgets. Just be aware that a Tesla has no video out and takes at least a 650watts plus power supply. The best low powered solution is a AMD/ATI cpu video combo chip. That way the power supply does not have to power two video cards just your tesla.

Sign In or Register to comment.