PCI-E 2.0 Standard and Geforce GCs

SzarkSzark Posts: 10,634

I am after some advise form the PC savvy crowd here. 

I am looking in to getting an additional GC or 2 for Iray and I only have two spare slots in this Micro system

  1. 1 x PCI Express x16 slot, running at x16
    (The PCI Express x16 slot conforms to PCI Express 3.0 standard.)
    * Whether PCI Express 3.0 is supported depends on CPU and graphics card compatibility.
  2. 2 x PCI Express x1 slots
    (All PCI Express slots conform to PCI Express 2.0 standard.)

The first Slot houses the main GC that is not used for rendering at all. 

I see that a high number of the Geforce range uses PCI-E 3.0 x 16. How does that effect me and the performance of the card if plugged in to a PCI-E 2.0 slot?

«1

Comments

  • hphoenixhphoenix Posts: 1,335
    edited March 2016

    The version of PCI-E is really a question of how much bandwidth it supports (via more available lines on the data bus, as well as how it handles the data transfer along those), and all the new versions are fully backwards compatible.

    The simple explanation is that putting a PCI-E v2.0 card in a PCI-E v3.0 slot is the same as if you'd plugged that card into a PCI-E v2.0 slot.  It won't use the extra bandwidth.  Putting a PCI-E v3.0 card into a PCI-E v2.0 slot will limit the card to the PCI-E v2.0 bandwidth limits.

    Unless your machine is pretty high-end, you probably won't ever hit those limits.  Bandwidth is largely a question of what it CAN do, not what the machine is capable of delivering to it.

    For something like DS Iray rendering, where everything is loaded at the beginning to the card, then runs on the card, most of the data transfer bandwidth is needed during the initial loading period.  Higher bandwidth means it loads quicker (assuming the southbridge and memory can keep up with it.)  But beyond that, the actual rendering speed won't be affected hardly at all.

     

    For gaming however, bandwidth becomes more of an issue, as new textures/meshes may be streamed to the card as they are needed, and that requires bandwidth.  The faster you can do that, the less 'stutter' you may get when a bunch of new monsters/players/effects/whatever pop into the scene.

     

    Post edited by hphoenix on
  • SzarkSzark Posts: 10,634

    Thank you for lightening me that is what I gleaned from the net, but I trust folks here more. Again thanks for that, so no need to worry then. Time for a new power suppy first. :)

  • hphoenixhphoenix Posts: 1,335

    Best to wait and see what the new Pascal generation of nVidia cards brings anyway.

    (current leaks and reveals by nVidia show the Titan equivalent will come with 16GB of RAM, and have about 3x-4x the raw compute power of a current Titan.  High-end version could have 32GB ram..... *drools*)

     

  • mjc1016mjc1016 Posts: 15,001
    hphoenix said:

    Best to wait and see what the new Pascal generation of nVidia cards brings anyway.

    (current leaks and reveals by nVidia show the Titan equivalent will come with 16GB of RAM, and have about 3x-4x the raw compute power of a current Titan.  High-end version could have 32GB ram..... *drools*)

     

    Here you go, hphoenix...

    I need one of those too...(both the card and the towel).

    Everything I've seen indicates that they will be out sometime over the next few months.

    towel2.png
    640 x 640 - 328K
  • IvyIvy Posts: 7,165

    The Nvidia Pascal and telsa GC cost a bit more than the the titans C G as well thats something to keep in mind

    this is the cheapest place i could find the tesla cg at around $5000  at new egg.com  which has a sweet 24 gigs of ram  thats stackable to 1TB 

    http://www.newegg.com/Product/Product.aspx?Item=N82E16814132041

    Yea drooling is about all i can do . 5 grand for graphic card  doesn't hardly seem worth it jus tfor rendering daz iray. when i can & have  2 titans x and have 3 grands left to buy daz producrs http://www.amazon.com/EVGA-GeForce-GAMING-Graphics-12G-P4-2990-KR/dp/B00UVN21RQ

  • nicsttnicstt Posts: 11,715
    edited March 2016

    I'd wait till Pascal, but be aware that there is at least a report or two stating that some Pascal cards will have current tech in them, namely Maxwell.

    Although this is likely to be in the lower end cards, but with that I am guessing.

    EDIT

    hphoenix said:

    Best to wait and see what the new Pascal generation of nVidia cards brings anyway.

    (current leaks and reveals by nVidia show the Titan equivalent will come with 16GB of RAM, and have about 3x-4x the raw compute power of a current Titan.  High-end version could have 32GB ram..... *drools*)

     


    Oh? Where you seen that, and yeh - /drools

    Post edited by nicstt on
  • hphoenixhphoenix Posts: 1,335
    edited March 2016

    Thanks, mjc....*wipes*

     

    Current display items for nVidia's automotive products are still using Maxwell, with the new memory and architecture.

    According to earlier leaks, Pascal should be debuting around late April to early May, for the high-end workstation cards (Quadro stuff.)  Expected first consumer-level cards are expected late June to early August.  (Titan and 1080 level stuff.)

    (That K80 that was linked is not Pascal-based.  It's a server-based (hence pricey) dual-titan single card (12GB per Maxwell GPU)....typical prices for a Titan-Z 12GB card is about $1500...so two would run around $3000.  Add in server level cooling and such, and yeah, $5000 is about what I'd expect.  Those are cards that go in render appliances and rack-mount 4U render servers.  Not consumer level.  But expect to see similar prices and products for the early Pascal cards which will be targetting that market.)

     

    Oh, and here is the link that shows the roadmap/timeline from nVidia:  http://techfrag.com/2016/01/25/nvidia-2014-2017-gpu-roadmap-pascal-titan-gp100-to-debut-in-april-gtx-1080-in-june-and-volta-in-2017/

     

    Post edited by hphoenix on
  • SzarkSzark Posts: 10,634

    You guys crack me up. I mean that in a good way. 

     

    Ok I am settling on http://www.geforce.co.uk/hardware/desktop-gpus/geforce-gtx-960/specifications. It is within my price range for now. I noticed on the spec sheet the wattage needed from the power supply, 400 w so if I was to get a third card would I need to double that amount and get a 1000w power supply?

  • hphoenixhphoenix Posts: 1,335
    edited March 2016
    Szark said:

    You guys crack me up. I mean that in a good way. 

     

    Ok I am settling on http://www.geforce.co.uk/hardware/desktop-gpus/geforce-gtx-960/specifications. It is within my price range for now. I noticed on the spec sheet the wattage needed from the power supply, 400 w so if I was to get a third card would I need to double that amount and get a 1000w power supply?

    Don't trust the 'spec sheet' power requirements.  Go to one of the good review sites and find out what the peak power drain under load for that card is.  Here's a good comparison:  http://www.tomshardware.com/reviews/nvidia-geforce-gtx-960,4038-8.html

    In other words, a 960gtx should consume around a maximum of 170 Watts.  Depending on what CPU, Motherboard, and other stuff is plugged into the PS, is what you have to worry about.  And making sure your +12v rail that feeds the GPU has the current rating you need (most require 20A, some as much as 40A.)

    Assuming your machine is pretty typical, a box with ONE of these cards would need at least a 500W PS, better at 650W.  If your machine is pretty demanding power-wise (multiple mechanical HDDs and Optical Drives, additional PCI cards, etc.) I'd recommend a 750W.  With two of these cards, you'd want 650W minimum, better at 800W, and for power-hungry systems a 900W+ PS.

     

    ETA:  You can also look up the TDP and such for other componenets in your system( http://www.buildcomputers.net/power-consumption-of-pc-components.html ;).  Most only vary a little bit from manufacturer to manufacturer and model to model.  A mechanical HDD typically under full load draws about 10 to 20 Watts, SSDs draw less than 5W.  Fans vary, but can be 2 to 5 Watts apiece.  CPUs vary considerably, as do other items.  Typical builds run (without GPU) around 300W, higher-end builds run 400-500W.  Also, don't target the total for your PS maximum wattage.  You want some breathing room (as power supplies lose efficiency as they approach their maximum ratings!)  Aim for about 1.25 to 1.5 times your calculated 'max load' power.  So if your machine draws at full burn 600W, aim for 750W-900W for your power supply rating.  Also, better regulation is good.....buy power supplies with at least an 80+Bronze rating, preferably higher, if you can afford it.

    Lots of people tend to try to save money on a system by skimping on the power supply, and it is the biggest single cause of failures in other components.  Spend a little more here, and the other stuff in the box will tend to last a LOT longer, and work more stably.

     

    Post edited by hphoenix on
  • SzarkSzark Posts: 10,634

    Oh man big help again, thank you. Yeah I did calculate the other stuff I have which is I7 3770 3.4 GHZ Quad, one GT 610 GC, two 1TB internal HDD's and two USB (1X1TB and 1X3TB) HDDs, and a LCD monitor and a pair of Speakers. ANd figured I wouldn't need much more than a 650 - 800 bt that spec sheet shocked me in to asking. LOL Sweet I will get one ordered.

  • hphoenixhphoenix Posts: 1,335
    Szark said:

    Oh man big help again, thank you. Yeah I did calculate the other stuff I have which is I7 3770 3.4 GHZ Quad, one GT 610 GC, two 1TB internal HDD's and two USB (1X1TB and 1X3TB) HDDs, and a LCD monitor and a pair of Speakers. ANd figured I wouldn't need much more than a 650 - 800 bt that spec sheet shocked me in to asking. LOL Sweet I will get one ordered.

    No prob.  The USB drives won't draw very much (the USB hubs or their external power supplies them) though motherboard USB hubs (i.e., the built-in ports) can consume a good bit.  Still not nearly as much as the other big power eaters, though (typically a USB drive that is hub-powered only draws around 10W.)  So based on what you describe above, BEFORE the GPU, you've got about 280-300W being consumed at full load.  The monitor and speakers will be externally powered, so they don't come into play.  At peak, the 960gtx draws around 120-170W, so with one in there, that's another 170W, plus the 610GT is 29W max, so about 480-500W total.  So best would be 700-750W, 80+Bronze or better.

    A 750W that meets all those good things (ActivePFC, 80+Bronze or better, good name brand) can be had for about $75-$100 on newegg.

     

  • SzarkSzark Posts: 10,634

    Even better cheers that gives me a starting point and search inr the UK. Yeah 50 pounds sounds about right then cool. I got one lined up but I will go and search these and compare.

  • JCThomasJCThomas Posts: 254
    edited March 2016

    Glad to see you're getting lots of helpful information.

    However, 280-300 Watts is an extreme overestimate of your system's current power consumption. Not meaning to be a contrarion, but the 400W requirement listed on Nvidia's site is actually meant to be conservative, as are all of their guidelines. They're there in case you have a very poor quality PSU. Anything 80+ that's not some off-brand would be fine. If all you're going to add is the 960, anything over 500W would be extreme overkill.

    Always good to have room to grow, but if you're planning on sticking with the 960, no reason to spend the extra money on more watts. For some perspective, I recently built a gaming box, mostly just for the fun of the build, with an i5-4590, 128GB SSD, 2TB HDD, 8GB RAM, and a GTX 970 on a small-form factor 450W Silverstone PSU. It is more than enough.

    Your 3770 has a TDP of 77W. While TDP isn't an exact measure of how much power your CPU will consume, it's still pretty close. Even doubling TDP to estimate consumption, say 150W, would be very far on the safe side. Your system now is probably consuming somewhere around 100W at load.

    Even this would do, and only 20 USD after rebate:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16817139026

    Post edited by JCThomas on
  • SzarkSzark Posts: 10,634

    Thanks Yeah I am planning on adding a further card after the additional 960 and yes I will be buying a branded PSU...I need reliaibilty.

    Well I can get that in the UK...cool, 

  • DAZ_SpookyDAZ_Spooky Posts: 3,100
    edited March 2016

    There is about a 10% speed difference between a PCI-e 3 x16 slot and a PCI-e 2 x16 slot. (And no the PCI-e 2 slot is not the faster one. LOL)

    You are also significantly better off with a minimum of x4 speed on the slot. (One of the reasons that the current external graphics solutions for laptops are not that good, yet.) The problem is not the speed of the render, because that is all on the card, but the speed of things getting loaded onto the card. 

    Post edited by DAZ_Spooky on
  • StratDragonStratDragon Posts: 3,273

    Newegg is not the company they were a few years ago and enough complaints submitted on line to better business establishments attest to this. I would look at competitors prices as well, especially if you need to return something which they used to be very good with. In the past I'd say they were reliable as anyone, but that's just not the case any longer.

  • DAZ_SpookyDAZ_Spooky Posts: 3,100
    Szark said:

    Thanks Yeah I am planning on adding a further card after the additional 960 and yes I will be buying a branded PSU...I need reliaibilty.

    Well I can get that in the UK...cool, 

    I have never had an Antec Power suply fail on me. I have out grown them over time, but I have never had one fail. 

  • nicsttnicstt Posts: 11,715
    edited March 2016
    hphoenix said:
     

    Lots of people tend to try to save money on a system by skimping on the power supply, and it is the biggest single cause of failures in other components.  Spend a little more here, and the other stuff in the box will tend to last a LOT longer, and work more stably.

     

    +1

    Gold or Platinum rating is the place to start, and don't trust it just because it says Gold or Platinum - research is your friend /nod

    Post edited by nicstt on
  • Richard HaseltineRichard Haseltine Posts: 108,093

    I'm pretty sure the PSU I had with a failing capacitor was an Antec, but that was eight or nine years ago.

  • SzarkSzark Posts: 10,634

    Cool thanks everyone. Yeah nicstt, totally agree and still researching.

  • DAZ_SpookyDAZ_Spooky Posts: 3,100
    edited March 2016

    I'm pretty sure the PSU I had with a failing capacitor was an Antec, but that was eight or nine years ago.

    I am not saying they don't or can't fail, :) Just that my experience, over the past 18+ years of building computers (and not just for me) I have never had one fail. :) (IN that same time frame I have used Power supplies from a couple of other companies fail, with one company I had two fail on me.) 

    Note that unless you overload it, if it survives the first month, it is unlikely to fail for at least 3 years. :) 

    Post edited by DAZ_Spooky on
  • SzarkSzark Posts: 10,634
    edited March 2016

    SO do the Antec PSU's have this Bronze, Gold, Plat ratings??

     

    PS scrap that

    Post edited by Szark on
  • SzarkSzark Posts: 10,634

    Arrr too many to choose from LOL http://store.antec.com/power-supply/

     

  • SzarkSzark Posts: 10,634

    ok getting a handle on what is what. I am going to dig more before I buy. Thanks for the help folks.

  • hphoenixhphoenix Posts: 1,335
    edited March 2016
    JCThomas said:

    Glad to see you're getting lots of helpful information.

    However, 280-300 Watts is an extreme overestimate of your system's current power consumption. Not meaning to be a contrarion, but the 400W requirement listed on Nvidia's site is actually meant to be conservative, as are all of their guidelines. They're there in case you have a very poor quality PSU. Anything 80+ that's not some off-brand would be fine. If all you're going to add is the 960, anything over 500W would be extreme overkill.

    Always good to have room to grow, but if you're planning on sticking with the 960, no reason to spend the extra money on more watts. For some perspective, I recently built a gaming box, mostly just for the fun of the build, with an i5-4590, 128GB SSD, 2TB HDD, 8GB RAM, and a GTX 970 on a small-form factor 450W Silverstone PSU. It is more than enough.

    Your 3770 has a TDP of 77W. While TDP isn't an exact measure of how much power your CPU will consume, it's still pretty close. Even doubling TDP to estimate consumption, say 150W, would be very far on the safe side. Your system now is probably consuming somewhere around 100W at load.

    Even this would do, and only 20 USD after rebate:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16817139026

    Sorry, this is incorrect.  TDP is Thermal Dissipated Power.  Not all power is dissipated thermally (though most is) and the CPU may have a TDP of 77W at AVERAGE, the peak consumed power can spike much higher under full load.  Don't believe me?  Hook up a watt-meter to the PS, and watch the difference between idle and full bore on all cores.  And 77W is a conservative measure of IDLE power consumption.  Load power is measured for the 3770k at 166W.  (see http://www.bit-tech.net/hardware/2012/04/23/intel-core-i7-3770k-review/8 )

    Furthermore, you don't want the PS to be running at full rating even at full load, as power supply efficiency and regulation drop dramatically once you are at 80% of rating.  Look at some good tech reviews, you'll find the graphs of regulation vs. load and efficiency vs. load all over the place.

    I've been building systems for almost 30 years.  I've also been an Electrical Engineer for about 20.  Trust me, for this system, you'll want the 750W.  Go below that and when you have it going full-blast on a heavy render, and something chugs the CPU and drives to full load......you'll get drop-outs, or even shutdowns/bluescreens.  Seen it happen ALL the time.

    AVERAGE users don't ever peg all their cores at once and max out the GPU.  So they don't require the kind of power that people doing rendering do.  Browsing the web, even most gaming, won't peg all the cores and the GPU simultaneously.  Heck, it's tough to do with a lot of rendering software.....but it does happen.  And when it does, having the system stay up and running, rather than blue-screen or shutdown, makes a big difference.

     

    Post edited by hphoenix on
  • SzarkSzark Posts: 10,634

    I am going for a 740w one.

    So what is the difference, in laymens terms, between this Bronze, Plat efficiency. Is it just power consumption efficiency or something else? Big difference in price between the top and bottom. If I have to pay more for possible reliability then I will do so. I just want the make sure my not so healthy bank balance isn't going to waste. :)

  • hphoenixhphoenix Posts: 1,335
    edited March 2016

    https://en.wikipedia.org/wiki/80_Plus

    This will give you the full info.  The TLDR version:  80+ certification indicates a power supply will function at 80% efficiency over it's full output range (20%, 50%, and 100% load.)  Bronze, Gold, Platinum, etc. are just higher levels of efficiency across the board.  Naturally, building a PS to maintain high efficiency even at full load requires more and pricier components, so they tend to get more expensive.

     

    I personally don't think the difference from 80+ Bronze and the higher levels (gold, platinum, etc.) is worth the additional cost.  For 24/7 servers, sure.  For a consumer desktop, not really needed (unless it's only a couple of bucks difference in price, then by all means get the better one...)

     

    Post edited by hphoenix on
  • nicsttnicstt Posts: 11,715
    Szark said:

    I am going for a 740w one.

    So what is the difference, in laymens terms, between this Bronze, Plat efficiency. Is it just power consumption efficiency or something else? Big difference in price between the top and bottom. If I have to pay more for possible reliability then I will do so. I just want the make sure my not so healthy bank balance isn't going to waste. :)

    Greater efficiency; you save cash on your electric bills. There is also the posibility the components are better quality - no idea if they are, but it's a reasonable hope. You also get less wasted heat as it's converting more to the energy you need as opposed to chaotic heat energy. I'm sure I saw something about that somewhere.

  • SzarkSzark Posts: 10,634
    edited March 2016

    It's sinking in finally. This PC is hardly ever off and rendering overnight many nights of the week so I have to give this some serious thought over the next week or two. But I can't really argue with 30 years experience hphoenix ;) I haven't had to worry about this sort of information in the past. A PSU blew, replaced it without knowing anything apart from the wattage. :)

    What I am doing is trying to get the max out if this two year old system. I just maxed out the RAM to 16GB which makes a heal of a difference when rendering with Iray and Vue. :) Next is PSU and then a 960, then a few months after that another 960. After that is save up for a new rig.

    nicstt "There is also the possibility the components are better quality - no idea if they are, but it's a reasonable hope".

     amen, yeah hope. LOL

    Post edited by Szark on
  • mjc1016mjc1016 Posts: 15,001

    Another thing....generally there are better warranties on the 'upper' levels (gold, platinum), too.  Just looked at a few...Bronze was running around 3 yr, Gold at 5 yr and 10 yr Platinum (all EVGA ones),  antec seems to do 3 yrs on everything.

Sign In or Register to comment.