OT: RTX 40 Series a Power Hungry GPU? Update 4090 Ti, Release Date?

Ghosty12Ghosty12 Posts: 1,982
edited May 2022 in The Commons

Well it seems that the leaks and rumors of the next gen RTX 40 series are out.. The rumors abounding suggest that these new cards might be very power hungry, even more so than the 3090.. But as with all rumors take it with a truck load of salt until we see actual specs.. If true though that will be one big yikes for peoples power bills..

And another rumor is that these new cards might/could/may be announced in the 3rd quarter of this year, again only time will tell..

https://videocardz.com/newz/some-nvidia-geforce-rtx-40-graphics-cards-with-ad102-gpu-are-now-rumored-to-consume-more-than-450-watts

Update: Rumors running rampant now are that there is a 4090ti being tested that supposedly has a 900 watt TGP..

https://www.pcgamer.com/if-a-900w-nvidia-geforce-card-appears-in-the-next-generation-the-gpu-industry-has-lost-its-head/

Update 1: Well rumors are now abounding that Nvidia might be releasing the 40 series cards around July / August..

https://videocardz.com/newz/nvidia-geforce-rtx-40-series-now-rumored-to-launch-in-early-q3

Post edited by Ghosty12 on
«13

Comments

  • TaozTaoz Posts: 9,732

    A GTX 1070 is using 110W, if a 450W card renders 5 times faster it's actually cheaper to use.

  • davesodaveso Posts: 6,438

    if the pricing is naything like the current, I'll still be using my 2070

  • Taoz said:

    A GTX 1070 is using 110W, if a 450W card renders 5 times faster it's actually cheaper to use.

    The rumor is that the 4090Ti, or whatever it'll be called, will be 800W+. Your logic is still valid, but many people dreaming about dual configurations have been shot down by this, as commercial-off-the-shelf PSUs wouldn't be able to support that, and if you live in the US, your house's wiring either.

  • outrider42outrider42 Posts: 3,679
    edited February 2022

    There is no way Nvidia would ship a consumer GPU that uses 800W. This rumor is straight up false. If Nvidia has a product that uses 800W, it is a special product designed for datacenter use that has to be special ordered.

    Any modern home in the US can handle 800W. 1500W electric heaters are common. Most heaters have 2 or 3 modes, 500W, 1000W, and 1500W. So 800W would be like low to medium heat from a space heater. That is how much heat such a product could produce. 

    However, the other rumors of 400W or 450W are totally possible, and actually could be very true. Remember Nvidia is supposed to launch a 3090ti, they officially announced it in Jan, but then went silent. Speculation is that the 3090ti is in fact TOO hot, using 400+ Watts, that the boards cannot handle the heat produced. If this is true, this alone shoots down any possible 800W card. If they can't control a 400W card, they certainly cannot control a 800W one. My personal opinion is that Nvidia is using the 3090ti as a test to see just how much is possible, and if it releases, if consumers are willing to buy a 450W card.

    Keep in mind you can always tune your GPUs to use less energy, a lot less, even. It is not like you have to run a 3090 at 350W or 400W all the time.

    Competition is driving this. Nvidia wants to have the fastest GPU, always. And they are willing to go to great lengths to do that. AMD might also push wattage up next gen in an attempt to do the same. They want that performance crown, too. It has been a very long time since AMD has been able to claim that. Owning the performance crown is big for marketing. People buying cards in lower tiers might buy AMD or Nvidia simply because "they make the fastest cards", even if that particular SKU is beat by a similar priced model from the other company. It might sound silly, but this really does happen. Nvidia has built huge brand power in large part because they have been on top for quite a while now.

    But don't expect all the cards to be that crazy. The mid tier and low end cards will not be like that. You might see a small bump up in a particular segment.

    BTW, the 1070 does around 2.5 iterations per second in the Iray benchmark scene run in the forums. The 3090 can do around 20 iterations per second. That is a factor of 8, and the 3090 is a 350W card. At least my Founder's Edition is. Some 3090's do go higher. Either way, that is 8 times the performance for a little over 3 times the power draw.

    The 3060 gets about 8.2 iterations per second in that test.  The 3060 uses about the same power as the 1070, 115 to 120W I believe. So over 3.3 times the performance for basically the same power.

    The 4000 series is supposedly going to be 2 times faster in general, with a 2.5 times boost to ray tracing. If these actually achieve while only pushing power up a bit, you can make some rough guesses. So the "4090" might do 40 iterations per second, or more. And a potential "4060" could do 16 iterations per second, or more. If the 4060 can do that while using say, 150W, well that would be pretty sweet in my book. A 4070 might be as fast as the 3090. You might think this sounds crazy, but the 1070 is as fast as the 980ti, the fastest previous gen card. So Nvidia has actually done this before.

    How is this possible now? Nvidia chose Samsung 8nm for Ampere 3000. This node is not as good as what TSMC offers, it is more like TSMC 10nm. Next gen all signs point to Nvidia going back to TSMC and going to 5nm. Node sizes are mostly marketing these days, but TSMC 5nm truly twice as dense as Samsung 8nm. Thus Nvidia would be able to pack almost twice as many transistors into the same space as they can with Ampere. That alone is a massive boost. Then Nvidia is going to try to push these chips hard, clocking them as high as possible, this is where the power draw comes into play. Just looking at this way, it is easy to imagine how Nvidia could seriously more than double the performance of Ampere. AMD has their own trick, as they are expected to introduce multiple chiplets on their GPUs, like they do with Ryzen. If they achieve this, then they can really push performance up. All signs point to them doing just that, and they might even beat Nvidia in the process.

    That is just competition at work. Of course Iray cannot run on AMD GPUs, but the competition AMD creates still empacts the Nvidia GPUs you can buy. So it is always good to root for competition so consumers can benefit. Intel is finally getting into this game as well, but their cards will not have CUDA to run Iray, either. Still, more competition is better for us all. I really think Intel could shake the whole market up, at least in the low and mid range. Intel could eventually challenge AMD and Nvidia for the performance crown in a couple years.

    At any rate, the next generation of GPUs is set to be one of the biggest leaps yet. We can only hope that something like crypto doesn't get in the way, and it does seem like crypto is in decline for now. Will that hold up until the next gen releases? Who knows. Crypto is too unpredictable.

    Post edited by outrider42 on
  • nonesuch00nonesuch00 Posts: 17,929

    TheMysteryIsThePoint said:

    Taoz said:

    A GTX 1070 is using 110W, if a 450W card renders 5 times faster it's actually cheaper to use.

    The rumor is that the 4090Ti, or whatever it'll be called, will be 800W+. Your logic is still valid, but many people dreaming about dual configurations have been shot down by this, as commercial-off-the-shelf PSUs wouldn't be able to support that, and if you live in the US, your house's wiring either.

    Which means it's not going to happen. Thanks for the info, not that I expect to get a 4000 series card either.

  • kyoto kidkyoto kid Posts: 40,575

    ...they'll all be unaffordable once they hit the market and get snapped up by speculators again.. I have my 3060, I'm fine for a while.

  • marblemarble Posts: 7,449

    One can only hope that NVidia does not push IRay to greater limits requiring these new monster cards. I have a 3090 and the heat coming from the fans around my legs is considerable. I think many will be hopeful of alternatives to IRay and I hope that DAZ is not entirely tied to NVidia. I will certainly never be able to replace the 3090 with a newer generation top of the line card. That was a fortunate one-off and will have to last for some years to come.

  • kyoto kidkyoto kid Posts: 40,575

    ...well there is Blender Cycles which isn't Nvidia dependent and we do have the bridge.

    About the only other hope is someone in the PC realm (like AMD) pushes ARM technology to where it can perform as well as (or better than) a GPU card for rendering.  Unfortunately for now the main focus seems to be on mobile systems, not desktops.

  • alexhcowleyalexhcowley Posts: 2,310

    I have a top end 3080 ti which requires 400 watts at full throttle.  In this context, I find it hard to believe that the next generation cards will have twice the power requiements. 

    Cheers,

    Alex. 

  • Ghosty12Ghosty12 Posts: 1,982

    kyoto kid said:

    ...they'll all be unaffordable once they hit the market and get snapped up by speculators again.. I have my 3060, I'm fine for a while.

    Same here I am set for quite a while now with my 3060.. :)

  • ColinFrenchColinFrench Posts: 635
    edited February 2022

    I was hoping the smaller node Nvidia will be using for these cards would lower the power requirement. I guess on an equal preformance basis it might, but of course Nvidia would want to push the performance higher, otherwise they can't brag about it.

    I would probably be happy with a mid-tier card, say a 4070 sort of thing, since even that would be a big jump from my current 1080ti. That should have more reasonable power requirements. But the problem is that mid-tier cards typically have less VRam as well and that's where it gets sticky for us Daz'ers in our weird little niche. I definitely don't want to go lower than my current 11GB. We'll see.

    Post edited by ColinFrench on
  • TorquinoxTorquinox Posts: 2,559

    It doesn't matter anyway. A lot of people here likely won't be getting one due to cost/availability issues.

  • Ghosty12Ghosty12 Posts: 1,982

    Added an update in my OP on a rumored 4090ti in testing, that supposedly has a 900 watt TGP..

  • outrider42outrider42 Posts: 3,679

    Again, no way they release 900 Watt card to the consumer market. If there is any truth to the rumor (it is important to remember these are just rumors), then the product being tested is some other product not intended for consumers.

    If a card actually uses 900 Watts, then your PC is using well over 1000 Watts. That will blow breakers in a lot of homes. There will be fires, as numerous fires get started by space heaters running on poorly maintained electrical outlets. It would be suicide for Nvidia to release a 900 Watt card to consumers. Even the super mega Hopper H100 does not use that much power.

    The 4090 may indeed use 600 Watts according to MLID, that still seams crazy, but again we need to remember that the 3090 was pitched as a creator product first. Yes Nvidia talked gaming, but the first words about the 3090 were how it was intended for creators who needed the extra memory. Sounds like us.

    So how do you pitch a 600 Watt card? Actually, it is very easy. If the new card increases ray tracing 2.5x and normal performance by 2x, you have a simple answer. You pitch your 600 Watt 4090 as being faster than two 3090s in SLI, and actually uses less energy. After all, two 3090s would be 700 Watts minimum. Plus the 4090 costs less than buying two 3090s, it is a win-win, LOL. There's your sale's pitch.

    Pay me Jenson!

    But seriously, that is one way to explain these values. The top products are not intended for gamers, who do not need 24GB of VRAM anyway. The 3090 never made sense to gamers, offering just 10% more performance but was priced at DOUBLE the cost of the 3080. The 4090 will be the same way, it is not really for gamers. Certainly some high rollers will buy it because they have to have the fastest card...but if AMD scores a win as some predict then that would actually be AMD's top card not the 4090.

  • PerttiAPerttiA Posts: 9,471

    outrider42 said:

    Again, no way they release 900 Watt card to the consumer market. If there is any truth to the rumor (it is important to remember these are just rumors), then the product being tested is some other product not intended for consumers.

    If a card actually uses 900 Watts, then your PC is using well over 1000 Watts. That will blow breakers in a lot of homes. There will be fires, as numerous fires get started by space heaters running on poorly maintained electrical outlets. It would be suicide for Nvidia to release a 900 Watt card to consumers. Even the super mega Hopper H100 does not use that much power.

    I still can't understand, how you have such problems with electricity still in 21st century... 

  • prixatprixat Posts: 1,585

    PerttiA said:

    outrider42 said:

    Again, no way they release 900 Watt card to the consumer market. If there is any truth to the rumor (it is important to remember these are just rumors), then the product being tested is some other product not intended for consumers.

    If a card actually uses 900 Watts, then your PC is using well over 1000 Watts. That will blow breakers in a lot of homes. There will be fires, as numerous fires get started by space heaters running on poorly maintained electrical outlets. It would be suicide for Nvidia to release a 900 Watt card to consumers. Even the super mega Hopper H100 does not use that much power.

    I still can't understand, how you have such problems with electricity still in 21st century... 

    It's a bit of a U.S. specific problem, due to their historically poor choice of Electrical Standards. Max. draw from a U.S. domestic socket is 1500W. Max. UK socket is 3000W.

  • Ghosty12Ghosty12 Posts: 1,982

    Just added another update to the ongoing saga of the 40 series.. smiley

  • namffuaknamffuak Posts: 4,063

    Take the power requirements with a (fairly large) grain of salt. I'm running a long render right now and took a snapshot:

    1080 TI 97% GPU load, board power draw 186.1 W

    3060     95% GPU load, board power draw 129.4 W

    CPU    100% load, 99.1 W (not part of the render - running handbrake converting some avi video to mp4)

    APC Powerchute monitor reports total power draw is 583 W; this will be everything - including the 980 ti driving the monitors,the six case fans, the cpu fan, five internal ssd drives, spinning power for seven external usb drives, my fiber interface box, and my wifi router. Both GPUs are voltage (VRel) limited. The monitors, network switch, and my laptop are on a different UPS.

  • kyoto kidkyoto kid Posts: 40,575

    prixat said:

    PerttiA said:

    outrider42 said:

    Again, no way they release 900 Watt card to the consumer market. If there is any truth to the rumor (it is important to remember these are just rumors), then the product being tested is some other product not intended for consumers.

    If a card actually uses 900 Watts, then your PC is using well over 1000 Watts. That will blow breakers in a lot of homes. There will be fires, as numerous fires get started by space heaters running on poorly maintained electrical outlets. It would be suicide for Nvidia to release a 900 Watt card to consumers. Even the super mega Hopper H100 does not use that much power.

    I still can't understand, how you have such problems with electricity still in 21st century... 

    It's a bit of a U.S. specific problem, due to their historically poor choice of Electrical Standards. Max. draw from a U.S. domestic socket is 1500W. Max. UK socket is 3000W.

    ...add to that homes and apartments with older wiring.  Lived in one place that was still "knob & tube" with the old fashioned fusebox.

  • ChumlyChumly Posts: 793

     

    It's a bit of a U.S. specific problem, due to their historically poor choice of Electrical Standards. Max. draw from a U.S. domestic socket is 1500W. Max. UK socket is 3000W.

    My experience is different

    Its not "just" a US problem.  I have lived in the US (30yrs), the UK (5yrs), and in Germany (20yrs), and all 3 champions of the West can have crappy house wiring.  My last house, (DE) would blow fuses if my computer and printer were plugged into an outlet on the same wall (among other electrical problems), my current DE house has many rooms with just 1 electrical outlet... shesshhh.  My last house in the UK, I had to be cognisant of what the wattage was on my kitchen mixer, and often had to turn everything off in the kitchen, just to use the mixer.  Maybe that was a wiring issue specific to the house... but still.  And gadzooks, I don't even want to begin discussing the state of UK plumbing (Mixer taps?  Water Pressure?... ugghhh).



    My US electrical problems were centerered around the ages of the houses I lived in.  There was a spate of time in the 60/70s where aluminum wiring was used ​So no, in my experience, it "ain't" just the US.  

  • nonesuch00nonesuch00 Posts: 17,929

    I was offered a RTX 3000 series card as a gift yesterday but I declined and said I am saving for an RTX 4000 series. I did thank them though. laugh Never thought I'd be turning down a free RTX 3000 GPU but it would of been a waste of money at this point.

  • LeatherGryphonLeatherGryphon Posts: 11,173

    And watch out for places built in the late middle part of the last century that had aluminum electrical wires in the house.  Very subject to corrosion and subsequent overheating or sparking at the connection points.  Bad, bad, bad!no

  • Ghosty12Ghosty12 Posts: 1,982

    This unfortunately will be a big concern for those living in houses and so on, with really old electrical wiring that is not up to the newer housing codes in various countries..

  • LeatherGryphonLeatherGryphon Posts: 11,173
    edited May 2022

    Most of my life I've lived in older houses.  But my father was handy with household repairs and he and some friends (one was a qualified electrician) rewired our big house over the years to remove the knob & tube original wiring and replaced it with 12 gauge Romex (standard copper household 2 conductor w/ground), along with new distribution panel and proper 3-prong wall outlets.  Although, the improvement came over many years and at some point we learned that the switch that sent power out to the detached garage was somehow boogered because the rest of the house would flicker if the garage was powered.frown  One learned to not have to go out to the garage at night, until the rickety old garage was finally torn down and replaced with a new one, properly wired.  Yay!yes

    In the current place I live, it is another of the 100+ year old houses in this town but has at sometime at least been upgraded to semi-modern wiring but I don't have any properly grounded outlets in my computer area.  A couple outlets have 3-prong sockets but I don't believe they are properly grounded.  My little circuit tester lights its red light indicating "improper ground".sad  But I haven't been fatally shocked yet.  Wheee..., will try harder in the future.devil

    Post edited by LeatherGryphon on
  • Seven193Seven193 Posts: 1,064
    edited May 2022

    I haven't looked at PSUs in a while, but there apparently has been improvements.

    The RTX 3090 Ti already requires 3 PCIe 8-pin cables for power. It's max power is 450W.  One 8-pin PCIe cable is safely rated for 150W, so 3*150 = 450.  For a 900W card, that would required 6 PCIe 8-pin connectors.  That's too many.  But, nVidia and PSU makers have now upgraded to 12-pin PCIe connectors.

    1000W PSUs are already coming with 12-pin PCIe cables that can safely deliver around 450W-600W, so a 900W card would presumably require two 12-pin PCIe connectors.  So, if you intend to get a RTX 4090 card, you're gonna need a beefy PSU with these new 12-pin PCIe connectors.

    For me, I'm happy with what I've got.  I don't see myself going beyond 1000W, because I like my microwave in the kitchen, and not in my computer. :)
     

    Post edited by Seven193 on
  • oddboboddbob Posts: 348

    Seven193 said:

    The RTX 3090 Ti already requires 3 PCIe 8-pin cables for power. It's max power is 450W.  One 8-pin PCIe cable is safely rated for 150W, so 3*150 = 450. 

    Threre's another 75w from the slot as well. The strix 3090 could pull 480w maxed out, apparently the TI is over 500.

  • oddboboddbob Posts: 348

    namffuak said:

    Take the power requirements with a (fairly large) grain of salt. I'm running a long render right now and took a snapshot:

    1080 TI 97% GPU load, board power draw 186.1 W

    3060     95% GPU load, board power draw 129.4 W

    The cards use more power while gaming, especially if raytracing and upscaling are in use. My 3090 uses a fairly consistent 340w while gaming, 500w plus  40 series cards are a real possibility.

  • namffuaknamffuak Posts: 4,063

    oddbob said:

    namffuak said:

    Take the power requirements with a (fairly large) grain of salt. I'm running a long render right now and took a snapshot:

    1080 TI 97% GPU load, board power draw 186.1 W

    3060     95% GPU load, board power draw 129.4 W

    The cards use more power while gaming, especially if raytracing and upscaling are in use. My 3090 uses a fairly consistent 340w while gaming, 500w plus  40 series cards are a real possibility.

    Quite possibly - but for those of us who are not gamers the power requirements are apparently overstated. FWIW, the last games I tried out were Zork, Myst, and Doom II - not necessarliy in that order - and I pretty much lost interest in them an hour or so in.

  • daveso said:

    if the pricing is naything like the current, I'll still be using my 2070

    I think the prices will be nuts until next year.  Yet another mining bubble has burst, so it's possible we won't see a single mining operation in Iran buying 1,000 cards which will help consumer prices.  Worst case scenario is a lot of 30xx series cards becoming available second hand.  But if the 40xx series is so much better (I saw the 4070 has a bus size convenient for 12GB, for example, rather than the 3070's 8), it may not be worthwhile buying a 30xx even if it is cheaper.

  • nonesuch00nonesuch00 Posts: 17,929

    LeatherGryphon said:

    And watch out for places built in the late middle part of the last century that had aluminum electrical wires in the house.  Very subject to corrosion and subsequent overheating or sparking at the connection points.  Bad, bad, bad!no

    I lived in a huge old farm house as a teenager that had an old metal medicine cabinet with a mirror and there was a short in the lighting over that cabinet such that if you touched it you got a shock and as I found out if you got your whole hand on that metal the alternating current going through your hand glued your hand to the cabinet. So there I was looking in the mirror not being able to look away from the mirror because my hand was glued to the cabinet by electricity. I felt like Peter Griffith, "Oh! I can't stop looking in the readview mirror!" right before he crashes his car! laugh I did though manage to seperate myself from the mirror by throwing my weight away from my glued hand pulling if free as I had no control over that arm below the shoulder. 

Sign In or Register to comment.