RTX Pro 6000 (Blackwell) selling for $11,000+ (retail)

EZ3DTVEZ3DTV Posts: 1,494
edited May 7 in The Commons

I got an email from Nvidia today stating that this card can now be purchased.

COPILOT: Yes! NVIDIA has released the RTX PRO 6000 Blackwell, which is built on the Blackwell architecture and features 96GB of GDDR7 memory. This workstation GPU is designed for AI workloads, content creation, and development, and is priced at over $11,000

COMMENT: It seems the lag behind when this hardware has been introduced on the market and when Daz Studio 6 (2025) will be ready to utilize this card is an issue here.

Software always lags new hardware and this is the case. Most new hardware has alpha software and it can take months and even years before the software catches up.

Do not be fooled by the RTX 6000 built upon the lovelace architecture with 48gb of memory.

This is the email I received from NVIDIA:

Now Available: NVIDIA RTX PRO 6000 Blackwell Workstation Edition
Powering the next era of AI.
Starting today, professionals can experience unmatched performance and tackle the most demanding workflows with the NVIDIA RTX PRO™ 6000 Blackwell Workstation Edition GPU, now available from our channel partners.

The new RTX PRO 6000 Blackwell Workstation Edition, powered by the NVIDIA Blackwell architecture, offers unmatched speed, precision, and efficiency, enabling you to push the boundaries of what’s possible.

Find a Partner

Is anyone sorry they bought the RTX 5090? I'm glad I waited...

 

Post edited by EZ3DTV on
«1345

Comments

  • kyoto kidkyoto kid Posts: 41,889
    edited May 8

    ...I am looking at 96GB as the basic RAM upgrade for my system (which can be later expanded to 192) . 

    96 GB of VRAM is insane. There would be little need to optimise anything with that much overhead.  However with an 11,000 USD price tag, yeah, just a bit out of our league.  Also rather power bungry with a 600 W TDP.

    Crikey I'd be more than happy to have an RTX Pro 4000 with 24 GB.

    Post edited by kyoto kid on
  • WendyLuvsCatzWendyLuvsCatz Posts: 40,150

    I doubt DAZ studio will support it devil

  • kyoto kidkyoto kid Posts: 41,889

    ...yeah from what I've fathered after reading a few reviews, it seems more suited for scientific/mathematical modelling and AI development than rendering Vicky in a temple wearing a chain mail bikini and wielding a sword.

    ...but it probably would only take a few seconds to render a NVIATWAS scene.

  • Daz Jack TomalinDaz Jack Tomalin Posts: 13,818
    edited May 8

    RexRed said:

    Is anyone sorry they bought the RTX 5090? I'm glad I waited...

    Why would anyone be sorry? I dont think theres many with that sort of money to spend on a GPU... the 5090's are probably out a lot of peoples budgets as it is.

     

    Post edited by Daz Jack Tomalin on
  • EZ3DTVEZ3DTV Posts: 1,494

    kyoto kid said:

    ...I am looking at 96GB as the basic RAM upgrade for my system (which can be later expanded to 192) . 

    96 GB of VRAM is insane. There would be little need to optimise anything with that much overhead.  However with an 11,000 USD price tag, yeah, just a bit out of our league.  Also rather power bungry with a 600 W TDP.

    Crikey I'd be more than happy to have an RTX Pro 4000 with 24 GB.

    I have 24 GB of video ram and I run out of ram in my Daz scenes all the time. An Ultra Scenery scene with camera culling turned off can pretty much do that alone.

    I am happy with 24 GB but it does not fully meet my needs at all, I am constantly slimming down a scene to avoid the black screen.

    Is 96 GB enough? Well, I need to see. I have made scenes with 12 Gen 8/9 figures and had to turn off teeth and leave out many items just to get them to render.

    24 GB is actually a low bar when you want to make detailed scenes. And then there is animation that requires scenes that have 360-degree panoramas and no culling. What about 8k skin and 24k HDRIs?

    With a Daz library of over 13 thousand assets and over a million objects I can easily envision a scene that would utilize this library to a much larger extent.

    Would I need 96 GB? I am sure I would push scenes to this extent if it was available on my system.

  • EZ3DTVEZ3DTV Posts: 1,494
    edited May 8

    Daz Jack Tomalin said:

    RexRed said:

    Is anyone sorry they bought the RTX 5090? I'm glad I waited...

    Why would anyone be sorry? I dont think theres many with that sort of money to spend on a GPU... the 5090's are probably out a lot of peoples budgets as it is.

     

    I would spend $11,000 on a graphics card for Daz with 96GB of ram.

    I know people who are planning on buying 2x5090s for their system, well, that is nearly $6,000

    And then, what about AI using the graphics card to take your Daz items and turning them into animations and detailed scenes? This is just around the corner.

    This is where having multiple Nvidia Pro 6000s in a system would be needed. And what about the billions Hollywood currently spends on 3D hardware? They would certainly utilize this technology to its fullest for the entertainment value.

    This all utilizes Nvlink which means the pooling of cuda cores over multiple cards.

    I think these are overpriced but, 3 of them is about the price of a new car. Is that too much to spend on a robust 3D business?

    Daz should be able to employ this kind of hardware to futureproof their market share, including state-of-the-art ray tracing capabilities.

    Exporting Daz items to Omnisphere is a real pain when they can simply remain in Daz Studio… Otherwise, Daz and Omnisphere need true compatibility.

    If Nvidia wants to sell these cards to 3D artists they need "a lot" more photo realistic content than they have…

    Post edited by EZ3DTV on
  • TaozTaoz Posts: 10,266
    edited May 8

    kyoto kid said:

     Also rather power bungry with a 600 W TDP.

    If you use electric heating or some other heating at the same price, you can render as much as you like in winter, for free (all that power will turn into heat). In summer, OTOH, your aircondition will eat up everything you saved during winter. 

    Post edited by Taoz on
  • EZ3DTVEZ3DTV Posts: 1,494
    edited May 8

    Taoz said:

    kyoto kid said:

     Also rather power bungry with a 600 W TDP.

    If you use electric heating or some other heating at the same price, you can render as much as you like in winter, for free (all that power will turn into heat). In summer, OTOH, your aircondition will eat up everything you saved during winter. 

    I am a music artist as well and I need absolute silence from computer fan noise. This is why I cut a hole in the wall and my computer towers are in a different room than my monitors and other equipment. I run long USB and HDMI cords along with Wi-Fi and Bluetooth when necessary. This way my vocals and other acoustic tracks are pristine with zero noise.

    So, a single AC unit cooling this "other room" in the summer is adequate for keeping the towers and their respective cards at reasonable temperatures. 

    Post edited by EZ3DTV on
  • Daz Jack TomalinDaz Jack Tomalin Posts: 13,818
    edited May 8

    RexRed said:

    Daz Jack Tomalin said:

    RexRed said:

    Is anyone sorry they bought the RTX 5090? I'm glad I waited...

    Why would anyone be sorry? I dont think theres many with that sort of money to spend on a GPU... the 5090's are probably out a lot of peoples budgets as it is.

     

    I would spend $11,000 on a graphics card for Daz with 96GB of ram.

    I know people who are planning on buying 2x5090s for their system, well, that is nearly $6,000

    And then, what about AI using the graphics card to take your Daz items and turning them into animations and detailed scenes? This is just around the corner.

    This is where having multiple Nvidia Pro 6000s in a system would be needed. And what about the billions Hollywood currently spends on 3D hardware? They would certainly utilize this technology to its fullest for the entertainment value.

    This all utilizes Nvlink which means the pooling of cuda cores over multiple cards.

    I think these are overpriced but, 3 of them is about the price of a new car. Is that too much to spend on a robust 3D business?

    Daz should be able to employ this kind of hardware to futureproof their market share, including state-of-the-art ray tracing capabilities.

    Exporting Daz items to Omnisphere is a real pain when they can simply remain in Daz Studio… Otherwise, Daz and Omnisphere need true compatibility.

    If Nvidia wants to sell these cards to 3D artists they need "a lot" more photo realistic content than they have…

    I'm not suggesting there isn't a market for it - but it's not really here - and I say that as someone with a whole rack of 4090/3090's.  Also, I don't believe they're really looking to sell them these cards to 3D artists.. this is all AI use-case I imagine.

    If you're looking for whether or not DS 2025 supports it - I don't know.  I would guess though that would support it given its the same family - but thats just my assumption, don't quote me in an official capacity on that.

    But again, if it doesn't support it (at this current stage at least) then I guess we'll be alienating a very very small number of users.

    Finally, personally speaking - there is no way I'd want to drop that amount of money on something with the reputation of those power connectors.  It's pretty much why I'm not pulling the trigger on upgrading.. I think 4 in a rack would just be a fire hazard.

    Post edited by Daz Jack Tomalin on
  • EZ3DTVEZ3DTV Posts: 1,494

    Daz Jack Tomalin said:

    RexRed said:

    Daz Jack Tomalin said:

    RexRed said:

    Is anyone sorry they bought the RTX 5090? I'm glad I waited...

    Why would anyone be sorry? I dont think theres many with that sort of money to spend on a GPU... the 5090's are probably out a lot of peoples budgets as it is.

     

    I would spend $11,000 on a graphics card for Daz with 96GB of ram.

    I know people who are planning on buying 2x5090s for their system, well, that is nearly $6,000

    And then, what about AI using the graphics card to take your Daz items and turning them into animations and detailed scenes? This is just around the corner.

    This is where having multiple Nvidia Pro 6000s in a system would be needed. And what about the billions Hollywood currently spends on 3D hardware? They would certainly utilize this technology to its fullest for the entertainment value.

    This all utilizes Nvlink which means the pooling of cuda cores over multiple cards.

    I think these are overpriced but, 3 of them is about the price of a new car. Is that too much to spend on a robust 3D business?

    Daz should be able to employ this kind of hardware to futureproof their market share, including state-of-the-art ray tracing capabilities.

    Exporting Daz items to Omnisphere is a real pain when they can simply remain in Daz Studio… Otherwise, Daz and Omnisphere need true compatibility.

    If Nvidia wants to sell these cards to 3D artists they need "a lot" more photo realistic content than they have…

    I'm not suggesting there isn't a market for it - but it's not really here - and I say that as someone with a whole rack of 4090/3090's.  Also, I don't believe they're really looking to sell them these cards to 3D artists.. this is all AI use-case I imagine.

    If you're looking for whether or not DS 2025 supports it - I don't know.  I would guess though that would support it given its the same family - but thats just my assumption, don't quote me in an official capacity on that.

    But again, if it doesn't support it (at this current stage at least) then I guess we'll be alienating a very very small number of users.

    Finally, personally speaking - there is no way I'd want to drop that amount of money on something with the reputation of those power connectors.  It's pretty much why I'm not pulling the trigger on upgrading.. I think 4 in a rack would just be a fire hazard.

    Unofficial 12V-2x6V Power Connector Melts NVIDIA GeForce RTX 5090

    by 

    AleksandarK

     Feb 10th, 2025 10:50 Discuss (105 Comments)

    NVIDIA's high-TDP flagship GPU, the GeForce RTX 5090, appears to cause additional headaches for users, not including the high power bill. According to a Reddit user, we now have the first documented case of a melted power connector on NVIDIA's flagship GeForce RTX 5090 Founders Edition, reigniting concerns over high-wattage GPU safety from the last generation. While playing Battlefield 5, Reddit user ivan6953 detected a burning odor and immediately shut down their system, only to discover severe damage to both the RTX 5090's 12V-2×6 connector and their ASUS ROG Loki SFX-L PSU. The user had employed a Moddiy 12VHPWR cable, marketed as ATX 3.0/PCIe 5.0-compliant and rated for RTX 5090's 600 watts of power. Despite claims of secure installation—audible clicks at both ends—the cable melted at 500-520 W load, charring connectors on the GPU and PSU.

    Notably, the same cable had powered an RTX 4090 FE for two years without issue. NVIDIA's RTX 5090 FE ships with a redesigned adapter featuring a longer, more flexible cable and an angled connector to reduce strain in compact builds. NVIDIA asserts that no incidents have occurred with its bundled adapter, emphasizing compliance with the updated 12V-2×6 standard, which shortens sensing pins to prevent power flow if connections loosen. Hence, an older connector can not provide 100% secure usage despite the user thinking that the sensing pins are touching properly.

     A second case reported by Spanish YouTuber Toro Tocho involved a melted PSU-side connector, though the GPU remained undamaged. This resulted from worn connectors or improper seating despite user assurances of correct installation. RTX 5090 owners should avoid third-party adapters and rely solely on NVIDIA's included cable or PSU-native connectors. Regular inspections for discoloration, wear, or bending near connectors are also advised, particularly for systems with high power consumption. While NVIDIA investigates, the takeaway is clear: cutting corners on power delivery risks costly hardware failures. The RTX 5090's melting woes appear isolated to unofficial accessories—a small solace for early adopters navigating the pitfalls of the high-TDP GPU era.

     

  • Daz Jack TomalinDaz Jack Tomalin Posts: 13,818

    Here's one example where that is discussed -  by someone far more experienced than me - and again personally speaking - the whole spec is just too close to comfort especially in a render server environment where it would be under load almost 24/7. On cards that expensive it should never even be a possibility even with 'unofficial accessories' which is really grasping at straws given the history of the companies who make 3rd party cables.

  • SquishySquishy Posts: 710

    Trying to imagine what it would feel like to have eleven thousand dollars I could spend on a video card cool

  • WendyLuvsCatzWendyLuvsCatz Posts: 40,150

    Squishy said:

    Trying to imagine what it would feel like to have eleven thousand dollars I could spend on a video card cool

    none of my cars even cost that much when I used to drive surprise 

  • EZ3DTVEZ3DTV Posts: 1,494

    Daz Jack Tomalin said:

    Here's one example where that is discussed -  by someone far more experienced than me - and again personally speaking - the whole spec is just too close to comfort especially in a render server environment where it would be under load almost 24/7. On cards that expensive it should never even be a possibility even with 'unofficial accessories' which is really grasping at straws given the history of the companies who make 3rd party cables.

    Question to Copilot:

    How many 5090s have reportedly had melted cables?:

    Answer:

    There have been a few reports of RTX 5090 GPUs experiencing melted power cables, but the exact number isn't clear. One of the first credible cases involved a user playing Battlefield 5 when they noticed a burning smell—turns out their RTX 5090 FE’s power connector and cable had melted. Another report suggests that MSI’s yellow-tipped 12V-2x6 cable, designed to prevent melting, still failed. However, some investigations have debunked certain claims, pointing to prior RTX 4090 FE cabling issues rather than a flaw in the 5090 itself.

    It seems like third-party cables and improper connections might be contributing factors. Nvidia and its partners have introduced safety features like angled connectors and warning lights to reduce the risk.

    Question to Copilot:

    How many 5090s have been sold to date?

     Answer: 

    Exact sales numbers for the RTX 5090 haven't been officially disclosed, but tracking sites suggest it's been in and out of stock frequently. The average selling price on eBay has reportedly dropped to $4,000, indicating that supply is stabilizing.

    Comment: I would take a guess that more than a thousand are currently in operation. 

     

  • backgroundbackground Posts: 592
    edited May 8

    You can trust Nvidia.

    Nvidia.jpg
    1158 x 554 - 67K
    Post edited by background on
  • SquishySquishy Posts: 710

    I like that people treat answers given by LLMs as authoritative and trustworthy, lol we are so absolutely cooked

  • Daz Jack TomalinDaz Jack Tomalin Posts: 13,818

    Heh yea, I thought it was angled connectors which were recalled in the first place over safety fears

  • EZ3DTVEZ3DTV Posts: 1,494

    background said:

    You can trust Nvidia.

    Copilot is Microsoft's AI. Copilot is often worng, but is more often very helpful and accurate.

    It seems the info I have posted here is accurate.

    What I do not trust is YouTube videos with obvious clickbate titles and only one example to back them up.

  • backgroundbackground Posts: 592

    So far as i know a major contributing factor is that Nvidia removed the load balancing circuitry which on pre 40xx cards would not allow the GPU to power up unless there was roughly similar current on each of the 12v conductors between the PSU and GPU. As they are now the GPU will power up if just one of the 12v conductor is connected properly.

  • EZ3DTVEZ3DTV Posts: 1,494
    edited May 8

    background said:

    You can trust Nvidia.

    According to YouTube my 4090 was supposed to burn up.

    It is going strong and not one single issue and I have played the most demanding games for sometimes up to 12 hours, and put it through massive rendering projects.

    There are lots of issues with them melting but mine is fine. I do not overclock. 

    I am very interested in what a 5090 will do with Daz 2025 once the scripts and utilities are restored.

    It would be nice to see the power connector on the 5090 made tri or quad (dividing off power) along with the connector on the pro 6000.

    The very purpose of Daz 2025 is the new blackwell cards and hopefully, a new NVIDIA rendering engine.  

    Post edited by EZ3DTV on
  • EZ3DTVEZ3DTV Posts: 1,494
    edited May 8

    background said:

    So far as i know a major contributing factor is that Nvidia removed the load balancing circuitry which on pre 40xx cards would not allow the GPU to power up unless there was roughly similar current on each of the 12v conductors between the PSU and GPU. As they are now the GPU will power up if just one of the 12v conductor is connected properly.

    Copilot:

    Nvidia's RTX 40-series GPUs reportedly lack the load-balancing circuitry found in previous generations, which ensured equal current distribution across all 12V conductors. This change means that even if only one 12V conductor is properly connected, the GPU can still power up, potentially leading to uneven power draw and higher risk of overheating

    Some discussions suggest that PCI-SIG is working on a redesign to address these concerns, possibly by modifying the sense pins to enforce a fully inserted connection before power is delivered. Additionally, PWM controllers could be used to cut power if unsafe conditions are detected.

    Question: what is PCI-SIG

    Copilot answer:

    PCI-SIG (Peripheral Component Interconnect Special Interest Group) is a nonprofit consortium responsible for developing and maintaining the PCI, PCI-X, and PCI Express (PCIe) standards. Founded in 1992, PCI-SIG ensures industry-wide compatibility for peripheral component interconnects, allowing seamless communication between processors and expansion devices.

    The organization is headquartered in Beaverton, Oregon, and has over 800 member companies, including AMD, Intel, Nvidia, and Qualcomm. PCI-SIG regularly updates specifications to improve data transfer speeds, power efficiency, and interoperability.

    Comment: I assume that 3rd party nvidia card manufacturers are already addressing these issues on their own until a general solution is presented.

    Or, just make sure both are firmly plugged in.

     

     

    Post edited by EZ3DTV on
  • GreymomGreymom Posts: 1,140
    edited May 8

    $6000!  (Or worse, $11,000!)  Ack!  Way too rich for my blood.  Got two good used RTX 3090s for $1700 (for both)  from EBAY.  Price has been drifting up recently, closler to $1000 each.

    Post edited by Greymom on
  • GreymomGreymom Posts: 1,140

    NVIDIAs new "AI Agent" workstation chipset is capable of ONE PETAFLOP throughput (4-bit).   The extimated cost of the workstation (no monitor, etc.) is $3000.  At the moment, it will only run their AI OS, but there are rumors it will be adapted as a PC workstation later.  It is about the size of a brick.  Now if we could just run IRAY or CYCLES....

  • EZ3DTVEZ3DTV Posts: 1,494

    Greymom said:

    NVIDIAs new "AI Agent" workstation chipset is capable of ONE PETAFLOP throughput (4-bit).   The extimated cost of the workstation (no monitor, etc.) is $3000.  At the moment, it will only run their AI OS, but there are rumors it will be adapted as a PC workstation later.  It is about the size of a brick.  Now if we could just run IRAY or CYCLES....

    Yes! Exactly! There is a new (Nvidia) PC revolution and let's hope Daz Studio (and our assets) are firmly along for the ride. 2025(V6) is a perfect vehicle for this!  

  • Richard HaseltineRichard Haseltine Posts: 108,209

    The very purpose of Daz 2025 is the new blackwell cards and hopefully, a new NVIDIA rendering engine.  

    Not the only purpose, no - it is a general application update. The release as an alpha may be related, but even then as a new major version they might want earlier testing than with minor versions so it may well not be the only factor.

  • GreymomGreymom Posts: 1,140

    I am hoping that we will eventually be able to use rented time on Blackwell (or GH-series) NVIDIA servers to run run renders online.

  • NylonGirlNylonGirl Posts: 2,222

    WendyLuvsCatz said:

    Squishy said:

    Trying to imagine what it would feel like to have eleven thousand dollars I could spend on a video card cool

    none of my cars even cost that much when I used to drive surprise 

    I would probably get a new floor or new roof.

  • MasterstrokeMasterstroke Posts: 2,313

    kyoto kid said:

    ...yeah from what I've fathered after reading a few reviews, it seems more suited for scientific/mathematical modelling and AI development than rendering Vicky in a temple wearing a chain mail bikini and wielding a sword.

    ...but it probably would only take a few seconds to render a NVIATWAS scene.

    Nah, that is for old Poser.
    For DAZ Studio, it is:
    Pixi On A Mushroom In The Middle Of A Forrest
    POAMITMOAF

  • kyoto kidkyoto kid Posts: 41,889
    edited May 9

    RexRed said:

    kyoto kid said:

    ...I am looking at 96GB as the basic RAM upgrade for my system (which can be later expanded to 192) . 

    96 GB of VRAM is insane. There would be little need to optimise anything with that much overhead.  However with an 11,000 USD price tag, yeah, just a bit out of our league.  Also rather power bungry with a 600 W TDP.

    Crikey I'd be more than happy to have an RTX Pro 4000 with 24 GB.

    I have 24 GB of video ram and I run out of ram in my Daz scenes all the time. An Ultra Scenery scene with camera culling turned off can pretty much do that alone.

    I am happy with 24 GB but it does not fully meet my needs at all, I am constantly slimming down a scene to avoid the black screen.

    Is 96 GB enough? Well, I need to see. I have made scenes with 12 Gen 8/9 figures and had to turn off teeth and leave out many items just to get them to render.

    24 GB is actually a low bar when you want to make detailed scenes. And then there is animation that requires scenes that have 360-degree panoramas and no culling. What about 8k skin and 24k HDRIs?

    With a Daz library of over 13 thousand assets and over a million objects I can easily envision a scene that would utilize this library to a much larger extent.

    Would I need 96 GB? I am sure I would push scenes to this extent if it was available on my system.

    ...I've been rendering on a Maxwell Titan X. (12 GB) for the last several years. Granted I can't "click on the render icon take a few sips of coffee and the render is done" but it still  gets the job done and much faster than CPU rendering..

    I tend to do fairly detailed scenes with little optimisation.

    I also have a 12 GB 3060 still in the box waiting for when I can upgrade the rest for the system to Win 11 specs as the x58 BIOS is too old to recognise it.

    Post edited by kyoto kid on
  • Daz Jack TomalinDaz Jack Tomalin Posts: 13,818
    edited May 9

    RexRed said:

    background said:

    You can trust Nvidia.

    Copilot is Microsoft's AI. Copilot is often worng, but is more often very helpful and accurate.

    It seems the info I have posted here is accurate.

    What I do not trust is YouTube videos with obvious clickbate titles and only one example to back them up.

    It's cool, I'm not here to argue.  For your info though the guy in the vid owns and runs Thermal Grizzly.. if you can't trust someone with his (enginnering) background that's totally your call - maybe ask Copilot about him for some info on his credentials.

    Post edited by Daz Jack Tomalin on
Sign In or Register to comment.