Nvidia plans heavy cuts to GPU supply in early 2026

«1

Comments

  • kprkpr Posts: 364

    have claimed / reportedly / could be / If this is true

    cheeky

  • "reportedly"

  • Maybe the memory suppliers will ramp up production to take advantage of the higher prices, leading to a glut of memory chips and lower GPU prices. 

  • HavosHavos Posts: 5,629

    background said:

    Maybe the memory suppliers will ramp up production to take advantage of the higher prices, leading to a glut of memory chips and lower GPU prices. 

    It is a shortage of memory causing the problems, so we can assume production is already at max speed. Memory factories take years to build and cost billions, so there is no short term fix to lack of supply. From what I read, it is demand from AI data centers that has caused a big rise in overall demand, and thus a shortage for manufacturers like nVidia.

  • The memory shortage exists so people can chat with their AI girlfriends and have AI write their TPS reports.

  • TorquinoxTorquinox Posts: 4,512

    First it was the crypto miners. Then it was the NFTs. Now it's the AI data centers. Always something with chip supply! And should we mention the vast quantities of water and electricity required for all this AI stuff? 

  • 3DSaga3DSaga Posts: 761

    Torquinox said:

    First it was the crypto miners. Then it was the NFTs. Now it's the AI data centers. Always something with chip supply! And should we mention the vast quantities of water and electricity required for all this AI stuff? 

    True. I live in an area that struggles to provide both, and I have yet to hear any investment by these companies to improve or expand the water or electrical infrastructures they draw on so heavily.  

  • No affordable GPU = No PC = No 3d software = no DAZ Studio = no 3d content
    Well, with a low retirement ahead, this will save me a lot of money, as one of my two addictions is taken away from me.
    Pen and paper will be cheaper ... , I hope.
    Meanwhile I'm giving AI the "wednesday finger" .

  • 3DSaga said:

    Torquinox said:

    First it was the crypto miners. Then it was the NFTs. Now it's the AI data centers. Always something with chip supply! And should we mention the vast quantities of water and electricity required for all this AI stuff? 

    True. I live in an area that struggles to provide both, and I have yet to hear any investment by these companies to improve or expand the water or electrical infrastructures they draw on so heavily.  

    The problem is that power and water are more-or-less closed markets, so when data centers drastically increase demand, prices rise for everyone. The effect is that we're basically subsidizing the effort to replace us with AI and take our jobs.

    There's no reason these data centers need to be built in populated areas. Governments should force these companies to build far away from the cities, to pay for the necessary new roads, and to fund their own nuclear power plant and water system.

    That'll also minimize the collateral damage of hitting these places with tactical nukes when the AI goes rogue and tries to kill us all. ;)
  • kyoto kidkyoto kid Posts: 42,134
    edited December 2025

    ...yeah in the last two and a half months the upgrade i was saving my nickels and dimes for more than doubled in price thinks to DRAM and to an extent NVME SSD prices spiking. It will now take years to fnish saving for it and by that time it will likely be at least two generations behind the times,

    I'm still working on a 13 year old system built on an X58 motherboard with 24 GB DDR2 memory, a Westmere 6 core Xeon and Maxwell Titan-X.  The MB's BIOS is so ancient it wont recognise the 12 GB EVGA 3060 that I bought a few years ago (which I put back in the box). . 

    Maybe Daz needs to look at adding another render engine that supports AMD (of course then we would need some way to convert Iray shaders to that render language) or reconsder going back to Renderman..

    Post edited by kyoto kid on
  • memcneil70memcneil70 Posts: 5,658

    doubledeviant said:

    3DSaga said:

    Torquinox said:

    First it was the crypto miners. Then it was the NFTs. Now it's the AI data centers. Always something with chip supply! And should we mention the vast quantities of water and electricity required for all this AI stuff? 

    True. I live in an area that struggles to provide both, and I have yet to hear any investment by these companies to improve or expand the water or electrical infrastructures they draw on so heavily.  

    The problem is that power and water are more-or-less closed markets, so when data centers drastically increase demand, prices rise for everyone. The effect is that we're basically subsidizing the effort to replace us with AI and take our jobs.


    There's no reason these data centers need to be built in populated areas. Governments should force these companies to build far away from the cities, to pay for the necessary new roads, and to fund their own nuclear power plant and water system.

    That'll also minimize the collateral damage of hitting these places with tactical nukes when the AI goes rogue and tries to kill us all. ;)

    Just to point out that many areas of the United States, and the world have deserts where water is a finite resource and becoming too scarce each year as populations grow beyond the previous tolerances pre-AI. The Colorado River is a case in point. Right now Colorado is experiencing a drought and high temperatures in the 50-60sF in December, instead of snow and temps in the 20s/30sF and building the snowpack needed for said river this summer. This will affect every state that depends on their water supply for homes to agriculture, much less AI farms from here to So. California. And we no longer have a 'fire season' it is all year round. Last week was a nightmare for the western part of the Denver metro area as homes and businesses had their power shut off because of high winds to avoid fires and then 4 - 5 days later, still had no power. They didn't do this on the eastern side of the state and that is where the wildfires sparked off.

    I bluntly have no desire to donate a drop of water to the AI industry.

  • TorquinoxTorquinox Posts: 4,512

    I agree with @memcneil70, well said.

  • IceDragonArtIceDragonArt Posts: 12,966

    Not sorry that I just had a new machine built this month.  But it was definitely pricey.  It will have to last me for the next 20 years at which point I will probably be too old to use it.

  • Just a reminder to avoid politics, thank you.

  • Cam FoxCam Fox Posts: 372
    edited December 2025

    I did a RAM upgrade earlier this year. The kit I bought has since quadrupled in price. I hope it lasts because there's no way I could afford to replace it now. :/

     

    One consequence of hardware being so expensive is it might push the software side to innovate and run on lower spec hardware. In the Daz world maybe that would translate to steady demand for optimized models, thoughtful instancing, etc.

    Post edited by Cam Fox on
  • FSMCDesignsFSMCDesigns Posts: 12,851

    I got a new 4080 system last year and between Fooocus, Forge and a couple of online AI apps, I have spent less on 3D assets and had less strain on my PC doing most of my images with AI, only using DS for reference images for poses, image to image, etc..I figure I am good for another 5 years. I am more concerned with how controlling and intrusive Windows has become.

  • HamEinarHamEinar Posts: 133

    Cam Fox said:

    I did a RAM upgrade earlier this year. The kit I bought has since quadrupled in price. I hope it lasts because there's no way I could afford to replace it now. :/

     

    One consequence of hardware being so expensive is it might push the software side to innovate and run on lower spec hardware. In the Daz world maybe that would translate to steady demand for optimized models, thoughtful instancing, etc.

    Unfortunately, the trend is to move away from the consumer market - on a hardware level for now - and then you and I are forced to lease hardware through the many datacenters; no reason for either hardware or software developers to change anything when they can make more money from B2B. PC is becoming CC; out with the personal, in with the cloud.

  • TaozTaoz Posts: 10,299

    HamEinar said:

    Cam Fox said:

    I did a RAM upgrade earlier this year. The kit I bought has since quadrupled in price. I hope it lasts because there's no way I could afford to replace it now. :/

     

    One consequence of hardware being so expensive is it might push the software side to innovate and run on lower spec hardware. In the Daz world maybe that would translate to steady demand for optimized models, thoughtful instancing, etc.

    Unfortunately, the trend is to move away from the consumer market - on a hardware level for now - and then you and I are forced to lease hardware through the many datacenters; no reason for either hardware or software developers to change anything when they can make more money from B2B. PC is becoming CC; out with the personal, in with the cloud.

     That trend seems to collide with another trend among comsumers and cloud-dependent businesses to reject or move away from the cloud, which I've seen several youtube channels talk about, e.g.:  https://www.youtube.com/@CloudComputingInsider

  • AI is the latest flash in the pan, with so much research $$,$$$,$$$$,$$$... being thrown at it, some companies are gonna' lose big!  That, and nuclear fusion,  but the billionaires behind it will walk away fat.  It's always the little people who get trampled.

  • NylonGirlNylonGirl Posts: 2,293
    I doubt the companies are spending their own money.
  • kenmokenmo Posts: 1,142

    All of my recent video cards for the last 10 years have been Nvidia. Presently using a RTX 2060 Super with 8GB of memory. 

    But I am sick of all the price hikes for Nvidia. First it was crypto mining.

    I am hoping the high rise in price will be a awake up call to software developers and there will be a trend to develop graphics apps also for AMD/ATI and Intel.

    Competition is good and we have it in the gaming industry so why graphics apps.

    Cheers & Merry Christmas..

    Kenmo

  • kenmokenmo Posts: 1,142

    doubledeviant said:

    3DSaga said:

    Torquinox said:

    First it was the crypto miners. Then it was the NFTs. Now it's the AI data centers. Always something with chip supply! And should we mention the vast quantities of water and electricity required for all this AI stuff? 

    True. I live in an area that struggles to provide both, and I have yet to hear any investment by these companies to improve or expand the water or electrical infrastructures they draw on so heavily.  

    The problem is that power and water are more-or-less closed markets, so when data centers drastically increase demand, prices rise for everyone. The effect is that we're basically subsidizing the effort to replace us with AI and take our jobs.


    There's no reason these data centers need to be built in populated areas. Governments should force these companies to build far away from the cities, to pay for the necessary new roads, and to fund their own nuclear power plant and water system.

    That'll also minimize the collateral damage of hitting these places with tactical nukes when the AI goes rogue and tries to kill us all. ;)
     

    I like this idea of forcing AI centers to rural remote locations. But if too remote, they may have trouble recruiting staff. With no high spped internet, Walmart, COSTCO, McDonalds, hospitals, dentists, theatres, Best Buy, pro sports, etc, it maybe hard to lure people to the "wilds" of USA or Canada. After all this is not frontier or pioneering time. People love their 21st century comforts.

  • TheMysteryIsThePointTheMysteryIsThePoint Posts: 3,306
    edited December 2025

    FSMCDesigns said:

    I got a new 4080 system last year and between Fooocus, Forge and a couple of online AI apps, I have spent less on 3D assets and had less strain on my PC doing most of my images with AI, only using DS for reference images for poses, image to image, etc..I figure I am good for another 5 years. I am more concerned with how controlling and intrusive Windows has become.

    I'm heading in this direction, as well. The tech is not quite there yet, but I don't see how this can NOT be simply the way we do things in the future. I generated this with two frames. Two frames. No cloth sim, no hair sim, no pyro sim, just one frame and a prompt that I put slightly above zero effort into. You can't make a movie with generic things like this, but with the way the technology is advancing on a monthly basis, this has simply got to be the shape of things to come. I'm going to become proficient in ComfyUI because the prospect about forgetting all about exporting and rigging and simulation and rendering is, to be honest, appealing.

    https://youtu.be/4b-21ccUNLQ

    But to be honest, I think Generative AI is the best thing to happen to DAZ, because a single frame is actually what DAZ Studio is among the best software for. It's a pretty ironic plot twist that the best tools for animation/video could end up including DAZ Studio.

     

    Post edited by TheMysteryIsThePoint on
  • HamEinarHamEinar Posts: 133

    Taoz said:

    HamEinar said:

    Cam Fox said:

    I did a RAM upgrade earlier this year. The kit I bought has since quadrupled in price. I hope it lasts because there's no way I could afford to replace it now. :/

     

    One consequence of hardware being so expensive is it might push the software side to innovate and run on lower spec hardware. In the Daz world maybe that would translate to steady demand for optimized models, thoughtful instancing, etc.

    Unfortunately, the trend is to move away from the consumer market - on a hardware level for now - and then you and I are forced to lease hardware through the many datacenters; no reason for either hardware or software developers to change anything when they can make more money from B2B. PC is becoming CC; out with the personal, in with the cloud.

     That trend seems to collide with another trend among comsumers and cloud-dependent businesses to reject or move away from the cloud, which I've seen several youtube channels talk about, e.g.:  https://www.youtube.com/@CloudComputingInsider

    It's just hard to move away from the cloud when datacenters dictate manufacturing, making hardware impossibly scarce for consumers - if you can't get the hardware, or the prices keep skyrocketing it will become yet another social divide. Profit is very motivating for any business, of course, and subscriptions vs. one-time purchases of software is definitely the way to go for software developers to earn more - plus, when the novelty of chatbots and image generations go away, there will be INSANE amounts of processing power available to run pretty much any software / hardware combo on demand. For businesses, it's the snowball effect; if they don't get on the train, they will be absorbed and left behind... Sure wish some kind of counter-movement would rise though.

  • csaacsaa Posts: 971

    TheMysteryIsThePoint said:

    'm going to become proficient in ComfyUI because the prospect about forgetting all about exporting and rigging and simulation and rendering is, to be honest, appealing.

    https://youtu.be/4b-21ccUNLQ

    TheMysteryIsThePoint,

    When it comes to AI and ML, I'm more optimistic than otherwise. Maybe it helps to have a grasp of the technical underpinings. Maybe it also helps to have been around the block a couple of times. The world never comes to an end, despite the gloomy prognostications; at the same time, human nature being what it is, it's important to stay grounded and see the hype for what it is -- talented, determined and charismatic folks eager to uncork the genie's bottle and worry about the consequences later. 

    BTW, impressive video! Seeded with just two frames? I'm curious about the figure: G8 or G9? She has a passing resemblance to Tera Patrick. 

    Cheers!

  • kenmokenmo Posts: 1,142

    Tera Patrick? Never heard of her. Is it a Genesis figure that is sold on the DAZ store?

    I wish more 3D apps would develop for AMD/ATI. I've been using Nvidia gpus for quite some time but tired all of the issues (crypto mining, rising rises, supply shortages, etc).

    Would a trucking industry only use GM trucks and not Ford or Dodge?

    Thanks kindly...

     

  • kenmo said:

    Tera Patrick? Never heard of her. Is it a Genesis figure that is sold on the DAZ store?

    I wish more 3D apps would develop for AMD/ATI. I've been using Nvidia gpus for quite some time but tired all of the issues (crypto mining, rising rises, supply shortages, etc).

    Would a trucking industry only use GM trucks and not Ford or Dodge?

    A company might, to make maintenance and support logistics easier. But due to the very different APIs nVidia and AMD are not as simple to switch between - if different truck makers used different fuels then an industry might well settle on just one, especially if it had clear advantages for their operations.

    Thanks kindly...

     

  • NylonGirlNylonGirl Posts: 2,293

    Richard Haseltine said:

    A company might, to make maintenance and support logistics easier. But due to the very different APIs nVidia and AMD are not as simple to switch between - if different truck makers used different fuels then an industry might well settle on just one, especially if it had clear advantages for their operations.

    I don't think it's a clear advantage anymore. Their whole business depends on a product that seems increasingly unobtainable. 

  • kenmokenmo Posts: 1,142

    NylonGirl said:

    Richard Haseltine said:

    A company might, to make maintenance and support logistics easier. But due to the very different APIs nVidia and AMD are not as simple to switch between - if different truck makers used different fuels then an industry might well settle on just one, especially if it had clear advantages for their operations.

    I don't think it's a clear advantage anymore. Their whole business depends on a product that seems increasingly unobtainable. 

    Exactly my point. Most of us are not Disney or Pixar. We can not afford the very expensive hardware upgrades for our consumer level computers. The PC revolution was all about giving the average household access to powerful computer tech. Now we are tranding back towards the old mainframe environment with centralize storage and processing. We are losing the autonomy of our desktops which were created many years ago because of the control of the mainframe.

    There needs to be a break away from Nvidia for the average computer user who wishes to pursuit 3D art. 

  • kenmo said:

    NylonGirl said:

    Richard Haseltine said:

    A company might, to make maintenance and support logistics easier. But due to the very different APIs nVidia and AMD are not as simple to switch between - if different truck makers used different fuels then an industry might well settle on just one, especially if it had clear advantages for their operations.

    I don't think it's a clear advantage anymore. Their whole business depends on a product that seems increasingly unobtainable. 

    Exactly my point. Most of us are not Disney or Pixar. We can not afford the very expensive hardware upgrades for our consumer level computers. The PC revolution was all about giving the average household access to powerful computer tech. Now we are tranding back towards the old mainframe environment with centralize storage and processing. We are losing the autonomy of our desktops which were created many years ago because of the control of the mainframe.

    There needs to be a break away from Nvidia for the average computer user who wishes to pursuit 3D art. 

    Daz can use only the render engines that are available for license on acceptable terms. And by "advantage" I menat does a better job, which does not seem to be how you are using it here.

Sign In or Register to comment.