RTX 2080ti questions

24

Comments

  • BobvanBobvan Posts: 2,653
    edited May 2020

    The PC store had a lineup & was moving slow to find out *if* they may have it in stock so will go back this week. The14th thats if plans are not altered by the germaphobe zombie nation we have been turned into..

    Post edited by Bobvan on
  • outrider42outrider42 Posts: 3,679
    nicstt said:

    I've been suffering with the inner struggle of wanting to go from a 1080TI to 2080TI (went from the 980TI to the 1080TI) and I don't know how much of a difference I'd see in my case. I have heard that the 3x line coming out will offer more VRAM and a better price point. 

    Worst reason ever for making a decission, based on what you're 'heard'. Basically, if you listen, you'll 'hear' anything and everything on the internet.

    Wait until Nvidia (or those officially authorised) release specs.

    I've been holding off for years, basically my opinion is that Nvidia are greedy, so I've been reluctant to part with cash.

    Of course one should always wait for the proper reveal. However, it still pays to be aware of what may be coming. These rumors come from multiple sources that have a history of being more right than wrong. Nothing would suck more than to save up and buy new hardware only for it to be completely surpassed just a few months later. The Turing line has been out since 2018 now. If this was early or mid 2019 I would say go for the 2080ti. But in mid 2020 with a new lineup coming soon, its hard to suggest that. Even if it comes out at the same price as Turing, well I'd consider that a huge win if the performance is there. I'd be shocked if the performance was not a big lift from Turing.
  • BobvanBobvan Posts: 2,653
    nicstt said:

    I've been suffering with the inner struggle of wanting to go from a 1080TI to 2080TI (went from the 980TI to the 1080TI) and I don't know how much of a difference I'd see in my case. I have heard that the 3x line coming out will offer more VRAM and a better price point. 

    Worst reason ever for making a decission, based on what you're 'heard'. Basically, if you listen, you'll 'hear' anything and everything on the internet.

    Wait until Nvidia (or those officially authorised) release specs.

    I've been holding off for years, basically my opinion is that Nvidia are greedy, so I've been reluctant to part with cash.

     

    Of course one should always wait for the proper reveal. However, it still pays to be aware of what may be coming. These rumors come from multiple sources that have a history of being more right than wrong. Nothing would suck more than to save up and buy new hardware only for it to be completely surpassed just a few months later. The Turing line has been out since 2018 now. If this was early or mid 2019 I would say go for the 2080ti. But in mid 2020 with a new lineup coming soon, its hard to suggest that. Even if it comes out at the same price as Turing, well I'd consider that a huge win if the performance is there. I'd be shocked if the performance was not a big lift from Turing.

    This always happen I am sure there are 1200 systems that outdo my 3 yr old 5K beast thats just the way it goes..

  • LeanaLeana Posts: 12,755
    Sevrin said:
    Leana said:
    TheKD said:

    They were saying june, but with all this zombie apacolypse stuff, that might not happen.

    Remember that when the 20xx cards were released it took months to have an Iray version which supported them and a few more to have a DS beta version including that version of Iray (and a lot more before it turned into a General Release)...

    I'm not sure that's a fair comparison.  The 20 series introduced RTX.  The 30 series so far sounds like it's mostly the 20 series on steroids, rather than a new rendering paradigm.

    I'm not talking about the Iray version which supported RTX features, I'm taking about the first one which would recognize the cards as something usable to render. 
    Granted, it may not take as long for the 30xx cards, but I certainly don't expect them to work in DS on release.

  • BobvanBobvan Posts: 2,653
    edited May 2020

    Its ok my original question was would my system support it? not is it ok should I?  I do appreciate the various feedabck but will prolly go ahead if the price is right..

     

    I do render quite a bit. Even in the days of the luxrender when it took eeeons..

    Untitled.png
    728 x 474 - 287K
    Post edited by Bobvan on
  • outrider42outrider42 Posts: 3,679
    Leana said:
    Sevrin said:
    Leana said:
    TheKD said:

    They were saying june, but with all this zombie apacolypse stuff, that might not happen.

    Remember that when the 20xx cards were released it took months to have an Iray version which supported them and a few more to have a DS beta version including that version of Iray (and a lot more before it turned into a General Release)...

    I'm not sure that's a fair comparison.  The 20 series introduced RTX.  The 30 series so far sounds like it's mostly the 20 series on steroids, rather than a new rendering paradigm.

    I'm not talking about the Iray version which supported RTX features, I'm taking about the first one which would recognize the cards as something usable to render. 
    Granted, it may not take as long for the 30xx cards, but I certainly don't expect them to work in DS on release.

    My understanding of the new Optix 6.0 that Iray RTX uses is different from the old Optix Prime. The old Optix Prime needed to be recompiled for every new graphics architecture, which is why we had to wait so painfully long for Daz to get supported. However Optix 6.0 does not need to be recompiled. So going by this information any new GPU generation should work right out of the box for Iray. They may not be optimized, and an update would address that, but they should work. They may only need a driver update.

  • outrider42outrider42 Posts: 3,679
    Leana said:
    Sevrin said:
    Leana said:
    TheKD said:

    They were saying june, but with all this zombie apacolypse stuff, that might not happen.

    Remember that when the 20xx cards were released it took months to have an Iray version which supported them and a few more to have a DS beta version including that version of Iray (and a lot more before it turned into a General Release)...

    I'm not sure that's a fair comparison.  The 20 series introduced RTX.  The 30 series so far sounds like it's mostly the 20 series on steroids, rather than a new rendering paradigm.

    I'm not talking about the Iray version which supported RTX features, I'm taking about the first one which would recognize the cards as something usable to render. 
    Granted, it may not take as long for the 30xx cards, but I certainly don't expect them to work in DS on release.

    My understanding of the new Optix 6.0 that Iray RTX uses is different from the old Optix Prime. The old Optix Prime needed to be recompiled for every new graphics architecture, which is why we had to wait so painfully long for Daz to get supported. However Optix 6.0 does not need to be recompiled. So going by this information any new GPU generation should work right out of the box for Iray. They may not be optimized, and an update would address that, but they should work. They may only need a driver update.

     

    Bobvan said:

    Its ok my original question was would my system support it? not is it ok should I?  I do appreciate the various feedabck but will prolly go ahead if the price is right..

     

    I do render quite a bit. Even in the days of the luxrender when it took eeeons..

    Then the answer is yes. Any PC that can install a 2080ti can do it. You literally pop a 2080ti into any pcie supported PC and it would work for Iray. It uses a little more electricity than a 1080ti, but that's not an issue with a 750 W supply. You could even use both the 1080ti and 2080ti at the same time if you have 2 pcie slots to support them.

  • BobvanBobvan Posts: 2,653
    edited May 2020
    Leana said:
    Sevrin said:
    Leana said:
    TheKD said:

    They were saying june, but with all this zombie apacolypse stuff, that might not happen.

    Remember that when the 20xx cards were released it took months to have an Iray version which supported them and a few more to have a DS beta version including that version of Iray (and a lot more before it turned into a General Release)...

    I'm not sure that's a fair comparison.  The 20 series introduced RTX.  The 30 series so far sounds like it's mostly the 20 series on steroids, rather than a new rendering paradigm.

    I'm not talking about the Iray version which supported RTX features, I'm taking about the first one which would recognize the cards as something usable to render. 
    Granted, it may not take as long for the 30xx cards, but I certainly don't expect them to work in DS on release.

    My understanding of the new Optix 6.0 that Iray RTX uses is different from the old Optix Prime. The old Optix Prime needed to be recompiled for every new graphics architecture, which is why we had to wait so painfully long for Daz to get supported. However Optix 6.0 does not need to be recompiled. So going by this information any new GPU generation should work right out of the box for Iray. They may not be optimized, and an update would address that, but they should work. They may only need a driver update.

     

    Bobvan said:

    Its ok my original question was would my system support it? not is it ok should I?  I do appreciate the various feedabck but will prolly go ahead if the price is right..

     

    I do render quite a bit. Even in the days of the luxrender when it took eeeons..

    Then the answer is yes. Any PC that can install a 2080ti can do it. You literally pop a 2080ti into any pcie supported PC and it would work for Iray. It uses a little more electricity than a 1080ti, but that's not an issue with a 750 W supply. You could even use both the 1080ti and 2080ti at the same time if you have 2 pcie slots to support them.

    Thx but I will be selling the 1080. As you say double the speed of 1 1080 sounds promising in any case in a few years when my beast is getting up there it will be time for a new system with what will then be the greatest..

     

    If it does pan out I will offer you kind folks the opportunity to purchase my 1080ti prior to posing it on my local CL.

    Post edited by Bobvan on
  • Granted, you would wait forever if you feared a better/cheaper product becoming available after you committed to purchasing one.

    We are witnessing the resurgence of AMD and Navi X2 doesn't even have to beat the 3080ti in order to make the NVidia's entire consumer line cheaper.

    And when Vulkan is supported, a whole new realm of possibilities opens up for Daz users.

    If you can wait, wait. There hasn't been a time like this for a very long time.

  • BobvanBobvan Posts: 2,653
     

    And when Vulkan is supported, a whole new realm of possibilities opens up for Daz users.

     

    What does Spock have to do wth anything jk:)

  • FSMCDesignsFSMCDesigns Posts: 12,843
    Bobvan said:
     

    And when Vulkan is supported, a whole new realm of possibilities opens up for Daz users.

     

    What does Spock have to do wth anything jk:)

    You system will then live longer and prosper, LOL

    https://developer.nvidia.com/Vulkan

     

  • BobvanBobvan Posts: 2,653
    edited May 2020
    Bobvan said:
     

    And when Vulkan is supported, a whole new realm of possibilities opens up for Daz users.

     

    What does Spock have to do wth anything jk:)

    You system will then live longer and prosper, LOL

    https://developer.nvidia.com/Vulkan

     

    Good one bro! Isin't Khronos also some ST plannitlaugh

    Post edited by Bobvan on
  • MelissaGTMelissaGT Posts: 2,611

    If you want to know how much faster the 2080ti is, just look at the benchmark thread in my signature. All the RTX cards have been benched, plus the 1080ti and many others. I have TWO 1080tis, and they are roughly as fast as a single 2080ti. So if you ask me, I think its worth thinking about. But now we also have the 3000 series due very soon.

    Nvidia is holding a special event on May 14 and they are expected to announce their next generation GPU, Ampere. They are not expected to reveal the gaming lineup, which is what we are talking about here, but we can gather the performance the gaming cards will be.

    There are lots of rumors and things going around. Right now they point to a May/June launch for the professional lineup, and September for the gamers. I do not think July was ever the target for gaming. If you look up my previous posts, I predicted August at the very earliest with September being possible. And that was back in 2019 long before corona came along.

    I expect dramatic improvements across the board, especially ray tracing. Turing is only the first generation of ray tracing hardware, and to be blunt, I do not think Turing is going to age well. Ampere will likely have much faster ray tracing that puts Turing to shame. Also, all Ampere cards will ray trace.

    No rumors have pointed at pricing at all. I find this very interesting, and what this tells me is that Nvidia is monitoring AMD as closely as they can. Now some people think Nvidia as the top to themselves. That is true, but AMD has cards in the works that can beat the 2080ti. Some rumors suggest that AMD might even beat the 3080ti in traditional gaming, while the 3080ti will offer better ray tracing. So we might have a very competitive market coming up. As for corona virus delays, they are not an issue. The next gen consoles are on track to release as scheduled, and these are AMD products. AMD and Nvidia recently just bought up the remaining capacity for 7nm fabs from TSMC, the company that produces their chips. Operations are a full go at this point.

    I always buy NVIDIA cards, but wouldn't think twice about going to AMD if AMD could put out something that wouldn't have problems with games, etc. I'm a gamer first, and though AMD offers (in general) a far better price-point, it usually results in having issues with game compatibility, etc. The one big example that comes to mind is that when The Witcher 3 came out, AMD users could not make use of all the graphics options offered. And that...would really suck big donkey you-know-what's if I had to deal with that. So I just buy NVIDIA cards. Does that continue the cycle of NVIDIA greed about charging whatever they feel like because they can? Yeah, probably. I wish it was like the CPU race...I always used to buy Intel, but now AMD and Intel are so close in peformance that Intel had to lower their prices to compete more directly with AMD. This gives users a choice. (I switched to AMD with my last build and love the RYZEN chipset line something fierce.)

  • BobvanBobvan Posts: 2,653

    I had AMD years ago and had temp problems but thats going back a while..

  • kenshaw011267kenshaw011267 Posts: 3,805

    The Nvidia keynote for GTC drops on Tuesday. If they are going to announce Ampere that would be where and when, however there is nothing in their press releases implying such. Even so it has generally taken 4 to 6 months to get consumer cards launched. So if they announce Tuesday it could be fall/winter before the flagship cards start hitting shelves. If the announcement is Tuesday and they release any sort of specs people can make rational decisions then. But holding off on purchases for a product that isn't even officially announced seems weird to me.

    Vulkan is not a render engine. It is a 3d graphics API, like OpenGL or DirectX. It has very little to do with DS and adding support would not make a ton of sense, unless theywere adding a Linux version. 

     

  • No one said it was a render engine. Vulkan would allow Daz to support raytracing on both AMD and NVidia GPUs.

  • outrider42outrider42 Posts: 3,679

    If you want to know how much faster the 2080ti is, just look at the benchmark thread in my signature. All the RTX cards have been benched, plus the 1080ti and many others. I have TWO 1080tis, and they are roughly as fast as a single 2080ti. So if you ask me, I think its worth thinking about. But now we also have the 3000 series due very soon.

    Nvidia is holding a special event on May 14 and they are expected to announce their next generation GPU, Ampere. They are not expected to reveal the gaming lineup, which is what we are talking about here, but we can gather the performance the gaming cards will be.

    There are lots of rumors and things going around. Right now they point to a May/June launch for the professional lineup, and September for the gamers. I do not think July was ever the target for gaming. If you look up my previous posts, I predicted August at the very earliest with September being possible. And that was back in 2019 long before corona came along.

    I expect dramatic improvements across the board, especially ray tracing. Turing is only the first generation of ray tracing hardware, and to be blunt, I do not think Turing is going to age well. Ampere will likely have much faster ray tracing that puts Turing to shame. Also, all Ampere cards will ray trace.

    No rumors have pointed at pricing at all. I find this very interesting, and what this tells me is that Nvidia is monitoring AMD as closely as they can. Now some people think Nvidia as the top to themselves. That is true, but AMD has cards in the works that can beat the 2080ti. Some rumors suggest that AMD might even beat the 3080ti in traditional gaming, while the 3080ti will offer better ray tracing. So we might have a very competitive market coming up. As for corona virus delays, they are not an issue. The next gen consoles are on track to release as scheduled, and these are AMD products. AMD and Nvidia recently just bought up the remaining capacity for 7nm fabs from TSMC, the company that produces their chips. Operations are a full go at this point.

    I always buy NVIDIA cards, but wouldn't think twice about going to AMD if AMD could put out something that wouldn't have problems with games, etc. I'm a gamer first, and though AMD offers (in general) a far better price-point, it usually results in having issues with game compatibility, etc. The one big example that comes to mind is that when The Witcher 3 came out, AMD users could not make use of all the graphics options offered. And that...would really suck big donkey you-know-what's if I had to deal with that. So I just buy NVIDIA cards. Does that continue the cycle of NVIDIA greed about charging whatever they feel like because they can? Yeah, probably. I wish it was like the CPU race...I always used to buy Intel, but now AMD and Intel are so close in peformance that Intel had to lower their prices to compete more directly with AMD. This gives users a choice. (I switched to AMD with my last build and love the RYZEN chipset line something fierce.)

    AMD has been selling their 5700 series pretty well. With Ryzen AMD has made huge strides not just in hardware, but in the all important mind share. While they have not competed at the very top of the GPU race for a while, the 5700XT is a very solid card that can run with a 1080ti. AMD's biggest problem with their previous arch was that they were strictly limited to a specific number of compute units. But with RDNA2 that all changes. They can now pack in as many CU as they can, so now they can truly compete. And also remember this, the 5700 debuted, and out of the blue Nvidia released "Super" versions of their cards and the prices dropped. The Super versions did not have the Founder's Edition price tags, which meant that prices were better. The competition had a near instant effect on the market. The 2080ti was the only one that did not get a Super version, its also the only one that the 5700XT was unable to compete with.

    Lets just look at the upcoming consoles. The Xbox Series X has been getting a lot of attention for how powerful it is. This thing is looking to easily match a 2080 in performance. People have been asking how on earth did MS and AMD afford to do that in what will likely be a $500 console? 

    The answer to that is that the consoles are not actually that powerful compared to what AMD is cooking! The GPUs in the PS5 and Xbox are just the start of the big hardware gains we are going to see with AMD and its RDNA2. These are still cheap GPUs.

    Now with that understanding, it is easier to grasp just how much more powerful the full lineup of AMD will be. We have a situation where AMD and Nvidia are basically playing a game of chicken with each other. Both are waiting for the other to make the first move.

    But Nvidia does not want AMD to grab the performance crown. If they did, that would capture the mind share of gamers that AMD is truly back. That sort of thing trickles down, so that even if people are not buying the fastest GPUs, they may still choose AMD simply because of reputation and brand. That is why having the fastest card is so important, its a marketing tool. That is why both companies are pushing full steam ahead with their launch plans. Nvidia will not wait to launch if AMD pushes out first and beats the 2080ti. Nvidia will respond as fast as they can.

    As far as to what is being rumored, the 3080 with have 10GB and the 3080ti might get 12GB. It does seem like Nvidia has yet to decide exactly how much VRAM the 3080ti will get. They are probably waiting on AMD. Another very interesting rumor is that the 3000 series will have 4 times the ray tracing power over Turing. That's per tier. So the 3060 will have 4 times faster ray tracing than the 2060, the 3070 4x the 2070, and so on.

    If that last bit is true, and frankly I expect it to be because I predicted that last year, then Turing will be quickly obsolete as ray tracing becomes the standard. The rumors say that basically ray tracing in games will not have the huge performance hit they do with Turing.

    The 3000 series will also offer a new iteration of Tensor cores and DLSS 3.0. One of the juiciest rumors was that they might use Tensor cores to help compress data in VRAM, which would mean games would use less VRAM. If this can somehow translate to Iray, just think about how incredible that would be.

    These rumors have not gone away, and come from sources that have been correct in the past. But I also believe this to be possible. Consider this, Turing is on the same fab as Pascal, and look at how much performance they still gained (in part because the chips got bigger.) Now they jump to a 7nm fab, which is vastly smaller. They can do a lot more on a chip now, and they get higher clocks. So not only will they offer far higher core counts, but these cores will be faster at their jobs on top of that. It makes perfect sense to me that Ampere will be big. And again, just look at AMD Ryzen, which has been on the 7nm fab, too. 

  • If you want to know how much faster the 2080ti is, just look at the benchmark thread in my signature. All the RTX cards have been benched, plus the 1080ti and many others. I have TWO 1080tis, and they are roughly as fast as a single 2080ti. So if you ask me, I think its worth thinking about. But now we also have the 3000 series due very soon.

    Nvidia is holding a special event on May 14 and they are expected to announce their next generation GPU, Ampere. They are not expected to reveal the gaming lineup, which is what we are talking about here, but we can gather the performance the gaming cards will be.

    There are lots of rumors and things going around. Right now they point to a May/June launch for the professional lineup, and September for the gamers. I do not think July was ever the target for gaming. If you look up my previous posts, I predicted August at the very earliest with September being possible. And that was back in 2019 long before corona came along.

    I expect dramatic improvements across the board, especially ray tracing. Turing is only the first generation of ray tracing hardware, and to be blunt, I do not think Turing is going to age well. Ampere will likely have much faster ray tracing that puts Turing to shame. Also, all Ampere cards will ray trace.

    No rumors have pointed at pricing at all. I find this very interesting, and what this tells me is that Nvidia is monitoring AMD as closely as they can. Now some people think Nvidia as the top to themselves. That is true, but AMD has cards in the works that can beat the 2080ti. Some rumors suggest that AMD might even beat the 3080ti in traditional gaming, while the 3080ti will offer better ray tracing. So we might have a very competitive market coming up. As for corona virus delays, they are not an issue. The next gen consoles are on track to release as scheduled, and these are AMD products. AMD and Nvidia recently just bought up the remaining capacity for 7nm fabs from TSMC, the company that produces their chips. Operations are a full go at this point.

    I always buy NVIDIA cards, but wouldn't think twice about going to AMD if AMD could put out something that wouldn't have problems with games, etc. I'm a gamer first, and though AMD offers (in general) a far better price-point, it usually results in having issues with game compatibility, etc. The one big example that comes to mind is that when The Witcher 3 came out, AMD users could not make use of all the graphics options offered. And that...would really suck big donkey you-know-what's if I had to deal with that. So I just buy NVIDIA cards. Does that continue the cycle of NVIDIA greed about charging whatever they feel like because they can? Yeah, probably. I wish it was like the CPU race...I always used to buy Intel, but now AMD and Intel are so close in peformance that Intel had to lower their prices to compete more directly with AMD. This gives users a choice. (I switched to AMD with my last build and love the RYZEN chipset line something fierce.)

    I plan on upgrading to Ryzen soon too.  Current specs are:

    Intel i5 4670K, Gigabyte Z97X-Gaming 7, EVGA GTX 1070 Gaming, 2x 8GB Patriot Viper 3 DDR-3 1866 memory, Samsung 860 EVO 1TB SSD, Hitachi HDT721010SLA360 1TB HDD, 2 x Western Digital Blue WD20EZRZ 2TB HDD in raid 1, EVGA SuperNova 1200 P2 Platinum PSU, HP DVD1720 optical drive, CoolerMaster CM 690 II Case, Samsung SyncMaster P2370 Monitor @ 1080p, Windows 10 Professional 64

    Upgrading to a Ryzen 3700X and whatever motherboard I can find (motherboards are in very short suply right now) along with 32 Gb of system memory once my stimulus gets here.  I currently have a GTX 1070 so I plan on waiting until 30 series Nvidia cards some out to see if I can afford a 30 series cars or pick up a 2080TI cheap.

     

    Bobvan said:

    I had AMD years ago and had temp problems but thats going back a while..

    Yeah, I still have some Abit boards in storage as collectors items with Athalon processors MELTED onto the sockets.  Used to be a extreme overclocker for gaming and had a few cooler fans and water pumps fail back in the day.  I also have some old 3DFX Voodo and Voodoo 2 cards around here somehwere in a box of old parts.

    To the original OP, if you already have the 750W PSU, you should be good, But if you haven't bought it yet, I would go with the biggest power supply you can afford even if you have to wait on the GPU for an additional month or two.

  • kenshaw011267kenshaw011267 Posts: 3,805

    No one said it was a render engine. Vulkan would allow Daz to support raytracing on both AMD and NVidia GPUs.

    No, it would not.

    DirectX already supports real time rays on all GPU's. That does not mean iRay supports AMD. iRay doesn't use DirectX or any of those API's. It works directly with CUDA-X. I assume that is how every other CUDA application works.

    What iRay does is directly use the RT cores on the RTX cards which is much much faster than software emulation, which is how DirectX does it on non RTX cards.

  • BobvanBobvan Posts: 2,653
    edited May 2020

    Naahhh im just buying the single GPU providing I get the sale price I was referring to earlier.

    Post edited by Bobvan on
  • A 750-watt PS should be more than enough. However, if you ever plan on adding another card, you might want to think about upgrading to a 1,000-watt PS.

    Wait for the 3000-series cards? I probably won't unless they're going to be cheaper(never happen). And when I say "wait" I mean waiting until after the "early buyers" go through the guinea pig phase. I plan on getting a Titan RTX in my next build, so I don't really expect the prices to drop much on that particular card when the 3000-series are released to justify waiting for another year. I would imagine Nvidia will be asking for $3500+ for the next gen Titan card. Prices just seem to keep going up & up with each successive generation of Nvidia cards, so unless we're going to be able to start using AMD cards for Iray rendering in Daz, I really don't care about AMD graphics cards.

    At this point, Ampere is all hype and no show, kind of like what they did with Turing, and we saw how that came out: extremely overpriced video cards with disappointing ray tracing capabilities(although its not like there were a whole bunch of games that took advantage of the "feature"). I think the Turing series cards were a bit rushed to release and just something to hold Nvidia over until the next over-hyped release...

    Yes, Nvidia is greedy, but right now they have a fairly large chunk of the market.

  • I have the same issue, I have a 1080 Ti, and I begin to suffer, so I wonder if I upgrade, and if yes, for what. (or if I wait as usually). Does it make a really big difference between a 1080 and a 2080 in term of scene weight or render speed? Did anyone already make this change?

    This actually surprises me! I was always thinking that PAs use monster graphic card setups like Titans, Quadros, or triple SLI with 1080Ti's or 2080Ti's in order to get their promo renders done fast. surprise

  • outrider42outrider42 Posts: 3,679

    A 750-watt PS should be more than enough. However, if you ever plan on adding another card, you might want to think about upgrading to a 1,000-watt PS.

    Wait for the 3000-series cards? I probably won't unless they're going to be cheaper(never happen). And when I say "wait" I mean waiting until after the "early buyers" go through the guinea pig phase. I plan on getting a Titan RTX in my next build, so I don't really expect the prices to drop much on that particular card when the 3000-series are released to justify waiting for another year. I would imagine Nvidia will be asking for $3500+ for the next gen Titan card. Prices just seem to keep going up & up with each successive generation of Nvidia cards, so unless we're going to be able to start using AMD cards for Iray rendering in Daz, I really don't care about AMD graphics cards.

    At this point, Ampere is all hype and no show, kind of like what they did with Turing, and we saw how that came out: extremely overpriced video cards with disappointing ray tracing capabilities(although its not like there were a whole bunch of games that took advantage of the "feature"). I think the Turing series cards were a bit rushed to release and just something to hold Nvidia over until the next over-hyped release...

    Yes, Nvidia is greedy, but right now they have a fairly large chunk of the market.

    There was a difference in 2018...no competition. Now there will be competition. That changes things. Also there was tremendous pushback on RTX prices. The Super line adjusted almost the entire lineup with the exception of the 2080ti.

    Prices will not increase with the 3000 series. If they do, then AMD will totally destroy Nvidia. Nvidia is not like Intel. They will compete much more aggressively when pushed. The existence of the Super cards is proof of that.

    I don't expect prices to drop, maybe a little, but not much. I expect the 3060-3080 to be very similar in price to their 2000 counterparts. The 3080ti is a bit of a wild card, but I do not expect the prices to go up, the same goes for the Ampere Titan. Again, May 14 is just days away. We are going to see what Ampere is.
  • A 750-watt PS should be more than enough. However, if you ever plan on adding another card, you might want to think about upgrading to a 1,000-watt PS.

    Wait for the 3000-series cards? I probably won't unless they're going to be cheaper(never happen). And when I say "wait" I mean waiting until after the "early buyers" go through the guinea pig phase. I plan on getting a Titan RTX in my next build, so I don't really expect the prices to drop much on that particular card when the 3000-series are released to justify waiting for another year. I would imagine Nvidia will be asking for $3500+ for the next gen Titan card. Prices just seem to keep going up & up with each successive generation of Nvidia cards, so unless we're going to be able to start using AMD cards for Iray rendering in Daz, I really don't care about AMD graphics cards.

    At this point, Ampere is all hype and no show, kind of like what they did with Turing, and we saw how that came out: extremely overpriced video cards with disappointing ray tracing capabilities(although its not like there were a whole bunch of games that took advantage of the "feature"). I think the Turing series cards were a bit rushed to release and just something to hold Nvidia over until the next over-hyped release...

    Yes, Nvidia is greedy, but right now they have a fairly large chunk of the market.

     

    There was a difference in 2018...no competition. Now there will be competition. That changes things. Also there was tremendous pushback on RTX prices. The Super line adjusted almost the entire lineup with the exception of the 2080ti.

     

    Prices will not increase with the 3000 series. If they do, then AMD will totally destroy Nvidia. Nvidia is not like Intel. They will compete much more aggressively when pushed. The existence of the Super cards is proof of that.

     

    I don't expect prices to drop, maybe a little, but not much. I expect the 3060-3080 to be very similar in price to their 2000 counterparts. The 3080ti is a bit of a wild card, but I do not expect the prices to go up, the same goes for the Ampere Titan. Again, May 14 is just days away. We are going to see what Ampere is.

    We are going to see what Nvidia wants us to see. If you don't think Nvidia is going to charge a lot more(like $300-$500 more) at the outset for a 3080Ti and the Ampere Titan vs their previous generation counterparts, then I have Bill Gates on the phone offering to split the difference with everyone who buys those cards. laugh Sure, Nvidia could lower the price of say the 3060, 3070, and 3080 series to compete with AMD in the gaming industry, but rest assured, Nvidia IS going to make up the difference by charging more for the aforementioned higher end 3000 series cards.

  • hlln334hlln334 Posts: 30

    I've done a similar upgrade (1080 to 2080ti) and I've seen a boost in render speeds, especially with the updated DS. I went with an 1100 PSU, but I was thinking I'd be adding a second ard eventually, but haven't goen that far yet. Use PC part picker to get an estimate of your Power needs, but I agree with SnowSultan and think 750w should be fine.

    I upgraded from a 980TI to a regular 2080 (not TI or Super) and got a TREMENDOUS boost in speed. But, to be fair, I still use my 980TI for rendering as well (unless the scene can't fit in the 6gb VRAM).

    I use both cards to render. I'm surprised a lot of you aren't keeping your 980TI in the system to have even more CUDA cores

     

     

  • No one said it was a render engine. Vulkan would allow Daz to support raytracing on both AMD and NVidia GPUs.

    No, it would not.

    DirectX already supports real time rays on all GPU's. That does not mean iRay supports AMD. iRay doesn't use DirectX or any of those API's. It works directly with CUDA-X. I assume that is how every other CUDA application works.

    What iRay does is directly use the RT cores on the RTX cards which is much much faster than software emulation, which is how DirectX does it on non RTX cards.

    Yes, it would. A list of AMD GPUs already support Vulkan.

    And you'll have to point out to me where I said IRay would be supported by AMD.

  • nicsttnicstt Posts: 11,715
    edited May 2020
    nicstt said:

    I've been suffering with the inner struggle of wanting to go from a 1080TI to 2080TI (went from the 980TI to the 1080TI) and I don't know how much of a difference I'd see in my case. I have heard that the 3x line coming out will offer more VRAM and a better price point. 

    Worst reason ever for making a decission, based on what you're 'heard'. Basically, if you listen, you'll 'hear' anything and everything on the internet.

    Wait until Nvidia (or those officially authorised) release specs.

    I've been holding off for years, basically my opinion is that Nvidia are greedy, so I've been reluctant to part with cash.

     

    Of course one should always wait for the proper reveal. However, it still pays to be aware of what may be coming. These rumors come from multiple sources that have a history of being more right than wrong. Nothing would suck more than to save up and buy new hardware only for it to be completely surpassed just a few months later. The Turing line has been out since 2018 now. If this was early or mid 2019 I would say go for the 2080ti. But in mid 2020 with a new lineup coming soon, its hard to suggest that. Even if it comes out at the same price as Turing, well I'd consider that a huge win if the performance is there. I'd be shocked if the performance was not a big lift from Turing.

    Agreed that knowing what is coming, is a good idea.

    ... But all we can be confident about, is Nvidia releasing a new itteration sometime this year (very probable). Anything else is at best speculation, and invariably guess work.

    I'm not making plans to spend hundreds or thousands based on speculation, never mind guesswork.

    Post edited by nicstt on
  • AsariAsari Posts: 703
    edited May 2020
    Regarding RTX capabilities of 20xx and 30xx cards: it really depends on your scene and what you're rendering. If you render scenes with lots of geometries like trees often, you will benefit from RTX capabilities greatly. Mesh hair also has lots of geometry. However if you mainly render transmapped hair with background not intensive in geometry (from migenius tests it has shown that indoor environments don't seem to profit much from RTX) you might not benefit from RTX capability of either card, and the speed increase from a 1080ti to a 2080ti will play out to be a lot less.

    I think this is the difficulty with RTX, it is more difficult to benchmark, and it really depends on the scene. We had benchmarks where RTX outperformed non-RTX by a factor of 3 times and we had scenes where the speed gain of RTX was merely around 15%. Sure, the 20xx and 30xx series will be faster cards overall. But RTX is a factor when you decide whether they are worth the money because these cards are expensive.

    If the 3080ti will only have 12GB VRAM that's a bit disappointing. I had hoped it would increase to 13 or 16 and the new Titan would go up to 36. 8k textures are becoming the standard outside of dazland so I imagine G9 might feature 8k maps. Now that's a huge challenge for Iray renders that drops to CPU unapologetically once you exceed VRAM limit.

    Post edited by Asari on
  • nicsttnicstt Posts: 11,715

    No one said it was a render engine. Vulkan would allow Daz to support raytracing on both AMD and NVidia GPUs.

    No, it would not.

    DirectX already supports real time rays on all GPU's. That does not mean iRay supports AMD. iRay doesn't use DirectX or any of those API's. It works directly with CUDA-X. I assume that is how every other CUDA application works.

    What iRay does is directly use the RT cores on the RTX cards which is much much faster than software emulation, which is how DirectX does it on non RTX cards.

    Yes, it would. A list of AMD GPUs already support Vulkan.

    And you'll have to point out to me where I said IRay would be supported by AMD.

    The confusion may arrise as you state Daz, folks state Daz to mean Studio regularly and consistently. It would certainly allow Daz to integrate support in Studio, which is what I presume you mean? I also presume you're not talking about one of their other products.

  • kenshaw011267kenshaw011267 Posts: 3,805

    No one said it was a render engine. Vulkan would allow Daz to support raytracing on both AMD and NVidia GPUs.

    No, it would not.

    DirectX already supports real time rays on all GPU's. That does not mean iRay supports AMD. iRay doesn't use DirectX or any of those API's. It works directly with CUDA-X. I assume that is how every other CUDA application works.

    What iRay does is directly use the RT cores on the RTX cards which is much much faster than software emulation, which is how DirectX does it on non RTX cards.

    Yes, it would. A list of AMD GPUs already support Vulkan.

    And you'll have to point out to me where I said IRay would be supported by AMD.

    All GPU's already support Vulkan. However if you don't think it would support iRay why is this even being brought up here?

Sign In or Register to comment.