Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
The PC store had a lineup & was moving slow to find out *if* they may have it in stock so will go back this week. The14th thats if plans are not altered by the germaphobe zombie nation we have been turned into..
This always happen I am sure there are 1200 systems that outdo my 3 yr old 5K beast thats just the way it goes..
I'm not talking about the Iray version which supported RTX features, I'm taking about the first one which would recognize the cards as something usable to render.
Granted, it may not take as long for the 30xx cards, but I certainly don't expect them to work in DS on release.
Its ok my original question was would my system support it? not is it ok should I? I do appreciate the various feedabck but will prolly go ahead if the price is right..
I do render quite a bit. Even in the days of the luxrender when it took eeeons..
My understanding of the new Optix 6.0 that Iray RTX uses is different from the old Optix Prime. The old Optix Prime needed to be recompiled for every new graphics architecture, which is why we had to wait so painfully long for Daz to get supported. However Optix 6.0 does not need to be recompiled. So going by this information any new GPU generation should work right out of the box for Iray. They may not be optimized, and an update would address that, but they should work. They may only need a driver update.
My understanding of the new Optix 6.0 that Iray RTX uses is different from the old Optix Prime. The old Optix Prime needed to be recompiled for every new graphics architecture, which is why we had to wait so painfully long for Daz to get supported. However Optix 6.0 does not need to be recompiled. So going by this information any new GPU generation should work right out of the box for Iray. They may not be optimized, and an update would address that, but they should work. They may only need a driver update.
Then the answer is yes. Any PC that can install a 2080ti can do it. You literally pop a 2080ti into any pcie supported PC and it would work for Iray. It uses a little more electricity than a 1080ti, but that's not an issue with a 750 W supply. You could even use both the 1080ti and 2080ti at the same time if you have 2 pcie slots to support them.
Thx but I will be selling the 1080. As you say double the speed of 1 1080 sounds promising in any case in a few years when my beast is getting up there it will be time for a new system with what will then be the greatest..
If it does pan out I will offer you kind folks the opportunity to purchase my 1080ti prior to posing it on my local CL.
Granted, you would wait forever if you feared a better/cheaper product becoming available after you committed to purchasing one.
We are witnessing the resurgence of AMD and Navi X2 doesn't even have to beat the 3080ti in order to make the NVidia's entire consumer line cheaper.
And when Vulkan is supported, a whole new realm of possibilities opens up for Daz users.
If you can wait, wait. There hasn't been a time like this for a very long time.
What does Spock have to do wth anything jk:)
You system will then live longer and prosper, LOL
https://developer.nvidia.com/Vulkan
Good one bro! Isin't Khronos also some ST plannit
I always buy NVIDIA cards, but wouldn't think twice about going to AMD if AMD could put out something that wouldn't have problems with games, etc. I'm a gamer first, and though AMD offers (in general) a far better price-point, it usually results in having issues with game compatibility, etc. The one big example that comes to mind is that when The Witcher 3 came out, AMD users could not make use of all the graphics options offered. And that...would really suck big donkey you-know-what's if I had to deal with that. So I just buy NVIDIA cards. Does that continue the cycle of NVIDIA greed about charging whatever they feel like because they can? Yeah, probably. I wish it was like the CPU race...I always used to buy Intel, but now AMD and Intel are so close in peformance that Intel had to lower their prices to compete more directly with AMD. This gives users a choice. (I switched to AMD with my last build and love the RYZEN chipset line something fierce.)
I had AMD years ago and had temp problems but thats going back a while..
The Nvidia keynote for GTC drops on Tuesday. If they are going to announce Ampere that would be where and when, however there is nothing in their press releases implying such. Even so it has generally taken 4 to 6 months to get consumer cards launched. So if they announce Tuesday it could be fall/winter before the flagship cards start hitting shelves. If the announcement is Tuesday and they release any sort of specs people can make rational decisions then. But holding off on purchases for a product that isn't even officially announced seems weird to me.
Vulkan is not a render engine. It is a 3d graphics API, like OpenGL or DirectX. It has very little to do with DS and adding support would not make a ton of sense, unless theywere adding a Linux version.
No one said it was a render engine. Vulkan would allow Daz to support raytracing on both AMD and NVidia GPUs.
AMD has been selling their 5700 series pretty well. With Ryzen AMD has made huge strides not just in hardware, but in the all important mind share. While they have not competed at the very top of the GPU race for a while, the 5700XT is a very solid card that can run with a 1080ti. AMD's biggest problem with their previous arch was that they were strictly limited to a specific number of compute units. But with RDNA2 that all changes. They can now pack in as many CU as they can, so now they can truly compete. And also remember this, the 5700 debuted, and out of the blue Nvidia released "Super" versions of their cards and the prices dropped. The Super versions did not have the Founder's Edition price tags, which meant that prices were better. The competition had a near instant effect on the market. The 2080ti was the only one that did not get a Super version, its also the only one that the 5700XT was unable to compete with.
Lets just look at the upcoming consoles. The Xbox Series X has been getting a lot of attention for how powerful it is. This thing is looking to easily match a 2080 in performance. People have been asking how on earth did MS and AMD afford to do that in what will likely be a $500 console?
The answer to that is that the consoles are not actually that powerful compared to what AMD is cooking! The GPUs in the PS5 and Xbox are just the start of the big hardware gains we are going to see with AMD and its RDNA2. These are still cheap GPUs.
Now with that understanding, it is easier to grasp just how much more powerful the full lineup of AMD will be. We have a situation where AMD and Nvidia are basically playing a game of chicken with each other. Both are waiting for the other to make the first move.
But Nvidia does not want AMD to grab the performance crown. If they did, that would capture the mind share of gamers that AMD is truly back. That sort of thing trickles down, so that even if people are not buying the fastest GPUs, they may still choose AMD simply because of reputation and brand. That is why having the fastest card is so important, its a marketing tool. That is why both companies are pushing full steam ahead with their launch plans. Nvidia will not wait to launch if AMD pushes out first and beats the 2080ti. Nvidia will respond as fast as they can.
As far as to what is being rumored, the 3080 with have 10GB and the 3080ti might get 12GB. It does seem like Nvidia has yet to decide exactly how much VRAM the 3080ti will get. They are probably waiting on AMD. Another very interesting rumor is that the 3000 series will have 4 times the ray tracing power over Turing. That's per tier. So the 3060 will have 4 times faster ray tracing than the 2060, the 3070 4x the 2070, and so on.
If that last bit is true, and frankly I expect it to be because I predicted that last year, then Turing will be quickly obsolete as ray tracing becomes the standard. The rumors say that basically ray tracing in games will not have the huge performance hit they do with Turing.
The 3000 series will also offer a new iteration of Tensor cores and DLSS 3.0. One of the juiciest rumors was that they might use Tensor cores to help compress data in VRAM, which would mean games would use less VRAM. If this can somehow translate to Iray, just think about how incredible that would be.
These rumors have not gone away, and come from sources that have been correct in the past. But I also believe this to be possible. Consider this, Turing is on the same fab as Pascal, and look at how much performance they still gained (in part because the chips got bigger.) Now they jump to a 7nm fab, which is vastly smaller. They can do a lot more on a chip now, and they get higher clocks. So not only will they offer far higher core counts, but these cores will be faster at their jobs on top of that. It makes perfect sense to me that Ampere will be big. And again, just look at AMD Ryzen, which has been on the 7nm fab, too.
I plan on upgrading to Ryzen soon too. Current specs are:
Intel i5 4670K, Gigabyte Z97X-Gaming 7, EVGA GTX 1070 Gaming, 2x 8GB Patriot Viper 3 DDR-3 1866 memory, Samsung 860 EVO 1TB SSD, Hitachi HDT721010SLA360 1TB HDD, 2 x Western Digital Blue WD20EZRZ 2TB HDD in raid 1, EVGA SuperNova 1200 P2 Platinum PSU, HP DVD1720 optical drive, CoolerMaster CM 690 II Case, Samsung SyncMaster P2370 Monitor @ 1080p, Windows 10 Professional 64
Upgrading to a Ryzen 3700X and whatever motherboard I can find (motherboards are in very short suply right now) along with 32 Gb of system memory once my stimulus gets here. I currently have a GTX 1070 so I plan on waiting until 30 series Nvidia cards some out to see if I can afford a 30 series cars or pick up a 2080TI cheap.
Yeah, I still have some Abit boards in storage as collectors items with Athalon processors MELTED onto the sockets. Used to be a extreme overclocker for gaming and had a few cooler fans and water pumps fail back in the day. I also have some old 3DFX Voodo and Voodoo 2 cards around here somehwere in a box of old parts.
To the original OP, if you already have the 750W PSU, you should be good, But if you haven't bought it yet, I would go with the biggest power supply you can afford even if you have to wait on the GPU for an additional month or two.
No, it would not.
DirectX already supports real time rays on all GPU's. That does not mean iRay supports AMD. iRay doesn't use DirectX or any of those API's. It works directly with CUDA-X. I assume that is how every other CUDA application works.
What iRay does is directly use the RT cores on the RTX cards which is much much faster than software emulation, which is how DirectX does it on non RTX cards.
Naahhh im just buying the single GPU providing I get the sale price I was referring to earlier.
A 750-watt PS should be more than enough. However, if you ever plan on adding another card, you might want to think about upgrading to a 1,000-watt PS.
Wait for the 3000-series cards? I probably won't unless they're going to be cheaper(never happen). And when I say "wait" I mean waiting until after the "early buyers" go through the guinea pig phase. I plan on getting a Titan RTX in my next build, so I don't really expect the prices to drop much on that particular card when the 3000-series are released to justify waiting for another year. I would imagine Nvidia will be asking for $3500+ for the next gen Titan card. Prices just seem to keep going up & up with each successive generation of Nvidia cards, so unless we're going to be able to start using AMD cards for Iray rendering in Daz, I really don't care about AMD graphics cards.
At this point, Ampere is all hype and no show, kind of like what they did with Turing, and we saw how that came out: extremely overpriced video cards with disappointing ray tracing capabilities(although its not like there were a whole bunch of games that took advantage of the "feature"). I think the Turing series cards were a bit rushed to release and just something to hold Nvidia over until the next over-hyped release...
Yes, Nvidia is greedy, but right now they have a fairly large chunk of the market.
This actually surprises me! I was always thinking that PAs use monster graphic card setups like Titans, Quadros, or triple SLI with 1080Ti's or 2080Ti's in order to get their promo renders done fast.
We are going to see what Nvidia wants us to see. If you don't think Nvidia is going to charge a lot more(like $300-$500 more) at the outset for a 3080Ti and the Ampere Titan vs their previous generation counterparts, then I have Bill Gates on the phone offering to split the difference with everyone who buys those cards.
Sure, Nvidia could lower the price of say the 3060, 3070, and 3080 series to compete with AMD in the gaming industry, but rest assured, Nvidia IS going to make up the difference by charging more for the aforementioned higher end 3000 series cards.
I upgraded from a 980TI to a regular 2080 (not TI or Super) and got a TREMENDOUS boost in speed. But, to be fair, I still use my 980TI for rendering as well (unless the scene can't fit in the 6gb VRAM).
I use both cards to render. I'm surprised a lot of you aren't keeping your 980TI in the system to have even more CUDA cores
Yes, it would. A list of AMD GPUs already support Vulkan.
And you'll have to point out to me where I said IRay would be supported by AMD.
Agreed that knowing what is coming, is a good idea.
... But all we can be confident about, is Nvidia releasing a new itteration sometime this year (very probable). Anything else is at best speculation, and invariably guess work.
I'm not making plans to spend hundreds or thousands based on speculation, never mind guesswork.
I think this is the difficulty with RTX, it is more difficult to benchmark, and it really depends on the scene. We had benchmarks where RTX outperformed non-RTX by a factor of 3 times and we had scenes where the speed gain of RTX was merely around 15%. Sure, the 20xx and 30xx series will be faster cards overall. But RTX is a factor when you decide whether they are worth the money because these cards are expensive.
If the 3080ti will only have 12GB VRAM that's a bit disappointing. I had hoped it would increase to 13 or 16 and the new Titan would go up to 36. 8k textures are becoming the standard outside of dazland so I imagine G9 might feature 8k maps. Now that's a huge challenge for Iray renders that drops to CPU unapologetically once you exceed VRAM limit.
The confusion may arrise as you state Daz, folks state Daz to mean Studio regularly and consistently. It would certainly allow Daz to integrate support in Studio, which is what I presume you mean? I also presume you're not talking about one of their other products.
All GPU's already support Vulkan. However if you don't think it would support iRay why is this even being brought up here?