Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
I can't see cloud-based rendering becoming commonplace for the forseeable future. Considering the time it takes for a complex scene to be loaded into my RAM locally before rendering, I hate to think how long it would take to send that information over a broadband connection. Also, when I work on a scene, I might be at it for several hours setting up the scene, tweaking poses and expressions, and over that several hours I might easily make a dozen test renders. On my local machine I don't really pay anything for them, but if I was paying per render on a cloud server, I wouldn't be able to work accurately on anything but the simplest of scenes. Maybe a professional CG artist can get it right first time, or not have to worry about the cost of rendering if they have a big company behind them, but I'm sure that hobbyists will be keeping their own local setups for decades yet.
Privacy, proprietary content and content restrictions are three very important hurdles that cloud needs to handle before it will become the primary method. Nothing more secure than locally hosting all of your own stuff.
There. Fixed that for me. Different strokes for different folks. I'm sure lots of people will be happy with cloud rendering - it's less cost up front. Just like some would rather rent a home than spend the time, money, blood, sweat & tears it takes to own one. And I'm fine with that, if it makes sense to you, go for it. As for me, I'll keep rendering in my tiny overheated room on my own GPUs, surrounded by my books, CDs, DVDs and vinyl records
.
Additionally, I guess GPUs can also be used for playing games. Plus there are all sorts of niche projects and hobbies that you can't do with cloud rendering. Apparently now there's a Single Board Computer (similar to Raspberry Pi) with a PCI-E connector on it. So it's an SBC with a low powered processor, but it can connect to any standard GPU. I don't know what it's meant for, but I can imagine the possibilities. That's why I love having my own stuff.
I'm trying to imagine having a very old and/or inexpensive GPU that is pretty much incapable of doing an Iray preview. And trying to set up the scene and guess at what the final rendered image will look like from just a texture shaded view or whatever. No idea what the emissive lights and materials will look like in the final render. No clue. And then you just assume it's okay, send it to the cloud for rendering, and get back something that's nothing like you wanted.
Repeat process over and over and over.
Yeah, not gonna happen.
Yea, I run an Iray render server and while everyone that uses it says how much it helps, it really is quite a hard sell for people to invest in it as a service. Certainly is a great option for people who want access to hardware otherwise out of their reach though.
Thank you for telling me about your Iray render server (a while ago). The only reason I have hesitated to use the service is because of this: "Pay as you go - $0.40 a minute (unlimited* access billed monthly)" This is the service I would most likely want. The problem for me is knowing how much a render is going to cost before it's done. It's .40 a minute, but how many minutes does it take to render an image? I suppose it depends on file size.
...some of us have no choice but to rent (housing prices where I live are obscene and I am now in my mid 60s and retired so not going to take on a mortgage at this stage).
That also applies to subscription software.
Being on Social Security, I cannot afford to plop down 600$ for an Octane perpetual licence and the Daz plugin up front. The subscription track gives me the full engine plus plugin for 20$ per month, that's nearly 3 years of use with the cost spread out over time which is easier to manage on a tight budget.
would a 2080ti render twice as fast as my 1080ti? sli setup would be 4 times as fast?
not impressed by that game either. they need to use photogrammetry for their game textures instead of stuff made in photoshop IMO.
Just so all of you RTX-using people are aware, there is a new version of the Daz Studio Pro Beta - 4.11.0.335:
https://www.daz3d.com/forums/discussion/265581/daz-studio-pro-beta-version-4-11-0-335-updated/p1
One of the things in this update is an updated Iray which adds support for the tensor cores that are part of the RTX technology. This won't accellerate rendering (it doesn't add RT core support) but it should provide a hardware speedup for the AI denoiser. So be sure to update your NVIDIA drivers and give that functionality a good beating.
So this update is only worth grabbing if you got a rtx, or does the ai denoiser work on gtx too?
The denoiser works on my gtx 1080ti quite nicely
Thanks, will have to wait to grab it till this render finishes baking lol.
I know what you mean. When I started college in '92 at a major university (ACC, football program and all), $2600 covered a whole semester, books, room, meal plan, everything. Today, it covers basic tuition at the local tech school. We're getting priced out of everything. I'll finish this degree sooner or later.
The 2080ti is definitely faster, but I don't think it's twice as fast. The cards will work together fine as long as they're installed, SLI will actually cause problems and slow them down.
Iray is a rendering engine. Just like Octane is a render engine. They do the same thing. Daz does not design Iray in any way, true, but Daz does need to implement the plugin once Nvidia provides the new SDK and they tailor it to Daz Studio. Iray is simply another product that Nvidia sells, it is not just designed for small studios or whatever. It just turns out that it has becomes more niche than probably Nvidia hoped for, most probably because it is exclusive to their hardware. Luxmark is hardware agnostic, so is Blender. Iray also allows for customization, after all, you have seen the Iray programming guide. It may not be as pliable as a game engine, but game engines are designed to be highly adjustable on purpose. They have to be in order to accommodate for all the different kinds and styles of games there are and every game has very unique demands of what it needs. A first person shooter will value high frame rates and speed where are a 3rd person exploration game may desire more atmospheric effects for mood.
The game changer comment has weight because of just how very rare it is for anybody at Daz to talk about future product features. Daz NEVER does that. I cannot even recall a time when somebody from Daz went on record in this fashion. It shows that they are extremely confident in what is coming out of RTX.
And they have good reason to be. RTX GPUs are already about twice as fast as last gen and that is without its prime feature enabled. Every software that has enabled RTX so far has seen massive, and I mean MASSIVE gains. Want me to define massive? I define massive in that it is multiple times faster than not having RTX. The typical generational leap is usually only 50% if even that. Turning RTX on gives results that are in the 2-3X+ range. No generation has ever seen this much of a leap in performance. Take a look at what turning on RTX does for Octane VS turning it off. And keep in mind this chart only lists RTX cards. If the 1080ti was on here, it would be pretty far behind the 2080ti even without RTX enabled. These results are pretty common. I can't really think of any instance where turning RTX on has not resulted in a huge performance gain.
I believe the results seen here will be duplicated when Iray gets RTX enabled. Just look at this, the 2060 with RTX on is faster than the 2080ti without RTX!!! That is huge. Just think of that. A $350 2060 could blow the doors off the (formerly) $700 1080ti. The 2060 with RTX enabled could be rendering faster than the 2080ti is right now. That is amazing. For comparison, the 1060 certainly was not even close to the 980ti. The 960 was nowhere near the 780ti. I could go on and on about each generation. This is what I mean by a huge generational leap and why so many in the rendering community are excited. If that is not a "game changer", then what in the world is? LOL.
And why would Iray not benefit like Octane? There is no possible way that Iray doesn't get similar benefits. Iray is not a game engine. You do not "partially" integrate ray tracing cores here. A game might only use RTX for shadows (like the ironically named "Shadow of the Tomb Raider"), or reflections like Battlefield 5, or global illumination like Metro Exodus. Each of these games only implements ray tracing for one single aspect of their lighting. Iray is fully ray traced just like Octane.
Gaming does have a ray traced benchmark, and now that Pascal GPUs can use ray tracing in games, people have tested Pascal in ray tracing gaming benchmarks. The prime bench is 3DMark's Port Royal. The results are pretty drastic. Like Octane, the 2060 destroys the mighty 1080ti, which can barely even handle the bench. The jump to the 2080ti is quite massive here. So even in gaming, it is possible to see just how much the ray tracing cores add to RTX cards.
Game changer is such a marketing term though. What game is being changed here exactly? So you'll render faster. That's great, it changes nothing other than having to wait less long for a render to finish. What game is changed when I have to wait 5 minutes instead of 20? I still have to wait. I still have to set up the scene, all the workflows are still the same.
A game changer would be if, solely because of RTX, we would suddently get a real time viewport with the same quality as a finished render. Or if we suddenly got completely real time ray traced games with the same quality as baked lightmaps. That will not happen for a while, so let's say what we have now could at least be the beginning of a game changer. But for offline rendering? I just see a very welcome speed boost, one that is indeed much higher than the usual generational steps. This does not change any game whatsoever. Game changer implies that something that was previously done in one manner can now be done in a completely different and much more comfortable manner.
For the price they are asking, I might go for it, if it doubled the VRAM. The scenes I do usually have to be done in multiple passes, more vram would be less passes. Cutting a few minutes off of each render is not going to make me go out and spend over a grand(for the 2070) on a graphics card, that's over a weeks pay, and work is sporadic as an independant contractor.
Yea I think a lot did, which is why I now do an 'all you can eat' monthly subscription. Drop me a PM if you want more info, I don't want to derail the thread :)
...well, it doesn't show up in the DIM.
I render for a living, so to say. If speed was such a game changer, I would throw more hardware at it or maybe use a render farm. Nothing keeps me from spending more money to get faster rendering. A 2080 TI costs a lot of money too, so that problem wasn't solved either. It just happens to be good upgrade, at least for us rendering folks, from a price vs speed point of view. That's pretty much it though. This doesn't change my game, but maybe it's just semantics and we have different understandings of what exactly game changer means.
Iray was a game changer for Daz Studio. It rendered much faster (with proper hardware at least), it changed the workflows of PA's towards a more modern PBR style, it was more intuitive and more productive. It brought Daz Studio to the age of GPU rendering. That's the type of thing that I call game changer. Fully raytraced games is what I would call game changer, not even so much for the gamer but for game developers who would get around a lot of annoying hackery and endless hours of lightmap baking.
I tend to agree. I really think it's pretty much a game changer. But not totally a game changer, maybe only 75%. But it's certainly awesome. No doubt. And let's face it, the whole thing is a total beast. Seriously.
And let's not forget jaw dropping. Though I'm not sure. Maybe. I think it's more of a beast than jaw dropping. But no doubt it will leave all others in the dust.
!!!!!!!!!!!!!
But that's not the scenario. You could already increase the speed, you just needed to throw more hardware at it. If cost isn't an issue and only speed matters, you can absolutely do that. It's what VFX studios do with their render farms.
Whaaaat? Did I read this right? You, our favorite hardcore pessimist, handing out warnings left and right and advocating caution with RTX not a few months ago (maybe weeks), suddenly think this is a game changer after all? Well I'm kinda glad to hear that
I don't know what "3-6 times faster" actually means. Mathematically, this is confusing.
Does it take 1/3rd to 1/6th the time that the old one did?
Actually it was a bit of sarcasm, highlighting how pointless it is to try to define marketing phrases that are ultimately meaningless, like "game changing" and "awesome" and "it's a beast". Which is why I immediately dismiss them as soon as I hear them.
Both ways of describing it are correct.
The 3-5 times faster means that in a given amount of time you could render 3-6 times as many frames for an animation for example.
On the other side, it also means that a single frame would take between 1/3rd to 1/6th as long to finish rendering.
So it just depends on which way you want to look at it.
Yeah, when I added a 1070 to my 1060, I was expecting a huge difference in render time, since I was more than doubling the core count. Same when I went from 960 to 1060, reality did not match my expectations. It seems to me that GPU are in the same kind of slump CPU have been in as far as diminishing returns every generation.