Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
@wolf359 Heh, well CC3 forums are really quiet, but to be honest, I don't really know what to talk about in there. I mean, for me CC3 is just a pitstop, where I transfer my Daz characters to a more game friendly format and then I move on to other programs where the real work is done. I have no idea why there even is Iray plugin, since what's the point of rendering there without any environments, poses etc when obvisously it's just a character creator program. I check their forums quite often to see what they are planning, and I like their 2019 roadmap, but I just don't have much to say there...
Actually that is exactly the purpose for which the program was designed.
A content conversion utility for use in game dev pipelines.
hence Zero interest from the poser users
whom ( like their Daz studio contemporaries)
largely create still illustrations and pinups
Not Animation for video Games
As far as I can tell, IRay has been a blunder for Reallusion
very little interest or discussion , in the forums, about rendering animation
with a unidirectional ,brute force, path tracer with not even a dedicated hair shader.
This assumes that Unity or Unreal developers want to attract still image renderers rather than animators.
And that they'd devote some budget and ressource to do that just to get a niche market, and that they'd be okay to simplify their program for point'n click guy. I don't really see the logic, when the gaming market is more lucrative.
And getting the market need more than just the software. You need content that they do not create and must be made to target this specific audience and not just that (and someone has to explain me why, if some concurrent content is already out there, are there some post of people willing to transfer DAZ content to these other plateforms). Also assuming that some random content even with RTX tech would move Poser or DS users from their comfort is just too pulled by the hair. That's also forgetting the emotional value. I'm sure Victoria/Michael/Genesis are a long love history for lots of people and that they wouldn't go away. I'm sure they'll just complain on the forum or eventually fill a request. Should I also point out that there is a big difference between a random fix figure that has only one shape/textures and a figure that can shapeshift to thousands of (more or less) different ones?
That some interrested people are planning to /already using unity/unreal/whatever, sure. Shifting a market...you'd need a way bigger lever than just RTX for that (reminder : did all Poser users moved to DS when they felt like DAZ abandonned them ?).
No, they didn't. They moved to other sites that did try to maintain a level of support for the older figures or those made for Poser itself. Some have since moved here, but not completely, as I understand it.
Yes. You always have to mid-point through another tool right now, i.e. Maya (Daz to Maya, which you still need to tweak, fix-up, etc.) or something like iClone 3DXChange, which I've found to be quite buggy and prone to crash. Exporting directly to Unreal and/or Unity would be quite nice. If you watch the tutorials on YouTube there's a lot of work to do to get characters into these engines. It's boiler-plate and tedious, and certainly either scriptable or plugin-able.
I think that it's more a question of Daz being able to expand their market. Surely the number of people who want to render still images is shrinking. But Unity and Unreal Users is a growth market.
Welp, don't know if any real time raytracing in this. But still just about the best thing I've ever seen come out of a game engine. Mostly due to Megascans I guess.
https://www.youtube.com/watch?v=9fC20NWhx4s
According to Epic, it is using Unreal 4.21, and RTX support is only coming for 4.22. So no RTX in this video. But at the end of the day, does it even matter when this is the result? Another noteworthy thing is that this video was made by 3 people, not a massive studio.
There is a RTX video out called "Troll". Troll is running on a single 2080ti.
There is a little more info in this article which highlights Unreal as whole. There is also a cool physics demo linked in there.
https://www.polygon.com/2019/3/20/18273847/unreal-engine-graphics-future-gdc-2019
No it doesn't. Just to clarify to those who may not know, there is absolutely ray traced GI going on here, just that it was pre-baked into the textures through lightmapping. Essentially offline rendering like we do, UE4 then bakes all the information directly into the textures. This doesn't work for moving lights though. And not for reflections of course.
And the biggest problem: to really benefit from iRay or Octane you need an NVidia graphics card. The Unreal engine also works with AMD cards. So, the ideal solution would be a cranked up Unreal plugin for Daz Studio.
Any other Unity dev waiting for April 4th (https://unity.com/ray-tracing)? Who knows, maybe it really is working already....
I heard alot about unreal engine improvement as well. I havent personally try it, why would you render in unreal engine than in daz if you are going for still images?
Does unreal engine 4(before the next Ray Tracing patch) render better than iray in Daz Studio? can anyone show some comparison? Thanks
Simple. Because of speed.
If you can render anywhere close to real time with Unity or Unreal...just think about how fast a still image would render. It would take just a few seconds to resolve. You do not need to animate anything. You move the camera around, and the image will resolve itself in real time for you, and take your shot. Not to mention having that real time feedback on your progress, how many times have you set everything only to find that something was off when you started rendering? With Iray you need to stop rendering, close it and go back. With how things are shaded sometimes, even in Iray Preview, it can be very hard to tell what you have until you actually hit "render". With the game engine you do not have that problem. You see what you get live.
Think of it as being like the Iray Preview mode in Daz Studio, but better. The game engines are extremely flexible. If you want non PBR elements in the scene, you can totally do that. You can have hybrid methods of rendering going on, so if you wish to do things like particle effects, which Iray is VERY poor at. You can have cool looking particle effects using classic game rendering while still using ray tracing. Iray cannot do that. Unreal and Unity can do great particle effects already without ray tracing.
In that sense, the game engines can actually offer an advantage. They can also do motion blur and other effects as well, which Daz Iray lacks. This flexibility is what makes the game engines so compelling. Imagine being able to render both 3DL and Iray at the same time, picking and choosing what parts of the scene use which render so that you get the best of both worlds. Unreal and Unity can do this, and they can do it in real time.
Ray tracing is only just now coming out for Unreal and Unity, so there are not a lot of things to show yet. Unity does not even have ray tracing yet, officially, as it only releases in April. So it will take some time for people to show what they are capable of.
This is a bit misleading though. As mentioned above, ray tracing has been in game engines for a while, just not real time. It's pre baked into the textures through lightmapping. Pretty much a render button and a lengthy process that can take hours on complex maps, currently done on CPU in UE4 but I think they're working on a GPU implementation. This only works for static lights. Reflections have to be faked some other way like reflection captures and screen space reflections.
I'm not sure most people realize that many of these awesome looking games are technically not real time rendered completely as often times lightmaps were pre baked which is essentially not that different from what we do in Iray including waiting around for it to finish.
Well there will be the time factor that you will spend
Adjusting your Daz/Iray materials the Unreal/Unity engine.
and for complex scenes,with many materials it could become labor intensive
Daz transmapped hair looks particularly bad in Unreal
as the Iclone community is now learning since Reallusion
started showcasing their live link to unreal.
Also the export method will make a big difference in how your
genesis people are going to look over in unreal/unity.
If it is FBX, say goodbye to Daz JCM's an HD morphs.
if it is .obj export ,you will get true mesh fidelity but will need to re-export
the .obj for every change unless you are using something like alembic (with unity at least)
And any Dforce Clothing will definitely have to be re-exported as static mesh everytime.