Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
It uses Displacement just fine, presuming the polys are dense enough to use them. It works like most render engines not named Firefly or 3Delight (and probably Renderman).
Note I do recommend setting the Render time Sub-D level instead of the Tesselation level for better results as the Tesselation level is not handled on the card as well as the Render time Sub-D level for the same level of Sub-D.
The one time I've found displacement useful is trying to tease out some shape from transmapped hair for closeups.
Though at that point I needed to punch up SubD to, like, 5 and frankly I would have been better off using a better hair.
Noted. Thank you for the info Spooky
You may want to start a seperate post in one of the other forums to quickly get to the bottom of that
As a looooong time Reality/Lux user I am getting really spoiled with IRAY. Even though I still struggle on learning what means what in the material options and in setting them up easily like I could in reality, the interactive preview window is FREAKIN aesome!! To manually see changes as you make them - its just fantastic. It has actually made rendering less task oriented and more fun. With my new PC, I can get great render times with either lux or Iray, so that's not an issue, but having to bring up reality everytime I make a small change gets old when I can see it happen as I go with Iray. I do feel a bit guilty as I havent use reality/Lux in awhile now and being a big fan of both and an advocate, it kinda pains me.
Just an additional note and additional clarification.
When using the surface tesselation setting, this causes the amount of video card memory to spike in initial calculation before settling back down to the render usage level for video card ram. The higher the tesselation level the bigger the spike. If this spike causes the amount of memory to exceed the memory available on your card, the card will drop out of the render, causing much longer render times.
Setting the Render Time Sub-D level will increase video card memory usage because of the extra polys, but not cause the spike, which will keep your video card in the mix much longer.
Displacement is a balancing act. Remembering that each level of tesselation or Sub-D quadruples the number of Polys on the object, don't use more than you need to use to get the effect you want. :)
The problem I usually run into is that if I'm close enough such that Bump/Normal isn't good enough to make the surface look right, I'm unlikely to get SubD up high enough to keep the displacement from looking obviously pixellated.
There are, again, some occasional exceptions, and it might be less of a problem with higher end cards.
Aren't displacement maps the method that is being used to create a product such as the Shape Reprojector?
http://www.daz3d.com/simtenero-shape-reprojector
Not based on the description of the product.
It appears to be creating morphs.
Hmmm, OK but similar concept. I suppose though you could make Mount Rushmore on a plane cliff model using displacement. Displacement sounds like the technique all the software apps that make 3D models from photographs use, although I guess it's now faster and more accurate with that apilibity built into the video cards.
I think there may be some confusion here.
The only modeling software that I am aware of that can use Displacement maps for modeling is Carrara if you have the "Anything Grooves" plug-in. While Displacement maps actually move the mesh, only Anything Grooves (AFAIK) allows this movement to be baked into the mesh, and exported that way.
Something may have changed since the last time I used this method, but it really is not an efficeint way to model, in any case. Morphs are generally created using modeling techniques, not displacement maps, especially since displacement maps generally require tesselation to make the shape changes, and that would change the vert order and make morphs not work.
Hmmm, I read the description of what displacement does and guessed that it must be very similar methods used for other products, if not the technique used for a product such as AutoDesk 123D Catch. So that isn't the case I guess. Still interesting although my HW can't handle displacement anyway. Maybe next year I can try.
ZBrush can do it, as I recall, though of course it is a bit weird to start with.
Ah, this explains the issue I was having the othe day with an item that had a Displacement SubD of 5 but didn't seem to be eating a lot of GPU RAM before it crashed.
I was under the impression ZBrush went the other way. You sculpted and then baked to displacement or normal maps. :) It has been a while since I looked though. :)
I took a set of Valandar's armor with some beautiful scroll work and set the surface to 6 levels of tesselation, and overflowed the video ram on a 12GB card, but man it looked great when it finished rendering in CPU mode. :)
Not even close. Products like 123D Catch use Structure From Motion (SFM) photogrammetry to create 3D point clouds from a series of stereo photographs. The software first resolves which portions of each photograph overlap, then it identifies points (pixels) in common on each photograph. Then using either calculated or known camera/lens geometries the software can determine the location of the camera/plane in 3D space. Once these calculations are complete, the software can create a dense point cloud by using trigonometry to determine a points (pixel) location based on the triangle(s) created by determining the points position relative to the camera planes. Once the dense point cloud is completed, a 3D mesh can be created via triangulation from each point in the point cloud. After this step is completed, the software can UV map the new 3d object, and texture map it using the original photographs.
The actual process is much more complicated than my uber brief explanation above. But hopefully it's detailed enough to give you an idea what photogrammetric software like 123D Catch actually does.
....however I am rendering on the CPU as my card only has 1 GB of memory and that is pretty much taken up just running the displays.
....while it may have been a bother to deal with submitting to Reality every time you made a material change, the upside is you don't need to keep the scene and Daz programme open thought the render process, thus saving on system resources, and if you have enough CPU cores, he ability to work on a new project while the other one is rendering. Also you can pause rendering close Lux shut down your system, come back later boot it all up and continue the process instead of being forced to have it run until the render is completed
When I read that Totte had a render of Jack's The Library using the Iray shaders and emissives took 48 hours to complete (CPU) it made me say no way am I going to punish my four year old system like that.
Interesting thread. Just stumbled upon it...