Show Us Your Bryce Renders! Part 7
This discussion has been closed.
Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.
Comments
A further attempt modelling with hexagon. Useless thingies.
@electro: Useless things, or abstract art? Either way it's a learning project.
@electro-elvis - useless thingies nicely presented.
Well, that is an interesting question. I suppose for altitude, it is when you want something to change over the material y-axis. Which could be the objects y-axis, depending on your mapping mode. So you can think of it as a change over distance from one end of something to the other. The translation tools giving you a good deal of control over where that transition takes place in space.
Slope is most useful for terrains and is independent of mapping modes, in a similar fashion to orientation, but orientation only applies to one side. To change the response of slope you need to go in and modify the filter properties directly.
Curvature is the most difficult I would suggest because it isn't easy to get a good idea of what impact your settings make unless you render the scene. If you have a high resolution mesh, that's the time to give some thought to using curvature. It can act as a "dirt" effect or can be very effective at showing up erosion detail on terrains.
There is no really easy way to say, in this situation use so and so. The choice of filter is really the act of solving a puzzle. You have in mind some kind of effect and in the material lab you have a set of tools to use and the challenge is to bring the components together into some kind of solution.
To some extent it might sound easier just to UV map everything and then draw the textures in a paint package... But that's something I reckon is easier said than done!
And once you have got to grips with your textures you can also use them in 3D - something that is not so easily achieved with UV mapping. So I think there is a lot of good things about Bryce's materials functions. But it is not without drawbacks. For example, the textures made in Bryce cannot be exported effectively outside Bryce. Which is a shame.
Here's another galaxy test.
Edit. Looks to me like Electro-Elvis has used a bit of curvature filtering on his "useless" what-knots. I'd call them "interesting" rather than useless.
Useless? I don't know. The bottom right one looks like a sea shell and the top right one looks like a button on a box. All you need is PRESS on the top :-)
The label on the button has faded by too much fingers pressing on it. When it was new, it read "Make Art".
Bottom left one? Put a shaft on it and use it as a hammer :-)
Excellent work, everyone. There are too many good things going on to comment on them all. Keep it up everyone.
Here is something new from me, a first this year. I'm still fussing around with this tropical island thing but I'm using plants from ngPlant and Carrara for more realism. Lighting is with my EGDLS (EarthGlow Dome Light Strategy) which is a special light rig I use for increased realism in outdoor landscapes. I don't really have a name for it since it isn't really a scene but a test render. Thanks for your time. Feedback is greatly appreciated.
The gold one on the lower left is obviously an alien trailer hitch...
@David: Thank you for the information. I won't pretend I understood everything your wrote, and that you've explained these three in videos. So I think I'll revisit those videos and pay closer attention, since I now want to try and use them. And that's another fine looking galaxy.
@Rashad: Those are nice looking plants, and the quantity does give a jungle feel. But I wonder if some of those plants aren't washed out from over intense lighting. Is your EGDLS something you've already explained or something new?
@Rashad - the vegetation looks very rich and diverse. Somehow, the lighting doesn't convince me. The shadowy parts look like there were ambient at work, though I know there isn't. I also miss a bit of light through the leaves, they seem not to be translucent. You asked for comments. Otherwise, I mean, this is a beautiful scene. Strange lady being violent in such a place. ;-P
On the solid objects, the effect is very good. The trunk of the tree on the right is excellently lit. The lighting of the leaves in the shadows is less convincing. They look to be self illuminated, ambient, and a bit solid. This I understand is a limitation of the render engine rather than your skill, since I know you can do leaf transparency, but with complex lighting then the "expense" of the transparency effect becomes the limiting factor.
In terms of complexity the scene is excellent.
In contrast, here's a render of two spheres. And then one of one.
Question for the lighting experts here.
Considering where the shadows on the ground are from the containers on the left.
If this was real, would there be shadows of the plane on the containers on the right?
I can't decide.
If there should be, is it possible to do it in Bryce using some sort of capture.
Given that the shadows from the containers on the left are being cast from objects that are almost out of frame. And that those containers are at ground level. Consider which part of those shadows correspond to the top of those containers. Now consider that you might reasonable expect an aircraft to be quite considerably higher from the ground than the tops of the containers. That would place the shadow of the plane (if it were above the containers) well into the scene - possibly not even visible. If now you were to move the plane to the position shown in the image. This would have the effect of shifting the shadow effective the same distance in space along the ground. Which would yet again be further into the scene along a similar axis to the direction of the falling shadow (coincidentally). Meaning I wouldn't expect to see the shadow from the aircraft in this shot.
@StuartB4 - great render. Shadow from the plane? At best on top of the containers far away at right, probably much farther away. In my experience, Bryce casts shadows correctly. Position the camera high above the scene and look down. That's always helpful to find the shadows.
Does this help
http://www.thenakedscientists.com/HTML/uploads/RTEmagicC_Qantas_b747_over_houses_arp_01.jpg.jpg
I don't recall actually seeing shadows from the planes even when they appear to be that close. I lived for some time (2 long years) right under the flight path into Heathrow
Rashad : this is really an outstanding picture! To me your lightning technique is truly amazing.. However, leaves here are looking a little bit "greeny" and I can't really see shadows under this jungle canopy. I've done a tropical picture this last month and it's very hard to do (you can go to my gallery here at daz), I've employed transparency on leaves for it but the render time has been awful!!
Stuart,
Looks to me as if the shadow cast by the plane would not be visible to you from this perspective. If the sun had been higher up in the sky then indeed we would expect the shadow from the plane to fall directly beneath it. What I do think you are missing however is some degree of upward facing light. Most often we concern ourselves only with the light that shines down, we forget that light bounces back up into space from the ground. Maybe you could add a radial beneath the plane so that the shadow isn't so dark on the underside.
David B
Those galaxies look amazing. I'm really impressed to see these are made with procedurals. If you can find a means of flattening out the central sphere so that it blends into the spiral a bit more these will look fairly realistic!
Guss, Horo, David,
I cannot thank you guys enough for your detailed and honest feedback. You are all absolutely correct. There are several things going on that aren't immediately obvious. First I will explain what EGDLS is for Guss.
EGDLS stands for EarthGlow Dome Light Strategy. Just as I was describing to Stuart about the light that reflects back upward into space and how it should be illuminating the underside of the plane. Chohole's example shows clearly how incredibly bright the shaded area underneath the plane should be. In real life there is always a ton of light flying around in all different directions, not just downward. Most of the time we as Bryce artists don't think about the light that bounces upward, so I often see Bryce renders where there is some degree of light missing. After years of rendering images of objects with dark undersides it begins to look "natural" to our trained eyes but in fact it isn't. More light is almost always better than not enough. EGDLS is basically two light domes, one that works as a Skylight blue influence, and another that is a gold colored dome for the Earth Glow. Because Light domes can be directionally biased in Bryce, the Skylight dome is biased to shine more of its light downward and the EarthGlow dome is biased so that most of its light shines upward. Simply disable the shadow casting from the ground plane so that the EarthGlow light can penetrate and you now have a means to get light into the areas near the ground where skylight alone would have already been obstructed by taller objects. The problem isnt the EGDLS, as the tree trunks can demonstrate. But the leaves are a different issue.
What goes wrong with this scene is exactly what David described. Due to limitations, Ive had to make several compromises, one of them being the essential translucency. What you see here is my best attempt to employ ambience to do the job of translucency, of which it fails.
Below is a quick sample of what these same trees look like rendered with translucency. To answer Horo's previous question; ngPlant can use meshes or billboards for leaves, but I use only meshes for leaves because only meshes can produce translucency. Realize that this little render took 13.5 hours on my monster 8 core Xeon system, and there are only a few palms visible, it is nowhere near the complexity of a real scene. Granted the specular response is too high, but you can still see how powerful the translucency can be. The lighting levels are exactly the same in this render as in the previous one. This render was done with both domes set to the minimum of 16 lights each plus the default sunlight for a grand total of 37 point lights (which isnt very many at all). To get smooth shadows, I should be rendering with each dome quality no less than 256 totaling over 500 point lights all combined. Due to the low light quality setting, there is a ton of shadow banding visible in both the original render and this new one. The shadow banding on the woman's body indicates how few lights there really are.
Lessons learned:
I've learned that high polygon meshes generally render more slowly than low polygon meshes. A mesh that is over a million polygons will render significantly more slowly than a mesh only a few hundred polygons. I think it is due to the Self Shading. I suspect that when Bryce is calculating the way a mesh casts shadows onto itself the higher polygonal complexity means many more calculations. This is a sad irony for me.
Part of the trick for how I accomplish this degree of complexity is that I build highly detailed models and groups of plants up to a few million polygons and then instance that group of plants all over the place. In the case of the trees, by the time I get to using them they are no longer made of separate meshes for leaves and trunk, I have consolidated them into a single mesh object (this is necessary for the Instance Lab to properly rotate the model due to a rotational bug with Grouped objects in the IL). I don't instance groups, only meshes.
This current example took 14 hours to render and a big part of it was the fact that the trees are themselves now single mesh objects. Translucency is a shadow intensive effect, so the self shading of the translucency rays adds up quickly. I suspect that if I had not consolidated these meshes leaving the leaves separate from the trunks, the resulting render would have completed in half the time. Sometimes, I just can't catch a break!
I've not been anxious to do so, but from what I can tell a true render of the full scene with translucency and even decent light quality would take over a week on my system, which is simply impractical. As many of you know, the entire point of this study was to create Howie Farkes style content for Bryce. But it seems impractical at this stage due to rendering issues. If not for the memory leak bug and the rotational bug with Instancing, I'd not need to employ this ideal of "super-meshes" as a workaround to attain naturalistic levels of complexity.
C-Ram,
Thanks a ton! I loved your tropical scene!! Excellent work. I will be sending you EGDLS so you can play around with it. Enjoy.
Horo
In this second example you can see a bit better the quality of the trees ngPlant can make. What do you think? Word on the street is that ngPlant gives x-frog a run for its money, and I tend to agree. The trees above even have fruit, albeit somewhat orange looking, but still the software allows for such specificity.
Thanks very much David, Horo, Chohole and Rashad Carter.
You've all been a great help.
I think I just got so used to making sure shadows were correct on my renders, I was a bit disappointed when I didn't see any on this one.
@Rashad - yes, those ngPlants look convincing. I have collected a couple of free XFrog trees and most of them have just 2D-faces for the foliage. Having 1 to 10 different leaf images makes the tree look more natural - at first glance. So Bryce trees are still quite good, if only we could properly bend them. Arbaro creates also quite natural trees but it's not a click here click there affair. Tanstaafl.
It may also interest you that David and I have supplied what we call Skydome HDRIs with several of our products. True HDRIs with the sun removed for sky light and nice clouds. 5 degrees below the horizon is the lower part of the panorama completely blurred. That part supplies the coloured light shining up from the ground. Such Skydome HDRIs are quite effective but admittedly a bit less versatile than two separate domes because there is only one control for the light. On the other hand, without shadow casting, IBL renders rather fast even at high quality settings and is easy to set up.
That was a render? I thought it was two photo's combined to provide a context for your question. The reason I thought that was that it looked to me like the FOV for the scene and the FOV for the aircraft element didn't match. So go on, you've piqued my curiosity now, is the aircraft a 2D element or 3D?
One the galactic front. Here is an evolution of my last example. Made possible by an unexpected find in the DTE that has allowed the generation of a low level positive hypertexture in one component - a process that formerly I thought needed a minimum of three. Thus freeing up the remaining two components for other texture elements. That's quite a saving!
@David - looking great. I've also been experimenting making the core larger and flatter. I sent you my settings.
@David.
It's a 3D object. It's the only object in the scene, the rest it just the HDRI.
Ah... OK right, that's good because it means you can fix the perspective issue. I reckon you need to move your aircraft further away from the camera and then scale it up. The trick is to try and match the perspective of the HDRI with that of your model. If you have the patience to do this what you can do to make the process less subjective is a cube in the scene and scale it so that it covers one of your containers in the foreground then copy that and move it into the scene so that you have matched it with a distant container and copy that and line it up over another more distant container. Now you can scale your aircraft to appropriate for the containers and position it according to this geometry. Once you are done with that. Delete all the cubes. That should give you a better chance of getting a match between FOV of HDRI (which you have no control over) and the apparent FOV for your aircraft (which is wrong if the scale is wrong). A sort of relativistic FOV effect.
If that is baffling and not very well explained, I can probably provide an example. This is something I have to deal with extensively in putting "real" (geometry) groundplanes into HDRI backdrops.
Edit: Having said all that. It may be that you turn out to be right or very close. It's a bit of a weird shot to see a plane so low and I don't really have that much of an idea how big they are. But the process I outlined would be how I would go about solving this problem since the scene would provide all the necessary information to eliminate the "unknowns". So I would find it an interesting exercise nevertheless.
They are very big David
http://repeatingislands.files.wordpress.com/2013/01/1658227.jpg
I was thinking of being a bit more precise than that.
For example, if... it was a 747 http://en.wikipedia.org/wiki/Boeing_747 there is a specification chart at the end of this page.
And for containers http://www.robsbrokerageservice.com/cargo-container.pdf a listing at the top here.
All that really is needed for a ratio to be established between the container size and the jet. Then put some placeholder geometry into the scene matched to the HDRI images. Get a distribution of these to establish a 3D frame in which to put your appropriately scaled jet. That way the perspective can be matched to the HDRI.
It's only really a problem with HDRI when you have specifically sized things in our HDRI and specifically sized things you want to incorporate.
Edit: And besides "big" is a relative concept.
I was thinking of being a bit more precise than that.
For example, if... it was a 747 http://en.wikipedia.org/wiki/Boeing_747 there is a specification chart at the end of this page.
And for containers http://www.robsbrokerageservice.com/cargo-container.pdf a listing at the top here.
All that really is needed for a ratio to be established between the container size and the jet. Then put some placeholder geometry into the scene matched to the HDRI images. Get a distribution of these to establish a 3D frame in which to put your appropriately scaled jet. That way the perspective can be matched to the HDRI.
It's only really a problem with HDRI when you have specifically sized things in our HDRI and specifically sized things you want to incorporate.
Edit: And besides "big" is a relative concept.
ah OK, was only trying to be helpful :coolsmile: Having lived in the St Margarets area of Twickenham (just over 7 miles from Heathrow) I can assure you that big seems very big at times, and certainly extremely noisy. :coolmad:
Having had a drive round the perimeter road of Heathrow (slightly unofficially), they do seem bigger though I must admit, when seen from that perspective when they take off in front of you.
i thought i would share my most recent work in progress :) ...i still have a lot i would like to do to this scene but i was very happy with my results so far and wanted to share :)
Very nice ship on the ocean render there. Interesting lighting conditions; I like how just bits of the sails are catching the sun with the rest kind of in shade, but not quite.