Who said Blender was hard?

14445464850

Comments

  • j cadej cade Posts: 1,326
    And FWIW, from what I have read ProRender doesn't really seem to have any advantages over Cycles. At this point Cycles is faster and more fully featured.
  • nonesuch00nonesuch00 Posts: 5,862

    For a hobby game developer Eevee is ideal. You can design and model your scene to look good in Eevee and it will mostly be able to look like that in Unity at realtime on most 'of the newest' platforms but not older HW like my PC or non-Metal and non-Vulkan mobile HW. That is the ultimate intent of the Octane renderer port to Unity too, sans the modeling part.

  • nonesuch00nonesuch00 Posts: 5,862

    I would imagine that ProRenderer being open source gets most of it's features rewritten and incorporated into Cycles. 

  • Joe CotterJoe Cotter Posts: 3,164

    Actually, PBR isn't about getting rid of shortcuts but rather moving to a 'standard' that more closely reflects real world lighting and materials. It also isn't just games. In fact it got started with animation in big studios. The point is that one can adjust wrong shaders with wrong lights or vice versa as long as we are doing a still image, but the minute one tries to animate something where lighting or shaders change, that wrong vs wrong balance doesn't work anymore. Things go all askew. So Pixar and others realized that they needed to move to a standard where they could change lighting and still have the materials work properly. This concept of PBR being some mythical thing to do with games is a common (and somewhat irritating) misperception in the industry. Again, it didn't even start with games, that's just when the average person started hearing about it.

    There is a related history where there was absolutely no relevance to real world in scale when modeling in 3d applications. Moving objects between different environments was a real problem because the scales weren't related in any way. Eventually, there was a movement to trying to move to some connection towards real world scales where Blender for instance moved to 1 Blender Unit = 1 Meter irl. All of the applications basically moved to some relevance to real world and so now a simple scale factor can be applied to objects moving between environments. PBR is simply the next step of this evolution but with light and shaders.

    As to Eevee, yes it takes shortcuts in the form of probes for instance. There is not magic to make a 20hr render take minutes.. it takes comprimises. (Btw, considering Eevee is not even at gen one, but beta of gen one...)

    A parallel evolution in cg is the evolution away from having to take shortcuts to deal with the fact that hardware isn't strong enough to do things in a simpler fashion. Optimally, we could sculpt millions of polygons, then paint without texture maps, animate it all with an automatically generated rigging system where we made simple directives and corrections to how the bending etc worked... where we could do real time complex physics simulations complete with heat and energy, surface properties... that mimics the real world, but that's a ways off yet. Currently, every time we come up with more horsepower to deal with making things real time, we come up with more we want to do that can't be done in real time. CG has moved at light speed (pun intended ;) recently and doesn't show signs of slowing down so for that I'm happy. It's a ride, enjoy it. :)

  • Joe CotterJoe Cotter Posts: 3,164

    Eevee will be fantastic for animation actually, both as a previz working environment for high end animation and a good enough animation for people who aren't doing big budget animation.

  • Joe CotterJoe Cotter Posts: 3,164
    edited August 14

    I would imagine that ProRenderer being open source gets most of it's features rewritten and incorporated into Cycles. 

    I haven't looked at ATI's ProRenderer licensing but it's important to note that different open source projects can be incompatible due to licensing issues. One can't just incorporate something from one open source project into another just because they are both open source.

    I haven't personally paid much attention to ATI as it's been pretty much a fundamental for the last 10+ years that Nvidia was what you used if you are creating content whereas ATI was often better if one was only a consumer, due to various aspects related to how the chip companies focused their resources. Since I am a content creator, ATI has no relevance to me.

    Post edited by Joe Cotter on
  • ebergerlyebergerly Posts: 1,218

     Personally I'd much prefer just a speed-up in 3D view performance for Iray, so I don't need to worry about switching over to another renderer. Just simple stuff like somehow making the Camera view in Iray as fast as the Perspective view. 

    I saw a video of a guy who had like 4 GTX-1080ti's, and his viewport response in an extremely complex scene was incredible. 

    And if they could make the new Principled Shader in Blender somehow identical to the Iray shaders (or actually vice-versa) so you could texture and UV in Blender and just move it over to D|S using the awesome Principled Shader. 

  • nonesuch00nonesuch00 Posts: 5,862

    I would imagine that ProRenderer being open source gets most of it's features rewritten and incorporated into Cycles. 

    I haven't looked at ATI's ProRenderer licensing but it's important to note that different open source projects can be incompatible due to licensing issues. One can't just incorporate something from one open source project into another just because they are both open source.

     

    I would imagine that ProRenderer being open source gets most of it's features rewritten and incorporated into Cycles. 

    I haven't looked at ATI's ProRenderer licensing but it's important to note that different open source projects can be incompatible due to licensing issues. One can't just incorporate something from one open source project into another just because they are both open source.

    I haven't personally paid much attention to ATI as it's been pretty much a fundamental for the last 10+ years that Nvidia was what you used if you are creating content whereas ATI was often better if one was only a consumer, due to various aspects related to how the chip companies focused their resources. Since I am a content creator, ATI has no relevance to me.

    It has an MIT license which is basically unrestricted.

    https://github.com/GPUOpen-LibrariesAndSDKs/RadeonProRender-Baikal/blob/master/LICENSE.txt

    https://github.com/GPUOpen-LibrariesAndSDKs/RadeonProRender-Baikal

     

  • Joe CotterJoe Cotter Posts: 3,164

    In that case, it could possibly be used to speed up development of Cycles/Eevee running on ATI cards. However there are other aspects that would come into the equation as to if the Blender organization would want to put resources there vs all of the other areas they are working on right now. Only Ton and others at the organization that are working on the current projects could probably tell you that.

  • nonesuch00nonesuch00 Posts: 5,862
    ebergerly said:

     Personally I'd much prefer just a speed-up in 3D view performance for Iray, so I don't need to worry about switching over to another renderer. Just simple stuff like somehow making the Camera view in Iray as fast as the Perspective view. 

    I saw a video of a guy who had like 4 GTX-1080ti's, and his viewport response in an extremely complex scene was incredible. 

    And if they could make the new Principled Shader in Blender somehow identical to the Iray shaders (or actually vice-versa) so you could texture and UV in Blender and just move it over to D|S using the awesome Principled Shader. 

    Well the precision in the same time deltas for all that HW are not the same so you can't get at what you are talking about except via a game-engine style realtime time renderer like Eevee that purposely trades come of that precision to try & yield identical realtime results on all these different hardware platforms. It's not that those professional renderers didn't take shortcuts too, just not as many.

    Or you can buy the same type of HW that guy has or wait a few years until such capabilities are possible on one consumer-level card. Those have short cuts too though and those shortcuts will be improved upon eventually so always chasing after the 'best' is a frustrating exercise for a consumer with a limited budget.

  • nonesuch00nonesuch00 Posts: 5,862
    edited August 14

    In that case, it could possibly be used to speed up development of Cycles/Eevee running on ATI cards. However there are other aspects that would come into the equation as to if the Blender organization would want to put resources there vs all of the other areas they are working on right now. Only Ton and others at the organization that are working on the current projects could probably tell you that.

    Well the ProRender can run on any card that can run openCL 1.2 so it is not restricted to ATI cards the way nVidia is restricted to nVidia cards. openCL is an open standard.

    I think that a MIT open source renderer that uses a platform independent API like openCL has possibilbly thoroughly investigated by the Blender Foundation and it's contributors.

    Maybe I will read some of the developer notes on the Eevee / Cycles portion of the Blender designer and developer log.

    Post edited by nonesuch00 on
  • j cadej cade Posts: 1,326

    I would imagine that ProRenderer being open source gets most of it's features rewritten and incorporated into Cycles. 

    It's more that Cycles is at this point simply an older and more developed renderer. There's not really all that much that can be added from ProRender, because it's already there
  • MechavenMechaven Posts: 71

    Actually, PBR isn't about getting rid of shortcuts but rather moving to a 'standard' that more closely reflects real world lighting and materials. It also isn't just games. In fact it got started with animation in big studios. The point is that one can adjust wrong shaders with wrong lights or vice versa as long as we are doing a still image, but the minute one tries to animate something where lighting or shaders change, that wrong vs wrong balance doesn't work anymore. Things go all askew. So Pixar and others realized that they needed to move to a standard where they could change lighting and still have the materials work properly. This concept of PBR being some mythical thing to do with games is a common (and somewhat irritating) misperception in the industry. Again, it didn't even start with games, that's just when the average person started hearing about it.

    There is a related history where there was absolutely no relevance to real world in scale when modeling in 3d applications. Moving objects between different environments was a real problem because the scales weren't related in any way. Eventually, there was a movement to trying to move to some connection towards real world scales where Blender for instance moved to 1 Blender Unit = 1 Meter irl. All of the applications basically moved to some relevance to real world and so now a simple scale factor can be applied to objects moving between environments. PBR is simply the next step of this evolution but with light and shaders.

     

    Agreed, PBR is just the latest catch phrase, prior to it terms like energy conserving, or architechural, were the catch phrases, but it is definitely nothing new. If anything good does come from it it will be the standardization of some terms. Specularity for example used to mean fake reflections, but has somehow been carried over and come to mean reflections or glossiness. I don't know where PBR originated but the game industry is setting the precedent for PBR shading and everyone else, including Blender is following suit.

    Scale is a big one, I can't think of a single Blender tutorial (although I know there have been a few) where scale was regarded whatsoever. The first thing you see is the default cube or sphere scaled up 5 times to start modeling, say an eyeball. So now that eyeball is 10 meters in diameter. Then they teach how to light and shade it when light falloff and shaders like SSS are trying to obey real world characteristics of an eyeball that is almost thirty feet tall.

  • Joe CotterJoe Cotter Posts: 3,164
    edited August 15

    Actually, I wasn't suggesting PBR is a catch phrase, just the opposite. It is a move towards standardization which didn't exist before. As such, it is no more a catch phrase then HTML or CSS. As to it being new, here is a good example of just how far back PBR actually goes: SIGGRAPH 2010 Course: Physically-Based Shading Models in Film and Game Production. It's just that it's taken this long for the industry to actually standardize, which for anyone who has followed standards is no surprise unfortunately. It takes a long time for standards to actually be implimented in an industry, even after it has been shown to be of significant advantage. As to the game industry setting the precident, many of the aspects of PBR were first done by Pixar in animation, however it has been a joint development, meaning that AAA game development has also had some specific advances in the area so the two industries have played off each other for moving forward over the years. Much of the misunderstanding is that the public in general (that follow this type of thing) are more exposed to game development then they are to the internals of advancements at Dreamworks, Pixar, ILM, etc... One has to go out of their way to find information and follow the animation/fx studios development of technology, especially considering that traditionally the animation/fx studios have held some of their techniques close to the chest with just releasing teasers whereas the gaming industry** is much more forward about what advances they have made, using it as marketing material even. This is changing some however and the animation/fx studios are now moving more in line with how the gaming industry has treated releasing this type of information.

    * Almost forgot. Things like conservation of energy were discussed separate from PBR but were really always a subset of it even if not directly mentioned. It's just that these various aspects were often discussed singularly and independant of the overal standardization issue.

    ** It wasn't just the gaming industry also. Video card manufacturers, hardware publications and gaming sites often quoted gaming technology advancements in relation to how it effected hardware so this had a synergistic effect on getting awareness out re: gaming industry. There was no such synergy with advancements in animation/fx studios. This lead to an uneven perspective by many people who have information on this technology as they were getting one side of the story.

    Scale is a big one, I can't think of a single Blender tutorial (although I know there have been a few) where scale was regarded whatsoever. The first thing you see is the default cube or sphere scaled up 5 times to start modeling, say an eyeball. So now that eyeball is 10 meters in diameter. 

    Yes, the software has changed to have standards, all major software has some relation now to real world measurements. Blender unit = 1 meter, 3DS Max unit = 1 inch, etc.. These are defaults and all packages also have a dialog box where the default can be changed. This also didn't exist a while back. However, the point is that unfortunately bad habits carry on for a long time and since their was no rl equivalents to measurements people got used to modeling without any consideration of that and this persists to this day with many modelers, having passed down from tutor to student modeling techniques that didn't consider this. I highly recommend breaking this habit if anyone has developed it. One of the benefits of having some relation to rl measurements is that import/export functions can automatically apply scaling without the user even realizing or having to deal with it if the importer/exporter is set up to do so. (Although an eyball scaled 10 meters high is still going to be 10 meters high unless it is adjusted properly.)

    shaders like SSS are trying to obey real world characteristics of an eyeball that is almost thirty feet tall.

    Yes, this is a good example of why using proper scale is important. 

    Post edited by Joe Cotter on
  • nonesuch00nonesuch00 Posts: 5,862

    Well I will be interested to see where the ProRenderer & Cycles go as neither Blender or AMD are going to be sitting still or have been.

  • LotharenLotharen Posts: 206

    Eevee seems rather interesting. I used a plugin to convert a scene from Studio to Blender using cycles but when I changed the renderer to Eevee everything was purple and I wasn't able to get the textures or lighting to show. I would love a dedicated plugin that would convert everything to blender ready. I'm not skilled enough to trouble shoot or do it myself yet. I think animating in Blender would be amazing - if only you could animate studio figures within blender.

  • FinlaenaFinlaena Posts: 334

    Hey, what's the best way to do cloth sims (i.e. Using Poser dynamics) in Blender? What settings, etc do y'all use, or how do you "pin" stuff?

  • ebergerlyebergerly Posts: 1,218

    There's a bunch of youtube videos on doing cloth in Blender. I think BornCG has one or more that should get you started. 

  • bradrgbradrg Posts: 457
    edited August 15

     

    Finlaena said:

    Hey, what's the best way to do cloth sims (i.e. Using Poser dynamics) in Blender? What settings, etc do y'all use, or how do you "pin" stuff?

    Blender actually has a fantastic clothes engine: You go into the physics tab to enable it. 

    To pin a vertex create a vertex group, and then select it in the physics tab. 

    It even has a sewing feature, so you can create two flat shapes, attach edges between them and blender will sew them together for you. 

    This is something I'm working on at the moment

     

    cloth.jpg
    218 x 310 - 48K
    Post edited by bradrg on
  • Joe CotterJoe Cotter Posts: 3,164
    Lotharen said:

    Eevee seems rather interesting. I used a plugin to convert a scene from Studio to Blender using cycles but when I changed the renderer to Eevee everything was purple and I wasn't able to get the textures or lighting to show. I would love a dedicated plugin that would convert everything to blender ready. I'm not skilled enough to trouble shoot or do it myself yet. I think animating in Blender would be amazing - if only you could animate studio figures within blender.

    If the objects in your scene are pink, that means that Blender can't find the texture(s) that is/are associated with that object, same as in Unity actually. You would need to go through some tutorials on how to set up your textures/shaders for Eevee. Here's one I haven't had time to check out yet but might get you started: EEVEE : Getting Started

  • RAMWolffRAMWolff Posts: 7,528
    bradrg said:

     

    Finlaena said:

    Hey, what's the best way to do cloth sims (i.e. Using Poser dynamics) in Blender? What settings, etc do y'all use, or how do you "pin" stuff?

    Blender actually has a fantastic clothes engine: You go into the physics tab to enable it. 

    To pin a vertex create a vertex group, and then select it in the physics tab. 

    It even has a sewing feature, so you can create two flat shapes, attach edges between them and blender will sew them together for you. 

    This is something I'm working on at the moment

     

    Cool. Reminds me of Hourmans cloak from the JSA

    Save

  • bradrgbradrg Posts: 457

     

     

    RAMWolff said:

    Cool. Reminds me of Hourmans cloak from the JSA

    Save

    Looks like a raincoat in that image because it's all shiny from blenders viewport window haha

  • Joe CotterJoe Cotter Posts: 3,164
    edited August 18

    Worth the watch: Eevee Viewport Demo - Wanderer

    Eevee + CMV Realtime Material Experiment by Reynante Martinez (CMV = Cycles Material Vault)

     

    Post edited by Joe Cotter on
  • LotharenLotharen Posts: 206

    I would like to use blender for special effects on my renders. Like rain, fog, ect. To give the image's more feeling. Just trying to figure out how best to do that.

  • Joe CotterJoe Cotter Posts: 3,164
    edited October 4

    There are tutorials specfically for that @Lotharen. Just search on "Blender rain / fog tutorial." For rain, there is an excellent one by Andrew Price, also known as Blender Guru, and for fog, I don't remember if Andrew has one but Gleb Alexandrov (Creative Shrimp) has some good tutorials on that and lighting in general.

    @Inkubo, nice post. I think it pretty accurately represents the community. :)

    I am planning on switching to Linux for my main OS. My reason for the switch is that I like lean, clean and specialized. I have a good base knowledge I'm working from so the cost isn't the same for me as it would be for most people, and Linux allows me to tweek it at the lowest level of the OS. I've worked with it for many years pretty in depth but always as a secondary OS, that is very part time. I've discovered over the years that one can have a good inner knowledge of something but the day-to-day working knowledge might not be as high if one doesn't also put in the hours.

    With switching to Linux as my primary desktop I should be among other things, better able to answer questions for anyone else looking to go that route. Linux has gotten easier and requires less hand holding but when one pushes the technology in areas such as cg/gaming, etc... it can go off rails and take some detailed knowledge to get it back on tracks. Therefore, if one isn't technically inclined, using it for these tasks might be more work then one wants unless very motivated. I don't mind helping people getting started, but as usuall, the biggest challenge is that there are times where it's a struggle to try to condense indepth knowledge into a reasonable package for a newer person so just know going in if you are a newbie and going down this journey that people will try to help but it will take many hours of self research, education, trial and error, etc.. on your part that can't be shortcut all of the time.

    Also, I have been going back to my roots as a programmer for what ever that might be worth. I want to be able to whip up some plugins and do some custom configurations for the various software I work with (i.e. Python, Electron, etc...) I would consider DAZ plugin programming but when I looked at it years ago, the API seemed a bit convoluted and restricted to me. I'm not sure if that has changed or if it was just my own lack of knowledge at the time.

    So... just throwing all of that out there for anyone in the community who are like minded and/or are on a similar journey.

    In signing off I would like to say hello. I hope everyone is doing well. Warm wishes

    Joe

    Post edited by Joe Cotter on
  • j cadej cade Posts: 1,326
    edited October 23

    Bringing this back because, !%^(^$$#@! Fiddling around in blender 2.79 and the combination of the principled shader + filmic + denoising is almost enough to make me give up Iray... if teleblender ever switches over to slotting the textures into the principled shader I very well might

    less than 5 minutes setting up the materials. Only 3 textures (diffuse, bump, normal) and 4 other nodes used.

    3 minutes to render 800x1200, 11 minutes 1600x2400 using cpu

    No noise, really swanky SSS,

     

    I think I'm diving down the rabbit hole for a bit

    Post edited by j cade on
  • Joe CotterJoe Cotter Posts: 3,164

    Beautiful. Very nice job :)

  • ArtiniArtini Posts: 2,606
    edited October 23

    Very well done, j cade. Big hopes for teleblender, then.

    Have you tried to make some renders of Daz hair and character face, yet?

     

     

    Post edited by Artini on
Sign In or Register to comment.