Announcing Reality 4 {Commercial}

1356721

Comments

  • pcicconepciccone Posts: 661
    edited December 1969

    Barubary said:

    One question I have been wondering about for a while is what will happen to old material presets we might have left from the Reality 2.0 days. Will the new Reality still read those?

    First let me say that you will be able to run both Reality 4 and Reality 2.5 side by side, there is no need to delete the previous version.

    Now, if we were to move the Reality 2 material settings to Reality 4 we would lose the advantage of using Reality 4. Reality 2 did not have the Skin material, for example. So, if we load a scene saved with Reality 2 and render it with Reality 4 it's best to let Reality 4 do its job. Otherwise, if Reality 4 would migrate the Reality 2 settings, the skin materials will be rendered as Glossy instead of Skin, and some of the effects, like double specular maps, will be lost.

    We seldom revisit old scenes, in that case it's best to either try the new solutions of Reality 4 or simply re-render them with Reality 2, which can run side-by-side with the new version.

    For everything new it's best to use Reality 4 new conversion, materials and automatic presets.

    Hope this helps.

  • animajik_6696dda723animajik_6696dda723 Posts: 109
    edited November 2014

    Pret-A-3D said:
    AniMajik said:
    How well will Reality 4 work for animation work? Mainly rendering image sequences

    Animations are usually rendered as image sequences. At least that's how it's done in the professional world. Pixar, DreamWorks and every animation studio use image sequences. There are too many drawbacks to rendering to movie clips to consider that as a viable option.

    There are no differences I the way Reality 4 renders animations, compared to the previous version. I'm planning on doing some optimizations I the future, but for this version we use the same strategy. Speed is depending on your machine. When working I animations it's necessary to use fast machines. It might be that the slgrender mode can help.

    Cheers.

    Thanks for the reply. I always use image sequence. (and yes, I am a pro) and only use fast MacPros for my editing, mograph and animation work - I was curious as to any optimization your new ver4 had for animation, which you answered. I will wait until you update the software with those tweaks and optimizations for animation - thanks.

    -AniMajik

    Post edited by animajik_6696dda723 on
  • FusionLAFusionLA Posts: 249
    edited December 1969

    Good to here the new Reality for Daz will be released.
    I know you mentioned people who purchased Reality from Renderosity could upgrade from your site.
    Will we also be able to upgrade through the Daz store?

  • pcicconepciccone Posts: 661
    edited December 1969

    fusionla said:
    Good to here the new Reality for Daz will be released.
    I know you mentioned people who purchased Reality from Renderosity could upgrade from your site.
    Will we also be able to upgrade through the Daz store?

    No, the DAZ store will handle upgrades from customers who bought from DAZ.

    Cheers.

  • kyoto kidkyoto kid Posts: 40,658
    edited December 1969

    Pret-A-3D said:
    Barubary said:

    One question I have been wondering about for a while is what will happen to old material presets we might have left from the Reality 2.0 days. Will the new Reality still read those?

    First let me say that you will be able to run both Reality 4 and Reality 2.5 side by side, there is no need to delete the previous version.

    Now, if we were to move the Reality 2 material settings to Reality 4 we would lose the advantage of using Reality 4. Reality 2 did not have the Skin material, for example. So, if we load a scene saved with Reality 2 and render it with Reality 4 it's best to let Reality 4 do its job. Otherwise, if Reality 4 would migrate the Reality 2 settings, the skin materials will be rendered as Glossy instead of Skin, and some of the effects, like double specular maps, will be lost.

    We seldom revisit old scenes, in that case it's best to either try the new solutions of Reality 4 or simply re-render them with Reality 2, which can run side-by-side with the new version.

    For everything new it's best to use Reality 4 new conversion, materials and automatic presets.

    Hope this helps.

    ...so by reading this, it sounds as it one needs to create an entirely new scene for Reality 4 rather than rendering one already processed for Reality 2.

    Now I always save a 3DL backup version of all my scenes as I am still in the "experimentation/learning" stage. This way if I mess something up badly, I can delete the bad scene, then reload the backup scene, saving it as "[scene name] Lux". So in this case it would be like stating with a brand new scene never processed in Reality or rendered in Lux and I should be OK correct?

    Other question, as to GPU rendering, if one has dual GPUs in SLI configuration (I have two Radeon GPUs in SLI with 3 GB GDDR5 each), Lux will only make use use of the VRAM on one not both units correct?

  • pcicconepciccone Posts: 661
    edited December 1969

    Kyoto Kid said:

    ...so by reading this, it sounds as it one needs to create an entirely new scene for Reality 4 rather than rendering one already processed for Reality 2.


    No, that is not at all the case. You can load your old scene and call Reality 4. The materials will be converted from scratch using the Reality 4 rules and material types. It will not take the Reality 2 settings because that would prevent Reality 4 from suing the more advanced materials.

    Hope this helps.

  • pcicconepciccone Posts: 661
    edited December 1969

    Kyoto Kid said:

    Other question, as to GPU rendering, if one has dual GPUs in SLI configuration (I have two Radeon GPUs in SLI with 3 GB GDDR5 each), Lux will only make use use of the VRAM on one not both units correct?

    That is a question for the Lux developer. You should post it to LuxRender.net

    Cheers.

  • kyoto kidkyoto kid Posts: 40,658
    edited December 1969

    Pret-A-3D said:
    Kyoto Kid said:

    ...so by reading this, it sounds as it one needs to create an entirely new scene for Reality 4 rather than rendering one already processed for Reality 2.


    No, that is not at all the case. You can load your old scene and call Reality 4. The materials will be converted from scratch using the Reality 4 rules and material types. It will not take the Reality 2 settings because that would prevent Reality 4 from suing the more advanced materials.

    Hope this helps.

    ...excellent, thanks.

  • kyoto kidkyoto kid Posts: 40,658
    edited November 2014

    Pret-A-3D said:
    Kyoto Kid said:

    Other question, as to GPU rendering, if one has dual GPUs in SLI configuration (I have two Radeon GPUs in SLI with 3 GB GDDR5 each), Lux will only make use use of the VRAM on one not both units correct?

    That is a question for the Lux developer. You should post it to LuxRender.net

    Cheers.
    ...I'll do that. I do know that Octane will only use the memory of one unit. but it will use all CUDA cores of both. CUDA architecture however is nVidia's core/thread platform, ATI uses a different technology called "Stream Processor".

    Post edited by kyoto kid on
  • pcicconepciccone Posts: 661
    edited December 1969

    Actually AMD, Intel, Samsung, Motorola and others use a technology called OpenCL. nVidia itself is moving past CUDA.

    Cheers.

  • Gothic ShadowGothic Shadow Posts: 35
    edited December 1969

    You say the CPU rendering is the best compared to GPU rendering right? How does CPU rendering compare to the hybrid option? Does the hybrid option give a extra boost to the render quality wise, or does it limit it because of the GPU and it is better off with just the CPU?

  • pcicconepciccone Posts: 661
    edited December 1969

    Incy said:
    You say the CPU rendering is the best compared to GPU rendering right? How does CPU rendering compare to the hybrid option? Does the hybrid option give a extra boost to the render quality wise, or does it limit it because of the GPU and it is better off with just the CPU?

    The hybrid option uses the GPU to help the rendering process by engaging the GPU for floating-point calculations. It helps in speeding things up but it can, not always, but sometimes, create different images than the ones generated by CPU rendering.

    Now, all this is at today's version of LuxRender, which is 1.3.1. LuxRender 1.4 is in the testing phase and it has a completely redesigned core which is more efficient. The next versions of LuxRender should bring a good amount of performance improvement and Reality will be right there, once the new LuxRender is fully tested and declared stable.

    As with almost everything in technology, this is a moving target :)

    Cheers.

  • Hiro ProtagonistHiro Protagonist Posts: 699
    edited December 1969

    Pret-A-3D said:
    fusionla said:
    Good to here the new Reality for Daz will be released.
    I know you mentioned people who purchased Reality from Renderosity could upgrade from your site.
    Will we also be able to upgrade through the Daz store?

    No, the DAZ store will handle upgrades from customers who bought from DAZ.

    Cheers.
    Does this also apply to people (like myself) who bought Reality 1 through DAZ back in 2011/12? I think the product delivery arrangement was different then to what it is now. I've subsequently upgraded Reality to 2.5 via Pret-A-3D in the period when Reality was no longer available from DAZ.

  • pcicconepciccone Posts: 661
    edited December 1969


    Does this also apply to people (like myself) who bought Reality 1 through DAZ back in 2011/12? I think the product delivery arrangement was different then to what it is now. I've subsequently upgraded Reality to 2.5 via Pret-A-3D in the period when Reality was no longer available from DAZ.

    Hi.
    You will have to option to upgrade from either company, your choice.

    Cheers.

  • RAMWolffRAMWolff Posts: 10,151
    edited December 1969

    You mentioned that NVidia is through with CUDA? Why would they create a technology like that and then drop it? I don't know much about it and I have only ONE program that actually uses it (3D Coat) but other than that no idea what it was or about.

  • StratDragonStratDragon Posts: 3,167
    edited December 1969

    RAMWolff said:
    You mentioned that NVidia is through with CUDA? Why would they create a technology like that and then drop it? I don't know much about it and I have only ONE program that actually uses it (3D Coat) but other than that no idea what it was or about.

    1) Because limitations are becoming more evident as time goes on vs the advances other platforms are achieving by excluding their technology.

    2) Because they would like to spend $600 on a new graphics card from them every eleven months.

  • pcicconepciccone Posts: 661
    edited December 1969

    RAMWolff said:
    You mentioned that NVidia is through with CUDA? Why would they create a technology like that and then drop it? I don't know much about it and I have only ONE program that actually uses it (3D Coat) but other than that no idea what it was or about.

    Proprietary software technologies have a history of not lasting. nVidia started CUDA years ago, before OpenCL was defined, and they pushed it as a technological advantage for using nVidia cards. You can't blame them for it but that was a short-lived solution.

    AMD, Intel, Samsung and others all need a way of accessing the GPU cores for computation. GPU cores are "dumb" and limited, compared to what an Intel or ARM CPU can do, but they do one things faster than anything else: floating point calculations. They are also designed to work in parallel, a logical design choice for a GPU.

    So, we need to access those cores for helping the CPU parallelize tasks using those fast FPUs (Floating Point Units). If AMD made its own API, Intel made its own API and others would do the same, it would cause chaos. Once you write your program for a certain API, that program would be bound to that hardware. That was the plan of nVidia but the world moved on and embraced GPUs from many manufacturers. This is always the case, once a certain technology becomes mainstream.

    Remember Soundblaster? Who uses a Soundblaster card anymore. Once sound was deemed a necessary commodity for PCs all kind of sound processors became available and a common API was designed to program them.

    Since creating an API for each brand would be foolish, AMD, Intel and others embraced OpenCL, which is an open standard. CUDA is proprietary and only works on one brand of cards. Programs written to use OpenCL can use any compatible GPU, so switching hardware does not require to recompile an OpenCL-based program. If you use a CUDA program it simply does not run when using different hardware.

    GPUs have moved to mobile devices: tablets and smartphones. iOS is today the largest gaming platform in the world. Bigger than XBox, Playstation and Wii combined. Android is doing very well too. All those GPUs don't use CUDA. All the main manufacturers of mobile devices support, or pledged to support, OpenCL. It could not be otherwise.

    So, now there are several vendors, and Intel is pretty darn big, supporting OpenCL, which leaves CUDA isolated. To note that nVidia has supported OpenCL for years, although reluctantly at first. But now nVidia has realized where the market is going and started making cards that have OpenCL acceleration in the hardware. They know that, if they don't embrace OpenCL, they will be left behind, they will completely miss the mobile market.

    Developing two APIs is just not practical and so I predict that CUDA will be on the way out in a couple of years, if not sooner. nVidia will keep supporting it for the time being, as a legacy technology, but the development will be done on OpenCL and the competition will focus on the hardware.

    That's why CUDA is a technology with no future and why programmers who developed for it, including Pilgway/3DCoat, will have to re-design their software if they want to move forward. Putting all your eggs in a single basket is never a wise decision.

    Cheers.

  • RAMWolffRAMWolff Posts: 10,151
    edited December 1969

    Well then. Thanks to you both for your answers. I guess I need to get out there and get a new vid card then. The one I'm using, NVidia GeForce GTX 650.

    If that's fine I'll wait but if there's a mid range affordable card out there please post. I use only NVidia, I like NVidia!

    Gracias!

  • pcicconepciccone Posts: 661
    edited November 2014

    RAMWolff said:
    Well then. Thanks to you both for your answers. I guess I need to get out there and get a new vid card then. The one I'm using, NVidia GeForce GTX 650.

    If that's fine I'll wait but if there's a mid range affordable card out there please post. I use only NVidia, I like NVidia!

    Gracias!


    There is no need to change hardware. CUDA is not a hardware, it's a software API (Application Programming Interface). CUDA can go away and the hardware will be driven by another API, OpenCL. No need to change anything. besides, most of your graphic work is done by OpenGL, which is a completely different. Studio, Poser, Maya, Blender, they all use OpenGL. When it comes to rendering OpenCL can be used to speed rendering. nVidia supports OpenCL, although the drivers can be buggy.

    My reply was meant to highlight how software is going to change, In the example of 3DCoat, their acceleration does not work on machines that don't use nVidia cards and so they are offering sub-par performance to a lot of users, including all of Mac users. That is not right and that kind of decisions need to be revised, which means a lot of work in rewriting software.

    Cheers.

    Post edited by pciccone on
  • RAMWolffRAMWolff Posts: 10,151
    edited December 1969

    Ah, OK. I did some research at the NVidia web site and found all these newer cards are all over 9" in length, my box won't allow for really long cards so that leaves me out, seemingly, from upgrading to a mid range hi end card because all of them past the 750 mark are all built like that! Total bummer! :-(

  • SimonJMSimonJM Posts: 5,957
    edited December 1969

    RAMWolff said:
    Ah, OK. I did some research at the NVidia web site and found all these newer cards are all over 9" in length, my box won't allow for really long cards so that leaves me out, seemingly, from upgrading to a mid range hi end card because all of them past the 750 mark are all built like that! Total bummer! :-(

    9"? Who says size doesn't matter? ;)
  • RAMWolffRAMWolff Posts: 10,151
    edited December 1969

    SimonJM said:
    RAMWolff said:
    Ah, OK. I did some research at the NVidia web site and found all these newer cards are all over 9" in length, my box won't allow for really long cards so that leaves me out, seemingly, from upgrading to a mid range hi end card because all of them past the 750 mark are all built like that! Total bummer! :-(

    9"? Who says size doesn't matter? ;)

    *snorts* You behave! ;-)~

  • Richard HaseltineRichard Haseltine Posts: 97,577
    edited December 1969

    I thought 3D Coat supported both DirectX and CUDA on Windows - not much help for mac users, of course.

  • pcicconepciccone Posts: 661
    edited December 1969

    I thought 3D Coat supported both DirectX and CUDA on Windows - not much help for mac users, of course.

    DirectX and CUDA are not similar technologies. In any case, Direct3D is not used. 3DCoat, like every 3D modeling program, uses OpenGL. OpenGL and OpenCL are two different technologies.
  • kyoto kidkyoto kid Posts: 40,658
    edited November 2014

    ...so based on this if one has an nVidia CUDA based GPU, there should be an OpenCL driver update somewhere down the road, correct?


    If not, those who invested in the 5,000$+ Quado K6000 or nVidia's 50,000$ Iray® VCA will not be happy campers.

    Post edited by kyoto kid on
  • Richard HaseltineRichard Haseltine Posts: 97,577
    edited December 1969

    Pret-A-3D said:
    I thought 3D Coat supported both DirectX and CUDA on Windows - not much help for mac users, of course.

    DirectX and CUDA are not similar technologies. In any case, Direct3D is not used. 3DCoat, like every 3D modeling program, uses OpenGL. OpenGL and OpenCL are two different technologies.

    Yes, version 4 has only one version - 3 however did come in the two flavours for Windows, so I wasn't entirely out of my mind.

  • Gothic ShadowGothic Shadow Posts: 35
    edited December 1969

    Can I predownload LuxRender before I own Reality? So that I don't have to download both at the same time? Or should I download Reality first then LuxRender? Now which version of LuxRender do I need the OpenCL version or the no OpenCL version? Also what is the difference between the versions?

  • BarubaryBarubary Posts: 1,201
    edited November 2014

    Pret-A-3D said:
    Barubary said:

    One question I have been wondering about for a while is what will happen to old material presets we might have left from the Reality 2.0 days. Will the new Reality still read those?

    First let me say that you will be able to run both Reality 4 and Reality 2.5 side by side, there is no need to delete the previous version.

    Now, if we were to move the Reality 2 material settings to Reality 4 we would lose the advantage of using Reality 4. Reality 2 did not have the Skin material, for example. So, if we load a scene saved with Reality 2 and render it with Reality 4 it's best to let Reality 4 do its job. Otherwise, if Reality 4 would migrate the Reality 2 settings, the skin materials will be rendered as Glossy instead of Skin, and some of the effects, like double specular maps, will be lost.

    We seldom revisit old scenes, in that case it's best to either try the new solutions of Reality 4 or simply re-render them with Reality 2, which can run side-by-side with the new version.

    For everything new it's best to use Reality 4 new conversion, materials and automatic presets.

    Hope this helps.

    Not the answer I was hoping for but helpful nonetheless. Doesn't reduce my anticipation for Reality 4 in the slightest. Being able to keep running Reality 2.5 does help a bit. I had a feeling this was going to happen, so thank god I switched to 3Delight when working with Gen2 for the past months, I would bite myself had I spent all this time creating Reality 2.5 material presets for Gen2 :D

    Post edited by Barubary on
  • pcicconepciccone Posts: 661
    edited December 1969

    Incy said:
    Can I predownload LuxRender before I own Reality? So that I don't have to download both at the same time? Or should I download Reality first then LuxRender? Now which version of LuxRender do I need the OpenCL version or the no OpenCL version? Also what is the difference between the versions?

    Yes, you can download Lux today and install it. Make sure to download version 1.3.1, which can be found from my website. Start with the non-OpenCL, you can always change later.
    Cheers.
  • pcicconepciccone Posts: 661
    edited November 2014

    Barubary, several Genesis presets are included in Reality 4.

    Cheers.

    Post edited by pciccone on
Sign In or Register to comment.