OT: New Nvidia Cards to come in RTX and GTX versions?! RTX Titan first whispers.

1192022242527

Comments

  • kyoto kidkyoto kid Posts: 40,560
    edited September 2018

    lol, I jump to the last page, and see the 'GN' GT1030-disgrace edition vid, and I know this is a cool place.

    [All statements are the opinion of ZDG, who does not work for or represent Daz Studio or Gamers Nexus]

    My thoughts on Iray via CUDA, vs Iray on RT and Tensor cores, lol.

    I feel that Iray via CUDA is going to happen, Iray with RT and Tensor performance bosts I'm not too sure of. RT cores can be very handy, the code just needs to be written to do it. Tensor from what I've read elsewhere tends to be more fuzzy math based (not great for calculating exact numbers). As I had written elsewhere, it only took over three months for GTX10 cards to get Iray support going when they launched. adversely, It 'ONLY' took the Adobe Premiere dev team over ten years to add Intel iGPU acceleration to video encoding (for an iGPU that Intell CPUs had since the Core Duo days, lol). Judging by that, RT and Tensor acceleration for Iray may take a few months to work out, or it may take a corporate level market share threat to Nvidia to convince them it would be good to do. Don't get me wrong, I would love Iray to be better, I just have doubts AMD will ever have that kind of market share threat to Nvidia to make them want to do it.

    The Turing CUDA/SM units do have a nifty trick up their sleeves that alone may explain some performance boosts. Older CUDA could only do an FP or Int op per cycle, the new Turing CUDA cores can do FP and int at the same time like Hyperthreading sort of. I do not know yet if that simultaneous CUDA execution will need new code to take advantage of, or if it will work fine with old code. My opinion on price aside, It may be a good GPU, if the code is worked out for it.

    I still feel the loss of memory at the same price bracket, is a major hit to RTX card value. It's easy to fill 4GB with Iray, and 8GB is not all that great either. I feel in that way, Nvidia has been a major letdown for what Iray offers. It's great if a 32GB Tesla V100 is a business write off, not so much if you have to pay for it out of pocket. And more affordable cards often lack memory, like anything below the GTX1050ti for that matter (including the 3GB GT1060, lol). If you're on a budget and need Iray capable cards with a min of 4GB, The pickings are slim. There is the 4GB GK208 GT730 (384 CUDA card, not the GK108 with 96 CUDA cores) that is kind of slow, then there is the 4GB GTX1050ti, or used 4GB GTX960's, the rest gets pricey fast.

    ...and if you are someone in our little pond who creates large, involved, high quality scenes, the lack of a card with a reasonable amount of memory (16 GB & >) is highly problematic.  As I mentioned for the cost, why couldn't they add even 1 GB to the 2080 Ti, if not 5 nd give the standard 2080 10 GB?  I saw the Titan V as pretty much a dead end particularly in light of the forthcoming (now present) developments as it didn't get an increase in memory and doesn't have NVLink capability to warrant an 1,800$ price increase over it's already overpriced predecessor, theTitan Xp (the then 700$ 1080Ti performed just as good if not better than the Xp  in several categories).

    If hardware acceleration will not be implemented for Iray, then yes it will not only raise a few hackles but make people further question the price of the consumer grade RTX series.

    It could also be a matter for Daz3D which has pretty much hitched its future to Iray.  As Outrider mentioned, what if Iray stops being supported?  What of all that content and resources using in the store and in development which uses Iray only materials? 

    True there is Octane, but like LuxRender, it means artists will have to convert all materials for every scene they create (I don't see Daz releasing Octane material kits anytime soon).  With implementation of Vulkan, Octane will also be supported on both Nvidia and AMD GPUs.  Don't see that happening with Iray which is strictly CUDA based as Nnidia would be shooting themselves in the foot if they did so.

    Move to Pro Render? Well that would stir things up a bit here as so many already have purchased or upgraded their Nvida GPUs because of Iray.  As I don't do games, I never had the need for a powerful high memory GPU card until Iray was introduced, and then effectively was deemed Daz's primary render engine with 3DL taking a back seat.

    Maybe finally updating 3DL?  Wowie's aweShading Kit for 3DL hit the store two days ago, and 3DL development has some interesting things in the works. 

    Hard to say where this will all go, however I do know one thing, an RTX card is definitely not in my future (unless I get a windfall and can afford a couple RTX5000s with NVLink)

     

    Post edited by kyoto kid on
  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited September 2018

    agreed, including the stuff about daz and PA products I was thinking and did not mention. As it is right now, I think the unattractive pricing on RTX cards is to sell off old GTX10 stock that didn't sell during the mining craze. For us non-gamers, well, for us that gaming is not the primary purpose of our computers, it is even worse due to the loss of memory. I do see Iray having a good use, just not for complex large scenes. 3DL just has that little bit going for it, it is kind of easy to add memory DIMMs to a motherboard, not so easy to add memory to a graphics card, lol.

    I was not totally sold on Octane when I looked at it years ago, because it required post-processing to fix the speckles sprites whatever them things are, at least Iray is a tad better with that. I honestly don't even know where to begin with what will happen in the future of 3D rendering engines, it's a tad beyond what I have time to follow now as it is, lol. There is so much potential for cool stuff, and some scary possibilities as well for PAs and artists alike.

    As for games, the RTX cards are already raising voices. You are essentially paying a tier up for an equivalent card that will deliver lower FPS with RT on. It's Hairworks 2.0 in a nutshell, and most gamers want more consistent higher FPS, not pretty shadows and reflections at a lower FPS, lol. I think Andrew and Steve at GN put it best, the people that are most excited about RT, are the people Nvidia is not trying to sell the cards to, lol. Let's just hope that whatever Daz3d does, it works with our old clothes, lol.

    Post edited by ZarconDeeGrissom on
  • kyoto kidkyoto kid Posts: 40,560

    ...yeah waiting until I get my benefit cheque next month to pick up Wowie's aweShaders Kit.  I watched much of the development over on the 3Delight Lab thread and  ccan't wait to get it.

    With 24 GB of memory in my primary render system and 32 in my secondary system (need to figure out how to attempt network rendering on 3DL between the two), that would give me a total of 56 GB of memory (well 54 after Windows) to throw at 3DL as well as Carrara render jobs.  If I could figure out how bring the CPUs into play, I'd have 20 total CPU threads for 3DL (Carrara already allows it).

    My experiments with IBL Master were very fruitful and I felt I came very close to a sense of realism (had to fake the bounce lighting though but the test scene still rendered in far less time than with UE)

  • kyoto kidkyoto kid Posts: 40,560
    edited September 2018

    ...well Octane4 will have a much more affordable subscription track that will get quicker updates.  That has me interested.

    ...and it's Not just GPU components but system memory and other components will be affected by the tariffs as well.

    Post edited by kyoto kid on
  • If I was Daz, I would start looking towards somekind of Octane intergration into Studio after the Octane 4 launch, unless of course, they have somekind of inside knowledge regarding Iray's future which we are not aware of.!!

  • Ok I have something I do not understand. I'm trying to test the memory and my new dual 2080 Ti with NVLink in a stress test.

    So I loaded the largest scene I could think of, Airport Island with all the props, and Airport Island City Centre with all the props.

    https://www.daz3d.com/airport-island-airport

    https://www.daz3d.com/airport-island-city-center.

    I first loaded everything, worked fine in texture mode. 

    Then I decided to switch to Iray mode.

    This is what happened:

    According to GPU Tweak II, inTexturemode I used 56% of the memory.

    According to GPU Tweak II, in Iray mode I used 151% of memory.

    What does this mean? 

    Is GPU Tweak an unreliable reader of memory, bugged?

    Or did the cards decide to stack the memory?

    Texturemode.jpg
    1920 x 1080 - 390K
    IrayMode.jpg
    1920 x 1080 - 448K
    txt
    txt
    airportlog.txt
    139K
  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited September 2018

    I'm tempted to say no, provided the cards have 11GB each, and 6GB is quite shy of that.

    It may be reporting 6GB on each card as a total of 151% of one card's worth of memory, an instrumentation error.

     

    000Lbl1.png
    640 x 512 - 178K
    Post edited by ZarconDeeGrissom on
  • jonascsjonascs Posts: 35
    edited September 2018

    Right! I'm doing another test with an even larger scene and using windows Task Manager instead. 

    And you seem to be right.

    As I said I used Taskmanager instead and decided to render the scene. So i attached the Log for this aswell.

     

    Well, I can't seem to be able to upload the Log, But it says it rendered with CPU.

    Which isn't surpricing, as I wieved in Iray when I started to render.

     

     

    Taskmanager.png
    1920 x 1080 - 670K
    taskmanager2.jpg
    1920 x 1080 - 525K
    Post edited by jonascs on
  • Here's the log for above render

    2018-09-30 10:11:40.069 Iray VERBOSE - module:category(IRAY:RENDER):   1.5   IRAY   rend progr: CUDA device 1 (GeForce RTX 2080 Ti): Processing scene...

    2018-09-30 10:11:40.083 Iray VERBOSE - module:category(IRAY:RENDER):   1.8   IRAY   rend progr: CUDA device 0 (GeForce RTX 2080 Ti): Processing scene...

    2018-09-30 10:11:40.424 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(304): Iray ERROR - module:category(IRAY:RENDER):   1.4   IRAY   rend error: Unable to allocate 1218423948 bytes from 622376550 bytes of available device memory

    2018-09-30 10:11:40.432 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(304): Iray ERROR - module:category(IRAY:RENDER):   1.9   IRAY   rend error: Unable to allocate 1218423948 bytes from 680643788 bytes of available device memory

    2018-09-30 10:11:40.440 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(304): Iray ERROR - module:category(IRAY:RENDER):   1.5   IRAY   rend error: CUDA device 1 (GeForce RTX 2080 Ti): Scene setup failed

    2018-09-30 10:11:40.442 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(304): Iray ERROR - module:category(IRAY:RENDER):   1.5   IRAY   rend error: CUDA device 1 (GeForce RTX 2080 Ti): Device failed while rendering

    2018-09-30 10:11:40.450 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(304): Iray ERROR - module:category(IRAY:RENDER):   1.8   IRAY   rend error: CUDA device 0 (GeForce RTX 2080 Ti): Scene setup failed

    2018-09-30 10:11:40.456 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(304): Iray ERROR - module:category(IRAY:RENDER):   1.8   IRAY   rend error: CUDA device 0 (GeForce RTX 2080 Ti): Device failed while rendering

    2018-09-30 10:11:40.457 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(304): Iray WARNING - module:category(IRAY:RENDER):   1.8   IRAY   rend warn : All available GPUs failed.

    2018-09-30 10:11:40.458 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(304): Iray WARNING - module:category(IRAY:RENDER):   1.8   IRAY   rend warn : No devices activated. Enabling CPU fallback.

    2018-09-30 10:11:40.464 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(304): Iray ERROR - module:category(IRAY:RENDER):   1.8   IRAY   rend error: All workers failed: aborting render

    2018-09-30 10:11:40.464 Iray INFO - module:category(IRAY:RENDER):   1.0   IRAY   rend info : CPU: using 8 cores for rendering

    2018-09-30 10:11:40.464 Iray INFO - module:category(IRAY:RENDER):   1.0   IRAY   rend info : Rendering with 1 device(s):

    2018-09-30 10:11:40.464 Iray INFO - module:category(IRAY:RENDER):   1.0   IRAY   rend info : CPU

    2018-09-30 10:11:40.469 Iray INFO - module:category(IRAY:RENDER):   1.0   IRAY   rend info : Rendering...

  • Based on previous discussions, I think there's a suspicion that the log is reproting the uncompressed texture sizes - but most of the images will be larger than the default compression thersholds and so will take up less RAM on the card than they do as raw data.

  • ebergerlyebergerly Posts: 3,255
    edited September 2018
    drzap said:

    And let's face it, iRay is not among the top or popular renderers in the market, not even close.  

    So THAT's why whenever I search for "iray" on google what I mostly get is posts in the DAZ forums....laugh

    I never thought of that, but it looks like you're right. The NVIDIA website for Iray actually lists DAZ Studio as one of the half-dozen or so users. Maybe Iray is becoming the Carrara/Bryce/Hexagon of the NVIDIA lineup laugh 

    Anyway, at the end of the day, I think most of this stuff is a bit irrelevant in practical terms for most of us. We get whatever software works, buy the best GPU we can afford (realizing nothing's perfect), and realize we can't change any of it (including VRAM stacking, W10 hogging VRAM, RTX/Turing utilization, NVLink functionality, and so on). And I assume most of us aren't going to run out and buy a couple of RTX-2080ti's anytime soon, so I think most of this is moot. 

    On the other hand, anyone know of some cool Optix-based renderers that can directly import Studio scenes? laugh

    Oh, BTW, I read on the wiki that Optix isn't actually a renderer.

    This stuff is giving me a headache.  

     

     

    Post edited by ebergerly on
  • davemc0 said:

    Hi.

    I ran the OptiX team at Nvidia for five years. I've since moved to Samsung to design a new GPU, but still keep tabs on Nvidia's ray tracing work. Let me mention a few points that might help this conversation.

    Nvidia's "RTX" ray tracing hardware in the Turing architecture is the real deal. It is real hardware acceleration of ray - acceleration structure intersection and ray - triangle intersection. The performance numbers quoted by Jensen of 10 billion rays per second are obviously marketing numbers, but in OptiX (which was written on top of CUDA at the time) 300 million rays per second was a good marketing number. So the speedup is very real.

    RTX is only being exposed through ray tracing APIs. It is not accessible through Compute APIs like CUDA, OpenCL, or Vulkan Compute. It is/will be exposed only through Microsoft DXR, Nvidia OptiX, and a Vulkan ray tracing extension. Nvidia can make their own Vulkan extension or the Vulkan Ray Tracing Technical Subgroup (of which I am a member) may define a cross-platform ray tracing extension.

    Thus, all existing renderers that want hardware acceleration will have to use one of these three APIs. Most renderers are / will be using OptiX - Pixar, Arnold, Vray, Clarisse, Red Shift, and others.

    Another thing to note is that OptiX Prime will not be accelerated on RTX; only OptiX will. This is significant because Iray uses OptiX Prime, not OptiX. Thus, Iray does not get RTX ray tracing acceleration on my RTX 2080, and something big will have to change before it does get accelerated. I don't know whether Nvidia will port Iray to OptiX, which would be a big effort, or whether that will be done by MI Genius or Lightwork Design, or it will just not happen. Another possibility is for some party to implement OptiX Prime on top of OptiX to get access to RTX hardware acceleration.

    If I were DAZ I would be privately freaking out about my renderer being left in the dust because it's not hardware accelerated, even though the same company made the renderer and the hardware acceleration.

    -Dave

    Thanks that's a big (bad) info you give us. I saw the Optix Prime dll in iray dir and was wondering if that was different from Optix

     

    kyoto kid said:

    ..so Vulkan it isn't specifically software based then?  Interesting.  That would make AMD Vega cards useable in Octane as well.

    For 500$ less than a P5000 (800$ less than an RTX5000) and 300$ more than a 2080 TI there's the WX9100 offering 16 GB HBM 2 and 4096 Stream processors. That could be an optimal choice for someone using Octane.

    You'd still have to wait the new Octane to confirm the compatibility

     

    kyoto kid said:

    ...yeah waiting until I get my benefit cheque next month to pick up Wowie's aweShaders Kit.  I watched much of the development over on the 3Delight Lab thread and  ccan't wait to get it.

    With 24 GB of memory in my primary render system and 32 in my secondary system (need to figure out how to attempt network rendering on 3DL between the two), that would give me a total of 56 GB of memory (well 54 after Windows) to throw at 3DL as well as Carrara render jobs.  If I could figure out how bring the CPUs into play, I'd have 20 total CPU threads for 3DL (Carrara already allows it).

    My experiments with IBL Master were very fruitful and I felt I came very close to a sense of realism (had to fake the bounce lighting though but the test scene still rendered in far less time than with UE)

    for 3DL network rendering you need one licence per computer ( 600$ for eight core licence)

     

  • kyoto kidkyoto kid Posts: 40,560
    edited September 2018

    ...the free version of 3DL supports 8 cores.  Wouldn't you be able to network that one?

    As to Octane4, Vulkan compatibility will most likely be in the 2019 release (they have already been testing it).

    Post edited by kyoto kid on
  • KitsumoKitsumo Posts: 1,210

    I'm confused. Is Vulkan supposed to replace OpenGL or OpenCL? I've seen articles about both.

  • kyoto kid said:

    ...the free version of 3DL supports 8 cores.  Wouldn't you be able to network that one?

    As to Octane4, Vulkan compatibility will most likely be in the 2019 release (they have already been testing it).

    I never tried but you could at least use one free node. However you'll need to do that by hand with command lines. I think it's possible to script something to integrate thatr inside DS but I don't know anybody doing that.

    But still you'd only have one render node. I'm not sure it wouldn't be easier to simply have a DS on the other machine to render

    For Octane I was speaking of compatibility with AMD Hardware. Even if in theory that is compatible there could be some HW implementation specifics. You should check Octane Forum for that

     

    Kitsumo said:

    I'm confused. Is Vulkan supposed to replace OpenGL or OpenCL? I've seen articles about both.

    Both in fact. One API  to rule them all

  • KitsumoKitsumo Posts: 1,210
     
    Kitsumo said:

    I'm confused. Is Vulkan supposed to replace OpenGL or OpenCL? I've seen articles about both.

    Both in fact. One API  to rule them all

    Oh. That should be nice. If all major vendors get behind it that could definitely work out for the consumer. As far as the 2000 series goes, I'm going to pass for now. It's not worth it for me to upgrade.

  • nicsttnicstt Posts: 11,714
    drzap said:

    Nvidia's business strategy has been shifting away from supporting their own renderers so they can invest more in the compute and AI space.  This has been shown by their dropping support for Mental Ray and handing a lot of the support for iRay to third parties.  And let's face it, iRay is not among the top or popular renderers in the market, not even close.  Mental Ray was far more successful.  There are iRay plugins for only about 3 or 4 content creation applications, then there's Daz Studio, iClone, and Substance Painter.   That's about it.  Not a great loss from Nvidia's standpoint.  But there is always Octane for DS users.  They decided to drop Optix and use the Vulkan API for RTX access, putting them in position to offer services for both Nvidia and AMD gpu's.  Octane has more than 25 plugins with more planned to come, A big update in the near future, and I doubt users would have reason to worry about them dropping support any time soon.  I'm not a big fan of Octane, but if I were a Daz Studio user, I would be knocking down their door right now.

    Indeed.

    Closer integration (and it is close now) would have me using a hammer not a fist to knock that door down. :)

    ebergerly said:
    nicstt said:

    I am please to hear my caution is warranted; 

    Yeah, sometimes being negative is a positive smiley

    Anyway, I guess I just assumed that in the world of NVIDIA GPU's, basically EVERYTHING was written using CUDA. Guess I have some homework to do...figure out exactly what Optix is. 

    Wow. 

    Actually I'm not please. Dislocating my shoulder trying to pay myself on the back is something I'd rather have avoided.

    There is lots to like about IRAY tbh, support from Nvidia doesn't seem to be one.

  • nicsttnicstt Posts: 11,714

    agreed, including the stuff about daz and PA products I was thinking and did not mention. As it is right now, I think the unattractive pricing on RTX cards is to sell off old GTX10 stock that didn't sell during the mining craze. For us non-gamers, well, for us that gaming is not the primary purpose of our computers, it is even worse due to the loss of memory. I do see Iray having a good use, just not for complex large scenes. 3DL just has that little bit going for it, it is kind of easy to add memory DIMMs to a motherboard, not so easy to add memory to a graphics card, lol.

    I was not totally sold on Octane when I looked at it years ago, because it required post-processing to fix the speckles sprites whatever them things are, at least Iray is a tad better with that. I honestly don't even know where to begin with what will happen in the future of 3D rendering engines, it's a tad beyond what I have time to follow now as it is, lol. There is so much potential for cool stuff, and some scary possibilities as well for PAs and artists alike.

    As for games, the RTX cards are already raising voices. You are essentially paying a tier up for an equivalent card that will deliver lower FPS with RT on. It's Hairworks 2.0 in a nutshell, and most gamers want more consistent higher FPS, not pretty shadows and reflections at a lower FPS, lol. I think Andrew and Steve at GN put it best, the people that are most excited about RT, are the people Nvidia is not trying to sell the cards to, lol. Let's just hope that whatever Daz3d does, it works with our old clothes, lol.

    In response to the image: Bravo.

    I currently see these cards as being great sometime in the future, providing the future pans out as Nvidia expect; they should ask Intel how that is working out for them... Over-confidence can be fatal.

  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited September 2018
    kyoto kid said:

    ...the free version of 3DL supports 8 cores.  Wouldn't you be able to network that one?

    Ah, I'm not clear on what agreement Daz has with what is in Daz Studio, tho the single CPU thing is a good point I had forgotten about. That chat was a few years back in the 3DL threads, and I can't remember if the free ver supports networked rendering or not. Kettu and others may know in the 3DL thread. I'm sure the launch of 8 core 16 thread Ryzen CPUs may possible in theory have the 3DL team hypothetically reconsidering the possibility of reconsidering the 8 core limit if that is not already an old limit due to Threadripper and i9 CPUs. On the "GN R7 build" Daz Studio does indeed use all 16 threads of the R7 with the built-in version of 3DL (I don't 'Think' that is the free version tho, and I don't think that is the full pay ver either. I honestly do not remember the details of what was different between the free version and what Daz3D has)

    kyoto kid said:

    As to Octane4, Vulkan compatibility will most likely be in the 2019 release (they have already been testing it).

    Total coolness. If I'm not mistaken, I  think there are two different plugins for another engine at the Daz Store, Reality and another one (that Destiny's garden and Inane Glory has done some stuff for). It may be already an option when Octane4 is out of beta if not soon after. Oh, that was Lux, I don't know about Octane4. I'm sorry, my memory is a tad fuzzy on this. Oops, Reality and Luxus for LuxRender, not Octane, Sorry. There is some old threads about Octane plugin things, they are old tho (2013 ish).

    P.S. Thanks nicstt 

    Post edited by ZarconDeeGrissom on
  • Kevin SandersonKevin Sanderson Posts: 1,643
    edited September 2018

    The version of 3DL that DAZ Studio uses will use unlimited cores on a single PC (actually more than the paid full version). It's a full 3DL version, but never the current version, it's just been dumbed down for DAZ users. The functions are readily available through scripting as Kettu has proven. It can't be networked and I don't believe the free version can either. The free version is all command line, too. There was a problem a few years back reported that the rib files that DAZ Studio makes did not work as well in the free version, but I don't know if that was a user knowledge issue. I seem to remember that getting an animation done with it was problematic, but again a user issue possibly. 3DL in DAZ Studio can do animation and with motion blur! There used to be another thread here that explained the free version.

    Hey, i just noticed DAZ has a prominent position on the 3Delight Cloud page! https://3delightcloud.com/

    Post edited by Kevin Sanderson on
  • kyoto kidkyoto kid Posts: 40,560

    ...hmm that last item.  Could that be a possible forthcoming service?

  • Ghosty12Ghosty12 Posts: 1,976
    edited October 2018

    Here is something that will give one a heart attack, the computer parts site below is in New Zealand and what they are charging for the new cards.. All can say is one could just about buy a complete new computer for these prices..

    https://www.playtech.co.nz/computer-hardware/graphic-cards/filter/cat/520-nvidia-geforce.html?p=3

    And here is what we will be paying in Australia, not as bad but still over the top expensive..

    https://www.scorptec.com.au/product/graphics-cards/nvidia

    https://www.pccasegear.com/category/193_1966/graphics-cards/geforce-rtx-2080-ti

    So yeah I think I will be waiting for a long time before I get one of these new cards.. lol

    Post edited by Ghosty12 on
  • kyoto kidkyoto kid Posts: 40,560

    ...crikey in NZ even at the exchange rate, a 2080 Ti Strix is still expensive at 1,945USD.  Wonder if part of that is due to a VAT.

    In Australia the same card is 1,593USD.  I do know you folks have a VAT.  

    I'd hang onto that 1070 Ti you have for a while.

  • Griffin AvidGriffin Avid Posts: 3,755
    edited October 2018

    Ask someone to buy it state-side and ship it to you.

    I'm sure you've got US friends as long as some of you have been posting....

    I'll wait on real-world reviews, but 30% speed increase is nothing to sneeze at. For those that do TONS of renders, that's a blessing and probably worth the cost for the time saved, overall.

    Post edited by Griffin Avid on
  • Ghosty12Ghosty12 Posts: 1,976
    edited October 2018
    kyoto kid said:

    ...crikey in NZ even at the exchange rate, a 2080 Ti Strix is still expensive at 1,945USD.  Wonder if part of that is due to a VAT.

    In Australia the same card is 1,593USD.  I do know you folks have a VAT.  

    I'd hang onto that 1070 Ti you have for a while.

    Yeah in Australia with have the GST = Goods and Services Tax that adds another 10% to the price.. :(  In New Zealand it is 15% so yes can get expensive, the main issue they have is a low exchange rate, $1 USD is .64 NZD.. Here in Australia $1 USD is .70 AUD at the moment, can see the take up of the cards here taking a while.. lol

    And well for myself with my new system that is in my sig, I will be holding off for quite a while.. lol

    Post edited by Ghosty12 on
  • There is one advantage to NVLink that has not been mentioned and it's a good one. NVLink allows each GPU to share its video memory through the NVlink bridge so when you have two cards the video memory is shared so two RTX 2080ti's will give you 22GB of memory for rendering. Right now each GPU has to save a copy of the Rendering assists. so each 1080ti has only access to 11gb   

  • outrider42outrider42 Posts: 3,679
    That's been discussed quite a bit, actually. And the answer is in fact yes and no. VRAM pooling is **possible**, but not enabled by default. To use it, the individual piece of software must be programmed to do so. Right now no software has yet to have this working, though some render engines are working on it. There is no guarantee that Iray will get this feature.

    At this time, I think Iray has a 50-50 chance of getting it. Those are just my thoughts, though.
  • ArtiniArtini Posts: 8,748
    edited October 2018
    drzap said:

    Nvidia's business strategy has been shifting away from supporting their own renderers so they can invest more in the compute and AI space.  This has been shown by their dropping support for Mental Ray and handing a lot of the support for iRay to third parties.  And let's face it, iRay is not among the top or popular renderers in the market, not even close.  Mental Ray was far more successful.  There are iRay plugins for only about 3 or 4 content creation applications, then there's Daz Studio, iClone, and Substance Painter.   That's about it.  Not a great loss from Nvidia's standpoint.  But there is always Octane for DS users.  They decided to drop Optix and use the Vulkan API for RTX access, putting them in position to offer services for both Nvidia and AMD gpu's.  Octane has more than 25 plugins with more planned to come, A big update in the near future, and I doubt users would have reason to worry about them dropping support any time soon.  I'm not a big fan of Octane, but if I were a Daz Studio user, I would be knocking down their door right now.

    Great marketing pitch.

    Does anyone of you uses monthly subscription of Octane for Daz Studio?

    How well does it perform?

    I see, they have updated version for Unity to Octane 4.00 RC 5, so I may give it a try there, first.

     

    Post edited by Artini on
  • Artini said:
    drzap said:

    Nvidia's business strategy has been shifting away from supporting their own renderers so they can invest more in the compute and AI space.  This has been shown by their dropping support for Mental Ray and handing a lot of the support for iRay to third parties.  And let's face it, iRay is not among the top or popular renderers in the market, not even close.  Mental Ray was far more successful.  There are iRay plugins for only about 3 or 4 content creation applications, then there's Daz Studio, iClone, and Substance Painter.   That's about it.  Not a great loss from Nvidia's standpoint.  But there is always Octane for DS users.  They decided to drop Optix and use the Vulkan API for RTX access, putting them in position to offer services for both Nvidia and AMD gpu's.  Octane has more than 25 plugins with more planned to come, A big update in the near future, and I doubt users would have reason to worry about them dropping support any time soon.  I'm not a big fan of Octane, but if I were a Daz Studio user, I would be knocking down their door right now.

    Great marketing pitch.

    Does anyone of you uses monthly subscription of Octane for Daz Studio?

    How well does it perform?

    I see, they have updated version for Unity to Octane 4.00 RC 5, so I may give it a try there, first.

     

    I started a monthly sub for the Octane plugin for daz. The plugin doesn't use the latest v4 of Octane yet and not sure when that will be available. 

    I played around with the plugin for 20ish hours over the last week and half. I have used the 4.11.0.196 beta with the plugin. There are many little quirks, like alot. I have managed to navigate through most things and can say I truly like Octane results, however, it takes quite some time to learn and get used to things. If you have never used a node editor for materials then you will have to learn to with the plugin. Most materials work well in the conversion process. Lighting is interesting and tricky until you learn the differences and how to utilize it with your scene. Also, I have tested the different anatomical elements available for daz models and only the daz ones convert well with the others showing lines and odd behaviour with the material on them. I am sure there is a way to fix them but I have not focused on it.

    Crashes. There will be crashes sometimes. I find that if I make alot of changes and keep rebuilding the scene daz will eventually crash. That is to say that I will make some material changes or displacement map changes etc and then in the separate Octane render window pane I will have to "Rebuild Scene" if it doesn't update correctly, will happen often, and then Daz will crash if I have done that a few times. Not sure if memory consumption is just getting too high or what.

    There is just way too much that I have learned in the last week and half to put conscisely in this response so my simplified recap is to be ready to tinker....alot. Great results though when you get things figured out and you will see alot of detail that just doesn't show up in iRay renders, at least from Daz iRay renders.

  • ArtiniArtini Posts: 8,748

    Thanks a lot for such a nice review and insights about Octane, shaneseymourstudio.

    I think, I will wait for v4 of Octane, then. The base version of Octane for Unity is free, so I will try it first to see, if I like it over iray renders.

     

Sign In or Register to comment.