How many here are using the newer RTX GPUs?

245

Comments

  • ebergerlyebergerly Posts: 3,255

    I'm curious...for those who have already bought an RTX-2080ti, could you explain how you decided to spend $1,200 for one? I've got 3 GPU's across two desktops, so I don't really need another GPU, but honestly I've been a good boy and I do deserve a present. It's just that this one has been extremely difficult to come close to justifying for me since there's so little real data showing it's worth the $$. 

    Am I missing something? Was it a gaming decision? Since I never play games that's not a factor. Have you seen that much of an Iray render time improvement? 

    Thanks. 

     

  • bluejauntebluejaunte Posts: 1,990

    I had two 1080 TI so I replaced them with one 2080 TI at just about zero cost. Retained the rendering  performance, increased gaming performance (since I was using only one 1080 TI for that), got real time ray traced GI for Metro Exodus (not impressed though), got RTX to play with for future games and future things that may be introduced for render engines. And I guess less cramped in my case too. It just seemed to be a great trade to make at the time since 1080 TI's were selling so well still.

  • TaozTaoz Posts: 10,259
    kyoto kid said:

    ...while faster than mechanical drives,  SSDs do degrade in performance from multiple read/write operations and using VM for rendering does this constantly during the process.  This is why on an HDD it is so slow as data for the render process is written and then read back numerous times until rendering is finished.

    +1

  • FSMCDesignsFSMCDesigns Posts: 12,843
    ebergerly said:

    I'm curious...for those who have already bought an RTX-2080ti, could you explain how you decided to spend $1,200 for one? I've got 3 GPU's across two desktops, so I don't really need another GPU, but honestly I've been a good boy and I do deserve a present. It's just that this one has been extremely difficult to come close to justifying for me since there's so little real data showing it's worth the $$. 

    Am I missing something? Was it a gaming decision? Since I never play games that's not a factor. Have you seen that much of an Iray render time improvement? 

    Thanks. 

     

    As I posted earlier, I got lucky and mine was a present from the GF. Personally I couldn't justify the cost myself. I had been looking for a 1080 ti at a decent price for awhile and I guess she heard me one to many times complain that they were getting harder to come by and the price was even crazier for one. She had asked why I wanted one so bad and I explained I wanted the 11 gig of DDR for DS, which might explain why she got the 2080 ti.

  • ebergerlyebergerly Posts: 3,255
    edited April 2019
    kyoto kid said:

    ...while faster than mechanical drives,  SSDs do degrade in performance from multiple read/write operations and using VM for rendering does this constantly during the process.  This is why on an HDD it is so slow as data for the render process is written and then read back numerous times until rendering is finished.

    Just to give a bit of real world perspective on the often-quoted "SSD degradation" issue...

    I have a Samsung 850EVO 500GB SSD that was installed in November, 2017, almost 1.5 years ago. That SSD, according to the manufacturer, has a (minimum) lifespan measured in what's called "terrabytes written", because SSD's can only write a finite number of times before it is expected they will start to degrade. For my device, that number is 150 terrabytes written (TBW). The warranty is 5 years or 150 TBW, whichever comes first. 

    What that means is that I would need to write the entire contents of 0.5 TB SSD a total of 150/0.5 = 300 times. Which means that every day for almost a year I'd need to write the entire contents of the drive (which nobody ever would). 

    So lets say I wrote a more realistic 50GB on a daily basis, which is 0.05 terrabytes. I'd have to do that 3,000 times to reach 150 TBW. Once a day for 10 years. That's a long time, probably much longer than your computer will last.

    Now Samsung has a cool free software that tells you how many terrabytes your SSD has written in its lifetime. My 1.5 year old SSD shows a total of only 7 TBW so far (see below), which is closer to 15 GB per day. At that rate, it will last AT LEAST another 32 years. 

    Now you're free to imagine a monster scene that overflows your system RAM and continuously writes many gigabytes to your pagefile every day for years, but that ain't gonna happen, for many reasons. 

    Now I'm not saying the whole SSD degradation thing is a myth, but, while it's somewhat factual (obviously, since all electronic & mechanical components degrade), in the real world, for most users, it's utterly irrelevant.  

    SSD.JPG
    895 x 629 - 70K
    Post edited by ebergerly on
  • kyoto kidkyoto kid Posts: 41,859
    edited April 2019

    ...yes, but are your scenes constantly going into swap mode?  

    Also many of those inexpensive SSDs do not necessarily have the same  ratings like Samsung Evo ones do

    Post edited by kyoto kid on
  • outrider42outrider42 Posts: 3,679
    edited April 2019
    ebergerly said:

    I'm curious...for those who have already bought an RTX-2080ti, could you explain how you decided to spend $1,200 for one? I've got 3 GPU's across two desktops, so I don't really need another GPU, but honestly I've been a good boy and I do deserve a present. It's just that this one has been extremely difficult to come close to justifying for me since there's so little real data showing it's worth the $$. 

    Am I missing something? Was it a gaming decision? Since I never play games that's not a factor. Have you seen that much of an Iray render time improvement? 

    Thanks. 

     

    Bluejaunte answered this quite well. One 2080ti just about matches two 1080ti's for rendering right out of the box, and they may not even be fully optimized. They do this while being less than twice the price of two 1080tis. Even if you bought 1080ti's at the base $700, buying two would $1400. Simple math takes care of the rest. Now that 1080ti's are extremely hard to find at that price, the 2080ti is much better buy. After all, if you can locate a new 1080ti, it might cost $1000 or more. I don't see the issue here.

    And that is not even accounting for the future RTX support that we know Iray is getting. Once Iray gets RTX, the performance gap will increase even more, and the $1200 2080ti will feel like a bargain.

    Ironically, from a pure gaming standpoint the 2080ti is not a great value at all. It offers a clear boost, but more like 30-50% depending on the game, over the 1080ti. The 2080ti is without question the fastest gaming card on the planet, but the price to performance is not nearly as good as it is for rendering. But there is no doubt, it is a powerful card, and the RTX features are slowly being added to games.

    Now that Daz has confirmed that RTX will be coming, I plan on picking up a 2080ti myself. But I want to wait for next gen Ryzen first. I want to build with the rumored 16 core Ryzen. I am stoked. Then I may buy a 2080ti. I don't want to sell my 1080ti's though. I will pair one with the 2080ti and use the 2nd 1080ti in my old computer for a second still pretty fast computer that I can play with. Maybe play a game while the other renders, or use both to render different scenes. Or LAN parties. Remember those?

     

    Taoz said:
    kyoto kid said:

    ...while faster than mechanical drives,  SSDs do degrade in performance from multiple read/write operations and using VM for rendering does this constantly during the process.  This is why on an HDD it is so slow as data for the render process is written and then read back numerous times until rendering is finished.

    +1

    Most modern SSDs are built in a way to manage data smart. They can last many years thanks to that. This video explains why. It covers how SSDs can die faster, but also how they can live longer.

    Additionally, if you are buying a small SSD just for paging, what is the fear??? The SSD is still speeding up your computer if you use it as the page file. SSDs are getting stupid cheap now and it is easier than ever to make this upgrade. Plus...mechanical hard drives can fail WITHOUT any warning at all, and Kyoto well knows this fact. Most SSDs have monitoring tools that help let you know when they are wearing down. So if you are wearing down your drive prematurely, you will know about it! You never get this warning with mechanical. SSDs can also handle abuse better than mechanical. Drop a HDD and it is probably toast. But a SSD will survive that fall. SSDs offer far more benefits. Really people, don't be so afraid of SSD.

    If you really don't want to use a SSD, there is another option, Optane Memory by Intel. Optane is also cheap, but not as cheap as a SSD can be now. Optane is non volatile memory and is sort of a cross between RAM and SSD (sort of). Still no substitute for straight up RAM, but it is a way to expand memory in a machine that cannot have any more RAM added to it. Provided you have a M.2 port, which you likely don't if your machine is that old. That leaves SSD as the best option.

    Linus Tech Tips did some tests with Optane Memory and found that compared to traditional page filing on a HDD, Optane could offer major benefits. He rendered a test scene in Blender. With 16GB of RAM, the 12GB scene rendered in 45 minutes, this is your control test. With 4GB of RAM, this scene took over 2 hours as it page filed. With 4GB RAM + 32GB Optane memory, the scene took just 1 hour. So clearly not as fast as real RAM, but much faster than paging off a HDD. A SSD can also offer similar, if not better paging performance. He also found that page filing to HDD really hits the CPU hard, which further slows down the render. Its not just the HDD getting worked here, your CPU also works "extra hard" to do this, and that itself causes slow downs.  The performance is real under the correct circumstances.

    Obviously he did not test Iray, because nobody uses Iray, LOL. But I would bet that paging from SSD or Optane would cut a massive amount of time from a render that is using the page file. I am talking about hours saved here.

    Is that not worth investing $30 for???

    It would be easy to test performance claims. Someone with a SSD can limit their computer to a small amount of RAM and do a render. You can limit your RAM without actually pulling it out. The render must use CPU only mode and be large enough to force page filing. One render with SSD page file and one render with HDD page file. I would, but I frankly don't want to. Sorry. But if somebody wants to test it, this is how, and we can apply a number to it.

    A SSD quite literally saved my laptop. It feels like a whole different machine thanks to its SSD. I have a 1TB SSD for my desktop as drive C. I also keep primary apps on it, like Daz, and my most played games. I also have a 250GB SSD since that was my first. I keep some games on it, too. I have a 4TB external HDD with the bulk of my Daz library on it, plus a couple other hard drives for backing up things. Because that is how I style and profile. WOOOOOOOOO!!!!

    Post edited by outrider42 on
  • kyoto kidkyoto kid Posts: 41,859

    mechanical hard drives can fail WITHOUT any warning at all, and Kyoto well knows this fact.

    ...indeed, do I, hence getting a backup drive was the next immediate priority (finally just beginning to rebuild my massive Freebie runtime).  Don't have the funds for a 1.5 or 2 TB SSD (those are still too expensive for my budget), but did get a small one for the boot and programme/utility drive (240 GB).  

  • joseftjoseft Posts: 310

    SSD life was something i was always weary of too, especially since the first one i ever got died on me inside of 2 years, and the second one had issues around that same time aswell, it didnt completely die but a few times i would get boot errors telling me no OS drive was found. It would work again after a hard reset, but i figured i should replace it before it died completely. 

    My current one is roughly 3 years old now, so i wondered how close it was to death. Checked it using that samsung magician program, was surprised at how little data it has written. 36 terabytes, and my model can supposedly write up to 300

  • TaozTaoz Posts: 10,259

    One of the reasons I'm not using SSD is that I'm not sure if the current ones will work correctly, or even work, with my hardware, which is pretty old (10+ years). If I knew for sure it would work without problems I might give it a try.

  • kyoto kid said:

    ...while faster than mechanical drives,  SSDs do degrade in performance from multiple read/write operations and using VM for rendering does this constantly during the process.  This is why on an HDD it is so slow as data for the render process is written and then read back numerous times until rendering is finished.

    They don't degrade that quickly.  It's actually a non-issue.  I refer you to the 2013-2015 torture test performed by Techreport.  It took 18 months of intensive writes to wear out most SSDs.  I've mentioned this before, yet the same people keep bringing this up as if it's a real thing.  Please do have a look.

  • ParadigmParadigm Posts: 423
    edited April 2019
    ebergerly said:

    I'm curious...for those who have already bought an RTX-2080ti, could you explain how you decided to spend $1,200 for one? I've got 3 GPU's across two desktops, so I don't really need another GPU, but honestly I've been a good boy and I do deserve a present. It's just that this one has been extremely difficult to come close to justifying for me since there's so little real data showing it's worth the $$. 

    Am I missing something? Was it a gaming decision? Since I never play games that's not a factor. Have you seen that much of an Iray render time improvement? 

    Thanks. 

     

    I had been saving up since the 1080TIs came out in order to buy the top of the line card of the next generation. The 2080TI came out and I bought it. Sure, it's overpriced but I had set aside money over a long period specifically to get the top tier card and those always come with the "top tax." It was expensive but it didn't really feel that way because of how methodical I was saving up for it.

     

    Regarding the HDD debate going on, I can personally attest that the read write speeds have basically no effect on anything but starting up the scene and importing new assets. I have a portablee HDD that I run all of my content off of on my laptop over USB3.0 and everything works fine.

    Post edited by Paradigm on
  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    edited April 2019
    Are the people experiencing virtual RAM swaps using SSDs for the page files? That makes page filing much faster. Obviously not a replacement for real RAM, but waaaaaaay better than using a spinning hard disc as a paging file. And since you can get 250gb SSDs for $30 or less these days, its a super easy upgrade to make.

    SSD or not when you have no more RAM, your computer will be slow. That doesn't change anything and you're good to wait or kill apps (I have 32GGB and SSDs) because your computer will still be unresponsive for a while

    Money is better spent on memory if that occurs often (I multiprocess a lot)

     

    kyoto kid said:

    ...while faster than mechanical drives,  SSDs do degrade in performance from multiple read/write operations and using VM for rendering does this constantly during the process.  This is why on an HDD it is so slow as data for the render process is written and then read back numerous times until rendering is finished.

    The Read/Write degradation allows for at least 5-10 years write in the most pessimistic estimation even with latest low-cost Nands. You'll be more likely to kill your SSD because of a controller or PCB problem

     

    Post edited by Takeo.Kensei on
  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    edited April 2019
     
    Taoz said:
    kyoto kid said:

    ...while faster than mechanical drives,  SSDs do degrade in performance from multiple read/write operations and using VM for rendering does this constantly during the process.  This is why on an HDD it is so slow as data for the render process is written and then read back numerous times until rendering is finished.

    +1

    Most modern SSDs are built in a way to manage data smart. They can last many years thanks to that. This video explains why. It covers how SSDs can die faster, but also how they can live longer.

     

    If you really don't want to use a SSD, there is another option, Optane Memory by Intel. Optane is also cheap, but not as cheap as a SSD can be now. Optane is non volatile memory and is sort of a cross between RAM and SSD (sort of). Still no substitute for straight up RAM, but it is a way to expand memory in a machine that cannot have any more RAM added to it. Provided you have a M.2 port, which you likely don't if your machine is that old. That leaves SSD as the best option.

    Linus Tech Tips did some tests with Optane Memory and found that compared to traditional page filing on a HDD, Optane could offer major benefits. He rendered a test scene in Blender. With 16GB of RAM, the 12GB scene rendered in 45 minutes, this is your control test. With 4GB of RAM, this scene took over 2 hours as it page filed. With 4GB RAM + 32GB Optane memory, the scene took just 1 hour. So clearly not as fast as real RAM, but much faster than paging off a HDD. A SSD can also offer similar, if not better paging performance. He also found that page filing to HDD really hits the CPU hard, which further slows down the render. Its not just the HDD getting worked here, your CPU also works "extra hard" to do this, and that itself causes slow downs.  The performance is real under the correct circumstances.

    Obviously he did not test Iray, because nobody uses Iray, LOL. But I would bet that paging from SSD or Optane would cut a massive amount of time from a render that is using the page file. I am talking about hours saved here.

    Is that not worth investing $30 for???

    It would be easy to test performance claims. Someone with a SSD can limit their computer to a small amount of RAM and do a render. You can limit your RAM without actually pulling it out. The render must use CPU only mode and be large enough to force page filing. One render with SSD page file and one render with HDD page file. I would, but I frankly don't want to. Sorry. But if somebody wants to test it, this is how, and we can apply a number to it.

    A SSD quite literally saved my laptop. It feels like a whole different machine thanks to its SSD. I have a 1TB SSD for my desktop as drive C. I also keep primary apps on it, like Daz, and my most played games. I also have a 250GB SSD since that was my first. I keep some games on it, too. I have a 4TB external HDD with the bulk of my Daz library on it, plus a couple other hard drives for backing up things. Because that is how I style and profile. WOOOOOOOOO!!!!

    Optane and SSD are completely different tech. You dont get the same performance and you need to have the required Hardware for that. If you don't, the inverstment cost is way more than 30$

    Post edited by Takeo.Kensei on
  • outrider42outrider42 Posts: 3,679
     
    Taoz said:
    kyoto kid said:

    ...while faster than mechanical drives,  SSDs do degrade in performance from multiple read/write operations and using VM for rendering does this constantly during the process.  This is why on an HDD it is so slow as data for the render process is written and then read back numerous times until rendering is finished.

    +1

    Most modern SSDs are built in a way to manage data smart. They can last many years thanks to that. This video explains why. It covers how SSDs can die faster, but also how they can live longer.

     

    If you really don't want to use a SSD, there is another option, Optane Memory by Intel. Optane is also cheap, but not as cheap as a SSD can be now. Optane is non volatile memory and is sort of a cross between RAM and SSD (sort of). Still no substitute for straight up RAM, but it is a way to expand memory in a machine that cannot have any more RAM added to it. Provided you have a M.2 port, which you likely don't if your machine is that old. That leaves SSD as the best option.

    Linus Tech Tips did some tests with Optane Memory and found that compared to traditional page filing on a HDD, Optane could offer major benefits. He rendered a test scene in Blender. With 16GB of RAM, the 12GB scene rendered in 45 minutes, this is your control test. With 4GB of RAM, this scene took over 2 hours as it page filed. With 4GB RAM + 32GB Optane memory, the scene took just 1 hour. So clearly not as fast as real RAM, but much faster than paging off a HDD. A SSD can also offer similar, if not better paging performance. He also found that page filing to HDD really hits the CPU hard, which further slows down the render. Its not just the HDD getting worked here, your CPU also works "extra hard" to do this, and that itself causes slow downs.  The performance is real under the correct circumstances.

    Obviously he did not test Iray, because nobody uses Iray, LOL. But I would bet that paging from SSD or Optane would cut a massive amount of time from a render that is using the page file. I am talking about hours saved here.

    Is that not worth investing $30 for???

    It would be easy to test performance claims. Someone with a SSD can limit their computer to a small amount of RAM and do a render. You can limit your RAM without actually pulling it out. The render must use CPU only mode and be large enough to force page filing. One render with SSD page file and one render with HDD page file. I would, but I frankly don't want to. Sorry. But if somebody wants to test it, this is how, and we can apply a number to it.

    A SSD quite literally saved my laptop. It feels like a whole different machine thanks to its SSD. I have a 1TB SSD for my desktop as drive C. I also keep primary apps on it, like Daz, and my most played games. I also have a 250GB SSD since that was my first. I keep some games on it, too. I have a 4TB external HDD with the bulk of my Daz library on it, plus a couple other hard drives for backing up things. Because that is how I style and profile. WOOOOOOOOO!!!!

    Optane and SSD are completely different tech. You dont get the same performance and you need to have the required Hardware for that. If you don't, the inverstment cost is way more than 30$

    Where did I say more RAM was bad? Where did I say SSD or Optane would equal more RAM?

    I also said you'd need M.2 to use Optane. They may use different tech, but this conversation is about using them for the same purpose as virtual memory. I'm merely discussing options here if RAM is not an option. Why is this a problem?
  • kyoto kidkyoto kid Posts: 41,859
    Are the people experiencing virtual RAM swaps using SSDs for the page files? That makes page filing much faster. Obviously not a replacement for real RAM, but waaaaaaay better than using a spinning hard disc as a paging file. And since you can get 250gb SSDs for $30 or less these days, its a super easy upgrade to make.

    SSD or not when you have no more RAM, your computer will be slow. That doesn't change anything and you're good to wait or kill apps (I have 32GGB and SSDs) because your computer will still be unresponsive for a while

    Money is better spent on memory if that occurs often (I multiprocess a lot)

     

    kyoto kid said:

    ...while faster than mechanical drives,  SSDs do degrade in performance from multiple read/write operations and using VM for rendering does this constantly during the process.  This is why on an HDD it is so slow as data for the render process is written and then read back numerous times until rendering is finished.

    The Read/Write degradation allows for at least 5-10 years write in the most pessimistic estimation even with latest low-cost Nands. You'll be more likely to kill your SSD because of a controller or PCB problem

     

    ...unfortunately having an older system I to have to deal with memory limits set by the MB.  The main system is limited to 24 GB (P6T). That's as much as I have in it.  The assembly system is limited to 32 GB (P8P67) which it currently has. The Titan X is a "big" card and just fits in the older 24 GB system which has a larger case so that became my rendering system.  Being networked, I can access it's Data drive from the assembly machine so I don't need to mirror the content drive in the assembly system. and worry about possible path errors. 

    The SSD's I have in both machines are SATA-III. The assembly machine has a 4 GB GPU primarily for performing Iray render tests before ending the completed scene to the render system.

  • ebergerly said:

     

    Now that Daz has confirmed that RTX will be coming, I plan on picking up a 2080ti myself. 

    I've been out of the loop-DAZ is officially going to support RTX? For just rendering or real time display as well?  

  • ebergerly said:

     

    Now that Daz has confirmed that RTX will be coming, I plan on picking up a 2080ti myself. 

    I've been out of the loop-DAZ is officially going to support RTX? For just rendering or real time display as well?  

    Wouldn't it be rather dumb for them not to support RTX?  I mean really, it's Nvidia.  Of course DAZ will support it. 

    Now if it was called "Slipshod RTX by some dragons, unicorns, and guys named McGyver, Kyoto, Gryphon, and Pixel", it would be another matter entirely, haha!  cheeky

  • ebergerlyebergerly Posts: 3,255
    edited April 2019
    ebergerly said:

     

    Now that Daz has confirmed that RTX will be coming, I plan on picking up a 2080ti myself. 

    I've been out of the loop-DAZ is officially going to support RTX? For just rendering or real time display as well?  

    Actually, it's more about NVIDIA supporting Iray and RTX, not DAZ. And apparently NVIDIA has said that yes they will bring RTX support to Iray. Of course the devil is in the details as far as when it will happen, how much will get implemented, and so on. But for those of us who were incorrectly thinking that Iray is on its way out, that's good news. Since Iray is certainly not widely used compared to other renderers, there were some concerns it might not have a long lifespan. 

    Post edited by ebergerly on
  • ebergerlyebergerly Posts: 3,255
    kyoto kid said:

    ...yes, but are your scenes constantly going into swap mode?  

    Also many of those inexpensive SSDs do not necessarily have the same  ratings like Samsung Evo ones do

    If you want something to be true, you can certainly manufacture a whole list of unlikely or impossible scenarios that might seem to prove it true. Whether those are relevant is another question. 

  • joseftjoseft Posts: 310
    ebergerly said:
    ebergerly said:

     

    Now that Daz has confirmed that RTX will be coming, I plan on picking up a 2080ti myself. 

    I've been out of the loop-DAZ is officially going to support RTX? For just rendering or real time display as well?  

    Actually, it's more about NVIDIA supporting Iray and RTX, not DAZ. And apparently NVIDIA has said that yes they will bring RTX support to Iray. Of course the devil is in the details as far as when it will happen, how much will get implemented, and so on. But for those of us who were incorrectly thinking that Iray is on its way out, that's good news. Since Iray is certainly not widely used compared to other renderers, there were some concerns it might not have a long lifespan. 

    funny thing about Iray - major studios dont use it because it doesnt have the feature set that other renderers do and is not developed fast enough, and Nvidia probably dont develop it fast enough because not enough people use it to be worth the extra resources.

    You would think that Nvidia would have their own renderer supporting their own RT hardware before other renderers do, but that isnt going to be the case. Redshift 3.0 is due any day now, and early support for RT is coming with it. I have not checked in on Octane recently, but i know they have also had internal beta builds supporting RT hardware for months now, just not sure if they have released publicly yet.

  • outrider42outrider42 Posts: 3,679
    edited April 2019
    ebergerly said:

     

    Now that Daz has confirmed that RTX will be coming, I plan on picking up a 2080ti myself. 

    I've been out of the loop-DAZ is officially going to support RTX? For just rendering or real time display as well?  

    Daz Studio was specifically listed as getting RTX support in 2019 with a statement direct from Daz Steve. This was not posted in the forums, but forum users quickly noticed it.

    Additionally, Steve also made a post on the forums about the future of Daz in the thread speculating about Genesis 9. He said there would be no Genesis 9 in 2019, but he also said that some very cool things would be coming to Daz Studio in 2Q-4Q. Of note here is that Steve used a plural term, so there are several things in the works.

    This is all the information that has been given. Daz is rather famous (or infamous perhaps) for keeping their plans extremely secretive. Odds are this is all we will get until whatever the new features are release. So we do not know what this means, it could be for Iray, it could be a real time mode, it might even be a whole new engine that is not Iray (notice Iray is not in the quote, so it could be anything). We are also now officially in 2Q, so if something is coming in 2Q like his forum post said, then something could come along any day now. If not, 2019 is the official time table, so SOMETHING should happen before the end of the year.

    Each software that has added RTX support has seen tremendous gains in performance. This not something to sneeze at, and keep in mind Steve himself calls it a "game changer". I believe that statement, I have even used that term in my own speculation. RTX will be a game changer, and that is not just speculation anymore, that is direct from Daz Steve himself.

     

    Post edited by outrider42 on
  • ebergerlyebergerly Posts: 3,255
    edited April 2019

    I think there may be some confusion on what is under DAZ control and what is under NIVIDA/Iray control. My limited understanding of Iray is that it is a complete and full featured rendering solution that is designed for relatively small studios and software vendors who want to provide a consumer-level rendering software without requiring a ton of development. It differs from some of the other rendering solutions that are used by large CG/game/whatever production companies with large staffs of developers who want low level, very custom and detailed control over all aspects of their software, and are willing to assign large teams of software developers to dig deep and develop customized solutions for changing business needs.

    My simple view of Iray is that DAZ (or whoever) merely has to, for the most part, just develop a user interface, configure their scene files in the Iray format, configure material/surface settings that are all implemented internally by Iray, and so on, and Iray does the rest. I'm sure this is a vast oversimplification, but I think the point remains. Once DAZ decides on Iray, I think they pretty much rely on it to do just about everything behind the scenes, and they just need to build the interface per Iray requirements. I assume DAZ's main focus is mainly on developing content, and as a side part of that making sure the content is all in Iray-compatible format. 

    And with all due respect, a phrase like "game changer" is, while somewhat exciting to some, ultimately meaningless. It can mean anything, and I think what's really important is how NVIDIA actually ties Iray into RTX technology in the future, and ultimately what that actually means for Studio/Iray in terms of render speeds and features. And until that happens and we can put numbers to it, it's all just guessing.   

    For now, the good thing (IMO) is that the RTX cards are cutting the render times of their GTX counterparts in half, and also have a good bang for the buck. Certainly not close to what all the hype promised last year, but there's always the promise that this will improve to some extent depending on what happens to Iray.  

    Post edited by ebergerly on
  • bluejauntebluejaunte Posts: 1,990

    That whole statement is kinda hyperbole if you dissect it a bit. laugh

  • kyoto kidkyoto kid Posts: 41,859
    ebergerly said:

     

    Now that Daz has confirmed that RTX will be coming, I plan on picking up a 2080ti myself. 

    I've been out of the loop-DAZ is officially going to support RTX? For just rendering or real time display as well?  

    Daz Studio was specifically listed as getting RTX support in 2019 with a statement direct from Daz Steve. This was not posted in the forums, but forum users quickly noticed it.

    Additionally, Steve also made a post on the forums about the future of Daz in the thread speculating about Genesis 9. He said there would be no Genesis 9 in 2019, but he also said that some very cool things would be coming to Daz Studio in 2Q-4Q. Of note here is that Steve used a plural term, so there are several things in the works.

    This is all the information that has been given. Daz is rather famous (or infamous perhaps) for keeping their plans extremely secretive. Odds are this is all we will get until whatever the new features are release. So we do not know what this means, it could be for Iray, it could be a real time mode, it might even be a whole new engine that is not Iray (notice Iray is not in the quote, so it could be anything). We are also now officially in 2Q, so if something is coming in 2Q like his forum post said, then something could come along any day now. If not, 2019 is the official time table, so SOMETHING should happen before the end of the year.

    Each software that has added RTX support has seen tremendous gains in performance. This not something to sneeze at, and keep in mind Steve himself calls it a "game changer". I believe that statement, I have even used that term in my own speculation. RTX will be a game changer, and that is not just speculation anymore, that is direct from Daz Steve himself.

     

    ...would be nice if something like Out of Core rendering or a render engine that can be run externally (like LuxRender) were introduced where you could submit the scene to the render engine then shut down the scene and programme to free up more system resources..  Not all of us can afford mini-supercomputers with 64 - 128 GB of memory, 32 core CPUs, and/or GPUs with boatloads of VRAM, particularly as the majority of us are still hobbyists on meagre incomes. 

  • GreymomGreymom Posts: 1,139
    edited April 2019
    Taoz said:

    One of the reasons I'm not using SSD is that I'm not sure if the current ones will work correctly, or even work, with my hardware, which is pretty old (10+ years). If I knew for sure it would work without problems I might give it a try.

    I am running Win 7 Pro on some old X8DTT-F Supermicro server blades (LGA 1366, Tylersburg 5500 chipset, CPU XEON W5660).  I am using three different brands of 120GB SSD's (whatever I could get cheap).  So far no problems.  These blades are so old they are SATA II.  The SSD's still seem faster than SATA III mechanical drives on these boards, but I mainly got them so they would fit in the compact rack system I designed.  The blades are part of a small, low-cost render stack for Carrara, and hopefully VUE 2015 (non-AVX version) and Blender Cycles.  It is not complete yet, but so far, so good.

    I am also using SSD's on old ASUS LGA 1155 (I7-2600K CPU, SATA III) motherboards, Win 7 Pro now upgraded to Win 10 Pro, for a couple of family gaming machines.  They are working great with an obvious speed boost.

     

     

    Post edited by Greymom on
  • ParadigmParadigm Posts: 423
    kyoto kid said:

     

    ebergerly said:

     

     

    ...would be nice if something like Out of Core rendering or a render engine that can be run externally (like LuxRender) were introduced where you could submit the scene to the render engine then shut down the scene and programme to free up more system resources..  Not all of us can afford mini-supercomputers with 64 - 128 GB of memory, 32 core CPUs, and/or GPUs with boatloads of VRAM, particularly as the majority of us are still hobbyists on meagre incomes. 


    Check out Battle Encoder Shirase. While it can't replace having tons of excess resources, it allows me to do other things while rendering. You can set it to limit the amount of CPU a program will use which is perfect for DS. It slows down the render, obviously, but if I set it to -33% I can do stuff like browse the web, listen to music and use photoshop while I render.

  • Joe.CotterJoe.Cotter Posts: 3,362
    edited April 2019

    I just put up a couple posts in the forum "Who said Blender was hard" that I think points out a monkeywrench to all of this.

    RTTE (read to the end before judging...)

    The future is going to leaner client devices and pushing more to cloud servers as it will be so much more cost effective in most situations. Specifically, the first post is about a Blender add-on to skip render farms and go directly to AWS for offline rendering. It is so much cheeper and faster then previous solutions that it could be considered a game changer if not for the fact that it's limited to Cycles and is new tech (read, not necessarily rock solid and harder to set up then it will be going forward.) But, it is a good example of where things are going since I believe using video cards in the price range of RTX cards will be hard to justify for most people when they can get performance cheaply and without a high overhead. It also has the ripple effect of not requiring someone to even have a desktop machine over a laptop. Not that there aren't still advantages with traditional methods going forward, just that imo, it will be more and more edge case scenarios. Specifically, content creators or high use situations where a high end local solution is combined with a renderfarm implementation. I just don't see anyone using only a local solution going forward (at some point in the near future) unless they are wedded to the comfort of a system that is already working and requires no transition costs.

    Post edited by Joe.Cotter on
  • TaozTaoz Posts: 10,259
    Greymom said:
    Taoz said:

    One of the reasons I'm not using SSD is that I'm not sure if the current ones will work correctly, or even work, with my hardware, which is pretty old (10+ years). If I knew for sure it would work without problems I might give it a try.

    I am running Win 7 Pro on some old X8DTT-F Supermicro server blades (LGA 1366, Tylersburg 5500 chipset, CPU XEON W5660).  I am using three different brands of 120GB SSD's (whatever I could get cheap).  So far no problems.  These blades are so old they are SATA II.  The SSD's still seem faster than SATA III mechanical drives on these boards, but I mainly got them so they would fit in the compact rack system I designed.  The blades are part of a small, low-cost render stack for Carrara, and hopefully VUE 2015 (non-AVX version) and Blender Cycles.  It is not complete yet, but so far, so good.

    I am also using SSD's on old ASUS LGA 1155 (I7-2600K CPU, SATA III) motherboards, Win 7 Pro now upgraded to Win 10 Pro, for a couple of family gaming machines.  They are working great with an obvious speed boost.

    Thanks, that sounds promising, I may give it a try with a small one then. If it doesn't work I guess I can always use it when I build a better rendering machine at some point.

  • ebergerlyebergerly Posts: 3,255
    Some of the powerful benefits of powerful local GPUs: You can also use them for video games, which presumably reflects the vast majority of consumer renderer folks. Also you can use them for near realtime scene preview while building scenes. I can't imagine being without that. Also I don't think the tech enthusiasts will ever part with their awesome GPU's. Add to that concerns over privacy and copyright (real or perceived) and I think cloud based rendering will remain fairly niche.
Sign In or Register to comment.