How many here are using the newer RTX GPUs?

135

Comments

  • ebergerlyebergerly Posts: 3,255
    BTW, personally, the more I hear about internet scams and password and private data stealing and so on the further away I stay from anything "cloud".
  • Dim ReaperDim Reaper Posts: 687

    I can't see cloud-based rendering becoming commonplace for the forseeable future.  Considering the time it takes for a complex scene to be loaded into my RAM locally before rendering, I hate to think how long it would take to send that information over a broadband connection.  Also, when I work on a scene, I might be at it for several hours setting up the scene, tweaking poses and expressions, and over that several hours I might easily make a dozen test renders.  On my local machine I don't really pay anything for them, but if I was paying per render on a cloud server, I wouldn't be able to work accurately on anything but the simplest of scenes.  Maybe a professional CG artist can get it right first time, or not have to worry about the cost of rendering if they have a big company behind them, but I'm sure that hobbyists will be keeping their own local setups for decades yet.

  • ParadigmParadigm Posts: 423

    I just put up a couple posts in the forum "Who said Blender was hard" that I think points out a monkeywrench to all of this.

    RTTE (read to the end before judging...)

    The future is going to leaner client devices and pushing more to cloud servers as it will be so much more cost effective in most situations. Specifically, the first post is about a Blender add-on to skip render farms and go directly to AWS for offline rendering. It is so much cheeper and faster then previous solutions that it could be considered a game changer if not for the fact that it's limited to Cycles and is new tech (read, not necessarily rock solid and harder to set up then it will be going forward.) But, it is a good example of where things are going since I believe using video cards in the price range of RTX cards will be hard to justify for most people when they can get performance cheaply and without a high overhead. It also has the ripple effect of not requiring someone to even have a desktop machine over a laptop. Not that there aren't still advantages with traditional methods going forward, just that imo, it will be more and more edge case scenarios. Specifically, content creators or high use situations where a high end local solution is combined with a renderfarm implementation. I just don't see anyone using only a local solution going forward (at some point in the near future) unless they are wedded to the comfort of a system that is already working and requires no transition costs.

     

    Privacy, proprietary content and content restrictions are three very important hurdles that cloud needs to handle before it will become the primary method. Nothing more secure than locally hosting all of your own stuff.

  • KitsumoKitsumo Posts: 1,221

     I just don't see anyone using only a local solution going forward (at some point in the near future) unless they are wedded to the comfort of a system that is already working and requires no transition costs. like owning thier own hardware and don't like the idea of paying a company for a "service" they can provide for themselves.

    There. Fixed that for me. Different strokes for different folks. I'm sure lots of people will be happy with cloud rendering - it's less cost up front. Just like some would rather rent a home than spend the time, money, blood, sweat & tears it takes to own one. And I'm fine with that, if it makes sense to you, go for it. As for me, I'll keep rendering in my tiny overheated room on my own GPUs, surrounded by my books, CDs, DVDs and vinyl recordscheeky.

    Additionally, I guess GPUs can also be used for playing games. Plus there are all sorts of niche projects and hobbies that you can't do with cloud rendering. Apparently now there's a Single Board Computer (similar to Raspberry Pi) with a PCI-E connector on it. So it's an SBC with a low powered processor, but it can connect to any standard GPU. I don't know what it's meant for, but I can imagine the possibilities. That's why I love having my own stuff.

  • ebergerlyebergerly Posts: 3,255

    I'm trying to imagine having a very old and/or inexpensive GPU that is pretty much incapable of doing an Iray preview. And trying to set up the scene and guess at what the final rendered image will look like from just a texture shaded view or whatever. No idea what the emissive lights and materials will look like in the final render. No clue. And then you just assume it's okay, send it to the cloud for rendering, and get back something that's nothing like you wanted.

    Repeat process over and over and over. 

    Yeah, not gonna happen. 

  • ebergerly said:
    Some of the powerful benefits of powerful local GPUs: You can also use them for video games, which presumably reflects the vast majority of consumer renderer folks. Also you can use them for near realtime scene preview while building scenes. I can't imagine being without that. Also I don't think the tech enthusiasts will ever part with their awesome GPU's. Add to that concerns over privacy and copyright (real or perceived) and I think cloud based rendering will remain fairly niche.

    Yea, I run an Iray render server and while everyone that uses it says how much it helps, it really is quite a hard sell for people to invest in it as a service.  Certainly is a great option for people who want access to hardware otherwise out of their reach though.

  • FauvistFauvist Posts: 2,219

    Yea, I run an Iray render server and while everyone that uses it says how much it helps, it really is quite a hard sell for people to invest in it as a service.  Certainly is a great option for people who want access to hardware otherwise out of their reach though.

    Thank you for telling me about your Iray render server (a while ago).  The only reason I have hesitated to use the service is because of this: "Pay as you go - $0.40 a minute (unlimited* access billed monthly)"  This is the service I would most likely want.  The problem for me is knowing how much a render is going to cost before it's done. It's .40 a minute, but how many minutes does it take to render an image?  I suppose it depends on file size.

  • kyoto kidkyoto kid Posts: 41,861
    Kitsumo said:

     I just don't see anyone using only a local solution going forward (at some point in the near future) unless they are wedded to the comfort of a system that is already working and requires no transition costs. like owning thier own hardware and don't like the idea of paying a company for a "service" they can provide for themselves.

    There. Fixed that for me. Different strokes for different folks. I'm sure lots of people will be happy with cloud rendering - it's less cost up front. Just like some would rather rent a home than spend the time, money, blood, sweat & tears it takes to own one. And I'm fine with that, if it makes sense to you, go for it. As for me, I'll keep rendering in my tiny overheated room on my own GPUs, surrounded by my books, CDs, DVDs and vinyl recordscheeky.

    Additionally, I guess GPUs can also be used for playing games. Plus there are all sorts of niche projects and hobbies that you can't do with cloud rendering. Apparently now there's a Single Board Computer (similar to Raspberry Pi) with a PCI-E connector on it. So it's an SBC with a low powered processor, but it can connect to any standard GPU. I don't know what it's meant for, but I can imagine the possibilities. That's why I love having my own stuff.

    ...some of us have no choice but to rent (housing prices where I live are obscene and I am now in my mid 60s and retired so not going to take on a mortgage at this stage).

    That also applies to subscription software.

    Being on Social Security, I cannot afford to plop down 600$ for an Octane perpetual licence and the Daz plugin up front.  The subscription track gives me the full engine plus plugin for 20$ per month, that's nearly 3 years of use with the cost spread out over time which is easier to manage on a tight budget.

  • davidtriunedavidtriune Posts: 452
    edited April 2019

    would a 2080ti render twice as fast as my 1080ti? sli setup would be 4 times as fast?

    Post edited by davidtriune on
  • davidtriunedavidtriune Posts: 452

    I had two 1080 TI so I replaced them with one 2080 TI at just about zero cost. Retained the rendering  performance, increased gaming performance (since I was using only one 1080 TI for that), got real time ray traced GI for Metro Exodus (not impressed though), got RTX to play with for future games and future things that may be introduced for render engines. And I guess less cramped in my case too. It just seemed to be a great trade to make at the time since 1080 TI's were selling so well still.

    not impressed by that game either. they need to use photogrammetry for their game textures instead of stuff made in photoshop IMO.

  • DAZ_RawbDAZ_Rawb Posts: 817

    Just so all of you RTX-using people are aware, there is a new version of the Daz Studio Pro Beta - 4.11.0.335:

     

    https://www.daz3d.com/forums/discussion/265581/daz-studio-pro-beta-version-4-11-0-335-updated/p1

     

    One of the things in this update is an updated Iray which adds support for the tensor cores that are part of the RTX technology. This won't accellerate rendering (it doesn't add RT core support) but it should provide a hardware speedup for the AI denoiser. So be sure to update your NVIDIA drivers and give that functionality a good beating.

  • TheKDTheKD Posts: 2,711

    So this update is only worth grabbing if you got a rtx, or does the ai denoiser work on gtx too?

  • davidtriunedavidtriune Posts: 452
    TheKD said:

    So this update is only worth grabbing if you got a rtx, or does the ai denoiser work on gtx too?

    The denoiser works on my gtx 1080ti quite nicely

  • TheKDTheKD Posts: 2,711
    TheKD said:

    So this update is only worth grabbing if you got a rtx, or does the ai denoiser work on gtx too?

    The denoiser works on my gtx 1080ti quite nicely

    Thanks, will have to wait to grab it till this render finishes baking lol.

  • KitsumoKitsumo Posts: 1,221
    edited April 2019
    kyoto kid said:
    ...some of us have no choice but to rent (housing prices where I live are obscene and I am now in my mid 60s and retired so not going to take on a mortgage at this stage).

    That also applies to subscription software.

    Being on Social Security, I cannot afford to plop down 600$ for an Octane perpetual licence and the Daz plugin up front.  The subscription track gives me the full engine plus plugin for 20$ per month, that's nearly 3 years of use with the cost spread out over time which is easier to manage on a tight budget.

    I know what you mean. When I started college in '92 at a major university (ACC, football program and all), $2600 covered a whole semester, books, room, meal plan, everything. Today, it covers basic tuition at the local tech school. We're getting priced out of everything. I'll finish this degree sooner or later.frown

    would a 2080ti render twice as fast as my 1080ti? sli setup would be 4 times as fast?

    The 2080ti is definitely faster, but I don't think it's twice as fast. The cards will work together fine as long as they're installed, SLI will actually cause problems and slow them down.

    Post edited by Kitsumo on
  • outrider42outrider42 Posts: 3,679
    edited April 2019
    ebergerly said:

    I think there may be some confusion on what is under DAZ control and what is under NIVIDA/Iray control. My limited understanding of Iray is that it is a complete and full featured rendering solution that is designed for relatively small studios and software vendors who want to provide a consumer-level rendering software without requiring a ton of development. It differs from some of the other rendering solutions that are used by large CG/game/whatever production companies with large staffs of developers who want low level, very custom and detailed control over all aspects of their software, and are willing to assign large teams of software developers to dig deep and develop customized solutions for changing business needs.

    My simple view of Iray is that DAZ (or whoever) merely has to, for the most part, just develop a user interface, configure their scene files in the Iray format, configure material/surface settings that are all implemented internally by Iray, and so on, and Iray does the rest. I'm sure this is a vast oversimplification, but I think the point remains. Once DAZ decides on Iray, I think they pretty much rely on it to do just about everything behind the scenes, and they just need to build the interface per Iray requirements. I assume DAZ's main focus is mainly on developing content, and as a side part of that making sure the content is all in Iray-compatible format. 

    And with all due respect, a phrase like "game changer" is, while somewhat exciting to some, ultimately meaningless. It can mean anything, and I think what's really important is how NVIDIA actually ties Iray into RTX technology in the future, and ultimately what that actually means for Studio/Iray in terms of render speeds and features. And until that happens and we can put numbers to it, it's all just guessing.   

    For now, the good thing (IMO) is that the RTX cards are cutting the render times of their GTX counterparts in half, and also have a good bang for the buck. Certainly not close to what all the hype promised last year, but there's always the promise that this will improve to some extent depending on what happens to Iray.  

    Iray is a rendering engine. Just like Octane is a render engine. They do the same thing. Daz does not design Iray in any way, true, but Daz does need to implement the plugin once Nvidia provides the new SDK and they tailor it to Daz Studio. Iray is simply another product that Nvidia sells, it is not just designed for small studios or whatever. It just turns out that it has becomes more niche than probably Nvidia hoped for, most probably because it is exclusive to their hardware. Luxmark is hardware agnostic, so is Blender. Iray also allows for customization, after all, you have seen the Iray programming guide. It may not be as pliable as a game engine, but game engines are designed to be highly adjustable on purpose. They have to be in order to accommodate for all the different kinds and styles of games there are and every game has very unique demands of what it needs. A first person shooter will value high frame rates and speed where are a 3rd person exploration game may desire more atmospheric effects for mood.

    The game changer comment has weight because of just how very rare it is for anybody at Daz to talk about future product features. Daz NEVER does that. I cannot even recall a time when somebody from Daz went on record in this fashion. It shows that they are extremely confident in what is coming out of RTX.

    And they have good reason to be. RTX GPUs are already about twice as fast as last gen and that is without its prime feature enabled. Every software that has enabled RTX so far has seen massive, and I mean MASSIVE gains. Want me to define massive? I define massive in that it is multiple times faster than not having RTX. The typical generational leap is usually only 50% if even that. Turning RTX on gives results that are in the 2-3X+ range. No generation has ever seen this much of a leap in performance. Take a look at what turning on RTX does for Octane VS turning it off. And keep in mind this chart only lists RTX cards. If the 1080ti was on here, it would be pretty far behind the 2080ti even without RTX enabled. These results are pretty common. I can't really think of any instance where turning RTX on has not resulted in a huge performance gain. 

    I believe the results seen here will be duplicated when Iray gets RTX enabled. Just look at this, the 2060 with RTX on is faster than the 2080ti without RTX!!! That is huge. Just think of that. A $350 2060 could blow the doors off the (formerly) $700 1080ti. The 2060 with RTX enabled could be rendering faster than the 2080ti is right now. That is amazing. For comparison, the 1060 certainly was not even close to the 980ti. The 960 was nowhere near the 780ti. I could go on and on about each generation. This is what I mean by a huge generational leap and why so many in the rendering community are excited. If that is not a "game changer", then what in the world is? LOL.

    And why would Iray not benefit like Octane? There is no possible way that Iray doesn't get similar benefits. Iray is not a game engine. You do not "partially" integrate ray tracing cores here. A game might only use RTX for shadows (like the ironically named "Shadow of the Tomb Raider"), or reflections like Battlefield 5, or global illumination like Metro Exodus. Each of these games only implements ray tracing for one single aspect of their lighting. Iray is fully ray traced just like Octane.

    Gaming does have a ray traced benchmark, and now that Pascal GPUs can use ray tracing in games, people have tested Pascal in ray tracing gaming benchmarks. The prime bench is 3DMark's Port Royal. The results are pretty drastic. Like Octane, the 2060 destroys the mighty 1080ti, which can barely even handle the bench. The jump to the 2080ti is quite massive here. So even in gaming, it is possible to see just how much the ray tracing cores add to RTX cards.

    Post edited by outrider42 on
  • bluejauntebluejaunte Posts: 1,990

    Game changer is such a marketing term though. What game is being changed here exactly? So you'll render faster. That's great, it changes nothing other than having to wait less long for a render to finish. What game is changed when I have to wait 5 minutes instead of 20? I still have to wait. I still have to set up the scene, all the workflows are still the same. 

    A game changer would be if, solely because of RTX, we would suddently get a real time viewport with the same quality as a finished render.  Or if we suddenly got completely real time ray traced games with the same quality as baked lightmaps. That will not happen for a while, so let's say what we have now could at least be the beginning of a game changer. But for offline rendering? I just see a very welcome speed boost, one that is indeed much higher than the usual generational steps. This does not change any game whatsoever. Game changer implies that something that was previously done in one manner can now be done in a completely different and much more comfortable manner.

  • TheKDTheKD Posts: 2,711
    edited April 2019

    For the price they are asking, I might go for it, if it doubled the VRAM. The scenes I do usually have to be done in multiple passes, more vram would be less passes. Cutting a few minutes off of each render is not going to make me go out and spend over a grand(for the 2070) on a graphics card, that's over a weeks pay, and work is sporadic as an independant contractor.

    Post edited by TheKD on
  • Fauvist said:

    Yea, I run an Iray render server and while everyone that uses it says how much it helps, it really is quite a hard sell for people to invest in it as a service.  Certainly is a great option for people who want access to hardware otherwise out of their reach though.

    Thank you for telling me about your Iray render server (a while ago).  The only reason I have hesitated to use the service is because of this: "Pay as you go - $0.40 a minute (unlimited* access billed monthly)"  This is the service I would most likely want.  The problem for me is knowing how much a render is going to cost before it's done. It's .40 a minute, but how many minutes does it take to render an image?  I suppose it depends on file size.

    Yea I think a lot did, which is why I now do an 'all you can eat' monthly subscription.  Drop me a PM if you want more info, I don't want to derail the thread :)

  • kyoto kidkyoto kid Posts: 41,861
    DAZ_Rawb said:

    Just so all of you RTX-using people are aware, there is a new version of the Daz Studio Pro Beta - 4.11.0.335:

     

    https://www.daz3d.com/forums/discussion/265581/daz-studio-pro-beta-version-4-11-0-335-updated/p1

     

    One of the things in this update is an updated Iray which adds support for the tensor cores that are part of the RTX technology. This won't accellerate rendering (it doesn't add RT core support) but it should provide a hardware speedup for the AI denoiser. So be sure to update your NVIDIA drivers and give that functionality a good beating.

    ...well, it doesn't show up in the DIM.

  • outrider42outrider42 Posts: 3,679
    edited April 2019

    Game changer is such a marketing term though. What game is being changed here exactly? So you'll render faster. That's great, it changes nothing other than having to wait less long for a render to finish. What game is changed when I have to wait 5 minutes instead of 20? I still have to wait. I still have to set up the scene, all the workflows are still the same. 

    A game changer would be if, solely because of RTX, we would suddently get a real time viewport with the same quality as a finished render.  Or if we suddenly got completely real time ray traced games with the same quality as baked lightmaps. That will not happen for a while, so let's say what we have now could at least be the beginning of a game changer. But for offline rendering? I just see a very welcome speed boost, one that is indeed much higher than the usual generational steps. This does not change any game whatsoever. Game changer implies that something that was previously done in one manner can now be done in a completely different and much more comfortable manner.

    Real time rendering is still about *time* though, time itself is a game changer. 20 minutes vs 5 minutes, if that is the case, that's pretty darn game changing if that's what happens. Time adds up fast. A lot of people don't render in 20 minutes, maybe an hour...2, 3, I don't know. But if you drop from 2 hours to like say 30 minutes, that may not be real time, but chopping off 1.5 hours of time is a game changer. For anybody who makes money creating renders, that would be a huge income boost, and I am pretty sure that would a big deal to them. Time can equal money. Even if its not about money, time equals more production. That cannot be argued.

    The combo of RT cores plus Tensor denoising could make animating with Iray actually feasible. The denoiser just got Tensor, that could be a big upgrade, too. After all, something has to be done to render frames fast enough for "real time". Tensor denoising is probably a key part of that feature, and here today we have it. I'm a little surprised the new beta didn't add RT cores, that would seem to indicate our wait for RT support will be longer.

    I found something interesting. According to the official release notes, the denoiser will "look different" from the previous version. So there is indeed a change to the visual quality of the denoiser in this new version. The question is...is the change for the better?

    "Denoiser now works on HDR images before tonemapping. For very bright or dark images it uses autoexposure before denoising and inverts that afterwards. Please note, that denoised images will look different from previous versions."
    Post edited by outrider42 on
  • bluejauntebluejaunte Posts: 1,990

    Game changer is such a marketing term though. What game is being changed here exactly? So you'll render faster. That's great, it changes nothing other than having to wait less long for a render to finish. What game is changed when I have to wait 5 minutes instead of 20? I still have to wait. I still have to set up the scene, all the workflows are still the same. 

    A game changer would be if, solely because of RTX, we would suddently get a real time viewport with the same quality as a finished render.  Or if we suddenly got completely real time ray traced games with the same quality as baked lightmaps. That will not happen for a while, so let's say what we have now could at least be the beginning of a game changer. But for offline rendering? I just see a very welcome speed boost, one that is indeed much higher than the usual generational steps. This does not change any game whatsoever. Game changer implies that something that was previously done in one manner can now be done in a completely different and much more comfortable manner.

     

    Real time rendering is still about *time* though, time itself is a game changer. 20 minutes vs 5 minutes, if that is the case, that's pretty darn game changing if that's what happens. Time adds up fast. A lot of people don't render in 20 minutes, maybe an hour...2, 3, I don't know. But if you drop from 2 hours to like say 30 minutes, that may not be real time, but chopping off 1.5 hours of time is a game changer. For anybody who makes money creating renders, that would be a huge income boost, and I am pretty sure that would a big deal to them. Time can equal money. Even if its not about money, time equals more production. That cannot be argued.

     

    The combo of RT cores plus Tensor denoising could make animating with Iray actually feasible. The denoiser just got Tensor, that could be a big upgrade, too. After all, something has to be done to render frames fast enough for "real time". Tensor denoising is probably a key part of that feature, and here today we have it. I'm a little surprised the new beta didn't add RT cores, that would seem to indicate our wait for RT support will be longer.

     

    I found something interesting. According to the official release notes, the denoiser will "look different" from the previous version. So there is indeed a change to the visual quality of the denoiser in this new version. The question is...is the change for the better?

     

    "Denoiser now works on HDR images before tonemapping. For very bright or dark images it uses autoexposure before denoising and inverts that afterwards. Please note, that denoised images will look different from previous versions."

    I render for a living, so to say. If speed was such a game changer, I would throw more hardware at it or maybe use a render farm. Nothing keeps me from spending more money to get faster rendering. A 2080 TI costs a lot of money too, so that problem wasn't solved either. It just happens to be good upgrade, at least for us rendering folks, from a price vs speed point of view. That's pretty much it though. This doesn't change my game, but maybe it's just semantics and we have different understandings of what exactly game changer means.

    Iray was a game changer for Daz Studio. It rendered much faster (with proper hardware at least), it changed the workflows of PA's towards a more modern PBR style, it was more intuitive and more productive. It brought Daz Studio to the age of GPU rendering. That's the type of thing that I call game changer. Fully raytraced games is what I would call game changer, not even so much for the gamer but for game developers who would get around a lot of annoying hackery and endless hours of lightmap baking.

  • outrider42outrider42 Posts: 3,679
    I don't know, if you told any other business they could do something 3-6 times faster they would likely consider it a game changer for their business.

    Like you could make your drive through 3-6 times faster. Its proven that Americans prefer faster drive throughs, and that kind of speed up would equal massive profits. Chic-filet is doing something like this. They have redesigned their restaurants around new drive through concepts. Mine can have 6 people outside to make sure the cars move quickly. There are many people who will drive to a less busy restaurant if they see a long line. This ensures that Chic-filet gets every possible customer they can. But at the end of the day, its simply a faster drive through.

    You could speed up customer check outs 3-6 times, another proven stat is customers hate long lines or checkouts, even online. There are ads running now for things like Visa Checkout which all push just how fast and convenient they are.

    An oil change 3-6 times faster, again, some business base their business entirely around how fast they can do this and even have speed themed names. The one that is the fastest often wins.

    Speed matters. That's why this thread exists. The typical generational leap would be one thing, but if RTX could bring speed ups like this to Daz Studio it changes things. 3D rendering by its nature is a very expensive hobby/business to get into. Ivy just posted some numbers on how much their machine can cost. RTX is poised to bring that cost down for many people. Like I showed above, the 2060 with RTX enabled blows the doors off the top GPUs in the world without RTX. A $350 GPU can beat all the top cards not just last generation, but even right now. To anyone who can only afford $350 for a GPU, that is a real game changer. Each upgrade I made personally felt dramatic. The speed changes things.

    Speed in Iray doesn't just effect your output. It can effect how fast you learn the program. A user can only learn as fast as they can render. Its cold and cruel, but it is reality. If it takes 45 minutes just to see what your image -might- look like, how can you tell what changes need to be made in a timely manner? How do you experiment with different lighting or surfaces when it takes so long to even see a result? Now if you can see those results much faster, like the render shows up and is fairly clean in just a couple minutes, that changes how quickly you can experiment with your scenes. With this kind of power, users will feel emboldened to try new things. I myself have been hesitate to use dforce because it takes so long. Time is NOT on my side. My playtime for Daz Studio is very small. I have a full time job that is outside the 3D world, and a life. The time I have for Daz is quite limited, so every minute saved is a big deal to me. I think people like me are going to feel the benefits of RTX the most.

    Maybe this isn't a game changer to you. But for me, this kind of performance would be a game changer.
  • ebergerlyebergerly Posts: 3,255
    edited April 2019

    I tend to agree. I really think it's pretty much a game changer. But not totally a game changer, maybe only 75%. But it's certainly awesome. No doubt. And let's face it, the whole thing is a total beast. Seriously.

    And let's not forget jaw dropping. Though I'm not sure. Maybe. I think it's more of a beast than jaw dropping. But no doubt it will leave all others in the dust. laugh 

    Post edited by ebergerly on
  • ParadigmParadigm Posts: 423
    DAZ_Rawb said:

    Just so all of you RTX-using people are aware, there is a new version of the Daz Studio Pro Beta - 4.11.0.335:

     

    https://www.daz3d.com/forums/discussion/265581/daz-studio-pro-beta-version-4-11-0-335-updated/p1

     

    One of the things in this update is an updated Iray which adds support for the tensor cores that are part of the RTX technology. This won't accellerate rendering (it doesn't add RT core support) but it should provide a hardware speedup for the AI denoiser. So be sure to update your NVIDIA drivers and give that functionality a good beating.

    !!!!!!!!!!!!!

  • bluejauntebluejaunte Posts: 1,990
    I don't know, if you told any other business they could do something 3-6 times faster they would likely consider it a game changer for their business.

    But that's not the scenario. You could already increase the speed, you just needed to throw more hardware at it. If cost isn't an issue and only speed matters, you can absolutely do that. It's what VFX studios do with their render farms.

    ebergerly said:

    I tend to agree. I really think it's pretty much a game changer. But not totally a game changer, maybe only 75%. But it's certainly awesome. No doubt. And let's face it, the whole thing is a total beast. Seriously.

    And let's not forget jaw dropping. Though I'm not sure. Maybe. I think it's more of a beast than jaw dropping. But no doubt it will leave all others in the dust. laugh 

    Whaaaat? Did I read this right? You, our favorite hardcore pessimist, handing out warnings left and right and advocating caution with RTX not a few months ago (maybe weeks), suddenly think this is a game changer after all? Well I'm kinda glad to hear that laugh

  • I don't know what "3-6 times faster" actually means.  Mathematically, this is confusing.

    Does it take 1/3rd to 1/6th the time that the old one did?

  • ebergerlyebergerly Posts: 3,255
     
    ebergerly said:

    I tend to agree. I really think it's pretty much a game changer. But not totally a game changer, maybe only 75%. But it's certainly awesome. No doubt. And let's face it, the whole thing is a total beast. Seriously.

    And let's not forget jaw dropping. Though I'm not sure. Maybe. I think it's more of a beast than jaw dropping. But no doubt it will leave all others in the dust. laugh 

    Whaaaat? Did I read this right? You, our favorite hardcore pessimist, handing out warnings left and right and advocating caution with RTX not a few months ago (maybe weeks), suddenly think this is a game changer after all? Well I'm kinda glad to hear that laugh

    Actually it was a bit of sarcasm, highlighting how pointless it is to try to define marketing phrases that are ultimately meaningless, like "game changing" and "awesome" and "it's a beast". Which is why I immediately dismiss them as soon as I hear them. 

  • DAZ_RawbDAZ_Rawb Posts: 817

    I don't know what "3-6 times faster" actually means.  Mathematically, this is confusing.

    Does it take 1/3rd to 1/6th the time that the old one did?

    Both ways of describing it are correct.

     

    The 3-5 times faster means that in a given amount of time you could render 3-6 times as many frames for an animation for example.

    On the other side, it also means that a single frame would take between 1/3rd to 1/6th as long to finish rendering.

     

    So it just depends on which way you want to look at it.

  • TheKDTheKD Posts: 2,711

    I don't know what "3-6 times faster" actually means.  Mathematically, this is confusing.

    Does it take 1/3rd to 1/6th the time that the old one did?

    Yeah, when I added a 1070 to my 1060, I was expecting a huge difference in render time, since I was more than doubling the core count. Same when I went from 960 to 1060, reality did not match my expectations. It seems to me that GPU are in the same kind of slump CPU have been in as far as diminishing returns every generation.

Sign In or Register to comment.