How many here are using the newer RTX GPUs?

124

Comments

  • ebergerlyebergerly Posts: 3,255
    DAZ_Rawb said:

     

    On the other side, it also means that a single frame would take between 1/3rd to 1/6th as long to finish rendering.

     

    True, as long as you're clear on what you're comparing. Unfortunately people throw around the term "faster" without explaining WHAT it's faster than. As long as you're clear on "it's faster than this GPU on this particular scene with these settings", then it's meaningful. But if you just make a general statement that "this thing is 10x faster", as is often done in the tech world, then it's pretty meaningless. 

    And the most meaningless of statements is something like "it's UP TO 10x faster"  laugh

  • bluejauntebluejaunte Posts: 1,990
    ebergerly said:
     
    ebergerly said:

    I tend to agree. I really think it's pretty much a game changer. But not totally a game changer, maybe only 75%. But it's certainly awesome. No doubt. And let's face it, the whole thing is a total beast. Seriously.

    And let's not forget jaw dropping. Though I'm not sure. Maybe. I think it's more of a beast than jaw dropping. But no doubt it will leave all others in the dust. laugh 

    Whaaaat? Did I read this right? You, our favorite hardcore pessimist, handing out warnings left and right and advocating caution with RTX not a few months ago (maybe weeks), suddenly think this is a game changer after all? Well I'm kinda glad to hear that laugh

    Actually it was a bit of sarcasm, highlighting how pointless it is to try to define marketing phrases that are ultimately meaningless, like "game changing" and "awesome" and "it's a beast". Which is why I immediately dismiss them as soon as I hear them. 

    I think we're outdoing each other on sarcasm laugh

    Still though, you do sound a bit more positive about it all in recent days.

  • ebergerlyebergerly Posts: 3,255
     

    I think we're outdoing each other on sarcasm laugh

    Still though, you do sound a bit more positive about it all in recent days.

    I'm positive when the facts, not the hype, support optimism. As soon as I learned that Iray would continue to be supported, and RTX would be supported in Iray, and we got actual render times showing that the RTX cards are cutting render times in half (rather than the more typical 30% improvement of previous generations), I became a bit more optimistic. I'm really hoping that we can now have realtime Iray previews thanks to the denoising and other tech in RTX. And maybe even realtime cloth sims. Now THAT would be game changing for me laugh

  • KitsumoKitsumo Posts: 1,222

    I don't know what "3-6 times faster" actually means.  Mathematically, this is confusing.

    Does it take 1/3rd to 1/6th the time that the old one did?

    Thank you! I was wondering if I was the only one who hated that expression. Grammatically it is confusing. Mathematically, I think it's a bunch of horse poop. I don't think you can define "faster" until you have a benchmark for what "fast" is. I guess it's safe to assume they mean 1/3rd to 1/6th the time of the old one. If they'd said "3 to 6 times less" I would have picked up my monitor and thrown it across the room.

    As far as the game changer debate, I think it would be a game changer if they only bumped the prices 10 or 15 percent over the 10 series. This is a huge step, but it's not gonna make everyone rush out and buy new cards. For most of us, the game is still the same for now.

  • ParadigmParadigm Posts: 423
    edited April 2019
    Kitsumo said:

    I don't know what "3-6 times faster" actually means.  Mathematically, this is confusing.

    Does it take 1/3rd to 1/6th the time that the old one did?

    Thank you! I was wondering if I was the only one who hated that expression. Grammatically it is confusing. Mathematically, I think it's a bunch of horse poop. I don't think you can define "faster" until you have a benchmark for what "fast" is. I guess it's safe to assume they mean 1/3rd to 1/6th the time of the old one. If they'd said "3 to 6 times less" I would have picked up my monitor and thrown it across the room.

    As far as the game changer debate, I think it would be a game changer if they only bumped the prices 10 or 15 percent over the 10 series. This is a huge step, but it's not gonna make everyone rush out and buy new cards. For most of us, the game is still the same for now.

    Render a scene on one card. Mark the time. Render the same scene on the other card. Mark the time.

    There's nothing really gramatically or mathematically confusing as far as I see it. If one takes 9 minutes and the other takes 3 minutes, the second was 3x faster. 

    Post edited by Paradigm on
  • kyoto kidkyoto kid Posts: 41,928

    ..stll not convicend spending big Zl

    I don't know, if you told any other business they could do something 3-6 times faster they would likely consider it a game changer for their business.

    But that's not the scenario. You could already increase the speed, you just needed to throw more hardware at it. If cost isn't an issue and only speed matters, you can absolutely do that. It's what VFX studios do with their render farms.

    ....and for many of us therein lies the rub. 

    As Gus Grissom in The Right Stuff said, "No Bucks, No Buck Rogers."  Most of us are hobbyists who either do this in our spare time and/or are on limited budgets.

  • kyoto kidkyoto kid Posts: 41,928
    ebergerly said:
    DAZ_Rawb said:

     

    On the other side, it also means that a single frame would take between 1/3rd to 1/6th as long to finish rendering.

     

    True, as long as you're clear on what you're comparing. Unfortunately people throw around the term "faster" without explaining WHAT it's faster than. As long as you're clear on "it's faster than this GPU on this particular scene with these settings", then it's meaningful. But if you just make a general statement that "this thing is 10x faster", as is often done in the tech world, then it's pretty meaningless. 

    And the most meaningless of statements is something like "it's UP TO 10x faster"  laugh

    ...I remember a similar claim with Reality 4.x and the (then) new version of LuxRender that was integrated into it.  I was getting about a 2x - 3x speed boost at best, but saw that it came at the cost of image quality (and was still slower than either 3DL with UE or Iray in CPU mode).. 

  • ebergerlyebergerly Posts: 3,255
    edited April 2019

    Okay everyone, stop the presses....laugh

    Not only is RTX a game changer, I just saw an NIVIDA ad that said

    "RTX. It's ON".

     

    Wow. That pretty much settles it for me. I gotta get me one of them RTX things. 

    And I also saw a video saying that the RTX 2080ti is 10-15 times faster than the GTX 1080ti. Obviously, my 15 minute 1080ti renders will now render in 1 minute. 

    Case closed. 

    Post edited by ebergerly on
  • RobinsonRobinson Posts: 751
    joseft said:

    Redshift 3.0 is due any day now, and early support for RT is coming with it. I have not checked in on Octane recently, but i know they have also had internal beta builds supporting RT hardware for months now, just not sure if they have released publicly yet.

    I've read good things about Redshift but I'm not sure how the pipeline to use it would work starting from Daz.  Would probably involve Maya.   I have the Maya character exporter but there are problems with the pipeline, such as geografts and so forth.  You still have a lot of work to do to be productive with it.

  • Joe.CotterJoe.Cotter Posts: 3,362
    edited April 2019

    Forums messing up and reposting new multiple posts instead of just editing :/

    Post edited by Joe.Cotter on
  • Joe.CotterJoe.Cotter Posts: 3,362
    edited April 2019

    double post...

    Post edited by Joe.Cotter on
  • Joe.CotterJoe.Cotter Posts: 3,362
    edited April 2019
    ebergerly said:
    ...benefits of powerful local GPUs: ... near realtime scene preview while building scenes.

    True, but one doesn't need near as much to render previews as a final image, much less an animated sequence. As powerful as new gpus might be, I don't see anyone doing animation on them for a reasonable cost.

    ebergerly said:
    Add to that concerns over privacy and copyright (real or perceived) and I think cloud based rendering will remain fairly niche.

    If you looked at what I was linking to, you would see it isn't a standard renderfarm. It is using AWS to basically create your own render farm in the cloud as an isolated instance so these don't really come into play.

    The other think to take note of is the cost of this solution looks to be a fraction of a typical renderfarm solution at the moment. This, combined with the speed issue opens up a lot of possibilities, such as animation vs still images which are typical now.

    Note: I haven't tested this yet so I can't say how feasible it is yet. It just looks interesting and posibly promising. I do agree that people will want to render locally going forward, but I also believe a lot will be interested in a cloud solution at some point. Just my opinion for whatever it is/isn't worth. ;)

     

    Post edited by Joe.Cotter on
  • Joe.CotterJoe.Cotter Posts: 3,362

    Yea, I run an Iray render server ...

    I didn't realize that. It could be handy for Iray renders. I'm interested ;)

  • TaozTaoz Posts: 10,269
    Paradigm said:
    DAZ_Rawb said:

    Just so all of you RTX-using people are aware, there is a new version of the Daz Studio Pro Beta - 4.11.0.335:

     

    https://www.daz3d.com/forums/discussion/265581/daz-studio-pro-beta-version-4-11-0-335-updated/p1

     

    One of the things in this update is an updated Iray which adds support for the tensor cores that are part of the RTX technology. This won't accellerate rendering (it doesn't add RT core support) but it should provide a hardware speedup for the AI denoiser. So be sure to update your NVIDIA drivers and give that functionality a good beating.

    !!!!!!!!!!!!!

    Hm, it only takes a few seconds for those renders I've post-denoised so far, so not much saved here I think?

  • RobinsonRobinson Posts: 751
    Taoz said:
    Paradigm said:
     

    Hm, it only takes a few seconds for those renders I've post-denoised so far, so not much saved here I think?

    You converge more quickly with the denoiser enabled, so it's not just the few seconds of post render denoise you have to compare against.  At least that's how I understand it.  I think to properly compare you should probably render to 95% with it enabled and disabled and see how long it took.  I may do this later.

  • TaozTaoz Posts: 10,269
    Robinson said:
    Taoz said:
    Paradigm said:
     

    Hm, it only takes a few seconds for those renders I've post-denoised so far, so not much saved here I think?

    You converge more quickly with the denoiser enabled, so it's not just the few seconds of post render denoise you have to compare against.  At least that's how I understand it.  I think to properly compare you should probably render to 95% with it enabled and disabled and see how long it took.  I may do this later.

    OK, I think I've heard others say that if you start denoising early it will slow down rendering. I guess only testing can tell.

  • RobinsonRobinson Posts: 751
    edited April 2019
    Taoz said:
    Robinson said:
     

    OK, I think I've heard others say that if you start denoising early it will slow down rendering. I guess only testing can tell.

    Probably true but "early" is probably quite a low number in terms of iterations.  I mean a start iteration of, say, 100 is OK.  But yes, needs testing.

    Post edited by Robinson on
  • outrider42outrider42 Posts: 3,679
    I don't know, if you told any other business they could do something 3-6 times faster they would likely consider it a game changer for their business.

    But that's not the scenario. You could already increase the speed, you just needed to throw more hardware at it. If cost isn't an issue and only speed matters, you can absolutely do that. It's what VFX studios do with their render farms.

    ebergerly said:

     

    But if you throw enough hardware at Iray you can just about render in real time. This sort of argument fails for that reason.

    It becomes a game changer because it lowers the bar of entry for everyone. 3D is a very difficult thing to get into. If you lack the money for a decent computer, you are pretty handicapped. It is NOT simply render speed. Being able to render quickly also helps people to learn the program itself. You have had two 1080ti's, and this has allowed you to work on testing your models. You can more quickly spot things you do not like, stop the render, and make a change. Other people with lesser hardware do not have that luxury. People with slow hardware might wait for 30 or 45 minutes before they even see a partial image, let alone a full render. So if something is wrong, they need to stop, fix, and start all over. I really don't know how anybody using CPU can handle working with Iray, because I would have given up ages ago and moved on to another hobby if that was me.

    I think a lot of people overlook this. Learning is about repetition. Painters paint, over and over. Photographers snap pictures, lots and lots of them. And this applies to 3D, you need to render, lots and lots of renders. But with poor hardware you simply cannot do that. Crappy hardware stunts your ability to learn. Just to be clear, I am not saying the tools are more important than the artist, but when it comes to 3D, the tools are quite important and can effect your growth.

    When the 2060 gets RTX support, it should easily beat the very fastest Pascal GPUs ever made. The Titan Xp, and the 1080ti. It will even beat the Titan Volta! Not only that, but the 2060 with RTX enabled even beats the 2080ti without RTX. Think about that. As fast as the 2080ti can render right now, the 2060 with RTX enabled could render even faster. This is not me pulling numbers out of my bum. Again, this is based on the RTX numbers from Octane, and as I said before, I see no reason why Iray would be any different. The 2060 is just $350, so for less than half of what the 1080ti launched at, you will be able to get that kind of performance. And of course each RTX takes it a step further, to where the 2080ti gains almost 3X over its non RTX score.

    That...is pretty game changing. That's like 3 or even 4 generational leaps at once.

  • bluejauntebluejaunte Posts: 1,990

    Yeah definitely, I can agree that someone who might be able to suddenly render much faster at an affordable price could see this as a game changer. Like I said I think it's a matter of semantics and perspective. I think we can agree that it's just "pretty damn awesome" if it pans out laugh

  • Since I am getting an RTX 2080 ti installed as we speak, I wondered what to expect? any issues? I know they only work with the beta DS version and Iray..

    Any gamers in the group have any concerns? Issues? happy incidents?

    Going from  GTX 970 with 4 gig of DDR to this is kind of exciting for me. Hopefully it won't let me down. I have scenes set up that I could never render before not to mention a few newer games on my wishlist at steam I am dying to check out.

    The performance will blow you away.  I upgraded from 3 gtx 970's to a 2080 ti.  Remember to keep the GTX 970 in the system.  It will still speed up your renders by maybe 15 to 20 percent with the 2080 ti.

    For ray tracing games, most of the people I know turn it off.  It's not worth the substantial drop in fps.

  • fred9803fred9803 Posts: 1,565

    The performance will blow you away.  I upgraded from 3 gtx 970's to a 2080 ti.  Remember to keep the GTX 970 in the system.  It will still speed up your renders by maybe 15 to 20 percent with the 2080 ti.

    Yeh I went from a single 960 to a 2080 and am very pleased with the better render times.... about 2 hours down to 30 minutes for the same scene. I guess it's down to more GPU memory and not switching to CPU.

  • dougjdougj Posts: 92

    I swapped out a 4gb Quadro K2200 for an RTX2060 and I'm very happy with the performance so far. One thing that I've noticed is that Daz Studio doesn't manage GPU memory efficiently. If I make a few minor changes to a scene, ie. change the render settings, and rerender a bunch of times after re-tweaking the render settings, Daz Studio will eventually fill up the GPU memory and switch to CPU only. When that happens, I just save the scene, restart Daz Studio and load the scene, then it renders fine with the GPU again. I wonder if that's a common problem or if it only happens on my PC.

  • kyoto kidkyoto kid Posts: 41,928
    I don't know, if you told any other business they could do something 3-6 times faster they would likely consider it a game changer for their business.

    But that's not the scenario. You could already increase the speed, you just needed to throw more hardware at it. If cost isn't an issue and only speed matters, you can absolutely do that. It's what VFX studios do with their render farms.

    ebergerly said:

     

    But if you throw enough hardware at Iray you can just about render in real time. This sort of argument fails for that reason.

    It becomes a game changer because it lowers the bar of entry for everyone. 3D is a very difficult thing to get into. If you lack the money for a decent computer, you are pretty handicapped. It is NOT simply render speed. Being able to render quickly also helps people to learn the program itself. You have had two 1080ti's, and this has allowed you to work on testing your models. You can more quickly spot things you do not like, stop the render, and make a change. Other people with lesser hardware do not have that luxury. People with slow hardware might wait for 30 or 45 minutes before they even see a partial image, let alone a full render. So if something is wrong, they need to stop, fix, and start all over. I really don't know how anybody using CPU can handle working with Iray, because I would have given up ages ago and moved on to another hobby if that was me.

    I think a lot of people overlook this. Learning is about repetition. Painters paint, over and over. Photographers snap pictures, lots and lots of them. And this applies to 3D, you need to render, lots and lots of renders. But with poor hardware you simply cannot do that. Crappy hardware stunts your ability to learn. Just to be clear, I am not saying the tools are more important than the artist, but when it comes to 3D, the tools are quite important and can effect your growth.

    When the 2060 gets RTX support, it should easily beat the very fastest Pascal GPUs ever made. The Titan Xp, and the 1080ti. It will even beat the Titan Volta! Not only that, but the 2060 with RTX enabled even beats the 2080ti without RTX. Think about that. As fast as the 2080ti can render right now, the 2060 with RTX enabled could render even faster. This is not me pulling numbers out of my bum. Again, this is based on the RTX numbers from Octane, and as I said before, I see no reason why Iray would be any different. The 2060 is just $350, so for less than half of what the 1080ti launched at, you will be able to get that kind of performance. And of course each RTX takes it a step further, to where the 2080ti gains almost 3X over its non RTX score.

    That...is pretty game changing. That's like 3 or even 4 generational leaps at once.

    ...some of us had no choice but to deal with the slowness, particularly when GPU prices skyrocketed because of the cryptomining fiasco.  Frustrating at times? You betcha, to the point I began following the 3DL tips and tricks threads and began pushing that engine as much as I could (with the help of new utilities and scripts released by people like Parris and Wowie).

    While I am still working with relatively old gear, I was able to step up to a Titan X (Maxwell) which compared to rendering in CPU mode all these years is a vast improvement. Yeah, maybe not as fast as the RTX series, but still a heck of a lot better than what I was struggling with beforehand. 

    Short of some kind of windfall, a SOTA system with the latest hardware is little more than wishful thinking for many of us (even 350$ can be a daunting amount when you are barely scraping by month to month).

    @ fred9803:  One of my contentions all along.  If the scene dumps to the CPU because the card cannot hold it in VRAM, all those CUDA, Tensor, and RTX cores are pretty useless. Having enough VRAM also translates to rendering speed.. 

  • Yea, I run an Iray render server ...

    I didn't realize that. It could be handy for Iray renders. I'm interested ;)

    Check your PM’s 

  • joseftjoseft Posts: 310
    Robinson said:
    joseft said:

    Redshift 3.0 is due any day now, and early support for RT is coming with it. I have not checked in on Octane recently, but i know they have also had internal beta builds supporting RT hardware for months now, just not sure if they have released publicly yet.

    I've read good things about Redshift but I'm not sure how the pipeline to use it would work starting from Daz.  Would probably involve Maya.   I have the Maya character exporter but there are problems with the pipeline, such as geografts and so forth.  You still have a lot of work to do to be productive with it.

     

    Redshift is brilliant. Hands down the best renderer i have used. But yes, currently, people who do most of their work inside Daz Studio cannot use redshift without exporting scenes out to software that has a Redshift plugin. 

    However, if you are not already aware, there is a fancy plugin for Maya currently in development (Daz are assisting with the development) that is going to bridge that gap quite nicely. See this thread:

    https://www.daz3d.com/forums/discussion/278111/dex-dson-exchange-plugin-for-maya/p1

  • RobinsonRobinson Posts: 751
    edited April 2019
    joseft said:
    Redshift is brilliant. Hands down the best renderer i have used. But yes, currently, people who do most of their work inside Daz Studio cannot use redshift without exporting scenes out to software that has a Redshift plugin. 

    However, if you are not already aware, there is a fancy plugin for Maya currently in development (Daz are assisting with the development) that is going to bridge that gap quite nicely. See this thread:

    https://www.daz3d.com/forums/discussion/278111/dex-dson-exchange-plugin-for-maya/p1

    This is going to be a Must Buy from me.  Also I couldn't get redshift plugin to work in Maya.  I installed it, ran the batch file but when I try to enable the plugin Maya crashes hard.  Didn't have any time recently to look further into it...

    Post edited by Robinson on
  • I've been watching this thread with a lot of interest.  Now, some of you know me well enough to know that I'm more likely to say "spend the money" when there's a good business case for it.  Buying more hard drives for backups, for example.  Spending another couple hundred to go with 64GB of RAM rather than 32.  Upgrading my SSDs to make sure I have the capacity to do the things I want to do.  Buying a 6 or 8 core CPU rather than the 2. 

    I'll NEVER say "I can't afford it".  I might not be able to afford it "right now", but I'll damned well find a way.

    That said, I don't like hype either, and I usually roll my eyes at anything and anybody who says something is "game changing".  But in the case of this thread I've withheld my eye-rolling so as to hear all sides first. 

    Outrider's post (which I've quoted, snipped, and commented on below) really cuts through all the chaff for me.  It makes "I can't afford it" seem like a weak argument.  Yeah, I get that some of us are on fixed incomes and all, but Outrider really makes some great points, and it's all math-based, if you think about it. 

    I've decided that I need to move forward faster on my plan to replace both of my GTX 980s with RTX 2080TI cards.

     

    It becomes a game changer because it lowers the bar of entry for everyone. 3D is a very difficult thing to get into. If you lack the money for a decent computer, you are pretty handicapped. It is NOT simply render speed. Being able to render quickly also helps people to learn the program itself. You have had two 1080ti's, and this has allowed you to work on testing your models. You can more quickly spot things you do not like, stop the render, and make a change.

    The above highlighted comment really spoke to me.  He's right, folks.  My workstation has 6C/12T and 64 GB RAM, so it's actually in great shape, EXCEPT for the 980 GPUs.  Reading the comment above made me have a "come to Jesus" moment and realize that it's long past time for me to make some plans to upgrade those GPUs.

    Other people with lesser hardware do not have that luxury. People with slow hardware might wait for 30 or 45 minutes before they even see a partial image, let alone a full render. So if something is wrong, they need to stop, fix, and start all over. I really don't know how anybody using CPU can handle working with Iray, because I would have given up ages ago and moved on to another hobby if that was me.

    Just my opinion, but this point has been made brilliantly.  Reducing low-quality time (waiting for renders) can be had by reducing render time.  This at least opens up the availability of more quality time (time working on improving one's craft).  Neither quality time nor quantity time is better than the other.  But having said that, they are completely and permanently intertwined and dependent on each other, and that's basically what Outrider has said here.

    I think a lot of people overlook this. Learning is about repetition. Painters paint, over and over. Photographers snap pictures, lots and lots of them. And this applies to 3D, you need to render, lots and lots of renders. But with poor hardware you simply cannot do that. Crappy hardware stunts your ability to learn.

    This is merely the conclusion, but it's brilliant and I think it's right on target.

    Just to be clear, I am not saying the tools are more important than the artist, but when it comes to 3D, the tools are quite important and can effect your growth.

    Nobody is saying that you're a bad person if you can't afford better hardware.  And nobody is saying that people with the best hardware are guaranteed to grow their skills and craft to their fullest potential, either.  It all comes down to what each of us does with our gifts, abilities, equipment, and DESIRE.  But in every respect, this statement above is 100% true.  Your tools DO matter.

    When the 2060 gets RTX support, it should easily beat the very fastest Pascal GPUs ever made. The Titan Xp, and the 1080ti. It will even beat the Titan Volta! Not only that, but the 2060 with RTX enabled even beats the 2080ti without RTX. Think about that. As fast as the 2080ti can render right now, the 2060 with RTX enabled could render even faster. This is not me pulling numbers out of my bum. Again, this is based on the RTX numbers from Octane, and as I said before, I see no reason why Iray would be any different. The 2060 is just $350, so for less than half of what the 1080ti launched at, you will be able to get that kind of performance. And of course each RTX takes it a step further, to where the 2080ti gains almost 3X over its non RTX score.

    That...is pretty game changing. That's like 3 or even 4 generational leaps at once.

    Yes, I agree now.  When I do my finances this weekend, I'm going to see if I can start with an upgrade for one of my 980 cards.

  • ebergerlyebergerly Posts: 3,255

    I suppose another option, rather than rely on expensive hardware, is to investigate the world of digital compositing and really improve your craft and cut down render times (and do fancy effects and make changes in real time rather than have to spend long hours rendering):

    https://www.youtube.com/watch?v=gYu4esqvnQ0

  • Subtropic PixelSubtropic Pixel Posts: 2,388
    edited April 2019
    ebergerly said:

    I suppose another option, rather than rely on expensive hardware, is to investigate the world of digital compositing and really improve your craft and cut down render times (and do fancy effects and make changes in real time rather than have to spend long hours rendering):

    https://www.youtube.com/watch?v=gYu4esqvnQ0

    This is good too!  Rendering will always constrain us in some way or another, at least for the foreseeable future.  And no matter how fast our GPUs get, sometimes there are things that compositing is just better at doing. 

    Finding more efficient "other ways" is also a critical part of honing one's craft.  But we still shouldn't try to chintz our way through this without good equipement.  I would rather have the "composite/don't composite" decision be based on what is better for the art and/or the efficiency of the overall creative process and not be a forced decision because I've put myself into a position of having no other choice.

    Post edited by Subtropic Pixel on
  • nicsttnicstt Posts: 11,715
    That said, I don't like hype either, and I usually roll my eyes at anything and anybody who says something is "game changing".  But in the case of this thread I've withheld my eye-rolling so as to hear all sides first. 

    I have been rolling my eyes; untill it is demonstrated, I will continue to believe that hype plays its part.

    With Octane, sure it looks promising, and I've been exploring Octane (I bought a subscription). But with the effort involved I might as well use Octane with Blender, or even Cycles and Eevee.

    However, if Daz gets the functionality that 20 series cards might offer, then I'll be getting that Titan. A year ago I started putting £250 away a month, it matures before the end of the month. Performance and RAM, the only thing not to like is the price.

Sign In or Register to comment.