GTX 1080 vs. GTX 1080Ti

I have been comparing these two. There seems to be a huge improvement in performance for a modest price increase.

GTX 1080 vs. GTX 1080Ti;  2560 cuda cores vs.3584 cuda cores.

Has anybody used this card?

Comments

  • nicsttnicstt Posts: 11,715

    Performance wise, there really is no comparison; taking the price into account then there is one, although the price difference is a lot less than expected.

    I seem to remember someone posting they had one; I am currently considering adding a 1080ti to my current 980ti and 970 setup; only issue is I'll need a new PSU, and as my MB is showing signs of giving up, I'm waiting.

    Check out one of the 1080 threads in Daz Studio forum; I've a feeling that is where I saw it.

  • Richard HaseltineRichard Haseltine Posts: 109,502

    I think someone posted to the benchmarks thread (Sickle Yield's sample scene thread).

  • Silver DolphinSilver Dolphin Posts: 1,640

    It is not the speed you should be looking at but the vram one has 8gb the other 11gb. You can do more with the 1080ti. If you are in need of speed save up and buy another.

  • kyoto kidkyoto kid Posts: 42,135

    ...yeah, the 1080 Ti is definitely a sizeable step above the 1080.  I am considering it as that would handle a major percentage (something like 90%) of the scenes I create.   Until Nvidia released Titan Xp today, the 1080 Ti had the same number of CUDA cores along with better memory base speed, bandwidth, and data rate than it's more pricey sibling.  The new Titan Xp has all the SMs unlocked to offer 3840 cores, but VRAM remains at 12 GB so still not enough of an edge over the 1080 Ti in my book to warrant spending the extra 500$ (I was expecting the memory to be bumped up to 16 GB).

  • CypherFOXCypherFOX Posts: 3,401
    edited April 2017

    Greetings,

    I want it, just so that I can use the Ti as my primary video card, my current 1080 as the spare, and then have a full 8GB available on both cards for scenes.  Right now my display seems to take up 1G of video memory by itself, so I've only got ~7GB available.  Then there's the performance boost...  But honestly, I should get a better PC first. :(

    --  Morgan

    p.s. nVidia is evidently releasing Mac OS X drivers for the Pascal generation.  This is either a tacit approval of the external case folks, or a move to show Apple that they're still available for the next generation of Mac hardware...

    Post edited by CypherFOX on
  • kyoto kidkyoto kid Posts: 42,135
    edited April 2017

    ...I read that.  Would make the next generation Mac Pro attractive for those who can afford one (well, if they also put it in a "conventional" case again instead of a coffee can, what were they thinking?).

    If I do ever get a 1080 Ti, I would use my current card to run just the displays and the Ti exclusively for rendering.

    Post edited by kyoto kid on
  • CypherFOXCypherFOX Posts: 3,401

    Greetings,

    kyoto kid said:

    ...I read that.  Would make the next generation Mac Pro attractive for those who can afford one (well, if they also put it in a "conventional" case again instead of a coffee can, what were they thinking?).

    If I do ever get a 1080 Ti, I would use my current card to run just the displays and the Ti exclusively for rendering.

    One conventional-ish case, or at least 'modular', coming up! :)

    --  Morgan

     

  • kyoto kidkyoto kid Posts: 42,135

    ...I sure hope so.  Was in a second hand store last week and saw an old Mac Pro case.

  • Richard HaseltineRichard Haseltine Posts: 109,502
    CypherFOX said:

    Greetings,

    I want it, just so that I can use the Ti as my primary video card, my current 1080 as the spare, and then have a full 8GB available on both cards for scenes.  Right now my display seems to take up 1G of video memory by itself, so I've only got ~7GB available.  Then there's the performance boost...  But honestly, I should get a better PC first. :(

    Don't forget that Windows 10, at least, reserves RAM on all cards agaisnt possible connection of a display - the onyl way to stop that, as far as I know, is to have a compute-only card like a Tesla which is pricey in itself and forces you down the Quadro route for everything. Presumably a card with fewer potential connections would lose less RAM than oen with many, but I don't know how much (if at all) they vary in that respect.

  • Silver DolphinSilver Dolphin Posts: 1,640
    CypherFOX said:

    Greetings,

    I want it, just so that I can use the Ti as my primary video card, my current 1080 as the spare, and then have a full 8GB available on both cards for scenes.  Right now my display seems to take up 1G of video memory by itself, so I've only got ~7GB available.  Then there's the performance boost...  But honestly, I should get a better PC first. :(

    Don't forget that Windows 10, at least, reserves RAM on all cards agaisnt possible connection of a display - the onyl way to stop that, as far as I know, is to have a compute-only card like a Tesla which is pricey in itself and forces you down the Quadro route for everything. Presumably a card with fewer potential connections would lose less RAM than oen with many, but I don't know how much (if at all) they vary in that respect.

    Or use windows 7 pro 64bit. I have extra gaming pc with win10 and don't like it. Got win10 free like everyone else but the operating system is too big brother. Turned off as much as I could but MS really want to know what I'm doing on my pc which is invasion of privacy. Big business seems to be going down the same road as govenment (we want to know what you are doing). Soon we will all be scanned and barcoded. Ok I'm taking off my tin foil hat now : )

  • I am looking at upgrading my IRAY abilities, and am interested in the 1080ti. I already have two 980ti's, and have maxed out my MB. Does anyone have any suggestions on a MB that would support up to four nvidia cards?

  • kyoto kidkyoto kid Posts: 42,135
    CypherFOX said:

    Greetings,

    I want it, just so that I can use the Ti as my primary video card, my current 1080 as the spare, and then have a full 8GB available on both cards for scenes.  Right now my display seems to take up 1G of video memory by itself, so I've only got ~7GB available.  Then there's the performance boost...  But honestly, I should get a better PC first. :(

    Don't forget that Windows 10, at least, reserves RAM on all cards agaisnt possible connection of a display - the onyl way to stop that, as far as I know, is to have a compute-only card like a Tesla which is pricey in itself and forces you down the Quadro route for everything. Presumably a card with fewer potential connections would lose less RAM than oen with many, but I don't know how much (if at all) they vary in that respect.

    ...just one more reason I have no desire to put W10 on any system I have. W7 has an almost negligible footprint on GPU memory. I've had updating disabled since October 1st when MS went ro the rollup format for updates and have no malware issues thanks to a very "beefy" firewall. As I am not into games, I have no need for Direct X12 support.
  • nicsttnicstt Posts: 11,715
    edited April 2017

    I'm trying out W10 pro on main comp; i've shut down all the spyware crap and it's not too bad; 3MB seems to be reserved, but tbh, that is negligible. Not sure yet if I'll switch back or not.

    RAM useage.jpg
    472 x 513 - 54K
    Post edited by nicstt on
  • GatorGator Posts: 1,320
    kyoto kid said:
    CypherFOX said:

    Greetings,

    I want it, just so that I can use the Ti as my primary video card, my current 1080 as the spare, and then have a full 8GB available on both cards for scenes.  Right now my display seems to take up 1G of video memory by itself, so I've only got ~7GB available.  Then there's the performance boost...  But honestly, I should get a better PC first. :(

    Don't forget that Windows 10, at least, reserves RAM on all cards agaisnt possible connection of a display - the onyl way to stop that, as far as I know, is to have a compute-only card like a Tesla which is pricey in itself and forces you down the Quadro route for everything. Presumably a card with fewer potential connections would lose less RAM than oen with many, but I don't know how much (if at all) they vary in that respect.

     

    ...just one more reason I have no desire to put W10 on any system I have. W7 has an almost negligible footprint on GPU memory. I've had updating disabled since October 1st when MS went ro the rollup format for updates and have no malware issues thanks to a very "beefy" firewall. As I am not into games, I have no need for Direct X12 support.

    Thinking about it, it doesn't sound like a big deal.

    You're limited to the amount of RAM in the lowest card.  You have 2 12 GB cards, one connect to displays.  The other isn't.  If it were that you had 11 GB available on the one with displays and 12 GB available on the other, overall you'll be limited to 11 GB anyways, the lowest card.

  • CypherFOXCypherFOX Posts: 3,401

    Greetings,

    Don't forget that Windows 10, at least, reserves RAM on all cards agaisnt possible connection of a display - the onyl way to stop that, as far as I know, is to have a compute-only card like a Tesla which is pricey in itself and forces you down the Quadro route for everything. Presumably a card with fewer potential connections would lose less RAM than oen with many, but I don't know how much (if at all) they vary in that respect.

    This does not jibe with my experience.  I actually used my motherboards 'built-in' video for a while, so I could use all 4GB of my previous (740GTX) card, and it may have been down by ~10MB, but nothing noticeable.

    --  Morgan

     

  • Richard HaseltineRichard Haseltine Posts: 109,502
    CypherFOX said:

    Greetings,

    Don't forget that Windows 10, at least, reserves RAM on all cards agaisnt possible connection of a display - the onyl way to stop that, as far as I know, is to have a compute-only card like a Tesla which is pricey in itself and forces you down the Quadro route for everything. Presumably a card with fewer potential connections would lose less RAM than oen with many, but I don't know how much (if at all) they vary in that respect.

    This does not jibe with my experience.  I actually used my motherboards 'built-in' video for a while, so I could use all 4GB of my previous (740GTX) card, and it may have been down by ~10MB, but nothing noticeable.

    --  Morgan

    Well, the details were as explained to me (or my attempt at as explained to me), but I'm pretty sure people with two cards using Windows 10 had reported a higher overhead than those using Windows 7 with the same system. If that's not true now then yippee, though whether it was always incorrect or is incorrect only since a later update I wouldn't know.

  • kyoto kidkyoto kid Posts: 42,135

    ...still won't move to W10 until they restore the old update process and allow full user control like previous versions as well get rid of Cortana and make it an add on app instead of integrated into the OS.

  • linvanchenelinvanchene Posts: 1,386
    edited April 2017

    Update / Edit:

    - added links to 3rd party sources

    - added information about TAG reporting

    - - -

    @ useable VRAM of GTX 1080 Ti with windows 10

    When you are using windows 10 you can use 9 GB of the 11 GB available VRAM of a GTX 1080 Ti

    -> Windows 10 is reserving 2 GB of VRAM of 11 GB cards.

    Tested with OctaneRender standalone 3.04

    to compare:

    Windows 10 is reserving 1 GB of VRAM of 6 GB cards

    Windows 10 is reserving 1.4 GB of VRAM of 8 GB cards

    - - -

    Side Note:

    @ VRAM reservation & cards with 4GB VRAM or lower

    So far  users that have claimed not being able to observe Windows 10 reserving VRAM have cards with 4 GB of VRAM or even lower.

    - In the Nvidia Geforce driver documentation you can read about VRAM TAG reporting issues that

    appear with cards that have more than 4 GB of VRAM. Nvidia categorized that issue

    "Known Product Limitations", "beyond the control of NVIDIA"

    http://us.download.nvidia.com/Windows/375.70/375.70-win10-win8-win7-desktop-release-notes.pdf

    - Well known GPU rendering companies like Otoy have confirmed the issue repeatedly and advice to go back to Windows 7 to make full use of all VRAM of Geforce based cards.

    https://render.otoy.com/forum/viewtopic.php?f=12&t=51992

    - Afaik Microsoft has still not published any official statement on this issue.

    For more information:

    https://social.technet.microsoft.com/Forums/windows/en-US/15b9654e-5da7-45b7-93de-e8b63faef064/windows-10-does-not-let-cuda-applications-to-use-all-vram-on-especially-secondary-graphics-cards?forum=win10itprohardware

    Personal opinion:

    From the outside it is very difficult to judge where exactly things go wrong.

    In any case all available information point towards the

    - Windows Display Driver Model (WDDM)

    - and the way the Total Available Graphics memory (TAG) is reported.

    - - -

    Using cards with 4 GB or less or windows 7 is not an option for me.

    With teeth grinding I "deal with the situation" and just spent the money knowing that things could be better.

     

    - - -

    In any case if you upgrade from a 1080 to a 1080 Ti you get 9 GB of actually useable VRAM and that is still better than only the 6.6 GB of actually useable VRAM on a GTX 1080.

    - - -

    My current setup is:

    Display: 1x Asus GTX 1080 STRIX A8G

    Rendering: 2 x Asus GTX 1080 Ti FE

    Personal impressions:

    - Nvidia Iray preview viewport

    With two 1080 the Nvidia "live preview" viewport with DAZ Studio still felt a bit slow.

    I only kept it open when performing material and surface related tasks.

    Now with two 1080 Ti finally the "live preview" viewport seems to update quickly enough to make it a feasible option to keep it also open all the time when making changes to the scene like posing, shaping, adding props.

    Its not yet quite there making openGL viewports obsolete but I feel like we are finally getting there

    It is fun to see the result of adjustments in good quality immediately instead of hours later when the pixels have cleared up...

     

     

     

     

     

    Post edited by linvanchene on
  • kyoto kidkyoto kid Posts: 42,135
    edited April 2017

    ...when Mec4D posted video of her 2 Titan-X (Maxwell) system in operation last year, the screen refresh was almost instant. 

    ...and that was with only 3072 CUDA cores ea. I can imagine how quick 7100 total cores is. If I only had the 1,400$ (and 24 GB of system memory i need to support those cards) that would be great as it would significantly cut down on the time I spend performing test renders.

    Still on W7 so smaller VRAM footprint.

    Post edited by kyoto kid on
  • KlaudMKlaudM Posts: 76
    edited November 2017

    My current setup is:

    Display: 1x Asus GTX 1080 STRIX A8G

    Rendering: 2 x Asus GTX 1080 Ti FE

     

    How can you set a gpu to work only for the display? During rendering Iray doesn't use it?

    At this time I have 2x GTX 1080 and I'm thinking to buy 2x 1080 Ti, the real doubt is if sell both 1080s or only one but I need to understand the benefit :)

    Post edited by KlaudM on
  • My current setup is:

    Display: 1x Asus GTX 1080 STRIX A8G

    Rendering: 2 x Asus GTX 1080 Ti FE

     

    How can you set a gpu to work only for the display? During rendering Iray doesn't use it?

    At this time I have 2x GTX 1080 and I'm thinking to buy 2x 1080 Ti, the real doubt is if sell both 1080s or only one but I need to understand the benefit :)

    Just uncheck the display card in the Advanced tab of Render Settings.

  • KlaudMKlaudM Posts: 76
    edited November 2017

    My current setup is:

    Display: 1x Asus GTX 1080 STRIX A8G

    Rendering: 2 x Asus GTX 1080 Ti FE

     

    How can you set a gpu to work only for the display? During rendering Iray doesn't use it?

    At this time I have 2x GTX 1080 and I'm thinking to buy 2x 1080 Ti, the real doubt is if sell both 1080s or only one but I need to understand the benefit :)

    Just uncheck the display card in the Advanced tab of Render Settings.

    Oh it's obv, thanks :) But what's the benefit?
    I know the 1080 has 3gb of memory less than the 1080 Ti but I don't understand if exclude it completely it's better and what's the reason to have a gpu dedicated to the display.

    Post edited by KlaudM on
  • My current setup is:

    Display: 1x Asus GTX 1080 STRIX A8G

    Rendering: 2 x Asus GTX 1080 Ti FE

     

    How can you set a gpu to work only for the display? During rendering Iray doesn't use it?

    At this time I have 2x GTX 1080 and I'm thinking to buy 2x 1080 Ti, the real doubt is if sell both 1080s or only one but I need to understand the benefit :)

    Just uncheck the display card in the Advanced tab of Render Settings.

    Oh it's obv, thanks :) But what's the benefit?
    I know the 1080 has 3gb of memory less than the 1080 Ti but I don't understand if exclude it completely it's better and what's the reason to have a gpu dedicated to the display.

    Well, I have only a single GPU - but tehnpotential benefit is being able to use the rest of the system more or less normally while a render is grinding away on the second GPU.

  • KlaudMKlaudM Posts: 76
    edited November 2017

    Ok it's clear, I could sell the 1080s and buy another cheap gpu and dedicate it for the system.

    Post edited by KlaudM on
  • fastbike1fastbike1 Posts: 4,081

    Are you sure your motherboard doesn't have onboard video? If this is a non-gaming PC, that might be a viable solution. Personally, I have a single 980TI and I don't seem to get a sluggish machine while rendering. Granted, I'm usually just on the internet, but I have watched video occasionally. 

    Ok it's clear, I could sell the 1080s and buy another cheap gpu and dedicate it for the system.

     

Sign In or Register to comment.