GTX 1080 vs. GTX 1080Ti
in The Commons
I have been comparing these two. There seems to be a huge improvement in performance for a modest price increase.
GTX 1080 vs. GTX 1080Ti; 2560 cuda cores vs.3584 cuda cores.
Has anybody used this card?

Comments
Performance wise, there really is no comparison; taking the price into account then there is one, although the price difference is a lot less than expected.
I seem to remember someone posting they had one; I am currently considering adding a 1080ti to my current 980ti and 970 setup; only issue is I'll need a new PSU, and as my MB is showing signs of giving up, I'm waiting.
Check out one of the 1080 threads in Daz Studio forum; I've a feeling that is where I saw it.
I think someone posted to the benchmarks thread (Sickle Yield's sample scene thread).
It is not the speed you should be looking at but the vram one has 8gb the other 11gb. You can do more with the 1080ti. If you are in need of speed save up and buy another.
...yeah, the 1080 Ti is definitely a sizeable step above the 1080. I am considering it as that would handle a major percentage (something like 90%) of the scenes I create. Until Nvidia released Titan Xp today, the 1080 Ti had the same number of CUDA cores along with better memory base speed, bandwidth, and data rate than it's more pricey sibling. The new Titan Xp has all the SMs unlocked to offer 3840 cores, but VRAM remains at 12 GB so still not enough of an edge over the 1080 Ti in my book to warrant spending the extra 500$ (I was expecting the memory to be bumped up to 16 GB).
Greetings,
I want it, just so that I can use the Ti as my primary video card, my current 1080 as the spare, and then have a full 8GB available on both cards for scenes. Right now my display seems to take up 1G of video memory by itself, so I've only got ~7GB available. Then there's the performance boost... But honestly, I should get a better PC first. :(
-- Morgan
p.s. nVidia is evidently releasing Mac OS X drivers for the Pascal generation. This is either a tacit approval of the external case folks, or a move to show Apple that they're still available for the next generation of Mac hardware...
...I read that. Would make the next generation Mac Pro attractive for those who can afford one (well, if they also put it in a "conventional" case again instead of a coffee can, what were they thinking?).
If I do ever get a 1080 Ti, I would use my current card to run just the displays and the Ti exclusively for rendering.
Greetings,
One conventional-ish case, or at least 'modular', coming up! :)
-- Morgan
...I sure hope so. Was in a second hand store last week and saw an old Mac Pro case.
Don't forget that Windows 10, at least, reserves RAM on all cards agaisnt possible connection of a display - the onyl way to stop that, as far as I know, is to have a compute-only card like a Tesla which is pricey in itself and forces you down the Quadro route for everything. Presumably a card with fewer potential connections would lose less RAM than oen with many, but I don't know how much (if at all) they vary in that respect.
Or use windows 7 pro 64bit. I have extra gaming pc with win10 and don't like it. Got win10 free like everyone else but the operating system is too big brother. Turned off as much as I could but MS really want to know what I'm doing on my pc which is invasion of privacy. Big business seems to be going down the same road as govenment (we want to know what you are doing). Soon we will all be scanned and barcoded. Ok I'm taking off my tin foil hat now : )
I am looking at upgrading my IRAY abilities, and am interested in the 1080ti. I already have two 980ti's, and have maxed out my MB. Does anyone have any suggestions on a MB that would support up to four nvidia cards?
I'm trying out W10 pro on main comp; i've shut down all the spyware crap and it's not too bad; 3MB seems to be reserved, but tbh, that is negligible. Not sure yet if I'll switch back or not.
Thinking about it, it doesn't sound like a big deal.
You're limited to the amount of RAM in the lowest card. You have 2 12 GB cards, one connect to displays. The other isn't. If it were that you had 11 GB available on the one with displays and 12 GB available on the other, overall you'll be limited to 11 GB anyways, the lowest card.
Greetings,
This does not jibe with my experience. I actually used my motherboards 'built-in' video for a while, so I could use all 4GB of my previous (740GTX) card, and it may have been down by ~10MB, but nothing noticeable.
-- Morgan
Well, the details were as explained to me (or my attempt at as explained to me), but I'm pretty sure people with two cards using Windows 10 had reported a higher overhead than those using Windows 7 with the same system. If that's not true now then yippee, though whether it was always incorrect or is incorrect only since a later update I wouldn't know.
...still won't move to W10 until they restore the old update process and allow full user control like previous versions as well get rid of Cortana and make it an add on app instead of integrated into the OS.
Update / Edit:
- added links to 3rd party sources
- added information about TAG reporting
- - -
@ useable VRAM of GTX 1080 Ti with windows 10
When you are using windows 10 you can use 9 GB of the 11 GB available VRAM of a GTX 1080 Ti
-> Windows 10 is reserving 2 GB of VRAM of 11 GB cards.
Tested with OctaneRender standalone 3.04
to compare:
Windows 10 is reserving 1 GB of VRAM of 6 GB cards
Windows 10 is reserving 1.4 GB of VRAM of 8 GB cards
- - -
- - -
In any case if you upgrade from a 1080 to a 1080 Ti you get 9 GB of actually useable VRAM and that is still better than only the 6.6 GB of actually useable VRAM on a GTX 1080.
- - -
My current setup is:
Display: 1x Asus GTX 1080 STRIX A8G
Rendering: 2 x Asus GTX 1080 Ti FE
Personal impressions:
- Nvidia Iray preview viewport
With two 1080 the Nvidia "live preview" viewport with DAZ Studio still felt a bit slow.
I only kept it open when performing material and surface related tasks.
Now with two 1080 Ti finally the "live preview" viewport seems to update quickly enough to make it a feasible option to keep it also open all the time when making changes to the scene like posing, shaping, adding props.
Its not yet quite there making openGL viewports obsolete but I feel like we are finally getting there
It is fun to see the result of adjustments in good quality immediately instead of hours later when the pixels have cleared up...
...when Mec4D posted video of her 2 Titan-X (Maxwell) system in operation last year, the screen refresh was almost instant.
...and that was with only 3072 CUDA cores ea. I can imagine how quick 7100 total cores is. If I only had the 1,400$ (and 24 GB of system memory i need to support those cards) that would be great as it would significantly cut down on the time I spend performing test renders.
Still on W7 so smaller VRAM footprint.
How can you set a gpu to work only for the display? During rendering Iray doesn't use it?
At this time I have 2x GTX 1080 and I'm thinking to buy 2x 1080 Ti, the real doubt is if sell both 1080s or only one but I need to understand the benefit :)
Just uncheck the display card in the Advanced tab of Render Settings.
Oh it's obv, thanks :) But what's the benefit?
I know the 1080 has 3gb of memory less than the 1080 Ti but I don't understand if exclude it completely it's better and what's the reason to have a gpu dedicated to the display.
Well, I have only a single GPU - but tehnpotential benefit is being able to use the rest of the system more or less normally while a render is grinding away on the second GPU.
Ok it's clear, I could sell the 1080s and buy another cheap gpu and dedicate it for the system.
Are you sure your motherboard doesn't have onboard video? If this is a non-gaming PC, that might be a viable solution. Personally, I have a single 980TI and I don't seem to get a sluggish machine while rendering. Granted, I'm usually just on the internet, but I have watched video occasionally.