GTX 1180 Preliminary Specs Update..

1356

Comments

  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited May 2018

    We may not have to wait until CES.  Nvidia is currently promoting the Ultimate Gaming Experience as part of their Keynote at Computex, said keynote is scheduled on May 30th

    Quote from the Guru3D piece:

    "Utilizing GPU computing to explore the worlds infinite possibilities."
    "Witness the ultimate gaming experience and the power of Artificial intelligence at Computex 2018"

    http://www.guru3d.com/news-story/nvidia-gtc-computex-taiwan-2018-teases-an-ultimate-gaming-experience.html

    Post edited by tj_1ca9500b on
  • kyoto kidkyoto kid Posts: 41,857

    ...from what I read out of that doesn't seem much like an 11xx, release will be anytime soon

  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited May 2018

    More preliminary specs for the GT 11xx series...

    https://wccftech.com/873853-2/

    These are of course preliminary specs, so of course they could be subject to change.  Nothing really new, except for the performance graphs... and the note that the GTX 1180 is likely to be debuting in or around July...

    Post edited by tj_1ca9500b on
  • kyoto kidkyoto kid Posts: 41,857

    ...still not holding my breath.

  • nicsttnicstt Posts: 11,715
    kyoto kid said:
    nicstt said:

    I would love to be wrong.

    ...same here of course, but not holding my breath.

    Likewise.

    The 1080ti can play most games at 4k settings, though maybe not at max settings. Most benchmarks these days start with 4k performance, because that's really where the differences are going to be. Almost everything can play 1080p games beyond 60 fps now, so 1080p benches are getting silly, though you do find players who have 120 and 144 hz monitors. For those players 1440p is the "sweet spot" as the 1080ti can deliver 1440p at those high frame rates, too. Even the consoles have moved to 4k, with both Sony and Microsoft shipping 4k capable machines bringing 4k to the masses, on top of the many 4k capable streaming boxes. 4k is not niche anymore. In a short period of time in my line of business, 4k went from an occasional sighting to becoming about half of what I personally see and that ratio seems to grow by the month. So I expect the 1100 series marketing to be all about 4k. I fully expect him to call the 1180 the "Ultimate 4K GPU". If he doesn't say those exact words, I'll be shocked.

    Pascal was heavily promoted as being VR ready at its launch, but VR has sort of fallen short in popularity. However 4k is here to stay. But as I said, the 1080ti can already do 4k decently, so the 1180 will have to be much faster, especially if it is going to selling for high price. The 1080ti launched at $700. The 1180 would be competing with its own older sibling.

    Yeh, I'm considering 4k for my centre monitor; I currently have 3 x 2560x1440. I'll need a new card for that though, as the 980ti I use for rendering (nor the 970 driving my current monitors) will cope.

  • nicsttnicstt Posts: 11,715

    More preliminary specs for the GT 11xx series...

    https://wccftech.com/873853-2/

    These are of course preliminary specs, so of course they coudl be subject to change.  Nothing really new, except for the performance graphs... and the note that the GTX 1180 is likely to be debuting in or around July...

    Are they preliminary, or something between guessing and speculation.

    Preliminary, to me, would be someone at Nvidia releasing early discussions on what the actual specs should be.

  • Subtropic PixelSubtropic Pixel Posts: 2,388

    I want two 1180s to replace my GTX 980s.

  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited May 2018

    So, there's some speculation out there that Nvidia might skip 12nm and go straight to 7nm, which of course would delay the introduction of the next GPUs.  Yes, 7nm is entering volume prodution, but it takes months to build up enough inventory for a launch (plus you still need to assemble the various components after the GPUs are made), which is why we won't see the 7nm AMD offerings in volume until next year.

    Unless you do paper launches of course.  The tech media hates those, so Intel, Nvidia, and AMD try to shy away from those a bit.  1-2 months after a launch announcement is announced is about the limit of what people are willing to tolerate, and even 30 days until decent availability after a launch announcement is considered a long time...

    As Nvidia plays their cards close to their chest, well it's hard to debunk that skipping 12nm for 7nm rumor... I don't believe that'll be the case (skipping 12nm) but I don't really know...

    So while we continue to wonder, well it's15 more days until the Nvidia keynote at Computex!

    Post edited by tj_1ca9500b on
  • outrider42outrider42 Posts: 3,679
    edited May 2018

    I don't see how it could be 7nm. The rumored die size is 400mm, which is pretty huge as it is, that would be monsterous for 7nm . It would have a ridiculous CUDA count if both of these were true, and that would destroy the Titan V, not just the Titan XP. So something does not add up. If it is 7nm, it cannot be 400mm, and if it is 400mm, it cannot be 7nm.

    Even if AMD does 7nm next year...that's next year. And for all we know AMD may not get it out until later than that given their lengthy Vega development. Vega wasn't even all that great! So I doubt Nvidia fears AMD all that much, even though I have talked about how AMD could figure into how much VRAM the 1180 may ship with. But that's probably the only thing AMD could do with Navi. They almost certainly cannot beat Nvidia in benchmarks, even with 7nm, but they could ship cards with lots of VRAM in order to get attention. If you saw a GPU with 16GB, you'd take notice. So that is why I would not be surprised if AMD went with such a high count, and that is something Nvidia would not be happy about.

    Nvidia is probably more concerned about what Intel might do than AMD. Nobody knows what Intel might do with a true GPU. GPUs are so specialized, Intel could really disrupt that if their GPUs are more robust. Nvidia has taken great care to separate gaming and professional use. Even so far as to prohibit GTX from workstations in EULAs.

    AMD is in such a bad situation, that they have to go all out to get any sales. They cannot just pull even with Nvidia, they have to beat Nvidia. At something. Anything.

    Post edited by outrider42 on
  • Subtropic PixelSubtropic Pixel Posts: 2,388

    So when's this damned thing coming out?

     

    cheeky

  • TaozTaoz Posts: 10,259

    Nvidia has taken great care to separate gaming and professional use. Even so far as to prohibit GTX from workstations in EULAs.

    Can they really do that? 

  • bytescapesbytescapes Posts: 1,905

    The only thing faster than Nvidia's new GTX1180 card will be the speed with which the cryptominers buy up every single one available.

  • I don't see how it could be 7nm. The rumored die size is 400mm, which is pretty huge as it is, that would be monsterous for 7nm . It would have a ridiculous CUDA count if both of these were true, and that would destroy the Titan V, not just the Titan XP. So something does not add up. If it is 7nm, it cannot be 400mm, and if it is 400mm, it cannot be 7nm.

    Even if AMD does 7nm next year...that's next year. And for all we know AMD may not get it out until later than that given their lengthy Vega development. Vega wasn't even all that great! So I doubt Nvidia fears AMD all that much, even though I have talked about how AMD could figure into how much VRAM the 1180 may ship with. But that's probably the only thing AMD could do with Navi. They almost certainly cannot beat Nvidia in benchmarks, even with 7nm, but they could ship cards with lots of VRAM in order to get attention. If you saw a GPU with 16GB, you'd take notice. So that is why I would not be surprised if AMD went with such a high count, and that is something Nvidia would not be happy about.

    AMD don't care about VRAM at this point. I can use my Vega with 32 GB HBCC memory segment. It can load a 50 GB dataset without any problem. Just need to activate the feature in the Radeon Settings panel. My friend use his WX9100 with 64 GB option, because he has 128 GB system memory. Last time he loaded a 120 GB dataset, no problem at all. AMD said that the hard limit is 256 TB for the RX Vega and 512 TB for the professional version, but if the dataset is larger than the system memory, than they recommend an SSG which has a special mode with 1 TB dedicated memory.

    NVIDIA has this feature with Volta, but only if the host CPU is based on an IBM Power9 architecture.

  • kyoto kidkyoto kid Posts: 41,857
    edited May 2018

    The only thing faster than Nvidia's new GTX1180 card will be the speed with which the cryptominers buy up every single one available.

    ...that is my concern as well. Bitcoin may be a lost cause for the armchair miner, but ETH is the current "hot" one and plans to remain ASIC resistant so GPUs will still be in high demand.

    Post edited by kyoto kid on
  • outrider42outrider42 Posts: 3,679

    I don't see how it could be 7nm. The rumored die size is 400mm, which is pretty huge as it is, that would be monsterous for 7nm . It would have a ridiculous CUDA count if both of these were true, and that would destroy the Titan V, not just the Titan XP. So something does not add up. If it is 7nm, it cannot be 400mm, and if it is 400mm, it cannot be 7nm.

    Even if AMD does 7nm next year...that's next year. And for all we know AMD may not get it out until later than that given their lengthy Vega development. Vega wasn't even all that great! So I doubt Nvidia fears AMD all that much, even though I have talked about how AMD could figure into how much VRAM the 1180 may ship with. But that's probably the only thing AMD could do with Navi. They almost certainly cannot beat Nvidia in benchmarks, even with 7nm, but they could ship cards with lots of VRAM in order to get attention. If you saw a GPU with 16GB, you'd take notice. So that is why I would not be surprised if AMD went with such a high count, and that is something Nvidia would not be happy about.

    AMD don't care about VRAM at this point. I can use my Vega with 32 GB HBCC memory segment. It can load a 50 GB dataset without any problem. Just need to activate the feature in the Radeon Settings panel. My friend use his WX9100 with 64 GB option, because he has 128 GB system memory. Last time he loaded a 120 GB dataset, no problem at all. AMD said that the hard limit is 256 TB for the RX Vega and 512 TB for the professional version, but if the dataset is larger than the system memory, than they recommend an SSG which has a special mode with 1 TB dedicated memory.

    NVIDIA has this feature with Volta, but only if the host CPU is based on an IBM Power9 architecture.

    What gets buyer attention is what is on board the GPU. Also, HBCC introduces a whole new latency, which could spell trouble for video games. While it may improve performance in a GPU that is limited in VRAM, it will almost certainly not outperform a GPU that has enough VRAM to handle the task. In other words, while this tech is great for expanding VRAM in low capacity cards, it is NOT a satisfactory replacement for having the VRAM available in the first place. So the idea that AMD wouldn't care about VRAM at this point is invalid. They better care, or they will be out of the GPU market in a few years.

    It was asked earlier about how games could even need 12GB or more. While it is true many don't, it is possible to exceed 12GB in games at 4k with anti-aliasing enabled. Anti-aliasing is a technique that smooths out the edges of items in a scene. This is done in a wide variety of ways, and there are a ton of different AA methods to choose from. But the general way is to render an image multiple times and blend them together in a way that smooths out the jaggies. Obviously, this is expensive to most GPUs, in both memory and processing. Anyway, it is possible to jack some modern games to over 14GB of memory use using AA. Thus the need for such high capacity cards at the top end is indeed coming. So it makes sense to address this now as games continue to push technology.

     

    Taoz said:

    Nvidia has taken great care to separate gaming and professional use. Even so far as to prohibit GTX from workstations in EULAs.

    Can they really do that? 

    I am no legal expert, but this is what their EULAs state now. This was a very recent change, back around January.

    https://www.digitaltrends.com/computing/nvidia-bans-consumer-gpus-in-data-centers/

     

    The only thing faster than Nvidia's new GTX1180 card will be the speed with which the cryptominers buy up every single one available.

    At this point, I doubt this. The cards are going to retail more than past cards, and we do not know what their mining performance will be. Nvidia has been very vocal against mining, and I would not be surprised if the cards had some kind of anti-mining feature that throttles blockchain mining tasks. And secondly, the ship has largely sailed on GPU mining. It is still big in some places, but in general the market is already moving back to an almost "normal" state. Second hand market prices are falling fast, and retail is starting to now as well. You can routinely find 1070s at $400 on ebay, which hasn't happened in a very long time. 1080ti's can be found for close to the $700 launch price. And things are only getting better.

    But you are correct the cards will be a hot ticket. They always are at launch. They will sell out very fast to early adopters. However, there is a caveat, and that is how long Daz Studio will take to add support for the 1100 series. It took MONTHS for Daz to update for Pascal. So for the Daz crowd, it might be better if we don't grab them right at launch anyway. If you trade your old cards in for a 1180 at launch, you will not be able to use Daz Studio at all! Even if you don't use Iray, the viewport itself needs a GPU to work correctly, so without updates Daz will not function. So you need to keep an old card handy until Daz updates. Ah, the joy of Iray. <.<

    Does Daz even support the Titan V yet?

  • kyoto kidkyoto kid Posts: 41,857
    edited May 2018

    ...but that's 400$ for "used" (and if it came from a mining rig possibly "abused") instead of "new", as well as often at bid (which I've seen go higher) rather than a Buy It Now! price.

    Should ETH mining continue to maintain or even gain momentum, by the time Daz gets around to updating support they could easily be in short supply and overpriced again.  The best routine in that event would be to purchase the new card at launch and don't install it until after Daz updates their support.

    Post edited by kyoto kid on
  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited May 2018

    Apparently Tom's Hardware talked to a few anonymous sources inside the industry.  Here's their latest take on release dates, etc.

    https://www.tomshardware.com/news/nvidia-turing-faq,37067.html

    Post edited by tj_1ca9500b on
  • outrider42outrider42 Posts: 3,679
    kyoto kid said:

    ...but that's 400$ for "used" (and if it came from a mining rig possibly "abused") instead of "new", as well as often at bid (which I've seen go higher) rather than a Buy It Now! price.

    Should ETH mining continue to maintain or even gain momentum, by the time Daz gets around to updating support they could easily be in short supply and overpriced again.  The best routine in that event would be to purchase the new card at launch and don't install it until after Daz updates their support.

    New prices are dropping too. And even if it is used, don't forget the warranty transfers. Nearly all GPUs have 3 year warranties, and Pascal is still less than 2 years old. So every Pascal card out there still has at least a year warranty left on it. If you buy one and it burns out, it gets exchanged. You can always ask the seller how old the card is to see how much warranty is left.

    I spotted numerous 1070s at $400 'buy it now' prices. That's more than before!

    I'm seeing brand new 1080s for under $600 at Amazon, which is still a bit high, but not outrageously high. 1070s are under $500. The 1070ti, which actually renders faster than the 1080, is right around $500. I see a Gigabyte Windforce for $510. Yes, the 1070ti renders faster than the 1080. I have 2 sources for this, see the iray benchmark thread where I posted a long list of professional benchmarks.

    And while many cards on ebay are used, you can find new cards as well. Sometimes even big stores sell on ebay. I spotted several 1070ti's for $500. The 1070ti ust came out in November, so a used one would still have at least 2 and a half years of warranty left. If was buying right now, I'd buy a 1070ti. A 1070ti is the best deal, and it is faster than a 1080 for Iray. However, I am waiting for the 1100 series to launch to see what they do.

  • kyoto kidkyoto kid Posts: 41,857

    ...well I recently came into a very very good deal for a Titan-X with 12 GB and 3072 cores back when things still looked pretty bleak, so taken care of for a while. Yeah the current and forthcoming cards may have overall better performance, but again for my purposes, having more memory is also a "speed factor". 

  • tj_1ca9500btj_1ca9500b Posts: 2,057

    As I mentioned in another thread, Nvidia's Computex Keynote is tommorow (May 29th, 2018) at 7 PM PDT.  We should learn more then.

  • kyoto kidkyoto kid Posts: 41,857

    ..still not sold on the 16 GB 1180 unless the Titan-V (which BTW has only 12 GB of HBM2 VRAM, not 16 like the article mentions) got a major upgrade.

  • ebergerlyebergerly Posts: 3,255

    Until I see actual price numbers, as well as actual rendering performance numbers, preliminary specs on die size and all that other stuff seems somewhat irrelevant, and tends to induce somewhat of a yawn response with me smiley

    IMO, there's too much unknown stuff involved in determining how the card will ACTUALLY perform with actual render engines to jump to any conclusions based solely on tech specs. 

    Maybe in 6 months to a year we'll really know enough to really make a decision. 

  • outrider42outrider42 Posts: 3,679
    edited May 2018

    Pascal was a solid improvement, including the jump to 8gb as standard for the 1080 and 1070. The 1180 should be a good refinement, and should best a 1080ti and Titan XP (it would be a terrible upgrade otherwise.)

    For Iray, every new generation gets new CUDA refinements, its not just clockspeed and CUDA counts. In fact, CUDA often gets refined throughout a generation. The much newer 1070ti renders Iray faster than a 1080 because it has upgraded CUDA over the 1080, even though it is the same Pascal chip that is a cut down 1080. So its only logical to expect some good improvements with the 1180 getting not only more CUDA cores, but also faster CUDA cores on top of that.

    Computex has not historically been where Nvidia reveals GTX cards. However, if rumors of a July launch are true, they really need to be showing something soon to start the official hype train campaign. E3 is coming soon, too.

    But you are probably right about it taking a while to get benchmarks for Iray since who knows how long it will take Daz to update for the 1100 series. I can see the "My new GPU will not work in Daz Studio" threads now coming from people who don't frequent the forums every day. I seriously hope it doesn't take as long as it did to support Pascal, which was downright embarrassing.

    Post edited by outrider42 on
  • Ghosty12Ghosty12 Posts: 2,080
    edited May 2018

    Theories and rumors floating around the net are that while, Nvidia are to soon showcase the new cards, rumors are that we wont see them till near the end of the year..

    outrider42 said:

    Pascal was a solid improvement, including the jump to 8gb as standard for the 1080 and 1070. The 1180 should be a good refinement, and should best a 1080ti and Titan XP (it would be a terrible upgrade otherwise.)

    For Iray, every new generation gets new CUDA refinements, its not just clockspeed and CUDA counts. In fact, CUDA often gets refined throughout a generation. The much newer 1070ti renders Iray faster than a 1080 because it has upgraded CUDA over the 1080, even though it is the same Pascal chip that is a cut down 1080. So its only logical to expect some good improvements with the 1180 getting not only more CUDA cores, but also faster CUDA cores on top of that.

    Computex has not historically been where Nvidia reveals GTX cards. However, if rumors of a July launch are true, they really need to be showing something soon to start the official hype train campaign. E3 is coming soon, too.

    But you are probably right about it taking a while to get benchmarks for Iray since who knows how long it will take Daz to update for the 1100 series. I can see the "My new GPU will not work in Daz Studio" threads now coming from people who don't frequent the forums every day. I seriously hope it doesn't take as long as it did to support Pascal, which was downright embarrassing.

    That will be the thing, is how long it takes for iRay support to kick in for the new cards..  But I guess that is the price one must pay for this type of rendering, even more so now with the uber new realtime raytracing rendering methods that are on the way..

    Post edited by Ghosty12 on
  • SylvanSylvan Posts: 2,719

    I am cherishing my 980ti for now. But wow, that card is a looker!

  • outrider42outrider42 Posts: 3,679
    ghosty12 said:

    Theories and rumors floating around the net are that while, Nvidia are to soon showcase the new cards, rumors are that we wont see them till near the end of the year..

    outrider42 said:

    Pascal was a solid improvement, including the jump to 8gb as standard for the 1080 and 1070. The 1180 should be a good refinement, and should best a 1080ti and Titan XP (it would be a terrible upgrade otherwise.)

    For Iray, every new generation gets new CUDA refinements, its not just clockspeed and CUDA counts. In fact, CUDA often gets refined throughout a generation. The much newer 1070ti renders Iray faster than a 1080 because it has upgraded CUDA over the 1080, even though it is the same Pascal chip that is a cut down 1080. So its only logical to expect some good improvements with the 1180 getting not only more CUDA cores, but also faster CUDA cores on top of that.

    Computex has not historically been where Nvidia reveals GTX cards. However, if rumors of a July launch are true, they really need to be showing something soon to start the official hype train campaign. E3 is coming soon, too.

    But you are probably right about it taking a while to get benchmarks for Iray since who knows how long it will take Daz to update for the 1100 series. I can see the "My new GPU will not work in Daz Studio" threads now coming from people who don't frequent the forums every day. I seriously hope it doesn't take as long as it did to support Pascal, which was downright embarrassing.

    That will be the thing, is how long it takes for iRay support to kick in for the new cards..  But I guess that is the price one must pay for this type of rendering, even more so now with the uber new realtime raytracing rendering methods that are on the way..

    Actually other GPU capable render engines (like Octane) had Pascal supported almost right away, if not right at launch. It was ONLY Daz Studio Iray that took months to support Pascal. That's why I classify the time it took Daz to support Iray as embarrassing, because it really should be. First Nvidia had to release the updated SDK, which took a month or so, and then it took Daz another month or so to add Pascal to the 4.9 beta. And another period of time before the full release finally supported Pascal. That is way too long by any standard, and not acceptable in my book. That's when Daz's decision to use Iray really came back to haunt them. The fact that Daz has to depend on Nvidia to update the Iray SDK is highly problematic, because we cannot depend on Nvidia to deliver updates quickly. They didn't even update the SDK when they launched Pascal, so I expect them to do the same for the 1100 series.

    Has anyone found if Daz supports the Titan V yet? I'm very curious to know. There is a tiny chance we might get lucky if Daz supports Volta. That is because the 1100 series is basically Volta with the Tensor cores stripped out. So if the two are that similar, then Volta support might also include the 1100 series as well, and thus a much shorter wait. If Daz supports Volta now, then maybe we actually have support for the 1100s at launch. Hey, I can dream, OK! But that would be pretty awesome. So I would love to know. I believe Nvidia released the Iray SDK for Volta some time ago.

  • Richard HaseltineRichard Haseltine Posts: 108,072
    ... It was ONLY Daz Studio Iray that took months to support Pascal. That's why I classify the time it took Daz to support Iray as embarrassing, because it really should be. First Nvidia had to release the updated SDK, which took a month or so, and then it took Daz another month or so to add Pascal to the 4.9 beta. And another period of time before the full release finally supported Pascal. That is way too long by any standard, and not acceptable in my book. That's when Daz's decision to use Iray really came back to haunt them. The fact that Daz has to depend on Nvidia to update the Iray SDK is highly problematic, because we cannot depend on Nvidia to deliver updates quickly. They didn't even update the SDK when they launched Pascal, so I expect them to do the same for the 1100 series.

    That isn't entirely right - it was a lot more than a month across the board, and as I recall the initial Pascal-compatible version of iray had issues so it may hae been released for iray Server or the plug-ins developed in-house, but it wasn't finished. I'm pretty sure Substance had to wait about as long as DS for its Pascal support.

  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited May 2018

    OK, so the Nvidia keynote held in Taiwan today is over.

    So what did I learn?

    That Jensen Huang loves his leather jackets.  Everything else was pretty much covered in the keynote from a couple of months ago.  Absolutely no new info about Nvidia cards.

    Let the speculation continue!

    Post edited by tj_1ca9500b on
  • nicsttnicstt Posts: 11,715
    Estroyer said:

    I am cherishing my 980ti for now. But wow, that card is a looker!

    I have absolutely zero interest in how it looks.

    ... Looks do zilch for rendering prowess.

Sign In or Register to comment.