Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
Looks like we have another leak:
https://wccftech.com/nvidia-geforce-next-gen-graphics-card-pcb-leak/
...fascinating. 12 GB that's kind of where I pegged the top end for GTX cards so as not to overstep their more expensive pro grade ones.
As to NVLInk, I could maybe see them enabling it on the Titan-V (or whatever its successor will be) but as mentioned it would have little impact on gaming which is the GTX series' bread & butter market.
It's not indicative of anything; too little to go off imo.
Just a fun note...
https://wccftech.com/micron-gddr6-memory-mass-production-official/
Since the PCB in the article I linked in the earlier post has GDDR6, you should know that those chips come in 1 GB and 2 GB versions. 1 GB X 12 Chips = 12 GB. 2 GB x 12 Chips = 24 GB.
So even if Nvidia goes the GDDR6 route (due to HBM2 shortages), they COULD crank out a 24 GB enthusiast/professional card on the same PCB...
Note that I said Enthusiast/Professional card, not gaming card. The Quadro P6000 is getting very long in the tooth, and NVidia really could use a 'refresh' to fill the pricing void between the $9000 flagship Quadro GV100 card and the $3000 12 GB Titan V. Also note that the PCB includes an NVLink...
DISCLAIMER: ALL OF THE FOLLOWING CONTENT IS THE OPINION OF OUTRIDER42 AND NO WAY AFFILIATED WITH DAZ3D OR ANY OF DAZ3D's EMPLOYEES OR FORUM MODERATORS.
My interpretation of this was that many of these surplus cards are only mid grade cards. Take notice of all the different 1060 and 1050 variants there have been over time. They have been trying to find new ways to sell the same chips over and over for a while. Most of the high end chips have already shipped out to the 3rd party partners, and they would be crazy to return top end chips that are still in decent demand. But this is why Nvidia was not super keen on simply producing more cards during the crypto fad. The fad has come and gone, and in fact it died quicker than many people anticipated. Nvidia themselves had predicted April/May, but prices fell sooner than that.
But ultimately I don't think that will factor into when the 1100's launch very much. That comes purely down to AMD, or rather the lack of AMD. Without any new AMD card in site, Nvidia is free to take as much time as they want to. This works both ways, though. While it may mean a longer wait, it may also mean that the 1100 will be better than the typical generation leap. Going well over 2 years for Pascal, users will expect a bigger leap as well.
That's not much to go on, though. All of that is subject to change. But 12gb would be nice. It is very interesting to see the new NVLink on this board, so Nvidia is at the very least considering this tech. NVLink would be the big time game changer. But will they really put that on a gaming card?
How powerful the 1180 truly is comes down to how Nvidia views AMD and Intel. Does Nvidia believe Navi will be a threat? Frankly, I doubt it. I don't believe Nvidia views AMD as a threat at all now. But if Nvidia wants to bury AMD, they can do so, right here and right now. The consoles are about the only thing keeping AMD GPU afloat. Intel is the wild card, and nobody has any idea what they might do, so the 1180 might also reflect that. I believe Nvidia is more concerned about Intel at this point.
Intel's foray into GPU can really shake things up. First of all, we have to consider how Intel might approach GPUs. Nvidia takes very special care to differentiate gaming and pro models. They even go so far as to make the EULAs prohibit using GTX cards in workstations...true story. But Intel may not take such a route. Intel may make GPUs that are more versatile, which would be a great way to tackle Nvidia. Going after gamers right away may not work so well, but going after the pro market where Nvidia sells Quadro cards for crazy high prices would be a much easier target.
This is what I think will happen. Intel will release GPUs that are not quite as fast at gaming, but will blow the doors off GTX in other computing tasks. This will probably make them very strong rendering cards, and if they can utilize CUDA, we might see a new go to GPU for Iray. I think Intel would license CUDA to get that edge, given that CUDA is becoming more popular. One might wonder if Nvidia would actually allow Intel to license it, and that is a good question. But big companies license tech all the time, that's nothing new.
LinusTechTips recently unearthed an Intel prototype GPU from a decade ago. This GPU was quite unique, it could be programmed like a CPU to do all kinds of tasks. Obviously this was a long time ago, but some of these things could still find their way into Intel's future GPU. It could make for a very interesting piece of hardware.
And this falls back to the above NVLink. While it may or may not be targeted at gamers, NVLink would allow people to scale GPUs like never before, and it would provide a big edge over whatever Intel is doing...unless Intel has their own similar tech in the works. Which again, would motivate Nvidia to respond.
That's how competition works for us, and Intel getting into the GPU race is a great thing for us all. With AMD falling so far behind, Intel can be the one to keep Nvidia on their toes. And Intel can be a very powerful opponent, as they are a massive company that can afford to throw money at whatever they want. That has to concern Nvidia.
Why Intel is getting into this market? It has nothing to do with the desire to sell GPUs. The reason is because Nvidia is butting in on their AI and workstation sectors. Nvidia has become a very serious threat to these areas that Intel dominates. If you watched Huang speak during any of his conferences about AI research, he has preached over and over that using GPUs for research will save money. And generally speaking, he's not wrong! As a result, Nvidia has experienced massive growth in in these sectors, by far the largest growth in the company. So Intel is fighting back with their own GPU. In a way, this is similar to Microsoft creating the original Xbox in an effort to keep Sony's Playstation from pulling people from Windows. If you recall, the PS2 launched with the option to install Linux. So MS's foray into the console market was more about protecting Windows than simply creating a new console. And today, the modern Xbox is practically running Windows 10. Anyway, this is why Intel is producing a GPU. Intel is gunning for Nvidia here, they don't even care about AMD in this market, and have even teamed up with AMD to produce laptop chips that combine Intel and and Vega...one of the weirdest partnerships ever. I'm not real sure what AMD gets out of this deal, I don't see how they benefit much from it when they can combine Vega with their own CPUs.
I'm still betting on multiple SKUs that offer different amounts of memory.
Consider this, Huang gave away 20 special "CEO Edition" Titan Vs to researches recently. Those Titan Vs featured 32 GB VRAM, 20 more than the launch model! They were exactly the same otherwise. Also, Nvidia has upgraded several Tesla and Quadro models with additional VRAM. They have have been increasing VRAM pretty steadily. So I would not be at all surprised if the 1180 offers a premium model with 16 GB. Yes, they have Titan Vs with 12 GB, but those were released last year, they are already old in GPU terms, LOL. Plus they could easily double VRAM or better like they proved with the CEO Edition. And that is what I think they will do. I predict they will sell new Titan Vs that offer more VRAM soon, so an 1180 with 16 GB would not be over stepping anything. I believe the CEO Edition is a hint of what is to come. By the time the 1180's release the original Titan Vs will be almost a year old, and that is assuming they still launch this year.
Just an update. I finally got around to working on my pixel-mining rig. I got all the big pieces together with suprisingly little glue and no zip-ties at all! I could have built it cheaper and faster, but I don't like the idea of my GPUs flopping around loose. Now I just have to figure out a way to power it. I had a power supply ready to use, but once I actually looked at it, it turned out to be a 250 watt. I don't think that's going to be enough. If necessary, I'll keep one card in my main case and put the lower power cards in the external case on a separate power supply.
Edit: I swear my cards aren't that dirty. That camera flash adds like 10mm of dust or something.
..."Pixel Mining Rig" I like that, however those as all Radeons which are useless for Iray.
I have a couple milk crates laying around, just cannot afford the GPUs.
No, only one is a Raedon. I'm including it to test in Luxmark, also because I like to punish myself. The last time I tried Luxrender with a Nvidia and ATI card at once, my computer just hard reset as soon as the render started. I think that may have been a power issue, though.
I'm trying to get through this without buying anymore stuff, especially a power supply. My main system is 850W and I'm not risking it. My others are 650W and 250W. I'm thinking I can leave the 1080ti in my main case where it is now, and use the 650W to power the GTX 770, GTX 460, and the Radeon. It seems like it should work. I'll have to rip that PS out of my old computer, so that's a job for tomorrow.
Oh it definitely will be limited. But I can still run Sickleyield's Iray benchmark and Luxmark. I could run both of those when I only had the GTX 460. This whole project is just for me to satisfy my curiosity. When crypto prices dropped earlier this year, mining hardware was really cheap, so I bought the PCIE extenders. Plus if I ever do win a small lotto jackpot, this will give me a head start on building my supercomputer.
I'm going to start saving my pennies. Hopefully when the 11 series comes out, there will be discounts on 10 series cards.
Just an update. Yesterday was a train wreck. I started testing the setup and my main computer stopped booting. It would turn on but not make it to the BIOS screen. I don't think the new hardware is causing it. Even with nothing hooked up, I can barely get it to boot. I think its a cable with a loose conection because it will boot fine one minute and the next time I try it, nothing.
Then I got called away for a family issue and that took the rest of the day and most of the night. Right now my PC is running (without the extra stuff) so I'm just going to leave it alone for now. Maybe I'll mess with it later this week. But it did run for a few minutes. I didn't do any testing, though. I'll start a new thread because this is super off-topic.
The big reveal may finally be coming on August 20th, the GeForce Gaming Celebration. Here is the press release:
The world’s biggest gaming expo, Gamescom 2018, runs August 21-25 in Cologne, Germany. And GeForce will loom large there — at our Gamescom booth, Hall 10.1, Booth E-072; at our partners’ booths, powering the latest PC games; and at our own off-site GeForce Gaming Celebration that starts the day before.
The event will be loaded with new, exclusive, hands-on demos of the hottest upcoming games, stage presentations from the world’s biggest game developers, and some spectacular surprises.
To join in, simply head here, register, and turn up for the start of the festivities on Monday, Aug. 20. Doors open at 5.30pm CET and the fun starts at 6pm. There’s limited space, so make sure you arrive early. Can’t make it in person? Don’t worry, it will be livestreamed.
If you somehow choose to miss Monday’s epic event, the festivities continue on Tuesday at 10am, and run clear through to 5pm. Again, space is strictly on a first-come, first-served basis, so make sure you and your clan members turn up early to go hands-on with the spectacular PC-powered demos of the most-anticipated, yet-to-be-released games.
WHEN
Monday, Aug. 20, 2018
6pm-Midnight CET
Tuesday, Aug. 21, 2018
10am-5pm CET
WHERE
We’re keeping this a secret for now, but we’ll give you all of the details once we get closer to the event. Stay tuned to your inbox and GeForce.com for further info!
------------
Now for some spitballing predictions.
There is talk of not only the 1180, but an "1180+". This does not seem to be a 1180ti, and it would be weird to release such a card this soon. No, I believe the "+" is in reference to the different SKUs I predicted earlier. The 1180+ will have more VRAM and might be clocked a bit faster. Think of this branding as being like the iphone Plus or Samsung Galaxy +. They are basically taking that kind of marketing. At any rate it is something to keep an eye on.
Will the next GTX series have Tensor cores? And what impact will they have?
I believe that they will have Tensor cores. Why? For several reasons, but the easiest being that CEO Huang himself said he would like to see Tensor in gaming. The question you may ask is what would Tensor cores do for gaming? That should have been answered already: Real Time Ray Tracing that is powered by AI. When Nvidia revealed their new Ray Tracing demo, they confirmed that while DirectX 12 and past GTX will be able to use it, the Nvidia new GPUs will handle real time ray tracing at the hardware level, NOT software. While they did not specify what exactly this hardware was, it should be easy to conclude that Tensor cores are this hardware.
For gaming, real time ray tracing has been called the "holy grail" for something like 2 decades, and that is why it matters. The idea is not new, nor is the promise, but we have never been this close to seeing it as reality for the masses. Remember the Star Wars real time demo? Keep in mind that this demo was actually done on the DGX Station. The DGX Station is the smaller of the crazy workstations that Nvidia offers. It is a beast, yes, but it is not that far fetched of a beast from today's desktop. It only has four Volta V100 GPUs, which are similar to Titan Vs, its the same chip with the same CUDA and Tensor core count. The Titan V (and V100) have 640 Tensor cores. The Titan V is a massive card, but by the time 1100 releases, it will be nearly 1 year old. It is very possible that the 1180 might have several hundred Tensor cores. The Titan V also has 5120 CUDA cores. I believe the 1180 will have about 4000. It has to be superior to the 1080ti, or it wont sell since the expected price is pretty high. There is no way the 1180 could be close to the 1080ti in price and not be more powerful. The 1180 also has to set a mark that the next AMD Navi cannot beat in 2019, and there is the off chance that Intel could fire something off early instead of 2020. What I am saying here is that the 1180 may be about 25% or more of the power you saw on display for the Star Wars demo, and it could be at your finger tips this September. In just a few years time, we will see the power used in that Star Wars demo in normal desktops. I predict 5 years.
And Iray will use Tensor cores, it is a core element of the AI denoising feature. Also keep in mind the mind blowing Iray speed that the Titan V displayed when tested by Migenius. I posted this before, but the test measured in Megapaths/sec.
Titan V 12.07
Titan X Pascal 5.4
1080ti 5.3
The Titan V has 33% more CUDA cores than the itan XP and 1080ti, but it runs Iray calculations far faster than than 33%. No, it is a 124% increase, more than doubling the render power of the previous generation GPU. The other Volta GPUs also post wild numbers here. And what is different about these GPUs? 640 Tensor cores. And that is why you should be hyped for seeing Tensor cores in the next GTX lineup. It is my personal belief that the 1100 series will see Tensor cores. I obviously do not know how far down the lineup they will go, will the 1160 and 1150 have Tensor cores at all? I believe they will, at least a few. And if Tensor cores really do perform this well for Iray, then we will be in for quite a treat...whenever Daz Studio gets the Iray update. If you are in the market for a GPU, I strongly recommend waiting it out.
I also believe that we will start to see the regular CUDA core counts to actually decrease over the years, as Tensor begins to replace them. There is only so much room on a die.
...will wait and see until after the convention.
Pretty much all can do until then, to see what Nvidia have up their sleeves.. :)
Case of wait and see at the moment, but these dates are the clearest indication of anything related to the new cards..
We know a new card is coming. Anything else is speculation at best, and much closer to guess-work.
Yeah, until someone can fill out an entry for GTX-1180 in the attached list of benchmark GPU performance in DAZ/Iray, it's all just hype and conjecture.
Well here is some more "rumor and speculation" and this time it is huge if true in the video below is purportedly one of the new 1180's PCB's, and well I will let you watch it because "if" the information in it turns out to be true, then those who use iRay will be over the moon.. It is the second bit of news after the new Intel CPU story..
...yep, more speculation.
Eighteen days to go until we get the real story.
But it is still fun to do. And the so called speculation and guesswork does have a pretty darn good track record of being more right than wrong much of the time. Just rewind to Pascal speculations, a lot of the leaks were spot on. By the time time Nvidia revealed them, nobody was surprised because every single spec had been leaked already. I still remember the people who pushed back on me for saying the 1080 would launch with a single 8 pin connector...and look at how that turned out. Nvidia has done a much better job containing information this time, but we still have some things to go with, and with the reveal now in sight, more leaks are certain. You just can't keep a good secret in this day and age, there's always someone ready to open up on something juicy like this.
There are lots of things to be excited about, and real time ray tracing is very much...real. Nvidia already announced it, and said it was coming to games. This tech will help rendering, too, so it is very much something to be looking forward to. I'm quite interested in the what the "plus" cards will offer, and expect a VRAM bump at least with them. It makes sense...just cram some more VRAM chips and charge a fat premium. People will buy it. The new NVLink is also quite interesting. Will it really offer the ability to combine VRAM? I say yes, otherwise, what is the point of it. But I believe this will only exist for high tier cards, like the 1180 or 1170. Or it may be one of the features that the "plus" cards will offer.
To speculate more, I as I said before, I think the rise of gaming engines for rendering will become ever stronger, and now it will be a direct result of these new technologies. Especially Epic's Unreal Engine. Real time ray tracing will cause a shift throughout the industry. Its not just for games, people, it is for special effects in movies and TV, and rendering in general. The potential for animation is unlimited. Just think, you create a scene and instead of hours, it renders faster than you can blink your eyes. That is what a game engine does. It renders beyond 60 individual frames every second. Or you can do 30 if you just want animation. I know, I know, some of you are already hot under the collar ready to say "but game engines just aren't there yet", and I hear you. But they are getting closer every day, and the launch of the 1100 series will spearhead this shift ever faster. Every one of us has a certain threshold for what we will accept as "good enough". When gaming engines reach that point of being "good enough", then more and more will shift over. The shift isn't just because of technology, either, it is also simple economics. Not only is it cheaper, but for people producing 3D assets, the Epic store is more profitable. Epic gives creators an astounding 88% of every sale in their Unreal asset store. No strings, everybody who sells gets that. Sell something for $10, you get $8.80 of it. That will lead to store becoming more and more robust over time, attracting talent from all directions.
With new GPUs coming soon, and 32 core desktop CPUs coming from AMD soon, the future sure does look bright, whether you be a hoobiest or amature animator. Its going to be fun.
...well, considering I do still frame illustrations and am not into gaming, not sure if the frame rate benefits justify the high cost of building a new system with funds I don't have.
I am still very skeptical whether they will offer more VRAM and features in consumer cards than their high end ones have. Gaming is a large source of their income but they have much bigger fish frying in the skillet as well.