NVidia 10x series cards
AJ2112
Posts: 1,417
I contact Nvidia about compatibility of NVidia 10x series cards. Rep states that Maxwell and Quadro series cards support Iray. Pascal series does not. Rep answered my questions, but also left me clueless, Lol !!!!! I'm seriously holding out on purchasing a 980Ti card for a 1080 card. The GTX 670 works outstanding, but problem is I'm planning to upgrade to dual 4K monitors, which 670 cannot handle.
Anyway, what are Maxwell, Quadro or Pascal series cards ? Thanks for feedback.
Edit: Sorry, nevermind learned the differences on Nvidia website.
Post edited by AJ2112 on

Comments
...Maxwell is name of the the previous GPU architecture used. Pascal is the new one. Pascal GPUs are more efficient than the Maxwell ones when it comes to power consumption and heat production. They are also touted to be faster, though I have yet to see any benchmarks for 3D CG software.
Quadro is just a brand name for their professional series GPUs (which are much more expensive than the GTX cards). These are designed more for high end CGI production, direct compute purposes, and deep learning (AI) rather than gaming like the GTX cards are.
Hi Krypto, thanks so much for info. I saw $5K cards, I thought Yikes ! Lol !
Any mention of whether they plan to add Pascal support to IRAY? No way I am going to be able to afford one of the pro series cards. I will just stick with Reality/Luxrender for now, it runs on anything with an OpenCl driver.
The Pascal architecture should be getting Iray support very soon - the SDK was released as a non-beta at the weekend, the chnage log shows it has been added to DS as an internal or private beta. We don't, of course, know how much work there will be until we see a public beta (or a release version) but things are moving now. If you don't need to buy your new card this week I would wait a bit.
Many thanks, Richard! That is good news. I would like to be able to get a card with 8 GB or greater (preferrably 12GB or greater) and use it for all render engines. With the Pascal series, this may be affordable.
soon
so͞on/
adverb
in or after a short time.
Nvidia soon
adverb
/ɪnˈvɪdiə/ in-VID-eeə so͞on/
see Daz soon.
Daz soon
adverb
/dhaz so͞on/
Defintion coming soon!
You forgot the ™ on Daz Soon™
Dang!
Before plunking down money for big monitors, read this thread on the Microsoft forum
http://answers.microsoft.com/en-us/windows/forum/windows_7-hardware/windows-7-dual-monitors-turning-off-a-monitor/e0b8bb93-5960-4273-b959-4e3177946f8d?page=7 )
Jumping to DP or HDMI dual displays causes problems with a Windows 'feature'. I now have to move all my pallets back to the secondary monitor every time the computer wakes up. Daz isn't so bad as it's only four pallets but all my Adobe programs have a bunch of open pallets and moving them each time is a pain. So far there's no fix.
I've got a 10xx card here and my other Iray software doesn't support it yet.
The mods in the forums over there are still sticking to Pascal (10xx series) support around October last I looked.
Hopefully sooooooon if not already.
Then I guess it needs to get updated in Daz.
EW
The SDK has been released, the Change Log for DS shows it has been added to an internal or private beta. It's not possible to say how long it will take for that to turn into a Public Beta, but a substantial step along the way has been taken.
That's great to hear.
EW
At this point any frustration should be directed toward Nvidia, not Daz. Not saying anyone is complaining, just that we can't fault Daz for Nvidia's failure to get iray read y for Pascal.
Thanks for feedback friends. Latest update is, Nvidia rep apologized and states GTX10 series cards are not compatible with Daz studio. However it's up to Daz company application developers for compatibility.
Rep also states Maxwell and Pascal cards are specifically designed for gaming. The Quadro is specifically designed for 3D graphics.
Hmmm, very interesting. I've alway's wondered why? old Nvidia cards works well as new ones. Other then benchmarks, there was not really any technical details to differences. From bench testing, noticed cards with similar cuda cores almost work the same. My GTX 670 was just as fast as 900 series cards, with same cuda cores and faster then cards with less. I desired to purchase a new 900 series NVidia card, but walked away with GTX670. And now awaiting GTX10 series, but according to Nvidia rep would not make a major difference, but Quadro will.
After my latest custom build, noticed my computer was super fast. Reason I switched over to Reality, then back to Iray when Iray was introduced. I have an AMD 8 core cpu.
Also, my Asus motherboard graphics chipset makes a difference in graphics, my motherboard has latest 990FX chipset. I made sure of that. Cause over the years, I did a lot of video editing, and noticed chipsets made a difference. And many motherboards had onboard graphics card. I also spoke with Asus about chipsets and SLI, stating Nvidia supports SLI. I don't do any major gaming on my comp, but I do play Microsoft flight simulator. My GTX 670 easily handles Microsoft flight simulator, 24" dual monitor at 1080 resolution.
Anyway, I'm certainly no expert, but I believe there's a lot more going on how computers process 3D graphics, then just graphic cards.
Thanks friend, I'm using Windows 10, thus far have no issues with dual monitors. A buddy of mine has dual 4k monitors on windows 8, works just fine. From what I can remember, only gripe he had was changing resolutions,
Glad to know that an Nvidia rep apologized because the delay in Iray implementation in any program is entirely their fault. It is ultimately up to Daz to implement updates into DS to use iray, but Daz can't be faulted for Nvidia only passing them the ball now. Pascal cards have been out for 5 months now; the delay is inexcusable. Pascal quadro are just as useless as pascal gaming cards for iray purposes and quadro cards are a waste of money for iray purposes.
What the rep says is actually true. Somebody did some video game benchmarks with super expensive Quadro cards, and found that they really were not that great for gaming respective to what their specs are. You -CAN- game on a Quadro, but compared to a GTX card at a similar level, the GTX will win almost every time. And you can compute with a GTX, but a Quadro may likely be better depending on what the compute task is.
You have to think of these cards as being well tuned for their given tasks. A big pick up truck has a lot more power than most cars, but a quick and speedy car can outrace it on the track easily. Quadros are pickup trucks, work horses of the GPU world. The GTX series (the Maxwells and Pascals) are the race cars of the GPU world. Video games are like races while GPU computing is more like working on the farm. That's the best analogy I can give. It would be fantastic if they could do both super well, but that's how things have gone. AMD does this, too, they have cards specifically made for workstations that compete with Quadro.
And to be clear, video games are Nvidia's bread and butter. Anything they do is dictated by gaming, first and foremost. Iray is very much not a priority for them. So keep that in mind.
GPUs are not just hardware, but also software. They have different software that does the complex math calculations differently. And while the 670 has 1366 CUDA cores, the CUDA cores in a newer cards are better optimized and more efficient at their job. In other words, you cannot directly compare the CUDA counts of GPUs across generations. That only applies to cards of its own era. The 670 was an absolute BEAST of its time, I have one myself. It is still a very capable card and can game at 1080p for many modern games. I would say it is a toss up between the 960. The 670 can do some things faster, while the 960 can do other things faster. The 1070 completely destroys the 670 at gaming. But, the 1070 cannot do Iray yet.
It remains to be seen exactly how the 1000 series will do with Iray. All we have a theoretical benchmarks, but I don't think specs are the only story here. I am hoping that the Iray update is more than simply driver support, I am hoping that Iray is updated to be more efficient itself. Because to be frank, Iray kind of sucks. It is very poorly optimized software. For example, the stuff you cannot see in a scene STILL takes up memory. A character with their mouth closed? Iray is still wasting memory with textures for their teeth, inner mouth, and tongue! Do you have a car in your scene? If that car has an engine, then Iray keeps the entire engine in memory, even if the hood is closed. It is best to delete the engine if you do not use it to save memory for Iray. But I think that is stupid. Iray should be smarter than that. Nvidia has developed technology that allows exactly this kind of thing...but for VR gaming. In the 1000 series, if you play a VR game, the GPU will actually take unseen elements and drop them from computation. And it does this more than 90 times per second!!! (or whatever your frame rate is.) But Iray cannot do this in a stand alone scene that is not moving??? POPPYCOCK.
And that is just ONE thing that Iray does horribly wrong. It should be able to delegate tasks to the CPU and share memory with the RAM. That way, if your scene exceeds your GPU VRAM, your are not stuck in CPU only mode. Yes, I get that this would be slower than pure GPU, but IF you exceed VRAM, this would surely be faster than pure CPU! And besides, Octane figured out how to do this, it is not impossible. I could go on...but I wont.
At any rate, if Nvidia introduced these features to Iray, Iray would be amazing. Iray is still pretty new, and it is reasonable to think that it can and will get better optimized over time. Of course, Daz3d has to update to take advantage of any such update to Iray.
Hi Outrider, thanks for excellant explaination. Certainly learned something new reading your post
..yeah unless you are a professional CGI designer or a perfectionist artist geek like me, you don't need a Quadro.
...+1
..I'm running dual displays on W7 with a Fermi GTX 460.
The problem is not with the number of displays, it's with the connection. The older (but still widely supported) DVI works fine. It's the newer DP (Display Port) or HDMI that cause problems. Because both these new connectors include a line that detects whether or not the monitor is on, Windows will move apps or open windows from the one that is off to the one that's on....or the main one. When the computer wakes up from sleep or hibernation, your open programs or windows are all stuffed onto the monitor you've designated as your primary monitor. To the average (not terribly computer literate person) user this is an advantage since they'd have no idea what had happened to their stuff when (for example) they undocked their laptop from the workstation with it's larger secondary monitor. To graphics folks and other power users, it's a huge pain in the butt. If you read the whole thread you'll find many work arounds and suggestions. None of them are a sure fire fix and unfortunately, none worked for me. Our only recourse is to run Linux, or complain (bitterly!...vehemently!) to microsoft and the hardware manufacturers. This problem cropped up recently on my home setup when I added a GTX960 card. It only has one DVI connector which won't support the 2560 x 1440 resolution of my nifty and new BenQ GW2765 monitor. I needed the DVI connection for my older secondary monitor anyway.
The problem is compunded when the screens are of differing sizes and/or resolutions. For instance, when the Cintiq goes offline and Windows moves the windows from there to the main display and completely distorts the work. Complete garbage.
Kendall