Good graphics card for Daz3d?????
in The Commons
I just got a new PC that can handle Daz3d alittle bit but it does get slowed down. I have a Dell Inspiron i5 It has intel graphics on board right now. What would be a better graphics card for this computer so I can run Daz3d better?

Comments
You need an NVIDIA card with as many CUDA cores and as much VRAM as you can afford. I would get a card with atleast 4GB VRAM. The GTX 980 TI is a good card for it's price. The 10xx series cards aren't supported by Iray yet, but when they are, they would probably be the way to go.
Yeah, with only an i5 you're going to need a really good nvidia graphics card. It depends on how much patience you have. My 970 GTX 4GB is just too slow for my liking these days, so I'd go for at least a 980 GTX, or 980 Ti like LayLo suggested - especially if you plan to use Iray.
But it all depends on whether you have other things you can go and do while the render is cooking. Having 2 PCs also helps.
Also, if you're happy to wait hours for a render to finish, or you only plan on doing simple scenes with HDRIs etc, then you might be able to get away with a lesser card - theres many 'ifs' and 'buts' to this question.
I'm guessing with the system specs you posted, that you probably have a 450W - 500W Power Supply (PS). This will limit your options quite a bit, as either a GTX 970 or GTX 980 will require a larger power supply. If you need it now, a GTX 960 SC (Super Clocked) with 4Gb would be a good option. If you can wait until the 10XX series is supported by Iray in DS, in theory a GTX 1060 with 6Gb would be a better option, or you may be able to go with a GTX 1070 due to the 10XX series lower power requirements.
A good EVGA GTX 960 will be in the $200.00 range, and the 1060 should be a similar price ( probably a bit more for the 6Gb version).
When are they going to make video cards that we can add RAM to like systemboards?
When Pigs fly. It would hurt business. Why upgrade to new card when you just need to add more Vram?
When they figure out how to stop the delay caused by sockets on the GPU logic board because as of now this would have an impact on the overall performance. Also factor in what the effect of 3rd party RAM present would contribute to the instability of the card along with how to keep the manufacturing cost down for the additional components necessary for an upgradable card.
So when that happens. :-)
I would get an NVIDIA with 6 GB or more. You really can't do much with 4GB except your usual portraits and a few props. I know, I have a 4 GB card and I am dying to get one with more RAM.
When a company comes along and starts doing it. When the others see the demand they will jump in. If the execution is excellent that first company will be sitting pretty in the market as the one to beat.
Right now I'm hoping to make do with 2 970s. Only 4 GB.
Down the road, as cash flow permits, hoping to upgrade to 980 Tis or maybe even a Titan X. Depending on how the market shakes out.
I am holding out for a decent 1080. When the price settles down to below list price and Iray is finally supported 100%.
It isn't so much the delay. That can be handled. It is the variance in signal propagation that is added by the module contacts along with the increase in resistance. Even the best inline sockets cause variable amounts of resistance and impedance to each line. As long as each is EXACTLY the same then one can make due and compensate, but rarely is nature so forgiving. It IS possible to add RAM into a high-performance system like a GPU card using the right type of packaging. Using a SIM/DIM (Single Inline Memory/Dual Inline memory) layout won't work well due to the reasons above. However, going with a BGA would easily handle the issues; but the costs to create and then sell the modules would take the ROI for the consumer too high. There are other electronics reasons why it isn't conveniently done, but I'm sure most don't want to read it.
Kendall
They used back in the early 1990's, but that was when the GPU was a whole lot slower that it is now.
Speed means nothing. Things are impossible until a company really decides to takle the issue. Then all of a sudden a new market opens up. A lot of things were said to be impossoble, uncluding the iPhone. Now look at that maket. I have been around computers since the early 80s. Never say never or that it is impossible.
does the power supply factor in with a 3 monitor setup?
i heard windows7 can use 3 monitors. was looking at gt cards with dual hdmi. plus the hdmi on m/b. is this a recipe for 3 monitors?
library on the left, preview in front, panes on the right. (dream)
Monitors generally have their own power cords. So no, their power is not fed from the Power Supply in the PC.
Yes, it is possible to set up dual monitors on one card, and another on an integrated adapter. As long as you can get the Motherboard and drivers to cooperate......
The GTX-960 and higher series as well as the Quadro K series and newer can do as many as 4 monitors. Some GT-610s can also do 4 monitors (Galaxy had one, and I own one).
Some AMD cards can go to more than 6 monitors, but this does no good for Iray.
Kendall
Every time I look at those setups it reminds me of the Mayan 'astronaut':
It is my understanding that a graphics card can only output to one HDMI port. I am running three monitors, at home, on a 980Ti, using the three display ports on the card. Note I have them set up as one monitor. :) (NVIDIA Surrond Vision.) Nice perpherial vision for gaming. :)
If you are going to get a bit crazy with chairs though I prefer the Emperor 1510. :)
If I had that desk, a cup of hot coco and a stuffed animal...
...nitey night rabbit.
Man, some of those desks look like they have more material used to make them than a motorcycle does.
Hmm, those workstation make me think there should be mysterious glowing egg things nearby.
I took up the black arts around the same time you did. At the time, the most powerful mainframe computer on the planet had 256 megabytes of main memory. My mobile phone now has four times that and it can be expanded to sixteen times.
Cheers,
Alex.
For the record, no one said the iPhone was impossible. Sharp was selling a PDA called the Zaurus that had almost all of the features of an iPhone (and some features the iPhone STILL can't match) long before the iPhone was announced. There was even a plugin card that allowed a Cell SIM to be installed to turn the Zaurus into a cell phone. I have both and was using both (can provide photos) for business.
Kendall
This is the problem with mentioning stuff we read years ago. The idea of the touch screen for certain uses, I guess it comes down to how people view current technology. The iPhone was innovative in this area and opened up the market. Or so we are still lead to believe.
Would upping the system ram from 8gb to 16gb help out with the rendering? I was thinking about getting this card https://www.amazon.com/EVGA-GeForce-2048MB-Graphics-02G-P3-2619-KR/dp/B00847TPH0/ref=sr_1_10?s=pc&ie=UTF8&qid=1469754063&sr=1-10&refinements=p_89%3AEVGA My computer cant handle the best graphics cards out there so just trying to find a decent one that can handle Daz 3d better then the built in intel graphics I have now.
I think most nVidia video card & iRay users are in agreement that you need to start out with a 4GB card for the most part. I think though if you do nothing but render a single model with a HRDI background 2GB will work for you.
The 16 GBs will help with CPU rendering, you will be able to render a heftier scene. The card will be limited to the available memory left in the card, which will be less than its 2 GBs.
Apple is the absolute best at taking other companies' ideas, making them "look good", and then claiming them for themselves in spectacular Al Gore fashion. From taking Xerox PARC's "mouse & GUI" and making it work on the Lisa/Mac, to making "desktop publishing" look sexy, to taking the lowly PDA and repackaging it into an integrated "phone". None were original, but all were marketed in a way to make them seem new and innovative.
Kendall
Another good example of this is Frank Soltis's hardware independent machine interface architecture. This was first implemented by IBM on their System 38, way back in the late seventies. A cruder version of it turned up in the form of the Java Virtual Machine and then Microsoft copied the idea when it introduced its .Net architecture.
Cheers,
Alex.
I wouldn't make a decission until 10 series cards support IRAY.
This should be soon, as in weeks or a month or so, and hopefully sooner.
In all honesty, I wouldn't recommend 9 series cards unless you are getting an exceptional deal, and I'm not seeing any. Expecially when 10 series cards could offer much better performance. The trouble is, we can't be certain until IRAY is supported. We know that Nvidia have updated the SDK, but that has only just happened (last couple of days), so 10 series is 'soon'.
10 series cards also offer more RAM, another plus.