Late 2015 - 2016 Mac product line with 4GB Nvidia GPU CUDA Options
StratDragon
Posts: 3,274
in The Commons
None.

Comments
yep. Last I checked, Apple only had one computer that had an nVidia graphics card, and I think it was their most inexpensive device running OSX (i.e. the cheapest laptop). I'm using iRay, because it's easier to get really nice looking stuff compared to fighting with the various faked stuff with 3Delight, but even though I own the fastest Mac made (other than the trash bin), my renders take over 2 hours as opposed to 15-30 minutes that I would expect from Poser for similar quality. Kinda sucks. I'm curious if Reality will be any better once the new version is released, since they will offer some GPU support that isn't nVidia specific.
Depending on your model, there are various thunderbolt solutions to connect an external card.
Found this video:
My dream machine was the Mac Pro... Or whatever the hell they might call it now... Even with the weird badass death drone garbage pail look it now has... (In case you haven't seen it, it has a black motherboard... Cool/evil looking)... I was debating saving more money and waiting maybe another year to get a new (Mac)computer... Then I looked into the specs and noticed no more Nvidia... Whaaaa? Then I looked at one of the open case display models and realized... I don't even think a different card would fit... Not that Apple would allow it if it did, it would probably void the warranty... So I bought a Dell and saved lots of money.
Maybe if I win the lottery and don't care how much I spend on a computer I'll get another one, but over the last couple of years, it's like they have been going out of their way to drive certain customers away.
Greetings,
I love my iMac, for doing software development, normal browsing/word processing/etc., and for setting up scenes. And if I've got something else to do, I'll let it run Iray for a little bit. But the real rendering work is done by the crappy, ancient, 8GB of RAM Windows box that sits next to it, gets the scene automatically when I save it (via Dropbox), and has a 4GB Geforce 740 GTX in it. Card cost me $119, IIRC, and it's the third video adapter in the system. (Onboard RAM, a GeForce 8800 (useless for rendering) and the 740). I'd love it if Apple came out with a more powerful Mac that had built-in strong nVidia support, but it's not likely to happen. Apple is making their systems power-friendly, which means using lower-power-use chipsets, and nVidia's just...not. :) At least for render-capable GPUs, the two companies seem to be going in very different directions right now.
-- Morgan
You can't get that tiny profile if with big fat power hungry components!
I was getting ready for a trip and decided to copy some stuff up to my work lap top (recent macbook pro) and tried to plug a cat 5 cable into it...oops! Wifi only! I'm surprised my iMac lets me use wires;)
The external GPU looks promising, even if the articles all warn that it's not what thunderbolt was designed for. It might be cheaper than buying a junky old PC for rendering. I was thinking of maybe snagging one around christmas, but I wouldn't know where to start with windows...been way too long.
CypherFox, do you have a feel for how much your $120 card helps with render times?
I realy fealt let down by apple when i saw the new mac pros. I was looking forward to picking one up, untill i discovered that the only option for GPU was amd. So what did I do, With the $3,000 i would have spent on a mac pro, I built my self a hackintosh, 1 Titan Z , I7, etc.... and a 4k 32 inch monitor. But I am using an apple key board to type this post!
Are hackintoshes for everyone, I don't think so. It took me a week of playing with parameteres and boot options until I got it stable. There were a few times that I almost gave up and was about to switch to windows. I did get it running, and it's more stable than the imac I used to own.
I wish they would switch back to nvidia or at least give the pros the choice to choose what they need.
Nick
There was once upon a time when Mac was made with the best hardware available. Cutting edge and badass. Now today's Mac Pro and Macbook Pro are both crippled. You can't get a 17" screen, you can't put more than 756 GB SSD inside, and you can't put more than 16 GB RAM in them. This makes Apple computer equipment nothing more than the Sony of today; cheaply built, already obsolete, and way too expensive for what you get. And that's NOT a good comparison.
I have decided that my next creative content laptop will be an Alienware or Asus gaming laptop running Windows. With 32 GB RAM, Nvidia, and multiple intenral SSDs.
And my next workstation will also be a Windows system; probably built on ASUS motherboard with dual Xeon CPU slots, 3 Nvidia GPU cards, and 3-4 TB of internal SSD and HDD storage.
I want to learn development for both Android and iOS. But since Macs are peg-legged, my Mac OS needs are only for iOS development and no content creation, so I don't need an expensive Mac. In fact, the cheapest (or even a used 2-year old) Macbook Air will be more than enough to code for iPad and iPhone and use a browser, so to Apple: You are losing a $3000 sale, continued sales for ongoing support, accessories, and software purchases. And it's your loss more than it is mine, because I can do 95% of my stuff with Non-Apple equipment.
Oh well.
Greetings,
My home Mac workstation is a 2011 27" iMac, 3.4GHz i7. If my baseline is CPU rendering on that box, the 740 in pure GPU mode (no CPU assist) renders 2-3 times faster.
This is just as biased nonsense as Apple fans spout. I use a late 2013 Mac Pro, 3GHz 8-core Xeon E5 as my work desktop, and for software development it is a killer system. I use Linux, Windows, and Macs on a regular basis. I love deploying on Linux, developing (and day to day use) on Mac, and gaming (and rendering) on Windows.
-- Morgan
All well and good but you need to take two things into consideration
You're paying for an external GPU chassis, many of which cost more than a capable GPU to begin with and more importantly you're still bound to Apple's release of Nvidia drivers through OS X updates. El Capitan is being released in September 30, 2015 and subsequently any mac coming out after that date will not be able to downgrade to anything prior. Apple seems to be phasing out Nvidia cards so consider this: Where would you get drivers? Not from Apple and not from Nvidia. One option may be 3rd party flashed GPU's which are usually done for the high end cards so you'll pay a premium on top of a premium (e.g. $1150 - $1300 for a $1000 Titan X with Apple firmware) - Apple may also prevent the card from running through their API, the other option will be Windows 8 or 10 through bootcamp (Windows 7 and prior are no longer permitted to be installed) or VMWare and Parallels which don' have this limitation (but you need a valid install of Windows)
As of right now you can still get a i7 iMac with a 4GB Nvidia card as a custom build (not available for 5K ) but as that stock runs out Apple is no longer manufacturing that option in it's next product line.
If you were considering mac for rendering and you were planing to take advantage of Iray now is a particularly bad time to do it.
My first real computer was a Mac, for many years I mostly used Macs for work, Macs used to have a real noticeable advantage in efficiency, power and usability over Windows machines... I was a huge Mac fan, I loved my Macs.
But over time they changed... Now, the main difference is in price and design... A little still in OS... But that tends to mostly be a six of one, half a dozen of another kinda thing... The thing that really, really killed it for me is the gradual phasing out of expandability and upgradeability... There is no buy a cheaper mid level unit and upgrade what you can afford as you go... It's buy what they sell as the top of the line and buy a new one when it becomes too awkward to maintain software upgrades... Seriously... Take a three year old Mac into an Apple Store and they look at you like it's made out of wood and clay... "Where did you get that?"...
The whole thing reads like those old weekday after school specials, where the dorky nerdy kid gets a new haircut and clothes and becomes cool and turns their back on old friends...
My real wish for them is to stop that nonsense and start making computers for real people again, not computers as fashion accessories... Listen to what people want, and stop doing what they feel is cool... I don't see it happening any time soon though.
If you were considering a mac solution, you could get the cuda drivers from Nvidia. Several mac users are using a newer cards like the 980gtx and higher with older macs. This is simply an option for people who have macs but can't upgrade them or don't have a suitable upgrade path. Keep in mind this solution basically uses the card like a processor, not a videocard, so it should be able to transfer the data. You also don't have to use flashed video cards like from years prior; again people have already researched and are using this option.
I have a 17 inch macbook that I haven't upgraded because none of the newer macs will suit my needs. Because of the thin form factor, I wouldn't use it for any amount of extended rendering, hence I have a watercooled 6-core PC for that. When I did go into the store for a mac machine, I walked out of it with an ipad and looked online to build my rendering machine. Right now Apple is positioning their machines as personal/entertainment machines, not really workhorse unless you're paying the $4000 for the macpro, but that doesn't have the right GPU for my needs, thus I haven't upgraded.
Any legitimate review comparing NVidia and AMD (AnandTech, Tomshardware to name two) consistently show NVidia to use less power for the same or better performance. And with less noise, the reason I finally ditched ATI, well AMD.
DirectX 12 early comparrisons look different, but time will show those in their true light.
CUDA is a software, it's not a driver, you still need a driver for the card if you plan to use it for anything outside of Iray. All mac OS's to present have Nvidia drivers developed by Apple at this time but as Apple drops Nvidia from their line it's possible those drivers will not be developed for further generations of OS X, only legacy GPU' for older systems with Nvidia cards Without the proper GPU driver newer cards will be a bottleneck for all other system tasks that require video. The next line of Pro's may take a page from the rest of apple's lineup and not have an Nvidia option at all and installing one directly would void the warranty. The 980GX uses the same drivers as all 900,800,700,600,500,400 series card sets because they all share the same chipset. If Nvidia changes for future cards that those drivers wont be very Mac friendly any longer.
If you can get CUDA from Nvidia, I thought it was assumed you could get the driver from them as well. I know you're speculating, but I've already told you that several people already have mac solutions already in place and they are rendering.
That said, if for some reason apple OS no longer supports Nvidia, then an external solution and bootcamp would be your solution if you already have apple hardware. If you're looking for a new Nvidia solution from apple, then you're probably not going to get what you need.
If you're looking for graphics power, Apple has never been a good solution as their GPUs have always been woefully underpowered and they've always supported the wrong graphics tech when software requires it. That's why they are also a terrible gaming platform.
Edit: The current Mac pros are not expandable at all. Everything is soldered in place and expansions are handled by thunderbolt cables.
The CUDA software is the one thing Apple has not wrestled from the hands of Nvidia. Apple wont let AMD or Nvidia touch HW drivers and often Apple users are woefully out of date with drivers since not all OS X releases update them with the regularity of Nvidia and AMD provide to the Windows and Unix/Linux users. I'm not looking to start a fanboy fight, or "i told you so" or anything like that, I've used both Windows and Mac for my 3D needs for years but lately it's been less and less on the mac as Apple has made some IMHO crazy-ass decisions about what they want in their high end computers and ignoring what power users want. This is not a knock on AMD either. AMD Makes excellent cards and I would love to see some technology that allows 3D artists to take advantage of those cards, especially Studio users who want to remain on mac or don't have or don't like Nvidia as the only GPU solution that is comparable to Iray.
Well if you look at the signs AMD/Opengl/OpenCL is going to be in the same boat as well as Apple works on its Metal API...
http://www.extremetech.com/gaming/207767-apple-brings-its-metal-api-to-os-x-10-11-kicks-vulkan-to-the-curb
So I think your argument may be moot as OpenCL gets tossed as far as support is concerned as well. OpenCL really isn't as mature as CUDA and more buggy so that's why Apple is probably going their own way and picks hardware depending on who gives them the better deal. So you may end up rendering with CPU anyway with macs.
SSGBRYAN, that picture tells it all for me. The technological improvements from Apple have slowed to a tiny trickle, so now we have to hook up all our devices with external cables and oh yeah, did they tell you that each device needs its own access to AC power? Surprise!
Just 4 months ago Apple introduced an iMac with a 780M with 4GB of Ram in it. (Released the same time as the 5k iMac.) Now it is no longer available.
If you want Iray on a Mac using an NVIDIA card, then it is a 2012 Mac Pro with a flashed video card or that iMac (Watch the refurb section of the Mac Store, one may show up.)
http://www.amazon.com/Apple-Mac-MD770LL-Desktop-VERSION/dp/B00747WW9E Then someplace like http://www.macvidcards.com/store/c2/Nvidia_GPUs.html
Paolo C over at Reality was one of the "horn blowers" on OS X 10.10's damaged implementation of Open GL tools. He claims Apple responded with "it will be fixed" but didn't specifiy a time, as of 10.10.5 (the most recent) it's still the same set of faluty toos. It's possible 10.11 will be the time line they address it but Open CL is old tech, it's possible they just wanted him to shut up.
I don't know if the drivers would work on other models but this is what is stated on the link:
This driver update is for Mac Pro 5,1 (2010), Mac Pro 4,1 (2009) and Mac Pro 3,1 (2008) users only.
it also requires the user to update to 10.10.3 so I don't know for sure if subsequent OS X.10 installs will install drivers over it.
The update is for the following cards: GeForce GTX 680, GeForce GTX 285, GeForce GT 120, GeForce 8800 GT, Quadro K5000 for Mac, Quadro 4000 for Mac, Quadro FX 4800, Quadro FX 5600
If I use Nvidia's page to get drivers for the 960 or Titan for Mac OS they don't have anything. The GTX 680 and the Quatro's are the only 4GB option for this update, and the GTX is over three years old.
Love the pic btw, I'm getting as much use out of my dual Xeon space heater as I can before it comes down to the black toilet.
From what I have personally seen, and from the Beta Test team, The NVIDIA drivers that are with the Apple updates work fine, even for Mac flashed NVIDIA cards. The issue seems to be CUDA drivers which are hit and miss from Apple, so always get those from NVIDIA.
My 2012 i5 iMac has a very paltry 1GB nVidia card. I tried the CUDA drivers for a while, but they destabilised the system so much (crash to reboot 4 or 5 times a day even without DS running. Lost me many hours worth of work.) that I got rid of them.
If I want IRay renders fast, I'll have to invest in a secondary Windows box, with a 980ti or whatever. Dunno where I'll put it though. Till that day, I just leave IRay to render overnight.
"Do this, do that, wear feathers on your butt, dance with a dead chicken in a thunderstorm, turn left 5 times, turn right 3, then gargle peanut butter while standing on your head and simultaneously singing any Justin Bieber or Miley Cyrus song with a ukelele and a Tiny Tim voice, and it will work!"
That's exactly what this sounds like, all this futzing around with special drivers and using specific old machines. I mock that, and I mock what Apple has become. What ever happened to "it just works?" Oh wait, we have that already with Intel and Windows and just about any motherboard from Asus or MSI and just about any new or 2 year old graphic card from Nvidia.
It's starting to feel to me like Nvidia and Microsoft are the "new Apple". I think Apple is far too distracted by that fancy new space ship they're constructing and they've taken their eye off the business!
my concern as well. My windows box is pretty friendly when it comes to making it do what I need it to do. My mac is becoming that accursed puzzle-box from Hellraiser to get it to do something the developers are not intending it ever did. It opens a can of worms, especially for anyone who uses the Mac for more than once specific task.
To add insult to injury if you now buy a mac and add your own RAM you've voided your warranty - that's if you buy a mac that did not have the RAM soldered to the board so it can not be removed and upgraded later which they've been doing for a while. So RAM upgrades need to be done by Apple for a preimum on top of their inflated RAM cost at the time of purchase.
And....BOOM.
If AMD gets bought by Microsoft, then Apple will be paying Microsoft for AMD GPU cards in all those Macbooks and Mac Pros.
Can you see the "I'm a PC vs. I'm a Mac" commercials after THAT one?
Irony is sometimes SOOOOOO........ironic.
But the fact is, AMD is a zombie company, and has been one since codename Bulldozer disappointed twice by becoming Piledriver. If Microsoft is smart, they'll turn this one down. Remember, ATI got bought by AMD back in the day, and now that whole crapstorm is going down the drain fast. Microsoft would do well to avoid this giant vortex of a swirly.
Macs have always been pretty unexpandable when it comes to upgrades. Now you can't even do the hard drive or memory. It's a moot point since I've always maxed everything out, but with things using specific types of hardware, these machines won't meet everyone's needs. I'm doubting I'll buy another mac and just get an ipad pro.
Yeah, I too could be in the market for another iPad. I have both iPad 3 and iPhone 6 Plus, and I'm thinking to myself that an iPad air or pro would be wonderful. The decision could hinge on whether or not there are a lot of accessories (keyboards, protective folios, etc) for the pro. The form factor of the air would be most compatible with all the stuff out there such as music effect units, recording stuff, and so forth. Plus, I can hide the iPad and iPad air (or the mini) sizes very easily if I'm just out running errands and need to chuck the device into the glove box or something. Can't do that with the big iPad Pro.
I bought my first IBM PC (note I use IBM) back in 1995; I've had a PC or some sort since 1981; in 1995 I was considering which to switch to as my Amiga (4000) had just died; I spent a lot of time comparing the PC and Apple, and decided on the PC because of the upgradeability options, even though the software I needed was better on Apple, but I could get by; for the first five years I regretted that decission. For the last five years, I'm happy with my choice. And after reading this, along with other articles and threads, I'm relieved.