Intel-Based Mac Pro Gets New Graphics Cards
Over at Six Colors, Jason Snell shares the news that Apple has released three new graphics card modules for the Intel-based Mac Pro, all using AMD’s Radeon Pro W6000 series GPUs. The details and stratospheric prices are likely of interest only to particular professional customers. The news will also continue to disappoint machine learning researchers who have long been hoping for a rapprochement between Apple and GPU-maker Nvidia. However, it’s notable that Apple continues to release upgrades for the Mac Pro, perhaps suggesting that an Apple silicon Mac Pro isn’t coming soon.
My son is just about to order a PC with an NVIDIA card, about 5k in all, which his main graphics package, Cinema4D, supports. He’s only ever used a Mac and well, needs must I suppose, his current setup, an M1 Mini and an i9 MBPro, just doesn’t have the power. It’s a real shame that Apple haven’t indicated a road map for Apple Silicon especially when it comes to the higher end. There’s a lack of clarity around working with other hardware than Apple’s and while all the system on a chip is impressive, for power users with high end graphics needs, they need to know the path forward if they are to invest in Apple hardware. Similarly software developers will lag behind, support for plug-ins/renderers for C4D on the Mac lag about a year behind the PC.
Can he wait a few months? Apple should announce its annual September and October extravaganzas soon, and it’s anticipated that new Macs with the next generation of the M series will be released. New MacBook Pros, a super powered Mini and iMacs are expected. And rumor has it that new, less expensive version of the Pro Display is in the works:
My fingers have been crossed for a M version MacBook Pro for so long now….
I hear you. The whole Apple NVIDIA fallout was ridiculous. And to keep it up for so long, was just petty.
Apple lost a bunch of pro sales in academia due to it. We replaced several systems that used to be Mac Pros with Linux boxes for no other reason than for certain scientific computing, it’s NVIDIA and nothing else. Because of CUDA of course. When Apple said Radeon/OpenCL only, they lost those workstation orders. Sure, Apple might be able to live with that. There is however a strong contrast between such a move and how important academia and scientific computing used to be taken at Apple.
While I love the new M1 systems and I know I’ll definitely be getting an M1X 14" MBP the day it comes out, I believe those who for their daily work presently rely on GPUs, especially dedicated GPUs on PCIe are least certain about what Apple Silicon means. It’s all but certain a new AS-based desktop Mac will actually accept third-party GPUs. Likewise, it’s hard to imagine that Apple has some kind of secret graphics core hidden away somewhere that will allow the Mac mini (or future Mac Pro) to exploit on-board graphics to a level that immediately obsoletes completely the professional dedicated GPU solutions out there today. Combined that makes for a lot of uncertainty in this market segment.
But perhaps worse yet, in terms of having to choose a GPU-heavy setup right now, is that Apple has so far given every indication that such systems (Mac Pro, perhaps iMac Pro, very unlikely high-end Mac mini) will be the last to transition. Considering they’ve so far only done the low end, and that the mid field (MBP, 27"-ish iMac) are only up for AS introduction late this year, it could easily be another year until we see how this last piece of the AS transition plays out. If you need an NVIDIA solution now, and you can’t afford to wait until at least spring 2022 (which is already quite a gamble), I’d say a Mac is basically out of the question.
Personally, I think Apple and Nvidia need to make up so my son Tristan can stop complaining about how he can’t use Macs for his machine learning research purely because of the GPUs and CUDA. I mean, this is about family harmony!
Apple had problems with Nvidia that were very similar to the problems that they had with Intel…shoddy, faulty graphic cards that caused Macs and Apple displays to fail. The cards that didn’t fail turned out to render slowly, and display quality turned out to be disappointing to what Nvidia claimed. In addition, they regularly missed delivery dates. Like Intel, they they refused to acknowledge the problems, and they ended up loosing a major lawsuit with Apple:
And there’s this one that Nvidia lost to Apple, Dell and HP:
There were also lawsuits about intellectual property between Apple and Intel. And Nvidia is infamous for getting hit with class action lawsuits:
Yeah, no question that Nvidia is far from a perfect company, but the reality of the situation is that their GPUs are the standard for machine learning (and other fields) due to the CUDA framework.
Yep. That’s it in a nutshell.
My guess is that Apple and Nvidia are not going to kiss and make up anytime soon; maybe not ever. And Apple is just getting its toes wet in the chip market. And Apple has always been focused on delivering weapons of choice for the high end graphics markets. Mac Pro is one example of this, and it does run Metal. And Apple is clearly interested in the automotive market, which is increasingly dependent on artificial intelligence and rapid processing. I’ll bet they are either planning to make Metal more competitive to CUDA, or they might be developing something new.
Broadly I’d agree on Apple’s commitment, why else are they building those great displays after all. It’s just the lack of clarity, we are left to guessing/betting/presuming. Roadmaps are key to confidence, for developers and customers. My camera system is a Fuji GFX, quite expensive and defining a new tier of camera type, but they published a roadmap for lens development that spanned several years, cementing my investment.
I don’t think Apple has ever been very big on road maps, to tell you the truth. Something like “we’ll transition to AS within 2 years” is about as good as it comes. I rarely recall them giving any forward guidance with respect to hardware or support for 3rd party hardware platforms. Software, sure, but I don’t recall hardware.
Apple has a history of keeping their mouths shut until there’s a PR opportunity, like WWWDC, to make a big blast. I’ll bet they have internal roadmaps.
100% correct. Apple doesn’t ever “pre-announce” features. And with good reason.
Pre-announcing means the competition gets more time to release a competing product/feature. This can be a killer if your feature gets delayed and their catch-up doesn’t.
Additionally, if you announce something and later decide it’s a bad idea for some reason, it becomes very bad PR to publicly retract the announcement. So you either lose credibility or you are forced to develop and release something you consider a bad idea.
I realize that other companies (like Microsoft) don’t share this opinion. They make their own strategic decisions using their own logic. And if it works for them, I won’t argue with success.
Join the discussion in the TidBITS Discourse forum