Submitted title was "Nvidia drops support for CUDA on macOS". We changed that for a while to "CUDA 10.2 is the last release to support macOS", which is language from the article itself. Since then someone emailed and asked why the title was like that in light of the discussion at https://news.ycombinator.com/item?id=21617016, so I've reverted to the article's title.
Edit: If the article were more a burying the lede kind of thing, as sometimes happens with corporate press releases, then the original title would be misleading and it would arguably be right to change it. But that seems unlikely here?
Edit: I suppose we could switch the URL to an article like https://gizmodo.com/apple-and-nvidia-are-over-1840015246 if this really is the only story here.
Any title that gives us a clue why the story is worthy of note. At the moment it's frustrating as you have to click through to find out if you want to click through.
(title at the time of my comment was "CUDA Toolkit Release Notes")
As an ML + open source developer for over 10y, MacOS support for deep learning is already long gone, Linux is the prime AI/ML OS. However, Apple and NVidia parting away is a good omen for GPU competition I believe. Whatever Apple hw comes up with, if usable outside MacOS software stack, it'd be an interesting alternative.
> if usable outside MacOS software stack
Except, of course, according to Apple’s history regarding the matter, it won’t.
Unless the swift-for-tensorflow effort cross-fertilises something interesting, I suppose? If you were asked to guess the most likely path by which AMD cards become widely useful, perhaps this would be as good a bet as any.
Someone will need to get ROCm on MacOS, or something like that.
God, I really hope AMD goes out of the deal with Apple as well. I'd rather people abandon macOS early than have to put up with Apple imposed implementations like Metal, HLS and Webkit.
At the end of the day, Apple should realize that having money alone does not make you , by any means, a popular company. It's the little things that make people sign up for Apple.
crosses fingers in OpenCL
Emphasis on `if usable`
I don't think the general crowd will adapt itself to go on to Apple's stack unless it is radically better.
Till Apple gets its developers to use what it makes, by enforcing it on its App Store, like the way it pushed people for Webkit - Otherwise it won't have takers.
As the whiner who whined about this title - the story isn't worthy of note at all, other than someone fishing a supposedly-important detail out of it and putting it in the title. Apple hasn't as much as sold a machine with an Nvidia GPU for many years. This non-editorializing thing is explained in great illustrative detail here:
https://news.ycombinator.com/item?id=21617907
1. I didn't know the fact highlighted in the original title 2. Doesn't this have an impact on eGPUs, Hackintoshes and other customizations?
It basically means heavy GPU-based work is pretty much a Linux/Windows only thing now. Huge problem as machine learning starts to become an interest topic for people in the art and design community.
For 1., again, the standard isn't 'someone might not have known a detail in the story', it's 'does the title represent the story'. If that particular detail is worthy of highlighting, it's easy to find (or even write!) a story about that or point it out in a comment. For 2., that's been the unfortunate case for ages but more importantly, see 1.
It's not a "detail" if it's pretty much the only reason to post or read the story in the first place.
Sure, it might be an important detail to you but that's just not how titling HN stories works, as explained at great length in the thing linked in my comment and many other places.
> Huge problem as machine learning starts to become an interest topic for people in the art and design community.
Out of curiosity, what Mac apps are you and other professionals using for art & design? Being a part time artist & ex-Mac person, I thought the stereotype of Mac being an artist/designer’s machine was almost gone. Most professional artists I know have had to switch away from Mac. To be fair, most artists I know are doing 3d and need GPUs. But I’m honestly wondering what else is keeping people on Mac and how well the platform is serving professional designers and artists today.
Perhaps they haven’t sold Nvidia hardware, but the allowance of Nvidia’s web drivers so that I could connect my eGPU w/ GTX 1080 to the 13in Macbook Pro was one of the primary reasons I bought the MBP.
It is fine if they want to kill off features off upcoming hardware, but killing off the capacity to use something that was used for months is not the best look. If Metal Only was the goal, add a prompt when someone enables said web drivers that they’re going off the reservation snd that they assume the risk.
These release notes seem to be Nvidia giving up the ghost that they’re going to be able to resolve this dispute and High Sierra is the last release where all but a couple of really old Nvidia GPUs work.
Yep; you’re right. And Metal isn’t the only goal. Preventing people from going off the reservation is the goal. I can see both sides; there are both upsides and downsides for both Apple and their customers. Either way, at this point, Apple’s strategy is clear and widely known. It sucks you can’t use eGPU with Mac, and I’d still be primarily using a Mac if you could, but it’s no longer an available choice. As consumers, the only option now is to vote with our pocketbooks.
The configuration you're describing was never officially supported - I'm not arguing the changes in this support is a good or bad thing just addressing the concern Nvidia might be trying to sneak something super-important past people in release notes and thus the editorializing should perhaps get a pass.
Even AMD eGPUs have a lot of compromises (e.g. HDCP issues), although those may or may not be Apple's fault.
I bought a lower-spec MBP with the expectation of using eGPUs (in the hope Apple and Nvidia would make up), but it's good to see it confirmed that it won't be worthwhile.
Agreed. Quite frankly the new title doesn't help people understand why it's newsworthy.
By the time we changed the title, the comments were making that quite clear.
Good point.
Personal opinion: I really don't care about "CUDA Toolkit Release Notes" or probably most other "$stuff Release Notes" on HN. I only opened the comments because others here seemed to upvote it more than I anticipated and thought "there must be something special about this release" - and indeed, nVidia dropping MacOS support is something that's worth knowing (even as a Linux dev).
Thus I personally would like to either have the title "CUDA Toolkit Release Notes: MacOS-support to be dropped" or link to the click-baity gizmodo (which is missing the CUDA aspect).
> Personal opinion: I really don't care about "CUDA Toolkit Release Notes" or probably most other "$stuff Release Notes" on HN. I only opened the comments because others here seemed to upvote it more than I anticipated and thought "there must be something special about this release" - and indeed, nVidia dropping MacOS support is something that's worth knowing (even as a Linux dev).
Would have to agree here, too. Part of the problem is that there's no obvious indicator to show that the title has changed (some sort of "post title history" feature would be useful here) which then leaves you (as a latecomer to the article) confused as to what's remarkable about something which would normally perhaps be considered mundane, and confused as to the context against most previous comments were made.
Why is the threshold for changing urls and titles lowering all the time?
It seems like it has went from something rare, only used in very clearcut cases to something you use daily.
That is sample bias. HN moderation of titles and URLs has been the same for many years. It was never rare, and was always happening daily.
https://blog.cuvilib.com/2019/11/27/nvidia-drops-mac-support...
This summarizes it well.