Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Normally it wouldn’t. But Steve Jobs very publicly threw a camera at someone who crossed him. He had a reputation for long held grudges that fundamentally fractured business relationships (eg IIRC Nvidia being permanently banned from Apple platforms for early announcing something, and similar reaction to integrating ZFS being announced early). Guy was an asshole, but he was an extraordinarily powerful asshole.


Nvidia is difficult to work with because they're as prideful (or more) as Apple. ZFS is hard to license commercially because it means having to negotiate with Oracle, who are evil.

Well, and there were those desoldering GPUs.


At the time it meant working with Sun not Oracle. Further, it didn’t need licensing - DTrace, under the same CDDL license - ships with macOS to this day.


Even further than that, Sun owned ZFS and was not bound to the license they gave to everyone else. If Apple demanded that ZFS on Mac OS X were a proprietary component, Sun might've actually gone with it.


Sun had, by then, mostly given up on the workstation market. They had everything to gain from making Unix mainstream, even if it was OSX instead of Solaris, and pretty much nothing to lose.


FWIW the beta-quality implementation of ZFS in macOS was removed only after Oracle bought Sun.

Though that doesn't explain how DTrace survived.

Also Steve Jobs and Larry Ellison were known to be best buddies.


> Though that doesn't explain how DTrace survived.

I think Larry Ellison discovered, much to his dissatisfaction, open source products and their forks, are incredibly hard to kill.

I'd love to have seen his face when he realized that he wouldn't be able to kill MySQL.


I had a laptop with one of those GPUs. Apple replaced the motherboard. With the same exact thing. It's like if your defective Takata air bag was replaced with... a defective Takata air bag.


Yeah Nvidia is a pain in the ass. But they weren’t banned from Apple platforms for that, as I understand it.


In hindsight I wish I said he was an incredibly effective asshole. Lots of powerful people are assholes. It’s not often they continue to hold much power as a corpse.


Still ten minutes to edit the post.


Meh, I’d rather walk my pup and eat dinner. If people are curious they’ll find it


> IRC Nvidia being permanently banned from Apple platforms for early announcing something

may also be the major issues with their chipset.

https://support.apple.com/en-us/HT203254


Nah Apple has had plenty of hardware issues over the years.

However Jobs was a master presenter and showman, and extremely anal about getting L&F just right (go check the calculator story, or recounting of his preparation for keynotes — you can also see how things broke down as soon as they started bringing in third parties, or after Jobs’ death).

Keeping things under as tight a wrap as possible with extreme OPSEC[0] is one of those things Apple has always done, and it’s entirely unsurprising that Jobs would get very cross about a supplier fucking that up.

[0] I expect detrimentally so at times, Apple has long been extremely compartmentalised — at least under Jobs; go check the history of the iPhone for flagrant examples of that where you’d just see colleagues disappear into unknown voids and HIG went full SCP


> Keeping things under as tight a wrap as possible with extreme OPSEC[0] is one of those things Apple has always done, and it’s entirely unsurprising that Jobs would get very cross about a supplier fucking that up.

I sat across from a Graphic Designer at Apple who accidentally published some of the iPods marketing assets to the website a few days before the launch. It was discovered soon after by a rumour site and there was a flurry of meetings to fix it.

The designer was never fired or reprimanded. And he never received a screaming phone call from Steve Jobs.

I am not privy to vendor negotiations but would be really surprised if Apple was making multi-billion decisions based solely on a vendor leaking something. Especially back then when Apple wasn't out of its recovery.


> Apple has long been extremely compartmentalised

Oh, I have been told that Apple could only build such great products because of the tight integration that would never be possible if Apple was split into e.g. a hardware and software company.


Apple only feels compartmentalised if (a) you're a low level engineer and (b) you're working on a secretive project. Otherwise it's a normal big tech company.

But the team responsible for writing all of the Forth firmware code very much worked closely with the hardware teams as you would expect.


Apple used Nvidia chips for years after the 8600M GT. I think their last use of Nvidia chips was the Kepler generation. Which meant that because a MBP came with a Nvidia 650 or something, you could put a Titan in a Mac Pro which was nice!


Nvidia still offered the "unofficial" official driver for a long time that let you run unsupported GPUs in Mac Pros or Hackintoshes for quite a while. I ran a couple of Maxwell generation NVidia parts this way, even though macOS out of the box didn't support them, Keppler wasn't the last that worked.

The only downside back then was when a new macOS release hit, you would have to wait for Nvidia to release the updated third party driver. IIRC there was one major OS ten release I was stuck waiting a few months before the driver landed. This extended the lifespan of a lot of Mac Pros greatly!


It was a lot of he-says she-says with those chips, but as far as I understood it, it mostly came down to Apple using cheap solder. When the Nvidia chips inevitably started to heat up, it caused the solder to come loose which caused the notorious graphics issues. IIRC, there were even reports of people reviving dead or malfunctioning logic boards by doing ye olde "oven trick", which pretty much confirmed that it was an assembly issue, not a manufacturing one.


The "bumps" (industry term for solder left on bottom of chip to attach to a board/circuit) in question are specified by the GPU chip vendor, not Apple.

This is why every single NVidia customer during this period was affected, not just Apple. NVidia ended up making several large compensation payments to several big vendors such as HP too etc.

Its amazing how quickly people have forgotten what a big a deal this was, it was terrible for much of the laptop industry for a year or two. It was after "Bumpgate" as tech media termed it at the time that Apple's relationship with Nvidia ended too, coincidence or not.

> https://semiaccurate.com/2009/08/21/nvidia-finally-understan...

As for it being "he-says she-says with those chips", I can't agree. There was an entire settled class action lawsuit at the time directly blaming NVidia for all affected Dell, HP and Apple computers. I've had to raid web.archive.org, but you can still find the details:

https://web.archive.org/web/20101011074425/http://www.nvidia...


"Soldergate" impacted all vendors that shipped nVIDIA chipsets back at the time, and the cause was not "cheap" solder - rather, the back-then relatively new lead-free solder that was mandated by the European Union's RoHS directive in ~2005-2006. The manufacturers didn't have much experience with the stuff back then and chose a solder formulation that didn't hold up well to repetitive thermal stress.


See also the Xbox 360 RRoD plague from the same period.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: