Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> IME Apple has always been the most honest when it makes performance claims

That's just laughable, sorry. No one is particularly honest in marketing copy, but Apple is for sure one of the worst, historically. Even more so when you go back to the PPC days. I still remember Jobs on stage talking about how the G4 was the fasted CPU in the world when I knew damn well that it was half the speed of the P3 on my desk.



Worked in an engineering lab at the time of the G4 introduction and I can contest that the G4 was a very, very fast CPU for scientific workloads.

Confirmed here: https://computer.howstuffworks.com/question299.htm (and elsewhere.)

A year later I was doing bonkers (for the time) photoshop work on very large compressed tiff files and my G4 laptop running at 400Mhz was more than 2x as fast as PIIIs on my bench.

Was it faster all around? I don't know how to tell. Was Apple as honest as I am in this commentary about how it mattered what you were doing? No. Was it a CPU that was able to do some things very fast vs others? I know it was.


Funny you mention that machine I still have one of those laying around. It was a very cool machine indeed with a very capable graphics card but that's about it. It did some things better/faster than a Pentium III PC but only if you went for the bottom of the barrel unit and crippled the software support (MMX just like another reply mentioned).

On top of that Intel increased frequency faster than Apple could handle. And after the release of the Pentium 4, the G4s became very noncompetitive so fast that one would question what could save Apple (later, down the road, Intel it turns out).

They tried to salvage it with the G5s but those came with so many issues that even their bi-proc water-cooled were just not keeping up. I briefly owned of those after repairing it for "free" using 3 of them, supposedly dead; the only thing worth a dam in that was the GPU. Extremely good hardware in many ways but also very weak for so many things that it had to be used only for very specific tasks, otherwise a cheap Intel PC was much better.

Which is precisely why right after they went with Intel. After years of subpar performance on laptops because they were stuck at G4 (not even high frequency).

Now I know from your other comments that you are a very strong believer and I'll admit that there were many reasons to use a Mac (software related) but please stop pretending they were performance competitive because that's just bonkers. If they were, the Intel switch would never have happend in the first place...


It's just amazing that this kind of nonsense persists. There were no significant benchmarks, "scientific" or otherwise, at the time or since showing that kind of behavior. The G4 was a dud. Apple rushed out some apples/oranges comparisons at launch (the one you link appears to be the bit where they compared a SIMD-optimized tool on PPC to generic compiled C on x86, though I'm too lazy to try to dig out the specifics from stale links), and the reality distortion field did the rest.


While certainly misleading, there were situations where the G4 was incredibly fast for the time. I remember being able to edit Video in iMove on a 12" G4 Laptop. At that time there was no equivalent x86 machine.


Have any examples from the past decade? Especially in the context of how exaggerated the claims are from PC and Android brands they are competing with?


Apple recently claimed that RAM in their Macbooks is equivalent to 2x the RAM in any other machine, in defense of the 8GB starting point.

In my experience, I can confirm that this is just not true. The secret is heavy reliance on swap. It's still the case that 1GB = 1GB.


Sure, and they were widely criticized for this. Again, the assertion I was responding to is that Apple does this ”laughably” more than competitors.

Is an occasional statement that they get pushback on really worse than what other brands do?

As an example from a competitor, take a look at the recent firestorm over Intel’s outlandish anti-AMD marketing:

https://wccftech.com/intel-calls-out-amd-using-old-cores-in-...


> Sure, and they were widely criticized for this. Again, the assertion I was responding to is that Apple does this ”laughably” more than competitors.

FWIW: the language upthread was that it was laughable to say Apple was the most honest. And I stand by that.


Fair point. Based on their first sentence, I mischaracterized how “laughable” was used.

Though the author also made clear in their second sentence that they think Apple is one of the worst when it comes to marketing claims, so I don’t think your characterization is totally accurate either.


Ye that was hilarious, my basic workload borders on the 8GB limit not even pushing it. They have fast swap but nothing beats real ram in the end, and considering their storage pricing is as stupid as their RAM pricing it really makes no difference.

If you go for the base model, you are in for a bad time, 256GB with heavy swap and no dedicated GPU memory (making the 8GB even worse) is just plain stupid.

This what the Apple fanboys don't seem to get, their base model at somewhat affordable price are deeply incompetent and if you start to load it up the pricing just do not make a lot of sense...


> If you go for the base model, you are in for a bad time, 256GB with heavy swap and no dedicated GPU memory (making the 8GB even worse) is just plain stupid ... their base model at somewhat affordable price are deeply incompetent

I got the base model M1 Air a couple of years back and whilst I don't do much gaming I do do C#, Python, Go, Rails, local Postgres, and more. I also have a (new last year) Lenovo 13th gen i7 with 16GB RAM running Windows 11 and the performance with the same load is night and day - the M1 walks all over it whilst easily lasting 10hrs+.

Note that I'm not a fanboy; I run both by choice. Also both iPhone and Android.

The Windows laptop often gets sluggish and hot. The M1 never slows down and stays cold. There's just no comparison (though the Air keyboard remains poor).

I don't much care about the technical details, and I know 8GB isn't a lot. I care about the experience and the underspecced Mac wins.


I don't know about your Lenovo and how your particular workload is handled by Windows.

And I agree that in pure performance, the Apple Silicon Macs will kill it; however, I am really skeptical that an 8GB model would give you a better experience overall. Faster for long compute operations sure, but then you have to deal with all the small slowdown from constant swapping. Unless you stick to a very small amounts of apps and very small amounts of tabs at the same time (which is rather limiting) I don't know how you do it. I don't want to call you a liar but maybe you are emotionally attached (just like I am sometimes) to the device to realize it, or maybe the various advantages of the Mac make you ignore the serious limitations that come with it.

Everyone has their own sets of tradeoffs but my argument is that you can deal with the 8GB Apple Silicon devices you are very likely to be well served by a much cheaper device anyway (like half as cheap).


All I can say is I have both and I use both most days. In addition to work-issued Windows laptops, so I have a reasonable and very regular comparison. And the comparative experience is exactly as I described. Always. Every time.

> you have to deal with all the small slowdown from constant swapping

That just doesn't happen. As I responded to another post, though, I don't do Docker or LLMs on the M1 otherwise you'd probably be right.

> Unless you stick to a very small amounts of apps and very small amounts of tabs at the same time

It's really common to have approaching 50+ tabs open at once. And using Word is often accompanied by VS Code, Excel, Affinity Designer, DotNet, Python, and others due to the nature of what I'm doing. No slowdown.

> maybe you are emotionally attached

I am emotionally attached to the device. Though as a long-time Mac, Windows, and Linux user I'm neither blinkered nor tribal - the attachment is driven by the experience and not the other way around.

> maybe the various advantages of the Mac make you ignore the serious limitations that come with it

There are indeed limitations. 8GB is too small. The fact that for what I do it has no impact doesn't mean I don't see that.

> you can deal with the 8GB Apple Silicon devices you are very likely to be well served by a much cheaper device anyway (like half as cheap)

I already have better Windows laptops than that, and I know that going for a Windows laptop that's half as cheap as the entry level Air would be nothing like as nice because the more expensive ones already aren't (the Lenovo was dearer than the Air).

---

To conclude, you have to use the right tool for the job. If the nature of the task intrinsically needs lots of RAM then 8GB is not good enough. But when it is enough it runs rings around equivalent (and often 'better') Windows machines.


None of that seems to be high loads or stuff that needs a lot of ram.


Not individually, no. Though it's often done simultaneously.

That said you're right about lots of RAM in that I wouldn't bother using the 8GB M1 Air for Docker or running LLMs (it can run SD for images though, but very slowly). Partly that's why I have the Lenovo. You need to pick the right machine for the job at hand.


You know that RAM in these machines is more different than the same as "RAM" in a standard PC? Apple's SoC RAM is more or less part of the CPU/GPU and is super fast. And for obvious reasons cannot be added to.

Anyway, I manage a few M1 and M3 machines with 256/8 configs and they all run just as fast as 16 and 32 machines EXCEPT for workloads that need more than 8GB for a process (virtualization) or workloads that need lots of video memory (Lightroom can KILL an 8GB machine that isn't doing anything else...)

The 8GB is stupid discussion isn't "wrong" in the general case, but it is wrong for maybe 80% of users.


> EXCEPT for workloads that need more than 8GB for a process

Isn't that exactly the upthread contention: Apple's magic compressed swap management is still swap management that replaces O(1) fast(-ish) DRAM access with thousands+ cycle page decompression operations. It may be faster than storage, but it's still extremely slow relative to a DRAM fetch. And once your working set gets beyond your available RAM you start thrashing just like VAXen did on 4BSD.


Exactly! Load a 4GB file and welcome the beach ball spinner any time you need to context switch to another app. I don't know how they don't realize that because it's not really hard to get there. But when I was enamored with Apple stuff in my formative years, I would gladly ignore that or brush it off so I can see where they come from, I guess.


It's not as different as the marketing would like you to think. In fact, for the low-end models even the bandwidth/speed isn't as big of a deal as they make it out to be, especially considering that bandwidth has to be shared for the GPU needs.

And if you go up in specs the bandwidth of Apple silicon has to be compared to the bandwidth of a combo with dedicated GPU. The bandwidth of dedicated GPUs is very high and usually higher than what Apple Silicon gives you if you consider the RAM bandwidth for the CPU.

It's a bit more complicated but that's marketing for you. When it comes to speed Apple RAM isn't faster than what can be found in high-end laptops (or desktops for that matter).


There is also memory compression and their insane swap speed due to SoC memory and ssd


Every modern operating system now does memory compression


Some of them do it better than others though.


Apple uses Magic Compression.


Not sure what windows does but the popular method on e.g. fedora is to split memory into main and swap and then compress swap. It could be more efficient the way Apple does it by not having to partition main memory.


This is a revolution


Citation needed?


Don't know if I'm allowed to. It's not that special though.


> The secret is heavy reliance on swap

You are entirely (100%) wrong, but, sadly, NDA...


I do admit the "reliance on swap" thing is speculation on my part :)

My experience is that I can still tell when the OS is unhappy when I demand more RAM than it can give. MacOS is still relatively responsive around this range, which I just attributed to super fast swapping. (I'd assume memory compression too, but I usually run into this trouble when working with large amounts of poorly-compressible data.)

In either case, I know it's frustrating when someone is confidently wrong but you can't properly correct them, so you have my apologies


Memory compression isn't magic and isn't exclusive to macOS.


I suggest you go and look HOW it is done in apple silicon macs, and then think long and hard why this might make a huge difference. Maybe Asahi Linux guys can explain it to you ;)


I understand that it can make a difference to performance (which is already baked into the benchmarks we look at), I don't see how it can make a difference to compression ratios, if anything in similar implementations (ex: console APUs) it tends to lead to worse compression ratios.

If there's any publicly available data to the contrary I'd love to read it. Anecdotally I haven't seen a significant difference between zswap on Linux and macOS memory compression in terms of compression ratios, and on the workloads I've tested zswap tends to be faster than no memory compression on x86 for many core machines.


How convenient :)


Regardless of what you can't tell, he's absolutely right regarding Apple's claims: saying that a 8gb mac is as good as a 16gb non-mac is laughable.


My entry-level 8GB M1 Macbook Air beats my 64GB 10-core Intel iMac in my day-to-day dev work.


That was never said. They said 8gb mac is similar to a 16gb non-Mac


If someone is claiming “‹foo› has always ‹barred›”, then I don't think it's fair to demand a 10 year cutoff on counter-evidence.


For “always” to be true, the behavior needs to extend to the present date. Otherwise, it’s only true to say “used to”.


Clearly it isn’t the case that Apple has always been more honest than their competition, because there were some years before Apple was founded.


Interesting, by what benchmark did you compare the G4 and the P3?

I don't have a horse in this race, Jobs lied or bent the truth all the time so it wouldn't surprise me, I'm just curious.


I remember that Apple used to wave around these SIMD benchmarks showing their PowerPC chips trouncing Intel chips. In the fine print, you'd see that the benchmark was built to use AltiVec on PowerPC, but without MMX or SSE on Intel.


Ah so the way Intel advertises their chips. Got it.


Yeah, and we rightfully criticize Intel for the same and we distrust their benchmarks


You can claim Apple is dishonest for a few reasons.

1) Graphs often are unannotatted.

2) Comparisons are rarely against latest generation products. (their argument for that has been that they do not expect people to upgrade yearly, so its showing the difference of their intended upgrade path).

3) They have conflated performance, for performance per watt.

However, when it comes to battery life, performance (for a task) or specification of their components (screens, ability to use external displays up to 6k, port speed etc) there are almost no hidden gotchas and they have tended to be trustworthy.

The first wave of M1 announcements were met with similar suspicion as you have shown here; but it was swiftly dispelled once people actually got their hands on them.

*EDIT:* Blaming a guy who's been dead for 13 years for something they said 50 years ago, and primarily it seems for internal use is weird. I had to look up the context but it seems it was more about internal motivation in the 70’s than relating to anything today, especially when referring to concrete claims.


"This thing is incredible," Jobs said. "It's the first supercomputer on a chip.... We think it's going to set the industry on fire."

"The G4 chip is nearly three times faster than the fastest Pentium III"

- Steve Jobs (1999) [1]

[1] https://www.wired.com/1999/08/lavish-debut-for-apples-g4/


Thats cool, but literally last millennium.

And again, the guy has been dead for the better part of this millennium.

What have they shown of any product currently on the market, especially when backed with any concrete claim, that has been proven untrue-

EDIT: After reading your article and this one: https://lowendmac.com/2006/twice-as-fast-did-apple-lie-or-ju... it looks like it was true in floating point workloads.


The G4 was a really good chip if you used photoshop. It took intel awhile to catch up.


If you have to go back 20+ years for an example…


Apple marketed their PPC systems as "a supercomputer on your desk", but it was nowhere near the performance of a supercomputer of that age. Maybe similar performance to a supercomputer from the 1970's, but that was their marketing angle from the 1990's.


From https://512pixels.net/2013/07/power-mac-g4/: the ad was based on the fact that Apple was forbidden to export the G4 to many countries due to its “supercomputer” classification by the US government.


It seems that US government was buying too much into tech hypes at the turn of the millenium. Around the same period PS2 exports were also restricted [1].

[1] https://www.latimes.com/archives/la-xpm-2000-apr-17-fi-20482...


The PS2 was used in supercomputing clusters.


Blaming a company TODAY for marketing from the 1990s is crazy.


Except they still do the same kind of bullshit marketing today.


> Apple marketed their PPC systems as "a supercomputer on your desk"

It's certainly fair to say that twenty years ago Apple was marketing some of its PPC systems as "the first supercomputer on a chip"[^1].

> but it was nowhere near the performance of a supercomputer of that age.

That was not the claim. Apple did not argue that the G4's performance was commensurate with the state of the art in supercomputing. (If you'll forgive me: like, fucking obviously? The entire reason they made the claim is precisely because the latest room-sized supercomputers with leapfrog performance gains were in the news very often.)

The claim was that the G4 was capable of sustained gigaflop performance, and therefore met the narrow technical definition of a supercomputer.

You'll see in the aforelinked marketing page that Apple compared the G4 chip to UC Irvine’s Aeneas Project, which in ~2000 was delivering 1.9 gigaflop performance.

This chart[^2] shows the trailing average of various subsets of super computers, for context.

This narrow definition is also why the machine could not be exported to many countries, which Apple leaned into.[^3]

> Maybe similar performance to a supercomputer from the 1970's

What am I missing here? Picking perhaps the most famous supercomputer of the mid-1970s, the Cray-1,[^4] we can see performance of 160 MFLOPS, which is 160 million floating point operations per second (with an 80 MHz processor!).

The G4 was capable of delivering ~1 GFLOP performance, which is a billion floating point operations per second.

Are you perhaps thinking of a different decade?

[^1]: https://web.archive.org/web/20000510163142/http://www.apple....

[^2]: https://en.wikipedia.org/wiki/History_of_supercomputing#/med...

[^3]: https://web.archive.org/web/20020418022430/https://www.cnn.c...

[^4]: https://en.wikipedia.org/wiki/Cray-1#Performance


>That was not the claim. Apple did not argue that the G4's performance was commensurate with the state of the art in supercomputing.

This is marketing we're talking about, people see "supercomputer on a chip" and they get hyped up by it. Apple was 100% using the "supercomputer" claim to make their luddite audience think they had a performance advantage, which they did not.

> The entire reason they made the claim is

The reason they marketed it that way was to get people to part with their money. Full stop.

In the first link you added, there's a photo of a Cray supercomputer, which makes the viewer equate Apple = Supercomputer = I am a computing god if I buy this product. Apple's marketing has always been a bit shady that way.

And soon after that period Apple jumped off the PPC architecture and onto the x86 bandwagon. Gimmicks like "supercomputer on a chip" don't last long when the competition is far ahead.


I can't believe Apple is marketing their products in a way to get people to part with their money.

If I had some pearls I would be clutching them right now.


> This is marketing we're talking about, people see "supercomputer on a chip" and they get hyped up by it.

That is also not in dispute. I am disputing your specific claim that Apple somehow suggested that the G4 was of commensurate performance to a modern supercomputer, which does not seem to be true.

> Apple was 100% using the "supercomputer" claim to make their luddite audience think they had a performance advantage, which they did not.

This is why context is important (and why I'd appreciate clarity on whether you genuinely believe a supercomputer from the 1970s was anywhere near as powerful as a G4).

In the late twentieth and early twenty-first century, megapixels were a proxy for camera quality, and megahertz were a proxy for processor performance. More MHz = more capable processor.

This created a problem for Apple, because the G4's SPECfp_95 (floating point) benchmarks crushed Pentium III at lower clock speeds.

PPC G4 500 MHz - 22.6

PPC G4 450 MHz - 20.4

PPC G4 400 MHz - 18.36

Pentium III 600 MHz – 15.9

For both floating point and integer benchmarks, the G3 and G4 outgunned comparable Pentium II/III processors.

You can question how this translates to real world use cases – the Photoshop filters on stage were real, but others have pointed out in this thread that it wasn't an apples-to-apples comparison vs. Wintel – but it is inarguable that the G4 had some performance advantages over Pentium at launch, and that it met the (inane) definition of a supercomputer.

> The reason they marketed it that way was to get people to part with their money. Full stop.

Yes, marketing exists to convince people to buy one product over another. That's why companies do marketing. IMO that's a self-evidently inane thing to say in a nested discussion of microprocessor architecture on a technical forum – especially when your interlocutor is establishing the historical context you may be unaware of (judging by your comment about supercomputers from the 1970s, which I am surprised you have not addressed).

I didn't say "The reason Apple markets its computers," I said "The entire reason they made the claim [about supercomputer performance]…"

Both of us appear to know that companies do marketing, but only you appear to be confused about the specific claims Apple made – given that you proactively raised them, and got them wrong – and the historical backdrop against which they were made.

> In the first link you added, there's a photo of a Cray supercomputer

That's right. It looks like a stylized rendering of a Cray-1 to me – what do you think?

> which makes the viewer equate Apple = Supercomputer = I am a computing god if I buy this product

The Cray-1's compute, as measured in GFLOPS, was approximately 6.5x lower than the G4 processor.

I'm therefore not sure what your argument is: you started by claiming that Apple deliberately suggested that the G4 had comparable performance to a modern supercomputer. That isn't the case, and the page you're referring to contains imagery of a much less performant supercomputer, as well as a lot of information relating to the history of supercomputers (and a link to a Forbes article).

> Apple's marketing has always been a bit shady that way.

All companies make tradeoffs they think are right for their shareholders and customers. They accentuate the positives in marketing and gloss over the drawbacks.

Note, too, that Adobe's CEO has been duped on the page you link to. Despite your emphatic claim:

> Apple was 100% using the "supercomputer" claim to make their luddite audience think they had a performance advantage, which they did not.

The CEO of Adobe is quoted as saying:

> “Currently, the G4 is significantly faster than any platform we’ve seen running Photoshop 5.5,” said John E. Warnock, chairman and CEO of Adobe.

How is what you are doing materially different to what you accuse Apple of doing?

> And soon after that period Apple jumped off the PPC architecture and onto the x86 bandwagon.

They did so when Intel's roadmap introduced Core Duo, which was significantly more energy-efficient than Pentium 4. I don't have benchmarks to hand, but I suspect that a PowerBook G5 would have given the Core Duo a run for its money (despite the G5 being significantly older), but only for about fifteen seconds before thermal throttling and draining the battery entirely in minutes.


My iBook G4 was absolutely crushed by my friends Wintel laptops that they bought for half as much. Granted it was more carriable and had somewhat better battery life (needed it cause how much longer was needed to do stuff) but really performance was not a good reason to go with Apple hardware, and that still holds true as far as I'm concerned.


G4 was 1998, Core Duo was 2006, 8 years isn’t bad.


That is a long time – bet it felt even longer to the poor PowerBook DRI at Apple who had to keep explaining to Steve Jobs why a G5 PowerBook wasn't viable!


Ya, I really wanted a G5 but power and thermals weren’t going to work and IBM/Moto weren’t interested in making a mobile version.


Indeed. Have we already forgotten about the RDF?


No, it was just always a meaningless term...


Was simply a phrase to acknowledge that Jobs was better at giving demos than anyone who ever lived.


Didn’t he have to use two PPC procs to get the equivalent perf you’d get on a P3?

Just add them up, it’s the same number of Hertz!

But Steve that’s two procs vs one!

I think this is when Adobe was optimizing for Windows/intel and was single threaded, but Steve put out some graphs showing better perf on the Mac.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: