Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Quickly looking at the press release, it seems to have the same comparisons as in the video. None of Apple's comparisons today are between the M3 and M4. They are ALL comparing the M2 and M4. Why? It's frustrating, but today Apple replaced a product with an M2 with a product with an M4. Apple always compares product to product, never component to component when it comes to processors. So those specs are far more impressive than if we could have numbers between the M3 and M4.


Didn't they do extreme nitpicking for their tests so they could show the M1 beating a 3090 (or M2 a 4090, I can't remember).

Gave me quite a laugh when Apple users started to claim they'd be able to play Cyberpunk 2077 maxed out with maxed out raytracing.


I'll give you that Apple's comparisons are sometimes inscrutable. I vividly remember that one.

https://www.theverge.com/2022/3/17/22982915/apple-m1-ultra-r...

Apple was comparing the power envelope (already a complicated concept) of their GPU against a 3090. Apple wanted to show that the peak of their GPU's performance was reached with a fraction of the power of a 3090. What was terrible was that Apple was cropping their chart at the point where the 3090 was pulling ahead in pure compute by throwing more watts at the problem. So their GPU was not as powerful as a 3090, but a quick glance at the chart would completely tell you otherwise.

Ultimately we didn't see one of those charts today, just a mention about the GPU being 50% more efficient than the competition. I think those charts are beloved by Johny Srouji and no one else. They're not getting the message across.


Plenty of people on HN thought that M1 GPU is as powerful as 3090 GPU, so I think the message worked very well for Apple.

They really love those kind of comparisons - e.g. they also compared M1s against really old Intel CPUs to make the numbers look better, knowing that news headlines won't care for details.


> not component to component

that's honestly kind of stupid when discussing things like 'new CPU!' like this thread.

I'm not saying the M4 isn't a great platform, but holy cow the corporate tripe people gobble up.


They compared against really old intel CPUs because those were the last ones they used in their own computers! Apple likes to compare device to device, not component to component.


You say that like it's not a marketing gimmick meant to mislead and obscure facts.

It's not some virtue that causes them to do this.


It's funny because your comment is meant to mislead and obscure facts.

Apple compared against Intel to encourage their previous customers to upgrade.

There is nothing insidious about this and is in fact standard business practice.


Apple's the ONLY tech company that doesn't compare products to their competitors.

The intensity of the reality distortion field and hubris is mind boggling.

Turns out, you fell for it.


No, they compared it because it made them look way better for naive people. They have no qualms comparing to other competition when it suits them.

You're explanation is a really baffling case of corporate white knighting.


Yes, can't remember the precise combo either, there was a solid year or two of latent misunderstandings.

I eventually made a visual showing it was the same as claiming your iPhone was 3x the speed of a Core i9: Sure, if you limit the power draw of your PC to a battery the size of a post it pad.

Similar issues when on-device LLMs happened, thankfully, quieted since then (last egregious thing I saw was stonk-related wishcasting that Apple was obviously turning its Xcode CI service into a full-blown AWS competitor that'd wipe the floor with any cloud service, given the 2x performance)


It’s an iPad event and there were no M3 iPads.

That’s all. They’re trying to convince iPad users to upgrade.

We’ll see what they do when they get to computers later this year.


I have a Samsung Galaxy S7 FE tablet, and I can't figure any use case where I may use more power.

I agree that iPad has more interesting software than android for use cases like video or music editing, but I don't do those on a tablet anyway.

I just can't imagine anyone updating their ipad M2 for this except a tiny niche that really wants that more power.


I don't know who would prefer to do music or video editing on smaller display, without keyboard for shortcuts, without proper file system and with problematic connectivity to external hardware. Sure, it's possible, but why? Ok, maybe there's some usecase on the road where every gram counts, but that seems niche.


The A series was good enough.

I’m vaguely considering this but entirely for the screen. The chip has been irrelevant to me for years, it’s long past the point where I don’t notice it.


A series was definitely not good enough. Really depends on what you're using it for. Netflix and web? Sure. But any old HDR tablet, that can maintain 24Hz, is good enough for that.

These are 2048x2732 with 120Hz displays, that support 6k external displays. Gaming and art apps push them pretty hard. From the iPad user in my house, goin from the 2020 non M* iPad to a 2023 M2 iPad made a huge difference for the drawing apps. Better latency is always better for drawing, and complex brushes (especially newer ones), selections, etc, would get fairly unusable.

For gaming, it was pretty trivial to dip well below 60Hz with a non M* iPad, with some of the higher demand games like Fortnight, Minecraft (high view distance), Roblox (it ain't what it used to be), etc.

But, the apps will always gravitate to the performance of the average user. A step function in performance won't show up in the apps until the adoption follows, years down the line. Not pushing the average to higher performance is how you stagnate the future software of the devices.


You’re right, it’s good enough for me. That’s what I meant but I didn’t make that clear at all. I suspect a ton of people are in a similar position.

I just don’t push it at all. The few games I play are not complicated in graphics or CPU needs. I don’t draw, 3D model, use Logic or Final Cut or anything like that.

I agree the extra power is useful to some people. But even there we have the M1 (what I’ve got) and the M2 models. But I bet there are plenty of people like me who mostly bought the pro models for the better screen and not the additional grunt.


The AX series, which is what iPads were using before the M series, were precisely the chip family that got rebranded as the M1, M2, etc.

The iPads always had a lot of power, people simply started paying more attention when the chip family was ported to PC.


Yeah. I was just using the A to M chip name transition as an easy landmark to compare against.


AI on the device may be the real reason for an M4.


Previous iPads have had that for a long time. Since the A12 in 2018. The phones had it even earlier with the A11.

Sure this is faster but enough to make people care?

It may depend heavily on what they announce is in the next version of iOS/iPadOS.


That’s my point - if there’s a real on-device LLM it may be much more usable with the latest chip.


That's because the previous iPad Pros came with M2, not M3. They are comparing the performance with the previous generation of the same product.


> They are ALL comparing the M2 and M4. Why?

Well, the obvious answer is that those with older machines are more likely to upgrade than those with newer machines. The market for insta-upgraders is tiny.

edit: And perhaps an even more obvious answer: there are no iPads that contained the M3, so the comparison would be more useless. The M4 was just launched today exclusively in iPads.


because previous ipad was M2. So 'remember how fast was your previous ipad', well this one is N better.


They know that anyone who has bought an M3 is good on computers for a long while. They're targeting people who have m2 or older macs. People who own an m3 are basically going to buy anything that comes down the pipe, because who needs an m3 over an m2 or even an m1 today?


I’m starting to worry that I’m missing out on some huge gains (M1 Air user.) But as a programmer who’s not making games or anything intensive, I think I’m still good for another year or two?


You're not going to be missing out on much. I had the first M1 Air and recently upgraded to an M3 Air. The M1 Air has years of useful life left and my upgrade was for reasons not performance related.

The M3 Air performs better than the M1 in raw numbers but outside of some truly CPU or GPU limited tasks you're not likely to actually notice the difference. The day to day behavior between the two is pretty similar.

If your current M1 works you're not missing out on anything. For the power/size/battery envelope the M1 Air was pretty awesome, it hasn't really gotten any worse over time. If it does what you need then you're good until it doesn't do what you need.


I have a 2018 15" MBP, and an M1 Air and honestly they both perform about the same. The only noticeable difference is the MBP takes ~3 seconds to wake from sleep and the M1 is instant.


I have an M1 Air and I test drove a friend's recent M3 Air. It's not very different performance-wise for what I do (programming, watching video, editing small memory-constrained GIS models, etc)


I wanted to upgrade my M1 because it was going to swap a lot with only 8 gigs of RAM and because I wanted a machine that could run big LLMs locally. Ended up going 8G macbook air M1 -> 64G macbook pro M1. My other reasoning was that it would speed up compilation, which it has, but not by too much.

The M1 air is a very fast machine and is perfect for anyone doing normal things on the computer.


Doesn't seem plausible to me that Apple will release a "M3 variant" that can drive "tandem OLED" displays. So probably logical to package whatever chip progress (including process improvements) into "M4".

And it can signal that "We are serious about iPad as a computer", using their latest chip.

Logical alignment to progresses in engineering (and manufacturing) packaged smartly to generate marketing capital for sales and brand value creation.

Wonder how the newer Macs will use these "tandem OLED" capabilities of the M4.


The iPads skipped the M3 so they’re comparing your old iPad to the new one.


I like the comparison between much older hardware with brand new to highlight how far we came.


> I like the comparison between much older hardware with brand new to highlight how far we came.

That's ok, but why skip the previous iteration then? Isn't the M2 only two generations behind? It's not that much older. It's also a marketing blurb, not a reproducible benchmark. Why leave out comparisons with the previous iteration even when you're just hand-waving over your own data?


In this specific case, it's because iPad's never got the M3. They're literally comparing it with the previous model of iPad.

There were some disingenuous comparisons throughout the presentation going back to A11 for the first Neural Engine and some comparisons to M1, but the M2 comparison actually makes sense.


I wouldn't call the comparison to A11 disingenuous, they were very clear they were talking about how far their neural engines have come, in the context of the competition just starting to put NPUs in their stuff.

I mean, they compared the new iPad Pro to an iPod Nano, that's just using your own history to make a point.


Fair point—I just get a little annoyed when the marketing speak confuses the average consumer and felt as though some of the jargon they used could trip less informed customers up.


personally I think this is a comparison most people want. The M3 had a lot of compromises over the M2.

that aside, the M4 is about the Neural Engine upgrades over anything (which probably should have been compared to the M3)


What are such compromises? I may buy an M3 mbp, so would like to hear more


The M3 Pro had some downgrades compared to the M2 Pro, less performance cores and lower memory bandwidth. This did not apply to the M3 and M3 Max.


Yes, kinda annoying. But on the other hand, given that apple releases a new chip every 12 months, we can grant them some slack here. Given that from AMD, Intel or nvidia we see usually a 2 year cadence.


There’s probably easier problems to solve in the ARM space than x86 considering the amount of money and time spent on x86.

That’s not to say that any of these problems are easy, just that there’s probably more lower hanging fruit in ARM land.


And yet they seem to be the only people picking the apparently "Low Hanging Fruit" in ARM land. We'll see about Qualcomm's Nuvia-based stuff, but that's been "nearly released" for what feels like years now, but you still can't buy one to actually test.

And don't underestimate the investment Apple made - it's likely at a similar level to the big x86 incumbents. I mean AMD's entire Zen development team cost was likely a blip on the balance sheet for Apple.


They don't care as much for the ARM stuff because software development investment vastly outweighs the chip development costs.

Sure, maybe they can do better but at what cost and for what? The only thing Apple does truly better is performance per watt which is not something that is relevant for a large part of the market.

x86 stuff is still competitive performance wise, especially in the GPU department where Apple attempts are rather weak compared to what is on offer across the pond. The Apple Silicon switch cost a large amount of developer effort for optimisation, and in the process a lot of software compatibility was lost, it took a long time to get even the most popular softwares to get properly optimized and some software house even gave up on supporting macOS because it just wasn't worth the man hour investment considering the tiny market.

This is why I am very skeptical about the Qualcomm ARM stuff, it needs to be priced extremely well to have a chance, if consumers do not pick it up in droves, no software port is going to happen in a timely manner and it will stay irrelevant. Considering the only thing much better than the current x86 offering is the performance per watt, I do not have a lot of hope, but I may be pleasantly surprised.

Apple aficionados keep raving about battery life but it's not really something a lot of people care about (appart for smartphones, where Apple isn't doing any better than the rest of industry).


> Qualcomm's Nuvia-based stuff, but that's been "nearly released" for what feels like years now

Launching at Computex in 2 weeks, https://www.windowscentral.com/hardware/laptops/next-gen-ai-...


Good to know that it's finally seeing the light. I thought they're still in legal dispute with ARM about Nuvia's design?


Not privy to details, but some legal disputes can be resolved by licensing price negotiations, motivated by customer launch deadlines.


speaking of which, whatever happened to qualcomm's bizarre assertion that ARM was pulling a sneak move in all its new licensing deals to outlaw third-party IP entirely and force ARM-IP-only?

there was one quiet "we haven't got anything like that in the contract we're signing with ARM" from someone else, and then radio silence. And you'd really think that would be major news, because it's massively impactful on pretty much everyone, since one of the major use-cases of ARM is as a base SOC to bolt your custom proprietary accelerators onto...

seemed like obvious bullshit at the time from a company trying to "publicly renegotiate" a licensing agreement they probably broke...


Again, not saying that they are easy (or cheap!) problems to solve, but that there are more relatively easy problems in the ARM space than the x86 space.

That’s why Apple can release a meaningfully new chip every year where it takes several for x86 manufacturers


> We'll see about Qualcomm's Nuvia-based stuff, but that's been "nearly released" for what feels like years now, but you still can't buy one to actually test.

That's more bound by legal than technical reasons...


Maybe for GPUs, but for CPU both intel and AMD release with yearly cadance. Even when Intel has nothing new to release, generation is bumped.


> Apple always compares product to product, never component to component when it comes to processors.

I don't think this is true. When they launched the M3 they compared primarily to M1 to make it look better.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: