Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Potentially > 2x greater battery life for the same amount of compute!

That is pretty crazy.

Or am I missing something?



Sadly, this is only processor power consumption, you need to put power into a whole lot of other things to make an useful computer… a display backlight and the system's RAM come to mind as particular offenders.


backlight is now the main bottleneck for consumption heavy uses. I wonder what are the main advancements that are happening there to optimize the wattage.


If the usecases involve working on dark terminals all day or watching movies with dark scenes or if the general theme is dark, may be the new oled display will help reduce the display power consumption too.


AMD gpus have "Adaptive Backlight Management" which reduces your screen's backlight but then tweaks the colors to compensate. For example, my laptop's backlight is set at 33% but with abm it reduces my backlight to 8%. Personally I don't even notice it is on / my screen seems just as bright as before, but when I first enabled it I did notice some slight difference in colors so its probably not suitable for designers/artists. I'd 100% recommend it for coders though.


Strangely, Apple seems to be doing the opposite for some reason (Color accuracy?), as dimming the display doesn't seem to reduce the backlight as much, and they're using a combination of software dimming, even at "max" brightness.

Evidence can be seen when opening up iOS apps, which seem to glitch out and reveals the brighter backlight [1]. Notice how #FFFFFF white isn't the same brightness as the white in the iOS app.

[1] https://imgur.com/a/cPqKivI


The max brightness of the desktop is gonna be lower than the actual max brightness of the panel, because the panel needs to support HDR content. That brightness would be too much for most cases


This was a photo of my MBA 15" which doesn't have an HDR capable screen afaik. Additionally, this artifacting happens at all brightness levels, including the lowest.

It also just doesn't seem ideal that some apps (iOS) appear much brighter than the rest of the system. HDR support in macOS is a complete mess, although I'm not sure if Windows is any better.


Please give me an external ePaper display so I can just use Spacemacs in a well-lit room!


Onyx makes a HDMI "25 eInk display [0]. It's pricey.

[0] https://onyxboox.com/boox_mirapro

edit: "25, not "27


I'm still waiting for the technology to advance. People can't reasonably spend $1500 on the world's shittiest computer monitor, even if it is on sale.


Dang, yeah, this is the opposite of what I had in mind

I was thinking, like, a couple hundred dollar Kindle the size of a big iPad I can plug into a laptop for text-editing out and about. Hell, for my purposes I'd love an integrated keyboard.

Basically a second, super-lightweight laptop form-factor I can just plug into my chonky Macbook Pro and set on top of it in high-light environments when all I need to do is edit text.

Honestly not a compelling business case now that I write it out, but I just wanna code under a tree lol


I think we're getting pretty close to this. The Remarkable 2 tablet is $300, but can't take video input and software support for non-notetaking is near non-existent. There's even a keyboard available. Boox and Hisense are also making e-ink tablets/phones for reasonable prices.


A friend bought it & I had a chance to see it in action.

It is nice for some very specific use cases. (They're in the publishing/typesetting business. It's… idk, really depends on your usage patterns.)

Other than that, yeah, the technology just isn't there yet.


If that existed as a drop-in screen replacement on the framework laptop and with a high refresh rate color gallery 3 panel, then I'd buy it at that price point in a heart beat.

I can't replace my desktop monitor with eink because I occasionally play video games. I can't use a 2nd monitor because I live in a small apartment.

I can't replace my laptop screen with greyscale because I need syntax highlighting for programming.


Maybe the $100 nano-texture screen will give you the visibility you want. Not the low power of a epaper screen though.

Hmm, emacs on an epaper screen might be great if it had all the display update optimization and "slow modem mode" that Emacs had back in the TECO days. (The SUPDUP network protocol even implemented that at the client end and interacted with Emacs directly!)


QD-oled reduces it by like 25% I think? But maybe that will never be in laptops, I'm not sure.


QD-OLED is an engineering improvement, i.e. combining existing researched technology to improve the result product. I wasn't able to find a good source on what exactly it improves in efficiency, but it's not a fundamental improvement in OLED electrical→optical energy conversion (if my understanding is correct.)

In general, OLED screens seem to have an efficiency around 20≈30%. Some research departments seem to be trying to bump that up [https://www.nature.com/articles/s41467-018-05671-x] which I'd be more hopeful on…

…but, honestly, at some point you just hit the limits of physics. It seems internal scattering is already a major problem; maybe someone can invent pixel-sized microlasers and that'd help? More than 50-60% seems like a pipe dream at this point…

…unless we can change to a technology that fundamentally doesn't emit light, i.e. e-paper and the likes. Or just LCD displays without a backlight, using ambient light instead.


Is the iPad Pro not yet on OLED? All of Samsung's flagship tablets have OLED screens for well over a decade now. It eliminates the need for backlighting, has superior contrast and pleasant to ise in low-light conditions.


The iPad that came out today finally made the switch. iPhones made the switch around 2016. It does seem odd how long it took for the iPad to switch, but Samsung definitely switched too early: my Galaxy Tab 2 suffered from screen burn in that I was never able to recover from.


LineageOS has an elegant solution for OLED burn in: imperceptibly shift persistent UI elements my a few pixels over time


I'm not sure how OLED and backlit LCD compare power-wise exactly, but OLED screens still need to put off a lot of light, they just do it directly instead of with a backlight.


that's still amazing, to me.

I don't expect an M4 macbook to last any longer than an M2 macbook of otherwise similar specs; they will spend that extra power budget on things other than the battery life specification.


Thanks. That makes sense.


Comparing the tech specs for the outgoing and new iPad Pro models, that potential is very much not real.

Old: 28.65 Wh (11") / 40.88 Wh (13"), up to 10 hours of surfing the web on Wi-Fi or watching video.

New: 31.29 Wh (11") / 38.99 Wh (13"), up to 10 hours of surfing the web on Wi-Fi or watching video.


A more efficient CPU can't improve that spec because those workloads use almost no CPU time and the display dominates the energy consumption.


Unfortunately Apple only ever thinks about battery life in terms of web surfing and video playback, so we don't get official battery-life figures for anything else. Perhaps you can get more battery life out of your iPad Pro web surfing by using dark mode, since OLEDs should use less power than IPS displays with darker content.


Yeah double the PPW does not mean double the battery, because unless you're pegging the CPU/SOC it's likely only a small fraction of the power consumption of a light-use or idle device, especially for an SOC which originates in mobile devices.

Doing basic web navigation with some music in the background, my old M1 Pro has short bursts at ~5W (for the entire SoC) when navigating around, a pair of watts for mild webapps (e.g. checking various channels in discord), and typing into this here textbox it's sitting happy at under half a watt, with the P-cores essentially sitting idle and the E cores at under 50% utilisation.

With a 100Wh battery that would be a "potential" of 150 hours or so. Except nobody would ever sell it for that, because between the display and radios the laptop's actually pulling 10~11W.


On my M1 air, I find for casual use of about an hour or so a day, I can literally go close to a couple weeks without needing to recharge. Which to me is pretty awesome. Mostly use my personal desktop when not on my work laptop (docked m3 pro).


So this could be a bit helpful for heavier duty usage while on battery.


Ok, but is it twice as fast during those 10 hours, leading to 20 hours of effective websurfing? ;)


Isn't this weird, a new chip consumes 2 times less power, but the battery life is the same?


No, they have a "battery budget". It the CPU power draw goes down that means the budget goes up and you can spend it on other things, like a nicer display or some other feature.

When you say "up to 10 hours" most people will think "oh nice that's an entire day" and be fine with it. It's what they're used to.

Turning that into 12 hours might be possible but are the tradeoffs worth it? Will enough people buy the device because of the +2 hour battery life? Can you market that effectively? Or will putting in a nicer fancy display cause more people to buy it?

We'll never get significant battery life improvements because of this, sadly.


The OLED likely adds a fair bit of draw; they're generally somewhat more power-hungry than LCDs these days, assuming like-for-like brightness. Realistically, this will be the case until MicroLEDs are available for non-completely-silly money.


This surprises me. I thought the big power downside of LCD displays is that they use filtering to turn unwanted color channels into waste heat.

Knowing nothing else about the technology, I assumed that would make OLED displays more efficient.


OLED will use less for a screen of black and LCD will use less for a screen of white. Now, take whatever average of what content is on the screen and for you, it may be better or may be worse.

White background document editing, etc., will be worse, and this is rather common.


Can’t beat the thermodynamics of exciton recombination.

https://pubs.acs.org/doi/10.1021/acsami.9b10823


It's not weird when you consider that browsing the web or watching videos has the CPU idle or near enough, so 95% of the power draw is from the display and radios.


this


Wait a bit. M2 wasn't as good as the hype was.


That's because M2 was on the same TSMC process generation as M1. TSMC is the real hero here. M4 is the same generation as M3, which is why Apple's marketing here is comparing M4 vs M2 instead of M3.


Actually, M4 is reportedly on a more cost-efficient TSMC N3E node, where Apple was apparently the only customer on the more expensive TSMC N3B node; I'd expect Apple to move away from M3 to M4 very quickly for all their products.

https://www.trendforce.com/news/2024/05/06/news-apple-m4-inc....


Yeah and M2 was on N5P vs M1's N5, but it was still N5. M4 is still N3.


Saying tsmc is a hero ignores the thousands of suppliers that improved everything required for tsmc to operate. Tsmc is the biggest, so they get the most experience on all the new toys the world’s engineers and scientists are building.


It's almost as if every part of the stack -- from the uArch that Apple designs down to the insane machinery from ASML, to the fully finished SoC delivered by TSMC -- is vitally important to creating a successful product.

But people like to assign credit solely to certain spaces if it suits their narrative (lately, Apple isn't actually all that special at designing their chips, it's all solely the process advantage)


Saying TSMC's success is due to their suppliers ignores the fact that all of their competitors failed to keep up despite having access to the same suppliers. TSMC couldn't do it without ASML, but Intel and Samsung failed to do it even with ASML.

In contrast, when Apple's CPU and GPU competitors get access to TSMC's new processes after Apple's exclusivity period expires, they achieve similar levels of performance (except for Qualcomm because they don't target the high end of CPU performance, but AMD does).


Tsmc being the biggest let them experiment at 10x the rate. It turns out they had the right business model that Intel didn’t notice was there, it just requires dramatically lower margins and higher volumes and far lower paid engineers.


I thought M3 and M4 were different processes though. Higher yield for the latter or such.


And why other PC vendors not latching on to the hero?


Apple often buys their entire capacity (of a process) for quite a while.


Apple pays TSMC for exclusivity on new processes for a period of time.


2x efficiency vs a 2 year old chip is more or less in line with expectations (Koomey's law). [1]

[1] https://en.wikipedia.org/wiki/Koomey%27s_law


Is the CPU/GPU really dominating power consumption that much?


Nah, GP is off their rocker. For the workloads in question the SOC's power draw is a rounding error, low single-digit percent.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: