Hacker Newsnew | past | comments | ask | show | jobs | submit | amiga-workbench's commentslogin

I'm glad someone else said this because I was right about to. One of the things I love about Rama 1 is how it squashes the idea of a human centric universe where everything has to occur for reasons knowable by us. Rama is truly alien, inscrutable and fulfilling a purpose we don't get to understand. As soon as it enters our solar system, its gone for good, leaving a lot unanswered.

I wonder if this can help with the extremely irritating bug (intentional?) on the X270 where if you give it a third party 9-cell battery, it will raise CPU_PROCHOT all the damn time, and my processor would drop to below 1Ghz clock speeds.

Back when I used to have an X270 I had a shell script that ran on boot which poked a register to disable thermal throttling handling. Not at all ideal, but it made the machine usable in the absence of official Lenovo batteries which they stopped manufacturing pretty damn quickly.


You can use ThrottleStop[1] to disable PROCHOT on non-standard battery. I encountered similar issue with throttling on my Dell Precision laptop when I was charging it via 60 W USB-C charger instead of proprietary barrel-type 130 W plug. The system triggered a warning about low power charger and initiated aggressive cpu frequency scaling. By using ThrottleStop, I was able to use type-c 60W charger on lightweight tasks (such as web browsing, older games) just fine.

[1]: https://www.techpowerup.com/download/techpowerup-throttlesto...


Dell likes to pull this stunt on other devices too. Like their 1L desktops in the OptiPlex line that I managed for many years. Even though we were using genuine Dell power adapters, if they became slightly unplugged but remained powered, they would enable PROCHOT.

This was fine until the machines randomly started setting PROCHOT on genuine power adapters that were fully plugged in. Eventually I just deployed a configuration with PDQ to all the machines that ran ThrottleStop in the background with a configuration that disabled PROCHOT on login.

Unfortunately, I couldn't get it to consistently disable PROCHOT pre-login, so students and teachers in my labs would consistently wait 3-4 minutes while the machines chugged along at 700 MHz as they prepared their accounts.


>You can use ThrottleStop[1] to disable PROCHOT

Dell disables that tinkering on some models of XPS in BIOS/EC so ThrottleStop won't to jack.


On my non Thinkpad lenovo yoga (Whiskey Lake) disabling bd_prochot leads to crash.


>ThrottleStop[1] to disable PROCHOT

can confirm, works great to bypass throttling when 60W supply on X230


Nice to finally know what was happening to my x270 after so many years. Well good thing it doesn't happen when connected to power nowadays is my home server


I was the happy owner of ThinkPad X1 Extreme g1. It had that bug out of the box, new original battery. Once it thermal throttles, it never goes back to full GHz. It throttled pretty soon, cause big CPU small chassis. Yes, I had a script like that.

It is still somewhere on a shelf, so maybe its day will come again.


Thinkpads do same thing when detecting 65W supply instead of 90W despite you only need 90W if running full tilt while charging.


Oh almost certainly. PROCHOT is programmable.


Possibly. Usually this is handled by the embedded controller, and not sure if that was reversed or not. You may be able to tristate the GPIO line that tells the CPU that a pin means PROCHOT, which would allow you to ignore the ECs attempts to do this.


I wonder now if a similar problem is present in the A285 which is a cousin of the x270


Yeah, I wrote a similar script. Run it once I see the clock going to 400, but wait for eternity for the sudo prompt emerge before running it.


Do you think it could also be due to an ACPI table?


Its possible. I know from the BIOS revision changelogs that the T470 did get a patch to fix this, but the X270 never did.


Not sure if battery issue is fully related but...

As a former owner of a T470, Lenovo included a pretty beefy component from intel that was supposed to be feature complete by itself for dynamically managing thermals, including funky ideas like detecting if you were potentially using the laptop on your legs etc. and reducing thermals then, but giving full power when running plugged on the desk.

Time comes for delivery, Lenovo finds out that intel did a half-assed job (not the first time, compare Rapid Start "hibernation" driver earlier) and the result is kabylake T470 (and X270 which share most of the design) having broken thermals when running anything other than windows without special intel driver, thus leading to funny tools that run in a loop picking at an MSR in the CPU in a constant whack-a-mole with piece of code deep in firmware.


Their ARM64 boxes are fantastic, but sold out at the moment.


I feel like taking the approach of ramming the entire current desktop userspace into a phone is a misguided one. I can fully see now why Android reinvented the wheel across the board.

If I were to do a Linux Phone platform, I'd be targeting feature phone levels of functionality to begin with, with a focus on battery life and actually working telephony. I'd be aggressively throwing Wayland/GTK and all that nonsense in the bin just to get something basic working well. Draw straight to the framebuffer if you have to. This doesn't help with the app problem, but it sets a tide mark for quality & performance, and it can be iterated on.


With not-quite current hardware as supported by Pocketblue, performance is not that much of an issue, despite the OnePlus 6 being introduced in 2018. GNOME Shell mobile is quite smooth on it.

That said, if you want to start without the entire Linux desktop stack, you can, and there's even a project that already does something like that IIUC: https://sr.ht/~mil/framebufferphone/


I got mine 2nd hand on eBay as new old stock. £300 for a 55" 4K panel. The only thing I can ding it for is that the backlight local dimming is done in columns which is extremely distracting, so I turn it off. You have to remember this thing is designed to sit in a shop window in direct sunlight.

Ticks all my other boxes though, powers on as soon as my finger leaves the button on the remote, same with input switching and any other interactions with the OSD. Its completely braindead, just how I like it.

Oh, they also sent me the model with the touch digitizer installed. So I've got capacitive touch and pen input, it has a USB-B port on the side to connect to a computer.


Whats the model?


Its a NEC MultiSync M551


I've switched to a low carb diet this year and have cut out just about all processed foods. I am considering getting a GLP1 injection privately in the near future. I'm hopeful that when I do get down to my target weight, my diet will remain changed, my habits will have improved and I'll be putting my new mobility to some use.

I don't plan on going cold turkey, I'll taper off the dose slowly and see what happens.


I don't think the bad sound is necessarily deliberate, its more of a casualty of TV's becoming so very thin there's not enough room for a decent cavity inside.

I had a 720p Sony Bravia from around 2006 and it was chunky. It had nice large drivers and a big resonance chamber, it absolutely did not need a sound bar and was very capable of filling a room on its own.


A dedicated GPU is a red flag for me in a laptop. I do not want the extra power draw or the hybrid graphics sillyness. The Radeon Vega in my ThinkPad is surprisingly capable.


For me it's a necessity to run the software I need to do my work (CAD design)


How so? APUs have gotten powerful enough that they can manage even moderately demanding games (at reasonable resolutions).


Dedicated GPUs in gaming laptops are a necessity for the IT industry, as it forces manufacturers, assemblers and software makers to be more creative and ambitious with power draw and graphics software, and better optimal usage of available hardware resources (e.g., better battery and different performance modes to compensate for the higher power consumption due to the GPU; so a low-power mode enabled by casual user will disable the dedicated GPU and make the OS and apps dependent on the integrated GPU instead, but same/another user using same PC can switch to dedicated GPU when playing a game or doing VFX or modeling).

Without dedicated GPUs, we consumers will get only weaker hardware, slower software and the slow death of graphics software market. See the fate of Chromebooks market segment - it is almost dead, and ChromeOS itself got abandoned.

Meanwhile, the same Google which made ChromeOS as a fresh alternative OS to Windows, Mac and Linux, is trying to gobble the AI market. And the AI race is on.

And the result of all this AI focus and veering away from dedicated GPUs (even by market leader nVidia, which is no longer having GPUs as a priority) is not only the skyrocketing price hikes in hardware components, but also other side effects. e.g., new laptops are being launched with NPUs which are good for AI but bad for gaming and VFX/CAD-CAM work, yet they cost a bomb, and the result is that budget laptop market segment has suffered - new budget laptops have just 8GB RAM, 250GB/500GB SSD, and poor CPU, and such weak hardware, so even basic software (MS Office) struggles on such laptops. And yet even such poor laptops are having a higher cost these days. This kind of deliberate market crippling affects hundreds of millions of students and middle class customers who need affordable yet decent performance PCs.


Same here... I do not wish for a laptop with >65W USB-C power requirements.


Yea I agree it's not worth it to have a igpu a dedicated. If I'm correct in what you are talking about. There's always issues with that setup in laptops. But I'd stay away from all laptops at this point until we get an Adminstration that enforces anti trust. All manufactures have been cutting so many corners, your likely to have hardware problems within a year unless it's a MacBook or a business class laptop.


I imagine because vsync and triple buffering introduce latency. There are cases like games where you don't want all that lag.


If the goal was to reduce latency, wouldn’t you want the desktop compositor out of the way when vsync is on?


That's very true, and I believe Wayland has a DRM leasing extension just for this use case. SteamVR uses it to punch through the compositor and draw straight to the screen.


Its old enough to use a CFL backlight and those turn yellow with age.


Thanks ;)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: