Hacker Newsnew | past | comments | ask | show | jobs | submit | oojuliuso's commentslogin

100% in agreement. Trying to get rid of my 32" 4K. Too much head panning and scanning. I want to comfortably see the entire screen without effort at less than 12 inches away. Creatives likely get some benefit with large displays, but for people who read, code, do productive stuff, it's too much screen, too much pixels.

27" @ WQHD res seems just about right. 4K if you absolutely must.


> I want to comfortably see the entire screen without effort at less than 12 inches away.

I wouldn’t use a display from this close. It’s better for my eyes to have a larger display a little further away. I’m closer to 30” with a 32” and another desk with a 38”

> but for people who read, code, do productive stuff, it's too much screen, too much pixels.

I do all of those things and find the opposite. So it would seem it’s more down to individual preference.

> WQHD res seems just about right.

I would dislike this. Especially for text and even more at closer distances.


And before the in-person interview, the applicant is required to produce a handwriting sample in front of the interviewer of random text, which is then compared against the mailed documents.


Steve Jobs rolling in his grave. The mortal enemy. Thermonuclear war.


Enemies? Google contributes about 20% of Apple's profits annually through their default search engine deal, that's more profitable than just about everything they do or make except selling iPhones.

> The U.S. government said Apple Chief Executive Officer Tim Cook and Google CEO Sundar Pichai met in 2018 to discuss the deal. After that, an unidentified senior Apple employee wrote to a Google counterpart that “our vision is that we work as if we are one company.”

https://www.bloomberg.com/news/articles/2020-10-20/apple-goo...


The original iPhone came pre-loaded with Google search, Maps, and Youtube. Jobs competed with Google but he also knew Google had best-in-class products too.


Jobs brokered a $150M deal with Apple's arch enemy Microsoft in 1997.


I think Cook left easy money on the table by not competing against NVIDIA. They could've tested the waters by loading up Apple Silicon on PCIe riser cards, maturing the toolkit for AI workloads, and selling them at competitive prices. Yes I know they're in the business of making entire widgets, but it would've been easy money. The hardware and software stacks are there. Unlimited upside with nearly zero downside risk.


Apple seems to be avoiding building server hardware for some reason. It seems like a big opportunity, besides AI, the power efficiency of their chips would surely be attractive for datacentres. I think momentum is building for moving away from x86.


They’re building servers now in Houston for their private cloud compute environments, but it’s just for them.


Given how hot my Mac is when I do anything they’d need Pacific Ocean to cool that data center :)


My Apple Sillicon macbooks are the coolest running computers I've had in decades. Something might be wrong with your cooling system.


Most Macs (both Intel and Apple Silicon) refuse to thermal-throttle until they reach the junction temp.

Both you and the parent can be correct, here; many Macs are quite cool at idle, but also throttle much slower than equivalent Intel or AMD chips under load.


> left easy money on the table by not competing against NVIDIA

What!? Seems to me the timelines do not support this. Apple has already been diligent in using their chip design effort (for multiple generations of CPUs) - would they have had still more bandwidth for taking on the GPU field? And Apple's successes are more recent than Nvidia's success with GPUs. Apple silicon capability was not there yet when Nvidia created then conquered the GPU world.


It sure as hell would have been better than wasting time with touch bars and vision pros.

Edit: dare I add apple watches


I don't know why this is getting downvoted. Apple for sure could make very capable hardware/software for cloud AI workloads - directly taking on NVIDIA.


Love walking by the AS/400 terminal emulator screens after checkout. If it ain't broke, why fix it?


You may need to get it to a shape where you can hire people to maintain it and not rely on IBM all the time ;)

But that being said I love TUI. Don’t ever need to use a mouse - Muscle memory and speed. Moving away from as/400 should not mean giving up the TUI


Why Amazon can't use its economies of scale to build a Kindle monitor that devs and writers would early adopt like wildfire is a strange curiosity.


I fear any upstart LinkedIn competitor sans social would just become another LinkedIn over time. Just don’t see the incentives for entering the professional networking space.


Was thinking more of something like an extension that hides certain activities or filters it out.


burpees


The new air’s right around the corner, so hopefully some of those concerns will be addressed. I’m excited that Apple can now release yearly refreshes without hesitation. No more vendor roadmaps to rely upon. This should mean faster iteration and refinement for what customers want.


surprised they don't have a system status page.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: