Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So, if it's only 3-5% slower, then for $50-100 I could buy a slightly faster processor and never know the difference?

Just trying to check my understanding of what the 3-5% delta is. Seems like a tiny tradeoff for any workstation (I wouldn't notice the difference at least). The tradeoff for servers might vary depending on what they are doing (shared versus owned, etc)



How many thousand tradeoffs like this are you willing to pay for?


This seems beneficial in systems where security concerns trump performance concerns. The above poster has probably made many such trade-offs already and would likely make more. (Full disk encryption, virtualization, protection rings, spectre mitigations, MMIO, ECC, etc.)

With exponentially increasing processor performance it does make sense for workstations where physical access should be considered in the threat model.


Lots.

But then again, I run a few companies that deal with sensitive data. If I were just a gamer, I wouldn't care.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: