So, if it's only 3-5% slower, then for $50-100 I could buy a slightly faster processor and never know the difference?
Just trying to check my understanding of what the 3-5% delta is. Seems like a tiny tradeoff for any workstation (I wouldn't notice the difference at least). The tradeoff for servers might vary depending on what they are doing (shared versus owned, etc)
This seems beneficial in systems where security concerns trump performance concerns. The above poster has probably made many such trade-offs already and would likely make more. (Full disk encryption, virtualization, protection rings, spectre mitigations, MMIO, ECC, etc.)
With exponentially increasing processor performance it does make sense for workstations where physical access should be considered in the threat model.
Just trying to check my understanding of what the 3-5% delta is. Seems like a tiny tradeoff for any workstation (I wouldn't notice the difference at least). The tradeoff for servers might vary depending on what they are doing (shared versus owned, etc)