Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The Wikipedia article cited above has some practical reasons:

> Early binary computers aimed at the same market therefore often used a 36-bit word length. This was long enough to represent positive and negative integers to an accuracy of ten decimal digits (35 bits would have been the minimum). It also allowed the storage of six alphanumeric characters encoded in a six-bit character code.



Coincidentally I commented on this above at the same time you were posting -- thanks.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: