Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's a reference to an AI apocalypse scenario, like a paper clip maximiser for example.

https://en.wikipedia.org/wiki/Instrumental_convergence



Is it though? That article talks about AI having a value mismatch that results in it pursuing an undesirable (to humans) outcome.

Not about wiping out all value in the universe. To the paperclip maximizer, if anything, it's generating the maximal-value universe.


Would "wiping out all human value" remove your objection?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: