Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

AI that hallucinates accurately enough times should just carry Errors and Omissions insurance like human contractors do


Who in their right mind would underwrite that? Hallucinations are a necessary part of the process, and there's no way to estimate whether the hallucinations are "accurate enough" or not. It'd be like a reverse lottery ticket for the insurance company.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: