Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

First, someone did exactly that and created a chatbot to emulate his dead fiancée [1]. You can read about their experience.

In my opinion, this type of chatbots will generate mostly generic messages ("So, how's the weather?"), but also some random ones (I have a chatbot right now that starts answering exclusively in emojis for no good reason) and some that are actually following the fine-tuned data ("I love fishing!"). I believe most people (myself included) will stick to those last ones as proof of the chatbot actually answering the way the person would have answered and rationalize all evidence to the contrary ("maybe grandpa really liked emojis and I just didn't know until now").

I think it has the potential of being therapeutic, but I am not a psychologist. And I do worry about the fine line between "this realistic baby doll will help you overcome the loss of your child" and "this realistic female doll of a woman is better than a real woman and I'm going to marry it".

[1] https://www.sfchronicle.com/projects/2021/jessica-simulation...



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: