You should know by now that LLMs will and do lie in subtle ways that are not apparent to non-experts. Using them to understand complicated concepts is a great way to "learn" incorrect information. To be fair, the same can be said for humans, but humans are worse at bullshitting.
I really just need to converse about a topic for more inquisitive-ness and to form structured thoughts
It will tell me if I’m conflating concepts, before bullshitting about the ways theyre different. Thats fine, my blind spot would have been that I was conflating a concept for the next decade.
In that regard its the same or better than a human
I don't need it to be the source of truth, I need it to be conversational. It can make urban legends just like a person does, I don’t care, just give me a way to talk about a concept and decide if I want to learn more and it does that extremely well
You should know by now that LLMs will and do lie in subtle ways that are not apparent to non-experts. Using them to understand complicated concepts is a great way to "learn" incorrect information. To be fair, the same can be said for humans, but humans are worse at bullshitting.