That is mostly correct but thing is, if you go on stackoverflow or chatGPT to have someone else write the solution for you, that's on you to use the wrong solution.
I've used chatGPT (as well as SO as many others) but always in the context of having enough knowledge (either from experience or experimenting before opening a question) to understand if the answers were just wrong or not.
To each their own but I treat ChatGPT as a rubber duck 2.0 and I think that's what it excel at.
There are so many nuance to a problem that the approach of "one question - one answer" is simply naïve.
Even if we wanted to go with this approach, questions that are potential duplicate should just have an answer pointing to the other question and let the OP decide if it's right or not.
Having someone else that doesn't have the full vision on the question and context is not the right approach.
> There are so many nuance to a problem that the approach of "one question - one answer" is simply naïve.
Yet I have marked hundreds of questions as duplicate which was often welcomed by the person who asked the question, so clearly it's not fundamentally broken, even though there is some friction at times.
I've had two questions closed as duplicates--both involved situations where how I asked the question didn't remotely look like the duplicate question (no way I would have found it by searching) but it actually had the same answer.
You can't let the OP decide if their question is good. Every OP thinks their question is good and most of them are idiots. I guarantee you most of the people complaining about SO do so because they suck at asking questions.
As someone with a 9-year-old account on SO, I have never asked a question there because I didn't fancy the chance of it being immediately shut down.
From personal experience, I have come across numerous questions that appeared to address the exact issue I was struggling with, only to find them closed as duplicates, off-topic, or something similar. Often, the suggested alternative Q&As are unrelated except for the surface-level similarity, which is especially frustrating.
Sometimes, when I find one of these closed questions, I feel a strong sense of empathy towards the asker, as if I'm the only person out there who understands them.
Unfortunately, I have always been primarily occupied with whatever problems I was dealing with at the time, so I don't have a list of examples I can share with you.
SO is undoubtedly a valuable resource, and I don't completely disagree with the moderation style. However, it has been a slow burn for me.
I don't doubt that this happens. Mods are people and people make mistakes.
Though i also think it is in part that the Asker doesn't understand how to apply the answer to their issue rather than the answer not being relevant. In my experience a lot of people are unwilling or incapable of adapting an answer to their situation, which can be problematic in a place like SO that doesn't attempt to cater directly to individuals like that but rather to answer more generally.
I think issues like this contribute a lot to the dissatisfaction we see surrounding SO.
Change the word most for some and you get a statement that also applies to mods.
I like to check SO for problems I encounter but no way I would ask a question in there. Really don't like the toxic climate.
I decide on my own in a way like: read a ton of similar questions and their answers (if they have some) and rtfm to fix the problem. Or simply ask my friends.
If you must you can imply that SO is not ones friend. At least for me this holds.
I've used chatGPT (as well as SO as many others) but always in the context of having enough knowledge (either from experience or experimenting before opening a question) to understand if the answers were just wrong or not.
To each their own but I treat ChatGPT as a rubber duck 2.0 and I think that's what it excel at.