That's an interesting topic that sometimes gets explored in sci-fi. If an AI is created that is able to learn all knowledge and form it into a single consistent model of reality but it ends up making conclusions we don't want to hear, what are the consequences for humanity and living things in general?
The Hitchhiker's Guide to the Galaxy is almost exactly what Mountain_Skies is describing.
If I remember correctly, Peter Watts has a somewhat more realistic take on this in his Rifters trilogy (under the Novels section here: https://rifters.com/real/shorts.htm), where there's a brain in a box that sifts through loads of information and gives advice to political leaders. The trilogy as a whole is more... deep-sea cyberpunk than particularly centered on the brain in a box, though.