AI summaries are information deodorant. When you stumble on a misinformation site via Google, usually there are some signals you can smell. Like how they word their titles or how frequently they post similar topics. The 'style' alone implies the quality of the 'substance'. But if you read the same substance summarized by LLMs you can't smell shit.