> "Science" increases our risk of the unknown by making us more aware of it.
I'm not sure I understand the phrase "our risk of the unknown". The risk something poses to us is surely the same whether or not we are aware of it—just that our mitigation strategies, and even the awareness that we need to mitigate, change in response to increased knowledge.
Perhaps the idea is that people are more hesitant to make decisions when they become aware that what they had previously taken as absolute is in fact conditional? That seems like a good thing (although I agree that it can be taken to paralysing extremes).
It's worded a bit ambiguously, but I think it means we have to deal with the knowledge of how much we don't know about something and when making policy, every new think you don't know is a point of contention that can be argued over. In some cases that's beneficial, because it keeps us from making a mistake, in others it's detrimental, because it keeps us from making the beneficial change, but if all policy decisions start tending towards infinite argumentation as more and more things we don't know the answers to are linked to the topic, that's also a problem.
> Perhaps the idea is that people are more hesitant to make decisions when they become aware that what they had previously taken as absolute is in fact conditional?
Most people are horrible at dealing with uncertainty when making decisions. I don't know why this is. But taking an uncertain landscape, making a decision and then projecting certainty works better than conveying the risks for communicating with everyone but people used to making executive decisions.
So if you have two politicians, one who channels a scientist's healthy (and realistic) scepticism and one who takes a random position and blasts it, the latter will tend to be more popular.
> But taking an uncertain landscape, making a decision and then projecting certainty works better than conveying the risks for communicating with everyone but people used to making executive decisions.
I think it depends on what 'better' means. It works better in the senses of getting things done, and of popularity. But, unfortunately, the things that get done are those that are some weighted combination of (a) rewarding in the short-term and (b) in the interests of the person who's good at projecting an aura of confidence.
If the 'right' decision tends to align with the interests of the decision-maker, then it's great to have that decision-maker pushing it through. But, when the decision-maker's interests are not those of the general public, paralysis might be better than populist marching into short-term gratification.
(On the other hand, I also recognize that not making any decision until you know it's the right one is just a long-winded way of never making any decision. Making decisions about whether and how to make decisions is just as complicated as the non-meta decisions themselves ….)
I'm not sure I understand the phrase "our risk of the unknown". The risk something poses to us is surely the same whether or not we are aware of it—just that our mitigation strategies, and even the awareness that we need to mitigate, change in response to increased knowledge.
Perhaps the idea is that people are more hesitant to make decisions when they become aware that what they had previously taken as absolute is in fact conditional? That seems like a good thing (although I agree that it can be taken to paralysing extremes).