I know a lot of time I'm guilty of quick/skim reading articles I like shared on HN. I think this one is worth reading more thoroughly than normal. I found doing so rewarding. I was particular pleased that a conclusion I was forming as I read through was then voiced near the bottom. Namely:
> The science policy scholar Daniel Sarewitz goes so far as to argue that science usually makes public controversies worse. His argument rests on the logical incompatibility of policy expectations and what science actually does — namely, that decision makers expect certainty, whereas science is best at producing new questions. That is, the more scientists study something, the more they uncover additional uncertainties and complexities.
I'm not sure I've ever seen this basic contradiction put so cogently. We want policy (politics) to create certainty and stability. "Science" increases our risk of the unknown by making us more aware of it.
Doubtful. There is a way to project confidence while acknowledging fundamental uncertainty, I think it's probably more effective and sustainable than outright lying. Being untransparent about difficulty in the long run creates distrust in authority because eventually you accumulate enough fuckups. I think the biggest error in modern leadership practices is confusing certainty for confidence.
Anyways your statement is literally begging the question:
["people want certainty and stability" is] not going to change. Certainty and stability is what people want.
Ahem, need - at least from our leaders. It may be that creating certainty amid uncertainty is itself the core of leadership. The stories we are most certain of are the ones we use to run our lives, to make decisions, and take action.
One heuristic for this is the 40-70 rule - a heuristic for decision making. In order to make a decision you should have no less than 40 percent of the information you would prefer to have, and you shouldn't wait to make the decision once you have 70 percent of the information you would prefer have.
I'm sympathetic to this. There is a strong argument to be made that this is a need.
> It may be that creating certainty amid uncertainty is itself the core of leadership.
I would agree with this without reservation.
But the phenomenon here is being driven by what people want, not what people need. If they're benefiting from the certainty they get, that's just a coincidence.
Wanting and needing are different things, and while people may need some certainty, they want much more than they need, and they're getting more than the optimal amount.
I remember a story , may from The Sea Around Us. politicians will decide how many fish they can capture in next year, scientists say 'please reduce x %, or we trust it will be terrible', politicians heard the advice and start taking. finally, they decided a number smaller than half of x.
> "Science" increases our risk of the unknown by making us more aware of it.
I'm not sure I understand the phrase "our risk of the unknown". The risk something poses to us is surely the same whether or not we are aware of it—just that our mitigation strategies, and even the awareness that we need to mitigate, change in response to increased knowledge.
Perhaps the idea is that people are more hesitant to make decisions when they become aware that what they had previously taken as absolute is in fact conditional? That seems like a good thing (although I agree that it can be taken to paralysing extremes).
It's worded a bit ambiguously, but I think it means we have to deal with the knowledge of how much we don't know about something and when making policy, every new think you don't know is a point of contention that can be argued over. In some cases that's beneficial, because it keeps us from making a mistake, in others it's detrimental, because it keeps us from making the beneficial change, but if all policy decisions start tending towards infinite argumentation as more and more things we don't know the answers to are linked to the topic, that's also a problem.
> Perhaps the idea is that people are more hesitant to make decisions when they become aware that what they had previously taken as absolute is in fact conditional?
Most people are horrible at dealing with uncertainty when making decisions. I don't know why this is. But taking an uncertain landscape, making a decision and then projecting certainty works better than conveying the risks for communicating with everyone but people used to making executive decisions.
So if you have two politicians, one who channels a scientist's healthy (and realistic) scepticism and one who takes a random position and blasts it, the latter will tend to be more popular.
> But taking an uncertain landscape, making a decision and then projecting certainty works better than conveying the risks for communicating with everyone but people used to making executive decisions.
I think it depends on what 'better' means. It works better in the senses of getting things done, and of popularity. But, unfortunately, the things that get done are those that are some weighted combination of (a) rewarding in the short-term and (b) in the interests of the person who's good at projecting an aura of confidence.
If the 'right' decision tends to align with the interests of the decision-maker, then it's great to have that decision-maker pushing it through. But, when the decision-maker's interests are not those of the general public, paralysis might be better than populist marching into short-term gratification.
(On the other hand, I also recognize that not making any decision until you know it's the right one is just a long-winded way of never making any decision. Making decisions about whether and how to make decisions is just as complicated as the non-meta decisions themselves ….)
> The science policy scholar Daniel Sarewitz goes so far as to argue that science usually makes public controversies worse. His argument rests on the logical incompatibility of policy expectations and what science actually does — namely, that decision makers expect certainty, whereas science is best at producing new questions. That is, the more scientists study something, the more they uncover additional uncertainties and complexities.
I'm not sure I've ever seen this basic contradiction put so cogently. We want policy (politics) to create certainty and stability. "Science" increases our risk of the unknown by making us more aware of it.
Loved the article. (and still an avid scientist)