Two things are infinite: the universe and human stupidity; and I’m not sure about the universe.
- Albert Einstein
During combat operations, we’d always turn to the intelligence cell to tell us where to find the enemy. The answers we’d get were always some variation of, “they’re likely to be here,” or, “it’s possible they’re there,” and the simple reason those answers were vague was because those intelligence analysts – who were far smarter than the average grunt – knew there was always a chance they were wrong. Perhaps more importantly, they also knew that they’d never hear the end of it if we went to one of those places they’d identified and found nothing.
For what it’s worth, they were right.
The problem, however, with those vague assessments was that it led many grunts to second-guess anything the intelligence cell told us since we never got a definitive answer. In a way, by trying to represent uncertainty, those super-smart analysts ended up undermining their credibility.
In a similar way, scientists, like intelligence analysts, are also sometimes their own worst enemy. That’s because for a lot of the same reasons, scientists also often hedge their findings.
In more technical terms, the scientific method is based on the idea that it’s always possible to show a hypothesis is false, but impossible to prove it’s true. After all, additional evidence might always turn up that contradicts the supposed truth, like how for more than 1500 years, most science-minded people thought the Sun orbited the Earth as opposed to the other way around. In military terms, the example is when an assault platoon arrives at a suspected enemy compound and finds nobody there.
Swing and a miss!
Or is it?
In any event, it’s at least understandable why scientists build escape-clauses into hypotheses. This is even more understandable when the hypotheses are huge, with large numbers of variables that can’t be easily measured or tested, like climate change, because each of those variables represents a potential source of error.
That said, science is supposed to be useful, and so in cases where the scientific community believes there’s enough information, they’ll choose to accept a hypothesis. This decision considers factors like: the cost of gathering more data, the implications if the hypothesis is wrong, and the consequences of ignoring a true hypothesis. In many cases, this is a no-brainer. Smoking, once endorsed by doctors, is now universally recognized as harmful. Still, that hasn’t stopped tobacco companies from trying to make a buck.
Where accepting a hypothesis on the basis of enough information might become truly problematic is when the hypothesis, were it to be accepted, has the potential to effect the way of life of every person and creature on the planet. And not in a small way either, but potentially in a ‘re-imagine the economic, social, and political landscape of the entire world’ kind of way, and in case there’s any doubt, here’s looking at you, climate change.
In this scenario, lots of people will seize every opportunity to question the hypothesis, because to accept it would mean drastic collective action would be needed on a scale unheard of in human history to both lessen the effects and also help our societies prepare for consequences that are probably unavoidable other than their severity.
In other words, there will be skeptics, lots of them. This is a good thing. The scientific method is based on skepticism, after all, which is why a hypothesis can never be proven as true.
Still, just because some people say they’re skeptical about climate change doesn’t mean the hypothesis isn’t correct.