Data-Driven Objectivity

I recently had an exchange online with someone I tend to like, and it was about self-driving cars. My correspondent said that he would never, under any circumstances, get into a self-driven car. In fact, he seemed to think that self-driven cars would lead to carnage on the roads. My own opinion is that human driven cars have already led to a very demonstrable carnage, and that in all likelihood computers would do a better job. As you might imagine, this impressed my correspondent not the least. When  I observed that his opbjections were irrational, he said I shouild choose my words more carefully, but that he would overlook the insult this time.

Possibly that is a bad way to phrase my objection, but it is also, in the strict sense of the term, the precisely proper word to use. What I was saying is that his view had no basis in data or facts, and was purely an emotional response. We all have those, and I’m not claiming any superiority on that ground. But when the Enlightenment philosophers talked of reason it was in contrast to religion and superstition, and really did mean thinking in terms of data, facts, and logical thinking. It is my own view that this type of thinking has the major reponsibility for the progress the human race has made in science and technology over the last few centuries. And it is also my view that this type of thinking is being attacked severely in these days.

The hallmark of rational thinking is that it starts from a basis in observed facts, but always keeps a willingness to revise the conclusion if new facts come to light. If that seems reasonable to you, good. Now think of how the worst insult you can pin on a politician is flip-flopping. The great 20th century economist John Maynard Keynes was accused of this and responded “When my information changes, I alter my conclusions. What do you do, sir?” That is how a rational person thinks. There are people who attack science for being of no use because occasionally scientists change thier minds about what is going on. But that is an uninformed (to be most charitable about it) view. Science is a process of deriving the best possible explanations for the data we have, but always ready to discard them in favor of other explanations when new data comes in. That may bother people who insist on iron-clad certainty in everything, but in fact it does work. If it didn’t work you wouldn’t be reading this. (Did you ever notice the irony of television commentators attacking scientists? You might think the plans for television were found in the Bible/Koran/etc.)

One of the biggest obstacles to clear, rational thinking is what is termed confirmation bias. This is the tendency of people to see the evidence that supports their view, while simultaneously ignoring any evidence that does not support their view. This why the only studies that are given credibility are what we call “double-blind” studies. An example is a drug trial. We know there is a tednency for people to get better because they believe they are being given a new drug. In addition, we know that just being shown attention helps. So we take great care (in a good study) to divide the sample into two groups, with one group getting the great new drug, and the other group getting something that looks exactly like it, but has no active ingredient. It may be a sugar pill, or a solution of saline liquid being injected, just so long as the patient cannot tell which group they are in. But the bias can also be on the experimenter side. If a team of doctors has devoted years to developing a new drug, they will naturally have some investment in wanting it to succeed. And that can lead to seeing results that are not there, or even in “suggesting” in sub-conscious ways to the patient that they are getting the drug or not. So none of those doctors can be a part of it either. Clinicians are recruited who only know that they have two groups, A & B, and have no idea which is which. This is the classic double-blind study: neither the patient nor the experimenter has any idea who is getting the drug and who isn’t.

The reason we need to be this careful is that people are, by and large, irrational. People will be afraid of flying in an airplane but think nothing of getting into a car and driving, even though every bit of data says that driving is far more dangerous. People are far more afraid of sharks than they are of the food they eat, though more people die every year from food poisoning than are ever killed by sharks. And we all suffer from a massive case of the Lake Wobegone effect, in that we all tend to think we are above average, even though by definition roughly half of us are below average on any given characteristic. We just are not good judges of our own capabilities in most cases. In fact, the Dunning-Kruger effect suggests that we are frequently wrong in self-assessments.

But the worst case is the person who is absolutely certain, no matter what he is certain of. Certainty is great enemy of rationality. Years ago, Jacob Bronowski filmed a series called The Ascent Of Man. In one scene, he stood in a puddle outside at Auschwitz and talked about people who had certainty, and said “I beg of you to consider the possiblity that you may be wrong.” This is the hallmark of a rational person, this is the standard by which every scientist is judged. If you know anyone who can say “This is what I think, but I might be wrong,” you will have found the rarest kind of person, and you should cultivate their aquaintance. This type of wisdom is all too rare. And if you ever find a politician who says that, please vote for them, no matter what their party affiliation. They are worth infinitely more than a hundred of the kind that never have changed their minds about anything.

 Save as PDF