Back to selection

Notes on Real Life

Adventures in Non-Fiction Storytelling. by Penny Lane

Notes on Vaxxed (Or, Not Lying But Being Wrong)


The thought stabbed itself into my mind, an ice-pick of fear:

But what if I’m wrong?

This was back in March. I had taken a short vacation from the hell of constant uncertainty in order to write an open letter to the Tribeca Film Festival about an anti-vaccination film called Vaxxed, directed by Andrew Wakefield. The letter contained the following premises:

1. Vaccines do not cause autism.
2. Wakefield is a well-known anti-vaccination quack.
3. Vaxxed is bullshit.

I wrote the letter bolstered by the conviction that my premises were right, so I was pleased the letter received so much attention and that TFF eventually pulled the film from its lineup.

But there was the thought. Like an alarm, beeping louder each time the letter was quoted, shared, defended:

What if I’m wrong?

A lot of hand-wringing goes on about whether or not a given documentary filmmaker is lying. To be a liar means you know the truth and have intentionally disregarded it. Intentional deception is mostly bad and worth policing. For example, I have called both Andrew Wakefield and his distributor Philippe Diaz liars (more on that later), and I am not alone in the documentary community in feeling that liars have no place at the table.

But I think we have a bigger fish to fry, and that is that none of us knows the truth about… much of anything. Documentary filmmakers, subjects, critics, programmers, audiences — we all float together in a sea of wrongness and not-knowing, our only life raft the necessary shared delusion that of course we know all sorts of things! In truth, we know very little and believe lots of things, many of them wrong. My argument here is simple: we should worry less about lying and more about being wrong.

By bringing up my response to Vaxxed again, I do not mean to beat a dead horse. (Wishful thinking: if only this horse were dead! Alas, it carries on, zombie-like, impervious to rationality and common sense, growing stronger each time it is attacked.) I do not bring it up because I miss my trolls, although trolls have a way of making a girl feel important. (Hi, trolls!) I bring it up because I think it illustrates nicely the difference between knowledge and belief, a distinction of special importance for those of us who want to claim the authority associated with being a documentary filmmaker. The mantle of truth-telling is heavy, weighted with the power of shaping perceptions. We need to carry it with grace and humility, with the willingness and desire to correct wrong beliefs.

So let me go back to March, and that letter I wrote. Let’s focus on my first premise, the one upon which disagreement inspires epistemological terror, the one that feels like insanity to anyone looking across the chasm:

1. Vaccines do not cause autism.

How do I know this is true? The answer is that I don’t know this is true. Rather, I believe it is true. This belief is based mostly on inductive, or probabilistic, reasoning, the type of reasoning that guides most of us into believing almost every belief.

An example of how inductive reasoning works: I believe the floor beneath my chair is solid. I came to this belief not because I have produced a Ph.D dissertation on the matter, but because I have some experience with floors and they’ve always been solid before. I could totally be wrong. The floor could give out, making me tragically wrong. Now that I’m thinking about it, I’m feeling pretty paranoid about this floor. But I’m probably right.

Inductive reasoning works pretty well most of the time, which is good because there’s not much we can do about the fact that our brains are wired to do it. Unfortunately, inductive reasoning is also prone to endless errors and cognitive distortions, among them incorrect biases and harmful stereotypes, which can easily lead to being wrong even when you feel completely certainly you are right. Another irony of the human experience, noted by Kathryn Schulz, is that being wrong feels just the same as being right. So even a mild skeptic should want to ask how they’ve come to believe their beliefs and what evidence would be sufficient for them to change them — here I link to the amazing subReddit community “Change My View,” which should be a model for us all.

The fact that a belief can change is belief’s most salient quality, and what separates it from the more immovable category of knowledge. Belief is what we think is true, and knowledge is what we know is true. We tend to dismiss beliefs as “just beliefs,” as if there is some huge body of fixed, solid knowledge for us to draw upon instead, but… there… kind of isn’t? I mean, there isn’t very much one person can say she definitely knows to be true. We can’t trust our senses, and don’t get me started on how horrifically bad our memories are. So we must rely on our best guesses, based on not-enough evidence, to make most of our choices, for most of our lives.

My belief about Andrew Wakefield is a good example:

2. Wakefield is a well-known anti-vaccination quack.

Having spent many years looking at quackery, I’ve developed a comprehensive mental image of a quack: the conspiracy-theory worldview, the crazed minions, the persecution complex, and most of all, the bad science which has been roundly rejected by reputable members of the scientific community. Conclusion: this dude is a quack. Conclusion: this dude is not to be trusted. Conclusion:

3. Vaxxed is bullshit.

I’m probably right. But I could be wrong.

And amazingly, I find that accepting the possibility I could be wrong offers the only intellectually honest escape from the terror that I could be wrong. Neat trick, huh? “Our errors are surely not such awfully solemn things,” mildly suggests the pragmatist William James, “so maybe just chill?” Okay, WJ didn’t say that second part. But he did insist, in contradiction to radical doubters like Voltaire, that believing with conviction and even certainty was okay and even good.

Again, this is a very simple point. But I think these ideas have special significance for nonfiction filmmakers because we are in the truth business. A person can’t develop a theory of truth without confronting the limits of knowledge or the nature of belief. One can’t really pay respect to truth if all one does is “not lie.”

(An aside, perhaps a note for a future column: I need to work out this mental error I have where I equate fiction with lying. They are not the same thing. There is an ontological difference between the two, but I haven’t worked it out. And since they are so often conflated in the doc milieu, my thinking on this is hopelessly confused.)

I do not know whether Andrew Wakefield is a liar. I’ve called him a liar before, but it seems to me more likely that he’s just wrong. Doesn’t even a charlatan care about the truth? Surely even a charlatan is just like the rest of us, hilariously and hubristically convinced we can access and identify the truth? Maybe he’s a bit more… out there, casting about a bit more unmoored in the sea of not-knowing, but that doesn’t mean he’s a liar.

Although… when Philippe Diaz, the CEO of Vaxxed distributor Cinema Libre, announces out loud in front of a room full of journalists that he did some research on me and found out “she’s a troll for the right wing, and financed by the right wing corporations, and she believes that the Twin Towers of 9/11 were taken out by UFOs…” I think maybe I’m giving these people too much fucking credit.

I wonder if Philippe Diaz and Andrew Wakefield lie awake at night, terrified, tortured by an unbidden thought:

But what if I’m wrong?

* * *

PS: This essay was largely inspired by Kathryn Schulz’s absolutely marvelous book Being Wrong: Adventures in the Margin of Error. I cop to borrowing a bunch of her ideas here and can only hope I didn’t hopelessly mangle them in my application. Everyone should read her book. And if you know yourself enough to know you know you won’t read a whole book, fine: here’s her TED Talk.