Episode 9 - Kevin McCain - Uncertainty and Reasoning during the Pandemic


Sahana: Hi everyone! Welcome to In Limbo Conversations! Today, we have with us, Kevin McCain. He's an associate professor of philosophy at the University of Alabama at Birmingham and is also the co-editor of the "Routledge Studies in Epistemology" series. His central areas of research are issues in contemporary epistemology and philosophy of science. He also concentrates on explanatory reasoning and its role in responding to sceptical challenges as well as generating scientific and everyday knowledge. Thank you for joining us, Kevin!


Kevin: Thank you for having me.


Sahana: I thought we could start by talking about uncertainty and its relation to the complexity in our world, and in the larger sense, how search uncertainty gets concealed in media coverage of issues. So, as a background for our audience, this year Kostas Kampourakis and Kevin McCain authored a book. It is titled "Uncertainty: How it Makes Science Advance".

In this book, they talk about the role of uncertainty in science, also about the kinds of uncertainty that are inherent in certain fields of research - like human evolution vaccination and forensic science. They focus on how uncertainty could drive understanding and scientific endeavors. The first thing I wanted to ask you, Kevin, was the complexity of our world and its relation to uncertainty. If we could pick out like a slice from everything that's happening in the current pandemic situation, say maybe the origin of coronavirus or the Wuhan outbreak and the spread across the world. The entire picture is just very, very complex. Could you tell us using this instance, or any other, how the complexity of our world is inherently fertile and creates conditions for uncertainty? Perhaps how uncertainty is inevitable?


Kevin: Yeah sure. That's a great question. I think one of the ways that can be helpful, first, is to distinguish between, as we do in the book, between two broad kinds of uncertainty. The first sort is psychological uncertainty and this just has to do with how confident we are about something that we believe or think is true. we might be psychologically certain right, we have absolutely no doubts, but there's another sort of uncertainty that has to do with our evidence. This is what we call "epistemic uncertainty" and this has to do with how strongly our evidence supports believing something. And one of the claims we make in the book is that rationality is matching these levels of uncertainty so, in another way, putting it is - we should only be as psychologically confident of something to the degree that it matches the evidence, the strength of our evidence.

One claim that we make is, pretty much everything is, to some degree, epistemically uncertain. We almost have, and possibly never have evidence that's so strong that it makes the truth of whatever we believe certain, such that here's no possibility of error. To give a simple example of that right now- I know where my car is parked, but I'm not epistemically certain of that. There's still a slight chance- the evidence that I have is that I remember where I parked it- typically, you know, when I park it in that spot it's safe, and so on. But there's a chance that it's been towed or something else. Someone maybe stole it. I'm not epistemically certain.


We don't have epistemic certainty.. what happens a lot of times is we want certainty...We want the sort of psychological certainty, and sometimes we approach things with that but we don't have the epistemic certainty...

The more complex things are, in simplest terms, the more ways for us to go wrong.


And I come from largely an epistemology background and my initial research, back when I was in graduate school - which has been several years now, which is kind of a surprise to me, I was dealing with skepticism. And I think one of the lessons of skepticism is not that we lack knowledge or reasonable belief about the world around us.

But one of the key lessons is - We don't have epistemic certainty. And I think that what happens a lot of times is we want certainty. We want the sort of psychological certainty, and sometimes we approach things with that but we don't have the epistemic certainty. And all that's kind of just to lay some groundwork to your initial question. The more complex things are, in simplest terms, the more ways for us to go wrong.

So the lower the amount of epistemic certainty, when we're talking about something like the coronavirus and these various outbreaks and there's so much information and there are so many little details. Any one of these things can make a difference to say how it's spread. We're still learning ways about how it might be most commonly spread, what's the best safety measures, and on. There's always going to be some uncertainty there because of our evidence - even when we've locked it down. So, for instance, there's very good evidence that wearing masks and keeping a certain distance can help fight the spread. It's not certain that it does. But it's also, you know, not certain that I'm talking to you right now but it'd be crazy to think that I'm not talking to you. Similarly, with things that are this complex, we never have certainty. But we do have, say, good evidence and the evidence is going to be a little more theoretical. It's going to be a little more open to change on what evidence we have because there's so much complexity, there's so much to understand.


Sahana: That does take me to how we generally treat this uncertainty. In the book too, you have taken the case study to talk about the roots and the possible impact of uncertainty in science, I guess vaccinations are a great example of the current situation. There is a buzz about Oxford vaccination being in Phase Three trials and that there would be a viable vaccine by 2021.

Could you tell us two things? First, is this.. like you just said, the mirage of psychological certainty, which sort of glosses over the underlying uncertainty. There are these popular messages about vaccination at large. Could you share what you think about this a bit? and Second, why do you feel there is such a reaction to uncertainty? If that is something that you're interested in sharing with us.


Kevin: Oh good, good! Yeah. I think both of those are great questions.

I'll start with the second question first and then go to the first after that... I think we have this sort of reaction about uncertainty because psychological uncertainty makes us uncomfortable. Think about why we want to know and be able to predict things about the future. Because we, the more certain we can be that something's going to happen, the better we can prepare for it, the more comfortable we are, the less we have to be afraid of the unknown. I think we just naturally have this aversion to being uncertain. The problem is we have such an aversion a lot of times to psychological uncertainty that we'll ignore epistemic uncertainties.


...we just naturally have this aversion to being uncertain. The problem is we have such an aversion a lot of times to psychological uncertainty that we'll ignore epistemic uncertainties.



And I think- to your first question- I think we see some of that when we make claims like "Oh there will be a vaccine by this time." That seems like too strong of a commitment there. If you look, experts that are being more careful are saying "Well.. we're hopeful there'd be a vaccine" or "We've got a reason to think there will be" or "We, you know, we're working as hard as we can and there are decent odds of this happening." But you'll see others, particularly in political realms - possibly because of the uncertainty, but possibly for other reasons - saying, "We're going to have a vaccine by this time" because that's more reassuring to people. When you hear that you go, "Oh I can plan on my life getting back to normal by this time. Everything's okay." whereas if you tell people, “We have good evidence.” Maybe we can even quantify it and say, “I don't know, there's an 85 percent chance that we'll have a vaccine by this time.” People find that way less comforting. And sometimes it's problematic too because people will in those cases sometimes doubt whether it's legitimate science.

One of the things we talk about in the book is that the general public has this view that if it's science then it's certain. When they hear - as scientists widely know and accept and philosophers know - that there's always a chance of error, “Uh, we've got a really good reason to think that this occurs.” But they always mention there- “There's a slight probability that this is due to, say, random chance or some noise in the way we've gotten our data.” But the general public doesn't typically like that. They typically will often think if you say, "Well, scientists think there's this percent chance that says a vaccine will come," a lot of times the general public will say "Well that means they don't know what they're talking about. If they can't tell me there's going to be a vaccine by this time, then they just should keep quiet because they don't know." And I think that's because we have confusion about epistemic certainty and what's required to know something.


Sahana: That's more like an urge for black and white than gray, I guess.


Kevin: Exactly! Exactly! That's a great way to put it. We want black and white answers. We're uncomfortable in the gray, which, unfortunately, we're always in the gray.


Sahana: Right. That connects to what I just wanted to ask you in the next question that relates to the theory that you proposed in your 2014 book "Evidentialism and Epistemic Justification". As background, in 2014, Kevin authored this book which tackles three, broad, meta-epistemic issues about evidentialism.


We want black and white answers. We're uncomfortable in the gray, which, unfortunately, we're always in the gray.


First, there is a significant question of what evidence is, what sort of thing counts as evidence. Second, there are questions about how broad or narrow one's criterion for evidence should be. Can you include even your childhood memories as evidence? Or only what you're thinking right now? And finally, there is also the question of what it means to believe based on one's evidence. In this book, Kevin has proposed Explanationist Evidentialism.

I wanted to look at one very specific aspect of this, Kevin, today. Usually, in traditional Coherentism, only beliefs count as evidence. But in your account, you allow mental states other than beliefs, like experiences, to be evidence too. And connecting this to what it means for a belief to be based on evidence- could you tell us a bit about experiences as evidence? And maybe some instances you have seen over the months where people have used their experience to justify a false or a true belief?


Kevin: Oh good, good. Yeah, in general, I think experiences are often great evidence when taken to the appropriate level. So, for instance, why do I believe that I'm talking to you, rather than say in a room by myself, which I am, but talking to no one on the other computer screen. Well, part of my reason is various beliefs. About how computers work and so on- but another part is- I have an experience of seeing you on the screen. I have the experience of hearing from you. I have lots of these experiences or think- even more basic- one that requires less- imagine I look out the window and I have an experience of seeing a tree that gives me some evidence for thinking there's a tree out there. This seems to be really good evidence- now, whether it's evidence all on its own or sufficiently good evidence to believe there's a tree on its own- maybe not. I might have background beliefs about, say, what trees look like and so on, though- that's all possible but it does seem pretty clear that the experience is evidence and we can continue to go even more basic- imagine that you're currently- hopefully this is not true- but imagine that you're currently experiencing pretty intense pain, that experience itself is a good reason for you to believe that you're in pain.

I think experiences can be evidence and they are some of our strongest evidence. I'm finishing up a manuscript with a philosopher at the University of Aberdeen, Luca Moretti. He and I are expanding on the explanationist picture that I've presented in previous work. We're expanding that to account for a view that gives a really strong role to our appearances, experiences of a certain kind- visual appearances, auditory appearances, and so on.

I think that at the base level, a lot of our evidence, perhaps all of it, grounds out on experiences of some sort. Now, that being said, sometimes we use the evidence of experience in the wrong way, and here's what I mean- so here's a common mistake that people make. We often take anecdotal evidence- so, for instance, things that we've experienced personally or someone we know has experienced and we grant them way more strength than we do other evidence like well-documented scientific facts. Here's an easy example of that: I know many people who are terrified of flying in an airplane. And they're convinced that it's just unsafe. Despite- you can give them all the evidence and say "Look actually, it's one of the safest modes of travel." You can tell them like, here in the US, you can say, "Thousands and thousands of people die every year in car accidents." which they're not afraid of traveling in, and very few die in plane crashes but the anecdotal evidence- when they've seen on television like, you know, a plane has gone down and people have died- it stands out more to their mind and they give it more weight than they should. I think, in that sort of case they do have evidence based on their experience, it's just that they weighed it too strongly. They let it outweigh them and I think, again, my experience is largely my local area in the US. I think we've seen a lot of that sort of thing happen in the United States concerning the COVID-19 problem. We've seen people say, "Well, I know someone- they had this, they were tested positive for COVID and they were fine. So, right I don't need to worry about this." ..and despite lots of warnings, that is some evidence that maybe they would be okay if they caught it or other people would be too- but they weigh that too heavily because, in my own experience, I know someone who's had it, they were fine it's not a big deal and so, they disregard all of the other evidence because their personal experience- they weigh too strongly- so, I think it’s one of these interesting things where our experience of that sort and even our experience of what the other and the scientific evidence is what we have to go on but we sometimes weigh certain experiences more than we should- we give them a more evidential control of what we believe than what we should.


...we sometimes weigh certain experiences more than we should- we give them a more evidential control of what we believe than what we should...


Sahana: When you said experiences as evidence. With beliefs, there is a certain way in which we can verify because there is a certain way in which we can have a third person and verify it with certain reports and with one's evidence..experience, that part becomes a bit difficult- so how do you generally propose checking one's evidence? If I'm scared that I'm going to get the coronavirus, just even though I have not had contact with anyone but maybe I just got some groceries. How does one justify? I can use that experience to say that "No, you know I could have because I'm just feeling a little bit of a headache or I'm having a little bit..". So, how do you generally verify one's experience? Is it corroboration?


Kevin: Yeah exactly. I think it'd be corroboration with your other experiences but also with your beliefs. So, the way I look at it is that your experiences are evidence but so are your beliefs- if they're, you know, a reasonable belief, it's a tricky question of whether- if you have some irrational belief that if that counts as evidence or not. I tend to think that those aren't genuine evidence but say that- you have this experience and you want to know "Okay do I..do I have this? I'm having a headache." and you're maybe just a worrying person and you go "Well, maybe I've got it."- even the way to judge that and the way to get a rational and justified response is to look at not just that bit of evidence but your overall evidence. It's always a matter of all the evidence that I have. I have a headache- can we go- “Well. Maybe that's a symptom." but then we take into account my other evidence which, as you mentioned, is "Okay, I haven't been around anyone at all. So, I didn't catch it from anyone. I have no other symptoms."- maybe I often am prone to headaches at a certain time of the day anyway.


...you have all this evidence and the key is always-

we have to look at your overall set of evidence....


So, you have all this evidence and the key is always- we have to look at your overall set of evidence because you might have and we often do- you might have some evidence that supports one thing but a lot of other evidence that doesn't and so, when we put it all together we have to look and see what does that total evidence support? Does it support thinking that I have an infection? Does it support even thinking that it's likely? And then we have to make decisions of course based on that. So if it ends up where I feel like "Well, the odds that I have it are pretty low but I'm at high risk."- maybe I work in the health profession and I might get it from someone sick, then, even though the odds are low and maybe I shouldn't believe I have it. It might be that the probability is high enough that I should get tested anyway because of the risk whereas on the other hand, if the odds are low and I'm not at risk for infecting anyone else then maybe, I should wait and see and so, it all depends- we have to always think about the total evidence that we have.


Sahana: This total evidence would include not only one's experiences but also the beliefs. This also relates to the article you wrote on the University of Alabama Reporter about how we can be responsible news consumers. I think it was Sharon Dunwoody- she had reviewed your book and she said this, "..that accepting the ubiquity of uncertainty in science is the start but learning how to not let uncertainty get in the way of understanding is the next critical step." You had looked at the relationship between scientific knowledge and psychological certainty and rationality in this article and we have also talked about psychological certainty and uncertainty inherent in scientific knowledge here, in this in our conversation right now. So, I wanted to ask you- there were some instances that you had mentioned in your article- there were the spring breakers on one hand who doubted the seriousness of the situation, and then we have, on the other hand, those who drank bleach and for both of those group of subjects, their decision would have felt informed and rational- just like we talked and given this, where and how do we draw the line? Could you give us a short checklist kind of thing where, if possible, to help us make rational decisions amidst such uncertainties?


Kevin: Good that's a great question! And yeah, it’s unfortunate that we see the errors on both extremes in general and none of this is going to seem too surprising. I think the way to best manage our uncertainty is of course first to admit it that we have it. Learn about it and learn how to deal with it rather than opting for a black and white answer because it makes us feel better. So, first, admit the uncertainty but then do the sorts of things that are good for inquiry in general. A lot of people will say, you know, they saw something on social media for instance, and say "Well, this is the right answer on this.".. Verify your sources- check- so if you see something on social media that doesn't necessarily mean it's wrong- but it doesn't mean it's true either. Verify the sources. Consult multiple sources. Especially if you can get independent sources- if you can say "Okay, this..this news outlet is saying this about the virus..what does this other independent source say? What does this other one say?"... So, look for reliable sources and consult them. See what they're saying, check and see what the experts are saying. So, the doctors are saying this, the epidemiologists are saying this- look at that and give that sort of information the appropriate weight. The thing we need to do is check our sources, look at a variety of sources, see which ones are the most reliable, which ones we have the most reason to trust, look at them, see what they're saying and give them the appropriate weight, the fact that someone's friend on social media who's not an expert says something- maybe that counts as a little bit of evidence- but it's nowhere near as much evidence as stay when the CDC gives some advice on this. That's the sort of thing we need to do.


...first, admit the uncertainty..second, verify your sources...

third, consult multiple sources (preferably, independent ones as well)..

fourth, see what the experts are saying...


Sahana: Okay. Those are the three points..so we're just going to be mentioning that so.. (1) to verify one source, (2) to use multiple reliable sources, and (3) to check what the experts are saying including the organizations. Thank you for sharing these points with us. That covers the things that I wanted to talk to you about.

In the end, I thought I would quote a tiny thing from your book..you had written this in the dedication of your book "Uncertainty: How it makes science advance"..here are 12 hopeful words: "May you come to understand uncertainty and have no fear of it."

Thank you Kevin for joining me today! Thank you so much!


May you come to understand uncertainty and have no fear of it.


Kevin: Thank you very much!

It's been my pleasure. Thank you!




Get in touch with us!

© In Limbo