Do You Have a Good Reason to Ignore Uncertainty? Check if I Approve of Your Reason Below
The Wombat Conference
Recently I went to a conference with the central topic of visualising data. The conference was great and it was the first time i have ever been able to understand most of most of the speeches. Usually I just stare at the speaker’s slides and wonder if I am the only one who has no idea what they are talking about. There was one overwhelming sentiment shared by all the speakers that seemed a little disastrous to my research. Speaker after speaker went to the podium, started their speech, and said they didn’t visualize uncertainty (except keynote speaker but I was late and missed it). As a PhD student whose research centres on visualising uncertainty, this could be seen as a bit of a spanner thrown into my work. I will admit that it became a running joke in my notes, to see how long until the speaker either admitted they don’t visualise uncertainty, or just outright dismissed it as feasible in their work. I started to wonder if living in the woods and carving sticks would be a more fruitful career than what I was currently doing. I have since spoken about this to multiple friends who have admonished those who openly reject visualising uncertainty, but I actually respect the honesty. In trying to improve uncertainty visualisations, I have noticed that uncertainty is rarely visualised and there must be a reason for it that goes beyond “the current methods aren’t good”. We can all sit here and say “we should visualise uncertainty” but the reality is, nobody actually does and everyone has some reasoning for it that outweighs misrepresenting our results. Even if I spend months working on a new way to visualise uncertainty, without understand why nobody does it, my work would be born to live in a frame on my mothers wall, read only by myself and examiners, and used by nobody.

When to NOT Visualise Uncertainty
When I was in high school, my fancy private school paid a lot of money to have a speaker come in and talk to my cohort about communicating statistics. To get his point across the speaker gave an example on smoking. His speech went something like this:
“People are really bad at applying probabilities to themselves, especially when the probability relates to something bad. That means that if the government wants people to stop smoking to decrease the burden on the health care system, they can’t communicate it through probability. I’m sure that everyone in this room thinks that if you smoke it is certain that you will eventually get cancer and die, that is quite frankly not the case. The reality is, even if you smoke you actually only have about a 1% chance of getting lung cancer. Obviously it also leads to other diseases and a lower quality of life, but it is a far cry away from it being certain you will get cancer and die. The reason it is communicated with uncertainty is that we don’t want anyone to smoke, and we achieve that by communicating with certainty even if it is a misrepresentation of reality.”
At this point our head of pastoral care interrupted the speaker and says, completely seriously, “Sorry everyone this man has no idea what he is talking about, or he is lying. If you smoke it is certain you will get cancer and you will die. There is no probability about it. OK buddy,” he continues, guesting to the speaker “keep going”. The speaker just stood there shocked for a couple seconds, laughed and then, as instructed, kept going. I imagine he remembered they had already paid him so if they wanted to publicly discredit him and undermine his entire point, it was no skin off his back. I silently wondered if our school fees would be cheaper if they just hired whatever nightmare speaker they actually wanted. I’m sure some guy who tells students they will go to hell if they engage in premarital sex or do drugs would be much cheaper than an expert in communicating statistics, but I digress. This may seem irrelevant, but this is a story about how even in a strict context, people don’t trust their audience to correctly understand probabilities no matter how they are communicated.

Most people think the reason nobody visualises uncertainty is a lack of trust in peoples ability to understand probability. I do somewhat understand how people feel when they express this sentiment. I mark third year statistics assignments and only half of the students seem to know what a random variable is. That being said, there is a difference between However, the conference made it clear that there are many that people avoid visualising uncertainty that are not just to do with assuming the worst about the intelligence of your audience. Having now read quite a few papers on the topic, finding out why people don’t express the uncertainty in their work is an exercise into insanity.

Prior to reading into it I suspected that the reason authors did not visualise uncertainty was because there were just not enough good and intuitive methods. I would be unsurprised if this was a widely held belief. Now that I have read some research one it, I actually think its just that people are incentivised not to. The problem is not the available methods or the audience, its human psychology. This is not to say that if someone says current methods are lacking they are always making up an excuse (for example, visualising uncertainty on maps is very difficult) but it just highly likely that they are. Below I will go though the most common reasons cited for failing to express uncertainty, and why they are lacking once you engage in the literature. Then you might see what I mean.
The Excuses (and The Rebuttals)
There is a large amount of literature providing new ways to visualise uncertainty and showing its effectiveness, but much less on why people don’t do it. I am going to focus on the reasons provided in “Why Authors Don’t Visualize Uncertainty” by Jessica Hullman because this is one of the few detailed reviews I could find that actually did a structured interview with visualisation authors to find out if they visualised uncertainty and why they would chose not to (Hullman 2020).
The paper discussed a myriad of reasons provided by the authors to explain why they don’t always visualise uncertainty. The most popular reasons were: not wanting to overwhelm the audience; an inability to calculate the uncertainty; a lack of access to the uncertainty information; and not wanting to make their data seem questionable (Hullman 2020). Bellow I will discuss some more detailed reasons and give my rebuttals to them, however every reason for not expressing uncertainty fits into one of these four categories.

I want to make it clear that majority of those interviewed or surveyed for Hullman’s paper agreed that expressing uncertainty is important and should be done more often (Hullman 2020). As a matter of fact, some people agreed that failing to visualize uncertainty is tantamount to fraud (Hullman 2020). Despite this, only a quarter of respondents included uncertainty in 50% or more of their visualisations (Hullman 2020). This means people are convinced that visualising uncertainty is important from a moral standpoint, but they have still been able to provide self sufficient reasoning that allows them to avoid doing it. That doesn’t mean the reasoning provided follows consistent and sound logic. For example, at least one interviewee from Hullman’s survey claimed that expertise implies that the signal being conveyed is significant, but also said they would omit uncertainty if it obfuscated the message they were trying to convey (Hullman 2020). Even some authors who were capable of calculating and and representing uncertainty well did not do it, and were unable to provide a self-satisfying reason why. The clear friction in the explanations below are obvious but for the time being I will ignore it and take each claim at face value. At the end I will discuss the clear overarching issue of backward justification.
Why do I think People Don’t Visualise Uncertainty?
I personally think all these excuses are an effort to use someone’s incompetence (the audiences or their own) to justify not having to do something they don’t want to do.
This does not mean everyone who doesn’t visualise uncertainty is evil. Widespread issues like this are almost universally created by systematic problems and norms. But the rationale provided by the participants in these studies reek of back justification. Hullman herself notices this, claiming in her paper
“It is worth noting that many authors seemed confident in stating rationales, as though they perceived them to be truths that do not require examples to demonstrate. It is possible that rationales for omission represent ingrained beliefs more than conclusions authors have drawn from concrete experiences attempting to convey uncertainty” (Hullman 2020).
I took this as a fancy academic way of saying “I think these people are full of it and are making up random reasons to justify why their actions don’t reflect their beliefs”. This is not to say I have a poor view of the participants in the study. I think they are normal people doing what people do. Rather, I think that discussing the results with an absence of acknowledgement of the human psychology that got us there is disingenuous in of itself. Authors are likely reacting to unstated norms in the field that are so accepted they don’t even question themselves when they create a visualisation that doesn’t include uncertainty.
From personal experience visualisation as a whole seems to be generally looked down upon in science. There is a large focus on facts and much less of a focus on communications. I sometimes wonder if there is an effort to purposely make research harder to understand. I don’t think I am entirely off the mark considering I have many memories of my undergraduate lecturers d gloating to students about high fail rates and difficulty of their course. Obviously there is a prestige to doing something so complicated others struggle to understand it. When you hear that the proof for the Poincare conjecture (the only millennium prize problem to be solved) could only be understood by experts meeting at a conference and understanding the work in groups over several days, it inspires an idea of godlike intelligence. Therefore, if something is hard to understand it is a more advanced idea, and you are smarter for knowing it. Of course, something can be hard to understand because it is poorly communicated, not only because it is difficult, but that seems to be lost on a few researchers. A man asking for directions to the train station in gibberish is also difficult to understand, but he is unlikely to stand in front of a multivariate calculus lecture and brag about still being lost.
Very often, while reading papers, I am floored by how difficult some academics are to understand. The papers have become more comprehensive as I understood the field more, but even so, many papers still leave me confused. If the research put out by academia is so difficult to comprehend it is even inaccessible to the people in it, I wonder what the point of our research is.
I want to clarify that I don’t think people are avoiding visualising uncertainty because its more prestigious to avoid doing it. However, I do think visualising uncertainty, and visualisation as a whole have become caught up in the scientific quest for prestige through gatekeeping the field with poor communication.
There is a common theme in Hullman’s paper of authors seeing uncertainty as a chip in their armour, a possibility to expose something they don’t know and they hide it. Authors don’t think audiences can understand uncertainty, so they make it completely inaccessible. Authors are afraid they don’t know how to compute uncertainty, so instead of doing it badly, they ignore it. Authors are afraid of being questioned when they show the uncertainty, so they hide it. There are a lot of field wide issues that seem to be coming into play when authors are choosing not to visualise uncertainty and it becomes impossible to pin down a single reason. Authors don’t have to give me their honesty, but they do need to give it to themselves, so the next time you sit down to make a visualisation, be honest with yourself about why you are ignoring the uncertainty.