When should you listen to the experts?
The desirability of listening to expert advice has been debated a lot recently, first with Brexit, then with COVID. I think it’s worth trying to understand when one should listen to the experts, and when not. TL;DR:
First, I think expert opinion can be classified into three different buckets:
- policy advice;
- statements of fact according to the most accurate empirical evidence;
Right from the get-go, we must recognise that, because policy is ultimately a matter of preferences and not objective truth, experts’ policy recommendations are only valuable inasmuch as their audience agrees with their policy’s objective. Take Brexit: suggesting one should listen to economists who argue the public must vote Remain, because leaving the EU would have disastrous effects on the British economy, is only good advice is one’s criteria for making this decision are based on economics. If one doesn’t care about the economy, and wants to vote Leave for legislative independence / symbolic gesture / stick it to Cameron, then it does not matter what economists think about the impact on trade (though of course, the advice of other experts, e.g. jurisprudence scholars, is still valuable).
This argument is simple enough, yet it is frequently overlooked: all too often, we are told people are stupid because they are not listening to the experts, overlooking that the experts are advocating policy that has a different objective to that of its detractors.
Statements of fact
When it comes to statements of fact, the first thing one needs to evaluate to decide how much to weigh an opinion is the level of expertise of the person stating the fact. A priori, I will weigh a doctor’s advice on medical matters more than that of a lawyer.
However, there are three issues that arise when discussing facts:
First, experts are often not precise enough when they describe things as ‘facts’. This is what often causes them to change their opinion, eroding the public’s trust. The WHO famously claimed that healthy people need not wear face masks; the fact here was that there has been no randomised trial study conclusively proving wearing masks prevents transmission of a virus — but then again, neither has there been a randomised trial study proving parachutes save lives. This is a classic example of mistaking absence of evidence for evidence of absence — and labelling the latter a ‘fact’. When the WHO then issues new guidance, it’s fair enough for the confused public to wonder what makes the guidance right this time. Similarly, experts are often quick to label current thinking as ‘fact’ — and this is exacerbated by the media, which usually do away with any nuance in the original expert opinion. For example, just a few weeks ago, scientists argued that a ‘first dose first’ vaccine strategy is a bad idea. Now, this approach is gaining ground, both among experts and policy makers. Was it a fact that prioritising more doses was a sounder approach? No — it turns out it was an assumption.
Second, laypersons often misunderstand an expert’s true domain of expertise — and professionals sometimes claim expertise in fields that are only akin to theirs. A lawyer is qualified to offer advice on how the law works, and though they may have put more thought than a lay person on how the law should work, that is not their expertise. One cannot therefore claim that a lawyer is better qualified than a doctor to opine on, say, capital punishment; they may be better-read, and equipped to offer statistics on the number of wrongful convictions etc — but, as argued in the preceding section, if one opposes capital punishment on principle, and not on some cost-benefit analysis, they have no reason to heed the lawyer’s advice.
Third, but related, is that the vast majority of people are very much glued to the status quo, and are incapable of thinking outside the box. I think experts are even more prone to this than average: if you have worked hard to built your expertise, and have come to acquire knowledge through hard work, questioning this knowledge and the things you have been taught can be daunting. At Google, I work with some of the smartest people on the planet, yet I’ve lost count of the times I’ve heard people justify their work practices with ‘this is the way we’ve always done it’.
So, you should trust an expert to explain how a process works, why it is in place, and how to execute it; but you should not take their word as gospel if they refuse to consider alternative processes. It is entirely possible that the conditions that necessitated the current process have changed, or that assumptions on which it rested are no longer relevant. But again, such challenges to the status quo should be informed by expert opinion — you really should understand why things are done the way they are, and experts can help you do that.
As with facts, you should begin by weighing a person’s opinion based on their level of expertise in the field, but there are a few additional things to take into account:
First, similar to when evaluating facts, one needs to make sure an expert is opining on their actual field of expertise. Consider Brexit again: one question that arose was whether the UK is better-placed to sign trade deals as part of the EU, or independently. On the one hand, the UK is more flexible as an independent agent — it does not have to balance the interests of 27 other states, and it can give concessions on areas that are not important to it, but which are to the other EU members. On the other, the EU has much larger scale, as is therefore likely to be prioritised by other nations. Now, a trade economist can talk about the impact of Brexit on the UK economy under different conditions, but, unless they have experience as trade negotiators, they cannot really opine on which of the former two forces will have a stronger impact. An experienced trade negotiator who has had to balance scale and flexibility will be more qualified to talk on the matter.
Second, some systems are overly complex; no matter how good the expert, sometimes they will only be able to identify general trends at best, or only provide the likelihood of an event taking place, at worst.
Third, one-off events, or events without clear feedback loops, are very hard to predict, because no-one can actually develop expertise: how can you become better at forecasting something without practice? How can you know whether you did a good job, if you cannot evaluate the quality of your prediction (because, e.g., there were too many confounding variables)?
And finally, one ought to ask experts to make explicit the assumptions on which experts’ forecasts are based. Unfortunately, some professionals like to opine on matters where they lack an understanding of the details of the specific situation, though they may be subject experts. As a result, they may hold some assumptions that are true in general, but not on every occasion. For example, back in 2015, Paul Krugman was arguing that Greece should leave the Euro. One of his arguments was that doing so would allow Greece to devalue its new currency and boost its exports — it is true that a currency devaluation tends to boost exports because it makes them cheaper, but what if there are other barriers that prevent exports from growing? An increase in demand for exports doesn’t mean Greece can increase its supply to meet that demand.
The conclusion is that we need moderation and caution: do not listen to those who say ‘enough with the experts’; don’t listen to those who urge blind faith in the pros. Trust the experts when they have information you don’t; question their assumptions, but start by giving them the benefit of the doubt; and only outright reject their advice when the what they are trying to achieve is different to the what you are after.