Do we pay too much attention to the statistics?
Communications and engagement teams (or consultation teams if you’re lucky enough to have one) will all be aware that consultation is not a vote. Yet significant weight is often put on the statistics from a consultation: ‘70% of consultees support the proposed changes.’ Could this be where some organisations fall down?
Last week on Twitter, we saw an article highlighting that in a recent European Commission consultation about proposed changes to end seasonal time changes. 4.6million responded and 84% were in favour of discontinuing. On the surface that is a staggering amount of people. But when you consider the population in the European Union, it equates to less than 1%.
In a local scenario, we may begin to question whether we did enough to involve the right people if we had less than 1% respond. Did we get our stakeholder mapping right? Did we plan enough events to generate enough interest? Did we work hard enough to involve those who will be most affected by the proposals?
The important thing here to note is not that statistics aren’t useful but presented in a certain way could be misleading; individuals will interpret what we write in different ways. Had the article said less than 1% responded, we may have had a different opinion of the consultation exercise. What do we know of the 500+million others that didn’t respond?
If disproportionate weight is given to the overall statistics of a consultation, there may be cause for concern. It can lead to confirmation bias. We last wrote about this in 2017 after our Conference where Ben Goldacre spoke about the over-interpretation of data and still organisations fall victim to it.
Now we know that the article was probably only 500words long and therefore couldn’t go into the detail of looking at who responded, or why they responded in such a way, but the point still stands. When you’re conducting your own consultation exercise, make sure that you present the data in a fair and transparent way.