A recent commentary was written by Angel Gurría, the Secretary General of the OECD, citing a need for statistics to give a more accurate picture of society and the economy.
As Mr. Gurría wrote in European Voice:
We need tools to measure what is going on in our society – where we are progressing, where we are failing and what are the consequences of our actions. There is nothing wrong with the quality of the indicators, but some of them are not looking at what matters.
…
Statistics are not an end in themselves. Their importance lies in the policy discussions they stimulate as much as the evidence they provide.
I agree. We have to get the numbers right to empower intelligent policy discussions.
In last month, we saw university logos applied to papers, statements and ideological manifestos, giving credence to data analysis that would have trouble holding up to serious scrutiny in peer-reviewed journals of academic research.
A jewel from a recent broadband study would raise the eyebrows of any reasonable statistician reviewer:
Speedtest data is not perfect, but it offers an enormous database of actual tests, which provide insight into the speeds users experience on their computers. The dataset we analyzed included about 41 million actual tests from the OECD countries, from the fourth quarter of 2008.
Translation? Even though the source data smells funny, there’s so much of it that maybe we can find something useful. Perhaps. But we’re not convinced that the correct conclusions were drawn.
There is much more wrong with these reports, as Suzanne Blackwell and others wrote a few weeks ago. The Harvard study penalized Canada for OECD sampling errors (that I walked you through), such as ignoring Quebec as being part of the country. And as I wrote a couple weeks ago, the folks at Harvard tell us on one page that Canada had no 35 Mbps services and then 3 pages later, oops – they discovered one! Just not in time to use in their rankings. A rebuttal published on Wednesday by the study author did not address this mistake that resulted in erroneously ranking Canada as 30th for these very high speed services. Since the Harvard study also used flawed OECD sampling, this error was doubled in unfairly scoring Canada.
No matter how many studies adopt flawed speed and pricing survey data, the source just won’t smell any better. By wrapping it up with impressive academic logos, the data might have taken on a better appearance – just don’t get too close and be sure to wash your hands after touching it.