Last week, PIAC released a report [full report (1 MB pdf)] calling for better disclosure about speeds being delivered to subscribers of internet services. Increased transparency and knowledge about services is always a good thing. The report is a good read; it includes a discussion of the factors that can lead to misleading speed test readings from many of the commonly used measurement tools. Of the 6 recommendations in the report, many tie into recommendation number 5: “Greater consumer education is needed surrounding broadband advertising claims.”
I agree. It is important for consumers to clearly understand what they can reasonably expect when subscribing to their telecommunications services.
However, there is a particularly misleading chart in the PIAC report that warrants mention; that chart turned out to be the highlight of CBC’s coverage of the report. On page 25, the report says:
data testing the measured download speed using the Network Diagnostic Test (NDT) tool on the Measurement Lab platform was compared against the data about advertised download speeds collected by the OECD in 2010. This chart shows that the OECD determined that the average advertised download speed in Canada according to 18 offers was 20.82 Mbps but 114,165 tests using the NDT tool showed an average measured download speed of 3.11 Mbps.
CBC highlighted this in its coverage which was titled: Advertised internet speeds not backed up by data.
For example, [PIAC Report co-author Janet] Lo said, at a time when the Organisation for Economic Co-operation and Development listed the average maximum advertised internet speeds in Canada as 20 megabits per second, the average internet speed among Canadians who used Google’s Measurement Lab to test their internet connections was just three megabits per second.
Now, frequent readers know that I don’t approve of the OECD methodology, but my complaint today has nothing to do with the OECD ranking. The problem is that the report itself is comparing a sampling of advertising of speeds with the average embedded base of what people subscribed to. The two have very little to do with each other. Car companies may advertise vehicles with fuel efficiency, but this has little to do with the average fuel ratings that consumers may experience. Not everyone owns a brand new fuel-efficient hybrid car (there is a large embedded base). Many new car shoppers elect to purchase a vehicle that may not be as fuel efficient as those being advertised.
Similarly, many consumers may not choose to subscribe to the higher level services, despite the high speeds being offered. If the objective is to understand whether consumers are getting what they are paying for, why is a comparison being made to what speeds are currently featured in ads?
The far more important comparison, for this purpose, is the average speed to which consumers are subscribed. That data is available from Table 5.3.3 of the CRTC’s Monitoring Report. For 2010, the CRTC stated the weighted average download speed was 7.060 Mbps and 8.238 Mbps in 2011.
The 3.11 Mbps measured still comes up short, but it is far less dramatic a concern than the attention grabbing tactic of comparing to a completely irrelevant metric like Average Advertised Speed. Such reporting does not contribute to better consumer understanding of the real issues.
How will the industry and regulator respond?
Last November, I wrote that Rogers was already engaging SamKnows to verify the speeds being delivered to its customers. At the time, I wrote that the CRTC already has in its 3-year plan an intent to monitor ISP upload and download speeds. I also observed:
As the broadband internet market matures to approach universal adoption, such confidence building measures can be expected to be used in marketing its services. How confident are you that you are getting what you pay for?
Will ISPs engage similar verification services as SamKnows or will they wait for a CRTC (or Competition Bureau) order to do so?
PIAC has once again produced a report under the Industry Canada contributions program well past the funding window. They received $76,000 to produce this report for 2011-2012 – with work to be completed by March 31 2012. They clearly performed work after this time, yet no-one seems to call them to account for these public funds.
It is unclear that they have ever finished a contributions program report within the funding window.
Personally I think that ISPs who’ve shown systemic issues with lack of network infrastructure upgrades (such as Rogers has twice in the past with users across their networks seeing sub 2mbps speeds when paying for much more – http://www.moneyville.ca/article/1073972–roseman-what-s-up-with-rogers-high-speed-internet ), should be forced by the CRTC to run a service like SamKnows.
Personally, I think the only reason Rogers is doing the SamKnows testing, is because of the CRTC sitting up and taking notice that ISPs haven’t been following 2009-657 (as Rogers was caught not following), and been investing in their networks before implementing monetary or technical ITMPs.
The only problem comes when the TPIA/GAS ISPs start experiencing issues, and it can be pinpointed to an incumbent-caused issue. As it is now, if there is local node congestion issue for a TPIA user (which would affect Rogers customers in that area as well), it’s really impossible to get Rogers to take notice, and at most Rogers will schedule a tech rollout for no reason, forcing the customer to pay for it when it wasn’t required. It would essentially force TPIA/GAS ISPs to also run the service, as a way to prove to the incumbent there’s an actual problem – as right now, nothing gets fixed unless the incumbent’s own customers complain.
IF Canadians are going to get a better network infrastructure for internet in Canada, it’ll probably require the CRTC ordering a service agreement between incumbent & TPIA/GAS ISPs with better response times, and actually taking their concerns into account, rather than brushing them off as they do now.
The focus on speed in ISP advertising can be sometimes misleading. An analogy I read once compared it to car advertising in the ’50s and ’60’s when “horsepower” was the touted statistic in the ads because that was what they could measure. More “speed” does not necessarily lead to a better Internet experience for the user.
Latency and the variability of the packet delay can greatly affect the user experience. The number of users simultaneously contending for network resources will as well. Also the application that the user is running; a VoIP call will behave differently then websurfing than realtime gaming etc The Internet is a statistically multiplexed environment and to focus on just one element, download speed, wont give you a true indication of how it will behave in the real world.
But this approach is at least a start. Let’s hope it is not the last word in holding ISP’s accountable…