Final comments

Friday’s mail had final submissions for the CRTC’s review of usage based billing.

I had a chance to glance through some of them over the course of the past few days.

At the essence, the challenge for the CRTC is to strike the correct rates, as Primus states right up front. Regardless of the model chosen, if the rates are set appropriately, independent ISPs will be able to develop differentiated services to help provide consumers with competitive options. It appears that there is no longer an issue about whether there will be a form of usage sensitive wholesale pricing, but rather what form of usage based billing will be adopted. As Bell stated:

Investments continue to be the first and preferred approach to managing network capacity but it cannot realistically be the sole approach. Without some form of pricing model that ensures that those who “use the most, pay the most”, revenues necessary to cover the costs and make investments in the network have to be absorbed by all users. Under the current wholesale model, those who use the least must subsidize those who use the most, and that is not the fairest approach to billing for Internet use.

We have two models under consideration, each with slight variants being advocated by various proponents: Capacity based; and, Volume based.

While Primus suggests that the capacity based model is attractive because

  • Capacity billing models rely on industry standard practices.
  • Capacity billing models correctly link billing amounts to the impact that competitors have on the network. This correctly ensures that the competitor that has the most impact on the network pays the most. This also provides a direct incentive for competitors to manage the impact that they have on the network.
  • Capacity billing models are transparent as both the competitor and the wholesale access service provider are able to measure capacity usage at the same time and place. This significantly lowers the likelihood of disputes and streamlines the process to resolve any disputes that do occur.
  • Capacity billing models completely sever any relationship between wholesale and retail billing models.

However, as Bell has indicated, there is not necessarily a correlation between peak usage at the point of network interconnection (between the retail and wholesale ISPs) and the various portions of the shared network that may experience congestion.

95th percentile is currently used as a billing method with regard to point-to-point services such as network transit and enterprise connections. In such circumstances 95th percentile adequately represents usage of the point-of-interconnection.

Wholesale access is more of a point-to-multipoint type service, for which Bell argues that a volume based billing model is a better proxy. On the other hand, TELUS does not charge usage sensitive wholesale or retail rates nor has it proposed to do so. As such, TELUS has requested that the CRTC continue to allow wholesale services to be offered on a flat rate basis, should it decide to.

TELUS supports the general position that an ILEC or cable carrier should be able to charge volume-sensitive wholesale rates should it wish to do so. All networks are not alike. Different architectures, different levels of investment, different usage patterns and different life cycles necessarily mean that networks will experience congestion and usage driven costs differently. TELUS has observed that data usage has been growing at an accelerating pace amongst its Internet customer base, including those end-customers that are served by wholesale ISPs. As a result, TELUS (or any other network provider) might ultimately, in the near to medium term, be required to examine volume-sensitive pricing models to deal with the costs associated with managing this data growth.

In choosing between usage sensitive billing models, TELUS supports an aggregated volume based model, because it is relatively simple to implement, easy to understand and less costly. A submission on behalf of Rogers, Cogeco and Videotron agreed and discussed the risk of arbitrage that would be enabled by the 95th percentile model.

I’ll have more later in the week. I welcome your comments.

Slow news days?

WiFi safety was back in the news thanks to a tweet from Green Party leader Elizabeth May:

It is very disturbing how quickly Wifi has moved into schools as it is children who are the most vulnerable.

Which set off a maelstrom of tweets from the extremes of the informed.

Some pointed to the World Health Organization advisory on mobile phones and had trouble discerning the differences. A key statement in the WHO advisory appeared to have escaped from people’s view:

The power (and hence the radiofrequency exposure to a user) falls off rapidly with increasing distance from the handset. A person using a mobile phone 30–40 cm away from their body – for example when text messaging, accessing the Internet, or using a “hands free” device – will therefore have a much lower exposure to radiofrequency fields than someone holding the handset against their head.

Most of us do not walk around with a WiFi device strapped to our heads. Our laptop computers are usually more than the 30-40 cm distance away, at least when I am typing on mine. The Ontario Medical Officer of Health has said WiFi is OK, citing Health Canada standards. Princeton University – a school with a reasonable reputation – did a review of the literature to develop a position statement for its staff and students to provide assurance that they were working in a safe environment.

What was the basis for whipping up anxiety over technology that is already built to meet national and international safety standards.

It is still July. I wasn’t ready for back-to-school special interest issues like this, despite the pre-release announcements from some of the phone companies. Next thing you know, the office supply stores will start those “most wonderful time of the year” jingles.

UPDATE: David Hillier provides the best link to sum up the WiFi issue.

Managing the user experience

Yesterday’s blog post about wireless network capacity showed the opportunity for network equipment providers to supply gear to service providers facing base station and backbone exhaust.

Canadian technology firm Sandvine has been successful in helping service providers with intelligent network management. Sandvine has acquired more new clients so far in 2011 than all of last year.

As I wrote yesterday,

As smartphone penetration rates continue to rise, together with adoption of attractive streaming video and audio services, networks will continue to be challenged to provide satisfactory user experience.

As if to validate my comment, Rogers reported yesterday that smartphones now represent 48% – almost half – of its mobile customer base. Investment in capacity increases takes time – to plan, engineer, install.

How do network operators provide a superior user experience as traffic volumes continue to increase? Sandvine has a solution for service providers to intelligently manage congestion.

Driving mobile network growth

A story in itWorld Canada discussed a Credit Suisse report that suggests that more than a third of network base stations face capacity constraints and networks are operating at 80% utilization levels.

The survey results have led the investment bank to predict sales from network equipment providers, such as Huawei, Nokia Siemens Networks, Ericsson and Alcatel-Lucent.

Consumer demand for high speed data continues to exacerbate network edge and backbone congestion creating  capital spending opportunities on both sides of the border. I have heard technical support people suggest that the network will operate better if the user adjusts their handset to use 2G only and shut down the HSPA network. Amazingly, they say this with a deadpan tone of voice – although I am certain that this makes for great coffee room chatter around the call centre.

As smartphone penetration rates continue to rise, together with adoption of attractive streaming video and audio services, networks will continue to be challenged to provide satisfactory user experience.

How is your network provider performing?

The coming year should see increased levels of investment as carriers try to stay ahead of continually increasing demand.

[Telax call center software solutions]

Provincial elections and digital strategies

When Canada went to the polls in May of this year, the election and subsequent cabinet shuffle delayed the release of a National Digital Strategy. Despite all of the best intentions for a Spring 2011 release of the federal strategy, it was to be expected that the new Industry Minister would seek to put his imprimatur on a piece that largely falls under his mandate together with his colleagues, the Ministers for Heritage and Human Resources. After all, telecommunications and the Telecom Act are within the purview of Minister Paradis.

Still, consider that many of the areas most commonly assumed to be part of a digital strategy actually fall under provincial responsibility: education, health care, social safety nets. There are a number of provinces facing elections this fall – perhaps providing incentives for provincial parties to turn their minds to progressive election agendas.

Canada’s largest province, Ontario will have an election on October 6. We will be watching the platforms of the major parties (Liberal, Conservative and NDP). Manitoba’s provincial elections are two days earlier (October 4); PEI residents are voting October 3; Newfoundland and Labrador voters go to the polls October 11; Saskatchewan’s provincial elections are a month later (November 7).

Will the federal government stay on the sidelines with the release of its national digital strategy in order to avoid conflicts with Ontario provincial election? Will any of Canada’s provinces scoop the federal government with the release of a comprehensive digital strategy?

Scroll to Top