Learning from the past

An essay in Time Magazine by Clemson economist Thomas Hazlett caught my eye. He opens “How to Neuter the Net Revolution” with a quote from Professor Lawrence Lessig:

The Internet revolution has ended just as surprisingly as it began. None expected the explosion of creativity that the network produced; few expected that explosion to collapse as quickly and profoundly as it has.

The kicker is that these words are from a paper written in 2001 promoting rules to ensure an open internet.

At that time, Lessig wrote:

The Internet promised the world — particularly the weakest in the world—the fastest and most dramatic change to existing barriers to growth. That promise depends on the network remaining open to innovation. That openness depends upon policy that better understands the Internet’s past.

The internet revolution didn’t come to an end in 2001 as Lessig warned. I agree that the openness of the internet “depends upon policy that better understands the Internet’s past.” However, I am not convinced that share the same understanding of that history.

Some observers are challenging some of the more idealized assumptions being set forth to justify government regulation. Hazlett writes “the idea that the Internet is everywhere neutral, that all bits are treated equally, is false.” Further, Frost & Sullivan principal analyst says Dan Rayburn wrote: “There has never been any rule or understanding that certain networks must carry traffic for free.” and in the Washington Post, Larry Downes writes “The engineering of the Internet has never been “neutral,” nor could it be. Voice and streaming video traffic, for example, which is much more sensitive to delays, is regularly given priority.”

Downes also writes in the Harvard Business Review:

As far back as 1999, at the dawn of broadband Internet access through DSL and cable modems, the same advocates were making the same urgent pleas. Absent immediate nationalization of the Internet, they argued, ISPs were certain to block or otherwise disadvantage start-ups, leaving the Internet in the hands of a few dominant content providers. You know, like AOL, GeoCities, and Blue Mountain electronic cards. (Google search was still in beta.)

Back then, fortunately, the White House, Congress, and the FCC ignored the doomsayers. Instead, a rare bipartisan coalition held fast to the view that for emerging technologies, light-touch regulation was more likely to encourage competition and discipline market participants than the heavy hand of regulators.

Plus ça change…

To better understand the past, it may be helpful to look to an FCC working paper from 1999 [pdf], entitled “The FCC and the Unregulation of the Internet“.

The author, Jason Oxman, cites 5 key FCC policy decisions that benefited the development of the internet:

  • Fostering the development of an interconnected telecommunications network that ensured near universal availability of a reliable and affordable telephone system over which data services could be offered.
  • Determining through the Computer Inquiry proceedings that computer applications offered over that network were not subject to regulation, giving rise to the unregulated growth of the Internet.
  • Exempting enhanced service providers from the access charges paid by interexchange carriers, helping drive the availability of inexpensive dial-up Internet access.
  • Deregulating the telecommunications equipment market while requiring carriers to allow users to connect their own terminal equipment, helping to foster the widespread deployment of the modem and other data equipment tools that can be easily attached to the public switched network.
  • Implementing flexible spectrum licensing policies that permit innovative uses of wireless data services, leading to the development of wireless Internet applications.

And the paper cited fundamental lessons learned from 30 years of application of a deregulatory approach by the FCC:

  • Do not automatically impose legacy regulations on new technologies,
  • When Internet-based services replace traditional legacy services, begin to deregulate the old instead of regulate the new; and
  • Maintain a watchful eye to ensure that anticompetitive behavior does not develop, do not regulate based on the perception of potential future bottlenecks, and be careful that any regulatory responses are the minimum necessary and outweigh the costs of regulation.

Prescient words from a paper from 15 years ago: “Do not regulate based on the perception of potential future bottlenecks, and be careful that any regulatory responses are the minimum necessary and outweigh the costs of regulation.”

A few weeks ago, I asked if Canada’s net neutrality rules has delivered the benefits to justify the costs of regulation:

Five years later, how many countries have followed Canada’s lead? Should we be reviewing the policy framework for traffic management and content delivery examining whether our rules are appropriate?

Canada may have been first, but one might ask if Canada can be considered a leader if other countries haven’t followed behind. Are Canadians – consumers, creators and carriers – well served by the current “comprehensive approach to Internet traffic management practices”?

As Dan Rayburn observed “Net neutrality is an incredibly complex set of problems that people keep trying to simplify and politicians try to turn into sound bytes.”

What lessons from the past can continue to be applied?

When the FCC issues its determination, it may be worthwhile for the CRTC to begin a fresh look at its regulatory policy framework to ensure that Canadians continue to be positioned to lead in a global digital economy.

Scroll to Top