ISED got it right

Is it possible that ISED got it right? As Canada’s 3800 MHz auction closed yesterday, it appears that ISED’s use of “caps” (as contrasted with spectrum set-asides) may have contributed to an auction that kept prices internationally competitive.

The total money raised in the auction was $2.16B, far below the Bay Street financial analysts expected range of $4B – $10B. The average cost per Mhz-pop was $0.29, more than 60% lower than Scotiabank’s pre-auction estimate of $0.70. As I wrote earlier this week, the $0.29 is in line with Australia’s recent auction which worked out to C$0.26 per MHz-pop.

Recall that only 2 years ago, the 3500 MHz auction raised $8.9B, with an average cost of $3.28 per MHz-pop.

With significantly higher costs of capital, financial analysts were expecting bidding to be lower, but the results came in significantly lower than expected. BMO Capital Markets called it “A much more disciplined auction.” Scotiabank said “Finally a spectrum auction that does not break the bank”. TD Securities said, “In short, we are delighted with the outcome of the auction. Each of Rogers, Bell, TELUS, and Quebecor spent materially less than what we and the Street had expected in this mid-band auction.” A note from National Bank credits the cross-band spectrum cap and more available spectrum.

The lower spectrum cost means carriers will be in a better position to invest in physical infrastructure.

A number of carriers released statements last night:

  • Bell: Bell secures the most 5G+ spectrum nationwide with acquisition of 3800 MHz licenses
  • Cogeco: 3800 MHz spectrum auction: Cogeco acquires 99 licenses in Québec and Ontario
  • Rogers: Rogers Acquires 3800 MHz 5G Spectrum Across Canada
  • Sasktel: SaskTel invests $10.2 million to acquire 3800 MHz wireless spectrum as part of its mission to deliver advanced 5G connectivity to customers across Saskatchewan
  • TELUS: TELUS secures critically important 3800 MHz spectrum licences, unleashing the full potential of 5G
  • Videotron: 3800 MHz wireless spectrum auction – Quebecor and Videotron invest nearly $300 million to move forward with Canadian expansion

Reading these, we might conclude that ISED got it right.

In any case, remember, there is a free webinar from the International Telecommunications Society next Thursday (December 7): “Optimizing spectrum auctions”. See you there!

AI-generated content

Bronwyn Howell recently wrote an article entitled “AI-Generated Content, Fake News and Credible Signals” for AEIdeas that I found to be particularly insightful.

It has been a couple months since I wrote “Emerging technology policy” and I think that the AEI paper presents some good perspectives.

She writes about the potential for people to be misled by AI-generated content due to what she terms information asymmetry. “Exploiting information asymmetries is not new. Snake oil salesmen and the advertising industry have long been “economical with the truth” to persuade gullible consumers to buy their products.”

In a digital world, consumers of AI-generated content do not necessarily know “whether the content they consume is a factual representation or a digital creation or manipulation, but the publisher does.” Regulations requiring content generated by AI to be labeled as such are intended to help overcome the information asymmetry.

Sometimes, no harm comes from the consumer not knowing. For example, if I am not told the aliens in a sci-fi movie are computer-generated, I am unlikely to be harmed; indeed, my enjoyment may be reduced if I am reminded of this before the movie starts, or if the information is emblazoned across the screen when the aliens are in action. But sometimes harm does come from the consumer not knowing—for example, when a video shows a politician saying or doing things that they did not. Yet even here, it is not clear or straightforward. If someone is lampooning a politician for entertainment purposes, then labelling is likely unnecessary (and even potentially harmful if it detracts from the entertainment experience). But if it is an election advertisement, and the intention is to convince voters that the portrayed events are factual and not fictional, then the asymmetry is material.

Potential harms may not arise from how the content was created, but rather from the intent behind its use. If the content is intended to deceive the consumer, regardless of how the content was created, then we need to examine ways to protect the public.

It may not be sufficient to require labelling of content generated by AI. It can be too easy to lie about its origins, and indeed, labelling may not be necessary if no harm ensues. Instead, the article suggests that regulatory “controls are required for the subset of transactions in which harm may occur from fake content.” She uses the example of election advertising, where rules already exist in most jurisdictions. “This suggests electoral law, not AI controls, are the best place to start managing the risks for this application”.

Do we need technology specific legislation and regulation? Or, do we ensure that existing protections for conventional technologies can apply in the world of artificial intelligence generating content?

An article on ABC News earlier this week says, “The war in Gaza is highlighting the latest advances in artificial intelligence as a way to spread fake images and disinformation”.

The risk that AI and social media could be used to spread lies to U.S. voters has alarmed lawmakers from both parties in Washington. At a recent hearing on the dangers of deepfake technology, U.S. Rep. Gerry Connolly, Democrat of Virginia, said the U.S. must invest in funding the development of AI tools designed to counter other AI.

A paper [pdf, 300KB] released earlier this week by Joshua Gans of the Rotman School of Management at University of Toronto asks “Can Socially-Minded Governance Control the AGI Beast?”. Spoiler alert: he concludes (robustly) that it cannot.

Optimizing spectrum auctions

What approach should governments follow for optimizing spectrum auctions?

Access to spectrum is vital for modern digital communications. Radio frequencies are essential for smartphones connectivity, access to the Cloud, the Internet of Things, as well as potential use cases in autonomous vehicles and artificial intelligence.

Governments use auctions in order to allocate radio spectrum to companies efficiently; these auctions have a significant impact on innovation, economic well-being, and government revenues.

Let’s take a look at the results of the 3500MHz auctions in Australia and Canada. Last week, Australia’s auction concluded, raising just under AUS$725M (roughly C$650M). Similar spectrum was auctioned in Canada 2 years ago, raising C$8.9B – more than 13 times the amount. Canadian carriers paid C$3.28 per MHz-pop; Australia’s auction worked out to C$0.26 per MHz-pop.

Canadian carriers paid billions of dollars in higher spectrum fees than their Australian counterparts. Had Canada’s auction been structured to optimize increased infrastructure investment instead of driving government revenues, what would have been the impact on innovation or economic well-being?

The International Telecommunications Society is hosting its next one-hour webinar on December 7, starting at 10:00am (Eastern), looking at “Optimizing spectrum auctions”. Frequent readers know that I have been a big fan of the ITS webinars (Like last month’s AI policy webinar)

Geoffrey Myers, a former economist with Ofcom, and now visiting professor in practice at The London School of Economics and Political Science, will discuss his recent book, Spectrum Auctions: Designing markets to benefit the public, industry and the economy.

The webinar will draw on his extensive experience at the UK’s communications regulator, and his study of the theory and practice of spectrum auctions. Professor Myers will explore the optimization of regulatory design in spectrum auctions, providing insights on the entire spectrum auction process. He will address several critical themes that emerge from his work:

  • How can we continually improve spectrum auction design by learning from successes and failures worldwide?
  • Are there enough analytical tools to consistently guide and support spectrum policy decisions?
  • How can expert advice, extending beyond technical and economic knowledge, shape spectrum policy effectively while considering policymakers’ practical concerns?

This webinar is a valuable resource for regulators, economists, and private sector experts involved in spectrum auction design and bidding strategies. It also provides insights for applied economists, teachers, and advanced students interested in market design and public management.

I hope to see you online. Reserve your complimentary spot today.

How fast is fast enough for broadband?

Just how fast is fast enough for broadband?

I last wrote about this 3 years ago, challenging the myth that universal fibre should be on the national agenda.

A couple of weeks ago, the FCC launched an inquiry [pdf, 155KB] to examine that question. The FCC intends to look at “universal deployment, affordability, adoption, availability, and equitable access to broadband”. The FCC Chair, Jessica Rosenworcel says the intent is to update the US broadband standard (currently 25Mbps down, 3Mbps up) to 100/20 and set a long-term goal for gigabit speeds.

The FCC Chair said that the 25/3 standard “is not only outdated, it masks the extent to which lowincome neighborhoods and rural communities are being left offline and left behind.”

However, FCC data shows that 94% of US households had 100 Mbps access available by the end of 2021. According to Eric Fruits of the International Center for Law & Economics (ICLE), “If the FCC wants to increase the number of households with 100/20 Mbps speeds, it should recognize that much of the gap is driven by lower rates of adoption, rather than a lack of access to higher speeds.”

That is a familiar refrain for my readers. “The problem of increased broadband adoption can’t be fixed directly by throwing money at it, but we need to undertake more serious research into those factors that stand in the way of people subscribing to broadband.”

A September brief from ICLE was entitled “Finding Marginal Improvements for the ‘Good Enough’ Affordable Connectivity Program”. ICLE found that “about two-thirds of households without at-home internet have access, but don’t subscribe. The brief argues that, for households without a broadband subscription, their smartphone internet service may provide a superior “bang for the buck” relative to fixed broadband.”

Just as mobile devices have become a substitute for wireline home phones, we need to examine the extent to which smartphones and mobile services are substitutes for home internet connections.

In 2021, Pew Research found that 19% of respondents said the most important reason for not having broadband at home is that their smartphone does everything they need to do online. That study found that 15% of US adults are “smartphone-only” internet users – that is, they have a smartphone, but do not have a home broadband connection.

What is the best approach for encouraging continued broadband investment?

Do regulators need to raise targets? CRTC data shows that more two thirds of Canadian broadband subscriptions were already at speeds of 100 Mbps or higher, well above Canada’s broadband objective. Ninety percent of households had access to 100 Mbps service by year end 2021; more than three quarters of Canadians had access to gigabit speeds.

When there is demand for higher speeds, doesn’t this demonstrate companies will make the necessary investments? As I have said many times before, the future can be brighter for Canadian innovation and investment if the government would try harder to get out of the way.

Reviewing net neutrality

I find it interesting to see how two of Canada’s most important telecom trading partners are approaching net neutrality.

Recall, earlier this year I wrote “Canada’s policy framework for net neutrality is among the most prescriptive and restrictive.” A month later, I asked if Canada should be considering a review of its network neutrality policy.

South of the border, we see the FCC looking at imposing network neutrality regulation through a recategorization of internet services under Title II. And the UK’s regulator, Ofcom, has recently concluded a year long review and it has announced that the UK is heading in the opposite direction, revising its network neutrality guidelines to relax earlier rules. Ofcom’s Director of Connectivity, Selina Chadha, is quoted saying:

The net neutrality rules are designed to constrain the activities of broadband and mobile providers, however, they could also be restricting their ability to develop new services and manage their networks efficiently.

We want to make sure they can also innovate, alongside those developing new content and services, and protect their networks when traffic levels might push networks to their limits. We believe consumers will benefit from all providers across the internet innovating and delivering services that better meet their needs.

In the UK, certain aspects of net neutrality are imposed under Parliamentary legislation. Ofcom is responsible for monitoring and ensuring compliance and cannot change the legislation itself.

Ofcom’s statement on the new guidelines sets out:

  • ISPs can offer premium quality retail offers: Allowing ISPs to provide premium quality retail packages means they can better meet some consumers’ needs. For example, people who use high quality virtual reality applications may want to buy a premium quality service, while users who mainly stream and browse the internet can buy a cheaper package. Our updated guidance clarifies that ISPs can offer premium packages, for example offering low latency, as long as they are sufficiently clear to customers about what they can expect from the services they buy.
  • ISPs can develop new ‘specialised services’: New 5G and full fibre networks offer the opportunity for ISPs to innovate and develop their services. Our updated guidance clarifies when they can provide ‘specialised services’ to deliver specific content and applications that need to be optimised, which might include real time communications, virtual reality and driverless vehicles.
  • ISPs can use traffic management measures to manage their networks: Traffic management can be used by ISPs on their networks, so that a good quality of service is maintained for consumers. Our updated guidance clarifies when and how ISPs can use traffic management, including the different approaches they can take and how they can distinguish between different categories of traffic based on their technical requirements.
  • Most zero-rating offers will be allowed: Zero-rating is where the data used by certain websites or apps is not counted towards a customer’s overall data allowance. Our updated guidance clarifies that we will generally allow these offers, while setting out the limited circumstances where we might have concerns.

Professor Mark Jamison of the University of Florida’s Public Utility Research Center writes that “net neutrality is a concept whose time has passed.” Instead of relying on rules tailored for the digital age, the FCC is planning to bring internet service providers under Title II. These were regulations originally developed for monopoly wireline telephone services. opens the door for all the old laws to apply.

Professor Jamison notes that the proposed regulations are introducing a ‘general conduct standard’ “that grants the FCC authority to prohibit anything it deems ‘unreasonable.'” He argues that in an era of such rapid technological advancements, regulators need to make decisions (based on sound evidence) and adopting a ‘light-handed regulatory approach.’

Should the CRTC undertake a review similar to that which was done by Ofcom? Like the UK, would Canada find that the current environment may be restricting the ability to develop new services and manage networks efficiently?

Does the CRTC have the resource capacity to take on yet another review as it implements the government’s Online Streaming and Online News Acts?

Scroll to Top