AI-generated content

Bronwyn Howell recently wrote an article entitled “AI-Generated Content, Fake News and Credible Signals” for AEIdeas that I found to be particularly insightful.

It has been a couple months since I wrote “Emerging technology policy” and I think that the AEI paper presents some good perspectives.

She writes about the potential for people to be misled by AI-generated content due to what she terms information asymmetry. “Exploiting information asymmetries is not new. Snake oil salesmen and the advertising industry have long been “economical with the truth” to persuade gullible consumers to buy their products.”

In a digital world, consumers of AI-generated content do not necessarily know “whether the content they consume is a factual representation or a digital creation or manipulation, but the publisher does.” Regulations requiring content generated by AI to be labeled as such are intended to help overcome the information asymmetry.

Sometimes, no harm comes from the consumer not knowing. For example, if I am not told the aliens in a sci-fi movie are computer-generated, I am unlikely to be harmed; indeed, my enjoyment may be reduced if I am reminded of this before the movie starts, or if the information is emblazoned across the screen when the aliens are in action. But sometimes harm does come from the consumer not knowing—for example, when a video shows a politician saying or doing things that they did not. Yet even here, it is not clear or straightforward. If someone is lampooning a politician for entertainment purposes, then labelling is likely unnecessary (and even potentially harmful if it detracts from the entertainment experience). But if it is an election advertisement, and the intention is to convince voters that the portrayed events are factual and not fictional, then the asymmetry is material.

Potential harms may not arise from how the content was created, but rather from the intent behind its use. If the content is intended to deceive the consumer, regardless of how the content was created, then we need to examine ways to protect the public.

It may not be sufficient to require labelling of content generated by AI. It can be too easy to lie about its origins, and indeed, labelling may not be necessary if no harm ensues. Instead, the article suggests that regulatory “controls are required for the subset of transactions in which harm may occur from fake content.” She uses the example of election advertising, where rules already exist in most jurisdictions. “This suggests electoral law, not AI controls, are the best place to start managing the risks for this application”.

Do we need technology specific legislation and regulation? Or, do we ensure that existing protections for conventional technologies can apply in the world of artificial intelligence generating content?

An article on ABC News earlier this week says, “The war in Gaza is highlighting the latest advances in artificial intelligence as a way to spread fake images and disinformation”.

The risk that AI and social media could be used to spread lies to U.S. voters has alarmed lawmakers from both parties in Washington. At a recent hearing on the dangers of deepfake technology, U.S. Rep. Gerry Connolly, Democrat of Virginia, said the U.S. must invest in funding the development of AI tools designed to counter other AI.

A paper [pdf, 300KB] released earlier this week by Joshua Gans of the Rotman School of Management at University of Toronto asks “Can Socially-Minded Governance Control the AGI Beast?”. Spoiler alert: he concludes (robustly) that it cannot.

Optimizing spectrum auctions

What approach should governments follow for optimizing spectrum auctions?

Access to spectrum is vital for modern digital communications. Radio frequencies are essential for smartphones connectivity, access to the Cloud, the Internet of Things, as well as potential use cases in autonomous vehicles and artificial intelligence.

Governments use auctions in order to allocate radio spectrum to companies efficiently; these auctions have a significant impact on innovation, economic well-being, and government revenues.

Let’s take a look at the results of the 3500MHz auctions in Australia and Canada. Last week, Australia’s auction concluded, raising just under AUS$725M (roughly C$650M). Similar spectrum was auctioned in Canada 2 years ago, raising C$8.9B – more than 13 times the amount. Canadian carriers paid C$3.28 per MHz-pop; Australia’s auction worked out to C$0.26 per MHz-pop.

Canadian carriers paid billions of dollars in higher spectrum fees than their Australian counterparts. Had Canada’s auction been structured to optimize increased infrastructure investment instead of driving government revenues, what would have been the impact on innovation or economic well-being?

The International Telecommunications Society is hosting its next one-hour webinar on December 7, starting at 10:00am (Eastern), looking at “Optimizing spectrum auctions”. Frequent readers know that I have been a big fan of the ITS webinars (Like last month’s AI policy webinar)

Geoffrey Myers, a former economist with Ofcom, and now visiting professor in practice at The London School of Economics and Political Science, will discuss his recent book, Spectrum Auctions: Designing markets to benefit the public, industry and the economy.

The webinar will draw on his extensive experience at the UK’s communications regulator, and his study of the theory and practice of spectrum auctions. Professor Myers will explore the optimization of regulatory design in spectrum auctions, providing insights on the entire spectrum auction process. He will address several critical themes that emerge from his work:

  • How can we continually improve spectrum auction design by learning from successes and failures worldwide?
  • Are there enough analytical tools to consistently guide and support spectrum policy decisions?
  • How can expert advice, extending beyond technical and economic knowledge, shape spectrum policy effectively while considering policymakers’ practical concerns?

This webinar is a valuable resource for regulators, economists, and private sector experts involved in spectrum auction design and bidding strategies. It also provides insights for applied economists, teachers, and advanced students interested in market design and public management.

I hope to see you online. Reserve your complimentary spot today.

How fast is fast enough for broadband?

Just how fast is fast enough for broadband?

I last wrote about this 3 years ago, challenging the myth that universal fibre should be on the national agenda.

A couple of weeks ago, the FCC launched an inquiry [pdf, 155KB] to examine that question. The FCC intends to look at “universal deployment, affordability, adoption, availability, and equitable access to broadband”. The FCC Chair, Jessica Rosenworcel says the intent is to update the US broadband standard (currently 25Mbps down, 3Mbps up) to 100/20 and set a long-term goal for gigabit speeds.

The FCC Chair said that the 25/3 standard “is not only outdated, it masks the extent to which lowincome neighborhoods and rural communities are being left offline and left behind.”

However, FCC data shows that 94% of US households had 100 Mbps access available by the end of 2021. According to Eric Fruits of the International Center for Law & Economics (ICLE), “If the FCC wants to increase the number of households with 100/20 Mbps speeds, it should recognize that much of the gap is driven by lower rates of adoption, rather than a lack of access to higher speeds.”

That is a familiar refrain for my readers. “The problem of increased broadband adoption can’t be fixed directly by throwing money at it, but we need to undertake more serious research into those factors that stand in the way of people subscribing to broadband.”

A September brief from ICLE was entitled “Finding Marginal Improvements for the ‘Good Enough’ Affordable Connectivity Program”. ICLE found that “about two-thirds of households without at-home internet have access, but don’t subscribe. The brief argues that, for households without a broadband subscription, their smartphone internet service may provide a superior “bang for the buck” relative to fixed broadband.”

Just as mobile devices have become a substitute for wireline home phones, we need to examine the extent to which smartphones and mobile services are substitutes for home internet connections.

In 2021, Pew Research found that 19% of respondents said the most important reason for not having broadband at home is that their smartphone does everything they need to do online. That study found that 15% of US adults are “smartphone-only” internet users – that is, they have a smartphone, but do not have a home broadband connection.

What is the best approach for encouraging continued broadband investment?

Do regulators need to raise targets? CRTC data shows that more two thirds of Canadian broadband subscriptions were already at speeds of 100 Mbps or higher, well above Canada’s broadband objective. Ninety percent of households had access to 100 Mbps service by year end 2021; more than three quarters of Canadians had access to gigabit speeds.

When there is demand for higher speeds, doesn’t this demonstrate companies will make the necessary investments? As I have said many times before, the future can be brighter for Canadian innovation and investment if the government would try harder to get out of the way.

Reviewing net neutrality

I find it interesting to see how two of Canada’s most important telecom trading partners are approaching net neutrality.

Recall, earlier this year I wrote “Canada’s policy framework for net neutrality is among the most prescriptive and restrictive.” A month later, I asked if Canada should be considering a review of its network neutrality policy.

South of the border, we see the FCC looking at imposing network neutrality regulation through a recategorization of internet services under Title II. And the UK’s regulator, Ofcom, has recently concluded a year long review and it has announced that the UK is heading in the opposite direction, revising its network neutrality guidelines to relax earlier rules. Ofcom’s Director of Connectivity, Selina Chadha, is quoted saying:

The net neutrality rules are designed to constrain the activities of broadband and mobile providers, however, they could also be restricting their ability to develop new services and manage their networks efficiently.

We want to make sure they can also innovate, alongside those developing new content and services, and protect their networks when traffic levels might push networks to their limits. We believe consumers will benefit from all providers across the internet innovating and delivering services that better meet their needs.

In the UK, certain aspects of net neutrality are imposed under Parliamentary legislation. Ofcom is responsible for monitoring and ensuring compliance and cannot change the legislation itself.

Ofcom’s statement on the new guidelines sets out:

  • ISPs can offer premium quality retail offers: Allowing ISPs to provide premium quality retail packages means they can better meet some consumers’ needs. For example, people who use high quality virtual reality applications may want to buy a premium quality service, while users who mainly stream and browse the internet can buy a cheaper package. Our updated guidance clarifies that ISPs can offer premium packages, for example offering low latency, as long as they are sufficiently clear to customers about what they can expect from the services they buy.
  • ISPs can develop new ‘specialised services’: New 5G and full fibre networks offer the opportunity for ISPs to innovate and develop their services. Our updated guidance clarifies when they can provide ‘specialised services’ to deliver specific content and applications that need to be optimised, which might include real time communications, virtual reality and driverless vehicles.
  • ISPs can use traffic management measures to manage their networks: Traffic management can be used by ISPs on their networks, so that a good quality of service is maintained for consumers. Our updated guidance clarifies when and how ISPs can use traffic management, including the different approaches they can take and how they can distinguish between different categories of traffic based on their technical requirements.
  • Most zero-rating offers will be allowed: Zero-rating is where the data used by certain websites or apps is not counted towards a customer’s overall data allowance. Our updated guidance clarifies that we will generally allow these offers, while setting out the limited circumstances where we might have concerns.

Professor Mark Jamison of the University of Florida’s Public Utility Research Center writes that “net neutrality is a concept whose time has passed.” Instead of relying on rules tailored for the digital age, the FCC is planning to bring internet service providers under Title II. These were regulations originally developed for monopoly wireline telephone services. opens the door for all the old laws to apply.

Professor Jamison notes that the proposed regulations are introducing a ‘general conduct standard’ “that grants the FCC authority to prohibit anything it deems ‘unreasonable.'” He argues that in an era of such rapid technological advancements, regulators need to make decisions (based on sound evidence) and adopting a ‘light-handed regulatory approach.’

Should the CRTC undertake a review similar to that which was done by Ofcom? Like the UK, would Canada find that the current environment may be restricting the ability to develop new services and manage networks efficiently?

Does the CRTC have the resource capacity to take on yet another review as it implements the government’s Online Streaming and Online News Acts?

Telecom affordability

A report from PwC Canada takes a new look at the state of telecom affordability in Canada.

According to “Understanding the affordability of wireless and wireline services in Canada” [26-page pdf, 7.7MB] focuses on assessing three elements of Canadian telecommunications affordability:

  1. Canadian economics statistics, including telecommunications expenditure, inflation, and changing incomes.
  2. The assessment of wireless and wireline affordability in Canada, including assessing the changing prices of wireless and wireline services over time relative to increases in data consumption and changing patterns of data usage.
  3. The affordability of wireless and wireline services for Canadians against consumption and income metrics relative to global jurisdictions.

What did PwC find?

  • Canadians have been impacted by inflation, with inflation in 2021 and 2022 surpassing the rate of income growth. Prior to 2021, incomes were growing faster than inflation for every quintile except the highest.
  • Between 2017 and 2021, cellular services was the second largest CPI drop among the only 13 deflationary goods and services in the CPI bucket, falling at a CAGR of 8.1%. Driven by the decrease in cellular service CPI, communications was also a deflationary service, with communications CPI falling by 16% from 2017 to 2022.
  • Affordability increased for all quintiles when assessing the cost of entry-level wireless and wireline plans against adjusted disposable incomes. Notably, for the lowest income quintile, the affordability of entry-level wireline plans improved by 11% between 2017 and 2021, while wireless affordability improved by 39%.
  • The price per gigabyte of wireless and wireline data fell by over a 19% CAGR in Canada from 2017 to 2021. This is attributed to increases in data consumption significantly outpacing changes in prices, with data consumption growing at CAGRs of 24% for wireless and 28% for wireline. Among selected international peers, Canada has the second-lowest cost per gigabyte of wireline data.
  • The affordability of wireless and wireline services in Canada is on par with peer countries. As the CPI of Canadian communications has dropped, it has brought the price of services in line with international peers as a percentage of income, indicating relative affordability.
  • Together, the Canadian market and international analyses demonstrate that facilities-based competition in Canada is able to maintain a healthy telecommunications industry while delivering on network coverage, quality, and affordability

Earlier this year, I wrote, “Affordability is a complex and multifaceted concept that varies depending on the context and the goods or services being considered.”

The report looks at telecom affordability across various income quintiles, but it did not explicitly include a discussion of targeted affordable services such as the industry-led Connecting Families initiative. It is worth noting that Rogers recently introduced its Connected for Success 5G Wireless Program, promised as a benefit of the Shaw acquisition, and it has rolled out its broadband Connected for Success to the former Shaw footprint. TELUS offers Mobility for Good, among other targeted services, as I have described.

The PwC report lays out a fact-based narrative on telecom affordability in Canada, and paints a very different picture from the conventional wisdom.

Scroll to Top