Skip to contentSkip to navigation
Stay
informed
No new laws required to hold social media accountable for illegal content

No new laws required to hold social media accountable for illegal content

on
September 21st, 2020

New report to be discussed by expert panel today.

TORONTO, SEPTEMBER 21, 2020 – In the eyes of Canadian law, social media companies like Facebook and YouTube are arguably publishers, opening the platforms to legal liability for user-generated content, according to Platform for Harm, a new research report released this morning by the watchdog group FRIENDS of Canadian Broadcasting.

The report builds on a legal analysis provided by libel defence lawyer and free speech advocate Mark Donald. Longstanding common law states that those who publish illegal content are liable for it in addition to those who create it. According to Donald, this liability is triggered when publishers know that content is harmful but publish it anyway, or if they fail to remove it after being notified of it.

“Our elected officials don’t need to create new laws to deal with this problem. They don’t need to define harmful content, police social media, or constrain free expression in any new way. All government needs to do is apply existing laws. But if a judge decides that content circulated on social media breaks the law, the platform which publishes and recommends that illegal content must be held liable for it,” says FRIENDS’ Executive Director Daniel Bernhard.

Social media platforms have long argued that they are simple bulletin boards that display user-generated content without editorial control, and that it is not possible to discover illegal content from among the 100 billion daily posts.

Yet Facebook and other social media platforms claim to advertisers that they do indeed have the technology to recognize content users post before it is published and pushed out to others.

In fact, the report finds that platforms like Facebook routinely exercise editorial control by promoting content users have never asked to see, including extreme content that would land any other publisher in court: for example the promotion of illegal acts such as the Christchurch, NZ massacre. They also conceal content from users without consulting them, another form of editorial control.

“Facebook and other social media platforms have complaints processes where they are alerted to potentially illegal or otherwise objectionable content. Yet it is their own community standards, not the law, which dictates whether they will remove a post. Even then Facebook employees say that the company does not apply its own standards when prominent right-wing groups are involved,” says Dr. George Carothers, FRIENDS’ Director of Research.

Platform for Harm is the subject of a panel discussion today from 12:00 pm to 1:00 pm ET co-sponsored by FRIENDS and the Centre for International Governance Innovation (CIGI). Moderated by Rita Trichur, Senior Business Writer and Columnist at The Globe and Mail, the panel will feature leading platform governance experts, lawyers and a leading political figure who will share their unique opinions and firsthand experiences and discuss ways to balance free speech and the rule of law in relation to harmful content online. Panelists are Daniel Bernhard, Executive Director, Friends of Canadian Broadcasting, Catherine McKenna, MP for Ottawa Centre, Taylor Owen, Senior Fellow, CIGI, Heidi Tworek, Professor of History and Public Policy, UBC.

Join the discussion here.

For information or to book an interview: Jim Thompson 613-447-9592


ABOUT FRIENDS

FRIENDS of Canadian Broadcasting is a watchdog group advocating for Canadian public broadcasting, journalism and storytelling on air, and online. FRIENDS enjoys the support of 364,000 Canadians and is not affiliated with any broadcaster or political party.

Stay informed, subscribe to the FRIENDS newsletter

Required

You are a few fields away from becoming a friend.

Required
Required
Required
Required
In this article
Stand with us in the defense of Canada's cultural and economic interests.