Social media platforms provide unprecedented opportunities for citizens, political candidates, activists and civil society groups to communicate, but they also pose new challenges for democracy. One key problem is the rise of harmful speech online, which can undermine democratic participation and debate. Harmful speech refers to a range of forms of problematic communication, including hate speech, threats of violence, defamation and harassment.

Canada has well-established policies to address the most toxic forms of harmful speech in non-digital media, and some are applicable to harmful speech online. However, the current regulatory approaches cannot address the speed, scale and global reach of harmful speech on social media platforms. Today, most decisions about Canadians’ exposure to harmful speech are made by foreign social media companies, with little public input or accountability. As a result, there is an imbalance between the limited democratic oversight of online platforms and the significant threat that harmful speech poses to democracy.

This project explains some of the most problematic forms of harmful speech, how they affect democratic processes, and how they are currently addressed in Canada. The report draws lessons from policy responses that are being developed or implemented in other countries and at the international level. It then sets out three mutually supporting policy recommendations for the Canadian context.