par
Marie J. Brousseau
Articles du même auteur
02 Oct 2024

Bill C-63, the Online Harms Act: New Rules for Social Media in Canada?

Par Marie J. Brousseau, avocate

Bill C-63[1] is described by the Minister of Justice as “a crucial step toward creating a safer, more inclusive online environment, where social media platforms are required to actively reduce the risk of user exposure to harmful content and help prevent its spread […].”[2] Indeed, a critical component of Bill C‑63 is the due diligence obligation imposed on private companies through the enactment of the Online Harms Act. Additionally, Parliament endeavours to comprehensively address the issue of online hate by proposing amendments to both the Canadian Human Rights Act [3] and the Criminal Code.[4]

THE ONLINE HARMS ACT

Scope and definitions

The Online Harms Act introduces and defines seven types of harmful content:

  1. Content that sexually victimizes a child;
  2. Content used to bully a child;
  3. Content that advocates or counsels a child to harm themselves (e.g., eating disorders or suicide);
  4. Intimate or sexual content communicated without consent;
  5. Content that foments hatred;
  6. Content that incites violence;
  7. Content that encourages violent extremism or terrorism;

The Online Harms Act defines “content that foments hatred” as content expressing detestation or vilification of an individual or group based on a prohibited ground of discrimination, as per the Canadian Human Rights Act. Notably, content that merely expresses disdain, dislike, or offends does not qualify as “content that foments hatred”. This threshold of intensity is a recurring theme in Bill C‑63.

The Online Harms Act would apply to operators of “regulated services” defined as social media platforms with a certain number of users, as determined by regulation. As currently drafted, the Online Harms Act defines “social media” as a website or application whose primary purpose is to facilitate online communication and content sharing between users.

Duties of Operators of Social Media

Part 4 of the Online Harms Act set out duties for operators. While not exhaustive, here is a glimpse of some of the key duties, including the duty to act responsibly, the duty to protect children and the duty to make certain content inaccessible.

Sections 54 to 63 of the Online Harms Act lay out the duty to act responsibly. Indeed, section 55 would require operators to implement measures to reduce the risk of user exposure to harmful content. For example, operators would have to provide blocking tools and publish user guidelines and standards of conduct.[5]

Sections 64 to 66 of the Online Harms Act describe the duty to protect children. This duty would include, for instance, the obligation for operators to incorporate design features such as parental control.[6]

Sections 67 to 71 of the Online Harms Act outline the duty to make certain content inaccessible. Thus, operators of social media would have to block access to content deemed intimate content communicated without consent or content that sexually victimizes a child within 24 hours after being flagged.  

The Canadian Digital Security Commission

If the Online Harms Act comes into force in its current form, it would create the Canadian Digital Security Commission (“Commission”).[7] The Commission would have the power to order operators to comply with the Online Harms Act and to impose administrative monetary penalties for non‑compliance.[8] Additionally, the Commission could receive and investigate complaints regarding intimate content or content that sexually victimizes a child[9] and the authority to order operators to make such content inaccessible.[10]

RELATED AMENDMENTS

While Bill C-63 calls for amendments to various acts, it specifically addresses the issue of hate by proposing amendments to the Canadian Human Rights Act and the Criminal Code.

Amendments to the Canadian Human Rights Act—Hate Speech by means of the Internet

Repealed in 2013, section 13 of the Canadian Human Rights Act used to target the use of telecommunication to communicate hate messages. Bill C-63 would reinstate a modified version of section 13 to create the discriminatory practice of communicating hate speech by means of the Internet or any other means of telecommunication. For the purposes of section 13, hate speech would express detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination. However, a communication that solely discredits, humiliates, hurts, offends or expresses disdain or dislike would not be considered hate speech.[11] It is important to note that the revised section 13 would not apply to individuals or entities that operate a social media service.[12]

Amendments to the Criminal Code—Hate propaganda and offence motivated by hatred

In its current wording, Bill C-63 increases the minimum sentence for hate propaganda offences.[13] Additionally, it would introduce a definition of “hatred” as the emotion that involves detestation or vilification and that is stronger than disdain or dislike, aligning with the definitions in the Online Harms Act and the Canadian Human Rights Act. Consistent with that definition, the tabled amendments set out a certain threshold of intensity for public incitement of hatred and wilful promotion of hatred by specifying a statement does not incite or promote hatred, “solely because it discredits, humiliates, hurts or offends.”[14] Furthermore, Bill C-63 proposes to create a specific infraction — a hate crime offence of committing an offence motivated by hatred.[15] Thus, under the proposed new section 320.1001, anyone who commits an offence under any Act of Parliament that is motivated by hatred would be guilty of an indictable offence and liable to imprisonment for life.

Conclusion

Bill C-63 ambitiously aims to hold social media platforms accountable, while striking a careful balance between online safety and freedom of speech. To this end, Parliament is working on a wide spectrum of legislative changes that may modify fundamental existing laws but also create a new one. At the moment, Bill C‑63 is still at its second reading in the House of Commons.

The full text of Bill C-63 is available here.


[1] C-63, An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service, and to make consequential and related amendments to other Acts, 1st Sess, 44th Parl, 2024, (first reading February 2026, 2024) (“Bill C-63, An Act to enact the Online Harms Act”).

[2] House of Commons Debates (Hansard), 44th Parl, 1st Sess, No. 327 (June 7, 2024).

[3] RSC 1985, c H-6.

[4] RSC 1985, c C-46.

[5] Bill C-63, An Act to enact the Online Harms Act, cl 1, ss 57-58.

[6] Ibid, s 64.

[7] Ibid, s 10.

[8] Ibid, ss 94, 96 to 105.

[9] Ibid, ss 11 c), 81.

[10] Ibid, ss 81 (4) b, 82 (5).

[11] Bill C-63, An Act to enact the Online Harms Act, cl 34, amending the Canadian Human Rights Act, s 13.

[12] Ibid.

[13] Public incitement of hatred (section 319 (1)), wilful promotion of hatred (section 319 (2)), advocating genocide (section 318), and wilful promotion of antisemitism (section 319 (2.1)).

[14] Bill C-63, An Act to enact the Online Harms Act, cl 14(5), amending the Criminal Code, s 319(8).

[15] Bill C-63, An Act to enact the Online Harms Act, cl 15, amending the Criminal Code, s 320.1001.

Commentaires (0)

L’équipe du Blogue vous encourage à partager avec nous et nos lecteurs vos commentaires et impressions afin d’alimenter les discussions sur le Blogue. Par ailleurs, prenez note du fait qu’aucun commentaire ne sera publié avant d’avoir été approuvé par un modérateur et que l’équipe du Blogue se réserve l’entière discrétion de ne pas publier tout commentaire jugé inapproprié.

Laisser un commentaire

À lire aussi...