The Responsibility of Online Platforms: a Marginal Challenge in Québec

The European Union recently expressed its view on the role that online platforms should play in combating illegal online content. In Québec, this concern was identified early on, when legislators laid down guidelines at the turn of the century. However, this head start did little to keep Québec in the lead, as the province is now seemingly lagging behind.

Online platforms are part of our daily lives. They include social networks, search engines, content-sharing and rating sites, blogs, and more. Yet we see an increasing dissemination of illegal online content inciting hatred, violence and even terrorism.

Although the concept of “illegal content” is broad and varies by jurisdiction, there are illegal situations across the world. We instinctively think of the many forms of cyberbullying, widespread among minors, such as flaming (brief disparaging messages), harassment (barrage of violent remarks), disparagement (harm to online reputation), masquerade (identity theft), happy slapping (media lynching) or outing (violation of privacy). The list goes on and, regrettably, it is only getting longer. In light of this, the Old Continent is sending a message to all that can be summed up as follows: that which is illegal offline is also illegal online.

The European Union takes the lead

For several years, the European Union has been concerned about illegal online content. Several binding and non-binding measures have therefore been adopted to curb its spread (guidelines, codes of conduct, etc.). Since public involvement will not suffice on its own, online platforms must take a stand and become fully involved in the process of addressing this issue. The importance of taking a stand was at the heart of the European Commission's Communication on September 28, entitled “Tackling Illegal Content Online: Towards an Enhanced Responsibility of Online Platforms.”

In this document, the Commission calls on online platforms to redouble its efforts in proactively preventing, detecting, and removing illegal content. More specifically, it recommends that platforms adopt several measures, such as:

  • setting up mechanisms enabling users to report illegal content and investing in automatic detection technologies;
  • cooperating more closely with the public law enforcement authorities;
  • working with “trusted flaggers,” namely entities specialized in finding, detecting and identifying illicit content;
  • removing illegal content as quickly as possible (the Commission, in fact, is considering the introduction of fixed timeframes for removal of such content);
  • demonstrating greater transparency by publishing content management policies and statistics; and
  • adopting measures against repeat offenders, namely by using and developing automatic tools to prevent previously deleted content from reappearing.

These recommendations are currently non-binding, but the Commission reserves the right to strengthen the regulatory framework if the online platforms are not sufficiently proactive in the coming months. Its evaluation in this regard is expected to take place by May 2018.

Québec is letting itself be surpassed

Meanwhile, nothing new on the Western front. The issue of illegal online content seems to bear little importance in Québec, despite it being just as prevalent here as it is worldwide. This, however, has not always been the case. Indeed, in the early 2000s, Quebec legislators were at the forefront creating specific liability guidelines for online platforms.

Also, in addition to the Civil Code of Québec, the Act to establish the legal framework for information technology sets out certain obligations and limitations regarding the liability of the “service provider acting as an intermediary.” It states, for instance, that there is no obligation to actively monitor content and that online platforms are theoretically not liable for the activities carried out by their users. This does not mean, however, that the platforms are automatically exempt from liability.

A platform can indeed be held liable if it has actual or potential knowledge of illegal activities and does not act promptly to prevent the continuation of such activities. In other words, there is no proactive obligation to be kept informed, but if one is aware, one must act quickly.

Such laws aim to strengthen the confidence of Internet users. Without playing the active role of “Internet police,” the platforms must still take appropriate action when they have knowledge of illegal content. Some believe that platforms should also behave in a “prudent and diligent” manner, in accordance with common law principles of civil liability.

Such a requirement is broad and offers no clear indication regarding compliance with the law. Should online platforms systematically allow users to report illegal content, publish clear policies in content management or introduce automatic detection tools for certain words (suicide, pornography, etc.)? It seems to us that all these measures should vary according to the content generated, the target audience, the objectives and the degree of sophistication of the platform, the history of illegal content, etc.

Ultimately, for the time being, this situation evokes a well-known narrative: the European Union, looking over its shoulder, saying to Québec: “Slow and steady wins the race.” Luckily, with the finish line still far ahead, there remains enough time to follow in our European neighbour’s footsteps by spelling out the platforms’ responsibility with regard to illegal online content.

This content has been updated on 7 April 2018 at 20 h 28 min.

Comments

Comment