Delfi vs Estonia ECHR judgment

Posted: October 10th, 2013 | Author: | Filed under: human rights, law, privacy, thoughts | No Comments »

Today, the European Court of Human Rights issued their long-awaited judgment in the case of Delfi vs Estonia. The case has implication for many human rights issues, including hate speech, anonymous speech, liability of internet portals for comments on their website etc.

As I currently understand the law, the provider is not liable when there is a notice-and-take-down system in place. This means that for example YouTube is not liable for any derogatory or infringing content that is uploaded until it has received a take down notice (which can be submitted by anyone) after which the content is made unavailable. This system is, of course, not perfect, because it allows to block also material that might not be infringing. It also means that the service provider should have the knowledge and expertise to identify content that is infringing from what is not, which poses specific problems. The lack of clarity and inconsistency of the notice-and-take-down system was also pointed out by the intervening third party Helsinki Foundation for Human Rights in Warsaw (which I had the pleasure of visiting and meeting with during the last weekend). See their written comments to the court (PDF).

The Delfi case involved a number of derogatory comments against Mr Leedo, who owns the company that provides ferry services between mainland Estonia and the two larger islands. The comments were extremely offensive (and are reproduced in English in para. 14 of the judgment).

The case will be commented in detail by many people, but here are my preliminary observations:

1. Not all news stories require the same degree of moderation of comments by the publisher. The court held that Delfi should have expected a number of hateful or defamatory comments due to the nature of the article and thus exercise extra caution:

86. /—/ Therefore, the Court considers that the applicant company, by publishing the article in question, could have realised that it might cause negative reactions against the shipping company and its managers and that, considering the general reputation of comments on the Delfi news portal, there was a higher-than-average risk that the negative comments could go beyond the boundaries of acceptable criticism and reach the level of gratuitous insult or hate speech. It also appears that the number of comments posted on the article in question was above average and indicated a great deal of interest in the matter among the readers and those who posted their comments. Thus, the Court concludes that the applicant company was expected to exercise a degree of caution in the circumstances of the present case in order to avoid being held liable for an infringement of other persons’ reputations.

This is an important distinction as the same could be said about articles that deal with other, sensitive topics such as issues related to ethnic minorities, migrants, LGBT rights etc in which areas at the moment there is uncontrolled hate speech. Does this mean that the publisher should put extra resources on moderation of the comments for these topics or should ban commenting altogether?

2. The notice-and-take-down system has to be effective in order to exclude liability of the publisher. Although there was a word-based automatic filtering system as well, the portal relied on a notice-and-take-down system, which allowed users to report comments they thought were offensive or illegal. Delfi had made it very easy to report offensive content with a single click (unlike twitter, which is only now starting to work out a reasonable system to report hate speech). The court thought that the systems were not adequate:

89. The Court notes that in the interested person’s opinion, shared by the domestic courts, the prior automatic filtering and notice-and-take-down system used by the applicant company did not ensure sufficient protection for the rights of third persons. The domestic courts attached importance in this context to the fact that the publication of the news articles and making public the readers’ comments on these articles was part of the applicant company’s professional activity. It was interested in the number of readers as well as comments, on which its advertising revenue depended. The Court considers this argument pertinent in determining the proportionality of the interference with the applicant company’s freedom of expression. It also finds that publishing defamatory comments on a large Internet news portal, as in the present case, implies a wide audience for the comments.

3. Liability of the publisher is related to the fact that submitters of comments were unable to modify or delete comments later. This means, according to the court, that Delfi had a substantial degree of control over the comments:

/…/ The Court further notes that the applicant company – and not a person whose reputation could be at stake – was in a position to know about an article to be published, to predict the nature of the possible comments prompted by it and, above all, to take technical or manual measures to prevent defamatory statements from being made public. Indeed, the actual writers of comments could not modify or delete their comments once posted on the Delfi news portal – only the applicant company had the technical means to do this. Thus, the Court considers that the applicant company exercised a substantial degree of control over the comments published on its portal even if it did not make as much use as it could have done of the full extent of the control at its disposal.

4. It is up to the publisher to decide which means to use in order to stop hate speech and defamatory comments. The court considered it to be an important factor that the domestic courts did not prescribe a specific remedy (i.e. moderation before publishing, etc). This means that it is up to the publisher to ensure that the exisiting protection system is adequate and effective.

5. The court does not ban anonymous expression or find that it is the cause of hate speech or defamation. The only thing the court said was that since users were not registered, it would have been more difficult to sue them directly. Since it allows comments for users that are not registered, it has a ‘certain responsibility’ for those comments. It goes on to state:

92. The Court is mindful, in this context, of the importance of the wishes of Internet users not to disclose their identity in exercising their freedom of expression. At the same time, the spread of the Internet and the possibility – or for some purposes the danger – that information once made public will remain public and circulate forever, calls for caution. The ease of disclosure of information on the Internet and the substantial amount of information there means that it is a difficult task to detect defamatory statements and remove them. This is so for an Internet news portal operator, as in the present case, but this is an even more onerous task for a potentially injured person, who would be less likely to possess resources for continual monitoring of the Internet. The Court considers the latter element an important factor in balancing the rights and interests at stake.

This approach by the court is problematic, because it might lead to anonymous commenting option to be removed in fear of liability. One wonders if similar comments made on a website which does not have a media company behind them would also result in a similar conclusion. Given the worrying trends of possibly illegal surveillance of internet activities, data-mining and the like, does it not make sense for people not to identify themselves and try to increase anonymity?

Finally, the court considered that the fine of 320 euros was not in any way disproportionate given the fact that Delfi was one of the largest Internet portals in Estonia. It seems that the court could have accepted a much higher financial penalty.

My advice to the news portals would be:

  1. Make notice-and-take-down system more effective by training the moderators and informing the users better;
  2. For stories for which hateful and defamatory comments are expected (high-risk stories), utilise a quick pre-moderation system or proactively and constantly screen comments for hate speech or defamation;
  3. Clarify the policies and increase transparency of the notice-and-take-down system;
  4. Allow users to modify or delete their own comments while remaining anonymous.

I have previously written that anonymous comments are not evil in themselves and removing the possibility is too severe infringement of freedom of expression that would not also have the intended effect. I am not happy that the court did not use this case to explore in more detail the issues related to anonymous expression online. I am a bit disappointed by the case made by Delfi, which they could have argued in a more powerful way, but the government’s lawyers should be commended for their thorough work.

The case does show very well the complexity of the issue and the difficulty in balancing different rights in the context of the Internet. Better and more clear rules on this are a necessity.



Leave a Reply