EU legal expert says online defamation enforceable worldwide

Facebook could be forced to remove ‘offensive’ content posted in Europe from across the entire website globally under EU court ruling

  • Maciej Szpunar, an advocate general at the EU Court of Justice, issued advice
  • He said that Facebook could be forced to ‘seek and identify’ any illegal content
  • The new ruling could help clarify how Facebook should police posts worldwide

Facebook could be forced to remove ‘offensive’ content posted in Europe – such as hate speech or defamation – and then search for similar posts spread elsewhere in the world, should a new EU ruling go ahead. 

Legal expert Maciej Szpunar, an advocate general at the EU Court of Justice, revealed that social media platforms like Facebook could be required to track down posts similar to content that an EU court has deemed illegal.

The new ruling, which could go ahead in the next few months, would be designed to help clarify how social media companies should police posts made by users worldwide. It would also seek to establish how far-reaching EU law could go to protect its social media users. 

In his legal opinion, Mr Szpunar said companies like Facebook can be ordered by a court ‘to seek and identify, among all the information disseminated by users of that platform, the information identical to the information that has been characterized as illegal.’         

Social media platforms like Facebook could be required to track down posts similar to content that an EU court has deemed illegal, should the new ruling go ahead

The underlying dispute in the Facebook case concerns a user who shared an online article on their personal page about Austrian Greens politician Eva Glawischnig-Piesczek.

The user put a disparaging comment about Glawischnig-Piesczek under a ‘thumbnail’ photo from the article so she took legal action to make Facebook stop the comment from spreading.

An Austrian court ruled that the comments were intended to insult and defame the politician and Facebook removed access to them in Austria.    

In his discussion on the case, Szpunar said that since European law on electronic commerce ‘does not regulate the territorial scope of an obligation to remove information disseminated via a social network platform, it does not preclude a host provider from being ordered to remove such information worldwide.’

And so a new ruling might have to be made. 

Mr Szpunar’s opinion will not be the last word on the matter, with the court still to deliberate and issue its final decision. But the European Court of Justice (pictured) often does heed his advice

He said that his recommendation sought to respect the balance between privacy rights, the freedom to do business and freedom of expression.

The adviser’s decision is not legally binding but top EU courts follow that guidance in most cases.

Mr Szpunar in January sided with Google in its fight against having to apply a so-called right to be forgotten globally. 

He issued his non-binding opinion to the European Court of Justice on the case, proposing that the court should limit the scope of the dereferencing that search engine operators are required to carry out to the EU.

Mr Szpunar said that the principle should be ‘balanced’ against other rights, such as data protection and privacy, as well as the ‘legitimate public interest’. 

His opinion will not be the last word on the matter, with the court still to deliberate and issue its final decision. But the court often does heed his advice.  


Facebook has disclosed its rules and guidelines for deciding what its 2.2 billion users can post on the social network. 

The full guidelines can be read here. Below is a summary of what they say: 

1. Credible violence

Facebook says it considers the language, context and details in order to distinguish casual statements from content that constitutes a credible threat to public or personal safety.

2. Dangerous individuals and organisations

Facebook does not allow any organizations or individuals that are engaged in terrorist, organized hate, mass or serial murder, human trafficking, organized violence or criminal activity.

3. Promoting or publicising crime

Facebook says it prohibit people from promoting or publicizing violent crime, theft, and/or fraud. It does not allow people to depict criminal activity or admit to crimes they or their associates have committed. 

4. Coordinating harm

The social network says people can draw attention to harmful activity that they may witness or experience as long as they do not advocate for or coordinate harm. 

5. Regulated goods

The site prohibits attempts topurchase, sell, or trade non-medical drugs, pharmaceutical drugs, and marijuana as well as firearms. 

6. Suicide and self-injury

The rules for ‘credible violence’ apply for suicide and self-injury. 

7. Child nudity and sexual exploitation of children

Facebook does not allow content that sexually exploits or endangers children. When it becomes aware of apparent child exploitation, we report it to the National Center for Missing and Exploited Children (NCMEC).

8. Sexual exploitation of adults

The site removes images that depict incidents of sexual violence and intimate images shared without permission from the people pictured.

9. Bullying

Facebook removes content that purposefully targets private individuals with the intention of degrading or shaming them.

10. Harassment

Facebook’s harassment policy applies to both public and private individuals.

It says that context and intent matter, and that the site will allow people to share and re-share posts if it is clear that something was shared in order to condemn or draw attention to harassment.  

11. Privacy breaches and image privacy rights

Users should not post personal or confidential information about others without first getting their consent, says Facebook. 

12. Hate speech

Facebook does not allow hate speech on Facebook because it says it creates an environment of intimidation and exclusion and in some cases may promote real-world violence. 

13. Graphic violence

Facebook will remove content that glorifies violence or celebrates the suffering or humiliation of others.

It will, however, allow graphic content (with some limitations) to help people raise awareness about issues.

14. Adult nudity and sexual activity

The site restricts the display of nudity or sexual activity.

It will also default to removing sexual imagery to prevent the sharing of non-consensual or underage content.

15. Cruel and insensitive

Facebook says it has higher expectations for content that defined as cruel and insensitive.

It defines this as content that targets victims of serious physical or emotional harm. 

16. Spam

Facebook is trying to prevent false advertising, fraud and security breaches.

It does not allow people to use misleading or inaccurate information to artificially collect likes, followers or shares. 

17. Misrepresentation

Facebook will require people to connect on Facebook using the name that they go by in everyday life.

18. False news

Facebook says that there is also a fine line between false news and satire or opinion. 

For these reasons, it won’t remove false news from Facebook, but, instead, significantly reduce its distribution by showing it lower in News Feed.

19. Memorialisation

Facebook will memorialise accounts of people who have died by adding “Remembering” above the name on the person’s profile. 

The site will not remove, update or change anything about the profile or the account. 

20. Intellectual property

Facebook users own all of the content and information that they post on Facebook, and have control over how it is shared through your privacy and application settings. 

21. User requests

Facebook say they will comply with:

  • User requests for removal of their own account
  • Requests for removal of a deceased user’s account from a verified immediate family member or executor
  • Requests for removal of an incapacitated user’s account from an authorised representative

22. Additional protection of minors

Facebook complies with:

  • User requests for removal of an underage account
  • Government requests for removal of child abuse imagery depicting, for example:
  • Beating by an adult
  • Strangling or suffocating by an adult
  • Legal guardian requests for removal of attacks on unintentionally famous minors

Source: Read Full Article