When I started my "10 things that happened…" series I decided that an update on social media legal issues would be interesting. Since then, I keep on discovering more and more publications and cases which I want to read and discuss. So there is a bit of a mixture in this article – a few top choices from my research.
The UK government has published a Code of Practice for providers of online social media platforms in relation to harmful content. This was required under section 103 of the Digital Economy Act 2017. The final version was published in April 2019, following a draft published in May 2018.
The Code provides guidance for social media platforms, and is also relevant for other sites hosting user-generated content and comments (including review websites, gaming platforms, online marketplaces). It centres around four key principles, summarised as follows:
Also in April 2019, the UK government published its Online Harms White Paper, setting out its plans for a regulatory framework to keep UK users safe online. The harms it intends to protect against include child exploitation, cyberbullying and trolling, terrorist and extremist content, extreme pornography, harassment and cyberstalking, and hate crime.
It is proposed that the regulatory framework apply to companies that provide "services or tools that allow, enable or facilitate users to share or discover user-generated content, or interact with each other online." This includes small companies and start-ups (as well as larger social media providers).
The aim is to create a new statutory duty of care on such companies to take responsibility for the safety of their users and tackle harm caused by content or activity on their services. This will be overseen and enforced by an independent regulator.
Whilst there is still a long way to go before it becomes law, there has already been commentary on potential concerns with the content of the White Paper. See, for example, commentary by the Guardian. This raises that the paper seeks to cover both harms which are illegal and those which may be ‘harmful’ but are not currently illegal; yet the paper intends to enforce them both in the same way. The BBC raises the concern that the rules may interfere with freedom of expression.
The White Paper is open for consultation until 1 July 2019.
The defendant was the chairman of the Bristol branch of the UK Independence Party (UKIP). The claimant was pictured with a Labour election candidate in a tweet from the UKIP branch Twitter account referring to him (and another pictured individual) as: "2 suspended child grooming taxi drivers". This was not true. The High Court held there was an inference that the tweet caused serious harm to the claimant’s reputation, and that the tweet had been defamatory.
The tweet was written by the vice-chairman of the branch, and the defendant had not approved the tweet. However, the defendant was held liable for the tweet as the vice-chairman was "quite clearly acting as the agent of [the defendant]". Some factors which were key to this decision include:
On the second point, there was evidence that the defendant had instructed members of the branch not to post anything on social media that was offensive, inappropriate, libellous or racist. The defendant had also given instructions to the vice-chairman not to post anything without prior approval. However, tweets had not been approved nor monitored in practice.
Damages of £40,000 were awarded. The judge said that had the libel been published in a national newspaper, damages of £250,000 or more could have been justified. However, to ensure the award was proportionate to the limited scale of the publication and difficulties in causation, he considered the lower amount appropriate in this case.
The case demonstrates the importance of taking measures to ensure social media is used appropriately by employees and others who use accounts on behalf of an organisation or individual. Such measures may include training on the risks of social media, policies on how social media may or may not be used (and by whom), approval and monitoring procedures, implementation and enforcement of such procedures, and clear consequences if they are not followed.
The judgment is available on BAILII, dated 19 December 2018.
The decision went the other way in this case involving an alleged libellous statement on Facebook. The claimant was a solicitor (with her businesses as other claimants). She claimed that a press release on a Facebook page (as well as a webinar) contained defamatory allegations. These related to the handling of litigation arising from the alleged mis-selling of off-plan property in Cyprus. The claimant was a law firm consultant who acted for claimants in this litigation.
The issues considered by the High Court included whether the wording referred to the claimant, the meaning of the wording, and whether the wording was likely to cause serious harm to the reputation of the claimant (which is the ‘serious harm’ requirement of defamation, introduced by the Defamation Act 2013).
Whilst the Court held that the meaning of the words was potentially defamatory, the ‘serious harm’ requirement was not met. In particular, the claimant had not demonstrated a sufficient link between what had been said and the reputation of the claimant.
The claim was therefore dismissed, although the judge left it open for a claim to be brought in another form which "meets or circumvents" the objections identified, or to pursue a separate claim for inducing breach of contract.
The judgment is available on BAILII, dated 16 April 2019.
This case was reported in the Law Society Gazette (and other press), as well as a blog by Yair Cohen, the solicitor representing the claimant. The case was reportedly heard in the High Court, although the articles do not appear to contain a full citation.
In July 2018, the claimant succeeded in her claim for harassment, misuse of private information and breach of confidence against the defendant, though the level of damages has not been published. The claimant and defendant had met in 2004 through the Guardian’s Soulmates dating website. Since then, the defendant had created dozens of websites publishing private information about the claimant. Amongst other actions to direct people to these websites, he created fake social media accounts on Twitter purporting to be her. The claimant sought compensation for damage to her relationships, career and personal life.
On 19 December 2018, the Advocate General of the Court of Justice of the EU (CJEU) issued his Opinion (available on the CJEU website) relating to data protection responsibilities when embedding a Facebook ‘like’ button (or similar plugin) on a website. The Opinion is taken into account by the Court when it hears the case.
The matter () is a referral from the German courts relating to the interpretation of EU data protection law. The relevant law was the EU Data Protection Directive 95/46/EU, as it related to data processing activities occurring prior to the applicability of the EU General Data Protection Regulation.
The Opinion is neatly summarised in its heading:
"…the operator of a website embedding a third party plugin such as the Facebook Like button, which causes the collection and transmission of the users’ personal data, is jointly responsible for that stage of the data processing"
"The operator of the website has to provide, with regard to that data processing operations, the users with the required minimum information and obtain, where required, their consent before the data are collected and transferred"
This means that website operators are jointly responsible with Facebook (or other provider) for ensuring compliance with data protection law when using Facebook ‘Like’ buttons or similar third party plugins.
In this case, when a user clicked the Facebook ‘Like’ button on Fashion ID’s website, it transmitted information about that user’s IP address and browser string to Facebook. The Advocate General said, at that stage, "there is unity of purpose"; Fashion ID and Facebook have a shared commercial and advertising purpose in transmission of that data. Fashion ID is therefore responsible for ensuring the legitimacy of the processing and providing the user with information about the processing.
The Opinion also says that the controller’s joint responsibility should be limited to those operations for which it effectively co-decides on the means and purposes of the processing of the personal data (i.e. for which it is a joint controller). So the controller is not liable for previous and later stages of the overall "chain of processing".
It is also worth mentioning this earlier similar case involving Facebook Pages. This case was a referral from the German courts, and the CJEU judgment (available on the CJEU website) was given in June 2018.
It was held that the administrator of a Facebook fan page was a joint controller with Facebook in relation to the processing of personal data of visitors to that page (which includes visitors who are not Facebook users). The relevant data in this case was personal data collected by means of cookies installed on the computers or other devices of visitors to the fan pages. This information was then used by Facebook, in particular to improve its advertising, and to enable the fan page administrator to obtain statistics in relation to the visits to the page (using ‘Facebook Insights’).
The Court raised that joint responsibility does not necessarily imply equal responsibility as parties may be involved at different stages of the processing and to different degrees, so that "the level of responsibility of each of them must be assessed with regard to all the relevant circumstances of the particular case".
Both these cases pre-dated the GDPR, and therefore did not apply the provisions on joint controllers under Article 26 of the GDPR. These provisions are more prescriptive as to the responsibilities of joint controllers than under the previous EU Directive. It will therefore be interesting to see if the conclusions on the extent of responsibilities of joint controllers are interpreted in a similar way to the cases referred to above. Article 26 is not necessarily inconsistent with the limitation of joint responsibility to specific processing operations, as two (or more) controllers only need be joint for the processing operations for which they have a joint purpose or means (so previous or later stages of the ‘chain of processing’ may not be joint processing operations).
Facebook now has a ‘Page Insights Controller Addendum’ which refers to the joint controller status.
The case law on linking does not relate solely to links on social media, but is interesting to discuss here nonetheless, particularly given the number of links I have included within this article!
In this case, a news portal in Hungary complained to the European Court of Human Rights (ECtHR) that it had been found liable in Hungary for linking to defamatory content. It claimed that this unduly restricted its freedom of expression (under Article 10 of the Convention for the Protection of Human Rights and Fundamental Freedoms). On 4 December 2018, the ECtHR gave its judgment and agreed that Article 10 had been violated.
In reaching its conclusion, the Court agreed that there had been interference with the Article 10 right, but also that such interference pursued a legitimate aim (to protect the rights of others). It then needed to decide whether the interference was ‘necessary in a democratic society’2. The Court concluded that it was not and, therefore, by finding the news portal liable for hyperlinking to defamatory content, there had been a disproportionate restriction on freedom of expression.
The facts of the case are important and, in particular, note that it related to journalism. The Court took into account whether the journalist was acting in good faith, and decided that "the journalist in the present application could reasonably assume that the contents, to which he provided access, although perhaps controversial, would remain within the realm of permissible criticism of political parties and, as such, would not be unlawful".
Whilst this case related to defamation, similar principles could be argued for links to other unlawful or protected content.
The Court concluded by paraphrasing the words of Tim Berners-Lee: "…hyperlinks are critical not merely to the digital revolution but to our continued prosperity – and even our liberty. Like democracy itself, they need defending".
In recent years, the Court of Justice of the EU (CJEU) has given decisions on copyright issues associated with linking, including C-466/12 Nils Svensson and Others v Retriever Sverige AB (2014) and C-150/15 GS Media BV v Sanoma Media Netherlands BV and Others (2016). The IPKat has published a useful summary table.
See also my article from 2014 in relation to Svensson, though there have since been updates to the law relating to linking and copyright.
Olivia Whitcroft, principal of OBEP, 13 June 2019
A citation which rolls off the tongue.
Note that, given its conclusion on this point, it did not consider it necessary to decide on whether the interference was ‘prescribed by law’.
1 A citation which rolls off the tongue.
2 Note that, given its conclusion on this point, it did not consider it necessary to decide on whether the interference was ‘prescribed by law’.
This article provides general information on the subject matter and is not intended to be relied upon as legal advice. If you would like to discuss this topic, please contact Olivia Whitcroft using the contact details set out here: Contact Details