It turns out that a lot more legal stuff happened whilst I was on leave than I could fit into 10 articles…so my article number 10 is a mixed bag of other things that caught my eye.
The California Consumer Privacy Act of 2018 will be effective from 1 January 2020. It is reported to be the first US state law of its kind, containing similarities to the GDPR, including in relation to transparency of activities using consumer information, rights for data subjects, and financial penalties for breaches. It appears only to apply to information about California residents (compared to the GDPR which applies in the EU to all personal data processed by an organisation, regardless of who are the data subjects).
Following a review of gender stereotyping by the Advertising Standards Authority (ASA), new rules were introduced into the CAP Code (for non-broadcast advertising) and BCAP Code (for broadcast advertising) from 14 June 2019. These rules state that direct marketing communications and advertisements "must not include gender stereotypes that are likely to cause harm, or serious or widespread offence." Adverts for Volkswagen and Philadelphia cream cheese were the first to be banned under this rule on 14 August 2019. For more information, see the ASA website.
I was originally going to report on the draft new ISO 27552, and suddenly I couldn’t find it any more. That’s because in the final version (published in August 2019) it changed its name to ISO/IEC 27701. The standard provides requirements and guidelines for privacy information management, as an extension to ISO/IEC 27001 and ISO/IEC 27002 (information security management). The description refers to ‘personally identifiable information’ (PII) rather than ‘personal data’, so I am not clear whether it is intended to be applied to identifiers or wider personal data sets. See the ISO website for more information.
Data protection and privacy concerns with biometric data systems have hit the news recently, including:
There has also been press coverage on concerns about the use of facial recognition technology at Kings Cross Station, which the ICO is also investigating.
Learning points for those considering use of biometric data technologies include: conduct a proper DPIA in advance (and involve the ICO if needed), identify a clear legal basis and seek informed consent where required, ensure any software algorithms produce fair results (see also Article number 6 in my series about artificial intelligence), assess how to apply the principle of data minimisation, and create a bespoke ‘appropriate policy document’ where required by the Data Protection 2018.
In April 2019, there was a lot of press coverage about the introduction of new consent forms to allow police investigators access to a victim’s mobile phone or other device in cases of alleged rape or other sexual offences. Concerns were raised that this intruded upon the victim’s privacy, and that if the victim objected, they may be informed that the case could not be taken forward. In a press statement, the CPS indicated that it was working with victim groups and the ICO to ensure the approach "offers the necessary balance between the requirement for reasonable lines of inquiry and the complainant’s right to privacy".
The UK Intellectual Property Office has blogged about copyright and the GDPR here. The blog provides that whilst the photographer may have copyright and associated moral rights in their photos, where there are people in the image, the GDPR and the individuals’ rights also need to be addressed. The article also links to a case study highlighting the importance of metadata to evidence copyright.
The Schrems challenge against the transfer of personal data by Facebook from the EU to the US has been bouncing back and forth between the Irish and EU Courts. The case will have an impact on the validity of both the EU-US Privacy Shield and the EU model contract clauses. A full decision of the Court of Justice of the European Union is expected in 2020.
In August 2018, the Law Commission published a consultation on the electronic execution of documents. This was followed by its final report on 3 September 2019. A key conclusion is that: "An electronic signature is capable in law of being used to execute a document (including a deed) provided that (i) the person signing the document intends to authenticate the document and (ii) any formalities relating to execution of that document are satisfied". Such formalities may include that witnesses are physically present, but the report recommends consideration of legislative reform to allow video witnessing. The report is available here.
Maybe I should have included this within Article number 4 in my series…but here it is now. Dawson Damer v Taylor Wessing  EWHC 1258 (Ch) (17 May 2019) reached the next stage of its journey in the High Court since the Court of Appeal judgement in 2017. The Court of Appeal had remitted issues for further determination by the High Court.
To recap, the case concerned a subject access request (SAR) made by an individual to a law firm in connection with legal proceedings in the Bahamas against one of the law firm’s clients. The court considered the extent to which controllers can:
In 2015 the High Court ruled that it would not exercise its discretion to enforce the SAR. It considered that it would not be reasonable and proportionate for the law firm to carry out the search for personal data on the basis that the purpose of a SAR is not to enable discovery of documents that may assist in litigation (see also my previous article on this).
However, on appeal to the Court of Appeal in 2017, the Court said: "…disproportionate effort must involve more than an assertion that it is too difficult to search through voluminous papers". In addition, there is no ‘no other purpose’ rule which is an automatic bar to the exercise of the Court’s discretion to enforce a SAR. This meant that the law firm must carry out a search, rather than no search at all.
The High Court in 2019 was left with deciding, on the facts, the extent to which the law firm must search through the documents to find personal data which may need to be disclosed. The High Court directed the law firm to carry out specific additional searches.
A couple of other points arising from the case:
In the case of FSHC Group Holdings Ltd v GLAS Trust Corporation Ltd  EWCA Civ 1361 (31 July 2019), the Court of Appeal clarified the circumstances in which a contract can be rectified for common mistake. To clarify what this means: where, as a result of a mistake by both parties, a written contract does not reflect the common intention of those parties, a court can change the terms to bring them in line with that common intention.
In this case, the Court concluded that, before such rectification could take place, it must be shown that either:
Common mistake may therefore be difficult to demonstrate in court, and it will, of course, be preferable for parties to check the written terms accurately reflect the intention before entering into any agreement.
Olivia Whitcroft, principal of OBEP, 11 September 2019
Other than in relation to Intelligence services processing under Part 4 of the DPA 2018.
1 Other than in relation to Intelligence services processing under Part 4 of the DPA 2018.
This article provides general information on the subject matter and is not intended to be relied upon as legal advice. If you would like to discuss this topic, please contact Olivia Whitcroft using the contact details set out here: Contact Details