Right-to-be-forgotten is limited to the EU. The Court of Justice of the European Union (CJEU) handed down its judgment in Google’s appeal against the CNIL – the French Data Protection Authority – about the territorial boundaries of the ‘right-to-be-forgotten’. The CNIL imposed a penalty of €100,000 on Google for its refusal to remove objectionable search results from the localized editions of its search engine around the globe (e.g., google.ca for Google Canada), not just from its European versions (e.g., google.fr for Google France) and its generic edition (google.com).
The CJEU mostly sided with Google, holding that it is only required to remove results from its European and generic editions, rather than globally. The CJEU arrived at this outcome because foreign jurisdictions do not necessarily apply the same balance as EU laws do between a person’s fundamental right to privacy and the public’s general right to information and freedom of expression.
However, the CJEU also held that Google must apply measures to effectively prevent, or at least seriously discourage, European users from gaining access to the removed links through the other editions of the engine where the results still appear. The court also indicated that the GDPR does not preclude national regulators or courts in member states of the EU from nevertheless ordering global removal if national legislation (beyond the GDPR) warrants this.
CLICK HERE to read the CJEU’s judgment in Google LLC v. Commission nationale de l’informatique et des libertés (CNIL).
Right-to-be-forgotten applies to search results containing sensitive data. In another CJEU judgment delivered on the same day, the court held that the GDPR’s restrictions on processing of sensitive categories of data (such as information revealing criminal record, political opinions, religious or philosophical beliefs and sex life) also apply to search engine operators when their search results provide links to websites that contain the sensitive data.
As per the CJEU, when faced with a removal request, a search engine operator must take into account the specific circumstances, such as the seriousness of the interference with the data subject’s fundamental rights to privacy and protection of personal data, in order to ascertain whether the inclusion of that link in the list of results is strictly necessary for protecting the freedom of information of internet users. In particular, when an individual’s request to remove search results relates to criminal proceedings brought against them, the search engine operator balance the seriousness of the offense in question, the progress and the outcome of the proceedings, the time elapsed, the content and form of the publication and the consequences of publication for that person.
CLICK HERE to read the CJEU’s judgment in John Does v. Commission nationale de l’informatique et des libertés (CNIL)
The Council of the EU advances ePrivacy regulation. The Council of the European Union has laid down an updated draft proposal for the regulation on privacy in electronic communication networks, otherwise known as the ePrivacy regulation.
Among other proposed arrangement in the draft regulation, providers of communication services will be allowed to process communication data for detecting, deleting and reporting on pedophilic materials, as long as the technology used when processing meets certain requirements aimed at safeguarding user privacy. For example, providers will be prohibited from storing the monitored communication data over time, except when pedophilic material is detected by using a unique hashing mechanism that compares the suspected material against a database containing hashes of material previously reliably identified as constituting child abuse.
The proposal also deals with the use of cookies and other device trackers, specifying that website operators may use cookies in various cases. For example, when the cookies are needed to measure statistics on end-users using the website by the website operator itself, or by an external service provider (such as Google Analytics) which has a GDPR-compatible agreement with the website operator.
Additionally, when cookies are used for data protection purposes, preventing fraudulent conduct, or detecting technical faults. The proposal also facilitates the use of cookies to detect the end user’s location when they contact emergency services.
The proposal for the ePrivacy regulation continues to be deliberated across the EU’s political and legislative institutions.
CLICK HERE to read the current version of the draft ePrivacy regulation proposal.
Italian regulator approves GDPR code of conduct for consumer credit agencies. The Italian data protection authority (the ‘Garante’) has approved a first of a kind GDPR ‘Code of Conduct’ for consumer credit reporting agencies, governing the processing of personal data of individuals located in Italy by entities that professionally manage credit information systems such as banks and others offering credit services.
The code’s objective is to contribute to the correct application of the GDPR by credit reporting agencies, by setting the bar for the agencies’ duties and obligations under the GDPR. A draft of the code was proposed by an association of Italian consumer credit reporting agencies and was approved by the Italian regulator after its comments and requests for revisions were implemented.
The approved code states that processing personal data in credit information systems for credit scoring purposes constitutes a legitimate interest for such credit agencies: the correct measurement of merit and credit risk, the correct evaluation of the reliability of debtors, fraud prevention and identity theft prevention. Therefore, these agencies need not obtain consent to process the data. However, according to the code, data subjects must receive from the credit agencies a complete and clear privacy notice. The code also defines data retention periods, mechanisms for handling data subject requests seeking to exercise their GDPR rights, restrictions on automated scoring and credit decision making and the establishment of a monitoring body for the code.
CLICK HERE to read the Code of Conduct for Consumer Credit Agencies (in Italian).
Fines for improper data security and unsolicited marketing emails. The Polish data protection authority imposed a fine of more than €645,000 on an online retailer whose consumer database that spans more than two million data subjects suffered two data breaches due to deficient data security measures, particularly ineffective access control and authentication measures.
The Berlin data protection authority issued a fine of approximately €195,000 against a German company providing food delivery services, for violations of the GDPR’s obligations to handle data subject requests for information about the processing of their data, requests to exercise their right to delete data about them and their right to opt-out of processing their data. The fine also covers the transmission of advertising emails to data subjects who had not consented to receive them.
CLICK HERE to read the press release published by the Berlin data protection authority (in German).
CLICK HERE to read the Polish data protection authorities’ decision (in Polish).