Many SaaS suppliers use personal data, collected on behalf of SaaS customers, in an anonymised form for their own purposes, for example for benchmarking. The UK Information Commissioner’s Office (ICO) Anonymisation Code and more recently the Article 29 Working Party’s Opinion on Anonymisation provide guidance on how to check that personal data is actually anonymous.
If you are a SaaS supplier using anonymised personal data you should comply with the recommendations in these two guides, to ensure that you are properly anonymising data, otherwise you could be found to be using personal data in breach of the Data Protection Act (DPA).
Any SaaS supplier processing personal data must comply with the Data Protection Act 1998 (DPA). The DPA regulates the disclosure and use of personal data.
Personal data is defined in the DPA as data which relates to a living individual who can be identified from those data. Accordingly, information or a combination of information, that does not relate to and identify a living individual, is not personal data.
Information which is not personal data i.e. anonymised data can be disclosed and used more freely with fewer legal restrictions as the DPA does not apply to it. However, SaaS suppliers should ensure that any personal data they have anonymised and they intend to use, is truly anonymous.
Most data is anonymised using:
- randomisation techniques, which alter the veracity of the data in order to remove the strong link between the data and the individual. If the data is sufficiently uncertain then it can no longer be referred to a specific individual; or
- generalisation techniques, which consist of generalising/diluting the attributes of individuals by modifying the respective scale or order of magnitude (i.e. a region rather than a city, a month rather than a week).
Such methods use the following tools to make data anonymous:
- noise addition;
- differential privacy;
- aggregation/k-anonymity; and
In order to ensure that data is properly anonymised, SaaS Supplier’s should carry out regular risk assessment audits. SaaS suppliers should test their anonymisation and randomisation techniques against the following risks to the re-identification of anonymised data:
- singling out – is it still possible to isolate some or all records which identify an individual in the data set;
- linkability – is it still possible to link at least two records concerning the same individual or a group of individuals;
- inference – can information be inferred concerning an individual from the values of a set of other attributes.
Audits should also assess new risks to re-identification, as over time due to technological developments anonymised data could become re-identifiable in the future.
Pseudonymised Data and Anonymised Data
SaaS suppliers should note that the Article 29 Working Group does not accept that pseudonymised data is equivalent to anonymised data. The reason for this is that pseudonymisation allows:
- an individual to be singled out; and
- the individual to be linkable across different data sets.
Where any uncertainty exists about whether or not re-identification can occur, SaaS suppliers should obtain prior consent from individuals to the use of their anonymised data or they run the risk of breaching the DPA.
Irene Bodle is an IT lawyer specialising in SaaS agreements with over 15 years experience in the IT sector. If you require assistance with any SaaS, ASP, software on demand contracts or any other IT legal issues contact me:
Speaker at the Berlin CloudConf 2013.
To register for my newsletter click here
Other related articles:
- SaaS Agreements – Essential Elements
- SaaS Agreements – Essential Elements – SLAs Explained
- SaaS Agreement – Data Protection – New General Data Protection Regulation (GDPR)
- SaaS Agreements – FAQs – Prism
- SaaS Agreements – Data Protection – Which Law Applies
- SaaS Agreements – Data Protection – Microsoft must disclose data on EU server
- SaaS Agreements – Data Protection – Update on new draft EU Data Protection Regulation
- SaaS Agreements – Data Protection – The Patriot Act
- SaaS Agreements – Data Protection – FISA customer concerns
- SaaS Agreements – Data Protection – HIPAA
- SaaS Agreements – Data Protection – Safe Harbor Still Adequate
- SaaS Agreements – Data Protection – Cyber Security Issues
- SaaS Agreements – Data Protection – BYOD
- SaaS Agreements – Data Protection – Recent ICO Fines
- SaaS Agreements – Data Protection – Sub-Contractors, Model Clauses
- SaaS Agreements – Data Protection – Liability for Loss of Backup Tapes
- SaaS Agreements – Data Protection – Anonymising Data
- SaaS Agreements – Data Protection – Transfer of Data Outside the EEA
- SaaS Agreements – Data Protection – Policies and Procedures
- SaaS Agreements – Data Protection – German Customers and Data Processing Agreements
- SaaS Agreements – Data Protection – Safe Harbor, German Customers
- SaaS Agreements – Data Protection – Customer Privacy Policies
- SaaS Agreements – Data Protection – New Proposed EU Rules Part 2
- SaaS Agreements – Data Protection – New Proposed EU Rules Part 1
- SaaS Agreements – Data Protection – IT Security Requirements