Financial Sector and Data Analytics
- Alister Pearson, Senior Policy Officer at ICO
- 25.05.2021 04:30 pm data
The Information Commissioner’s Office (ICO) is urging the finance sector to build in data protection from the start when considering data analytics projects.
A toolkit designed to help compliance with data protection law when using data analytics was recently launched by the ICO.
The toolkit guides users through considerations in relation to compliance with data protection law including information rights of the public. This then connects with building public trust and confidence when using data analytics.
What do we mean by data analytics?
We have defined data analytics as the use of software to automatically discover patterns in data sets containing personal data and using them to make predictions, classifications, or risk scores. Data analytics can help the finance sector analyse large volumes of data to infer how likely someone is to repay a loan, to detect suspicious activity, and other risk management and analysis.
Integral to data analytics are algorithms, which are a set of mathematical instructions given to computer systems to complete tasks.
The ICO is aware that there is a growing trend across sectors to use data analytics, and in some cases, a specific category of advanced algorithm – referred to as artificial intelligence or AI – to complete tasks. The ICO has also produced two pieces of guidance – Explaining decisions made with AI guidance in partnership with the Alan Turing Institute, and the guidance on AI and data protection – on the challenges that AI poses to individuals, which provide the foundations for the newly launched toolkit.
ICO toolkit can help
Building in data protection from the start when considering data analytics is not only the law but it’s a vital step in building public trust and confidence in the technology and how the finance sector is using people’s data.
The ICO’s new toolkit takes organisations through the data protection points they would need to think about from the outset of any project involving data analytics.
Data protection officers can help
When considering using data analytics, organisations should get their data protection officer (DPO) or other information governance professionals (or both) involved from the earliest stages. They can help identify and address risks of non-compliance with data protection law as well ensuring that organisations can demonstrate their compliance.
Finance sector professionals should also take into account how the processing using data analytics interacts with their organisation’s information governance procedures.
Assess the impact
In the vast majority of cases, the use of data analytics to process personal data is likely to result in a high risk to individual rights and freedoms, and therefore trigger the legal requirement to carry out a data protection impact assessment (DPIA). In any case, if a major project involves the use of personal data, the ICO consider it good practice to do a DPIA.
It can be easy to conclude that the impact of a data analytics solution will be positive. However, organisations should consider all types of possible impact. For example, a system may overstate the likelihood that a person will be able to repay a loan and a bank may subsequently issue a loan to that individual based on the system’s output. This may lead to that person being saddled with a debt that they have no way of paying.
To reduce the likelihood of these cases occurring, organisations should consider whether they include a meaningful human review before a final decision is made. A human reviewer may be able to detect where a system’s output is statistically inaccurate and overrule it before a negative impact can occur.
In decisions which have legal or similarly significant effects, individuals must be able to challenge a decision made about them using a solely automated system and be able to request a human review.
If a DPIA identifies a high risk, and no measures can be taken to sufficiently reduce this risk, organisations need to consult the ICO before going ahead with the processing.
Data sharing
If processing using data analytics involves sharing personal data with other organisations, bear in mind that data protection law is not a barrier to data sharing. It is a way of enabling data sharing that has relevant safeguards in place to protect people’s rights.
Data sharing agreements are likely to be required where data is to be shared with other controllers or processors. A good data sharing agreement will also help to demonstrate compliance, which is key to data protection law.
Automated decision making
There are specific provisions in data protection law covering people’s rights where processing involves solely automated individual decision-making, including profiling, with legal or similarly significant effects. Organisations need to consider whether the use of data analytics falls into this category of processing, and if it does, be aware of the safeguards, including letting people know that a decision has been made about them in this way.
Where a human is involved in a decision, it is important to ensure that their involvement is meaningful. If a human is just rubbing stamping every automated decision without meaningfully reviewing it, then this will be classed as a solely automated decision. Human reviewers should have meaningful influence on the decision, including the authority and competence to go against an automated recommendation. Reviewers must weigh up and interpret the automated decision, consider all available input data, and also take into account other additional factors.
Toolkit available at ico.org.uk
Once organisations have used the toolkit, a short report will be created that suggests practical actions and provides links to additional guidance that will help improve data protection compliance.
The toolkit is available on the ICO website at www.ico.org.uk/data-analytics-toolkit