On 7 October 2022, the Information Commissioner’s Office (ICO) has posted a news alert regarding the risks from use of biometric technologies. The alert provides a warning that the ICO views the use of certain emotional analysis technologies as more risky than traditional biometric technologies. This is because the ICO does not deem the current algorithms as sufficiently developed to detect accurate emotional cues. This is seen by the regulator as holding the potential for the risk of systemic bias, inaccuracy and even discrimination.
The emotional analysis technologies currently being flagged by the ICO are those that collect, store and process the range of personal data, including subconscious behavioural or emotional responses, and in some cases, special category data. Deputy Commissioner, Stephen Bonner has been quoted: […] “While there are opportunities present, the risks are currently greater. At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgements about a person that are inaccurate and lead to discrimination.”
This ICO warning is layered against the heightened risk and data security requirements concerning biometric data which is already being successfully used by various industries, such as the financial sector using facial recognition to verify human identities through comparing photo IDs and a selfie. Biometric technologies are technologies that process biological or behavioural characteristics for the purpose of identification, verification, categorisation, or profiling.
Biometric Guidance to be published spring 2023
To assist companies with understanding what is seen as fair use and biometric data use and compliance requirements, there is an anticipated Biometric Guidance to be published by the ICO in spring 2023. There is a strong emphasis that biometric data is unique to an individual and is difficult or impossible to change should it ever be lost, stolen or inappropriately used. This uniqueness creates a high threshold for companies using such information to embed a ‘privacy by design’ approach, thus reducing the risk factors.
See more on the ICO’s news alert here.
We will continue to monitor these developments and any more specific recommendations and guidance as it is released by the ICO.
Please contact Jose Saras and Joanna Coombs-Huang if you have any questions about the above.
The material in this article is only for general review of the topics covered and does not constitute legal advice. No legal or business decision should be based on its content.