Experts from the UK’s Information Commissioner’s Office (ICO) have warned against the use of AI-powered emotional analysis systems, fearing that its drawbacks may fair outweigh the benefits, at least for now.
Wearable screening tools for health monitoring, or behavioural monitoring like body position, speech, and eye and head movements, are given as examples whereby the data that needs to be collected could be seen as risky.
The ICO says that storing and processing this type of data, which includes subconscious behaviour, can be personally identifying, which presents certain challenges in terms of how companies can then go about using this data.
Biometric data security
“While there are opportunities present, the risks are currently greater, ” said ICO Deputy Commissioner Stephen Bonner. “At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgements about a person that are inaccurate and lead to discrimination.”
Then there’s the fact that, “they may not work yet, or indeed ever”, according to Bonner. Learning to understand personal emotions can be extremely difficult for people; condensing this information into data that a computer would be able to use to categorize its subjects even more so.
Right now, the ICO wants businesses to use technology that is “fully functional, accountable, and backed by science.” The organization also says it’s “yet to see” AI technology that satisfies data protection requirements, expressing further concerns about proportionality, fairness, and transparency.
By Spring 2023, the ICO hopes to have published biometric guidance to help organizations understand the importance of security, as it sees this type of technology becoming more prevalent in the finance, fitness, education, and even immersive entertainment industries.
In the meantime, Bonner says that the ICO will, “continue to scrutinise the market, identifying stakeholders who are seeking to create or deploy these technologies, and explaining the importance of enhanced data privacy and compliance, whilst encouraging trust and confidence in how these systems work.”