Jun 18, 2021
Laura Berrill

‘Deep concerns’ over use of live facial recognition tech

bigdata
datasecurity
facialrecognitiontechnology
Concerns are around the use of live facial recognition technology combined with social media and other big data

The UK Information Commissioner says she is "deeply concerned" that live facial recognition (LFR) may be used "inappropriately, excessively or even recklessly" if it was combined with social media and big data.

New guidance for companies and public organisations using the technology has also been published.

Risks to privacy

Elizabeth Denham posted her concerns in a blog, saying that although she thought facial recognition technology could be of use, for example in terms of enabling people to unlock mobile phones or set up online bank accounts; if people’s faces were scanned and processed by algorithms in real time and in public places, the risks to privacy increased.

She added that we should be able to take our children to a leisure centre, visit a shopping mall or take a city tour without having to have our biometric data collected and analysed at every step.

The benefits of the technology, however, could create instant profiles of people to be used in serving personalised adverts or it could match shoppers' faces against watch-lists of known shoplifters.

In a separate Commissioner's Opinion, the ICO revealed it was aware of proposals to use live facial recognition in billboards and might even remember faces allowing companies to track individuals visits across different locations and match their personal consumer-based preferences.

Bias and GDPR

The ICO adds companies also need to be aware of the dangers of bias in facial recognition systems and the risks of misidentification.

The Commissioner's Opinion sets standards for the use of live facial recognition by firms and public bodies and revealed that out of six ICO investigations into LFR systems, none of those that went live were fully compliant with data protection law.

As a result, all of the organisations then chose to stop, or not proceed with the use of the technology.


 

Share article