Jun 21, 2021

ICO warns of privacy concerns on the use of LFR technology

Technology
ICO
LFR
cameras
3 min
Organisations need to justify their use of live facial recognition (LFR) is fair, necessary, and proportionate, says the Information Commissioner’s Office

Live facial recognition (LFR) technology should not be used simply because it is available and must be used for a specific purpose, the Information Commissioner’s Office (ICO) has warned.

“I am deeply concerned about the potential for live facial recognition (LFR) technology to be used inappropriately, excessively, or even recklessly. When sensitive personal data is collected on a mass scale without people’s knowledge, choice or control, the impacts could be significant,” said Elizabeth Denham, the UK’s Information Commissioner.

Denham explained that with any new technology, building public trust and confidence in the way people’s information is used is crucial so the benefits derived from the technology can be fully realised.

“It is not my role to endorse or ban a technology but, while this technology is developing and not widely deployed, we have an opportunity to ensure it does not expand without due regard for data protection,” Denham added.

The Information Commissioner’s Office has said it will work with organisations to ensure that the use of LFR is lawful, and that a fair balance is struck between their own purposes and the interests and rights of the public. They will also engage with Government, regulators and industry, as well as international colleagues to make sure data protection and innovation can continue to work hand in hand.
 

What is live facial recognition? 

Facial recognition is the process by which a person can be identified or recognised from a digital facial image. Cameras are used to capture these images and FRT software measures and analyses facial features to produce a biometric template. This typically enables the user to identify, authenticate or verify, or categorise individuals. 

Live facial recognition (LFR) is a type of FRT that allows this process to take place automatically and in real-time. LFR is typically deployed in a similar way to traditional CCTV in that it is directed towards everyone in a particular area rather than specific individuals. It can capture the biometric data of all individuals passing within range of the camera indiscriminately, as opposed to more targeted “one-to-one” data processing. This can involve the collection of biometric data on a mass scale and there is often a lack of awareness, choice or control for the individual in this process. 

 

Why is biometric data particularly sensitive?

Biometrics are physical or behavioural human characteristics that can be used to digitally identify a person to grant access to systems, devices, or data. Biometric data extracted from a facial image can be used to uniquely identify an individual in a range of different contexts. It can also be used to estimate or infer other characteristics, such as their age, sex, gender, or ethnicity.

The security of the biometric authentication data is vitally important, even more than the security of passwords, since passwords can be easily changed if they are exposed. A fingerprint or retinal scan, however, is immutable. 

The UK courts have concluded that “like fingerprints and DNA [a facial biometric template] is information of an “intrinsically private” character.” LFR can collect this data without any direct engagement with the individual. Given that LFR relies on the use of sensitive personal data, the public must have confidence that its use is lawful, fair, transparent, and meets the other standards set out in data protection legislation.

Share article

Jul 14, 2021

Discord buys Sentropy to fight against hate and abuse online

Technology
Discord
Sentropy
AI
2 min
Sentropy is joining Discord to continue fighting against hate and abuse on the internet

Discord, a popular chat app, has acquired the software company Sentropy to bolster its efforts to combat online abuse and harassment. Sentropy, monitors online networks for abuse and harassment, then offers users a way to block problematic people and filter out messages they don’t want to see.

First launched in 2015 and currently boasting 150 million monthly active users, Discord plans to integrate Sentropy’s own products into its existing toolkit and the company will also bring the smaller company’s leadership group aboard. Discord currently uses a “multilevel” approach to moderation, and a Trust and Safety (T&S) team dedicated to protecting users and shaping content moderation policies comprised 15% of Discord’s workforce as of May 2020.

“T&S tech and processes should not be used as a competitive advantage,” Sentropy CEO John Redgrave said in a blog post on the announcement. “We all deserve digital and physical safety, and moderators deserve better tooling to help them do one of the hardest jobs online more effectively and with fewer harmful impacts.”

 

Cleanse platforms of online harassment and abuse

 

Redgrave elaborated on the company’s natural connection with Discord: “Discord represents the next generation of social companies — a generation where users are not the product to be sold, but the engine of connectivity, creativity, and growth. In this model, user privacy and user safety are essential product features, not an afterthought. The success of this model depends upon building next-generation Trust and Safety into every product. We don’t take this responsibility lightly and are humbled to work at the scale of Discord and with Discord’s resources to increase the depth of our impact.”

Sentropy launched out of stealth last summer with an AI system designed to detect, track and cleanse platforms of online harassment and abuse. The company emerged then with $13 million in funding from notable backers including Reddit co-founder Alexis Ohanian and his VC firm Initialized Capital, King River Capital, Horizons Ventures and Playground Global.

“We are excited to help Discord decide how we can most effectively share with the rest of the Internet the best practices, technology, and tools that we’ve developed to protect our own communities,” Redgrave said.

 

Share article