WASHINGTON DC: Digi-tal civil rights group Access Now is sending a letter to Spotify CEO Daniel Ek imploring the company to abandon a technology it has patented to detect emotion, gender and age using speech recognition, Axios has learned.
Why it matters: While many of us in theory want our computers to understand who we are and what we want, the industry too often doesn’t think through how its innovations will affect different kinds of people or what harm its collection of data can cause.
In its letter, Access Now says technology that aims to determine a person’s mood and demographics based on their speech could be used to manipulate human emotion and is likely to lead to discrimination.
“This technology is dangerous, a violation of privacy and other human rights, and should be abandoned,” Access Now says in its letter to Spotify, which was obtained by Axios.
Access Now highlights four areas of particular concern.
Emotion manipulation: “Serious doubts have been raised about the scientific basis of emotion recognition technology and whether it works. While the majority of criticism has focused on inferring emotion using facial recognition systems, many of these criticisms apply equally to speech-based approaches.”
Gender discrimination: “You cannot infer gender without discriminating against trans and non-binary people. If you infer gender, according to a male-female binary from voice data, you will likely misgender trans people, and place non-binary people into a gender binary that undermines their identity.”
Privacy violations: “Based on reporting, the device would always be on, which means that it would be constantly monitoring, processing voice data, and likely ingesting sensitive information. … No one wants a machine listening in on their most intimate conversations.”
Data security: “Harvesting this kind of data could make Spotify a target for third parties seeking information, from snooping government authorities to malicious hackers.” Our thought bubble: Information we give to companies for reasons of convenience becomes tough to claw back when they start to use it in ways that make us unhappy.
“Mood detection” and “emotional state” are particularly fuzzy categories fraught with both ethical and practical pitfalls.
Of note: Just because Spotify has received a patent doesn’t mean the company intends to build or deploy the feature.