Skip to content Skip to sidebar Skip to footer

ACLU, Human Rights Groups Call On Zoom To Drop Plans For 'Emotion Analysis' Software

Aclu human rights groups call on zoom to drop plans for building aclu human rights groups call on zoom to drop plans for picnic table aclu human rights groups call on zoom to groom aclu human rights groups call on zoom to mp4 aclu human rights groups call on zoom tooth aclu human rights opinion aclu human resources
ACLU, Human Rights Groups Call on Zoom to Drop Plans for 'Emotion Analysis' Software


ACLU, Human Rights Groups Call on Zoom to Drop Plans for 'Emotion Analysis' Software

Civil rights groups are calling on Zoom to ditch plans to explore "emotion analysis software" that would use artificial intelligence to analyze the mood of videoconference participants.

In an open letter to Zoom founder Eric Yuan on Wednesday, the American Civil Liberties Union, digital-rights nonprofit Fight for the Future and nearly 30 other civil liberties organizations called such technology discriminatory, manipulative and "based on pseudoscience." 

"Zoom claims to care about the happiness and security of its users but this invasive technology says otherwise," according to the letter, which called using AI to track human emotions "a violation of privacy and human rights."

The memo also warned that harvesting such "deeply personal data" could make client companies a target "for snooping government authorities and malicious hackers."

See Also: Zoom Privacy Risks: The Video Chat App Could Be Sharing More Information Than You Think

It was fueled by an April 13 Protocol article indicating that the popular video communications app was actively researching integrating AI that can read emotional cues.

"These are informational signals that can be useful; they're not necessarily decisive," Josh Dulberger, Zoom's head of product, data and AI, told Protocol. Dulberger imagined using the tech to give sales reps a better understanding of how a video meeting went, "for instance by detecting, 'We think sentiments went south in this part of the call,'" Protocol reported.

A woman on a Zoom call

Emotion-tracking software is inherently biased, according to civil rights groups, because it assumes all people display the same facial expressions and body language.

FG Trade

But, the groups contend, the technology could be used to punish employees, students and other Zoom users for "expressing the wrong emotions" based on the AI's determinations. It's also inherently biased, they added, because it assumes all people use the same facial expressions, voice patterns and body language to express themselves.

"Adding this feature will discriminate against certain ethnicities and people with disabilities, hardcoding stereotypes into millions of devices," the letter read.

The group has called on Zoom to commit by May 20 to not implement emotion-tracking AI in its products.

Zoom didn't immediately respond to a request for comment.


Source

https://nichols.my.id/how-to-download-youtube-video-with-subtitle.html

.