Racist critics accuse AI startup of ‘accent translation’ tech

  • Tech startup Sanas has been accused of racism over its “accent translation” technology.
  • Artificial intelligence and tech industry experts say the startup’s mission to legitimize racism is a form of “digital whitening.”
  • But some call center agents told Insider they think the technology will enhance their day-to-day work.

“Accent translation” startup Sanas faced racism and discrimination charges last week after it was accused of manipulating non-American accents to sound “whiter.” The company uses speech recognition technology to change users’ accents in near real-time; their main target appears to be foreign call center employees.

Sharath Keshava Narayana, co-founder and COO of Sanas, denied that the startup’s technology was discriminatory and told Insider that the company had always intended to expand its translation model to include other accents. According to Keshava Narayana, a demo on its website, in which the technology translates an Indian accent to a standard American accent, only shows an initial model of it.

“It’s not just an American having trouble understanding someone from India and vice versa,” Keshava Narayana told Insider. “If we continue to scale the product, and if we start to see more and more targeted accents, we believe this will be a localized solution.”

Sanas has been testing translation models in other countries such as India and the Philippines, and plans to bring accent translation to Latin America and South Korea, according to the startup.

However, some experts in the tech industry have accused the startup’s product of a “digital whitening.” FrauenLoop Nakeema Stefflbauer, an AI and tech angel investor and female-led computer programming group, told Insider that the problem with Sanas’ response was that “the accent suggests a sense of power and belonging.”

“When this was commercialized, there was only one direction for everyone,” she said. “It’s not about understanding, it’s about comfort — for groups that don’t want to understand, empathize, or interact with people who are completely different. This technology doesn’t ensure the comfort of hypothetical call center workers.”

She added that before Sanas pitched the technology to clients in the Global South as a tool to better understand and communicate with Americans and Western Europeans, “whether intentional or not, it was a way to reinforce racialized hierarchies. a one-way ‘solution’.”

AI and technology industry experts and call center workers spoke to Insider about what they see as the cultural costs and potential benefits of Sanas. While the company says the goal of its technology is to make people around the world sound more “local” when they’re on the phone, Stefflbauer and others in the AI ​​field worry it’s another step toward a homogenized startup world — — Silicon Valley has been doing this repeatedly or for a long time.

“This is trying to tell us what the future will look like and how should we experience voice and communicate with people online?” Stefbauer said. “Who are we supposed to communicate with and who we never hear?”

Founder of Sanas

Sanas founding team.


Tech industry experts call accent ‘translation’ a form of ‘digital whitening’

Sanas, which has raised $32 million in funding, said its goal is to help people be “more local and more global” on its website. In an interview with the BBC, Keshava Narayana said 90 percent of the company’s staff and all four founders were immigrants, and rejected criticism that the company was trying to make the world sound “white and American”.

But Mia Shah-Dand, founder of AI Ethics and Women in Lighthouse3, told Insider that, as an Indian immigrant with a non-American accent, she found Sanas’ announcement “very exciting,” especially as someone “taunted” and discriminated against [their] accent. “

The technology tries to erase people’s uniqueness and tell them they’re “not good enough,” she said.

“It feels like everything in Silicon Valley, as long as it’s legalized by Stanford or MIT,” she said. “People will accept racism, accept sexism, as long as the person who does it belongs to one of these prestigious universities.”

Shah-Dand added that Sanas’ product reinforces a power dynamic that “returns to the days of colonialism.” Rather than addressing the root causes of racism and discrimination, ‘accent translation’ leans towards a form of ‘whitening’ – in many historically colonized countries people feel pressure to have their skin whiter to match Europe beauty standards.

“It’s Silicon Valley’s version of digital whitening,” Shah-Dand said. “Technology isn’t about making the world a better place, it’s amplifying, helping, and just monetizing all the hate and racism without really trying to fix anything.”

Stefflbauer told Insider that she found Sanas’ technology “really disappointing and disturbing,” especially in a growing culture of “putting one person “on the job.”

“Only certain people can bring their whole selves, and don’t invite anyone who doesn’t belong to this mythical norm to bring them,” she said, referring to the 2018 dark surreal comedy “Sorry to Bother You,” which featured A black telemarketer found a new door to career success only when he adopted a “white” voice.

“This is really another example of what we’re facing in trying to make the tech industry and the products and services it produces reflect the real world,” Stefflbauer said.

She added that she doesn’t see how the technology will actually address racial bias in any way.

“It didn’t even try to address that in its solution,” she said. “It basically provides support and cover for those who have had any interaction with someone with an accent and continue to do so.”

Call center agents tell Insider they face racialized hostility

The founders of Sanas said they came up with the idea for the startup after a friend at Stanford University was underperforming in a call center job because of his heavy Central American accent.

Call center agents who spoke with the insider said their jobs could be brutal — doubled if they had a racially distinct accent or name.

“Unfortunately, there are a lot of people in this world who feel better than you or choose to speak to you when they hear your accent,” said Dafina Swann, who has worked more than five years in call centers ,Say.

Swan, from Trinidad and Tobago, said she had received a lot of “hostile” and “negative comments” from callers asking them to speak to Americans. She has also heard instances of colleagues being called racist names like the n-word and told “not human, but black”.

To minimize the racism they face, some call center agents told Insider they have tried imitating customers’ accents and even changing their names. Sometimes the order to change the name comes from the agent’s manager or employer.

“After I started introducing myself as Michael O’Connor, my performance ratings in customer surveys improved—all green, green, green,” Osama Badr, a call center agent from Egypt, told Insider.

Sanas co-founder Keshava Narayana said he had a similar experience working in a call center, where he trained for six weeks in accents and was told to change his name to “Ethan.”

“There are events that stay with you for a long time, and this is one of them,” he told Insider.

Some worry that manipulated voices could herald a homogeneous future for the tech industry

Shah-Dand, who said she doesn’t believe in defending the technology, said people are exposed to and understand different accents, but only because call center workers are seen “below” them being unfairly abused.

“There are a lot of people with strong accents, like Boutros Boutros-Ghali,” said Shah Dande, referring to the former UN secretary general. “But because they’re in a powerful position, you try to understand.”

In her work, Stefflbauer said she’s been thinking about what digital life will look like in the next 10, 20 years, and she’s worried about what technologies like Sanas will portend.

“I’m seeing more and more examples of digital life where no one is black, no one is brown, no one has an accent, no one has a history outside of North American mythological ideals,” Stefflbauer said. “The question is: Do we want to export this mentality, to bring this pain to everyone? Because it absolutely is.” Other AI technologies, including facial recognition, have also faced accusations of racism and homogenization.

“Who would want to take a selfie on Instagram and have their face automatically change to look like someone of a different race?” she said. “Essentially that.”

But call center workers, who have to deal with racist comments in their day-to-day jobs, say a solution like Sanas’s could be a blessing.

“It definitely makes my job easier. Everyone wants to be understood,” Swan said. “There is a job to be done, and it would be great if something could be implemented to make it easier.”

Source link