Navigation
Recherche
|
AI Writing Assistants Guilty of ‘Cultural Stereotyping and Language Homogenization’ – Cornell Study
lundi 5 mai 2025, 22:02 , par eWeek
AI-powered writing tools promise to democratize communication, helping people write faster, more clearly, and with greater confidence. But as these AI tools go global, a growing body of research warns they may be reshaping cultural identity in subtle but significant ways.
AI writing tools homogenize global voices A new study from Cornell has identified an unexpected effect from the international reach of AI assistants: They homogenize language, making billions of users in the Global South sound more like Americans. In the study, participants from the US and India who used an AI writing assistant produced more similar writing than those who wrote without one. Indian participants also spent more time editing the AI’s suggestions to better reflect their cultural context, which ultimately reduced the tool’s overall productivity benefits. Cultural stereotyping through predictive suggestions “This is one of the first studies, if not the first, to show that the use of AI in writing could lead to cultural stereotyping and language homogenization,” said Aditya Vashistha, assistant professor of information science and the senior author of the study. “People start writing similarly to others, and that’s not what we want,” Vashistha added. “One of the beautiful things about the world is the diversity that we have.” How the Cornell study about AI writing assistants was designed The Cornell study gathered 118 participants — about half from the US and half from India. Participants were then asked to write about cultural topics, with half in each country writing independently, and the other half using AI assistants. Indian participants using the AI writing assistant accepted about 25% of the tool’s suggestions, while American writers accepted roughly 19%. However, Indians were far more likely to modify the suggestions to fit their cultural writing style, making the tool much less helpful. Western norms embedded in AI defaults One example is that when participants wrote about their favorite food and holiday, the AI assistant recommended distinctly American favorites, including pizza and Christmas. And when writing about their favorite actors, Indians who started typing “S” received suggestions of Shaquille O’Neil or Scarlett Johansson, rather than famous Bollywood actor Shah Rukh Khan. The reason for this Western bias may be that AI assistants like ChatGPT are powered by large language models (LLMs) developed by US tech companies. These tools are now being used globally, including by 85% of the world’s population in the Global South. Rising concerns of ‘AI colonialism’ Researchers suggest that Indian users are now facing “AI colonialism,” with the bias of these assistants presenting Western culture as superior. This has the potential to change not only the way non-Western users write but also how they think. “These technologies obviously bring a lot of value into people’s lives,” said Paromita Agarwal, a co-author of the study. “But for that value to be equitable and for these products to do well in these markets, tech companies need to focus on cultural aspects, rather than just language aspects.” TechnologyAdvice contributing writer Michael Kurko wrote this article. The post AI Writing Assistants Guilty of ‘Cultural Stereotyping and Language Homogenization’ – Cornell Study appeared first on eWEEK.
https://www.eweek.com/news/ai-writing-tools-cornell-study/
Voir aussi |
56 sources (32 en français)
Date Actuelle
mar. 6 mai - 11:00 CEST
|