Navigation
Recherche
|
Hearing What the Moderators Actually Do
jeudi 28 février 2019, 20:22 , par TheMacObserver
There has been much discussion in recent times about what social media companies and online platforms are doing to monitor content. For example, Facebook has moved to moderate Anti-Vaxxer content on its platform. Apple News is, of course, curated by editors. We often hear from the heads of companies about moderation, but not from the people who actually do it. Medium’s s Head of Trust and Safety spoke to people who have been on the frontline of this at a variety of tech companies. The conversation sheds a light on how decisions about content get made.
This is where the trust and safety team comes in. Most companies operating an online platform have one. It sometimes goes by other names — “content policy” or “moderation” — and comes in other flavors, like “community operations.” Whatever the name, this is the team that encourages social norms. They make platform rules and enforce them. They are at once the judges and janitors of the internet. This is not the job of a few dozen techie randos, but tens of thousands of workers, both full-time employees and contractors.
https://www.macobserver.com/link/hearing-what-the-moderators-actually-do/?utm_source=macobserver&utm...
|
59 sources (15 en français)
Date Actuelle
ven. 1 nov. - 10:28 CET
|