Hiring more non-English language moderators and safety testing algorithms among
UK trio’s proposals
Social media platforms will be urged to protect minorities and help prevent ethnic violence by hiring non-English language moderators and conducting safety tests on their algorithms, under proposals for a UN global code of conduct.
A
British trio whose work has influenced the regulatory framework behind the online safety bill in the UK has sent a detailed plan for tackling toxic content on
Social Media and video platforms to a UN official drawing up anti-online hate guidelines.