A new paper published in IEEE Technology and Society Magazine examines the shortcomings of current techniques and tools used to identify toxic content on social media and proposes potential solutions. The authors argue that supporting human workers, who are essential for moderation, is crucial and requires constant re-evaluation.
