
After the jury announced its verdict in New Jersey I watched Associated Press video statement read by Tyler Clementi’s father. Sad and clearly with a heavy heart, he nevertheless looked to the future in a way that most of us could not have done had we lost a child the way he lost Tyler. Then I glanced down at the YouTube comments — just about every one included a gay slur or offensive language, and I was disgusted. The comments were not relevant.
Racist and hateful online comments demean writers, video-makers, and people who thoughtfully share digital content. It’s becoming tiresome. Masquerading as run-of-the-mill responses at the end of articles and videos – they are actually cyber-bullies’ remarks left here and there with the goal of offending and hurting others. The time has long past for comment and blog editors everywhere — but especially at Google’s YouTube — to set up and enforce guidelines.
I know that the United States Constitution guarantees freedom of speech; however, it’s not freedom of speech we are observing but the freedom to run off at the mouth and bully others in ways that are not relevant to the content. As a result we are teaching all sorts of silent lessons — the kind we don’t really intend to teach to young people as they grow up.
Continue reading “Removing Racist and Hateful Comments: A Simple Relevancy Test”
You must be logged in to post a comment.