Imagine someone walks up to the grieving daughter of a suicide victim, hands her a doctored photo of her dead parent and tells her she is the reason he killed himself.
Or picture someone walking into the newsroom of The New York Times and handing prints of pornographic, violent images to members of the staff, and predominantly to women.
Such instances are almost hard to visualize, because they so severely violate our social norms. They are repulsive, and anyone who could do it while looking their targets in the eye would fall far outside the spectrum of normal human interaction.
Yet the presence of Internet trolls makes clear that many people who would never perform these actions in person are eager to perform their equivalent once they are safely on the anonymous end of a screen.
Zelda Williams, the late Robin Williams’ daughter, recently left Twitter “for a good long time, maybe forever,” in response to abuse from a pair of users in the wake of her father’s death. The two users were eventually suspended. The Washington Post reports Williams had posted angrily on Tumblr in June, calling out commenters who had directed various slurs her way on social media. While Twitter has a way to report abuse after the fact (as does Tumblr), there is no way to filter hateful speech before it arrives in a user’s notifications.
As for the other example, the staff of Jezebel criticized its parent company, Gawker Media, earlier this month for failing to take swift and effective action to prevent sexually explicit and violent images from flooding the site’s comment section. Jezebel staff had no recourse except to manually dismiss the comments and ban the user - though, since IP addresses weren’t recorded on commenters’ accounts, those responsible could simply start new anonymous accounts and continue their activities. Gawker has since instituted a system that sorts “approved” replies from “pending” replies, warning the reader that pending replies may contain graphic material.
This change is a useful step. I have written before about the value of moderating user comments on news sites and blogs. But trolling - the practice of spewing hateful, offensive or otherwise hurtful material at a given target - is a problem that reaches beyond the comment sections of major news sites.
Whitney Phillips, a lecturer at Humboldt State University who has written a book about bad behavior online, told The New York Times, “As long as the Internet keeps operating according to a click-based economy, trolls will maybe not win, but they will always be present.” Some have suggested that the ability to comment online at all should be restricted or eliminated altogether in some cases; others have countered that such measures may end up silencing real, useful discussions that don’t have equivalent outlets elsewhere. Nor would such a solution have much helped Williams, as Twitter is designed as a means of communication. There is no way to simply shut down the comments there.
While Williams and other victims of trolling can sometimes remove themselves, the Internet is so pervasive today that disengaging can carry a heavy professional, social or even financial cost. As Amanda Hess observed in an article for the Pacific Standard, “for many women, steering clear of the Internet isn’t an option.” When faced with law enforcement who simply suggested she remove herself from Twitter, Hess quotes Nathan Jurgenson, who said, “Telling a woman to shut her laptop is like saying, ‘Eh! Just stop seeing your family.”
Although not all trolling or online bullying springs from anonymity, a great deal of it does. The idea that an anonymous person with an audience will often respond with rude, inappropriate or hostile behavior is borne out on poorly moderated news site comment sections and social media. Massive multiplayer online games, often called MMOs, have garnered a reputation for attracting abuse, especially directed at female players. World of Warcraft’s company, Blizzard, recently decided to force users to post under their real names in an attempt to raise the tone of discourse. Blizzard backtracked when users made clear that privacy concerns made that tactic unpalatable; perhaps ironically, those women who had found anonymity some shield against trolls expressed concern about being forced to reveal their gender.
While banishing anonymity altogether introduces new problems - including, but not limited to, restricting discourse from civil but privacy-minded individuals who might have worthwhile contributions - it is also clear that bad online behavior flourishes when posters face no real consequences for acting out. In 2004 psychologist John Suler identified six characteristics of online interactions that meant users’ online behavior might end up radically different from their offline behavior. Some of those characteristics, such as the asynchronicity that means someone may read or respond to a comment hours after it is made, are inherent parts of the online experience. Anonymity, though, is one over which those running social media or sites with comment sections have some degree of control.
In The Times, Phillips argues that trolls are a symptom of larger societal problems, not the disease itself. Yet as long as racism, misogyny, homophobia and other forms of bigotry remain, it is important to be aware of the ways in which online disinhibition will bring them to the forefront. Robust systems for reporting abuse, careful comment moderation and limits on anonymity will not eradicate online misbehavior, but are essential for curbing it.
Posted by Larry M. Elkin, CPA, CFP®
Imagine someone walks up to the grieving daughter of a suicide victim, hands her a doctored photo of her dead parent and tells her she is the reason he killed himself.
Or picture someone walking into the newsroom of The New York Times and handing prints of pornographic, violent images to members of the staff, and predominantly to women.
Such instances are almost hard to visualize, because they so severely violate our social norms. They are repulsive, and anyone who could do it while looking their targets in the eye would fall far outside the spectrum of normal human interaction.
Yet the presence of Internet trolls makes clear that many people who would never perform these actions in person are eager to perform their equivalent once they are safely on the anonymous end of a screen.
Zelda Williams, the late Robin Williams’ daughter, recently left Twitter “for a good long time, maybe forever,” in response to abuse from a pair of users in the wake of her father’s death. The two users were eventually suspended. The Washington Post reports Williams had posted angrily on Tumblr in June, calling out commenters who had directed various slurs her way on social media. While Twitter has a way to report abuse after the fact (as does Tumblr), there is no way to filter hateful speech before it arrives in a user’s notifications.
As for the other example, the staff of Jezebel criticized its parent company, Gawker Media, earlier this month for failing to take swift and effective action to prevent sexually explicit and violent images from flooding the site’s comment section. Jezebel staff had no recourse except to manually dismiss the comments and ban the user - though, since IP addresses weren’t recorded on commenters’ accounts, those responsible could simply start new anonymous accounts and continue their activities. Gawker has since instituted a system that sorts “approved” replies from “pending” replies, warning the reader that pending replies may contain graphic material.
This change is a useful step. I have written before about the value of moderating user comments on news sites and blogs. But trolling - the practice of spewing hateful, offensive or otherwise hurtful material at a given target - is a problem that reaches beyond the comment sections of major news sites.
Whitney Phillips, a lecturer at Humboldt State University who has written a book about bad behavior online, told The New York Times, “As long as the Internet keeps operating according to a click-based economy, trolls will maybe not win, but they will always be present.” Some have suggested that the ability to comment online at all should be restricted or eliminated altogether in some cases; others have countered that such measures may end up silencing real, useful discussions that don’t have equivalent outlets elsewhere. Nor would such a solution have much helped Williams, as Twitter is designed as a means of communication. There is no way to simply shut down the comments there.
While Williams and other victims of trolling can sometimes remove themselves, the Internet is so pervasive today that disengaging can carry a heavy professional, social or even financial cost. As Amanda Hess observed in an article for the Pacific Standard, “for many women, steering clear of the Internet isn’t an option.” When faced with law enforcement who simply suggested she remove herself from Twitter, Hess quotes Nathan Jurgenson, who said, “Telling a woman to shut her laptop is like saying, ‘Eh! Just stop seeing your family.”
Although not all trolling or online bullying springs from anonymity, a great deal of it does. The idea that an anonymous person with an audience will often respond with rude, inappropriate or hostile behavior is borne out on poorly moderated news site comment sections and social media. Massive multiplayer online games, often called MMOs, have garnered a reputation for attracting abuse, especially directed at female players. World of Warcraft’s company, Blizzard, recently decided to force users to post under their real names in an attempt to raise the tone of discourse. Blizzard backtracked when users made clear that privacy concerns made that tactic unpalatable; perhaps ironically, those women who had found anonymity some shield against trolls expressed concern about being forced to reveal their gender.
While banishing anonymity altogether introduces new problems - including, but not limited to, restricting discourse from civil but privacy-minded individuals who might have worthwhile contributions - it is also clear that bad online behavior flourishes when posters face no real consequences for acting out. In 2004 psychologist John Suler identified six characteristics of online interactions that meant users’ online behavior might end up radically different from their offline behavior. Some of those characteristics, such as the asynchronicity that means someone may read or respond to a comment hours after it is made, are inherent parts of the online experience. Anonymity, though, is one over which those running social media or sites with comment sections have some degree of control.
In The Times, Phillips argues that trolls are a symptom of larger societal problems, not the disease itself. Yet as long as racism, misogyny, homophobia and other forms of bigotry remain, it is important to be aware of the ways in which online disinhibition will bring them to the forefront. Robust systems for reporting abuse, careful comment moderation and limits on anonymity will not eradicate online misbehavior, but are essential for curbing it.
Related posts: