moderation of hashtags related to pro-eating disorder communities on Instagram, Pinterest, and Tumblr
Gerrard, Ysabel. “Beyond the Hashtag: Circumventing Content Moderation on Social Media.” New Media & Society, vol. 20, no. 12, Dec. 2018, pp. 4492–4511.
Ysabel Gerrard, lecturer at the University of Sheffield, explores the moderation of hashtags related to pro-eating disorder communities on Instagram, Pinterest, and Tumblr in her scholarly article “Beyond the hashtag: Circumventing content moderation on social media.” She finds that untagged pro-eating disorder content can be found without using hashtags; instead, users employ alternative signal phrases that identify themselves as pro-ED. Moreover, according to Gerrard, the recommendation systems of the platforms in question “recirculate pro-ED content, revealing the limitations of hashtag logics in social media content moderation” (Gerrard). In light of these findings, I think that it should be up to the users of social media platforms as to what content should be removed, not the platforms themselves. Automated policies currently in place by social media sites such as Tumblr or Pinterest do not seem to cover enough ground, and in my opinion, only the users who are aware of what content is objectionable on a personal level should be able to judge for themselves what is right and what is wrong.
Don't use plagiarised sources.Get your custom essay just from $11/page
Heins, Marjorie. “The Brave New World of Social Media Censorship.” Harvard Law Review, 20 June 2014, https://harvardlawreview.org/2014/06/the-brave-new-world-of-social-media-censorship/. Accessed 6 May 2020.
In the article posted to Harvard Law Review titled “The Brave New World of Social Media Censorship”, Marjorie Heins argues that internet censorship hurts the freedom of speech. In her article, she discusses the impenetrability of privacy social media sites censoring their content. According to Heins and many others, the sites do have a right to create a platform of their discretion. However, Heins points out that the systems made in place by organizations like Facebook and Google aren’t proving to be consistent or effective. Heins also summarizes many points made by Professor Marvin Ammori. Ammori expresses that many companies employ lengthy terms of service which censor protected free speech. Specifically, the Digital Millennium Copyright Act has prompted these services to remove content out of a sense of liability. Platforms such as YouTube and Facebook take on the side of “censor now, ask questions later” in order to avoid lawsuits. Finally, Heins discusses the outcome of a court case titled New York Times v. Sullivan which has favored the criticizing of Government as a part of freedom of speech. I think that Heins brings up very valid points. On one side I understand the requirement of censorship systems. However, this censorship has led to the curtailing of certain beliefs and thus stopping productive debate and discussion.
Rosen, Jeffery. “Who Decides – Civility v. Hate Speech on the Internet.” HeinOnline, 2013, heinonline.org/HOL/LandingPage?handle=hein.journals%2Finsilaso13&div=23&id=&page=.
In the article “Who Decides – Civility v. Hate Speech on the Internet” an anti-Mohammad video was posted to YouTube which sparked rioting in Middle Eastern countries. The United States ambassador asked for the video to be taken down, but Google refused. Google had complete control over whether or not the video should be taken down. “Today, lawyers at Google, YouTube, Facebook, and Twitter have more power over who can speak and who can be heard than any president, judge, or monarch”. After this incident, social media have adopted new rules and guidelines. Social media also now uses user flags, that are evaluated by lawyers, to determine if content should remain on the platform. Though users are involved in what appears on social media, the decision is still completely up to the platform. I think that social media should have some power to remove content but not full power over what people get to see.
Gerrard, Ysabel. “Behind the Screen: Content Moderation in the Shadows of Social Media.”
New Media & Society, vol. 22, no. 3, Mar. 2020, pp. 579–582. EBSCOhost, doi:10.1177/1461444819878844.
In “Behind the Screen: Content Moderation in the Shadows of Social Media”, Ysabel Gerrard, a lecturer at the University of Sheffield, analyzes how the employees behind the scene of a social media site might determine what to show, edit or delete a post. As a post is being put online, they have mere seconds to review, edit or delete that post objectively. Gerrard says “But all the moderators discussed so far share difficulties in jumping between different cultural contexts and client ‘voices’ (p. 142). As Rick explained, ‘Some of the news sites might be pro-NRA, some might be pro-gun control, so you need to put aside your personal beliefs and moderate, not “I think that comment is appropriate or not appropriate”’” (Gerrard). I do not think it is their place to tell us what is ok to look at and what is not. I firmly believe that it should be up to the users themselves to decide whether the content is appropriate for their viewings or not, because making decisions based on objective truth or not, it will be somewhat biased.
These articles combined allowed us to come up with the consensus that social media platforms should have some control over what we see but the users should still have a say in what they see. Preventing what the users see can be a conflict of interest as referenced above where moderators have to put their feelings aside so they aren’t taking comments down because “[they] think that comment is appropriate or not appropriate” (Gerrard). And it could be easier to just remove the posts and comments that are flagged because it’s easier “censor now, and ask questions later” (Heins) but as brought up by one of my colleagues, preventing users from deciding which posts and comments should be removed doesn’t allow them to build up the sense of what’s right and wrong. If we allow users to gain control of their social media then it might push the users to control what they post as well as communicate with other users on the controversial posts they find.