Facebook Ethical and Legal Considerations
Facebook Live is a live streaming feature that allows users on Facebook to share live videos with over 2 billion users. Since its inception in 2015, violent videos capturing homicide, sexual assault and suicide have been an unwanted consequence of this feature. The question has been raised whether or not Facebook and similar media platforms have a legal or ethical obligation to come to the aid of a crime victim. This paper discusses the legal and ethical responsibility of Facebook and what measures can be taken to deter the uploading of violent videos.
Contents
Facebook Live Killings; Ethical and Legal Considerations
Legal and Ethical Obligations to Users
In December of 2015 Facebook launched their Facebook Live feature. This allows users to stream videos as they are happening. Friends and family could share in the celebration of birthday parties, engagement announcements, and baby gender reveals from across the world. But within the first year and a half of Facebook Live being launched, a dark side of this new technology began to surface. Instances of murder, suicide, and sexual assault began to show up. It wasn’t until the Spring of 2017 that the issue of these violent videos and what should be done to prevent them being aired was brought front and center when the videos a 15 year old girl being raped by several teenage boys and an elderly man being murdered in broad daylight was put on Facebook.
In Chicago of March 2017, a group of teenage boys sexually assaulted a 15-year-old teenage girl. One of them live streamed the assault on Facebook where almost 40 people viewed the video without one person calling the police to help this young woman (Babwin, 2017). It was not until her uncle came across the video and alerted the police.
A month later, a 37 year-old man uploaded a video of himself shooting an elderly man in broad daylight. Robert Godwin Sr. was returning home after visiting with his family for a holiday meal (Yann & Simon & Gingras, 2017). The video remained on Facebook for nearly 2 hours. Facebook states they responded by making Stephens account inactive 20 minutes after it was brought to their attention (Lecher, 2017).
Generally speaking, you are not legally obligated to be a Good Samaritan or report a crime. Just a few states have laws on the books citing you must help a crime victim if it does not put your life in undue peril while others do require a person to report the crime if there is a case of suspected child abuse. (Cite: http://www2.law.ucla.edu/volokh/rescue.htm#14). In light of these violent crimes streamed on Facebook, the question has been raised if Facebook themselves is legally or ethically bound to help someone who is a victim of a crime or is intent on harming themselves.
Facebook has no more legal obligation to come to the aid of a crime victim than the general public. It does however have an ethical obligation if it wants to remain true to its mission statement. In 2017, Mark Zuckerberg, CEO of Facebook, announced at the Facebook Communities Summit they were focusing their mission statement to “Give people the power to build community and bring the world closer together.” Cite: https://techcrunch.com/2017/06/22/bring-the-world-closer-together/. When a person considers themselves part of a community there is a sense of belonging to the group, and with that, a belief that others care about them and would offer help in a time of need. To foster this sense of community, Facebook would need to have in place guidelines to promote this environment, rules governing what can and cannot be posted, and policies that enforce those rules. When a company displays moral fortitude it can be used a business strategy as it tends to help the viability of the company over time and customers tend to trust a company more if they are seen as honest and not only caring about the bottom line. CITE: text book chapter 2-8.
Content Review
As more violent videos appeared on Facebook, the public became outraged and called for Facebook to fix this problem. In response, Facebook hired 3,000 more employees to their Community Operations team bringing the total to 7,500. (Cite: https://www.washingtonpost.com/news/the-switch/wp/2017/05/03/facebook-is-adding-3000-workers-to-look-for-violence-on-facebook-live/?noredirect=on&utm_term=.df24ef364092). This team monitors posts, videos and photos for violent or criminal content. If a user comes across this type of content, they can flag the post, which then sends a report to one of those 7,500 employees for review.
Because of the nature of live streaming, it is almost impossible to catch a violent video while it is happening so continuing the use of the Community Operations team is one way Facebook can continue to review its content. However, it is impossible for 7,500 people to catch every questionable post so Facebook needs to create more sophisticated software to monitor posts in a fraction of the time it takes a human. The software can flag certain words and phrases and assign an urgency to the review.
A second way Facebook can be more proactive in preventing violent posts being uploaded is to add even more staff to their Community Operations team until the software mentioned above has been created. 7,500 people seems to be an impressive number but it isn’t until you look at the amount of posts that are uploaded per day that you realize the amount of data to be reviewed is overwhelming. As of October 2018, over 700 million comments, 420 million status updates and nearly 200 million photos per day occur. In order for the 7,500 employees charged with monitoring content to be effective, they would need to review 26,000 photos each. Until better software is developed that can ease the burden, more humans will need to be assigned to this task.(Cite: https://zephoria.com/top-15-valuable-facebook-statistics/).
A final way Facebook can monitor what types of videos and photos are posted is to be perfectly clear as to what is acceptable and what is not so the right content is taken down. Employees should be educated in hate crime speech, what constitutes a sexually inappropriate video and cyber bullying. There should also be protocol for alerting authorities if a crime is witnessed within a post, whether it be verbal or though a picture or video.
Safeguards
If Facebook or other social media outlets wanted to insure that no violent videos were put on their websites, the ultimate safeguard would be eliminating that part of their platform, live or otherwise. A more realistic solution may be using technology that would scan the video or photo first and after it is deemed as acceptable it then becomes uploaded. Another way Facebook can try to reduce the amount of questionable content is to do a better job of continuously educating their users on what their responsibility is as part of this online community. They need to make clear the rules for posting and remind them of the consequences if the policy is violated by using pop up boxes regularly enough to deliver the message, but not so frequent that the user becomes irritated by it and avoids the platform.
The Role of An Ethics Officer or Oversight Committee
The policy setting, implementation and oversight of these measures could be the responsibility of an Ethics Officer. Currently Facebook does not have an Ethics Officer or Oversight Committee. An Ethics Officer is essentially an in-house watchdog. They would make sure the company is following their mission statement and how they conduct business does not put them in jeopardy of being cast in a bad light from the public or their investors. An Ethics Officer can be beneficial to a company because they can provide the perspective of an outside entity. While a CFO may be looking at the financial outcome of a business decision, the Ethics Officer can take into account the global effect of that decision and how it will help or hurt the company in the long run.
Promoting Ethical Use
Facebook can promote positive posting by encouraging peer review. Routinely reminding users how to flag or report an inappropriate or violent post will help the company in their efforts to discourage and remove them. In addition, a focus of these companies should be the environment their young users are interacting within. A specific department should focus on bringing awareness to minors about cyber bullying, drug and alcohol abuse and how they can report posts that they believe indicate someone may be in harm. This group should also be engaging with the parents of these young users by reminding them to periodically check their child’s posts and provide resources if they suspect their child may be engaging in risky behavior.
Conclusion
Even with the latest technology and best of intentions, there will be instances where violent acts are broadcasted over social media. However, given the number of posts, videos and photos that are uploaded on a daily basis, the amount that contain violent or criminal activity is small. BuzzFeed News conducted their own research that showed from December 2015 to June 2017, two violent videos per month were being broadcasted. (CITE: https://www.buzzfeednews.com/article/alexkantrowitz/heres-how-bad-facebook-lives-violence-problem-is#.bwo251N9m). As compared to the hundreds of millions of daily interactions, the majority appear to be within the perimeters of Facebook’s policies for hosting a safe environment for their users to engage in.
Cite this page
Facebook Ethical and Legal Considerations. (2019, Mar 09). Retrieved from https://papersowl.com/examples/facebook-ethical-and-legal-considerations/