Facebook. what Happened
Facebook. You'd have to be living on another planet to have not heard of it. Most Americans have accounts on this social media giant. Facebook allows people to easily keep in contact with others. You can converse with others, make plans, post stories and or pictures, share videos or links to articles of interest. It has changed the way people communicate. Just like all of my high school friends, I opened an account.
My friend list grew to over 300 people as I "friended" friends of friends and their friends.
For a while, it was great. This is how we made plans to get together, sent each other jokes. I felt so popular. I felt like I really knew some of these people that I had never met in person. It was an easier way to break the ice with strangers as you could see pictures and things that were going on in other's lives and felt like you know them. I felt more confident connecting as there wasn't the awkwardness of trying to find common ground.
However, I soon learned there was a dark side to all this fun. Many people posted all the "fun" things they were doing, and it sounded like their lives were so much better then mine. Why didn't they have any bad things happen to them or any boring lulls? I started to feel inferior. Then there was the cyberbullying I noticed popping up. Some people I was close to were devastated by untruths being posted about them by anonymous users hiding behind their clever screen names. I started to notice some questionable articles and stories making the rounds, being stated as facts, when in reality they weren't true. Then news stories came out about Senate hearings and election meddling.
In 2003, Mark Zuckerberg, a 19 year old student at Harvard University, and three fellow students, started an online service for Harvard students to judge the attractiveness of other students called "Facemash". Even though it was shut down after two days, it was so popular, the four originators started a social network called "The Facebook" in 2004 where Harvard students could post pictures and personal information about themselves. It was opened up to other schools and in just six months had over 250,000 students from over 34 schools and by the end of that year saw 1 million users. Soon, it was known just as Facebook and was open to anyone over the age of 13, becoming the most used social media site. (Hall)
Starting out, Facebook had all good intentions. Taken from the Facebook website, under Careers, their mission statement reads " Founded in 2004, Facebook's mission is to give people the power to build community and bring the world closer together. People use Facebook to stay connected with friends and family, to discover what's going on in the world, and to share and express what matters to them". They also state "As our company grows we have 5 strong values that guide the way we work and the decisions we make each day to help achieve our mission.
The company seemed really focused on improving society by helping people stay connected in a fast paced world. The thought was that this would lead to more understanding. When Facebook first decided to offer stock to the public in its initial public offering (IPO) in 2012, Zuckerberg wrote, "People sharing more - even if just with their close friends or families - creates a more open culture and leads to a better understanding of the lives and perspectives of others. We believe that this creates a greater number of stronger relationships between people, and that it helps people get exposed to a greater number of diverse perspectives." (Zipkin).
The company started with, and still has, a lot of good qualities. Facebook makes people use their real identities. No one was allowed to pretend to be someone else. This fact allows people to feel safe knowing who they are conversing with. Some people might think they can hide their identity, but they really can't. People started groups to share ideas about common interests and meetings. I remember finding out about social gatherings and community events. I also sometimes found out about people having a hard time through their posts on Facebook. This allowed me to reach out to talk with them either by private message on Facebook, or calling. It made me feel good to be able to help someone out when I otherwise might not have even known.
Facebook is free to anyone who wants to create an account. It makes its money by charging companies for advertising .Facebook worked with the companies to prove to them that could market directly to individuals by using data collected. Companies pay a lot of money for this targeted advertising. This can be a good thing for Facebook, but not always so good for the consumer as it can be seen as an invasion in privacy.
Facebook says it does not sell users data. What they do is write algorithms for the advertisers using Facebook user's data based on what the advertisers want to target. There are a few types of ads that Facebook has and coined its own names for them. Sponsored stories are when an advertiser pays to highlight an action that a user takes.
For example, if a user "likes" a product, that "like" can be shown to the user's friends. Or if a user enters a sweepstakes, the that story can be shared with the user's friends hoping that they will enter the sweepstakes too. Page Post ads can be shown to anyone on Facebook. They can be links, videos, events among other things. They are mostly used to promote events.
Promoted Posts are shown to a page's fans and can be spread by clicking on a promote button. Marketplace ads are shown in a sidebar and clicking on them can lead to an app or a company website. The different types of ads carry different prices. Some have one price, some are charged per click. (Darwell). So basically, companies tell Facebook what they want to do, what type of audience they want to target and Facebook chooses who will see the ads based on their preferences, likes, etc then puts the ads on their feeds. When people click on the ads, or like them, these get shared with their friends. And if you know your friend is interested in something, it might influence you to take a closer look.
Then there is the darker side. In contrast to its' original goal of connecting people and promoting understanding and diversity, there were those who started using Facebook to promote hate and intolerance. School children cyberbully and shame other kids under anonymous names. Some political groups across the world started using Facebook to spread lies about their opponents and cause unrest and violence in their countries.
Facebook did not really anticipate this and have any mechanisms in place to prevent things like this from happening. One example is in the country of Myanmar. "Facebook employees missed a crescendo of posts and misinformation that helped to fuel modern ethnic cleansing in Myanmar. (Stevenson). However, in an article from the Business section of Wired.com, Issy Lapowsky and Steven Levy point out that, "From a legal perspective, Facebook is under no obligation to write or enforce any of these policies.
It is protected from the consequences of its users' speech by a provision of the 1996 act that defines social media platforms as a "safe harbor" for speech. That "Section 230" provision distinguishes Facebook from a publisher that stands behind its content. Yet Facebook knows that it must go beyond the legal minimum to keep itself from descending into a snake pit of harassment, bullying, sexual content and gun-running.
There is also an increasing clamor to do away with Section 230 now that the internet startups the provision was intended to help are giants." So even though legally, Facebook doesn't have to police its content, users demand it to a certain extent. And laws may change, so it's important that they put some provisions in place to help with that.
Facebook does have Community Standards that outline what is and what is not allowed on Facebook. These are enforced by employees called moderators who scan entries for content and respond to complaints about posted content. But they walk a fine line. "Facebook's dilemma is that it wants to be a safe place for users without becoming a strict censor of their speech. In the document released today it explains, 'We err on the side of allowing content, even when some find it objectionable, unless removing that content prevents a specific harm.'"
he Community Standards page on Facebook lists areas that they policies cover including: violence, criminal behavior, safety, objectionable content, respect of intellectual property, and authenticity. Some of the problems with enforcing this are what some people consider objectionable, others consider an expression of free speech. How that decision is made can be controversial. One plan is to use artificial intelligence programs to find objectionable content using programming tools. This is in process, however many think posters will be able to subvert it by changing up their language. Also, it can take day or even weeks to remove content that is clearly out of line. In that time frame, the damage can already be done as an enormous number of people could have seen it.
The dam really opened in 2011 when the Federal Trade Commission (FTC) brought charges against Facebook for telling users they could keep their information private, but allowed it to be made public and this violated Federal Law. Facebook settled the suit. The FTC website displays a list of complaints brought against Facebook: In December 2009, Facebook changed its website so certain information that users may have designated as private such as their Friends List was made public. They didn't warn users that this change was coming, or get their approval in advance.
Then it came to light that a political data firm hired by Trump's presidential campaign in 2016 had obtained access to the data of Facebook users. How could that happen when Zuckerberg himself assured the American public that Facebook didn't sell data to anyone? The data in question included user identities, friends, and items they had "liked". A psychology professor, Dr. Aleksandr Kogan, built an app in 2014 in which users took a personality survey and downloaded the app. The app then took private information from their profiles and those of their friends.
The users who participated in the survey were told their data would be collected for academic use. But no one had said friend's data would be collected nor were the friend's asked permission. Granville goes onto say, "Facebook in recent days has insisted that what Cambridge did was not a data breach, because it routinely allows researchers to have access to user data for academic purposes ” and users consent to this access when they create a Facebook account. But Facebook prohibits this kind of data to be sold or transferred 'to any ad network, data broker or other advertising or monetization-related service' ." So what responsibility does Facebook have for preventing this collecting of information?
Some people think Facebook may be lying and might actually be selling user's information without actually saying it is. As mentioned previously, in 2011, Facebook had settled a lawsuit with the FTC for pretty much the same infringements on users privacy. At that time, the FTC specifically stated: "The proposed settlement bars Facebook from making any further deceptive privacy claims, requires that the company get consumers' approval before it changes the way it shares their data, and requires that it obtain periodic assessments of its privacy practices by independent, third-party auditors for the next 20 years." Yet, here we are a few years later, with the same 'mistakes' being made.
On April 10, 2018, Mark Zuckerberg was called to testify before Congress to testify about the leaks related to Cambridge Analytica and the presidential election. In his testimony, Zuckerberg admitted that Facebook didn't do enough to prevent unauthorized data getting out to developers of apps and they made mistakes allowing Russian interference in the election and were slow to react when the information became clear. He did point out that people trying to game the system are becoming more sophisticated and therefore harder to spot, but Facebook is trying to stay ahead of them. He also assured Congress that the mission of Facebook has not changed: "My top priority has always been our social mission of connecting people, building community, and bringing the world closer together. Advertisers and developers will never take priority over that as long as I'm running Facebook."
However, more data has been uncovered pointing to Facebook's cover-up of warning signs that their data was being misused. As the executives concentrated on growth and increasing profits, they compromised the privacy of millions of users. In a New York Times article, Frenkel, Confessore, Kang, Rosenberg, and Nicas expose how the top executives at Facebook swept information from their underlings under rug and employees who tried to push the issues were branded as disloyal. The authors point out how when researchers tried to warn the company that Facebook was being used for government propaganda and other evil things, the company ignored them.
As for Russian activity on Facebook meant to interfere with the election, when the security chief tried to tell the board that they were finding more evidence this was happening, Facebook's executive, Sheryl Sandberg, told him he was betraying the company and should stop investigating. Both Sandberg and Zuckerberg just wanted to ignore what was going on. They did not want to know the extent of the issue and assigned lessor level employees to oversee it and basically buried their heads in the sand. They tried to handle the problem by getting congressmen on their side.
When the truth started coming out, and news organizations were reporting it, Facebook was forced to admit what they had found. When Zuckerberg was called to testify before Congress, Facebook had hired a public relations firm who advised them to "have positive content pushed out about your company and negative content pushed out about your competitor". (Frenkel, Confessore, Kang, Rosenberg, and Nicas). This is the tactic they decided to take. Zuckerberg apologized and swore Facebook was making amends for what happened. But it became very clear that he really didn't realize the extent of the damage these revelations had caused.
Facebook's problems continue as now European countries want to question Facebook for selling user data, which Facebook claims it doesn't do. In fact, Zuckerberg testified to Congress that they don't sell data but instead allow users to move their data to whatever apps they want to. That means of course, that the developers of the apps are getting access to user's data and sometimes the data of friend's of users, even if the friends didn't sign up with the app. It also means that Facebook is making money from charging the developers of the apps access to Facebook accounts.
On an episode of PBS Newsletter, Washington Post reporter Elizabeth Dwoskin told Nick Schifrin, "..one piece of the documents that was interesting is there's this point where they talk about how they are going to cut off all developers, unless those developers spend $250,000 on their mobile app program." This statement shows that even if Facebook didn't directly sell users data, they did it in spirit. Dwoskin also pointed out that Facebook schemed to try and crush their competitors. She talks about how as the video streaming app Vine started growing, Zuckerberg himself talked about cutting off Vine's access to Facebook's data. Zuckerberg himself won't testify in Europe - instead he's sending a representative. Perhaps, the heat is too much?
Facebook. What Happened. (2020, Jan 14). Retrieved from https://papersowl.com/examples/facebook-what-happened/