Disclaimer: This essay is provided as an example of work produced by students studying towards a organisations degree, it is not illustrative of the work produced by our in-house experts. Click here for sample essays written by our professional writers.

Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of UKEssays.com.

Facebook's Privacy Crisis and the Company Culture

Paper Type: Free Essay Subject: Organisations
Wordcount: 3570 words Published: 8th Feb 2020

Reference this

Facebook’s recent crisis is just one of many privacy issues that company has had to deal with in its relatively short existence. Barely two years old in 2006, the company faced user outrage when it introduced its News Feed. A year later it had to apologize for telling people what their friends had bought. Years after that, the Federal Trade Commission stepped in — and is now looking at the company again. Facebook has a history of running afoul of regulators and weathering user anger.[i]

In 2004, when Mark Zuckerberg was a Harvard undergrad working on a skunkworks project called The Facebook, a friend asked him how he’d managed to obtain more than 4,000 emails, photos and other bits of personal info from fellow students. He said, “They trust me”. Facebook’s foundation is built on trust. Facebook users’ confidence in the company has plunged by 66 percent as a result of revelations that data analysis firm Cambridge Analytica inappropriately acquired data on tens of millions of Facebook users — and CEO Mark Zuckerberg’s public mea culpa during two days of congressional hearings last week did not change that.[ii] In this paper we are exploring how Facebook’s culture and organizational dysfunction not only contributed but also aggravated the crisis instead of resolving it.

Crisis Background- How It Happened

Several years ago, Facebook allowed apps on its platforms, including ordering cakes, simple games, fortune telling, etc. Interesting, convenient, and likable, many Facebook users gradually relaxed vigilance against it. In 2013, there was a psychological test online: this is your digital life, which could send $5 if the user finished all the questions. More than 270,000 people completed it using their Facebook account. However, all the testers’ and their friends’ information on Facebook, such as liking someone’s posts, commenting somebody’s photos, was totally sneaked, harvested by the behind-the-scenes developer: Cambridge academic Aleksandr Kogan and his company Global Science Research­­­.[iii]  For this, Facebook did nothing but just got paid. After analyzing millions of people’s personalities, Kogan sold the results to Cambridge Analytic, a firm hired by campaigns to analyze voters and target them with ads. For this, in 2014, Facebook just “asked” Kogan to delete all the data but did not follow up.[iv]

Get Help With Your Essay

If you need assistance with writing your essay, our professional essay writing service is here to help!
Find out more about our Essay Writing Service

In 2016, during the U.S. presidential election, Cambridge Analytica took advantage of Facebook users’ personality by helping the Trump campaign invest heavily and purposefully in Facebook fake ads. It worked like if your good friend said on Facebook the Mexican neighbor next door is noisy, and you liked it, Facebook would post a false ad that the presidential candidate, Mr. Trump, promised to refuse Mexican immigration if he took office. Similarly, if you forwarded and supported a charity news, then you would soon receive a fake push: cheeky and shameful, Hillary’s abuse of donations was revealed. Through accurate political advertising for each Facebook user through this improper propaganda, people would subconsciously support Trump when they voted. And Facebook, which received the money, was equivalent to the same involvement in the manipulation of public opinion.

In our opinion, this irresponsible behavior can have something to do with Facebook’s organizational structure as well as culture. Facebook organizational structure is vertical-based, mainly including the departments: Information Security, Business Development, Marketing Center, Finance Center, and Engineering.[v] Ironically, there is no sub-sector in any department to ensure the security of user’s information, even in the Information Security department, which is mainly aimed at the company’s own information privacy. That shows they simply did not take the protection of users’ information as their own responsibility, because censoring every ad’s authenticity and every inserted applet’s purpose would cost them ‘unnecessary’ time.

Also, in order to expand the commercial sector rapidly, Facebook advocates flat management to strengthen every department’s communication and finish tasks as quickly as possible. It has corporate function-based teams, geographic divisions, and product-based divisions, but every part has very blurry boundaries.[vi] For example, some of these geographic divisions share resources and managers with function-based teams, and people in a product-based division can also work in one of the geographic divisions. But the problem is, employees, including managers temporarily working in Asia geographic area, may not be familiar with the situations in North America. So, it cannot incur such a big disaster to insert a test game or push ads to Asian consumers. Unfortunately, if these people then work in the North America sector, they would naturally relax their vigilance and monetize user information in North America, causing a big problem. I just think there should be a fixed competent leadership in every geography, like Latin America, who has great insight into the political direction, laws, and application requirements of the region. In that way, the company will have more specific actions to a specific area.

In addition, this scandal is also related to the culture of Facebook. From a culture model layer one: observable artifacts and behaviors,[vii] we can see a gym, washing machine, microwave, refrigerator, dining and more in the company for employees to enjoy, making it easy for them to work late; there is no cubicle in the office area between employees, so they can communicate naturally; whenever there is a new idea, they would not spend time discussing but just realize it, which sometimes makes it hard to expand the feature of this newly developed application, leading to clumsy ways to modify or even rewrite it. Based on these, the layer two value[viii]of Facebook is speed, openness, act first and ask questions later, growth and expansion as quickly as possible. Indeed, Facebook can move forward quickly, but sometimes a company’s development should consider more than speed. In this data breach case, we may first have to admit inserting some interesting applets truly makes profits as well as brings joy to customers, but the company has never thought about what if the people behind these small programs, in fact, have ulterior motives? What if collecting and selling private data of unwitting people would stir up the dissatisfaction? What if pushing fake ads to or even indirectly manipulating every user who believes Facebook causes irreparable damage to the accumulative reputation over years? So, we believe Facebook should slow the pace of expanding, and look back to figure out how many hidden dangers they have overlooked on the road of development.

Crisis Response- or Lack Thereof

In a crisis, it’s not the event itself that counts. It’s the response. The first few days or weeks of the Cambridge Analytica crisis were marked by mishandled responses. When the company got its act together considerable damage had been done. Zuckerberg did a decent job defending the company when he finally responded. But nearly every story was preceded by numerous questions regarding the silence of the company and its inept initial response—both to this and other recent crises—going back to questions regarding Russian actors’ use of Facebook to manipulate the 2016 U.S. presidential election. This clearly shows how unprepared the organization was and the lack of structure to handle the crisis even when the core business is affected. Effective crisis communications response requires proper leadership, structure, and technology. Organizations should have a permanent internal team tasked with continuous monitoring and management of crisis events and a crisis plan.[ix]

Poor handling of the crisis led to political scrutiny by the public and governments across the world. As the spotlight grows harsher, it affects investor sentiments and public trust. Facebook can no longer resist government scrutiny without suffering major repercussions, including:

  • Lawmakers in the UK made it clear that they were extremely disappointed with Zuckerberg’s refusal to attend and answer their questions directly. UK Parliament’s decision to publish Facebook’s emails and other sensitive information will no doubt lead to further inquiries in the UK and abroad, embroiling Facebook’s partners along the way.[x]
  • The Federal Trade Commission is looking into whether Facebook violated a consent decree by enabling third parties to access users’ information without their permission.[xi]
  • Mark Zuckerberg appeared before the Senate’s Commerce and Judiciary committees to discuss data privacy and Russian disinformation on Facebook.
  • The attorney general of Massachusetts, Maura Healey, announced that her office was opening an investigation. Facebook’s lack of disclosure on the harvesting of data could violate privacy laws in Britain and several states.[xii]
  • Facebook has lost $35 billion in market value following reports that Cambridge Analytica, a data firm that worked with President Donald Trump in the 2016 elections, had unauthorized access to 50 million Facebook user accounts in one of its largest breaches yet.[xiii]

Additionally, Facebook’s mishandling of the Cambridge Analytica crisis has led to widespread lack of trust. In a column Forbes notes that “the most valuable business commodity is trust”.[xiv] Trust is an immeasurable currency, both externally and internally; once lost, a company must climb a mountain of challenges to reestablish its integrity.[xv] Facebook was challenged by a critical situation where it had to choose between admitting to third-party access to consumer data or withholding information that eventually became publicly revealed through other sources years later. Facebook’s decision to choose the latter, led to loss in public trust, due to lack of transparency. Jonathan Albright, a research director at the Tow Center for Digital Journalism, said that he was disappointed that the CEO did not address why Facebook enabled so much third-party access to its users’ personal information for so many years. He also said:

This problem is part of Facebook and cannot be split off as an unfortunate instance of misuse. It was standard practice and encouraged. Facebook was literally racing towards building tools that opened their users’ data to marketing partners and new business verticals. So, this is something that’s inherent to the culture and design of the company.[xvi]

In a recent survey by Pew Research Center, around four-in-ten (42%) say they have taken a break from checking the platform for a period of several weeks or more, while around a quarter (26%) say they have deleted the Facebook app from their cellphone. All told, some 74% of Facebook users say they have taken at least one of these three actions in the past year. Just over half of Facebook users ages 18 and older (54%) say they have adjusted their privacy settings in the past 12 months, according to a new Pew Research Center survey. There are, however, age differences in the share of Facebook users who have recently taken some of these actions. Most notably, 44% of younger users (those ages 18 to 29) say they have deleted the Facebook app from their phone in the past year, nearly four times the share of users ages 65 and older (12%) who have done so.[xvii]

Being one of the “big tech companies” with immense power that can shape the news for more than two billion people worldwide, it was important for the CEO, Mark Zuckerburg to rethink his goals. After an unintended role as a “ propaganda weapon for Russia” in the 2016 US presidential elections, he declared his new goal for the year 2018 as to “fix Facebook”. Though he did not explicitly talk about his course of action, he emphasized on making changes through enforcing policies that could possibly prevent the misuse of their product. Although these changes were initially not publicly visible, after the huge Cambridge Analytica scandal, these changes became much more evident. The company has gone through the biggest executive shakeup in its entire tenure of 15 years.

 With its earlier organisational structure, where each of the acquired apps, Instagram and Whatsapp were independently functioning, Facebook was a mess. These independently operating arms lacked coordination among them, turning the  company into a tangle of overlapping products. This lack of synchronization has led to a great deal of redundancy, that was responsible for the miscommunication.

 Today, the company has reorganised it’s product and engineering efforts into three broad areas, which include, the company’s family of apps, new platforms and infrastructure, and central product services. These changes are intended to provide better transparency ,ensuring improved communication. Along with appointing new leaders for each of the applications, new responsibilities were assigned to the executives, that include the new effort to incorporate blockchain technology.

 The “family of Apps” group will now oversee Facebook, Instagram, Whatsapp and Messenger that together have a reach of about 5 billion users monthly. The “New platforms and infra” group will tackle the new growing features of Facebook that include Facebook’s AR, VR and artificial intelligence efforts. The upcoming efforts of Blockchain technology at Facebook will now be a part of this team. The intersection of these applications, that include the shared product features such as ads, security and growth will now be handled by the third division called the “Central Product Services”. It was mostly shuffling the teams, rather than bringing new people  into the company or kicking out people from the company. These new roles are intended to provide open lines of communication among executives without hurting the speed Facebook is known for.

 Clearly, these changes bring all the independently functioning arms together and turn the efforts cohesively into one direction. Though it does not appear from the face of it, informally, this suggests a reduction of autonomy for each of these applications. Though this threat of interfered functioning might spook any possible future acquisition candidates, considering the overlapping functionalities of these applications, such as the stories, this makes better sense.  

Earlier it was assumed that being in different domains, with their own set of competitors, each of these apps need different strategies for acquiring the market growth. But with the increasing number of overlapping features, for instance the stories, this no longer hold true.It took until late 2017 for Facebook to realize it should synchronize Stories across Instagram, Facebook and Messenger so users could post once to their audiences everywhere. The new organisational structure will ensure the ability to formulate a coherent strategy. Instead of reinventing the wheel every time, the expertise of different executives who are skilled in these overlapping functionalities can be shared across multiple applications. For instance, achieving utter dominance over Snapchat in photo sharing, Instagram is putting its efforts in terms of enhancing its News Feed, hoping to ramp up monetization. Mosseri, a long-time member of Mark Zuckerberg’s inner circle, will now bring his experience turning News Feed in Instagram. Similarly, the reorg could prevent Facebook from haphazardly tripping over itself in an attempt to seize on emerging trends.


[i] Newcomb, Alyssa. “A timeline of Facebook’s privacy issues — and its responses”, 2018, https://www.nbcnews.com/tech/social-media/timeline-facebook-s-privacy-issues-its-responses-n859651.

[ii] Weisbaum, Herb. “Trust in Facebook has dropped by 66 percent since the Cambridge Analytica scandal”, 2018, https://www.nbcnews.com/business/consumer/trust-facebook-has-dropped-51-percent-cambridge-analytica-scandal-n867011.

[iii] Meredith, Sam. “Facebook-Cambridge Analytica: A timeline of the data hijacking scandal”, 2018, https://www.cnbc.com/2018/04/10/facebook-cambridge-analytica-a-timeline-of-the-data-hijacking-scandal.html.

[iv] Aleksandr Kogan, “The Link Between Cambridge Analytica and Facebook”, 2018, https://www.cbsnews.com/news/aleksandr-kogan-the-link-between-cambridge-analytica-and-facebook-60-minutes/.

[v] Nancy, “FACEBOOK ORGANIZATIONAL STRUCTURE: CHECK THE BIG FIGURE”,

https://www.orgcharting.com/facebook-organizational-structure/.

[vi] Lombardo, Jessica. “Facebook Inc.’s Organizational Structure (Analysis)”, 8 Sept. 2018,

http://panmore.com/facebook-inc-organizational-structure-analysis.

[vii] Lassman, David. “Note on Culture”, 16 Jul. 2017.

[viii] Lassman, David. “Note on Culture”, 16 Jul. 2017.

[ix] Haggerty, James F. “Commentary: How Facebook’s Response Ignited the Cambridge Analytica Scandal”, 2018, http://fortune.com/2018/03/27/facebook-cambridge-analytica-data-scandal-crisis-investigation/.

[x] Urbelis, Alexander. “Facebook faces major repercussions if it continues to resist government scrutiny”, 2018, https://www.cnn.com/2018/12/07/opinions/facebook-resist-government-scrutiny-urbelis/index.html.

[xi] Chang, Ailsa. “FTC Investigating Whether Facebook Violated Consent Decree”, 2018, https://www.npr.org/2018/03/27/597390569/ftc-investigating-whether-facebook-violated-consent-decree.

[xii] Granville, Kevin. “Facebook and Cambridge Analytica: What You Need to Know as Fallout Widens”, 2018, https://www.nytimes.com/2018/03/19/technology/facebook-cambridge-analytica-explained.html.

[xiii] Shen, Lucinda. “Why Facebook Suddenly Shed $35 Billion in Value”, 2018, http://fortune.com/2018/03/19/facebook-stock-share-price-cambridge-analytica-donald-trump.

[xiv] Mankowska, Asha. “Why A Strong Brand Authority Will Transform Your Business Into a 7-Figure Revenue Source”, 7 Jul. 2017, https://www.forbes.com/sites/forbescoachescouncil/2017/07/07/why-a-strong-brand-authority-will-transform-your-business-into-a-7-figure-revenue-source/#62f42dd745a7.

[xv] Fioravante, Vanessa. “4 lessons from Facebook’s data crisis”, 2018, https://www.prdaily.com/Main/Articles/4_lessons_from_Facebooks_data_crisis_24255.aspx.

[xvi] Solon, Olivia and Edward Helmore. “Mark Zuckerberg apologizes for Facebook’s ‘mistakes’ over Cambridge Analytica”, 2018, https://www.theguardian.com/technology/2018/mar/21/mark-zuckerberg-response-facebook-cambridge-analytica.

[xvii] Perrin, Andrew. “Americans are changing their relationship with Facebook”, 2018, http://www.pewresearch.org/fact-tank/2018/09/05/americans-are-changing-their-relationship-with-facebook/.

 

Cite This Work

To export a reference to this article please select a referencing stye below:

Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.

Related Services

View all

DMCA / Removal Request

If you are the original writer of this essay and no longer wish to have your work published on UKEssays.com then please:

Related Services

Our academic writing and marking services can help you!

Prices from

£124

Approximate costs for:

  • Undergraduate 2:2
  • 1000 words
  • 7 day delivery

Order an Essay

Related Lectures

Study for free with our range of university lecture notes!

Academic Knowledge Logo

Freelance Writing Jobs

Looking for a flexible role?
Do you have a 2:1 degree or higher?

Apply Today!