A Checkered History with the Truth – Audio Article

Share

A Checkered History with the Truth – Audio Article

 
 
00:00 / 13:30
 
1X

A Checkered History with the Truth.

This is part four of a multi-part series on, The Facebook Dossier. Researched, reported, and compiled by Acreto IoT Security.

As a stone-faced Mark Zuckerberg sat before Congress in April 2018, the ill-informed and unprepared senators managed to blunderously confirm what most of the world suspected about Facebook’s privacy practices: the organization knows a lot – too much by many accounts.

And while candid talk of the social media company’s reach was something to behold, the shocker was that Facebook had yet again covertly collected information on its users, this time from 87 million mostly American users and friends in their networks, and provided it to Cambridge Analytica, a third party. Most people hadn’t heard of Cambridge Analytica before last year. Zuckerberg validated this information with a subsequent testimony before the European Parliament six weeks later.

Facebook’s collaboration with Cambridge Analytica only came to light after intense scrutiny from multiple US intelligence agencies studying Russian election interference. Eventually, Christopher Wylie, one of Cambridge Analytica’s founders, turned whistleblower on the organization’s questionable activities.

Cambridge Analytica was founded in 2013 in large part by Steve Bannon, the then alt-right editor of Breitbart.com, with the financial backing of Robert Mercer, a conservative hedge fund billionaire who also funded Breitbart. Bannon, who later became Donald Trump’s alt-right consigliere, co-founded the analytics firm to build his grand vision: a powerhouse Big Data platform that could sway the minds of the electorate.

Mercer funded the political psyop platform, which in 2014 alone was used for more than 40 political races. As Cambridge’s Director of Research, Wylie came up with the technical concept of Cambridge Analytica and helped develop it into a vast, deeply intrusive analytics platform that used an extraordinary amount of private data from Facebook to target conservatives during the 2016 US presidential election. According to Wylie, Bannon oversaw the Facebook data grab. Wylie also accused Cambridge Analytica of misusing data and, more specifically, “cheating” to surreptitiously persuade British citizens to support the Brexit.

In fact, Cambridge Analytica and its parent company, SCL Elections, used a suite of political psyop tools in more than 200 elections around the world. The vast majority of the targets were third-world and underdeveloped countries, many without the resources or knowledge to defend themselves. These efforts were in preparation for their biggest effort to date: the 2016 US presidential elections.

SCL ramped up for the elections by entering into an agreement with Global Science Research, GSR, owned by Aleksandr Kogan, an academic based in the United Kingdom, to harvest Facebook data to influence the 2016 elections.

GSR began pulling profile data from Facebook, and, according to Wylie, Facebook not only knew it but collaborated in the effort. Facebook even hired one of GSR’s research directors as a researcher in November 2015.

Facebook could see it was happening,” said Wylie. “Their security protocols were triggered because Kogan’s apps were pulling this enormous amount of data, but apparently Kogan told them it was for academic use. So they were like, ‘Fine’.”

In Wylie’s 2018 testimony to the US Senate Judiciary Committee, he stated, “Facebook was first notified of Cambridge Analytica’s harvesting scheme in 2015. It did not warn users then, and it only took action to warn affected users three weeks after the Guardian, New York Times, and Channel 4 made the story public.”

Later on, when Facebook acknowledged that the data had been taken, Wylie received a letter asking him to check a box promising that GSR would not sell or share the data. He did.

When confronted with the mounting evidence, Zuckerberg finally admitted to Facebook’s role in the scandal.

The Cambridge Analytica incident has not been Facebook’s only breach of trust. In December 2007, Facebook weathered one of its first privacy scandals through Beacon, Facebook’s targeted advertising platform. Facebook was doing more than tracking user habits from its own app. It was overreaching to:

  • collect information from participating third-party sites,

  • make sure users were opted in—not out—by default, and

  • ensure that information was collected from users’ friends and families.

In the face of scathing criticism from MoveOn.org, which gathered over 70,000 signatures to oppose both Beacon’s and Facebook’s privacy practices, Zuckerberg ran this apology – one of many mea culpas he and Facebook would make in the coming years. He said,

We’ve made a lot of mistakes building this feature, but we’ve made even more with how we’ve handled them. We simply did a bad job with this release, and I apologize for it. . . . Instead of acting quickly, we took too long to decide on the right solution. I’m not proud of the way we’ve handled this situation and I know we can do better.

In 2009, Facebook launched a raft of privacy tools that seemed to be designed to confuse users to reveal more—not less—personal information. The Federal Trade Commission (FTC) was so alarmed that it began investigating Facebook’s privacy practices.

In 2010, Facebook and its advertisers used a privacy loophole to gather names and other identifying information from users. Their response was to again apologize and reiterate the company values in relation to Facebook’s privacy practices in a Washington Post piece, which state,

  • You have control over how your information is shared.
  • We do not share your personal information with people or services you don’t want.
  • We do not give advertisers access to your personal information.
  • We do not and never will sell any of your information to anyone.
  • We will always keep Facebook a free service for everyone.

In 2011, the hammer dropped—again. This time, the FTC charged Facebook with deceiving users, specifically by telling them they could keep their information private, all the while publicizing and monetizing the same information. The FTC also ordered the company to take significant steps to reverse the damage and transform its privacy practices.

An eight-count complaint drafted by the FTC listed specific incidents where Facebook used a combination of vague language and outright dishonesty to break its privacy promises to Facebook users, their friends, and even strangers. Here are the incidents:

  • Information that users had set to private—like Friends lists—were suddenly made public. However, users were never warned ahead of time.
  • Facebook stated that the third-party apps users installed would access only enough user information to operate. On the contrary, third-party apps had access to almost all Facebook user data.
  • Facebook said users could restrict their data sharing to a limited group, the “friends only” option, for example, but they still shared that information with third-party applications those friends had used.
  • Facebook claimed its Verified Apps program would certify that participating apps were secure. However, it did not certify that those apps were secure.
  • Facebook claimed that when users deleted or deactivated their accounts, photos and videos would no longer be accessible. In reality, users with deleted or deactivated accounts could still access photos and videos.
  • Facebook said it was complying with the EU Safe Harbor Framework, which dictates how data is transferred between the United States and the European Union. However, it was not complying with the EU’s framework.
  • And perhaps the most broad, simple, and troubling description of why the FTC stepped in and why Facebook – and Facebook’s privacy practices – has become so mistrusted is that the platform promised users it would not share their personal information with advertisers. But it did.

In a blog post addressing Facebook’s privacy practices, Zuckerberg responded in part:

Facebook has always been committed to being transparent about the information you have stored with us – and we have led the internet in building tools to give people the ability to see and control what they share. But we can also always do better. I’m committed to making Facebook the leader in transparency and control around privacy.

A few years later, Zuckerberg was sitting before Congress trying to explain how the social media company was complicit in the most significant election incident in US history.

Given Facebook’s checkered past on privacy issues, it’s tough to measure the long-term damage to Internet users who may or may not have Facebook accounts. From the beginning, Facebook was created from the ground up by its founder and the company’s only CEO, Mark Zuckerberg. Less than a week after its launch, the trouble began. Three Harvard students claimed that Zuckerberg stole their idea and misled them. While claiming he was helping them create HarvardConnection.com, an early social network, Zuckerberg was instead collecting their ideas and developing his own platform—FaceMash. After a noisy lawsuit, Zuckerberg settled, giving up Facebook shares that were estimated to be worth $125 million to $150 million.

Zuckerberg’s FaceMash.com site allowed users to vote on who they thought was best looking. Students featured on the site were incensed that they were pitted against each other and their photos were being used without permission—a theme that continues to haunt Zuckerberg and his creation, Facebook.

During Zuckerberg’s most recent testimony, pundits commented that his contempt for Congress and the European Parliament was clear and that he toyed with them. The frustration about the lack of substantive and meaningful testimony became the brunt of late night comedians’ jokes. Zuckerberg offered only generic information regarding Facebook’s privacy practices on app numbers during his testimony, opting to leave anything impactful for a written follow-up. The ensuing 200+ pages of answers to post-hearing questions were carefully curated by Facebook’s legal and PR teams.

To date, the social giant has not taken any substantive action to address Facebook’s privacy practices and integrity issues that have repeatedly plagued it throughout its short life. Zuckerberg’s own investors went as far as to say that he was a “19th-century dictator” and are demanding that he step down from his role as Facebook’s chairman. The outcry is getting louder, while many are complaining that with every iteration of being caught with its proverbial hand in the user privacy cookie jar, there has been this formulaic apology: “You deserve better, and we will do better.” However, the record shows little to no substantive change. Meanwhile, quarterback Zuckerberg is accused of taking a knee waiting for the clock to run out on public scrutiny.

Next up in the Facebook Dossier: listen to, “Data Never Stops for the Facebook Empire”.

Learn more or read online by visiting our web site: Acreto.io — On Twitter: @acretoio and if you haven’t done so, sign up for the Acreto Crypto-n-IoT podcast. You can get it from Apple – Google or your favorite podcast app.

 

About Acreto IoT Security
Acreto IoT Security delivers advanced security for Crypto-IoT Ecosystems, from the cloud. IoTs are slated to grow to 50 Billion by 2021 and are on track to be the biggest consumers of Blockchain and Crypto technologies. Acreto’s Ecosystem security protects Crypto / Blockchain and Clouds as well as all purpose-built IoTs that are unable to defend themselves in-the-wild. The Acreto platform offers simplicity and agility, and is guaranteed to protect IoTs for their entire 8-20 year lifespan. The company is founded and led by an experienced management team, with multiple successful cloud security innovations. Learn more by visiting Acreto IoT Security on the web at acreto.io or on Twitter @acretoio.

Acreto Marketing
Acreto Marketing

Watch Video

Replay





Interested In ...

Show Buttons
Hide Buttons