Here is why you should read more into Facebook’s rejection of documentary The Social Dilemma

In a seven-part report, Facebook calls the Netflix film by Jeff Orlowski a “conspiracy documentary” which “buries the substance in sensationalism.” We dissect the report for you:

October 05, 2020 04:29 pm | Updated December 05, 2021 08:54 am IST - Hyderabad

An illustration of the Facebook logo as a maze

An illustration of the Facebook logo as a maze

 

(Subscribe to our Today's Cache newsletter for a quick snapshot of top 5 tech stories. Click here to subscribe for free.)

Watched The Social Dilemma ? Perhaps it has shaken up your household’s perspective of how your offline lives are mediated by the ones you lead online.

The Jeff Orlowski-directed documentary, which struck a chord with many netizens upon its global release in September, has already been bookmarked as a favourite among anti-trust regulators who are more than eager to see the downfall of the world’s largest social media entities.

The film, serving as a ‘burn book’ of all the big Internet companies, features first-person accounts from several ‘whistleblowers’ who have worked at Facebook, Twitter, Google and more. All of them left these organisations based on ethical concerns. In terms of Facebook, much of the information came directly from Justin Rosenstein, the former co-creator of the Facebook ‘like’ button.

Facebook CEO and founder Mark Zuckerberg; Justin Rosenstein co-creator Facebook ‘like’ button

Facebook CEO and founder Mark Zuckerberg; Justin Rosenstein co-creator Facebook ‘like’ button

As October 2020 kicked off, Facebook has now published a rebuttal to The Social Dilemma ; it addresses the platform’s notoriety for bypassing users’ privacy, the rapid spread of misinformation, fake news and hate speech, the advancement of political polarisation, and threats against the inherent values of elections.

The report ‘What The Social Dilemma Gets Wrong’ by Facebook, a seven-part breakdown of the corroborated arguments made in the film, was posted on its official website and attributed to no particular spokesperson. “The film’s creators do not include insights from those currently working at the companies or any experts that take a different view of the narrative put forward by the film. They don’t acknowledge — critically or otherwise — the efforts already taken by companies to address many of the issues they raise. Instead, they rely on commentary from those who haven’t been on the inside for many years,” it states, adding the film is “distorted” in its approach.

Mental health and fear

Facebook claims its News Feed, is “not incentivised to build features that increase time-spent on our products. Instead, we want to make sure we offer value to people, not just drive usage.”

The platform refers to a 2018 change to News Feed wherein they adjusted the ranking in users’ timelines “to prioritise meaningful social interactions and deprioritise things like viral videos. The change led to a decrease of 50 [million] hours a day worth of time spent on Facebook.” The company points out they have been actively working with mental health organisations to further understand the effects social media has on users. For example, in April 2020, Facebook released Quiet Time , a digital well-being feature that helps users spend certain time slots on the platform — but few know of this feature.

Probably one of the more mind-boggling arguments Facebook, to its credit, acknowledges is its leveraging of algorithmic power just as Netflix does, “to determine who it thinks should watch The Social Dilemma film.”

Facebook seemingly waves off the concern around algorithms. Yes, algorithms are the norm but be aware of the concept’s evolution. While algorithms started out as a way for technology to help rank searches according to the user’s shared data, they have been developed to become one of Internet companies’ favourite surveillance tactics. In this case, Facebook states, “portraying algorithms as ‘mad’ may make good fodder for conspiracy documentaries, but the reality is a lot less entertaining.” However, many who have watched The Social Dilemma would not necessarily categorise it as ‘entertainment’ but more as a reality check.

We cannot speak of algorithms without speaking of advertising. Facebook claims they are “funded by advertising so that it remains free for people... We don’t sell your information to anyone.” The platform insists that they “provide advertisers with reports about the kinds of people who are seeing their ads and how their ads are performing, but [they] don’t share information that personally identifies you unless you give [them] permission.”

Facebook-owned Instagram, which turns 10 on October 6, 2020, has one of the most contentious News Feeds across social networking platforms; its evolution has gone from simply featuring posts in a chronological order in publishing of those in a user’s circle to interspersing algorithm-driven sponsored and recommended posts across the basic News Feed and Explore pages. Sadly, Instagram still tells you what you need to see.

October 23, 2019 : Facebook Chairman and CEO Mark Zuckerberg arrives to testify before the House Financial Services Committee on 'An Examination of Facebook and Its Impact on the Financial Services and Housing Sectors' in Washington, DC on October 23, 2019.

October 23, 2019 : Facebook Chairman and CEO Mark Zuckerberg arrives to testify before the House Financial Services Committee on "An Examination of Facebook and Its Impact on the Financial Services and Housing Sectors" in Washington, DC on October 23, 2019.

The Social Dilemma makes strong references to the Cambridge Analytica data breach of 2018, which led to Facebook CEO and founder Mark Zuckerberg sitting through a gruelling Senate hearing that same year . Zuckerberg was asked whether or not Facebook would still have access to a user’s information should they delete their information and account from the platform. Zuckerberg responded Facebook will not be able to access any information or content a user shared in the past. However, some third-party apps may still have access to some of this data. Interestingly, even when users delete their accounts, it can take up to 90 days for Facebook to remove content such photos and updates stored in backup systems.

Facebook reiterates they do not want users’ personal data and that they support regulation, elaborating, “We have policies that prohibit businesses from sending us sensitive data about people, including users’ health information or social security numbers, through business tools like the Facebook Pixel (a snippet of JavaScript code that allows you to track visitor activity on your website) and SDK (an Android-specific developers’ line of code which includes tracking and traffic analytics, and tracking of user behaviour and ads engagements).”

Social media and politics

Polarisation and populism have existed long before social media. This point may be a little ‘grey area’ simply because polarisation is still a fairly fluid term, where one can use it in both macro (platform integrity and diversity) and micro (smaller scale partisanship) senses.

Notably, The Social Dilemma addresses the scope of radicalisation through social media; in the film’s dramatised parallel storyline, a teen is exposed to a vague form of it through YouTube and Facebook.

“The overwhelming majority of the content that people see on Facebook is not polarising or even political — it’s everyday content from people’s friends and family,” Facebook states, “We reduce the amount of content that could drive polarisation, including links to clickbait headlines or misinformation.”

In May 2020, TheWall Street Journal published an exposé, based on internal documents and interviews with current and former employees, on how Facebook actually encourages divisiveness across its users. The article ‘Facebook Executives Shut Down Efforts to Make the Site Less Divisive’ states, ‘“Our algorithms exploit the human brain’s attraction to divisiveness,’ read a slide from a 2018 presentation. ‘If left unchecked,’ it warned, Facebook would feed users ‘more and more divisive content to gain user attention and increase time on the platform.’” Facebook, not taking kindly to this article, published a report on their investments into reducing platform-specific polarisation, as they have done in this rebuttal to The Social Dilemma .

Dozens of cardboard cut-outs of Facebook CEO Mark Zuckerberg sit outside of the U.S. Capitol Building as part of an Avaaz.org protest in Washington, U.S., April 10, 2018. REUTERS/Leah Millis/File Photo

Dozens of cardboard cut-outs of Facebook CEO Mark Zuckerberg sit outside of the U.S. Capitol Building as part of an Avaaz.org protest in Washington, U.S., April 10, 2018. REUTERS/Leah Millis/File Photo

The aforementioned makes for a natural segue to elections and misinformation. “We’ve acknowledged that we made mistakes in 2016. Yet the film leaves out what we have done since 2016 to build strong defences to stop people from using Facebook to interfere in elections,” says the Facebook report, referring to its use of around 3000 Russian-backed ads which were then turned over to Congress.

In regards to the upcoming US Presidential election, Facebook explains, “We have policies prohibiting voter suppression and in the US, between March and May this year alone, we removed more than 1,00,000 pieces of Facebook and Instagram content for violating our voter interference policies,” and have also “updated our policies to counter attempts by a candidate or campaign to prematurely declare victory or delegitimise the election by questioning official results.”

Facebook states they do not benefit from misinformation, adding, “We don’t want hate speech on our platform and work to remove it... We know our systems aren’t perfect and there are things that we miss.” The company adds they have removed over 22 million pieces of hate speech in the second quarter of 2020, over 94% of which they found before someone reported it. They say this is an increase from a quarter earlier when they removed 9.6 million posts, over 88% of which they found before some reported it to the platform.

 

The Social Dilemma has stirred an online uprising around not just privacy but about mental health, and the power held by these companies. If the film’s mission was to further fracture our trust in the online, it has worked on a great many — and Facebook knows it. To be fair, many have questioned the film’s inherent purpose: ‘do we boycott these platforms altogether (is that even possible?)’, and ‘what can do about it now when erasing a digital footprint is near impossible?’

Facebook’s report aims to serve as a reminder of its positive (albeit slow) change after years of ongoing scandal and their regular statements of ‘we are working to change this’. Despite policy changes and international hearings, The Social Dilemma reminds us we need to perhaps reverse-engineer the situation, and start at home.

Top News Today

Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.