Facebook shares sink by 5% after whistleblower claims

Facebook shares plummet by 5 percent to the lowest in six months after whistleblower claims: GoFundMe set up for Frances Haugen raises $15,000 to help her fight the social media giant’s ‘army of lawyers’

  • Frances Haugen went public on CBS 60 Minutes on Sunday night 
  • She has filed eight complaints against Facebook with the SEC and will testify before Congress on Tuesday 
  • She says the company put profits above moral responsibility repeatedly
  • Among complaints is that is reactivated harmful algorithms which encouraged hate speech and misinformation after the election 
  • She is being represented by an organization called Whistleblower Aid which set up a GoFundMe account for her 
  • The page aims to raise $50,000 and has already raised $15,000 to fight Facebook’s ‘army of lawyers’ if necessary 
  • Facebook shares slid by 5 percent on Monday afternoon  

Facebook’s shares fell by 5 percent on Monday after whistleblower Frances Haugen went public with how the company puts profits above morals, a day before her scheduled testimony in front of Congress.  

Haugen went public on Sunday in episode of CBS 60 Minutes to tell how she routinely filed complaints against Facebook for putting profits above morals by  failing to stop the spread of misinformation online, protect young people and or stop the January 6 riot.  

She gave the information to The Washington Post anonymously before speaking out on Sunday night ahead of her scheduled testimony to Congress on Tuesday. 

On Monday morning, shares of the social media giant opened at $335 – $8 less than Friday’s close. 

They plummeted throughout the afternoon before reaching $323 at around 1pm – the lowest since May. 

As Facebook shares sank in value, a GoFundMe page that was set up for Haugen drew in thousands. 

The page was set up by Whistleblower Aid – an organization set up by NSA whistleblower John Napier Tye – which claims it helped her through the process of speaking out against the company. 

The page has a $50,000 goal set and has already raised $16,000 to help Haugen combat Facebook’s ‘army of lawyers’. 

Facebook shares fell by 5 percent on Monday to the lowest in six months after a whistleblower went public on Sunday night with claims against the company 

Frances Haugen has filed eight complaints with the SEC about how Facebook puts profits over morals 

A GoFundMe page set up for Frances Haugen, the Facebook whistleblower, has raised $15,000 

Haugen says that the social media giant knew this would cause further damage but that it ignored warning signs because it wanted to focus on profit instead. 

Haugen claimed Facebook turned off ‘safeguards’ designed to stop the proliferation of misinformation and rabble-rousing after Joe Biden beat Donald Trump in the November 2020 presidential election. 

That saw political content given a lower priority on users’ news feeds in the run-up to the poll – only for executives to reverse course on realizing the change was turning users off.

Haugen, who is due to testify in Congress Tuesday about Facebook’s alleged impact on its younger users, also claimed that decision directly-contributed to the violence at the US Capitol. 

‘As soon as the election was over they turned them back off, or they changed the settings back to what they were before to prioritize growth over safety. And that really feels like a betrayal of democracy to me,’ Haugen stated.

Haugen, whose leaks formed The Wall Street Journal’s ‘Facebook Files’ series, also said that Facebook’s algorithms – mathematical formulae that help decide which information is most visible on users’ feeds – favored hateful content. 

She claimed that a 2018 change prioritizing divisive posts which made Facebook users argue was found to boost user engagement.

That in turn helped bosses sell more online adverts that have seen the social media giant’s value creep close to $1 billion. 

Haugen said: ‘You are forcing us to take positions that we don’t like, that we know are bad for society. We know if we don’t take those positions, we won’t win in the marketplace of social media,’ she said.

The executive, who worked at Google and Pinterest before joining Facebook in 2019, said the scales fell from her eyes after the firm dissolved a unit on civic integrity she’d been working in after the 2020 election.

She explained: ‘I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.’ 

‘The version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world,’ Haugen added. 

‘There were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimize for its own interests, like making more money.’ 

After realizing she could no longer trust her company to protect the public, Haugen secretly copied tens of thousands of Facebook internal research which she claims is evidence that ‘the company is lying to the public about making significant progress against hate, violence and misinformation.’ 

‘We have evidence from a variety of sources that hate speech, divisive political speech and misinformation on Facebook and the family of apps are affecting societies around the world,’ the complaint reads. 

Haugen claimed that Facebook’s ‘evidence of harm’ extended to its Instagram app, commenting on a study that showed teen girls said the social network site worsened thoughts of suicide and eating disorders.

‘What’s super tragic is Facebook’s own research says, as these young women begin to consume this — this eating disorder content, they get more and more depressed. And it actually makes them use the app more,’ Haugen explained.

‘And so, they end up in this feedback cycle where they hate their bodies more and more. Facebook’s own research says it is not just the Instagram is dangerous for teenagers, that it harms teenagers, it’s that it is distinctly worse than other forms of social media.’ 

‘No one at Facebook is malevolent,’ Haugen said. Mark Zuckerberg ‘has never set out to make a hateful platform,’ she added.

However, she says that they have decided the balance sheet is more important than ethics.

Haugen said the social network proved it could make a positive change when it altered content policies for several weeks surrounding the 2020 election, by deprioritizing political content in its Newsfeed algorithm.

But she claims that the company swiftly reverted to its old models when it realized that engagement in adverts had plummeted. 

Haugen claims the relaxation of measures on Facebook allowed rioters to plot the insurrection on the platform

‘Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, and [Facebook] will make less money,’ Haugen said.

Haugen’s lawyers filed at least eight complaints with the Securities and Exchange Commission outlining her findings and comparing them with the company’s public statements.

The SEC did not confirm to 60 Minutes if they plan to take action against Facebook. DailyMail.com has also reached out to the organization for comment. 

Facebook, however, did released a statement in response to the allegations: ‘Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. 

FACEBOOK’S EMAIL TO STAFF IN FULL:

OUR POSITION ON POLARIZATION AND ELECTIONS

You will have seen the series of articles about us published in the Wall Street Journal in recent days, and the public interest it has provoked. This Sunday night, the ex-employee who leaked internal company material to the Journal will appear in a segment on 60 Minutes on CBS. We understand the piece is likely to assert that we contribute to polarization in the United States, and suggest that the extraordinary steps we took for the 2020 elections were relaxed too soon and contributed to the horrific events of January 6th in the Capitol.

I know some of you – especially those of you in the US – are going to get questions from friends and family about these things so I wanted to take a moment as we head into the weekend to provide what I hope is some useful context on our work in these crucial areas.

Facebook and Polarization

People are understandably anxious about the divisions in society and looking for answers and ways to fix the problems. Social media has had a big impact on society in recent years, and Facebook is often a place where much of this debate plays out. So it’s natural for people to ask whether it is part of the problem. But the idea that Facebook is the chief cause of polarization isn’t supported by the facts – as Chris and Pratiti set out in their note on the issue earlier this year.

The rise of polarization has been the subject of swathes of serious academic research in recent years. In truth, there isn’t a great deal of consensus. But what evidence there is simply does not support the idea that Facebook, or social media more generally, is the primary cause of polarization.

The increase in political polarization in the US pre-dates social media by several decades. If it were true that Facebook is the chief cause of polarization, we would expect to see it going up wherever Facebook is popular. It isn’t. In fact, polarization has gone down in a number of countries with high social media use at the same time that it has risen in the US.

Specifically, we expect the reporting to suggest that a change to Facebook’s News Feed ranking algorithm was responsible for elevating polarizing content on the platform. In January 2018, we made ranking changes to promote Meaningful Social Interactions (MSI) – so that you would see more content from friends, family and groups you are part of in your News Feed. This change was heavily driven by internal and external research that showed that meaningful engagement with friends and family on our platform was better for people’s wellbeing, and we further refined and improved it over time as we do with all ranking metrics. Of course, everyone has a rogue uncle or an old school classmate who holds strong or extreme views we disagree with – that’s life – and the change meant you are more likely to come across their posts too. Even so, we’ve developed industry-leading tools to remove hateful content and reduce the distribution of problematic content. As a result, the prevalence of hate speech on our platform is now down to about 0.05%.

But the simple fact remains that changes to algorithmic ranking systems on one social media platform cannot explain wider societal polarization. Indeed, polarizing content and misinformation are also present on platforms that have no algorithmic ranking whatsoever, including private messaging apps like iMessage and WhatsApp.

Elections and Democracy

There’s perhaps no other topic that we’ve been more vocal about as a company than on our work to dramatically change the way we approach elections. Starting in 2017, we began building new defenses, bringing in new expertise, and strengthening our policies to prevent interference. Today, we have more than 40,000 people across the company working on safety and security.

Since 2017, we have disrupted and removed more than 150 covert influence operations, including ahead of major democratic elections. In 2020 alone, we removed more than 5 billion fake accounts — identifying almost all of them before anyone flagged them to us. And, from March to Election Day, we removed more than 265,000 pieces of Facebook and Instagram content in the US for violating our voter interference policies.

Given the extraordinary circumstances of holding a contentious election in a pandemic, we implemented so called ‘break glass’ measures – and spoke publicly about them – before and after Election Day to respond to specific and unusual signals we were seeing on our platform and to keep potentially violating content from spreading before our content reviewers could assess it against our policies.

These measures were not without trade-offs – they’re blunt instruments designed to deal with specific crisis scenarios. It’s like shutting down an entire town’s roads and highways in response to a temporary threat that may be lurking somewhere in a particular neighborhood. In implementing them, we know we impacted significant amounts of content that did not violate our rules to prioritize people’s safety during a period of extreme uncertainty. For example, we limited the distribution of live videos that our systems predicted may relate to the election. That was an extreme step that helped prevent potentially violating content from going viral, but it also impacted a lot of entirely normal and reasonable content, including some that had nothing to do with the election. We wouldn’t take this kind of crude, catch-all measure in normal circumstances, but these weren’t normal circumstances.

We only rolled back these emergency measures – based on careful data-driven analysis – when we saw a return to more normal conditions. We left some of them on for a longer period of time through February this year and others, like not recommending civic, political or new Groups, we have decided to retain permanently.

Fighting Hate Groups and other Dangerous Organizations

I want to be absolutely clear: we work to limit, not expand hate speech, and we have clear policies prohibiting content that incites violence. We do not profit from polarization, in fact, just the opposite. We do not allow dangerous organizations, including militarized social movements or violence-inducing conspiracy networks, to organize on our platforms. And we remove content that praises or supports hate groups, terrorist organizations and criminal groups.

We’ve been more aggressive than any other internet company in combating harmful content, including content that sought to delegitimize the election. But our work to crack down on these hate groups was years in the making. We took down tens of thousands of QAnon pages, groups and accounts from our apps, removed the original #StopTheSteal Group, and removed references to Stop the Steal in the run up to the inauguration. In 2020 alone, we removed more than 30 million pieces of content violating our policies regarding terrorism and more than 19 million pieces of content violating our policies around organized hate in 2020. We designated the Proud Boys as a hate organization in 2018 and we continue to remove praise, support, and representation of them. Between August last year and January 12 this year, we identified nearly 900 militia organizations under our Dangerous Organizations and Individuals policy and removed thousands of Pages, groups, events, Facebook profiles and Instagram accounts associated with these groups.

This work will never be complete. There will always be new threats and new problems to address, in the US and around the world. That’s why we remain vigilant and alert – and will always have to.

That is also why the suggestion that is sometimes made that the violent insurrection on January 6 would not have occurred if it was not for social media is so misleading. To be clear, the responsibility for those events rests squarely with the perpetrators of the violence, and those in politics and elsewhere who actively encouraged them. Mature democracies in which social media use is widespread hold elections all the time – for instance Germany’s election last week – without the disfiguring presence of violence. We actively share with Law Enforcement material that we can find on our services related to these traumatic events. But reducing the complex reasons for polarization in America – or the insurrection specifically – to a technological explanation is woefully simplistic.

We will continue to face scrutiny – some of it fair and some of it unfair. We’ll continue to be asked difficult questions. And many people will continue to be skeptical of our motives. That’s what comes with being part of a company that has a significant impact in the world. We need to be humble enough to accept criticism when it is fair, and to make changes where they are justified. We aren’t perfect and we don’t have all the answers. That’s why we do the sort of research that has been the subject of these stories in the first place. And we’ll keep looking for ways to respond to the feedback we hear from our users, including testing ways to make sure political content doesn’t take over their News Feeds.

But we should also continue to hold our heads up high. You and your teams do incredible work. Our tools and products have a hugely positive impact on the world and in people’s lives. And you have every reason to be proud of that work.

‘We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.’

Facebook head of global affairs Nick Clegg, appearing on CNN Sunday morning, also called the allegations that the social media giant is responsible for the Capitol riot ‘ludicrous.’ 

‘The responsibility for the violence on January the 6th and the insurrection on that day lies squarely with the people who inflicted the violence and those who encouraged them, including then-President Trump and candidly many other people in the media who were encouraging the assertion that the election was stolen,’ he said. 

Meanwhile, a congressional panel will hear Haugen’s testimony on Tuesday.  

Sen. Richard Blumenthal (D-Conn.), who is a member of the panel, told the Washington Post that the SEC should take Haugen’s allegations that Facebook may have mislead investors ‘very seriously’. 

‘Facebook certainly misled and deceived the public, and so their investors may well have been deceived as well,’ Blumenthal said. 

Lawmakers will also investigate if Facebook’s products are harmful to children and whether or not the social media company undermined its safety efforts by disbanding its civic integrity team, as Haugen has alleged.

The social media giant confirmed that Antigone Davis, its global head of safety, would also testify before the Senate Commerce Committee Consumer protection panel. 

Haugen’s allegations have caused a headache for Facebook in recent weeks.

Some of the secrets contained in the trove of tens of thousands of pages of internal company documents she copied were previously leaked to the Wall Street Journal for a series of reports dubbed the ‘Facebook Files’, including damning revelations the company knew its platform Instagram is toxic to young girls’ body image. 

With more damaging allegations headed for the company Sunday, Clegg warned employees: ‘We will continue to face scrutiny.’ 

According to Clegg’s email, the whistleblower will accuse her former employer of relaxing its emergency ‘break glass’ measures put in place in the lead-up to the election ‘too soon.’

Haugen claimed this played a role in enabling rioters in their quest to storm the Capitol on January 6 in a riot that left five dead.

The relaxation of safeguards including limits on live video allowed prospective rioters to gather on the platform and use it to plot the insurrection.

Clegg pushed back at this suggestion, insisting that the so-called ‘break glass’ safeguards were only rolled back when the data showed they were able to do so.

Some such measures were kept in place until February, he wrote, and some are now permanent features.  

‘We only rolled back these emergency measures – based on careful data-driven analysis – when we saw a return to more normal conditions,’ Clegg wrote. 

‘We left some of them on for a longer period of time through February this year and others, like not recommending civic, political or new Groups, we have decided to retain permanently.’

Clegg listed several safeguards which have been put in place in recent years and reeled off a list of success stories of handling misinformation around the election and shutting down groups focused on overturning the results. 

‘In 2020 alone, we removed more than 5 billion fake accounts — identifying almost all of them before anyone flagged them to us,’ he wrote.

‘And, from March to Election Day, we removed more than 265,000 pieces of Facebook and Instagram content in the US for violating our voter interference policies.’

Clegg admitted such policies were not ideal and resulted in many people and posts were impacted by this heavy-handed approach.

But, he said, an ‘extreme step’ was necessary because ‘these weren’t normal circumstances.’ 

‘It’s like shutting down an entire town’s roads and highways in response to a temporary threat that may be lurking somewhere in a particular neighborhood,’ he said.

‘We wouldn’t take this kind of crude, catch-all measure in normal circumstances, but these weren’t normal circumstances.’

He wrote that the company had removed millions of pages and groups from hate groups and dangerous organizations such as the Proud Boys, QAnon conspiracy theorists and content pushing #StopTheSteal election fraud claims.  

The email also pushed back at an accusation that Facebook benefits from the divisiveness created on its platform.  

‘We do not profit from polarization, in fact, just the opposite,’ he wrote.

‘We do not allow dangerous organizations, including militarized social movements or violence-inducing conspiracy networks, to organize on our platforms.’

The VP called any suggestion the blame for the Capitol riot lies with Big Tech ‘so misleading’ and said the blame should be on the rioters themselves and the people who incited them. 

‘The suggestion that is sometimes made that the violent insurrection on January 6 would not have occurred if it was not for social media is so misleading,’ he wrote. 

‘To be clear, the responsibility for those events rests squarely with the perpetrators of the violence, and those in politics and elsewhere who actively encouraged them.’

The lengthy email to staff ended by urging the workforce to ‘hold our heads up high’ and ‘be proud’ of their work.  

Source: Read Full Article