Logo for WebMD

Leaked Documents Show Facebook Put Profit Before Public Good

November 8, 2021 – A mine of leaked documents inside Facebook shows the social media giant’s internal research has revealed a host of issues on the platform related to public health and other issues, but hardly did anything about it.

The files were leaked by a whistleblower, former Facebook employee Frances Haugen, who shared tens of thousands of documents with the Securities and Exchange Commission, Congress and a consortium of news agencies. She has since testified before the Senate Commerce Subcommittee on Consumer Protection and European Lawmakers.

Amplification of “anti-vaccines” and other misinformation

President Joe Biden caused a stir in July when he said that thanks to widespread misinformation about the COVID-19 vaccine, social media platforms like Facebook are “killing people – I mean they really are, look, the only pandemic we have is among the unvaccinated, “he said.” And they are killing people. “

While he was forced to reconsider the statement, the leaked documents suggest he wasn’t necessarily wrong.

According to newspapers, in March – as the White House was preparing a $ 1.5 billion campaign against vaccine misinformation – some Facebook employees believed they had found a way to counter these lies on the platform, and at the same time. time prioritizing legitimate sources like the World Health Organization.

“Given these results, I guess we hope to launch as soon as possible,” wrote one employee.

But Facebook ignored some of the suggestions, and executives dragged their feet as they implemented others. Another proposal, to limit anti-vaccine comments, was also ignored.

“Why don’t you delete the comments?” Because engagement is all that matters, ”Imran Ahmed, CEO of the Center for Countering Digital Hate, an Internet watchdog group, told The Associated Press. “It attracts attention and attention equals eyeballs and eyeballs equals ad revenue.”

Facebook’s algorithms, which determine what content you see in your feed, also help spread misinformation.

“It’s not like the anti-vax contingent was created by Facebook,” says Dean Schillinger, MD, director of the Health Communications Research Program at the University of California-San Francisco. The algorithm said, ‘OK, let’s find some people with certain political beliefs and link them to anti-vaccines’, amplifying the misinformation. “It’s definitely something new. “

If that wasn’t enough, it looks like Facebook may have misled Congress about the company’s understanding of how COVID misinformation has spread across the platform. In July, two senior House Democrats wrote to Facebook CEO Mark Zuckerberg asking for details on how many users saw misinformation about COVID and how much money the company was making from it. to these publications.

“At this time, we have nothing to share in response to the questions you have raised, other than what Mark has said publicly,” the company said in response.

But the leaked documents show that at this point, Facebook researchers had conducted several studies on COVID disinformation and produced major internal reports. Employees were able to calculate the number of views gathered by widely shared disinformation. But the company did not recognize him in Congress.

Keeping this knowledge secret was a huge missed opportunity to ensure that science-backed information reaches the general public, says Sherry Pagoto, PhD, director of the UConn Center for Mobile Health and Social Media.

“We know how disinformation spreads, so how can we think more about spreading good information? ” she says. “They have all kinds of data on the characteristics of messages that go far. How can we use what they know about health communication to develop a plan? “

In an emailed statement, a spokesperson for Meta (amid the uproar, Facebook announced a new business name) said, “There is no silver bullet to tackling disinformation, that’s why we’re taking a holistic approach, which includes removing over 20 million pieces of content that violate our COVID disinformation policies, permanently banning thousands of repeat offenders from our services, linking over 2 billion people to information trusted on COVID-19 and vaccines, and partner with independent fact checkers. “

Ignoring Instagram’s Effect on Mental Health of Vulnerable Adolescents

Tackling disinformation isn’t the only way Facebook and its affiliates could have acted to protect public health. The company was also aware of its negative impact on the mental health of young people, but publicly denied it.

Instagram, which is owned by Facebook, is extremely popular among teenage girls. But the photo sharing app repeatedly exposes them to images of idealized bodies and faces, which can lead to negative self-comparisons and pressure to look perfect.

Eating disorder content is also widely available on the platform. For years, social science and mental health researchers have investigated the effect of social media on mental health, particularly in adolescents. Studies have found links between Instagram use and depression, anxiety, low self-esteem, and eating disorders.

Facebook posts revealed what Instagram researchers called a “deep dive into adolescent mental health.” And there were serious issues: Internal research showed the platform made body image issues worse for 1 in 3 teenage girls, and 14% of teens said Instagram made them feel worse about themselves. . Data has linked app use to anxiety and depression. And among teens who reported suicidal thoughts, 6% of US users and 13% of UK users linked that impulse directly to Instagram.

Jean Twenge, PhD, author of iGen: Why today’s super-connected kids grow up less rebellious, more tolerant, less happy, and completely unprepared for adulthood, has been studying the effects of social media on young people for nearly a decade.

“I was not surprised that Facebook found out that social media could have important links with depression and self-harm. University research has shown this for years, ”she says. “I was surprised at the depth of their research into the mindset of teenage girls using Instagram. Their research really drew on what we already knew.

As with Facebook’s disinformation findings, the company has publicly downplayed Instagram’s negative effects – including in comments to Congress – and has done little to adjust the experience of teenage users on it. application.

“I think given what they knew about Instagram and Mental Health, it definitely would have been the right thing to do to make changes to the platform,” Twenge said.

In his email, the Meta spokesperson said, “Our research does not conclude that Instagram is inherently bad for teens. While some teens told us Instagram made them feel worse when struggling with issues like loneliness, anxiety and sadness, more teens told us Instagram helps them feel better. feel better when they encounter these same problems.

A responsibility for the public good?

While Facebook users may be surprised to learn how the company routinely puts profits before the health of its customers, those who study public health are anything but.

“This is by no means a problem unique to social media platforms,” said Schillinger.
“Companies frequently pursue policies that encourage the public to participate in activities, to buy or consume products, to engage in behaviors that are unhealthy for themselves or for others or for the planet. … Do you think Facebook acts any differently from any other business in this space? “

That’s where the regulatory potential comes in, said Haugen, the whistleblower. She claimed it, like many lawmakers following her revelations.

“Large organizations that have influence and access to large numbers of people should be responsible for the well-being of that population, as a matter of principle,” says sociologist Damon Centola, PhD, author of Change: How to make great things happen.

He compares the explosion of social media to the history of television, which has been regulated in many ways for decades.

“I think it provides us with a parallel between social media and the media’s ability to influence people,” he says. “It seems to me that organizations can’t get away with saying they won’t consider public welfare.

The so-called Facebook Papers are the most damning, some experts say, because the company’s defense claims their research was only for product development, so that doesn’t prove anything.

This ignores all of the peer-reviewed articles published in reputable journals that reinforce their internal research findings. Taken together, the two types of research leave little room for doubt, and little doubt that anything needs to change.

“Think of it as environmental pollution,” says Centola. “Businesses can know they’re polluting, but they can also say that it doesn’t really matter, that it didn’t cause any harm. But then you get the documentation saying no, it has huge effects. This is when it really matters.

Social media as a force for good

But there is one potential benefit of Facebook logs, experts say: It’s clear the company knows a lot about how to effectively deliver messages. With enough pressure, Facebook and other social media platforms can now begin to use this information in a positive direction.

“Facebook should develop a close collaboration with reputable entities to develop content that is both true and promoting public health, while being engaging and algorithm-based,” said Schillinger. “If we can use the platform, the reach and the [artificial intelligence] Facebook has for health promotion content, the sky is the limit.

And efforts like this may be on the horizon.

“We are focused on creating new features to help people struggling with a negative social comparison or negative body image,” the Meta spokesperson wrote in the email. “We also continue to seek opportunities to work with more partners to publish independent studies in this area, and we are looking to find ways to allow external researchers to access our data more in a way that respects life. private of people. “

That’s not to say that Facebook willfully put public health ahead of the company’s need to make money, without being forced by regulations.

“I think Facebook wants to improve its platform for users. But their first interest will always be to have as many users as possible spending as much time as possible on the platform, ”says Twenge. “These two desires are often at odds.”

Our sincere thanks to
Source link

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *