Facebook responds to the ex-employee's report and denies any accusations

Facebook responds to the ex-employee's report and denies any accusations

Lena Pietsch, director of political communications at Facebook, responded to the 60 Minutes report, "The Facebook Whistleblower", through which a former employee revealed the secrets of the algorithm and disinformation of the social network.

Here are his statements on behalf of Facebook: "Every day our teams must balance protecting the right of billions of people to speak out with the need to keep our platform a safe and positive place. We continue to make significant improvements to address the dissemination of misinformation and harmful content. To suggest that we encourage bad content and do nothing is simply untrue. "

On claims that internal research has shown the company is not doing enough to eradicate hate , disinformation and conspiracy:

"We have invested heavily in people and technology to keep our platform safe, and we have o the fight against disinformation and the sharing of authoritative information a priority. If any research had identified an exact solution to these complex challenges, the technology industry, governments and society would have solved them long ago. We have a great deal of experience in using our research - as well as external research and collaboration with experts and organizations - to inform users about changes to our apps. "

Regarding the claim that economic interests at Facebook's internals are misaligned, and that the desire to generate more engagement on the platform and profit outweighs security in some cases, he said:

"Hosting hateful content is bad for our community, bad for advertisers, and ultimately bad for our business. Our interest is to provide a safe and secure experience. positive for the billions of people who use Facebook. That's why we have invested so much in security. "

The logos of WhatsApp, Facebook Messenger and Instagram Regarding the claim that the" Meaningful Social Interactions "change in 2018 amped up controversial and hate speech content, said:

"The goal of 'Meaningful Social Interactions' change is in the name itself: to improve people's experience by prioritizing posts that attract interactions, especially conversations, between family and friends - which research shows to be good for people's well-being - and deprivate public content. Research also shows that polarization of extreme content has grown negatively. The United States for decades, long before platforms like Facebook existed, and which is decreasing in other countries where Internet and Facebook use has increased. We have our role to play and will continue to make consistent changes with the goal of making people's experience more meaningful, but blaming Facebook ignores the root causes of these problems and all related research. "

As for the claim that security measures were put in place, and then withdrawn, made Facebook less secure ahead of January 6, the rep says:

"We spent more than two years in prepare for the 2020 election with massive investments, more than 40 teams across the company, and over 35,000 people working on security. In gradually introducing and then regulating further emergency measures before, during and after the elections, we have taken into consideration specific signals that have appeared on the platform and information deriving from our ongoing and regular engagement with law enforcement. When these signals changed, so did the measurements. It is wrong to say that these measures were the reason for January 6th - the measures we needed remained in place until February, and some such as not recommending new civic or political groups remain in place to this day. These choices were all part of a much bigger strategy to protect the elections on our platform - and we are proud of that work. "

Here's an additional statement on Facebook's response to dangerous organizations that appeared on the platform before the Capitol Uprising January 6:

"We've banned hundreds of militarized social movements, deleted tens of thousands of QAnon pages, groups and accounts from our apps, and removed the #StopTheSteal group. This adds to our removal and repeated disruption of various hate groups, including Proud Boys, which we banned in 2018. Ultimately, the blame lies with those who broke the law, and the leaders who incited them. Facebook has taken extraordinary measures to address malicious content and we will continue to do our part. We also worked aggressively with law enforcement, both before January 6 and in the days and weeks following, with the aim of ensuring that evidence linking those responsible for January 6 to their crimes is available to prosecutors. "

Regarding Instagram that has decided to stop launching a version for younger users, the representative said:" While we are convinced of the value this experience would provide to families, we have decided to pause this project to give us time to work with parents, experts, politicians and regulators, to listen to their concerns, and to demonstrate the importance of this project for young teens online. The reality is that kids are already online, and we believe that developing age-appropriate experiences designed specifically for them is much better than what is currently available. "

You can find statements from the ex- addicted here.

Source Have you noticed any errors?




Powered by Blogger.