End-to-end encryption is fundamental to human rights

End-to-end encryption is fundamental to human rights

After the struggles and clashes that have engulfed tech companies on the end-to-end cryptography front for years, Meta recently added a new tool to its arsenal that could help the social media giant resist pressure from the US government. intent on modifying or weakening the plan to extend end-to-end encryption to all of the company's private communications services.

On Monday, April 4, Meta published a report on the impacts of end-to-end encryption on human rights, created by Business for Social Responsibility (Bsr), a non-profit that deals with corporate social responsibility. In addition to commissioning it, Meta has also published a response document to Bsr's independent report. In the study, which took more than two years to complete, Bsr notes that end-to-end encryption has a fundamental and extremely positive effect on the protection of human rights, and at the same time delves into the topic of how criminal activities and violent extremism can find a safe haven on encrypted platforms thanks to technology. The report also lists some recommendations on how to mitigate these negative impacts.

Since 2019 Meta claims to integrate end-to-end encryption across all its messaging platforms. This security measure, designed to prevent companies from accessing their users' communications, has already been implemented for some time on WhatsApp, one of Meta's proprietary platforms, but the plan would also extend the technology to Facebook Messenger and Instagram Direct Messenger. Meta said the delay in applying end-to-end encryption to all other services largely has to do with technical complications and interoperability issues, but the company also cited criticism of the plan from governments of the United States and other countries around the world, which fear that adding the feature would make it more difficult for society and law enforcement to counter a range of threats, from child abuse, to the distribution of child pornography , through coordinated disinformation campaigns, hate speech, terrorism and violent extremism. The US government, and the FBI in particular, has long argued that cryptography that protects user data also similarly protects suspects in the center of criminal investigations, thereby endangering citizens and national security. br>
The difficulties in combining privacy and legality "I am pleased that the BSR report highlights the crucial role of cryptography for the protection of human rights - explains Riana Pfefferkorn, who is in charge of research at the Stanford Internet Observatory and not was involved in the study - while it is true that unwanted conduct occurs in encrypted contexts, most people are not criminals, and everyone needs privacy and security. Weakening encryption is not the answer. " br>
For Meta and privacy advocates around the world the question is how to develop mechanisms to stop digital abuse before it takes place by reporting behavior potentially suspicious without accessing user communications, and by creating mechanisms that allow users to effectively report behavior that could represent abuse. The most recent efforts to find a balance point have been met with intense criticism from advocates of privacy and encryption.

WiredLeaks, how to send us an anonymous report Last August, for example, Apple announced that it was want to introduce a function capable of locally scanning data on users' devices in search of child pornography material. In this way, according to the logic of the company, Apple would not have needed to directly access the data to verify the possible presence of illicit material. However, the researchers raised a number of concerns, pointing out how such a mechanism would be susceptible to manipulation and abuse, as well as the risk that the target would not be achieved if the system produced a series of false positives and false negatives. Within a month, Apple backtracked, claiming it needed time to reassess the project.

The report's recommendations In its report, Bsr argues that it does not approve "customer-side scanning" mechanisms. highlighting how the approach ends up leading to unsustainable slippery ground. Bsr instead recommends that Meta pursue other strategies, such as creating safe and responsive reporting channels for users and analyzing unencrypted metadata, to detect potentially problematic activities without having to scan or access communications directly.

See more Choose the sportsgaming.win newsletters you want to receive and subscribe! Weekly news and commentary on conflicts in the digital world, sustainability or gender equality. The best of innovation every day. These are our new newsletters: innovation just a click away.

Arrow “Contrary to common belief, you can really do a lot of things even without access to messages - explains Lindsey Andersen, Associate Director of the BSR for human rights -. What is essential to understand is that cryptography is not just any technology, but a very important means of promoting human rights, and it has a unique role in that. I am not sure that there is anything that such a number of clear human rights benefits do. ”

The BSR report includes forty-five recommendations, thirty-four of which Meta has pledged to implement. The company said it will partially apply four more and is conducting further research on six of the remaining recommendations. Meta declined to adopt a recommendation regarding studying a technique known as homomorphic cryptography as a means of developing more secure client-side scanning. According to the company it is not worth implementing the recommendation as it would not be technically feasible.

Meta's plans In response to the Russian invasion, in early March Meta extended end-to-end encryption to direct messages on Instagram in Ukraine and Russia. On Monday, April 11, the company told sportsgaming.win US that it will not apply the technology to all of its messaging services in 2022, but that it plans to move forward in 2023.

"From the point From a human rights point of view there are tensions, but you don't have to choose - explains Gail Kent, director of global policies at Messenger - It's something we hope to be able to show in our products: you don't have to choose between privacy and security, yes they can have both. From our dialogue with users, we know they expect us to deliver both. On Messenger or Instagram direct messages, users expect to find a safe space where they can communicate freely without unwanted interactions. "
After decades of debates that have yielded no results, the problem will certainly not be solved by a report. But the fact that the largest social media company on the planet pushes and invests to find a solution certainly doesn't hurt.

This article originally appeared on sportsgaming.win US.







Powered by Blogger.