EFF rejects Google: FLoC is a terrible idea

EFF rejects Google: FLoC is a terrible idea

EFF rejects Google

According to the Electronic Frontier Foundation, the technology imagined by Google to overcome the concept of third-party cookies is something extremely dangerous, "a terrible idea". In short, the association sees in this context a piece that is worse than the hole. The alarm is therefore raised before it is too late, before FLoC (Federated Learning of Cohorts) becomes reality.

A few hours after the Google post in which the assumptions of one's own idea are defended, defined more ambitious and guarantor than what it would like to be part of the competition, EFF rejects the Mountain View theorem, underlining the fear that a misleading narrative is going to be imposed on public opinion: if the past was represented by cookies, the future will not it is necessarily filled with alternatives to the cookies themselves. In short, the alternative of a cookie-free mode should be contemplated and EFF would like Google to fight for this ideal.

EFF: Google, stop

The feeling is that the chimera of privacy protection and market pragmatism clash exactly on this front - and we all know which side will win. But at the same time the accusations of the EFF have the merit of bringing the discussion back on fair tracks, allowing everyone to form an idea on the true horizon of the possible.

The first problem raised by the EFF is related to privacy: although Google considers it impossible to identify the single person since each cohort includes a high number of users, in reality there would be the margins to arrive at much more identifications capillaries (fingerprinting) by analyzing the behavior of the individual browser in its individual activities. Google itself has already expressed its full commitment to this effect, but to date there would be no basis for denying this possibility and this therefore entails an objective risk.

Another serious problem raised by the EFF consists in the possible predatory strategies that a target capacity based on cohorts could allow, acting through the exclusion of specific categories from specific announcements and thus opening up dangerous ridges right on the abyss of privacy. Grouping based on tastes could mean groupings indirectly linked to age, ethnicity, sexuality and so on: entrusting all this to an algorithm could have repercussions that are difficult to delineate a priori.

What Google sees as a step in ahead of yesterday, EFF sees it as a step backwards from tomorrow. The views cannot clearly be aligned, but in this concertation at least calming elements can arise that can define a possible balance.

Source: EFF

EFF Director Cindy Cohn On Warrantless Surveillance, Encryption And Financial Privacy With Bitcoin

SAN FRANCISCO, CA - September 19: One of the custom paintings in a stairwell of the Electronic ... [+] Frontier Foundation offices, September 19, 2013 in San Francisco, CA, USA. (Photo by Peter DaSilva for The Washington Post via Getty Images)

The Washington Post via Getty Images

Recently, there’s been a lot of activity around encryption, bitcoin and the connection between financial and personal liberty and digital rights and tools. As protest movements emerge around the world, there are also moves to create backdoors into encryption and to weaken the same technologies that are underpinning and supporting meaningful dissent from Hong Kong to Nigeria.

Electronic Frontier Foundation (EFF) director Cindy Cohn has been at the forefront of fighting for digital rights, both as a lawyer and advocate on important Constitutional cases. Here’s her thoughts on what it means to lead a digital rights advocacy group in a volatile time where digital rights are under threat.

Question 1: What were your challenge areas in 2020, and what are some of the immediate priorities that have come up during our current time?

We set three challenge areas for the year 2020, but I would say that we’ve done a lot on them, there have been intervening events.

Our three challenge areas for 2020 were the rise of public-private partnerships between the police and private companies for surveillance purposes, the need to really articulate the role of the public interest Internet, especially in Europe to try to make sure that rules that get passed because Europe is mad at Facebook don’t have bad effects on Facebook competitors or little businesses and also things like the Internet Archive and Wikimedia that are very much public interest pieces of Internet infrastructure.

And then the third [area] was to talk about the problems of content moderation and how the content moderation strategies of big tech are troubling and also causing these collateral effects like attacks on end-to-end encryption.

Those were the three things we started out with. Of course, the two big things that intervened and one small thing are, well, one big thing is COVID, of course — and so we spent a lot of time on working on how to think about the kinds of tracking applications that are coming out on COVID — soon we’ll be talking about immunity passports [...], how to think about those things and how to weigh the tradeoffs.

The second big thing that happened was the big racial justice movement and police violence against people of color and the response — and that has led us to really refresh and push out alot of the work that we do around protecting yourself in protests, as well as raise some concern on the use of facial recognition technologies and the impact they have on political protests — and then the third thing is the election in the United States.

Question 2: What is going on with end-to-end encryption? Should there be backdoors?

There’s probably a much more erudite version of this on the EFF website. The general framing of this, I try to frame it for people who may not be in the thick of it because it can feel really technical.

Imagine a world in which the local police come around and knock on your door and say “there’s crime around here, and some of it is really serious crime, so what we want you to do is to make sure that your door’s not locked because if you’re a criminal, we want to be able to come in and catch you”, and most people would get that’s a really insane way to go about law enforcement because, first of all, if there’s crime about, you want to be more secure not less secure, and why are the cops treating me like I’m a potential defendant as opposed to the person who needs to be protected?

These two things would come up for me immediately and I think they would for most people. Law enforcement isn’t doing their job right if the way that they’re trying to do their job is to make me less secure in order to make their job easier.


I’ve been involved in trying to protect end-to-end encryption, first to free it up from government control with the Bernstein case in the 90s, and now trying to protect it for the 30 years I’ve been involved with Internet policy — a time that predates even the World Wide Web. I can give you 20 other reasons as well, but I think those are the two biggest ones in terms of how to talk to people who aren’t deep in this debate about it.

Question 3: How does the EFF think about bitcoin and cryptocurrencies?

We do a lot of work to support cryptocurrencies — we passed the period where we were nervous about the regulatory state. [...] We do a lot of work to support financial privacy and we think that bitcoin and all cryptocurrencies are really for us as a civil liberties group — we think about it in the framing of financial privacy and the importance of this. My colleague Rainey Reitman does a lot of work on this, and we do a lot of work talking to regulators on why financial privacy matters and why they should respect it.

Question 4: Talk to us a bit about your work on Jewel v. NSA as chief counsel. What’s currently going on with mass warrentless surveillance?

The Jewel case had an argument in the Ninth Circuit Court of Appeals just on November 2nd, the day before the elections — where it sits as a legal matter is that the government claims secrecy and that because of this secrecy the case should be dismissed— some version of this state is been where we’ve been since we’ve launched the case in 2008. We’ll see if the Ninth Circuit buys it — there are several other decisions that have come out of the Ninth Circuit in the last couple of years that makes us think they will reject the government’s position. But that really just gets us to the starting gate of the case so we’ll have further to go. So that’s where that is, and we’re waiting to see what the panel is going to do.

In the overall fight about NSA surveillance [...], the bigger thing is that the NSA has abandoned two of the three big programs that it had that the [EFF] sued over. One of them is the mass collection of telephone records, which they didn’t really abandon but Congress made them stop. They do something else that honestly doesn’t seem to collect fewer records, so there’s still an ongoing fight about that.

The underlying authority for that mass collection actually got extended to March 2020, and then they never renewed it. The underlying legal authority for the mass telephone collection has expired. We don’t think that that’s limited them too much because the way that these programs work if they’ve launched it while they still have authority, they get to keep doing it. They usually cannot start a new investigation. Because it’s all secret, we don’t know what loopholes are there, but on the surface, that’s what it’s supposed to look like.

The other one is the mass metadata collection, which the [NSA] stopped a long time ago actually, because it didn’t work — and Congress was breathing down their necks about the fact that it didn’t work so they stopped that one.

But the tapping into the Internet backbone, which is really to me the bigger of the programs in terms of our security and the risk of mischief is still going on as far as we can tell. They’ve had to limit it and they’ve had to slowly narrow it, but the core of the program where they’re sitting on the Internet backbone watching all of the traffic that goes by with a secret list to pull off what they want — that’s still going on.

Question 5: If people are interested in preserving their digital rights, what can people do?

The first thing I would say is that while there are some individual things that people can do, if you reduce this to a problem about your choices, you’re going to fail. You’re going to be overwhelmed. We have to stand up for policy choices and legal choices that make these tools available to us. Yes, people can use things like Signal and Tor (I’m on the board of directors for the Tor Project), they can use DuckDuckGo rather than Google to have more private searches — there are a suite of tools that you can use such as Mastodon rather than Facebook for your social media.

There’s a suite of alternative tools that people can use, but they’re all very small and weak compared to how big they ought to be, and I think we have to stand up for legal and policy solutions that make these tools better available to us. Supporting end-to-end encryption against these kinds of attacks is one of the big things that we do and a place where we put a lot of our efforts in — but also just in general, how do we move away from the surveillance business model as well as the surveillance state. Both of them are growing greater right now and both of them require not just technical solutions, technical solutions are important but we also have to have policy and legal solutions — tech can’t do this by itself.

People can support organizations like us that do this work — there’s a ton of them and it’s a movement. If EFF is the one you want, we have cool swag — but there are organizations all around the world, big and small that are working in the digital rights movement — it’s not hard to find them, we have a bunch of them in our Electronic Frontier Alliance. In addition to supporting the tools by using the tools and building the tools, you need to support the laws and policies that protect the tools.

Powered by Blogger.