You are here


Error message

  • Deprecated function: Unparenthesized `a ? b : c ? d : e` is deprecated. Use either `(a ? b : c) ? d : e` or `a ? b : (c ? d : e)` in include_once() (line 1439 of /usr/share/nginx/html/
  • Deprecated function: Unparenthesized `a ? b : c ? d : e` is deprecated. Use either `(a ? b : c) ? d : e` or `a ? b : (c ? d : e)` in include_once() (line 1439 of /usr/share/nginx/html/
Subscribe to EFF feed
EFF's Deeplinks Blog: Noteworthy news from around the internet
Updated: 1 hour 46 min ago

In the Internet Age, Copyright Law Does Far More Than Antitrust to Shape Competition

7 hours 1 min ago

We're taking part in Copyright Week, a series of actions and discussions supporting key principles that should guide copyright policy. Every day this week, various groups are taking on different elements of copyright law and policy, addressing what's at stake and what we need to do to make sure that copyright promotes creativity and innovation.

There has been a notable, and long overdue flurry, of antitrust actions targeting Big Tech, launched by users, entrepreneurs, and governments alike. And in the US and abroad, policymakers are working to revamp our antitrust laws so they can be more effective at promoting user choice.

These are positive developments, but this renewed focus on antitrust risks losing sight of another powerful legal lever: copyright. Because there’s copyrighted software in every digital device and online service we use, and because the internet is essentially a giant machine for copying digital data, copyright law is a major force that shapes technology and how we use it. That gives copyright law an enormous role in enabling or impeding competition.

The Digital Millennium Copyright Act (DMCA) is a case in point. It contains two main sections that have been controversial since they went into effect in 2000. The "anti-circumvention" provisions (sections 1201 et seq. of the Copyright Act) bar circumvention of access controls and technical protection measures. The "safe harbor" provisions (section 512) protect service providers who meet certain conditions from monetary damages for the infringing activities of their users and other third parties on the net.

Congress ostensibly passed Section 1201 to discourage would-be infringers from defeating DRM and other access controls and copy restrictions on creative works. In practice, it’s done little to deter infringement – after all, large-scale infringement already invites massive legal penalties. Instead, Section 1201 has been used to block competition and innovation in everything from printer cartridges to garage door openers, videogame console accessories, and computer maintenance services. It’s been used to threaten hobbyists who wanted to make their devices and games work better. And the problem only gets worse as software shows up in more and more places, from phones to cars to refrigerators to farm equipment. If that software is locked up behind DRM, interoperating with it so you can offer add-on services may require circumvention.  As a result, manufacturers get complete control over their products, long after they are purchased, and can even shut down secondary markets (as Lexmark did for printer ink, and Microsoft tried to do for Xbox memory cards.)

On the other hand, Section 512’s “safe harbors” are essential to internet innovation, because they protect service providers from monetary liability based on their users’ infringing activities. To receive these protections service providers must comply with the conditions set forth in Section 512, including “notice and takedown” procedures that give copyright holders a quick and easy way to disable access to allegedly infringing content. Without these protections, the risk of potential copyright liability would prevent many online intermediaries—from platforms to small community websites to newspapers and ISPs -- from hosting and transmitting user-generated content. Without the DMCA, much of big tech wouldn’t exist today – but it is equally true that if we took it away now, new competitors would never emerge to challenge today’s giants. Instead, the largest tech companies would strike lucrative deals with major entertainment companies and other large copyright holders, and everyone else who hosted or transmitted third-party content would just have to shoulder the risk of massive and unpredictable financial penalties—a risk that would deter investment.

There is a final legal wrinkle: filtering mandates. The DMCA’s hair-trigger takedown process did not satisfy many rightsholders, so large platforms, particularly Google, also adopted filtering mechanisms and other automated processes to take down content automatically, or prevent it from being uploaded in the first place. In the EU, those mechanisms are becoming mandatory, thanks to a new copyright law that conditions DMCA-like safe harbors on preventing users from uploading infringing content. Its proponents insisted that filters aren't required, but in practice that’s the only way service providers will be able to comply. That’s created a problem in the EU – as the Advocate General of the EU Court of Justice acknowledged last year, automated blocking necessarily interferes with the human right to free expression.

But filtering mandates create yet another problem: they are expensive. Google has famously spent more than $100 million on developing its Content ID service – a cost few others could bear. If the price of hosting or transmitting content is building and maintaining a copyright filter, investors will find better ways to spend their money, and the current tech giants will stay comfortably entrenched.

If we want to create space for New Tech to challenge Big Tech, antitrust law can’t be the only solution. We need balanced copyright policies as well, in the U.S. and around the world. That’s why we fought to stop the EU’s mandate and continue to fight to address the inevitable harms of implementation, It’s why we are working hard to stop the current push to mandate filters in the U.S. as well. We also need the courts to do their part.  To that end, EFF just this month asked a federal appeals court to block enforcement of the copyright rules in Section 1201 that violate the First Amendment and criminalize speech about technology. We have also filed amicus briefs in numerous cases where companies are using copyright to shut out competition. And we’ll keep fighting, in courts, legislatures, agencies, and the public sphere, to make sure copyright serves innovation rather than thwarting it.

Fact-Checking, COVID-19 Misinformation, and the British Medical Journal

10 hours 47 min ago

Throughout the COVID-19 pandemic, authoritative research and publications have been critical in gaining better knowledge of the virus and how to combat it. However, unlike previous pandemics, this one has been further exacerbated by a massive wave of misinformation and disinformation spreading across traditional and online social media.

The increasing volume of misinformation and urgent calls for better moderation have made processes like fact-checking—the practice that aims to assess the accuracy of reporting—integral to the way social media companies deal with the dissemination of content. But, a valid question persists: who should check facts? This is particularly pertinent when one considers how such checks can shape perceptions, encourage biases, and undermine longstanding, authoritative voices. Social media fact-checks currently come in different shapes and sizes; for instance, Facebook outsources the role to third party organizations to label misinformation, while Twitter’s internal practices determine which post will be flagged as misleading, disputed, or unverified.

That Facebook relies on external fact-checkers is not in and of itself a problem – there is something appealing about Facebook relying on outside experts and not being the sole arbiter of truth. But Facebook vests a lot of authority in its fact-checkers and then mostly steps out of the way of any disputes that may arise around their decisions. This raises concerns about Facebook fulfilling its obligation to provide its users with adequate notice and appeals procedures when their content is moderated by its fact-checkers.

According to Facebook, its fact-checkers may assign one of four labels to a post: “False,” “Partly False,” Altered,” or “Missing Context.”  The label is accompanied by a link to the fact-checker and a more detailed explanation of that decision. Each label triggers a different action from Facebook. ​Content rated either “False” or “Altered” is subject to a dramatic reduction in distribution and gets the strongest warning labels. Content rated “Partly False” also gets reduced distribution, but to a lesser degree than "False" or "Altered." Content rated "Missing Context" is not typically subject to distribution reduction; rather Facebook surfaces more information from its fact-checking partners. But under its current temporary policy, Facebook will reduce distribution of posts about COVID-19 or vaccines marked as “Missing Context” by its fact-checkers.

As a result, these fact-checkers exert significant control over many users' posts and how they may be shared.

A recent incident demonstrates some of the problems with this system.

In November 2021, the British Medical Journal (BMJ) published a story about a whistleblower’s allegations of poor practices at three clinical trial sites run by Ventavia, one of the companies contracted by Pfizer to carry out its COVID-19 vaccine trials. After publication, BMJ’s readers began reporting a variety of problems, including being unable to share the article and being prompted by Facebook that people who repeatedly share “false information” might have their posts removed from Facebook’s News Feed.

BMJ’s article was fact-checked by Lead Stories, one of the ten fact-checking companies contracted by Facebook in the United States. After BMJ contacted Lead Stories to inquire about the flagging and removal of the post, the company maintained that the “Missing Context” label it had assigned the BMJ article was valid. In response to this, BMJ wrote an open letter to Mark Zuckerberg about Lead Stories’ fact-check, requesting that Facebook allow its readers to share the article undisturbed. Instead of hearing from Facebook, however, BMJ received a response to its open letter from Lead Stories.

Turns out, Facebook outsources not just fact-checking but also communication. According to Facebook, “publishers may reach out directly to third-party fact-checking organisations if they have corrected the rated content or they believe the fact-checker’s rating is inaccurate.” Then Facebook goes on to note that “these appeals take place independently of Facebook.” Facebook apparently has no role at all once one of its fact-checkers labels a post.

This was the first mistake. Although Facebook may properly outsource its fact-checking, it’s not acceptable to outsource its appeals process or the responsibility for follow-up communications. When Facebook vests fact-checkers with the power to label its users' posts, Facebook remains responsible for those actions and their effects on its users' speech. Facebook cannot merely step aside and force its users to debate the fact-checkers. Facebook must provide, maintain, and administer its own appeals process.

But more about this in a while; now, back to the story:

According to Lead Stories’ response, the reasons for the “Missing Context” label could be summarized in two points: the first concerned the headline and other substantive parts of the publication, which, according to Lead Stories, overstated the jeopardy and unfairly disqualified the data collected from the Pfizer trials; and, the second doubted the credibility of the whistleblower, given that in some other instances it would appear he had not always expressed unreserved support for COVID vaccines on social media. Lead Stories claims it was further influenced by the fact that the article was being widely shared as part of a larger campaign to discredit vaccines and their efficacy.

What happens next is interesting. The “appeals” process, as it were, played out in the public. Lead Stories responded to BMJ’s open letter in a series of articles published on its site. And Lead Stories further used Twitter to defend its decision and criticize both BMJ and the investigative journalist who was the author of the article. 

What does this all tell us about Facebook’s fact-checking and the implications for the restriction of legitimate, timely speech and expression on the platform? It tells us that users with legitimate questions about being fact-checked will not get much help from Facebook itself, even if they are a well-established and well-regarded scholarly journal. 

It is unacceptable that users who feel disserviced by Facebook need to navigate a whole new and complex system with a party that they were not directly involved with. For example, since 2019, Facebook has endorsed the Santa Clara Principles, which, among others, require companies to ensure a clear and easily accessible appeals’ process. This means that “users should be able to sufficiently access support channels that provide information about the actioning decision and available appeals processes once the initial actioning decision is made.” Do Lead Stories offer such an appeal process? Have they signed up to the Santa Clara principles? Does Facebook require its outside fact-checkers to offer robust notice and appeals processes? Has Facebook even encouraged them to?

Given the current state of misinformation, there is really no question that fact-checking can help navigate the often-overwhelming world of content moderation. At the same time, fact-checking should not mean that users must be exposed to a whole new ecosystem, consisting of new actors, with new processes and new rules. Facebook and other technology companies cannot encourage processes that detach the checking of facts from the overall content moderation process. Instead, it must take on the task of creating systems that users can trust and depend on. Unfortunately, the current system created by Facebook fails to achieve that. 

Copyright Shouldn’t Stand in the Way of Your Right to Repair

Tue, 01/18/2022 - 15:11

We're taking part in Copyright Week, a series of actions and discussions supporting key principles that should guide copyright policy. Every day this week, various groups are taking on different elements of copyright law and policy, addressing what's at stake and what we need to do to make sure that copyright promotes creativity and innovation.

If you bought it, you own it and you can do what you want with it. That should be the end of the story—whether we’re talking about a car, a tractor, a smartphone, a computer, or really anything you buy.

Yet product manufacturers have chipped away for years at the very idea of ownership, using the growing presence of software on devices to make nonsense arguments about why your tinkering with the things you own violates their copyright. It’s gotten so bad that there’s a booming market for 40-year-old tractors that don’t rely on software. We’ve worked for years with advocates with the Repair Coalition, iFixit, U.S. PIRG, and countless others, to get lawmakers to make it crystal clear that people have the right to tinker with their own stuff.

It’s working. The wind is at our backs right now. In just the past two years, the right to repair has won at the ballot box in Massachusetts, received a supportive directive from the Biden Administration, and made some gains at the Library of Congress to expand repair permissions.

Those wins have now built a lot of momentum for taking this fight to statehouses like never before. Advocates have gotten lawmakers to commit to or introduce bills or to affirm the right to repair in ten states. Some of these bills are general right-to-repair bills, while others focus on specific products such as cars or agricultural equipment. These efforts reach all corners of the country—from Massachusetts to Hawaii, from Florida to Washington. And it’s only January. As more states reach their deadlines to introduce new bills, EFF will be working to support those efforts and get our member involved in as many states as we can. Stay tuned for ways you can help at the state level throughout the year.

Change isn’t only coming in the form of possible legislation; pressure from consumers and activists have moved the needle in other ways. Even companies that have historically been the strongest opponents to right-to-repair legislation have made changes that acknowledge how important it is to their customers. Shareholder activism has changed policy at Microsoft to be friendlier to the right to repair. Apple, which has been hugely critical of right to repair legislation in the past,  announced a “Self Service Repair” program that makes genuine Apple parts and tools for a handful of products available for do-it-yourself repairs. We’ll be watching to make sure these companies live up to their promises.

At the heart of the matter, the right to repair your own things is pure common sense. Copyright shouldn’t dictate where you can take your cracked smartphone for repairs. It shouldn’t stop a mechanic or a medic from accessing a manual they need to fix vital equipment. It should never interfere with a farmer’s ability to get their time-sensitive work done, while they wait on an authorized repair provider. Copyright has been used for too long to chip away at the very idea of ownership. It’s time for state policymakers to join the growing number of people who know that makes no sense at all.

Podcast Episode: How Private is Your Bank Account?

Tue, 01/18/2022 - 03:41
Podcast Episode 108

Your friends, your medical concerns, your political ideology— financial transactions tell the story of your life in intimate details. But U.S. law has failed to protect  this sensitive data from prying eyes.  Join EFF’s Cindy Cohn and Danny O’Brien as they talk to Marta Belcher, one of the leading lawyers working on issues of financial censorship and financial privacy, as they help you understand why we need better protections for our financial lives—and the important role courts must play in getting things right. 

Click below to listen to the episode now, or choose your podcast player: Privacy info. This embed will serve content from


When the Supreme Court considered the issue of financial privacy under the Bank Secrecy Act in the 1970s, we were living in a really different time. Online shopping, Apple Pay, and tools like PayPal and Venmo didn’t exist yet. But even as our financial lives have become increasingly complex, digital, and detailed, the Supreme Court hasn’t revisited its approach to our rights. Instead it has allowed this information to be handed over by default to the government, ensnaring hundreds of millions of nonsuspect people instead of just carefully targeting a few suspects . Marta thinks it’s time to revisit this situation.   

Marta offers a deep dive into financial surveillance and censorship. In this episode, you’ll learn about: 

  • The concept of the third party doctrine, a court-created idea that law enforcement doesn’t need to get a warrant to access metadata shared with third parties (such as companies that manage communications and banking services);
  • How financial surveillance can have a chilling effect on activist communities, including pro-democracy activists fighting against authoritarian regimes in Hong Kong and elsewhere;
  • How the Bank Secrecy Act means that your bank services are sharing sensitive banking details on customers with the government by default, without any request from law enforcement to prompt it;
  • Why the Bank Secrecy Act as it’s currently interpreted violates the Fourth Amendment; 
  • The potential role of blockchain technologies to import some of the privacy-protective features of cash into the digital world;
  • How one recent case missed an opportunity to better protect the data of cryptocurrency users;
  • How financial surveillance is a precursor to financial censorship, in which banking services are restricted for people who haven’t violated the law. 

Belcher serves as general counsel of Protocol Labs, chair of the Filecoin Foundation, and special counsel to the Electronic Frontier Foundation. She was previously an attorney focusing on blockchain and emerging technologies at Ropes & Gray in San Francisco.  She has spoken about blockchain law around the world, including presenting during the World Economic Forum, testifying before the New York State Senate,  speaking in the European Parliament, and testifying before the United States Congress. You can find Marta on Twitter @MartaBelcher.

If you have any feedback on this episode, please email You can find a copy of this episode on the Internet Archive.

Below, you’ll find legal resources—including links to important cases, books, and briefs discussed in the podcast—as well as a full transcript of the audio.


Financial Surveillance:

Payment Processors and Censorship:


 Third-Party Doctrine:



Marta: When you're going about your life and you're engaging in financial transactions, all of that data is really exposed. Our financial transactions really paint an intimate portrait of our lives. Our financial transactions really expose our religious beliefs or our family status or a medical history, our location. And these are things that I think are very sensitive, and that should have full fourth amendment protection. These are things that ought to be private. 

Cindy: That’s Marta Belcher. One of the lawyers pioneering privacy and user freedom in the emerging world of blockchain technologies. She’s here to explain why financial privacy is vital for everyone and how the digitization of our financial lives has begun to erode that privacy and with it the protections that activists and organizers and all the rest of us need all around the world. 

Danny: Marta will also explain the ins and outs of important legal cases that have undermined our financial privacy. 

Cindy:  I'm Cindy Cohn. And I'm the Executive Director of the Electronic Frontier Foundation.

Danny: And I'm Danny, O'Brien.  Welcome to how to fix the internet, a podcast of the Electronic Frontier Foundation. 

Cindy: So we are delighted to have Marta with us today to talk about financial surveillance, Marta and EFF. We go back a long way. You were an intern with us, way back when you were in law school, but since then you've blazed a trail that's been just so fun for us to watch. You recently testified before Congress on financial privacy and the practical uses of cryptocurrency.

And before that you testified before the New York legislature. 

So we're official about it, Marta is the general counsel to Protocol Labs, and she serves as the chair of the board of the File Coin Foundation.

Danny: Where I should add you recently hired me. I don't know whether that's still in your good books there, Cindy.

Cindy: We're working our way to forgiving Marta about that. But all along the way, you've been one of EFFS official advisors in this space as our special counsel and at each step of the way we've relied on your wisdom, and honestly, the feeling that you were just living a little further into the future than the rest of us. So Marta, thank you for coming on the podcast.

Marta: Oh, my gosh. Thank you so much for having me. I am so excited to be here and to get to talk to some of my favorite people on the planet.

Cindy: Oh, it's just a love fest all around. Let's talk about financial surveillance. What kind of information about how we spend our money in our financial businesses is out there and what's happening to it? 

Marta: In the financial system financial transactions that go through certain intermediaries like banks, are often turned over to the government by default. So when there’s been a financial transaction over a certain amount for example, financial institutions will immediately turn that information over to the government, regardless of whether the government has specifically requested that information and without the government having to go and get a warrant to get that specific information. There's also requirements that for example, businesses, even if they receive something like cash, so not even electronic purchases, over a certain amount that they actually have to by default, file a form with the United States government that says I received a transaction in cash over X amount, and here's the identity of the person who handed me that cash.   

Cindy: So when you talk about financial transactions, can you make that real for us? What are the kinds of things that the US government is getting access to?  

Marta: The thing that we're talking about here is people's financial transactions, which includes for example, transactions that they're doing via their bank. It includes transactions that are done, for example, via cryptocurrency and that's things like, making purchases, sending money back and forth buying things, particularly if you're, if you're buying them electronically, but also if you're buying them with cash. So there's sort of a wide range of financial transactions that are subject to government surveillance.

Cindy: How did the United States get into this place where we treat financial transactions like they're, you know, not vitally private to people.

Marta: This is really one of the things that I find so frustrating about working on policy around financial surveillance is that for whatever reason, we seem to have gotten to a place where everyone accepts that financial surveillance in the banking system is totally normal.

Cindy: What should people know about the bank secrecy act and how it plays into this whole story? 

Marta: I think that the important thing for folks to know about the Bank Secrecy Act is that it effectively imposes reporting requirements on banks, so that for certain financial transactions those are turned over to the government without a warrant, en masse by default. So the issue here is that instead of law enforcement having to go get a warrant in order to get particular financial information, not only do they have the ability to just go to financial institutions and get that information, but actually it gets turned over to them by default. 

Cindy: So a warrant means you have to go in one by one and get information about a particular crime or a particular person. A subpoena lets you go a little more broadly without having probable cause or a judge sign off. And what I'm hearing is the Bank Secrecy Act actually flips that on its head and it starts out that the government gets the information rather than them having to go through any hoops at all. Is that a fair summary?

Marta: Exactly. Exactly. And that is exactly why I think it’s pretty shocking. I think this is something that is in my view, clearly violates the fourth amendment and is something and I really find it shocking that this is something that we see in our society today as being totally normal and acceptable. That somehow financial surveillance is different than other surveillance. 

Cindy: Part of the problem here is the Supreme court precedent. There is a decision from decades ago about this. Can you talk a little about that?

Marta: So there was a challenge to the Bank Secrecy Act in the 1970s. And unfortunately the Supreme court at the time held that because of a thing called the third party doctrine, as it existed at the time, the Bank Secrecy Act requirements that for example, banks turn over information about their customers by default, without a warrant, didn't violate the fourth amendment. And that is as a result of that 1976 Supreme court case, US v Miller

Cindy: If you look at the Miller case, and some similar cases, I mean, they really existed at a simpler time. That people's banking and their financial transactions, first of all, many of them were not available to their bank at all because things happened in cash. But otherwise things were just a simpler time. And I think we, I feel the same way about this with the financial side, as you do with people's email or other communications, you know, suddenly when things get digitized, there's much more information available. And so the default rule, which might've been okay in a simpler time, makes less and less sense in a more complicated time. Can you talk a little bit about the real world consequences of all this surveillance?

Marta: You know, you can imagine why a court in 1976 would say, okay, if you're turning over, details about people's financial transactions from, you know, from a bank to the government, the amount that you can learn about a person is pretty limited in 1976. Right. And of course, if you fast forward to today, you know, people's financial transactions really paint a detailed picture of their lives, right? It paints a picture of who they are interacting with, who they're associating with, what their religion is, what their location is. 

Danny: How to Fix the Internet is supported by the Alfred P Sloan foundations program in the public understanding of science enriching people's lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians. 

Cindy: It seems like part of this problem originates in the courts and in their interpretation of the Bank Secrecy Act. So let's, let's drill down a little bit, you know, a lawyer to lawyer with lots of non lawyers listening. The core thing here is something we call the third party doctrine, which provides that your fourth amendment rights end when a third party has your data. This is something EFF has worked to end for a very long time. And for those of you that have listened for a while, it was the topic of episode 3 of How to Fix the Internet with our friend Jumana Musa. So tell me about how the third party doctrine applies in the context of financial records.

Marta: Yeah, absolutely. So, I mean, going back even further, the fourth amendment really requires that law enforcement obtain a warrant supported by probable cause before they are conducting a search or seizure. And so why is it that in the financial system, law enforcement can engage in mass surveillance of bank customers without a warrant? So the answer is: the third-party doctrine, which is the idea that people don't have a reasonable expectation of privacy in the data that they share with a third party, like a bank. And so that was why that was what the Supreme court was relying on in 1976 in US v Miller when it held that the Bank Secrecy Act didn't violate the fourth amendment, it was because of the third party doctrine.

Cindy: The Miller case is interesting because it was about the cops trying to figure out whether somebody was illegally distilling whiskey. And then if you look at the case, they actually probably could've gotten a warrant. They knew a lot about this guy. The thing that we're arguing for is the difference between the cops having free range access to people's financial information and the cops having to do a probable cause warrant. And in most of these situations, if you drill down, if the police could have easily made their case to a judge, and just didn't want to. Again, I think a lot of this stuff around financial privacy really comes down to things that make cops jobs as easy as possible. But the entire thing about civil liberties is to make the cops’ job harder so that we have a zone of privacy that we can live in.

Are we safer because they have to automatically report any transaction of $10,000 or more when we know that the vast majority of those are going to be perfectly innocent? Or would we be safer if we made the cops actually do the work that they need to do to do probable cause and identify the suspects through all the other ways in which you can investigate. And it, of course, it's an important issue about all the other innocent people who are sideswiped along the way.

Marta: Yeah, I think that's really well put. And I think the thing that I would expand on is fundamentally if the thing we are optimizing for is solving every single crime, we could live in a society where people have cameras following them around at all times in their homes and everything is recorded, right? So there's a spectrum. And the way that the constitution balances civil liberties with the interests of enforcing the laws is the fourth amendment, which is to say that if there is probable cause law enforcement can go and show there's probable cause and get a warrant in order to obtain information. And so it's really the difference between does law enforcement need to go get that warrant to show probable cause in order to obtain information, which is what the fourth amendment requires, or do they have access to that information by default, even when there is no probable cause? Do they have the ability to look at people's transactions, even when there's no reason to believe that those transactions are in any way associated with crime? And that's really fundamentally the issue from a civil liberties perspective.

Cindy: I guess that leads to the obvious question. Do you think this is all constitutional?

Marta:  I absolutely do not think that the Bank Secrecy Act, as it's applied today,  is, is constitutional. And, you know, unfortunately the Supreme court disagreed with me on that, but that was back in 1976. I really do think that the court would come to a different decision if it was faced with that challenge again for a variety of reasons. The extent to which the surveillance under the Bank Secrecy Act has expanded, but I think more importantly, and as a Testament to EFFs work in the decades since that Miller decision, the Supreme court has really issued strong pro-privacy opinions in multiple cases. So they've been chipping away at the third party doctrine in the context of the digital world. So for example, the Supreme court held in Carpenter v US that law enforcement must have a warrant in order to obtain location information from a cell phone company. And really, I think that goes to show that the information that could be gleaned from bank data in the 1970s is just a complete world away from the picture of a person's life that can be painted with access to digital financial transactions today.

Danny: There seems to be a sort of global spread in this assumption that financial data is fair game for any country. Are there any sort of examples that really bring home just what it means to have the local state be able to peer directly into your day-to-day transactions?

Marta: Last year when we had the Hong Kong protests there were these really powerful pictures that showed long lines at the subway stations. As these pro-democracy protestors were waiting to purchase their tickets with cash because they didn't want their electronic purchases to place them at the scene of the protest. And so for me, that really underscores the importance of the ability for people to engage in anonymous transactions for civil liberties, and really underscores that a cashless society or a society where all transactions are tracked is really a surveillance society.

Danny: Do you think that that cryptocurrency really addresses some of the sort of privacy issues?

Marta: I think the most important thing about cryptocurrency is that it takes the civil liberties enhancing benefits of cash and imports them into the online world. I think that for me is the most important thing about the technology and because of that ability to transact anonymously, cryptocurrency has become a target of regulators, lawmakers to try to expand this surveillance to the cryptocurrency space. But for me, the fact that cryptocurrencies can enable anonymous transactions is a feature, not a bug. 

Cindy: Now we know cryptocurrency transactions can enable anonymity and transactions, but so far anyway, that's not really what we're seeing. Can you talk a little bit about you know, how the laws interacted with it so far, the, you know, specifically I'm thinking about the Gratkowski case.

Marta:  So I think first of all, it's important to say that not all cryptocurrency transactions are anonymous, many of them are actually pseudonymous. So Bitcoin for example, the Bitcoin ledger, the Bitcoin blockchain is a publicly viewable ledger of all transactions. So you can actually go see that user 123 sent one Bitcoin to user 456. Right? And if you are able to figure out that Marta is user  123, and Cindy is user 456, you can actually see anyone in the world can see that I have sent one Bitcoin to Cindy. And so what happens is you have these choke points such as cryptocurrency exchanges, which is where those cryptocurrency exchanges will do identity checks. In the Gratkowski case basically the law enforcement had gone to an exchange and basically done that. They had gone in and said we want to know who user 123 is. Right. And based on that, we're able to arrest this person. Now they could, as we've been discussing, they could have gone and gotten a warrant, but instead they just asked the exchange and the exchange just handed over that information. So the defendant, Gratkowski, challenged that based on the fourth amendment and that, went up to the Fifth Circuit court of appeals. Unfortunately the fifth circuit held that because of the third-party doctrine and because of US v Miller, the law enforcement did not need a warrant to go and get that information from the exchange. And I think that was the wrong decision and I think that that court really missed an opportunity to follow the Supreme court's lead in recognizing  that there are stronger privacy protections for digital data that's held by third parties.

Cindy: So let's go to my favorite part, which is how do we fix all of this? I think you've pointed to some ways. But let's, let's talk about them, what does the world look like if we get this right? 

Marta: I think there are a couple of things. I think the big one is, you know, there's really no reason that we need to take the financial surveillance of the traditional banking system and extend it out to cryptocurrency, just in the, in the cryptocurrency context specifically. And so we could really utilize this technology that enables people to make anonymous transactions, and really utilize that for enhancing civil liberties. The thing that I hope will happen is that there will be a fourth amendment challenge to the Bank Secrecy Act, as it is currently applied. And that the Supreme court would come out differently today and would basically decide if  the government wants to get this detailed financial information about bank customers, they do have to go get a warrant in order to do it.

Cindy: So, you know, in our future world, your transactions are your own, you get to buy what you want, whether that's a ticket to attend a protest or opening a bank account to start your opposition work against a dictator, or whether you just simply want to, you know, buy something without the government looking over your shoulder, you get to do all of that. You're free to do all of that. And if the government thinks you're doing something wrong, they have to go to a judge and get a warrant to get access to your information. 

Marta: Right now, one of the other issues in this space beyond just financial surveillance is the amount of censorship by financial intermediaries. So we've seen repeatedly Visa and MasterCard and PayPal and other financial intermediaries cut off access to financial services for all sorts of different legal websites, legal speech, and merely because of their own sort of moral whim. So some examples are adult booksellers, social networks, whistleblower websites, have all sort of suffered from financial censorship.

Cindy: I think that the other piece of this, it's not only that the companies are engaging in kind of some moralistic decisions about who gets to do transactions and who doesn't. We have this thing that we call jawboning, right,which is kind of a newly emerging term for politicians leaning on platforms to cut some people off or limit what people can do, because the politician wants to make political points. And, this, I would say, tends to come up, around election time a lot. So I think the other thing that we get in this world is not only that the corporations don't feel a push to be moralistic about who gets to do financial transactions, but they also aren't vulnerable to pressure from governments, US and otherwise to do that for them to outsourcing the censorship that a politician can't do directly to a private company.

Marta: I think it's a huge vulnerability that the way that electronic payments work really make these payment systems a choke point for controlling online content. And we have seen, as you said, instances of government officials actually pushing for financial services to cut off particular websites, particular speech. Luckily, you know, thanks in part to EFF, submitting an Amicus brief in at least one of those cases, in the Backpage V Dart seventh circuit case, there have been findings that, um, doing so would violate the first amendment.

Danny: So I started this thinking of this sort of solution, this future as being kind of the same as what we have now, but in the world of cash, right. Cash is reasonably protected, but it seems like part of the solution would actually be broader than that. It would give us more choice and more alternatives in a digital world that you would not only just deal with the limitations of cash, but you would also be able to escape the limitations of credit card companies and you will be able to pick and choose, who to transact with based on what you want to do rather than what the credit card companies want you to do. Is that right?

Marta: I think that there's a really interesting question for advocacy organizations in the financial space as to are we going to draw the line at, well, whatever restrictions there are on cash that's okay. We can extend those to other types of technologies as well. Or are we going to take a stronger stance and say, you know, actually we think all of these types of reporting requirements, including those that apply to cash are violations of the fourth amendment and that that should not be extended into new technologies.

Cindy: So Marta, what values are we trying to preserve and support in this new world where we get it all right?

Marta: Fundamentally, this is about civil liberties and this is about people's ability to go about their lives without government surveillance. We may not think about money when we think about, for example, exercising our first amendment rights and engaging in politics. But in reality, all of these things do involve financial transactions, whether that be political expenses or things that reveal your religion or your sexual associations or who you associate with all of these things can be revealed by your financial transactions. And it's very important to be able to live in a world where you can engage in those transactions privately without those being surveilled by the government by default.

Cindy:  Thank you so much Marta for taking this time and taking us through this tour of financial privacy. It's a tremendously important issue and sometimes it gets buried underneath a lot of the hype around cryptocurrency. 

Where can people find you? I understand that you have your own podcast and that our own Rainey Reitman was recently a guest. 

Marta: That’s right, we do have a podcast, the FileCoin Foundation as a podcast, it’s called the Future Rules, and not only has Rainey been a guest but also Danny has been a guest. You can definitely listen to that podcast.

Cindy: Wonderful, thanks again for taking the time to talk to us. 

Cindy: Wow. That was so fun and so interesting. Marta really opened my eyes even a bit more about how critical financial privacy is to real privacy. And I was especially struck by the image of the protesters in Hong Kong, buying their tickets with cash, because as we all know, you can be tracked to where you are based on what you buy, and, you know, at that particular time, especially being at a protest in Hong Kong, it was tremendously dangerous. 

Danny: And of course we have this background in preventing communication surveillance and all the arguments apply, right? Like actually tracking money lets you see everything about someone and everyone uses money in the same way as everyone has to communicate. It's not just criminals who use money and like every transaction over $10,000, you know, if the police talk about that, you go, oh yeah, $10,000, that's probably drugs. But actually of course, you know, houses, cars like, like monthly transactions. I'm not a rich person, $10,000 is still something that, you know, I run into occasionally. And everything else, and every credit credit card transaction, everything going to the government, getting stored in databases forever and ever. It's really sort of turned around my thoughts on this actually.

Cindy: You know, the thing that she really drove home is that, you know, this kind of financial surveillance, especially under the Bank Secrecy Act, it's mass surveillance, right? It is surveilling everybody first and then figuring out what you need second. There's millions, literally hundreds of millions of people who are innocent, who are caught up in this dragnet for the few people that they want to catch. And, and I honestly don't know that the case has been made that they couldn't catch these people any other way, most specifically by getting a probable cause warrant, which I think she told us over and over again was, you know, the thing that would switch this around from a situation in which there's a problem for a lot of people to something that's, you know, a reasonable law enforcement strategy.

Danny: It’s this classic problem of mass surveillance. Right. You're trying to tackle and find a thousand to 10,000 people who are doing a bad thing. And you're looking through millions of people's records in order to do that. The other thing I think is that in the same way, as we say, in communication surveillance, that surveillance leads to censorship, or at least you can't sensor without surveillance. If you can't see what websites people are visiting, then you can't censor them. And same thing here, right? Like if you have a basis that people have to share all their financial information with third parties, pretty soon, those third parties are going to have pressure put on them or just decide themselves that they don't want some kinds of business. Like you said, is it Jawboning, is that the phrase? 

Cindy: Yeah, that's the phrase I just heard about it.  

Danny: It's so Jawboning where, where Congress people like put pressure on, credit card providers, like Visa and MasterCard to throw sex workers or other people that other people don't like off the financial system. And you can only do that with this level of surveillance. 

Cindy: Thinking about this, you know, the cryptocurrency and the blockchain technologies are really technologies, but what's become so clear about this is that we've got some legal case law, frankly, that was written in the 1970s or, or adopted in the 1970s, it's really getting in the way here in a context where it's really not appropriately applied. So this is one where we think about, you know, again, we do this a lot, but Larry Lesig’s four areas we've got code, we've got law, we've got social norms and we've got markets. We've got code doing, giving us potentially some really good things and we need to get the law out of the way.

Danny: Right. And the norms, I think, beginning to rethink again, just how intrusive all of this surveillance is, and I think that that might be a little uphill work, right? Because people, people just think of it this way, but, but if we're going to build a better future, we have to start thinking and putting our own civil liberties first.

Danny: And thanks to Nat Keefe and Reed Mathis of Beat Mower for making the music for this podcast. Additional music is used under a Creative Commons license from CCMixter. You can see the credits and links to the music in our episode notes. Please visit where you’ll find more episode, learn about these issues, you can donate to become a member of EFF, as well as lots more. Members are the only reason we can do this work plus you can get cool stuff like an EFF hat, an EFF hoodie or an EFF camera cover for your laptop. 

How to Fix the Internet is supported by the Alfred P Sloan foundation’s program and public understanding of science and technology. I'm Danny O’Brien.  I'm Danny O'Brien.

Cindy: and I'm Cindy Cohn. 


Welcome to the Public Domain, Winnie-the-Pooh

Mon, 01/17/2022 - 11:26

In 2019, for the first time in 20 years, U.S. copyright law allowed formerly copyrighted works to join the public domain. Works in the public domain are no longer under copyright, and anyone can republish or use those works in whatever way they want. The public domain is the default home of all creative endeavors because culture isn’t owned by any single person or corporation—it’s shared.

This year, the public domain opened up to include works from 1926 and a whopping 400,000 sound recordings. Of course, the real fun is that the third Hercule Poirot novel by Agatha Christie, Ernest Hemingway’s The Sun Also Rises, and the original books of Winnie-the-Pooh and Bambi are now free for anyone to use.

In particular, the popular images of Winnie-the-Pooh and Bambi have been dominated by one rightsholder’s vision for a long time: Disney. And while Disney’s versions of those stories remain under copyright, their exclusive hold on two cornerstones of childhood has come to an end. This is a good thing—it lets those stories be reinterpreted and repurposed by people with different takes. We can all decide whether the Disney versions are the actual best ones or were simply the only ones.

Public domain works can be used for such lofty goals. Or they can simply be used for fun, allowing anyone to participate in a worldwide sport of joy. With so many more uses suddenly available to so many more people, we get a flood of works and get to choose which ones we love most. And, of course, we can try our hand at joining in.

Last year, the Great Gatsby was at the center of a flurry of internet jokes when it entered the public domain. Archive of Our Own, the award-winning fanfiction archive, suddenly found itself home to very lightly altered versions of F. Scott Fitzgerald’s famous work. Some replaced the characters in the original with those from other works, putting them in dialog with each other. One absolute internet genius replaced every use of “Gatsby” with “Gritty,” replacing a memetic capitalist played by Leonardo DiCaprio in a recent film adaptation with a memetic anti-capitalist puppet hockey mascot.

When people compete to top each other for the most creative, weird, or just funny use of a public domain work, we all win.

It’s Copyright Week 2022: Ten Years Later, How Has SOPA/PIPA Shaped Online Copyright Enforcement?

Mon, 01/17/2022 - 11:20

We're taking part in Copyright Week, a series of actions and discussions supporting key principles that should guide copyright policy. Every day this week, various groups are taking on different elements of copyright law and policy, addressing what's at stake and what we need to do to make sure that copyright promotes creativity and innovation.

Ten years ago, a diverse coalition of internet users, non-profit groups, and internet companies defeated the Stop Online Piracy Act (SOPA) and the PROTECT IP Act (PIPA), bills that would have forced internet companies to blacklist and block websites accused of hosting copyright-infringing content. These were bills that would have made censorship very easy, all in the name of copyright enforcement. This collective action showed the world that the word of the few major companies who control film, music, and television can’t control internet policy for their own good.

We celebrate Copyright Week every year on the anniversary of the internet blackout that finally got the message across: Team Internet will always stand up for itself.

While SOPA and PIPA were ultimately defeated, their spirits live on. They live on in legislation like the CASE Act and the EU Copyright Directive. They live on in the use of copyright filters on major platforms, which exist because the largest entertainment companies insist on them. They live on every time you can’t fix a device you paid for and rightfully own. They live on in the licensing agreements that prevent us from owning digital goods.

We continue to fight for a version of copyright policy that doesn’t seek to control users. That doesn’t serve only a few multibillion-dollar corporations, but rather the millions of people online who are independent artists. That contributes to the growth, not stagnation, of culture.

Each year, we pick five issues in copyright to highlight and advocate a set of principles around. This year’s issues are:

  • Monday: The Public Domain
    The public domain is our cultural commons and a crucial resource for innovation and access to knowledge. Copyright should strive to promote, and not diminish, a robust, accessible public domain.
  • Tuesday: Device and Digital Ownership
    Copyright should not be used to control knowledge, creativity, or the ability to tinker with or repair your own devices. Copyright should encourage more people to share, make, or repair things.
  • Wednesday: Copyright and Competition
    Copyright policy should encourage more people to create and seek to keep barriers to entry low, rather than concentrate power in only a few players.
  • Thursday: Free Expression and Fair Use
    Copyright policy should encourage creativity, not hamper it. Fair use makes it possible for us to comment, criticize, and rework our common culture.
  • Friday: Copyright Enforcement as a Tool of Censorship
    Freedom of expression is a fundamental human right essential to a functioning democracy. Copyright should encourage more speech, not act as a legal cudgel to silence it.

Every day this week, we’ll be sharing links to blog posts and actions on these topics at and at #CopyrightWeek on Twitter.

As we say every year, if you too stand behind these principles, please join us by supporting them, sharing them, and telling your lawmakers you want to see copyright law reflect them.

EFF Asks Appeals Court to Rule DMCA Anti-Circumvention Provisions Violate First Amendment

Thu, 01/13/2022 - 14:55
Lawsuit Filed on Behalf of Computer Scientist and Security Researcher Seeks to Bar Enforcement of Section 1201 Provisions

Washington D.C.—The Electronic Frontier Foundation (EFF) asked a federal appeals court to block enforcement of onerous copyright rules that violate the First Amendment and criminalize certain speech about technology, preventing researchers, tech innovators, filmmakers, educators, and others from creating and sharing their work.

EFF, with co-counsel Wilson Sonsini Goodrich & Rosati, asked the U.S. Court of Appeals for the District of Columbia to reverse a district court decision in Green v. DOJ, a lawsuit we filed in 2016 challenging the anti-circumvention and anti-trafficking provisions of the Digital Millennium Copyright Act (DMCA) on behalf of security researcher Matt Green and technologist Andrew “bunnie” Huang. Both are pursuing projects highly beneficial to the public and perfectly lawful except for DMCA’s anti-speech provisions.

These provisions—contained in Section 1201 of the DMCA—make it unlawful for people to get around the software that restricts access to lawfully-purchased copyrighted material, such as films, songs, and the computer code that controls vehicles, devices, and appliances. This ban applies even where people want to make noninfringing fair uses of the materials they are accessing. The only way to challenge the ban is to go through an arduous, cumbersome process, held every three years, to petition the Library of Congress for an exemption.

While enacted to combat music and move piracy, Section 1201 has long served to restrict people’s ability to access, use, and even speak out about copyrighted materials—including the software that is increasingly embedded in everyday things. Our rights to tinker with or repair the devices we own are under threat by the law, which makes it a crime to create or share tools that could, for example, allow people to convert their videos so they can play on multiple platforms or conduct independent security research to find dangerous flaws in vehicles or medical devices.

Green, a computer security researcher at Johns Hopkins University, works to make Apple messaging and financial transactions systems more secure by uncovering software vulnerabilities, an endeavor that requires finding and exploiting weaknesses in code. Green seeks to publish a book about his work but fears that it could invite criminal charges under Section 1201.

Meanwhile Huang, a prominent computer scientist and inventor, and his company Alphamax LLC, are developing devices for editing digital video streams that would enable people to make innovative uses of their paid video content, such as captioning a presidential debate with a running Twitter comment field or enabling remixes of high-definition video. But using or offering this technology could also run afoul of Section 1201.

Ruling on the government’s motion to dismiss the lawsuit, a federal judge said Green and Huang could proceed with claims that 1201 violated their First Amendment rights to pursue their projects but dismissed the claim that the section was itself unconstitutional. The court also refused to issue an injunction preventing the government from enforcing 1201.

“Section 1201 makes it a federal crime for our clients, and others like them, to exercise their right to free expression by engaging in research, creating software, and publish their work,” said EFF Senior Staff Attorney Kit Walsh. “This creates a censorship regime under the guise of copyright law that cannot be squared with the First Amendment.”

For the filing:

For more about this case:

Contact:  CorynneMcSherryLegal KitWalshSenior Staff

EFF Threat Lab’s “apkeep” APK Downloader, Now More Capable and Available in More Places

Thu, 01/13/2022 - 14:21

In September, we introduced EFF Threat Lab’s very own APK Downloader, apkeep. It is a tool that allows us to make the job of tracking state-sponsored malware and combatting the stalkerware of abusive partners easier. Since that time, we’ve added some additional functionality that we’d like to share.


In addition to the ability to download Android packages from the Google Play Store and APKPure, we’ve added support for downloading from the free and open source app repository F-Droid. Packages downloaded from F-Droid are checked against the repository maintainers’ signing key, just like in the F-Droid app itself. The package index is also cached, which makes it easy to run multiple subsequent requests for downloads.


You can now download specific versions of apps from either the apk-pure app store, which mirrors the Google Play Store, or from f-droid. To try it, issue the following command to see which versions are available:

apkeep -l -a -d apk-pure

Once you’ve picked a desired version, download it with this command:

apkeep -a -d apk-pure .

Keep in mind not all versions will be retained by these download sources, so only recent versions may be available.

Additional Platform Support

On initial launch, we supported only 6 platforms:

  • GNU/Linux x86_64, i686, aarch64, and armv7
  • Android aarch64 and armv7

We have been quickly building our platform support to bring the current tally to 9:

  • GNU/Linux x86_64, i686, aarch64, and armv7
  • Android x86_64, i686, aarch64 and armv7
  • Windows x86_64

and we plan to continue to build out to more platforms in the future.

Termux Repositories

The Android terminal application Termux now makes it easy to install apkeep. We have added our package to their repository, so that Termux users now only need to issue a simple command to install the latest version:

pkg install apkeepFuture Plans

In addition to continuing to build out to additional platforms, we would also like to add more Android markets to download from, such as the Amazon Appstore. Have any suggestions for features or new platforms you’d like to see supported? Let us know by opening an issue on our GitHub page!

Special Thanks

We would like to thank the F-Droid and Termux communities for their assistance in this build-out, and thank our users for their feedback and support.

San Francisco Police Illegally Used Surveillance Cameras at the George Floyd Protests. The Courts Must Stop Them

Thu, 01/13/2022 - 12:51

By Hope Williams, Nathan Sheard, and Nestor Reyes

The authors are community activists who helped organize and participated in protests against police violence in San Francisco after the murder of George Floyd. A hearing in their lawsuit against the San Francisco Police Department over surveillance of Union Square protests is scheduled for Friday. This article was first published in the San Francisco Standard.

A year and a half ago, the San Francisco Police Department illegally spied on us and thousands of other Bay Area residents as we marched against racist police violence and the murder of George Floyd. Aided by the Electronic Frontier Foundation (EFF) and the ACLU of Northern California, we have taken the SFPD to court.

Our lawsuit defends our right to organize protests against police violence without fear of illegal police surveillance. After the police murdered George Floyd, we coordinated mass actions and legal support and spent our days leading the community in chants, marches and protests demanding an end to policing systems that stalk and kill Black and Brown people with impunity.

Our voice is more important than ever as the mayor and Chris Larsen, the billionaire tech executive funding camera networks across San Francisco, push a false narrative about our lawsuit and the law that the SFPD violated. 

In 2019, the city passed a landmark ordinance that bans the SFPD and other city agencies from using facial recognition and requires them to get approval from the Board of Supervisors for other surveillance technologies. This transparent process sets up guardrails, allows for public input and empowers communities to say “no” to more police surveillance on our streets. 

But the police refuse to play by the rules. EFF uncovered documents showing that the SFPD violated the 2019 law and illegally tapped into a network of more than 300 video cameras in the Union Square area to surveil us and our fellow protesters. Additional documents and testimony in our case revealed that an SFPD officer repeatedly viewed the live camera feed, which directly contradicts the SFPD’s prior statements to the public and the city’s Board of Supervisors that “the feed was not monitored.”

Larsen has also backpedaled. Referencing the network, he previously claimed that “the police can’t monitor it live.” Now, Larsen is advocating for live surveillance and criticizing us for defending our right under city law to be free from unfettered police spying. He even suggests that we are to blame for recent high-profile retail thefts at San Francisco’s luxury stores. 

As Black and Latinx activists, we are outraged—but not surprised—by rich and powerful people supporting illegal police surveillance. They are not the ones targeted by the police and won’t pay the price if the city rolls back hard-won civil rights protections. 

Secret surveillance will not protect the public. What will actually make us safer is to shift funding away from the police and toward housing, healthcare, violence interruption programs and other services necessary for racial justice in the Bay Area. Strong and well-resourced communities are far more likely to be safe than they would be with ever-increasing surveillance.

As members of communities that are already overpoliced and underserved we know that surveillance is a trigger that sets our most violent and unjust systems in motion. Before the police kill a Black person, deport an immigrant, or imprison a young adult for a crime driven by poverty, chances are the police surveilled them first.

That is why we support democratic control over police spying and oppose the surveillance infrastructure that Larsen is building in our communities. We joined organizations like the Harvey Milk LGBTQ Democratic Club in a successful campaign against Larsen’s plan to fund more than 125 cameras in San Francisco’s Castro neighborhood. And we made the decision to join forces with the EFF and the ACLU to defend our rights in court after we found out the SFPD spied on us and our movement.

Tomorrow, we will be in court to put a stop to the SFPD’s illegal spying and evasion of democratic oversight. We won’t let the police or their rich and powerful supporters intimidate activists into silence or undermine our social movements.

Related Cases: Williams v. San Francisco

Nearly 130 Public Interest Organizations and Experts Urge the United Nations to Include Human Rights Safeguards in Proposed UN Cybercrime Treaty

Thu, 01/13/2022 - 10:35

EFF and Human Rights Watch, along with nearly 130 organizations and academics working in 56 countries, regions, or globally, urged members of the Ad Hoc Committee responsible for drafting a potential United Nations Cybercrime Treaty to ensure human rights protections are embedded in the final product. The first session of the Ad Hoc Committee will begin on January 17th

The proposed treaty will likely deal with cybercrime, international cooperation, and access to potential digital evidence by law enforcement authorities, as well as human rights and procedural safeguards. UN member states have already written opinions discussing the scope of the treaty, and their proposals vary widely. In a letter to the committee chair, EFF and Human Rights Watch along with partners across the world asked that members include human rights considerations at every step in the drafting process. We also recommended  that cross-border investigative powers include strong human rights safeguards, and that global civil society be provided opportunities to participate robustly in the development and drafting of any potential convention.

Failing to prioritize human rights and procedural safeguards in criminal investigations can have dire consequences.  As many countries have already abused their existing cybercrime laws to undermine human rights and freedoms and punish peaceful dissent, we have grave concerns that this Convention might become a powerful weapon for oppression. We also worry that cross-border investigative powers without strong human rights safeguards will sweep away progress on protecting people’s privacy rights, creating a race to the bottom among jurisdictions with the weakest human rights protections.

We hope the Member States participating in the development and drafting of the treaty will recognize the urgency of the risks we mention, commit to include civil society in their upcoming discussions, and take our recommendations to heart.

Drafting of the letter was spearheaded by EFF, Human Rights Watch, AccessNow, ARTICLE19, Association for Progressive Communications, CIPPIC, European Digital Rights, Privacy International, Derechos Digitales, Data Privacy Brazil Research Association, European Center For Not-For-Profit Law, IT-Pol – Denmark, SafeNet South East Asia, Fundación Karisma, Red en Defensa de los Derechos Digitales, OpenNet Korea, among many others.

The letter is available in English and Spanish, and will be available in other UN languages in due course.

The full text of the letter and list of signatories are below:

December 22, 2021

H.E. Ms Faouzia Boumaiza Mebarki
Ad Hoc Committee to Elaborate a Comprehensive International Convention on Countering the Use of Information and Communication Technologies for Criminal Purposes

Your Excellency,

We, the undersigned organizations and academics, work to protect and advance human rights, online and offline. Efforts to address cybercrime are of concern to us, both because cybercrime poses a threat to human rights and livelihoods, and because cybercrime laws, policies, and initiatives are currently being used to undermine people’s rights. We therefore ask that the process through which the Ad Hoc Committee does its work includes robust civil society participation throughout all stages of the development and drafting of a convention, and that any proposed convention include human rights safeguards applicable to both its substantive and procedural provisions.


The proposal to elaborate a comprehensive “international convention on countering the use of information and communications technologies for criminal purposes” is being put forward at the same time that UN human rights mechanisms are raising alarms about the abuse of cybercrime laws around the world. In his 2019 report, the UN special rapporteur on the rights to freedom of peaceful assembly and of association, Clément Nyaletsossi Voule, observed, “A surge in legislation and policies aimed at combating cybercrime has also opened the door to punishing and surveilling activists and protesters in many countries around the world.” In 2019 and once again this year, the UN General Assembly expressed grave concerns that cybercrime legislation is being misused to target human rights defenders or hinder their work and endanger their safety in a manner contrary to international law. This follows years of reporting from non-governmental organizations on the human rights abuses stemming from overbroad cybercrime laws.

When the convention was first proposed, over 40 leading digital rights and human rights organizations and experts, including many signatories of this letter, urged delegations to vote against the resolution, warning that the proposed convention poses a threat to human rights.

In advance of the first session of the Ad Hoc Committee, we reiterate these concerns. If a UN convention on cybercrime is to proceed, the goal should be to combat the use of information and communications technologies for criminal purposes without endangering the fundamental rights of those it seeks to protect, so people can freely enjoy and exercise their rights, online and offline. Any proposed convention should incorporate clear and robust human rights safeguards. A convention without such safeguards or that dilutes States’ human rights obligations would place individuals at risk and make our digital presence even more insecure, each threatening fundamental human rights.

As the Ad Hoc Committee commences its work drafting the convention in the coming months, it is vitally important to apply a human rights-based approach to ensure that the proposed text is not used as a tool to stifle freedom of expression, infringe on privacy and data protection, or endanger individuals and communities at risk.  

The important work of combating cybercrime should be consistent with States’ human rights obligations set forth in the Universal Declaration of Human Rights (UDHR), the International Covenant on Civil and Political Rights (ICCPR), and other international human rights instruments and standards. In other words, efforts to combat cybercrime should also protect, not undermine, human rights. We remind States that the same rights that individuals have offline should also be protected online.

Scope of Substantive Criminal Provisions

There is no consensus on how to tackle cybercrime at the global level or a common understanding or definition of what constitutes cybercrime. From a human rights perspective, it is essential to keep the scope of any convention on cybercrime narrow. Just because a crime might involve technology does not mean it needs to be included in the proposed convention. For example, expansive cybercrime laws often simply add penalties due to the use of a computer or device in the commission of an existing offense. The laws are especially problematic when they include content-related crimes. Vaguely worded cybercrime laws purporting to combat misinformation and online support for or glorification of terrorism and extremism, can be misused to imprison bloggers or block entire platforms in a given country. As such, they fail to comply with international freedom of expression standards. Such laws put journalists, activists, researchers, LGBTQ communities, and dissenters in danger, and can have a chilling effect on society more broadly.

Even laws that focus more narrowly on cyber-enabled crimes are used to undermine rights. Laws criminalizing unauthorized access to computer networks or systems have been used to target digital security researchers, whistleblowers, activists,  and journalists. Too often, security researchers, who help keep everyone safe, are caught up in vague cybercrime laws and face criminal charges for identifying flaws in security systems. Some States have also interpreted unauthorized access laws so broadly as to effectively criminalize any and all whistleblowing; under these interpretations, any disclosure of information in violation of a corporate or government policy could be treated as “cybercrime.” Any potential convention should explicitly include a malicious intent standard, should not transform corporate or government computer use policies into criminal liability, should provide a clearly articulated and expansive public interest defense, and include clear provisions that allow security researchers to do their work without fear of prosecution.

Human Rights and Procedural Safeguards

Our private and personal information, once locked in a desk drawer, now resides on our digital devices and in the cloud. Police around the world are using an increasingly intrusive set of investigative tools to access digital evidence. Frequently, their investigations cross borders without proper safeguards and bypass the protections in mutual legal assistance treaties. In many contexts, no judicial oversight is involved, and the role of independent data protection regulators is undermined. National laws, including cybercrime legislation, are often inadequate to protect against disproportionate or unnecessary surveillance.

Any potential convention should detail robust procedural and human rights safeguards that govern criminal investigations pursued under such a convention. It should ensure that any interference with the right to privacy complies with the principles of legality, necessity, and proportionality, including by requiring independent judicial authorization of surveillance measures. It should also not forbid States from adopting additional safeguards that limit law enforcement uses of personal data, as such a prohibition would undermine privacy and data protection. Any potential convention should also reaffirm the need for States to adopt and enforce “strong, robust and comprehensive privacy legislation, including on data privacy, that complies with international human rights law in terms of safeguards, oversight and remedies to effectively protect the right to privacy."

There is a real risk that, in an attempt to entice all States to sign a proposed UN cybercrime convention, bad human rights practices will be accommodated, resulting in a race to the bottom. Therefore, it is essential that any potential convention explicitly reinforces procedural safeguards to protect human rights and resists shortcuts around mutual assistance agreements.

Meaningful Participation

Going forward, we ask the Ad Hoc Committee to actively include civil society organizations in consultations—including those dealing with digital security and groups assisting vulnerable communities and individuals—which did not happen when this process began in 2019 or in the time since.

Accordingly, we request that the Committee:

  • Accredit interested technological and academic experts and nongovernmental groups, including those with relevant expertise in human rights but that do not have consultative status with the Economic and Social Council of the UN, in a timely and transparent manner, and allow participating groups to register multiple representatives to accommodate the remote participation across different time zones.
  • Ensure that modalities for participation recognize the diversity of non-governmental stakeholders, giving each stakeholder group adequate speaking time, since civil society, the private sector, and academia can have divergent views and interests.
  • Ensure effective participation by accredited participants, including the opportunity to receive timely access to documents, provide interpretation services, speak at the Committee’s sessions (in-person and remotely), and submit written opinions and recommendations.
  • Maintain an up-to-date, dedicated webpage with relevant information, such as practical information (details on accreditation, time/location, and remote participation), organizational documents (i.e., agendas, discussions documents, etc.), statements and other interventions by States and other stakeholders, background documents, working documents and draft outputs, and meeting reports.

Countering cybercrime should not come at the expense of the fundamental rights and dignity of those whose lives this proposed Convention will touch. States should ensure that any proposed cybercrime convention is in line with their human rights obligations, and they should oppose any proposed convention that is inconsistent with those obligations.

We would be highly appreciative if you could kindly circulate the present letter to the Ad Hoc Committee Members and publish it on the website of the Ad Hoc Committee.


  1. Access Now – International
  2. Alternative ASEAN Network on Burma (ALTSEAN) – Burma
  3. Alternatives – Canada
  4. Alternative Informatics Association – Turkey
  5. AqualtuneLab – Brazil
  6. ArmSec Foundation – Armenia
  7. ARTICLE 19 – International
  8. Asociación por los Derechos Civiles (ADC) – Argentina
  9. Asociación Trinidad / Radio Viva – Trinidad
  10. Asociatia Pentru Tehnologie si Internet (ApTI) – Romania
  11. Association for Progressive Communications (APC) – International
  12. Associação Mundial de Rádios Comunitárias (Amarc Brasil) – Brazil
  13. ASEAN Parliamentarians for Human Rights (APHR)  – Southeast Asia
  14. Bangladesh NGOs Network for Radio and Communication (BNNRC) – Bangladesh
  15. BlueLink Information Network  – Bulgaria
  16. Brazilian Institute of Public Law - Brazil
  17. Cambodian Center for Human Rights (CCHR)  – Cambodia
  18. Cambodian Institute for Democracy  –  Cambodia
  19. Cambodia Journalists Alliance Association  –  Cambodia
  20. Casa de Cultura Digital de Porto Alegre – Brazil
  21. Centre for Democracy and Rule of Law – Ukraine
  22. Centre for Free Expression – Canada
  23. Centre for Multilateral Affairs – Uganda
  24. Center for Democracy & Technology – United States
  25. Civil Society Europe
  26. Coalition Direitos na Rede – Brazil
  27. Collaboration on International ICT Policy for East and Southern Africa (CIPESA) – Africa
  28. CyberHUB-AM – Armenia
  29. Data Privacy Brazil Research Association – Brazil
  30. Dataskydd – Sweden
  31. Derechos Digitales – Latin America
  32. Defending Rights & Dissent – United States
  33. Digital Citizens – Romania
  34. DigitalReach – Southeast Asia
  35. Digital Security Lab – Ukraine
  36. Državljan D / Citizen D – Slovenia
  37. Electronic Frontier Foundation (EFF) – International
  38. Electronic Privacy Information Center (EPIC) – United States
  39. Elektronisk Forpost Norge – Norway
  40. for digital rights – Austria
  41. European Center For Not-For-Profit Law (ECNL) Stichting – Europe
  42. European Civic Forum – Europe
  43. European Digital Rights (EDRi) – Europe
  44. ​​eQuality Project – Canada
  45. Fantsuam Foundation – Nigeria
  46. Free Speech Coalition  – United States
  47. Foundation for Media Alternatives (FMA) – Philippines
  48. Fundación Acceso – Central America
  49. Fundación Ciudadanía y Desarrollo de Ecuador
  50. Fundación CONSTRUIR – Bolivia
  51. Fundación Karisma – Colombia
  52. Fundación OpenlabEC – Ecuador
  53. Fundamedios – Ecuador
  54. Garoa Hacker Clube  –  Brazil
  55. Global Partners Digital – United Kingdom
  56. GreenNet – United Kingdom
  57. GreatFire – China
  58. Hiperderecho – Peru
  59. Homo Digitalis – Greece
  60. Human Rights in China – China 
  61. Human Rights Defenders Network – Sierra Leone
  62. Human Rights Watch – International
  63. Igarapé Institute -- Brazil
  64. IFEX - International
  65. Institute for Policy Research and Advocacy (ELSAM) – Indonesia
  66. The Influencer Platform – Ukraine
  67. INSM Network for Digital Rights – Iraq
  68. Internews Ukraine
  69. Instituto Beta: Internet & Democracia (IBIDEM) – Brazil
  70. Instituto Brasileiro de Defesa do Consumidor (IDEC) – Brazil
  71. Instituto Educadigital – Brazil
  72. Instituto Nupef – Brazil
  73. Instituto de Pesquisa em Direito e Tecnologia do Recife (IP.rec) – Brazil
  74. Instituto de Referência em Internet e Sociedade (IRIS) – Brazil
  75. Instituto Panameño de Derecho y Nuevas Tecnologías (IPANDETEC) – Panama
  76. Instituto para la Sociedad de la Información y la Cuarta Revolución Industrial – Peru
  77. International Commission of Jurists – International
  78. The International Federation for Human Rights (FIDH)
  79. IT-Pol – Denmark
  80. JCA-NET – Japan
  81. KICTANet – Kenya
  82. Korean Progressive Network Jinbonet – South Korea
  83. Laboratorio de Datos y Sociedad (Datysoc) – Uruguay 
  84. Laboratório de Políticas Públicas e Internet (LAPIN) – Brazil
  85. Latin American Network of Surveillance, Technology and Society Studies (LAVITS)
  86. Lawyers Hub Africa
  87. Legal Initiatives for Vietnam
  88. Ligue des droits de l’Homme (LDH) – France
  89. Masaar - Technology and Law Community – Egypt
  90. Manushya Foundation – Thailand 
  91. MINBYUN Lawyers for a Democratic Society - Korea
  92. Open Culture Foundation – Taiwan
  93. Open Media  – Canada
  94. Open Net Association – Korea
  95. OpenNet Africa – Uganda
  96. Panoptykon Foundation – Poland
  97. Paradigm Initiative – Nigeria
  98. Privacy International – International
  99. Radio Viva – Paraguay
  100. Red en Defensa de los Derechos Digitales (R3D) – Mexico
  101. Regional Center for Rights and Liberties  – Egypt
  102. Research ICT Africa 
  103. Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic (CIPPIC) – Canada
  104. Share Foundation - Serbia
  105. Social Media Exchange (SMEX) – Lebanon, Arab Region
  106. SocialTIC – Mexico
  107. Southeast Asia Freedom of Expression Network (SAFEnet) – Southeast Asia
  108. Supporters for the Health and Rights of Workers in the Semiconductor Industry (SHARPS) – South Korea
  109. Surveillance Technology Oversight Project (STOP)  – United States
  110. Tecnología, Investigación y Comunidad (TEDIC) – Paraguay
  111. Thai Netizen Network  – Thailand
  112. Unwanted Witness – Uganda
  113. Vrijschrift – Netherlands 
  114. West African Human Rights Defenders Network – Togo
  115. World Movement for Democracy – International
  116. 7amleh – The Arab Center for the Advancement of Social Media  – Arab Region

Individual Experts and Academics

  1. Jacqueline Abreu, University of São Paulo
  2. Chan-Mo Chung, Professor, Inha University School of Law
  3. Danilo Doneda, Brazilian Institute of Public Law
  4. David Kaye, Clinical Professor of Law, UC Irvine School of Law, former UN Special Rapporteur on Freedom of Opinion and Expression (2014-2020)
  5. Wolfgang Kleinwächter, Professor Emeritus, University of Aarhus; Member, Global Commission on the Stability of Cyberspace
  6. Douwe Korff, Emeritus Professor of International Law, London Metropolitan University
  7. Fabiano Menke, Federal University of Rio Grande do Sul
  8. Kyung-Sin Park, Professor, Korea University School of Law
  9. Christopher Parsons, Senior Research Associate, Citizen Lab, Munk School of Global Affairs & Public Policy at the University of Toronto
  10. Marietje Schaake, Stanford Cyber Policy Center
  11. Valerie Steeves, J.D., Ph.D., Full Professor, Department of Criminology University of Ottawa

*List of signatories as of January 13, 2022