Melon Farmers Original Version

Internet Encryption in the EU


Encryption is legal for the moment but the authorites are seeking to end this


 

EU snoopers foiled...

European Parliament votes against an EU Commission proposal for mass scanning of all internet communication


Link Here 16th November 2023
Full story: Internet Encryption in the EU...Encryption is legal for the moment but the authorites are seeking to end this

On 14th November, Members of the European Parliament's Civil Liberties committee voted against attempts from EU Home Affairs officials to roll out mass scanning of private and encrypted messages across Europe. It was a clear-cut vote, with a significant majority of MEPs supporting the proposed position.

A political deal struck by the Parliament's seven political groups at the end of October meant that this outcome was expected. Nevertheless, this is an important and welcome milestone, as Parliamentarians demand that EU laws are based in objective evidence, scientific reality and with respect for human rights law.

This vote signals major improvements compared to the Commission's original draft law (coined Chat Control'), which has courted controversy. The process around the legislation has faced allegations of conflicts of interest and illegal advert micro-targeting, and rulings of "maladministration". The proposal has also been widely criticised for failing to meet EU requirements of proportionality -- with lawyers for the EU member states making the unprecedented critique that the proposal likely violates the essence of the right to privacy.

In particular, the vote shows the strong political will of the Parliament to remove the most dangerous parts of this law -- mass scanning, undermining digital security and mandating widespread age verification. Parliamentarians have recognised that no matter how important the aim of a law, it must be pursued using only lawful and legitimate measures.

At the same time, there are parts of their position which still concern us, and which would need to be addressed if any final law were to be acceptable from a digital rights point of view. Coupled with mass surveillance plans from the Council of member states and attempts from the Commission to manipulate the process, we remain sceptical about the chances of a good final outcome.

Civil liberties MEPs also voted for this position to become the official position of the European Parliament. On 20 th November, the rest of the house will be notified about the intention to permit negotiators to move forward without an additional vote. Only after that point will the position voted on today be confirmed as the European Parliament's mandate for the CSA Regulation.

Mass scanning (detection orders)

The European Parliament's position firmly rejects the premise that in order to search for child sexual abuse material (CSAM), all people's messages may be scanned (Articles 7-11). Instead, MEPs require that specific suspicion must be required -- a similar principle to warrants. This is a vital change which would resolve one of the most notorious parts of the law. The position also introduces judicial oversight of hash lists (Article 44.3), which we welcome. However, it unfortunately does not distinguish between basic hashing (which is generally seen as more robust) and perceptual hashing (which is less reliable).

At the same time, the wording also needs improvement to ensure legal certainty. The Parliament position rightly confirms that scanning must be "targeted and specified and limited to individual users, [or] a specific group of users" (Article 7.1). This means that there must be "reasonable grounds of suspicion a link [...] with child sexual abuse material" (Articles 7.1. and 7.2.(a)). However, despite attempts in Recital (21) to interpret the "specific group of users" narrowly, we are concerned that the phrasing "as subscribers to a specific channel of communications"(Article 7.1.) is too broad and too open to interpretation. he concept of "an indirect link" is also ambiguous in the context of private messages, and should be deleted or clarified.

The Parliament's position deletes solicitation (grooming) detection from the scope of detection orders, recognising the unreliability of such tools. However, the fact that solicitation remains in the scope of risk assessment (Articles 3 and 4) still poses a risk of incentivising overly-restrictive measures.

End-to-end encryption

The European Parliament's position states that end-to-end encrypted private message services -- like WhatsApp, Signal or ProtonMail -- are not subject to scanning technologies (Articles 7.1 and 10.3). This is a strong and clear protection to stop encrypted message services from being weakened in a way that could harm everyone that relies on them -- a key demand of civil society and technologists.

Several other provisions throughout the text, such as a horizontal protection of encrypted services (Article 1.3a and Recital 9a), give further confirmation of the Parliament's will to protect one of the only ways we all have to keep our digital information safe.

There is a potential (likely unintended) loophole in the Parliament's position on end-to-end encryption, however, which must be addressed in future negotiations. This is the fact that whilst encrypted 'interpersonal communications services (private messages) are protected, there is not an explicit protecting for other kinds of encrypted services ('hosting services').

It would therefore be important to amend Article 1.3a. to ensure that hosting providers, such as of personal cloud backups, cannot be required to circumvent the security and confidentiality of their services with methods that are designed to access encrypted information, and that Article 7.1. is amended so that it is not limited to interpersonal communications.

Age verification & other risk mitigation measures

The European Parliament's position is mixed when it comes to age verification and other risk mitigation measures. EDRi has been clear that mandatory age verification at EU level would be very risky -- and we are glad to see that these concerns have been acted upon. The European Parliament's position protects people's anonymity online by removing mandatory age verification for private message services and app stores, and adds a series of strong safeguards for its optional use (Article 4.3.a.(a)-(k)). This is a positive and important set of measures.

On the other hand, we are disappointed that the Parliament's position makes age verification mandatory for porn platforms (Article 4a.) -- a step that is not coherent with the overall intention of the law. What's more, the cumulative nature of the risk mitigation measures for services directly targeting children in the Parliament's position (Article 4.1.(aa)) need further attention.

This is because there is no exception given for cases where the measures might not be right for a particular service, and could instead risk platforms or services deciding to exclude young people from their services to avoid these requirements.

We recommend that there should not be mandatory age verification for porn platforms, and that risk mitigation measures should oblige providers to achieve a specific outcome, rather than creating overly-detailed (and sometimes misguided) service design requirements. We also warn that the overall CSA Regulation framework should not incentivise the use of age verification tools.

Voluntary scanning

The European Parliament's position does not include a permanent voluntary scanning regime, despite some MEPs calling for such an addition. This is an important legal point: if co-legislators agree that targeted scanning measures are a necessary and proportionate limitation on people's fundamental human rights, then they cannot leave such measures to the discretion of private entities. The Parliament's position does -- however, extend the currently-in-force interim derogation by nine months (Article 88.2).

 

 

Not enough!...

Children's campaigners claim that EU proposals for responding to child abuse don't go far enough and call for all internet communications to be open to snooping regardless of the safety of internet users from hackers, fraudsters and thieves


Link Here6th March 2023
Full story: Internet Encryption in the EU...Encryption is legal for the moment but the authorites are seeking to end this
The European Commission proposed new EU rules to prevent and combat child sexual abuse (CSA) in May 2022. Complementing existing frameworks to fight online CSA, the EU proposal would introduce a new, harmonised European structure for assessing and mitigating the spread of child sexual abuse material (CSAM) online.

The thrust of the proposal is to react in a unified way, either to CSAM detected, or else to systems identified most at risk of being used to disseminate such material.

However as is always the case with campaigners, this is never enough. The campaigners basically want everybody's communications to be open to snooping and surveillance without the slightest consideration for people's safety from hackers, identity thieves, scammers, blackmailers and fraudsters.

The European Commission wrote:

The Commission is proposing new EU legislation to prevent and combat child sexual abuse online. With 85 million pictures and videos depicting child sexual abuse reported worldwide in 2021 alone, and many more going unreported, child sexual abuse is pervasive. The COVID-19 pandemic has exacerbated the issue, with the Internet Watch foundation noting a 64% increase in reports of confirmed child sexual abuse in 2021 compared to the previous year. The current system based on voluntary detection and reporting by companies has proven to be insufficient to adequately protect children and, in any case, will no longer be possible once the interim solution currently in place expires. Up to 95% of all reports of child sexual abuse received in 2020 came from one company, despite clear evidence that the problem does not only exist on one platform.

To effectively address the misuse of online services for the purposes of child sexual abuse, clear rules are needed, with robust conditions and safeguards. The proposed rules will oblige providers to detect, report and remove child sexual abuse material on their services. Providers will need to assess and mitigate the risk of misuse of their services and the measures taken must be proportionate to that risk and subject to robust conditions and safeguards.

A new independent EU Centre on Child Sexual Abuse (EU Centre) will facilitate the efforts of service providers by acting as a hub of expertise, providing reliable information on identified material, receiving and analysing reports from providers to identify erroneous reports and prevent them from reaching law enforcement, swiftly forwarding relevant reports for law enforcement action and by providing support to victims.

The new rules will help rescue children from further abuse, prevent material from reappearing online, and bring offenders to justice. Those rules will include:

  • Mandatory risk assessment and risk mitigation measures: Providers of hosting or interpersonal communication services will have to assess the risk that their services are misused to disseminate child sexual abuse material or for the solicitation of children, known as grooming. Providers will also have to propose risk mitigation measures.

  • Targeted detection obligations, based on a detection order: Member States will need to designate national authorities in charge of reviewing the risk assessment. Where such authorities determine that a significant risk remains, they can ask a court or an independent national authority to issue a detection order for known or new child sexual abuse material or grooming. Detection orders are limited in time, targeting a specific type of content on a specific service.

  • Strong safeguards on detection: Companies having received a detection order will only be able to detect content using indicators of child sexual abuse verified and provided by the EU Centre. Detection technologies must only be used for the purpose of detecting child sexual abuse. Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate of false positives to the maximum extent possible.

  • Clear reporting obligations: Providers that have detected online child sexual abuse will have to report it to the EU Centre.

  • Effective removal: National authorities can issue removal orders if the child sexual abuse material is not swiftly taken down. Internet access providers will also be required to disable access to images and videos that cannot be taken down, e.g., because they are hosted outside the EU in non-cooperative jurisdictions.

  • Reducing exposure to grooming: The rules require app stores to ensure that children cannot download apps that may expose them to a high risk of solicitation of children.

  • Solid oversight mechanisms and judicial redress: Detection orders will be issued by courts or independent national authorities. To minimise the risk of erroneous detection and reporting, the EU Centre will verify reports of potential online child sexual abuse made by providers before sharing them with law enforcement authorities and Europol. Both providers and users will have the right to challenge any measure affecting them in Court.

The new EU Centre will support:

  • Online service providers, in particular in complying with their new obligations to carry out risk assessments, detect, report, remove and disable access to child sexual abuse online, by providing indicators to detect child sexual abuse and receiving the reports from the providers;

  • National law enforcement and Europol, by reviewing the reports from the providers to ensure that they are not submitted in error, and channelling them quickly to law enforcement. This will help rescue children from situations of abuse and bring perpetrators to justice.

  • Member States, by serving as a knowledge hub for best practices on prevention and assistance to victims, fostering an evidence-based approach.

  • Victims, by helping them to take down the materials depicting their abuse.

Next steps

It is now for the European Parliament and the Council to agree on the proposal. Once adopted, the new Regulation will replace the current interim Regulation . Feedback from members of the public on the proposals is open for a minimum of 8 weeks.*

According to child campaigners:

On 8 February 2023, the European Parliament's Committee on the Internal Market and Consumer Protection (IMCO) published its draft report on the European Commission's proposal to prevent and combat child sexual abuse. The draft report seeks a vastly reduced scope for the Regulation. It prioritises the anonymity of perpetrators of abuse over the rights of victims and survivors of sexual abuse and seeks to reverse progress made in keeping children safe as they navigate or are harmed in digital environments that were not built with their safety in mind.

The letter also criticises the removal of age verification and claims that technology can meet high privacy standards, explaining that the new legislation adds in additional safeguards to already effective measures to prevent the spread of this material online.

And of course the campaigners demand that technology companies allow the surveillance of all messages via backdoors to encryption or perhaps just to ban encryption.

See the letter from the likes of the NSPCC. See article from iwf.org.uk

 

 

EU is endangering journalists...

The Committee to Protect Journalists expresses concern about an proposal to ban secure encrypted messaging


Link Here 10th November 2020
Full story: Internet Encryption in the EU...Encryption is legal for the moment but the authorites are seeking to end this

The Committee to Protect Journalists expressed concern after the Council of the European Union proposed a draft resolution last week calling for national authorities across the EU to have access to encrypted messages as part of criminal investigations into terrorism and organized crime. Journalists rely on encryption to evade surveillance and protect their sources, CPJ has found .

End-to-end encryption prevents authorities, company employees, and hackers from viewing the content of private digital messages, but the resolution proposes unspecified technical solutions to undermine those protections, according to rights groups European Digital Rights and Access Now. The groups said the resolution was drafted without input from privacy experts or journalists.

EU institutions must immediately retract all plans to undermine encryption, which is vital to press freedom and the free flow of information, said Tom Gibson, EU Representative for the Committee to Protect Journalists. Encryption offers essential protection for journalists who routinely communicate and share files electronically. If journalists cannot communicate safely with colleagues and sources, they cannot protect the anonymity of their sources.

The resolution was proposed by Germany, which holds the current presidency of the Council of the European Union, and could serve as a basis for further negotiations with other EU institutions in 2021.

 

 

Everybody deserves strong encryption...

Even the EU Commission!


Link Here 23rd February 2020
Full story: Internet Encryption in the EU...Encryption is legal for the moment but the authorites are seeking to end this
The European Commission has told its staff to start using Signal, an end-to-end-encrypted messaging app, in a push to increase the security of its communications.

The instruction appeared on internal messaging boards in early February, notifying employees that Signal has been selected as the recommended application for public instant messaging.

The app is favored by privacy activists because of its end-to-end encryption and open-source technology. Bart Preneel, cryptography expert at the University of Leuven explained:

It's like Facebook's WhatsApp and Apple's iMessage but it's based on an encryption protocol that's very innovative. Because it's open-source, you can check what's happening under the hood.

Promoting the app, however, could antagonize the law enforcement community. It will underline the hypocrisy of  Officials in Brussels, Washington and other capitals have been putting strong pressure on Facebook and Apple to allow government agencies to access to encrypted messages; if these agencies refuse, legal requirements could be introduced that force firms to do just that.

American, British and Australian officials have published an open letter to Facebook CEO Mark Zuckerberg in October, asking that he call off plans to encrypt the company's messaging service. Dutch Minister for Justice and Security Ferd Grappehaus told POLITICO last April that the EU needs to look into legislation allowing governments to access encrypted data.

 

 

Petition: Encryption is under threat in Europe!...

Tell the EU Council: Protect our rights to privacy and security!


Link Here 1st December 2016
Full story: Internet Encryption in the EU...Encryption is legal for the moment but the authorites are seeking to end this
The Council of the EU could undermine encryption as soon as December. It has been asking delegates from all EU countries to detail their national legislative position on encryption.

We've been down this road before. We know that encryption is critical to our right to privacy and to our own digital security. We need to come together once again and demand that our representatives protect these rights -- not undermine them in secret. Act now to tell the Council of the EU to defend strong encryption!

Dear Slovak Presidency and Delegates to the Council of the EU:

According to the Presidency of the Council of the European Union, the Justice and Home Affairs Ministers will meet in December to discuss the issue of encryption. At that discussion, we urge you to protect our security, our economy, and our governments by supporting the development and use of secure communications tools and technologies and rejecting calls for policies that would prevent or undermine the use of strong encryption.

Encryption tools, technologies, and services are essential to protect against harm and to shield our digital infrastructure and personal communications from unauthorized access. The ability to freely develop and use encryption provides the cornerstone for today's EU economy. Economic growth in the digital age is powered by the ability to trust and authenticate our interactions and communication and conduct business securely both within and across borders.

The United Nations Special Rapporteur for freedom of expression has noted, encryption and anonymity, and the security concepts behind them, provide the privacy and security necessary for the exercise of the right to freedom of opinion and expression in the digital age.

Recently, hundreds of organizations, companies, and individuals from more than 50 countries came together to make a global declaration in support of strong encryption. We stand with people from all over the world asking you not to break the encryption we rely upon.

Sign the  petition from act.accessnow.org



Censor Watch logo
censorwatch.co.uk

 

Top

Home

Links
 

Censorship News Latest

Daily BBFC Ratings

Site Information