| |
Poland president vetoes EU internet censorship law from being implemented in Poland
|
|
|
 | 12th
January 2026
|
|
| See article from
politico.eu |
Poland's nationalist President Karol Nawrocki has vetoed EU censorship law. The EU's Digital Services Act, which seeks to force big platforms like Elon Musk's X, Facebook, Instagram to censor content, as a form of Orwellian censorship against
conservatives and right-wingers. The presidential veto stops national regulators in Warsaw from implementing the DSA. Nawrocki argued that while the bill's stated aim of protecting citizens 2was legitimate, the Polish bill would grant excessive
power to government officials over online content, resulting in administrative censorship. Nawrocki said in a statement; I want this to be stated clearly: a situation in which what is allowed on the internet is decided by
an official subordinate to the government resembles the construction of the Ministry of Truth from George Orwell's novel 1984.
|
| |
EU lobby group proposes to censor 'disinformation' via ICANN's powers held over worldwide domain name controls
|
|
|
 |
10th April 2024
|
|
| See article from reclaimthenet.org
|
EU DisinfoLab, a censorship lobby group regularly making policy recommendations to the EU and member-states, is now pushing for a security structure created by ICANN (the Internet Corporation for Assigned Names and Numbers) to be utilized to censor what
it deems as disinformation. Attempting to directly use ICANN would be highly controversial. Given its importance in the internet infrastructure -- ICANN manages domain names globally -- and the fact content control is not among its tasks (DisinfoLab
says ICANN refuses to do it) -- this would represent a huge departure from the organization's role as we understand it today. But now DisinfoLab proposes to use the structure already created by ICANN against legitimate security threats, to police the
internet for content that somebody decides to treat as disinformation. It would require minimal amount of diligence and cooperation from registries, a blog post said, to accept ICANN-style reports and revoke a site's domain name. |
| |
New EU internet censorship laws have come into force for the largest social media giants
|
|
|
 |
25th August 2023
|
|
| See article from bbc.co.uk |
About 20 internet giants now have to comply with new EU internet censorship rules. Under the EU Digital Services Act (DSA) rule-breakers can face big fines of 6% of turnover and potentially suspension of the service. The EU commission has named the
very large online platforms that will form the first tranche of internet companies subjected to the new censorship regime. Those are sites with over 45 million EU users: Alibaba, AliExpress, Amazon Store, the Apple App Store,
Booking.com, Facebook, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, X (formerly Twitter), Wikipedia, YouTube and Zalando. Search engines Google and Bing will also be subject to the rules. These websites will now have to assess potential risks they may cause, report that assessment and put in place measures to deal with the problem. This includes risks related to:
- illegal content
- rights, such as freedom of expression, media freedom, discrimination, consumer protection and children's rights public security and
- threats to electoral processes
- gender-based violence, public health wrong
think, age restrictions, and mental and physical 'wellbeing'.
Targeted advertising based on profiling children is no longer permitted. They must also share with regulators details of how their algorithms work. This could include those which decide what adverts users see, or which posts appear in their
feed. And they are required to have systems for sharing data with independent researchers. All though the law is targeted at the EU, of the companies have already made changes that will also affect users in the UK.
- Starting July TikTok stopped users in Europe aged 13-17 from being shown personalised advertising based on their online activity.
- Since February Meta apps including Facebook and Instagram have stopped showing users aged 13-17 worldwide
advertising based on their activity to the apps.
- In Europe Facebook and Instagram gave users the option to view Stories and Reels only from people they follow, ranked in chronological order.
- In the UK and Europe Snapchat is also
restricting personalised ads for users aged 13-17. It is also creating a library of adverts shown in the EU.
Retailers Zalando and Amazon have mounted legal action to contest their designation as a very large online platform. Amazon argues they are not the largest retailer in any of the EU countries where they operate. Smaller tech services will be
brought under the new censorhip regime next year. |
| |
New EU internet censorship laws look likely block or restrict Google Search from linking to adult websites
|
|
|
 |
28th April 2023
|
|
| See article from xbiz.com
|
The European Commission has officially identified 19 major platforms and search engines to be targeted for compliance under its new internet censorship law, the Digital Services Act (DSA). Under the new rules, Very Large providers will be required to
assess and mitigate the risk of 'misuse' of their services and the measures taken must be proportionate to that risk and subject to robust conditions and safeguards. The EU Commission officially designated 17 Very Large Online Platforms (VLOPs)
and two Very Large Online Search Engines (VLOSEs), each of which, according to the EC, reaches at least 45 million monthly active users. The VLOPs are: Alibaba AliExpress, Amazon Store, Apple AppStore, Booking.com, Facebook, Google Play, Google Maps,
Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, Twitter, Wikipedia, YouTube and German retailer Zalando. The two VLOSEs are Bing and Google Search. Following their designation, an EC statement explained, these companies will
now have to comply, within four months, with the full set of new censorship rules under the DSA. Under the subheading Strong protection of minors, the EC listed the following directives:
- Platforms will have to redesign their systems to ensure a high level of privacy, security, and safety of minors;
- Targeted advertising based on profiling towards children is no longer permitted;
- Special risk assessments including for
negative effects on mental health will have to be provided to the Commission four months after designation and made public at the latest a year later;
- Platforms will have to redesign their services, including their interfaces, recommender
systems, terms and conditions, to mitigate these risks.
According to industry attorney Corey Silverstein of Silverstein Legal, the impact of the new designations and consequent obligations could be substantial because many of the platforms that have been designated as VLOPs and VLOSEs are frequently utilized
by the adult entertainment industry. Assuming these platforms decide to comply with the DSA, Silverstein told XBIZ, there may be major changes coming to what these platforms allow on their services within the EU. This could end up leading to
major content moderation and outright blocking of adult content in the EU, including the blocking of websites that display adult entertainment from being listed in search results. It is also noted that as the larger adult platforms continue to grow,
some may pass the EC's benchmark of having 45 million monthly active users, and therefore face the potential for future designation under the DSA, which could have even more direct impact on their users and creators. |
| |
The European Parliament ratifies the latest EU internet censorship law
|
|
|
 |
6th July 2022
|
|
| See article from techxplore.com |
The European Parliament has ratified the latest laws that to extend internet censorship in the EU. MEPs approved the final versions of the Digital Markets Act, focused on ending monopolistic practices of tech giants, and the Digital Services Act
(DSA), which toughens scrutiny and the consequences for platforms when they host banned content. The DSA will target a wide range of internet actors and aims to ensure real consequences for companies that fail to censor supposed hate speech,
information the authorities don't like and and child sexual abuse images. Danish MEP Christel Schaldemose commented: The digital world has developed a bit like a western movie, there were no real rules of the game,
but now there is a new sheriff in town. The DSA passed easily with 539 votes in favor, 54 against and 30 abstentions. Both laws now require the final approval by the EU's 27 member states, which should be a formality. Now the big
question is over enforcement with worries that the European Commission lacks the means to give sharp teeth to its new powers. |
| |
|
|
|
 |
4th May 2022
|
|
|
The Digital Services Act will be the envy of autocrats the world over. By Andrew Tettenborn See article from
spiked-online.com |
| |
Don't hold ordinary social media users responsible for other users responses
|
|
|
 | 27th April 2022
|
|
| See CC article from eff.org
|
Courts and legislatures around the globe are hotly debating to what degree online intermediaries--the chain of entities that facilitate or support speech on the internet--are liable for the content they help publish. One thing they should not be
doing is holding social media users legally responsible for comments posted by others to their social media feeds, EFF and Media Defence told the European Court of Human Rights (ECtHR). Before the court is the case Sanchez v.
France , in which a politician argued that his right to freedom of expression was violated when he was subjected to a criminal fine for not promptly deleting hateful comments posted on the "wall" of his Facebook account by others. The
ECtHR's Chamber, a judicial body that hears most of its cases, found there was no violation of freedom of expression, extending its rules for online intermediaries to social media users. The politician is seeking review of this decision by ECtHR's Grand
Chamber, which only hears its most serious cases. EFF and Media Defence, in an amicus brief submitted to the Grand Chamber, asked it to revisit the Chamber's expansive interpretation of how intermediary liability rules should
apply to social media users. Imposing liability on them for third-party content will discourage social media users, especially journalists, human rights defenders, civil society actors, and political figures, from using social media platforms, as they
are often targeted by governments seeking to suppress speech. Subjecting these users to liability would make them vulnerable to coordinated attacks on their sites and pages meant to trigger liability and removal of speech, we told the court.
Further, ECtHR's current case law does not support and should not apply to social media users who act as intermediaries, we said. The ECtHR laid out its intermediary liability rules in Delfi A.S. v. Estonia , which concerned
the failure of a commercial news media organization to monitor and promptly delete "clearly unlawful" comments online. The ECtHR rules consider whether the third-party commenters can be identified, and whether they have any control over their
comments once they submit them. In stark contrast, Sanchez concerns the liability of an individual internet user engaged in non-commercial activity. The politician was charged with incitement to hatred or violence against a
group of people or an individual on account of their religion based on comments others posted on his Facebook wall. The people who posted the comments were convicted of the same criminal offence, and one of them later deleted the allegedly unlawful
comments. What's more, the decision about what online content is "clearly unlawful" is not always straightforward, and generally courts are best placed to assess the lawfulness of the online content. While social media
users may be held responsible for failing or refusing to comply with a court order compelling them to remove or block information, they should not be required to monitor content on their accounts to avoid liability, nor should they be held liable simply
when they get notified of allegedly unlawful speech on their social media feeds by any method other than a court order. Imposing liability on an individual user, without a court order, to remove the allegedly unlawful content in question will be
disproportionate, we argued. Finally, the Grand Chamber should decide whether imposing criminal liability for third party content violates the right to freedom of expression, given the peculiar circumstances in this case. Both the
applicant and the commenters were convicted of the same offence a decade ago. EFF and Media Defence asked the Grand Chamber to assess the quality of the decades-old laws--one dating back to 1881--under which the politician was convicted, saying criminal
laws should be adapted to meet new circumstances, but these changes must be precise and unambiguous to enable someone to foresee what conduct would violate the law. Subjecting social media users to criminal responsibility for
third-party content will lead to over-censorship and prior restraint. The Grand Chamber should limit online intermediary liability, and not chill social media users' right to free expression and access to information online. You can read our
amicus brief here: https://www.eff.org/document/sanchez-v-france-eff-media-defence-ecthr-brief
|
| |
The EU is moving towards the conclusion of its new internet censorship law, the Digital Services Act
|
|
|
 | 22nd April 2022
|
|
| See article from nytimes.com |
The European Union is nearing a conclusion to internet censorship legislation that would force Facebook, YouTube and other internet services to censor 'misinformation', disclose how their services algorithms and stop targeting online ads based on a
person's ethnicity, religion or sexual orientation. The law, called the Digital Services Act, is intended to more aggressively police social media platforms for content deemed unacceptable or risk billions of dollars in fines. Tech companies would be
compelled to set up new policies and procedures to remove flagged hate speech, terrorist propaganda and other material defined as illegal by countries within the European Union. The Digital Services Act is part of a one-two punch by the European
Union to address the societal and economic effects of the tech giants. Last month, the 27-nation bloc agreed to a different sweeping law, the Digital Markets Act, to counter what regulators see as anticompetitive behavior by the biggest tech firms,
including their grip over app stores, online advertising and internet shopping. The new law is ambitious but the EU is noted for producing crap legislation that doesn't work in practice. Lack of enforcement of the European Union's data privacy law,
the General Data Protection Regulation, or G.D.P.R., has cast a shadow over the new law. Like the Digital Services Act and Digital Markets Act, G.D.P.R. was hailed as landmark legislation. But since it took effect in 2018, there has been little action
against Facebook, Google and others over their data-collection practices. Many have sidestepped the rules by bombarding users with consent windows on their websites. |
| |
The EU demands that internet platforms take down terrorist content with an hour of being told
|
|
|
 | 2nd May 2021
|
|
| See
article from europarl.europa.eu |
The EU has dreamed up another impossible to comply with piece of internet legislation that places onerous, if not impossible, requirements on small internet businesses that will have to relocate user forums and the likes onto the platforms of the US
internet giants that are able to deal with the ludicrously short timescales demanded by the EU. The EU describes its latest attack on business in a press release: A new law to address the dissemination of terrorist
content online was approved by the EU Parliament: The new regulation will target content such as texts, images, sound recordings or videos, including live transmissions, that incite, solicit or contribute to terrorist offences,
provide instructions for such offences or solicit people to participate in a terrorist group. In line with the definitions of offences included in the Directive on combating terrorism , it will also cover material that provides guidance on how to make
and use explosives, firearms and other weapons for terrorist purposes. Terrorist content must be removed within one hour Hosting service providers will have to remove or disable access to flagged
terrorist content in all member states within one hour of receiving a removal order from the competent authority. Member states will adopt rules on penalties, the degree of which will take into account the nature of the breach and the size of company
responsible. Protection of educational, artistic, research and journalistic material Content uploaded for educational, journalistic, artistic or research purposes, or used for awareness-raising
purposes, will not be considered terrorist content under these new rules. No general obligation to monitor or filter content Internet platforms will not have a general obligation to monitor or filter
content. However, when competent national authorities have established a hosting service provider is exposed to terrorist content, the company will have to take specific measures to prevent its propagation. It will then be up to the service provider to
decide what specific measures to take to prevent this from happening, and there will be no obligation to use automated tools. Companies should publish annual transparency reports on what action they have taken to stop the dissemination of terrorist
content. Next steps The Regulation will enter into force on the twentieth day following publication in the Official Journal. It will start applying 12 months after its entry into force.
|
| |
Aggressive new EU terrorism internet censorship law will require onerous and expensive self censorship by all websites
|
|
|
 |
18th April 2021
|
|
| See article from laquadrature.net |
An upcoming European law pretexts fighting terrorism to silence the whole Internet In September 2018, under French and German influence, the European Commission put forward a proposal for a Regulation of the European
Parliament and of the Council on preventing the dissemination of terrorist content online . The text was adopted in December 2018 by the EU Council and adopted (with some changes) by the EU Parliament in April 2019. After
negotiations in trilogue (between the three institutions), this text is now back in the Parliament for a final vote . This new regulation will force every actor of the Web's ecosystem (video or blog platforms, online media,
small forums or large social networks) to block in under an hour any content reported as "terrorist" by the police (without a judge's prior authorisation), and therefore to be on call 24/7. If some
"exceptions" have been provided in the text, they are purely hypothetical and will not protect our freedoms in practice :
The one hour deadline is unrealistic and only big economic platforms will be capable of complying with such strict obligations. With the threat of heavy fines and because most of them will not be able to comply whithin the removal
orders, it will force Web actors to censor proactively any potentially illegal content upstream, using automated tools of mass surveillance developed by Google and Facebook. Such a power given to the police can easily lead to
the censorship of political opponents and social movements. The text allows an authority from any Member State to order removal in another Member State. Such cross-border removal orders are not only unrealistic but can only
worsen the danger of mass political censorship.
The European Parliament must reject this text
|
| |
The EU Commission president introduces the next round of internet censorship law
|
|
|
 |
2nd December 2020
|
|
| See article from bbc.co.uk |
Ursula von der Leyen, president of the European Commission, has introduced a new swathe of internet regulation. She said the commission would be rewriting the rulebook for our digital market with stricter rules for online content, from selling unsafe
products to posting 'hate speech'. Von der Leyen told the online Web Summit: No-one expects all digital platforms to check all the user content that they host. This would be a threat to everyone's freedom to speak
their mind. ...But... if illegal content is notified by the competent national authorities, it must be taken down. More pressure
The Digital Services Act will replace the EU's 2000 e-commerce directive. Due to come into
force on Wednesday, 2 December, it has now been delayed until next week. Likely to put more pressure on social-media platforms to take down and block unlawful content more quickly, the new rules will almost certainly be contested by companies such
as Google and Facebook, which now face far stricter censorship both in Europe and the US, following claims about the the supposed spread of 'fake news' and 'hate speech'. |
| |
The EU is pushing for an agreement by Christmas for a new rapid internet take down law applying to terrorist content
|
|
|
 |
15th November 2020
|
|
| See article from bbc.co.uk |
EU ministers are discussing a new censorship law this year obliging internet firms to remove what is deemed to be extremist propaganda within an hour of it being reported. The EU has been discussing such a regulation for more than a year, but the
recent terror attacks in France and Austria have given it new urgency. Interior ministers said the text must be agreed soon with the EU Commission and European Parliament. Both German Interior Minister Horst Seehofer and EU Home Affairs
Commissioner Ylva Johansson called for an agreement by Christmas on the new regulation on terrorist content online (TCO). |
| |
The EU's next round of strangulation of European internet businesses via red tape and censorship
|
|
|
 |
23rd October 2020
|
|
| See Creative Commons article from eff.org by
Christoph Schmon |
The European Union has made the first step towards a significant overhaul of its core platform regulation, the e-Commerce Directive .
In order to inspire the European Commission, which is currently preparing a proposal for a Digital Services Act Package , the EU Parliament has voted on three related Reports ( IMCO , JURI , and LIBE reports), which address the legal
responsibilities of platforms regarding user content, include measures to keep users safe online, and set out special rules for very large platforms that dominate users' lives. Clear EFF's Footprint Ahead of the votes, together with our allies , we argued to preserve what works for a free Internet and innovation, such as to retain the E-Commerce directive's approach of limiting platforms' liability over user content and banning Member States from imposing obligations to track and monitor users' content. We also stressed that it is time to fix what is broken: to imagine a version of the Internet where users have a right to remain anonymous, enjoy substantial procedural rights in the context of content moderation, can have more control over how they interact with content, and have a true choice over the services they use through interoperability obligations .
It's a great first step in the right direction that all three EU Parliament reports have considered EFF suggestions. There is an overall agreement that platform intermediaries have a pivotal role to play in ensuring the
availability of content and the development of the Internet. Platforms should not be held responsible for ideas, images, videos, or speech that users post or share online. They should not be forced to monitor and censor users' content and
communication--for example, using upload filters. The Reports also makes a strong call to preserve users' privacy online and to address the problem of targeted advertising. Another important aspect of what made the E-Commerce Directive a success is the
"country or origin" principle. It states that within the European Union, companies must adhere to the law of their domicile rather than that of the recipient of the service. There is no appetite from the side of the Parliament to change this
principle. Even better, the reports echo EFF's call to stop ignoring the walled gardens big platforms have become. Large Internet companies should no longer nudge users to stay on a platform that disregards their privacy or
jeopardizes their security, but enable users to communicate with friends across platform boundaries. Unfair trading, preferential display of platforms' own downstream services and transparency of how users' data are collected and shared: the EU
Parliament seeks to tackle these and other issues that have become the new "normal" for users when browsing the Internet and communicating with their friends. The reports also echo EFF's concerns about automated content moderation, which is
incapable of understanding context. In the future, users should receive meaningful information about algorithmic decision-making and learn if terms of service change. Also, the EU Parliament supports procedural justice for users who see their content
removed or their accounts disabled. Concerns Remain The focus on fundamental rights protection and user control is a good starting point for the ongoing reform of Internet legislation in Europe.
However, there are also a number of pitfalls and risks. There is a suggestion that platforms should report illegal content to enforcement authorities and there are open questions about public electronic identity systems. Also, the general focus of
consumer shopping issues, such as liability provision for online marketplaces, may clash with digital rights principles: the Commission itself acknowledged in a recent internal document that "speech can also be reflected in goods, such as books,
clothing items or symbols, and restrictive measures on the sale of such artefacts can affect freedom of expression." Then, the general idea to also include digital services providers established outside the EU could turn out to be a problem to the
extent that platforms are held responsible to remove illegal content. Recent cases ( Glawischnig-Piesczek v Facebook ) have demonstrated the perils of worldwide content takedown orders. It's Your Turn Now @EU_Commission
The EU Commission is expected to present a legislative package on 2 December. During the public consultation process, we urged the Commission to protect freedom of expression and to give control to users rather than the big platforms.
We are hopeful that the EU will work on a free and interoperable Internet and not follow the footsteps of harmful Internet bills such as the German law NetzDG or the French Avia Bill, which EFF helped to strike down . It's time to make it right. To
preserve what works and to fix what is broken.
|
| |
EU arms up against US internet giants
|
|
|
 | 12th October 2020
|
|
| See article from politico.eu
|
The European Commission is beefing up its weapons to take on Big Tech. Under Commission Executive Vice President Margrethe Vestager, the commission is planning to merge two major legislative initiatives on competition into a single text. One is
the so-called New Competition Tool, a market investigation tool that would allow competition enforcers to act more swiftly and forcefully. The other is a part of the Digital Services Act , a new set of rules due to be unveiled in December for companies
like Google, Apple and Amazon. Combined, the new powers would be known as the Digital Markets Act. The act will include a list of do's and don'ts for so-called gatekeeping platforms -- or those who are indispensable for other companies to reach
consumers online -- to curb what it sees as anti-competitive behavior. |
| |
EU plans for extending censorship laws to US messaging services falters
|
|
|
 | 26th
November 2019
|
|
| See
article from reuters.com
|
The European Commission is struggling to agree how to extend internet censorship and control to US messaging apps such as Facebook's WhatsApp and Microsoft's Skype. These services are run from the US and it is not so easy for European police to obtain
say tracking or user information as it is for more traditional telecoms services. The Commission has been angling towards applying the rules controlling national telecoms companies to these US 'OTT' messaging services. Extended ePrivacy regulation
was the chosen vehicle for new censorship laws. But now it is reported that the EU countries have yet to find agreement on such issues as tracking users' online activities, provisions on detecting and deleting child pornography and of course how
to further the EU's silly game of trying to see how many times a day EU internet users are willing to click consent boxes without reading reams of terms and conditions. EU ambassadors meeting in Brussels on Friday again reached an impasse, EU
officials said. Tech companies and some EU countries have criticized the ePrivacy proposal for being too restrictive, putting them at loggerheads with privacy activists who back the plan. Now doubt the censorship plans will be resuming soon.
|
| |
|
|
|
 | 23rd August 2019
|
|
|
Top EU Court is to Decide on case threatening safe harbour protections underpinning the legality of European websites hosting user content See
article from torrentfreak.com |
| |
|
|
|
 | 20th August
2019
|
|
|
EU planning to grab total control of internet regulations. By David Spence See article from vpncompare.co.uk |
| |
|
|
|
| 17th August 2019
|
|
|
Soon online speech will be regulated by Brussels. By Andrew Tettenborn See article from spiked-online.com
|
| |
European Court of Justice moves towards a position requiring the international internet to follow EU censorship rulings
|
|
|
 | 8th
June 2019
|
|
| 6th June 2019. See
article from techdirt.com |
TechDirt comments: The idea of an open global internet keeps taking a beating -- and the worst offender is not, say, China or Russia, but rather the EU. We've already discussed things like the EU Copyright Directive and the Terrorist Content
Regulation , but it seems like every day there's something new and more ridiculous -- and the latest may be coming from the Court of Justice of the EU (CJEU). The CJEU's Advocate General has issued a recommendation (but not the final verdict) in a new
case that would be hugely problematic for the idea of a global open internet that isn't weighted down with censorship. The case at hand involved someone on Facebook posting a link to an article about an Austrian politician, Eva
Glawischnig-Piesczek, accusing her of being a lousy traitor of the people, a corrupt oaf and a member of a fascist party. An Austrian court ordered Facebook to remove the content, which it complied with by removing access to anyone in Austria. The
original demand was also that Facebook be required to prevent equivalent content from appearing as well. On appeal, a court denied Facebook's request that it only had to comply in Austria, and also said that such equivalent content could only be limited
to cases where someone then alerted Facebook to the equivalent content being posted (and, thus, not a general monitoring requirement). The case was then escalated to the CJEU and then, basically everything goes off the rails See
detailed legal findings discussed by techdirt.com
Offsite Comment: Showing how Little the EU Understands About the Web
8th June 2019. See article from forbes.com by
Kalev Leetaru As governments around the world seek greater influence over the Web, the European Union has emerged as a model of legislative intervention, with efforts from GDPR to the Right to be Forgotten to new efforts to allow
EU lawmakers to censor international criticism of themselves. GDPR has backfired spectacularly, stripping away the EU's previous privacy protections and largely exempting the most dangerous and privacy-invading activities it was touted to address. Yet it
is the EU's efforts to project its censorship powers globally that present the greatest risk to the future of the Web and demonstrate just how little the EU actually understands about how the internet works. |
| |
European Parliament removes requirement for internet companies to pre-censor user posts for terrorist content but approves a one hour deadline for content removal when asked by national authorities
|
|
|
 | 18th April 2019
|
|
| See article from bbc.com |
The European Parliament has approved a draft version of new EU internet censorship law targeting terrorist content. In particular the MEPs approved the imposition of a one-hour deadline to remove content marked for censorship by various national
organisations. However the MEPs did not approve a key section of the law requiring internet companies to pre-process and censor terrorsit content prior to upload. A European Commission official told the BBC changes made to the text by parliament
made the law ineffective. The Commission will now try to restore the pre-censorship requirement with the new parliament when it is elected. The law would affect social media platforms including Facebook, Twitter and YouTube, which could face fines
of up to 4% of their annual global turnover. What does the law say? In amendments, the European Parliament said websites would not be forced to monitor the information they transmit or store, nor have to actively seek facts indicating illegal
activity. It said the competent authority should give the website information on the procedures and deadlines 12 hours before the agreed one-hour deadline the first time an order is issued. In February, German MEP Julia Reda of the European
Pirate Party said the legislation risked the surrender of our fundamental freedoms [and] undermines our liberal democracy. Ms Reda welcomed the changes brought by the European Parliament but said the one-hour deadline was unworkable for platforms run by
individual or small providers. |
| |
Europe's proposed regulation on online extremism endangers freedom of expression. A statement by Index on Censorship
|
|
|
 | 16th January 2019
|
|
| See article from
indexoncensorship.org |
Index on Censorship shares the widespread concerns about the proposed EU regulation on preventing the dissemination of terrorist content online. The regulation would endanger freedom of expression and would create huge practical challenges for companies
and member states. Jodie Ginsberg, CEO of Index, said We urge members of the European Parliament and representatives of EU member states to consider if the regulation is needed at all. It risks creating far more problems than it solves. At a minimum the
regulation should be completely revised. Following the recent agreement by the European Council on a draft position for the proposed regulation on preventing the dissemination of terrorist content online, which adopted the initial
draft presented by the European Commission with some changes, the Global Network Initiative (GNI) is concerned about the potential unintended effects of the proposal and would therefore like to put forward a number of issues we urge the European
Parliament to address as it considers it further. GNI members recognize and appreciate the European Union (EU) and member states' legitimate roles in providing security, and share the aim of tackling the dissemination of terrorist
content online. However, we believe that, as drafted, this proposal could unintentionally undermine that shared objective by putting too much emphasis on technical measures to remove content, while simultaneously making it more difficult to challenge
terrorist rhetoric with counter-narratives. In addition, the regulation as drafted may place significant pressure on a range of information and communications technology (ICT) companies to monitor users' activities and remove content in ways that pose
risks for users' freedom of expression and privacy. We respectfully ask that EU officials, Parliamentarians, and member states take the time necessary to understand these and other significant risks that have been identified, by consulting openly and in
good faith with affected companies, civil society, and other experts. ...Read the full
article from indexoncensorship.org
|
| |
The EU Commissioner for 'justice' and gender equality labels Facebook and Twitter as 'channels of dirt' And then whinges when UK newspapers refer to 'EU dirty rats'
|
|
|
 | 29th
September 2018
|
|
| 22nd September 2018. See article from theverge.com
|
Vera Jourova is the European Commissioner for justice, consumers and gender equality. Once she opened a Facebook account. It did not go well. Jourova said at a news conference: For a short time, I had a Facebook account.
It was a channel of dirt. I didn't expect such an influx of hatred. I decided to cancel the account because I realised there will be less hatred in Europe after I do this.
Jourova's words carry more weight than most. She has a policy
beef with Facebook, and also the means to enforce it. Jourova says Facebook's terms of service are misleading, and has called upon the company to clarify them. In a post Thursday on that other channel of dirt, Twitter.com, she said:
I want #Facebook to be extremely clear to its users about how their service operates and makes money. Not many people know that Facebook has made available their data to third parties or that for instance it holds full copyright about
any picture or content you put on it. Jourova says European authorities could sanction Facebook next year if it doesn't like what it hears from the company soon. I was quite clear that we cannot negotiate forever, she said at the news
conference. We need to see the result. Update: Dishing the dirt 25th September 2018. See
article from theguardian.com Vera Jourova is the European Commissioner for justice, consumers
and gender equality has condemned a series of hard-hitting front pages in the British press after a recent Sun headline described Europe's leaders as 'EU Dirty Rats'. Jourová bad mouthed media again in a press release saying:
Media can build the culture of dialogue or sow divisions, spread disinformation and encourage exclusion. The Brexit debate is the best example of that. Do you remember the front page of a popular British daily
calling the judges the 'enemy of the people'? Or just last week, the EU leaders were called 'Dirty Rats' on another front page. Fundamental rights must be a part of public discourse in the media. They have to belong to the media.
Media are also instrumental in holding politicians to account and in defining the limits of what is 'unacceptable' in a society.
Offsite Comment: Now the EU wants to turn off the Sun 29th September 2018. See article from spiked-online.com by Mick Hume
They dream of stopping populism by curbing press freedom. The European Commission has come up with a new way to prevent people backing Brexit -- not by winning the argument, but by curbing press
freedom . They want to stop the British press encouraging hatred of EU leaders and judges, and impose a European approach of smart regulation to control the views expressed by the tabloids and their supposedly non-smart readers.
...Read the full
article from spiked-online.com |
| |
The European Commission publishes its proposal for massive fines for internet companies that don't implement censorship orders within the hour
|
|
|
 |
15th September 2018
|
|
| See article from money.cnn.com
|
Tech companies that fail to remove terrorist content quickly could soon face massive fines. The European Commission proposed new rules on Wednesday that would require internet platforms to remove illegal terror content within an hour of it being flagged
by national authorities. Firms could be fined up to 4% of global annual revenue if they repeatedly fail to comply. Facebook (FB), Twitter (TWTR) and YouTube owner Google (GOOGL) had already agreed to work with the European Union on a voluntary basis
to tackle the problem. But the Commission said that progress has not been sufficient. A penalty of 4% of annual revenue for 2017 would translate to $4.4 billion for Google parent Alphabet and $1.6 billion for Facebook. The proposal is the
latest in a series of European efforts to control the activities of tech companies. The terror content proposal needs to be approved by the European Parliament and EU member states before becoming law. |
| |
European Commission outlines its plans for direct and immediate censorship control of the internet
|
|
|
 |
21st August 2018
|
|
| See article from
dailymail.co.uk |
Internet companies will have to delete content claimed to be extremist on their platforms within an hour or face being fined, under new censorship plans by the European Commission. The proposals will be set out in draft regulation due to be published
next month, according to The Financial Times. Julian King, the EU's commissioner for security, told the newspaper that Brussels had not seen enough progress, when it came to the sites clamping down on terror-related material. Under the
rules, which would have to be agreed by a majority of EU member states, the platforms would have an hour to remove the material, a senior official told the newspaper. The rules would apply to all websites, regardless of their size. King told the
FT: The difference in size and resources means platforms have differing capabilities to act against terrorist content and their policies for doing so are not always transparent. All this leads
to such content continuing to proliferate across the internet, reappearing once deleted and spreading from platform to platform.
Of course the stringent requirements are totally impractical for small companies, and so no doubt will
further strengthen the monopolies of US companies with massive workforces. And of course a one hour turn around gives absolutely no one time to even consider whether the censorship requests are fair or reasonable and so translates into a tool for
direct state censorship of the internet. |
| |
TorrentFreak suggests that the disgraceful EU law to allow censorship machines to control the internet is just to help US Big Media get more money out of US Big Internet
|
|
|
 |
28th June 2018
|
|
| See article from torrentfreak.com
|
|
| What is the mysterious hold that US Big Music has over Euro politicians?
|
Article 13, the proposed EU legislation that aims to restrict safe harbors for online platforms, was crafted to end the so-called "Value Gap" on YouTube. Music piracy was traditionally viewed as an
easy to identify problem, one that takes place on illegal sites or via largely uncontrollable peer-to-peer networks. In recent years, however, the lines have been blurred. Sites like YouTube allow anyone to upload potentially
infringing content which is then made available to the public. Under the safe harbor provisions of US and EU law, this remains legal -- provided YouTube takes content down when told to do so. It complies constantly but there's always more to do.
This means that in addition to being one of the greatest legal platforms ever created, YouTube is also a goldmine of unlicensed content, something unacceptable to the music industry. They argue that the
existence of this pirate material devalues the licensed content on the platform. As a result, YouTube maintains a favorable bargaining position with the labels and the best licensing deal in the industry. The difference between
YouTube's rates and those the industry would actually like is now known as the " Value Gap " and it's become one of the hottest topics in recent years.
In fact, it is so controversial that new copyright legislation, currently weaving its way through the corridors of power in the EU Parliament, is specifically designed to address it. If passed, Article 13
will require platforms like YouTube to pre-filter uploads to detect potential infringement. Indeed, the legislation may as well have been named the YouTube Act, since it's the platform that provoked this entire debate and whole Value Gap dispute.
With that in mind, it's of interest to consider the words of YouTube's global head of music Lyor Cohen this week. In an interview with
MusicWeek , Cohen pledges that his company's new music service, YouTube Music,
will not only match the rates the industry achieves from Apple Music and Spotify, but the company's ad-supported free tier viewers will soon be delivering more cash to the labels too. "Of course [rights holders are] going to get more money,"
he told Music Week. If YouTube lives up to its pledge, a level playing field will not only be welcomed by the music industry but also YouTube competitors such as Spotify, who currently offer a free tier on less favorable terms.
While there's still plenty of room for YouTube to maneuver, peace breaking out with the labels may be coming a little too late for those deeply concerned about the implications of Article 13. YouTube's
business model and its reluctance to pay full market rate for music is what started the whole Article 13 movement in the first place and with the Legal Affairs Committee of the Parliament (JURI)
adopting the proposals last week , time is running out to have them overturned.
Behind the scenes, however, the labels and their associates are going flat out to ensure that Article 13 passes, whether YouTube decides to "play fair" or not. Their language suggests that force is the best negotiating tactic with the
distribution giant. Yesterday, UK Music CEO Michael Dugher led a delegation to the EU Parliament in support of Article 13. He was joined by deputy Labour leader Tom Watson and representatives from the BPI, PRS, and Music
Publishers Association, who urged MEPs to support the changes. |
| |
European Parliament committee passed vote to hand over censorship of the internet to US corporate giants
|
|
|
 | 20th June 2018
|
|
| See article from bit-tech.net
|
The European Parliament's Committee on Legal Affairs (JURI) has officially approved Articles 11 and 13 of a Digital Single Market (DSM) copyright proposal, mandating censorship machines and a link tax. Articles 11 and 13 of the Directive of the
European Parliament and of the Council on Copyright in the Digital Single Market have been the subject of considerable campaigning from pro-copyleft groups including the Open Rights Group and Electronic Frontier Foundation of late. Article 11, as
per the final version of the proposal, discusses the implementation of a link tax - the requirement that any site citing third-party materials do so in a way that adheres to the exemptions and restrictions of a total of 28 separate copyright laws or pays
for a licence to use and link to the material; Article 13, meanwhile, requires any site which allows users to post text, sound, program code, still or moving images, or any other work which can be copyrighted to automatically scan all such uploads
against a database of copyright works - a database which they will be required to pay to access. Both Article 11 and Article 13 won't become official legislation until passed by the entire European Parliament in a plenary vote. There's no definite
timetable for when such a vote might take place, but it would likely happen sometime between December of this year and the first half of 2019. |
| |
In two days, an EU committee will vote to crown Google and Facebook permanent lords of internet censorship
|
|
|
 |
19th June 2018
|
|
| See article from boingboing.net CC by Cory Doctorow |
On June 20, the EU's legislative committee will vote on the new Copyright directive , and decide whether it will include the controversial "Article 13" (automated
censorship of anything an algorithm identifies as a copyright violation) and "Article 11" (no linking to news stories without paid permission from the site). These proposals will make starting new internet companies
effectively impossible -- Google, Facebook, Twitter, Apple, and the other US giants will be able to negotiate favourable rates and build out the infrastructure to comply with these proposals, but no one else will. The EU's regional tech success stories
-- say Seznam.cz , a successful Czech search competitor to Google -- don't have $60-100,000,000 lying around to build out their filters, and lack the leverage to extract favorable linking
licenses from news sites. If Articles 11 and 13 pass, American companies will be in charge of Europe's conversations, deciding which photos and tweets and videos can be seen by the public, and who may speak.
The MEP Julia Reda has written up the state of play on the vote, and it's very bad. Both left- and right-wing parties
have backed this proposal, including (incredibly) the French Front National, whose Youtube channel was just deleted by a copyright filter of the sort
they're about to vote to universalise.
So far, the focus in the debate has been on the intended consequences of the proposals: the idea that a certain amount of free expression and competition must be sacrificed to enable rightsholders to force Google and Facebook to
share their profits. But the unintended -- and utterly foreseeable -- consequences are even more important. Article 11's link tax allows news sites to decide who gets to link to them, meaning that they can exclude their critics.
With election cycles dominated by hoaxes and fake news, the right of a news publisher to decide who gets to criticise it is carte blanche to lie and spin. Article 13's copyright filters are even more vulnerable to attack: the
proposals contain no penalties for false claims of copyright ownership, but they do mandate that the filters must accept copyright claims in bulk, allowing rightsholders to upload millions of works at once in order to claim their copyright
and prevent anyone from posting them. That opens the doors to all kinds of attacks. The obvious one is that trolls might sow mischief by uploading millions of works they don't hold the copyright to, in order to prevent others from
quoting them: the works of Shakespeare, say, or everything ever posted to Wikipedia, or my novels, or your family photos. More insidious is the possibility of targeted strikes during crisis: stock-market manipulators could use
bots to claim copyright over news about a company, suppressing its sharing on social media; political actors could suppress key articles during referendums or elections; corrupt governments could use arms-length trolls to falsely claim ownership of
footage of human rights abuses. It's asymmetric warfare: falsely claiming a copyright will be easy (because the rightsholders who want this system will not tolerate jumping through hoops to make their claims) and instant (because
rightsholders won't tolerate delays when their new releases are being shared online at their moment of peak popularity). Removing a false claim of copyright will require that a human at an internet giant looks at it, sleuths out the truth of the
ownership of the work, and adjusts the database -- for millions of works at once. Bots will be able to pollute the copyright databases much faster than humans could possibly clear it. I spoke with Wired UK's KG Orphanides about
this, and their excellent article on the proposal is the best explanation I've seen of the uses of these copyright filters to create
unstoppable disinformation campaigns. Doctorow highlighted the potential for unanticipated abuse of any automated copyright filtering system to make false copyright claims, engage in targeted harassment and even
silence public discourse at sensitive times. "Because the directive does not provide penalties for abuse -- and because rightsholders will not tolerate delays between claiming copyright over a work and suppressing its public
display -- it will be trivial to claim copyright over key works at key moments or use bots to claim copyrights on whole corpuses. The nature of automated systems, particularly if powerful rightsholders insist that they default to
initially blocking potentially copyrighted material and then releasing it if a complaint is made, would make it easy for griefers to use copyright claims over, for example, relevant Wikipedia articles on the eve of a Greek debt-default referendum or,
more generally, public domain content such as the entirety of Wikipedia or the complete works of Shakespeare. "Making these claims will be MUCH easier than sorting them out -- bots can use cloud providers all over the world
to file claims, while companies like Automattic (WordPress) or Twitter, or even projects like Wikipedia, would have to marshall vast armies to sort through the claims and remove the bad ones -- and if they get it wrong and remove a legit copyright claim,
they face unbelievable copyright liability."
|
| |
The UN's free speech rapporteur condemns the EU's censorship machines that will violate human rights
|
|
|
 | 17th June 2018
|
|
| See
article from techdirt.com |
David Kaye, the UN's Special Rapporteur on freedom of expression has now chimed in with a very thorough report, highlighting how Article 13 of the Directive -- the part about mandatory copyright filters -- would be a disaster for free speech and would
violate the UN's Declaration on Human Rights, and in particular Article 19 which says: Everyone has the right to freedom of opinion and expression; the right includes freedom to hold opinions without interference and to
seek, receive and impart information and ideas through any media regardless of frontiers.
As Kaye's report notes, the upload filters of Article 13 of the Copyright Directive would almost certainly violate this principle.
Article 13 of the proposed Directive appears likely to incentivize content-sharing providers to restrict at the point of upload user-generated content that is perfectly legitimate and lawful. Although the latest proposed
versions of Article 13 do not explicitly refer to upload filters and other content recognition technologies, it couches the obligation to prevent the availability of copyright protected works in vague terms, such as demonstrating best efforts and taking
effective and proportionate measures. Article 13(5) indicates that the assessment of effectiveness and proportionality will take into account factors such as the volume and type of works and the cost and availability of measures, but these still leave
considerable leeway for interpretation. The significant legal uncertainty such language creates does not only raise concern that it is inconsistent with the Article 19(3) requirement that restrictions on freedom of expression
should be provided by law. Such uncertainty would also raise pressure on content sharing providers to err on the side of caution and implement intrusive content recognition technologies that monitor and filter user-generated content at the point of
upload. I am concerned that the restriction of user-generated content before its publication subjects users to restrictions on freedom of expression without prior judicial review of the legality, necessity and proportionality of such restrictions.
Exacerbating these concerns is the reality that content filtering technologies are not equipped to perform context-sensitive interpretations of the valid scope of limitations and exceptions to copyright, such as fair comment or reporting, teaching,
criticism, satire and parody.
Kaye further notes that copyright is not the kind of thing that an algorithm can readily determine, and the fact-specific and context-specific nature of copyright requires much more than just throwing
algorithms at the problem -- especially when a website may face legal liability for getting it wrong. The designation of such mechanisms as the main avenue to address users' complaints effectively delegates content
blocking decisions under copyright law to extrajudicial mechanisms, potentially in violation of minimum due process guarantees under international human rights law. The blocking of content -- particularly in the context of fair use and other
fact-sensitive exceptions to copyright -- may raise complex legal questions that require adjudication by an independent and impartial judicial authority. Even in exceptional circumstances where expedited action is required, notice-and-notice regimes and
expedited judicial process are available as less invasive means for protecting the aims of copyright law. In the event that content blocking decisions are deemed invalid and reversed, the complaint and redress mechanism
established by private entities effectively assumes the role of providing access to remedies for violations of human rights law. I am concerned that such delegation would violate the State's obligation to provide access to an effective remedy for
violations of rights specified under the Covenant. Given that most of the content sharing providers covered under Article 13 are profit-motivated and act primarily in the interests of their shareholders, they lack the qualities of independence and
impartiality required to adjudicate and administer remedies for human rights violations. Since they also have no incentive to designate the blocking as being on the basis of the proposed Directive or other relevant law, they may opt for the legally safer
route of claiming that the upload was a terms of service violation -- this outcome may deprive users of even the remedy envisioned under Article 13(7). Finally, I wish to emphasize that unblocking, the most common remedy available for invalid content
restrictions, may often fail to address financial and other harms associated with the blocking of timesensitive content.
He goes on to point that while large platforms may be able to deal with all of this, smaller ones are going to be
in serious trouble: I am concerned that the proposed Directive will impose undue restrictions on nonprofits and small private intermediaries. The definition of an online content sharing provider under Article 2(5) is
based on ambiguous and highly subjective criteria such as the volume of copyright protected works it handles, and it does not provide a clear exemption for nonprofits. Since nonprofits and small content sharing providers may not have the financial
resources to establish licensing agreements with media companies and other right holders, they may be subject to onerous and legally ambiguous obligations to monitor and restrict the availability of copyright protected works on their platforms. Although
Article 13(5)'s criteria for effective and proportionate measures take into account the size of the provider concerned and the types of services it offers, it is unclear how these factors will be assessed, further compounding the legal uncertainty that
nonprofits and small providers face. It would also prevent a diversity of nonprofit and small content-sharing providers from potentially reaching a larger size, and result in strengthening the monopoly of the currently established providers, which could
be an impediment to the right to science and culture as framed in Article 15 of the ICESCR.
|
| |
Vint Cerf, Tim Berners-Lee, and Dozens of Other Computing Experts Oppose Article 13 of the EU's new internet censorship law
|
|
|
 |
13th June 2018
|
|
| See article from eff.org See
joint letter that was released today [pdf] |
As Europe's latest copyright proposal heads to a critical vote on June 20-21, more than 70 Internet and
computing luminaries have spoken out against a dangerous provision, Article 13, that would require Internet platforms to automatically filter uploaded content. The group, which includes Internet pioneer Vint Cerf, the inventor of the World Wide Web Tim
Berners-Lee, Wikipedia co-founder Jimmy Wales, co-founder of the Mozilla Project Mitchell Baker, Internet Archive founder Brewster Kahle, cryptography expert Bruce Schneier, and net neutrality expert Tim Wu , wrote in a
joint letter that was released today : By requiring Internet platforms to perform automatic filtering all of the
content that their users upload, Article 13 takes an unprecedented step towards the transformation of the Internet, from an open platform for sharing and innovation, into a tool for the automated surveillance and control of its users.
The prospects for the elimination of Article 13 have continued to worsen. Until late last month, there was the hope that that Member States (represented by the Council of the European Union) would find a compromise. Instead, their
final negotiating mandate doubled down on it. The last hope for defeating the proposal now lies with the European Parliament. On June 20-21 the Legal Affairs (JURI) Committee will vote on the proposal. If it votes against upload
filtering, the fight can continue in the Parliament's subsequent negotiations with the Council and the European Commission. If not, then automatic filtering of all uploaded content may become a mandatory requirement for all user content platforms that
serve European users. Although this will pose little impediment to the largest platforms such as YouTube, which already uses its Content ID system to filter content, the law will create an expensive barrier to entry for smaller platforms and startups, which may choose to establish or move their operations overseas in order to avoid the European law.
For those platforms that do establish upload filtering, users will find that their contributions--including video, audio, text, and even source code --will be
monitored and potentially blocked if the automated system detects what it believes to be a copyright infringement. Inevitably, mistakes will happen . There is no
way for an automated system to reliably determine when the use of a copyright work falls within a copyright limitation or exception under European law, such as quotation or parody. Moreover, because these exceptions are not
consistent across Europe, and because there is no broad fair use right as in the United States, many harmless uses of copyright works in memes, mashups, and remixes probably are technically infringing even if no reasonable copyright owner would
object. If an automated system monitors and filters out these technical infringements, then the permissible scope of freedom of expression in Europe will be radically curtailed, even without the need for any substantive changes in copyright law.
The upload filtering proposal stems from a misunderstanding about the purpose of copyright
. Copyright isn't designed to compensate creators for each and every use of their works. It is meant to incentivize creators as part of an effort to promote the public interest in innovation and expression. But that public interest isn't served
unless there are limitations on copyright that allow new generations to build and comment on the previous contributions . Those limitations are both legal, like fair dealing, and practical, like the zone of tolerance for harmless uses. Automated upload
filtering will undermine both. The authors of today's letter write: We support the consideration of measures that would improve the ability for creators to receive fair remuneration for the use
of their works online. But we cannot support Article 13, which would mandate Internet platforms to embed an automated infrastructure for monitoring and censorship deep into their networks. For the sake of the Internet's future, we urge you to vote for
the deletion of this proposal.
What began as a bad idea offered up to copyright lobbyists as a solution to an imaginary "
value gap " has now become an outright crisis for future of the Internet as we know it. Indeed, if
those who created and sustain the operation of the Internet recognize the scale of this threat, we should all be sitting up and taking notice. If you live in Europe or have European friends or family, now could be your last
opportunity to avert the upload filter. Please take action by clicking the button below, which will take you to a campaign website where you can phone, email, or Tweet at your representatives, urging them to stop this threat to the global Internet before
it's too late. Take Action at saveyourinternet.eu
|
| |
TorrentFreak explains the grave threat to internet users and European small businesses
|
|
|
 | 6th June 2018
|
|
| See article from torrentfreak.com cc See
also saveyourinternet.eu |
The EU's plans to modernize copyright law in Europe are moving ahead. With a crucial vote coming up later this month, protests from various opponents are on the rise as well. They warn that the proposed plans will result in Internet filters which
threaten people's ability to freely share content online. According to Pirate Party MEP Julia Reda, these filters will hurt regular Internet users, but also creators and businesses. September 2016, the European Commission
published its proposal for a modernized copyright law. Among other things, it proposed measures to require online services to do more to fight piracy. Specifically, Article 13 of the proposed Copyright Directive will require
online services to track down and delete pirated content, in collaboration with rightsholders. The Commission stressed that the changes are needed to support copyright holders. However, many legal scholars , digital activists ,
politicians , and members of the public worry that they will violate the rights of regular Internet users. Last month the EU Council finalized the latest version of the proposal. This means that the matter now goes to the Legal
Affairs Committee of the Parliament (JURI), which must decide how to move ahead. This vote is expected to take place in two weeks. Although the term filter is commonly used to describe Article 13, it is not directly mentioned in
the text itself . According to Pirate Party Member of Parliament (MEP) Julia Reda , the filter keyword is avoided in the proposal to prevent a possible violation of EU law and the Charter of Fundamental Rights. However, the
outcome is essentially the same. In short, the relevant text states that online services are liable for any uploaded content unless they take effective and proportionate action to prevent copyright infringements, identified by
copyright holders. That also includes preventing these files from being reuploaded. The latter implies some form of hash filtering and continuous monitoring of all user uploads. Several companies, including Google Drive, Dropbox,
and YouTube already have these types of filters, but many others don't. A main point of critique is that the automated upload checks will lead to overblocking, as they are often ill-equipped to deal with issues such as fair use.
The proposal would require platforms to filter all uploads by their users for potential copyright infringements -- not just YouTube and Facebook, but also services like WordPress, TripAdvisor, or even Tinder. We know from
experience that these algorithmic filters regularly make mistakes and lead to the mass deletion of legal uploads, Julia Reda tells TF. Especially small independent creators frequently see their content taken down because others
wrongfully claim copyright on their works. There are no safeguards in the proposal against such cases of copyfraud. Besides affecting uploads of regular Internet users and smaller creators, many businesses will also be hit. They
will have to make sure that they can detect and prevent infringing material from being shared on their systems. This will give larger American Internet giants, who already have these filters in place, a competitive edge over
smaller players and new startups, the Pirate Party MEP argues. It will make those Internet giants even stronger, because they will be the only ones able to develop and sell the filtering technologies necessary to comply with the
law. A true lose-lose situation for European Internet users, authors and businesses, Reda tells us. Based on the considerable protests in recent days, the current proposal is still seen as a clear threat by many.
In fact, the save youri nternet campaign, backed by prominent organizations such as Creative Commons, EFF, and Open Media, is ramping up again. They urge the
European public to reach out to their Members of Parliament before it's too late. Should Article 13 of the Copyright Directive proposal be adopted, it will impose widespread censorship of all the content you share online. The
European Parliament is the only one that can step in and Save your Internet, they write. The full Article 13 text includes some language to limit its scope. The nature and size of online services must be taken into account, for
example. This means that a small and legitimate niche service with a few dozen users might not be directly liable if it operates without these anti-piracy measures. Similarly, non-profit organizations will not be required to
comply with the proposed legislation, although there are calls from some member states to change this. In addition to Article 13, there is also considerable pushback from the public against Article 11, which is regularly referred
to as the link tax . At the moment, several organizations are planning a protest day next week, hoping to mobilize the public to speak out. A week later, following the JURI vote, it will be judgment day. If
they pass the Committee the plans will progress towards the final vote on copyright reform next Spring. This also means that they'll become much harder to stop or change. That has been done before, such as with ACTA, but achieving that type of momentum
will be a tough challenge.
|
| |
The EU Security Commissioner threatens censorship laws if social media companies don't censor themselves voluntarily
|
|
|
 |
24th April 2018
|
|
| See article from theguardian.com
|
Brussels may threaten social media companies with censorship laws unless they move urgently to tackle supposed 'fake news' and Cambridge Analytica-style data abuse. The EU security commissioner, Julian King, said short-term, concrete plans needed to
be in place before the elections, when voters in 27 EU member states will elect MEPs. Under King's ideas, social media companies would sign a voluntary code of conduct to prevent the misuse of platforms to pump out misleading information. The code would include a pledge for greater transparency, so users would be made aware why their Facebook or Twitter feed was presenting them with certain adverts or stories. Another proposal is for political adverts to be accompanied with information about who paid for them.
|
| |
|
|
|
 | 27th November 2017
|
|
|
Detailed discussion of the EU proposed internet censorship law requiring internet companies to pre-censor user posts See article
from cyberleagle.com |
| |
The European Union enacts new regulation enabling the blocking of websites without judicial oversight
|
|
|
 | 23rd November 2017
|
|
| 16th November 201. See
article from bleepingcomputer.com |
The European Union voted on November 14, to pass the new internet censorship regulation nominally in the name of consumer protection. But of course censorship often hides behind consumer protection, eg the UK's upcoming internet porn ban is enacted in
the name of protecting under 18 internet consumers. The new EU-wide law gives extra power to national consumer protection agencies, but which also contains a vaguely worded clause that also grants them the power to block and take down websites without
judicial oversight. Member of the European Parliament Julia Reda said in a speech in the European Parliament Plenary during a last ditch effort to amend the law: The new law establishes overreaching Internet
blocking measures that are neither proportionate nor suitable for the goal of protecting consumers and come without mandatory judicial oversight,
According to the new rules, national consumer protection authorities can order any
unspecified third party to block access to websites without requiring judicial authorization, Reda added later in the day on her blog . This new law is an EU regulation and not a directive, meaning its obligatory for all EU states. The new
law proposal started out with good intentions, but sometimes in the spring of 2017, the proposed regulation received a series of amendments that watered down some consumer protections but kept intact the provisions that ensured national consumer
protection agencies can go after and block or take down websites. Presumably multinational companies had been lobbying for new weapons n their battle against copyright infringement. For instance, the new law gives national consumer protection
agencies the legal power to inquire and obtain information about domain owners from registrars and Internet Service Providers. Besides the website blocking clause, authorities will also be able to request information from banks to detect the
identity of the responsible trader, to freeze assets, and to carry out mystery shopping to check geographical discrimination or after-sales conditions. Comment: European Law Claims to Protect Consumers... By Blocking the Web
23rd November 2017 See article from eff.org
Last week the European Parliament passed a new Consumer Protection Regulation [PDF] that allows national
consumer authorities to order ISPs, web hosts and domain registries to block or delete websites... all without a court order. The websites targeted are those that allegedly infringe European consumer law. But European consumer law has some perplexing
provisions that have drawn ridicule, including a prohibition on children blowing up balloons unsupervised and a ban on excessively curvy bananas. Because of these, the range of websites that could be censored is both vast and uncertain.
The Consumer Protection Regulation provides in Article 8(3)(e) that consumer protection authorities must have the power: where no other effective means are available to bring about the cessation or
the prohibition of the infringement including by requesting a third party or other public authority to implement such measures, in order to prevent the risk of serious harm to the collective interests of consumers:
to remove content or restrict access to an online interface or to order the explicit display of a warning to consumers when accessing the online interface; to order a hosting service provider to
remove, disable or restrict the access to an online interface; or where appropriate, order domain registries or registrars to delete a fully qualified domain name and allow the competent authority concerned to register it;
The risks of unelected public authorities being given the power to block websites was powerfully demonstrated in 2014, when the Australian company regulator ASIC
accidentally blocked 250,000 websites in an attempt to block just a handful of sites alleged to be
defrauding Australian consumers. This likelihood of unlawful overblocking is just one of the reasons that the United Nations Special Rapporteur for Freedom of Expression and Opinion has underlined how web blocking often
contravenes international human rights law. In a 2011 report [PDF], then Special Rapporteur Frank La Rue set out how extremely
limited are the circumstances in which blocking of websites can be justified, noting that where: the specific conditions that justify blocking are not established in law, or are provided by law but in an overly broad
and vague manner, [this] risks content being blocked arbitrarily and excessively. ... [E]ven where justification is provided, blocking measures constitute an unnecessary or disproportionate means to achieve the purported aim, as they are often not
sufficiently targeted and render a wide range of content inaccessible beyond that which has been deemed illegal. Lastly, content is frequently blocked without the intervention of or possibility for review by a judicial or independent body.
This describes exactly what the new Consumer Protection Regulation will do. It hands over a power that should only be exercised, if at all, under the careful scrutiny of a judge in the most serious of cases, and allows it
to be wielded at the whim of an unelected consumer protection agency. As explained by Member of the European Parliament (MEP) Julia Reda , who voted against
the legislation, it sets the stage for the construction of a censorship infrastructure that could be misused for purposes that we cannot even anticipate, ranging from copyright enforcement through to censorship of political protest.
Regrettably, the Regulation is now law--and is required to be enforced by all European states. It is both ironic and tragic that a law intended to protect consumers actually poses such a dire threat to their right to freedom of
expression. |
| |
56 European human rights groups call on the EU to abandon its disgraceful law proposal requiring the pre-censorship of content as it is being uploaded to the internet
|
|
|
 | 17th
October 2017
|
|
| See article from indexoncensorship.org
|
Article 13: Monitoring and filtering of internet content is unacceptable. Index on Censorship joined with 56 other NGOs to call for the deletion of Article 13 from the proposal on the Digital Single Market, which includes obligations on internet
companies that would be impossible to respect without the imposition of excessive restrictions on citizens' fundamental rights. Dear President Juncker, Dear President Tajani, Dear President Tusk, Dear Prime Minister
Ratas, Dear Prime Minister Borissov, Dear Ministers, Dear MEP Voss, MEP Boni The undersigned stakeholders represent fundamental rights organisations. Fundamental rights, justice and the rule of
law are intrinsically linked and constitute core values on which the EU is founded. Any attempt to disregard these values undermines the mutual trust between member states required for the EU to function. Any such attempt would also undermine the
commitments made by the European Union and national governments to their citizens. Article 13 of the proposal on Copyright in the Digital Single Market include obligations on internet companies that would be impossible to
respect without the imposition of excessive restrictions on citizens' fundamental rights. Article 13 introduces new obligations on internet service providers that share and store user-generated content, such as video or
photo-sharing platforms or even creative writing websites, including obligations to filter uploads to their services. Article 13 appears to provoke such legal uncertainty that online services will have no other option than to monitor, filter and block
EU citizens' communications if they are to have any chance of staying in business. Article 13 contradicts existing rules and the case law of the Court of Justice. The Directive of Electronic Commerce ( 2000/31/EC)
regulates the liability for those internet companies that host content on behalf of their users. According to the existing rules, there is an obligation to remove any content that breaches copyright rules, once this has been notified to the provider.
Article 13 would force these companies to actively monitor their users' content, which contradicts the 'no general obligation to monitor' rules in the Electronic Commerce Directive. The requirement to install a system for filtering
electronic communications has twice been rejected by the Court of Justice, in the cases Scarlet Extended ( C 70/10) and Netlog/Sabam (C 360/10). Therefore, a legislative provision that requires internet companies to install a filtering system would
almost certainly be rejected by the Court of Justice because it would contravene the requirement that a fair balance be struck between the right to intellectual property on the one hand, and the freedom to conduct business and the right to freedom of
expression, such as to receive or impart information, on the other. In particular, the requirement to filter content in this way would violate the freedom of expression set out in Article 11 of the Charter of Fundamental
Rights. If internet companies are required to apply filtering mechanisms in order to avoid possible liability, they will. This will lead to excessive filtering and deletion of content and limit the freedom to impart information on the one hand, and
the freedom to receive information on the other. If EU legislation conflicts with the Charter of Fundamental Rights, national constitutional courts are likely to be tempted to disapply it and we can expect such a rule to be
annulled by the Court of Justice. This is what happened with the Data Retention Directive (2006/24/EC), when EU legislators ignored compatibility problems with the Charter of Fundamental Rights. In 2014, the Court of Justice declared the Data
Retention Directive invalid because it violated the Charter. Taking into consideration these arguments, we ask the relevant policy-makers to delete Article 13. European Digital Rights (EDRi) Access Info
ActiveWatch Article 19 Associação D3 -- Defesa dos Direitos Digitais Associação Nacional para o Software Livre (ANSOL) Association for Progressive Communications (APC) Association for Technology and Internet (ApTI) Association
of the Defence of Human Rights in Romania (APADOR) Associazione Antigone Bangladesh NGOs Network for Radio and Communication (BNNRC) Bits of Freedom (BoF) BlueLink Foundation Bulgarian Helsinki Committee Center for Democracy &
Technology (CDT) Centre for Peace Studies Centrum Cyfrowe Coalizione Italiana Liberta@ e Diritti Civili (CILD) Code for Croatia COMMUNIA Culture Action Europe Electronic Frontier Foundation (EFF) epicenter.works Estonian Human Rights Centre
Freedom of the Press Foundation Frënn vun der Ënn Helsinki Foundation for Human Rights Hermes Center for Transparency and Digital Human Rights Human Rights Monitoring Institute Human Rights Watch Human Rights Without Frontiers
Hungarian Civil Liberties Union Index on Censorship International Partnership for Human Rights (IPHR) International Service for Human Rights (ISHR) Internautas JUMEN Justice & Peace La Quadrature du Net Media
Development Centre Miklos Haraszti (Former OSCE Media Representative) Modern Poland Foundation Netherlands Helsinki Committee One World Platform Open Observatory of Network Interference (OONI) Open Rights Group (ORG) OpenMedia
Panoptykon Plataforma en Defensa de la Libertad de Información (PDLI) Reporters without Borders (RSF) Rights International Spain South East Europe Media Organisation (SEEMO) South East European Network for Professionalization of
Media (SEENPM) Statewatch The Right to Know Coalition of Nova Scotia (RTKNS) Xnet
|
| |
|
|
|
 | 5th July 2016
|
|
|
The Internet Referral Unit has now been politely asking for online terrorism content to be removed for a year See
article from arstechnica.com |
| |
European Parliament considers EU wide internet website blocking
|
|
|
 |
23rd June 2016
|
|
| See article from
arstechnica.co.uk
|
The European Parliament is currently considering EU wide website blocking powers. The latest draft of the directive on combating terrorism contains proposals on blocking websites that promote or incite terror attacks. Member states may take all
necessary measures to remove or to block access to webpages publicly inciting to commit terrorist offences, says text submitted by German MEP and rapporteur Monika Hohlmeier. Digital rights activists have argued that it leaves the door wide
open to over-blocking and censorship as safeguards defending proportionality and fundamental rights can be skipped if governments opt for voluntary schemes implemented by ISPs. Amendments have been proposed that would require any take down
or Web blocking to be subject to full judicial oversight and rubber stamping. Last week, Estonian MEP Marju Lauristin told Ars she was very disappointed with the text, saying it was jeopardising freedom of expression as enshrined in the
Charter of Fundamental Rights of EU. The measure will be up for a vote by the civil liberties committee on 27th June. |
| |
Tony Blair appointed to top role in organisation campaigning for a new Europe-wide blasphemy law disguised behind Orwellian doublespeak about tolerance
|
|
|
 |
7th June 2015
|
|
| |
The misleadingly named European Council on Tolerance and Reconciliation (ECTR) is a campaign group backed by European Jewish leaders, and a gaggle of former EU heads of state and government. It calls for pan-European legislation outlawing antisemitism
and criticism of religion, coining a phrase, 'group libel' to mirror the muslim phrase 'defamation of religion'. The group recently published a document proposing to outlaw antisemitism as well as criminalising a host of other activities of what the
group deems to be violating fundamental rights on religious, cultural, ethnic and gender grounds. The group cleverly heads the list with some justifiable prohibitions, female genital mutilation, forced marriage, polygamy, but then slip in extensive
censorship and blasphemy items, eg criminalising xenophobia, and creating a new crime of group libel , ie public defamation of ethnic, cultural or religious groups. The proposed legislation would also curb freedom of expression on grounds
of a bizarre definition of 'tolerance'. The document twists the meaning of tolerance to try and justify the end to the right of freedom of expression: Tolerance is a two-way street. Members of a group who wish to
benefit from tolerance must show it to society at large, as well as to members of other groups and to dissidents or other members of their own group. There is no need to be tolerant to the intolerant. This is especially important
as far as freedom of expression is concerned: that freedom must not be abused to defame other groups.
But the document goes much further, calling for the criminalisation of overt approval of a totalitarian ideology, xenophobia or
antisemitism. The group has now appointed Tony Blair as chairman. Comment: Tony Blair's plans to tackle extremism will stifle free speech See
article from indexoncensorship.org
Index on Censorship considers Tony Blair's proposals on hate speech to be dangerous and divisive. Blair has defended plans to lower the the barriers on what constitutes incitement to violence and make Holocaust denial illegal. Jodie Ginsberg, CEO of
Index on Censorship said: These suggestions, far from protecting people, are likely to have the opposite effect, driving extremist views underground where they can fester and grow Instead, we should be protecting free
expression, including speech that may be considered offensive or hateful, in order to expose and challenge those views. Individuals should always be protected from incitement to violence and that protection already exists in law,
as do stringent laws on hate speech. Further legislation is not needed.
Comment: NSS criticises Tony Blair's plans to entrench religion in public life across Europe See
article from
secularism.org.uk
The National Secular Society (NSS) has criticised Blair ahead of his appointment as chair of the ECTR as ill thought out and counter-productive . The former Prime Minister has defended proposals lowering the barriers to what
constitutes incitement to violence and pan-European plans to make Holocaust denial illegal and to entrench state funding for religious institutions into law. The NSS is adamant that measures such as 'group libel' would be
counter-productive, have a massive chilling effect on free speech and would be likely to restrict the open debate necessary to resolve problems. Keith Porteous Wood, NSS executive director, said: Britain already
has draconian legislation on religious insults -- a possible seven year jail term with a low prosecution threshold. Politicians have already called for the outlawing of Islamophobia, playing into the hands of those intent on closing down honest debate
about and within Islam. There is no need for more laws, and the ones we already have fail to adequately protect freedom of expression. A robust civil society with a deep commitment to free expression is our best hope for
challenging and countering bigoted narratives and misguided views. Driving extremist views underground will only allow them to fester and allow their proponents to present themselves as martyrs. Outlawing Holocaust denial
completely undermines the West's defence of freedom of speech at home and abroad and removes our moral authority to propound freedom of expression abroad. No one has the right in a plural society not to be offended and ideas should not be proscribed but
people should be defended from incitement to violence. A European-wide Holocaust denial law would be exhibit A in every response from dictators abroad - and Islamists at home - when we criticise their appalling human rights
records or challenge their rhetoric and beliefs.
The NSS has also accused Blair of being confused over the role of religion . For Mr Blair to dismiss those intent on justifying violence in
the name of religion as abusing religion and using it as a mask reveals that his enthusiasm for religion has once more led him to misunderstand one of the roots of this problem. While few would suggest that extremists' interpretations of their faith are
mainstream in today's society, it is naive and counterproductive to deny the role that such interpretations play in their religio-political motivations.
Comment: BBC to be forced to report the news
under a narrow set of acceptable values . See article from ukip.org
Tony Blair's new role as chairman of the European Council on Tolerance and Reconciliation (ECTR) is in fact supporting an organisation that is a danger to free speech. Paul Nuttall, UKIP Deputy Leader and MEP for the North West, said the ECTR wants
public broadcasting companies like the BBC to be forced under legal statute to report the news under a narrow set of acceptable values . He explained: Tony Blair is joining an organisation that explicitly wants
to see legislative control of news output. The ECTR sent a framework statute to members of the European Parliament with the intention of it becoming law that frankly caused great concern. It included dictatorial powers to demand
that 'public broadcasting (television and radio) stations will devote a prescribed percentage of their programmes to promoting a climate of tolerance'. It also called for private and public media to be controlled by a Media Complaints Commission driven
by a narrow set of acceptable values. The ECTR also called for certain new 'thought' crimes to be regarded as aggravated criminal offences, such as the 'overt approval of a totalitarian ideology, xenophobia'. This is very
dangerous stuff and is utterly against the great tradition of free speech in this country. Do we really want our news reports to be dictated by a political organisation led by Blair? Even worse is that Mr Blair's organisation also
proposes re-education programmes, which brings to mind the 1930s. It proposes young people 'convicted of committing crimes listed will be required to undergo a rehabilitation programme designed to instil in them a culture of tolerance'. It's very
worrying that in championing the ECTR, Mr Blair appears to want to enforce an Orwellian-style 'Ministry of Information' regime upon the population without taking it to the ballot box.
Offsite Comment: Tony Blair
has just joined the crew of reckless muzzlers 7th June 2015. See article from
theguardian.com by Nick Cohen
Moves by Blair, Cameron and co to end tolerance of intolerance will create a country unable to be honest with itself. |
| |
Jewish leaders in a disgraceful call for censorship and a European blasphemy law, subtly hiding it behind a ban on reprehensible cultural practices
|
|
|
 |
28th January 2015
|
|
| See article from
theguardian.com |
European Jewish leaders, backed by former EU heads of state and government, are calling for pan-European legislation outlawing antisemitism and criticism of religion. A panel of four international jewish leaders backed by the misleadingly named
European Council on Tolerance and Reconciliation (ECTR) have spent three years drafting a 12-page document on 'tolerance' . They are lobbying to have it converted into law in the 28 countries of the EU. The proposal would outlaw
antisemitism as well as criminalising a host of other activities of what the group deems to be violating fundamental rights on religious, cultural, ethnic and gender grounds. The group head the list with some justifiable prohibitions, female
genital mutilation, forced marriage, polygamy, but then slip in extensive censorship and blasphemy items, eg criminalising xenophobia, and creating a new crime of group libel , ie public defamation of ethnic, cultural or religious groups. Then to
try and generate a little support, the group extends the list to include women's and gay rights. The proposed legislation would also curb freedom of expression on grounds of a bizarre definition of 'tolerance'. The document twists the meaning of
tolerance to try and justify the end to the right of freedom of expression: Tolerance is a two-way street. Members of a group who wish to benefit from tolerance must show it to society at large, as well as to members
of other groups and to dissidents or other members of their own group. There is no need to be tolerant to the intolerant. This is especially important as far as freedom of expression is concerned: that freedom must not be abused
to defame other groups.
But the document goes much further, calling for the criminalisation of overt approval of a totalitarian ideology, xenophobia or antisemitism. Education in tolerance should be mandatory from
primary school to university, and for the military and the police, while public broadcasting must devote a prescribed percentage of their programmes to promoting a climate of 'tolerance' . The panel was chaired by Yoram Dinstein, a war
crimes expert, professor and former president of Tel Aviv university. The drafters are currently touring the parliaments of Europe trying to drum up support. |
| |
Legal advice to the European Court of Justice confirms the legality of ISPs being ordered to block copyright infringing websites
|
|
|
 | 27th November 2013
|
|
| See article from
torrentfreak.com |
In legal advice to the EU Court of Justice, Advocate General Pedro Cruz Villalon has announced that EU law allows for ISPs to be ordered to block their customers from accessing known copyright infringing sites. The opinion, which relates to a dispute
between a pair of movie companies and an Austrian ISP over the now-defunct site Kino.to, is not legally binding. However, the advice of the Advocate General is usually followed in such cases. The current dispute involves Austrian ISP UPC Telekabel
Wien and movie companies Constantin Film Verleih and Wega Filmproduktionsgesellschaft. The film companies complained that the ISP was providing its subscribers with access to Kino.to which enabled them to access their copyrighted material without
permission. Interim injunctions were granted in the movie companies' favor which required the ISP to block the site. However, the Austrian Supreme Court later issued a request to the Court of Justice to clarify whether a provider that provides
Internet access to those using an illegal website were to be regarded as an intermediary, in the same way that the host of an illegal site might. In his opinion, Advocate General Pedro Cruz Villalon said that the ISP of a user accessing a website
said to be infringing copyright should also be regarded as an intermediary whose services are used by a third party, such as the operator of an infringing website. This means that the ISP of an infringing site user can be subjected to a blocking
injunction, as long as it contain specifics on the technicalities. |
29th April 2011 | | |
EU proposal to create a Great Firewall of Europe
| See article from
telegraph.co.uk
|
Broadband providers have voiced alarm over an EU proposal to create a Great Firewall of Europe by blocking illicit web material at the borders of the bloc. The proposal emerged an obscure meeting of the Council of the European
Union's Law Enforcement Work Party (LEWP), a forum for cooperation on issues such as counter terrorism, customs and fraud. The minutes from the meeting state: The Presidency of the LEWP presented its
intention to propose concrete measures towards creating a single secure European cyberspace with a certain virtual Schengen border and virtual access points whereby the Internet Service Providers (ISP) would block illicit contents on the
basis of the EU black-list . Delegations were also informed that a conference on cyber-crime would be held in Budapest on 12-13 April 2011.
Malcolm Hutty, head of public affairs at LINX, a cooperative of British ISPs,
said the plan appeared ill thought-out and confused . We take the view that network level filtering of the type proposed has been proven ineffective. Broadband providers say that illegal content should be removed at the source
by cooperation between police and web hosting firms because network blocking can easily be circumvented.
|
15th January 2011 | | |
| Euro ISPs unimpressed by EU proposed mandate of ISP website blocking
| See
article from theregister.co.uk See also
Blocking sites leads to less policing of criminal content from
pcpro.co.uk
|
The European Commission has drafted new laws to force ISPs to block child porn. The measure will be voted on by the European Parliament next month. The technical solutions envisaged are broadly based on arrangements in the UK, where all major ISPs block
access to child abuse websites named on a list maintained by the Internet Watch Foundation (IWF). If the laws are passed as proposed, the UK government will get powers to force the small ISPs who do not use the IWF blocklist – who serve less
than 2% of British internet users – to fall into line. Last year the Home Office abandoned a pledge to enforce 100% compliance. Although voluntary, the British system is not without controversy, and EuroISPA, the European ISP trade
association, is lobbying MEPs to reject the move to enforce it across the bloc. Malcolm Hutty, the President of EuroISPA, said: In order to make the Directive on child sexual exploitation as strong as possible, emphasis must
be placed on making swift notice and takedown of child sexual abuse material focused and effective. Blocking, as an inefficient measure, should be avoided. Law enforcement authorities' procedures for rapid communication to internet hosting providers of
such illegal material must be reviewed and bottlenecks eliminated.
|
4th June 2009 | | |
EU poised to appoint telecoms regulatory body
| Based on
article from mobiletoday.co.uk
|
The EU is poised to appoint a super-regulatory body that will bring together all 27 national regulators, including Ofcom in the UK, and enforce wide-ranging reforms to the industry.
The establishment of the Body of European Regulators in
Electronic Communications (BEREC) would bring national regulators together in an attempt to further integrate the European market and become the main advisory body to the Commission, the body that proposes legislation.
The creation of a European
telecoms regulator was pushed by EU commissioner Viviane Reding, who continues to campaign for lower data roaming rates around Europe.
Malcolm Harbour, West Midlands MEP and vice president of the European Parliament's science and technology unit,
was involved in proposals for the package and told Mobile that aside from issues about internet access, the rest of the reforms had already been agreed on in theory.
|
| |