Censor Watch

Latest

2008   2009   2010   2011   2012   2013   2014   2015   2016   2017   2018  
Jan   Feb   Mar   April   May   June   July   Aug   Sep   Oct   Nov   Latest  


 

UK adult businesses to be crucified from Easter 2019...

DCMS Minister Margot James informs a parliamentary committee of the schedule for the age verification internet porn censorship regime


Link Here 15th November 2018
Full story: BBFC Internet Porn Censors...BBFC: Age Verification We Don't Trust
Age Verification and adult internet censorship was discussed by the Commons Science and Technology Committee on 13th November 2018.

Carol Monaghan Committee Member: The Digital Economy Act made it compulsory for commercial pornography sites to undertake age verification, but implementation has been subject to ongoing delays. When do we expect it to go live?

Margot James MP, Minister for Digital and the Creative Industries: We can expect it to be in force by Easter next year. I make that timetable in the knowledge that we have laid the necessary secondary legislation before Parliament. I am hopeful of getting a slot to debate it before Christmas, before the end of the year. We have always said that we will permit the industry three months to get up to speed with the practicalities and delivering the age verification that it will be required to deliver by law. We have also had to set up the regulator--well, not to set it up, but to establish with the British Board of Film Classification , which has been the regulator, exactly how it will work. It has had to consult on the methods of age verification, so it has taken longer than I would have liked, but I would balance that with a confidence that we have got it right.

Carol Monaghan: Are you confident that the commercial pornography companies are going to engage fully and will implement the law as you hope?

Margot James: I am certainly confident on the majority of large commercial pornography websites and platforms being compliant with the law. They have engaged well with the BBFC and the Department , and want to be on the right side of the law. I have confidence, but I am wary of being 100% confident, because there are always smaller and more underground platforms and sites that will seek ways around the law. At least, that is usually the case. We will be on the lookout for that, and so will the BBFC. But the vast majority of organisations have indicated that they are keen to comply with the legislation.

Carol Monaghan: One concern that we all have is that children can stumble across pornography. We know that on social media platforms, where children are often active, up to a third of their content can be pornographic, but they fall outside the age verification regulation because it is only a third and not the majority. Is that likely to undermine the law? Ultimately the law, as it stands, is there to safeguard our children.

Margot James: I acknowledge that that is a weakness in the legislative solution. I do not think that for many mainstream social media platforms as much of a third of their content is pornographic, but it is well known that certain social media platforms that many people use regularly have pornography freely available. We have decided to start with the commercial operations while we bring in the age verification techniques that have not been widely used to date. But we will keep a watching brief on how effective those age verification procedures turn out to be with commercial providers and will keep a close eye on how social media platforms develop in terms of the extent of pornographic material, particularly if they are platforms that appeal to children--not all are. You point to a legitimate weakness, on which we have a close eye.

 

 

Suffocating European livelihoods at the behest of big business...

Julia Reda outlines amendments to censorship machines and link tax as the upcoming internet censorship law gets discussed by the real bosses of the EU


Link Here 15th November 2018
Full story: Copyright in the EU...Copyright law for Europe

The closed-door trilogue efforts to finalise the EU Copyright Directive continue. The Presidency of the Council, currently held by Austria, has now circulated among the EU member state governments a new proposal for a compromise between the differing drafts currently on the table for the controversial Articles 11 and 13.

Under this latest proposal, both upload filters and the link tax would be here to stay -- with some changes for the better, and others for the worse.

Upload filters/Censorshipmachines

Let's recall: In its final position, the European Parliament had tried its utmost to avoid specifically mentioning upload filters, in order to avoid the massive public criticism of that measure. The text they ended up with, however, was even worse: It would make online platforms inescapably liable for any and all copyright infringement by their users, no matter what action they take. Not even the strictest upload filter in the world could possibly hope to catch 100% of unlicensed content.

This is what prompted YouTube's latest lobbying efforts in favour of upload filters and against the EP's proposal of inescapable liability. Many have mistaken this as lobbying against Article 13 as a whole -- it is not. In Monday's Financial Times, YouTube spelled out that they would be quite happy with a law that forces everyone else to build (or, presumably, license from them) what they already have in place: Upload filters like Content ID.

In this latest draft, the Council Presidency sides with YouTube, going back to rather explicitly prescribing upload filters. The Council proposes two alternative options on how to phrase that requirement, but they match in effect:

Platforms are liable for all copyright infringements committed by their users, EXCEPT if they

  • cooperate with rightholders

  • by implementing effective and proportionate steps to prevent works they've been informed about from ever going online determining which steps those are must take into account suitable and effective technologies

  • Under this text, wherever upload filters are possible, they must be implemented: All your uploads will require prior approval by error-prone copyright bots .

On the good side, the Council Presidency seems open to adopting the Parliament's exception for platforms run by small and micro businesses . It also takes on board the EP's better-worded exception for open source code sharing platforms like GitHub.

On the bad side, Council rejects Parliament's efforts for a stronger complaint mechanism requiring reviews by humans and an independent conflict resolution body. Instead it takes on board the EP's insistence that licenses taken out by a platform don't even have to necessarily cover uses of these works by the users of that platform. So, for example, even if YouTube takes out a license to show a movie trailer, that license could still prevent you as an individual YouTuber from using that trailer in your own uploads.

Article 11 Link tax

On the link tax, the Council is mostly sticking to its position: It wants the requirement to license even short snippets of news articles to last for one year after an article's publication, rather than five, as the Parliament proposed.

In a positive development, the Council Presidency adopts the EP's clarification that at least the facts included in news articles as such should not be protected. So a journalist would be allowed to report on what they read in another news article, in their own words.

Council fails to clearly exclude hyperlinks -- even those that aren't accompanied by snippets from the article. It's not uncommon for the URLs of news articles themselves to include the article's headline. While the Council wants to exclude insubstantial parts of articles from requiring a license, it's not certain that headlines count as insubstantial. (The Council's clause allowing acts of hyperlinking when they do not constitute communication to the public would not apply to such cases, since reproducing the headline would in fact constitute such a communication to the public.)

The Council continues to want the right to only apply to EU-based news sources -- which could in effect mean fewer links and listings in search engines, social networks and aggregators for European sites, putting them at a global disadvantage.

However, it also proposes spelling out that news sites may give out free licenses if they so choose -- contrary to the Parliament, which stated that listing an article in a search engine should not be considered sufficient payment for reproducing snippets from it.

 

 

Er...it's easy, just claim it transgresses 'community guidelines'...

Facebook will train up French censors in the art of taking down content deemed harmful


Link Here 15th November 2018
Full story: Facebook Censorship...Facebook quick to censor

The French President, Emmanuel Macron has announced a plan to effectively embed French state censors with Facebook to learn more about how to better censor the platform. He announced a six-month partnership with Facebook aimed at figuring out how the European country should police hate speech on the social network.

As part of the cooperation both sides plan to meet regularly between now and May, when the European election is due to be held. They will focus on how the French government and Facebook can work together to censor content deemed 'harmful'. Facebook explained:

It's a pilot program of a more structured engagement with the French government so that both sides can better understand the other's challenges in dealing with the issue of hate speech online. The program will allow a team of regulators, chosen by the Elysee, to familiarize [itself] with the tools and processes set up by Facebook to fight against hate speech. The working group will not be based in one location but will travel to different Facebook facilities around the world, with likely visits to Dublin and California. The purpose of this program is to enable regulators to better understand Facebook's tools and policies to combat hate speech and, for Facebook, to better understand the needs of regulators.

 

 

Fireworks in the House...

The Lords discuss when age verification internet censorship will start


Link Here 13th November 2018
Full story: BBFC Internet Porn Censors...BBFC: Age Verification We Don't Trust

Pornographic Websites: Age Verification - Question

House of Lords on 5th November 2018 .

Baroness Benjamin Liberal Democrat

To ask Her Majesty 's Government what will be the commencement date for their plans to ensure that age-verification to prevent children accessing pornographic websites is implemented by the British Board of Film Classification .

Lord Ashton of Hyde The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

My Lords, we are now in the final stages of the process, and we have laid the BBFC 's draft guidance and the Online Pornography (Commercial Basis) Regulations before Parliament for approval. We will ensure that there is a sufficient period following parliamentary approval for the public and the industry to prepare for age verification. Once parliamentary proceedings have concluded, we will set a date by which commercial pornography websites will need to be compliant, following an implementation window. We expect that this date will be early in the new year.

Baroness Benjamin

I thank the Minister for his Answer. I cannot wait for that date to happen, but does he share my disgust and horror that social media companies such as Twitter state that their minimum age for membership is 13 yet make no attempt to restrict some of the most gross forms of pornography being exchanged via their platforms? Unfortunately, the Digital Economy Act does not affect these companies because they are not predominantly commercial porn publishers. Does he agree that the BBFC needs to develop mechanisms to evaluate the effectiveness of the legislation for restricting children's access to pornography via social media sites and put a stop to this unacceptable behaviour?

Lord Ashton of Hyde

My Lords, I agree that there are areas of concern on social media sites. As the noble Baroness rightly says, they are not covered by the Digital Economy Act . We had many hours of discussion about that in this House. However, she will be aware that we are producing an online harms White Paper in the winter in which some of these issues will be considered. If necessary, legislation will be brought forward to address these, and not only these but other harms too. I agree that the BBFC should find out about the effectiveness of the limited amount that age verification can do; it will commission research on that. Also, the Digital Economy Act itself made sure that the Secretary of State must review its effectiveness within 12 to 18 months.

Lord Griffiths of Burry Port Opposition Whip (Lords), Shadow Spokesperson (Digital, Culture, Media and Sport), Shadow Spokesperson (Wales)

My Lords, once again I find this issue raising a dynamic that we became familiar with in the only too recent past. The Government are to be congratulated on getting the Act on to the statute book and, indeed, on taking measures to identify a regulator as well as to indicate that secondary legislation will be brought forward to implement a number of the provisions of the Act. My worry is that, under one section of the Digital Economy Act , financial penalties can be imposed on those who infringe this need; the Government seem to have decided not to bring that provision into force at this time. I believe I can anticipate the Minister 's answer but--in view of the little drama we had last week over fixed-odds betting machines--we would not want the Government, having won our applause in this way, to slip back into putting things off or modifying things away from the position that we had all agreed we wanted.

Lord Ashton of Hyde

My Lords, I completely understand where the noble Lord is coming from but what he said is not quite right. The Digital Economy Act included a power that the Government could bring enforcement with financial penalties through a regulator. However, they decided--and this House decided--not to use that for the time being. For the moment, the regulator will act in a different way. But later on, if necessary, the Secretary of State could exercise that power. On timing and FOBTs, we thought carefully--as noble Lords can imagine--before we said that we expect the date will be early in the new year,

Lord Addington Liberal Democrat

My Lords, does the Minister agree that good health and sex education might be a way to counter some of the damaging effects? Can the Government make sure that is in place as soon as possible, so that this strange fantasy world is made slightly more real?

Lord Ashton of Hyde

The noble Lord is of course right that age verification itself is not the only answer. It does not cover every possibility of getting on to a pornography site. However, it is the first attempt of its kind in the world, which is why not only we but many other countries are looking at it. I agree that sex education in schools is very important and I believe it is being brought into the national curriculum already.

The Earl of Erroll Crossbench

Why is there so much wriggle room in section 6 of the guidance from the DCMS to the AV regulator? The ISP blocking probably will not work, because everyone will just get out of it. If we bring this into disrepute then the good guys, who would like to comply, probably will not; they will not be able to do so economically. All that was covered in British Standard PAS 1296, which was developed over three years. It seems to have been totally ignored by the DCMS. You have spent an awful lot of time getting there, but you have not got there.

Lord Ashton of Hyde

One of the reasons this has taken so long is that it is complicated. We in the DCMS , and many others, not least in this House, have spent a long time discussing the best way of achieving this. I am not immediately familiar with exactly what section 6 says, but when the statutory instrument comes before this House--it is an affirmative one to be discussed--I will have the answer ready for the noble Earl.

Lord West of Spithead Labour

My Lords, does the Minister not agree that the possession of a biometric card by the population would make the implementation of things such as this very much easier?

Lord Ashton of Hyde

In some ways it would, but there are problems with people who either do not want to or cannot have biometric cards.

 

 

Rainbow 6 Siege...

Games developers announce will remove sex and gambling references worldwide so as to comply with Chinese censorship requirements


Link Here 13th November 2018
In order to prepare Rainbow 6 Siege for expansion into China, Ubisoft announced that it will be making some global censor cuts to the game's visuals to remove gore and references to sex and gambling.

In a blog post, Ubisoft explained:

A Single, Global Version

We want to explain why these changes are coming to the global version of the game, as opposed to branching and maintaining two parallel builds. We want to streamline our production time to increase efficiency

By maintaining a single build, we are able to reduce the duplication of work on the development side. This will allow us to be more agile as a development team, and address issues more quickly.

Ubisoft provided examples of their censorship:

  • Icons featuring knives become fists
  • Icons featuring skulls are replaced
  • Skulls in artwork are fleshed out into faces
  • Images of slot machines are removed
  • Blood spatters are removed from a Chinese landscape painting
  • Strip club neon nudity is removed

 

 

Offsite Article: The Potential Unintended Consequences of Article 13...


Link Here 13th November 2018
Full story: Copyright in the EU...Copyright law for Europe
Susan Wojcicki, CEO of YouTube explains how the EU's copyright rewrite will destroy the livelihood of a huge number of Europeans

See article from youtube-creators.googleblog.com

 

 

Overlord...

Cut in Australia for an MA15+ rating


Link Here 11th November 2018
Overlord is a 2018 USA action war horror by Julius Avery.
Starring Wyatt Russell, Bokeem Woodbine and Iain De Caestecker. BBFC link IMDb

On the eve of D-Day, American paratroopers are dropped behind enemy lines to carry out a mission crucial to the invasion's success. But as they approach their target, they begin to realize there is more going on in this Nazi-occupied village than a simple military operation. They find themselves fighting against supernatural forces, part of a Nazi experiment.

Overlord has been cut in Australia for MA15+ cinema and home video release. MA15+ is something like a 15A in UK terms.

The film was originally rated R18+ uncut for: High impact violence; Strong impact themes; Moderate impact language. However distributors Paramount preferred a lower rating and a month later submitted a cut version. This was duly rated MA15+ for Strong impact themes, violence; Moderate impact language:

For comparison:

  • UK: Passed 18 uncut for strong bloody violence, gory images
  • US: Rated R uncut for strong bloody violence, disturbing images, language, and brief sexual content.

 

 

Potent regulation...

HappyDown cocktails censured for lack of clarity about alcoholic content


Link Here 11th November 2018
Full story: UK Drinks Censor...Portman Group play PC censor for drinks

A complaint about HappyDown sparkling cocktails has been upheld by the Independent Complaints Panel for failing to clearly communicate their alcoholic content.

The complainant, a member of the public, believed that the cartoon imagery used on the cans could appeal to children. The Panel did not believe that it did appeal to children but did raise concerns that the cues describing it as alcoholic were not immediately obvious. The Panel concluded that the alcoholic nature of the drink was not clearly communicated and accordingly found the product in breach of Code rule 3.1.

HappyDown's producer, Tipple Brands Limited, will work with the Advisory Service to address the issues raised.

John Timothy, Secretary to the Independent Complaints Panel, commented, Alcoholic content needs to be conveyed clearly. Producers need to ask themselves if there is any other messaging or design on their product which could undermine this clarity.

 

 

Avoiding tears...

Bible Society is miffed that its Remembrance Day advert is banned by cinemas, unsurprisingly preferring to avoid the violence, threat and intimidation associated with religion


Link Here 10th November 2018
Cinemas have rejected a Bible Society advert speaking of the comfort some first world war soldiers found in the Bible. The three-minute film, titled Wipe Every Tear , explains that all British soldiers were given a Bible as part of their kit and that this was a source of hope to many.

Empire Cinemas explained that they do not take adverts from any religious groups.

The three-minute film opens with footage of soldiers in trenches. A caption explains All British soldiers were given a Bible as part of their kit. Captions continue: To many it was a source of hope. For eternal peace. The film then moves to clips of contemporary people, often in their workplace, reciting Revelation 21: 1-7. These include a farmer, a fisherman, a hairdresser, a soldier, and a chef. The concluding captions state: The Bible. Still giving peace and hope today.

The film was intended to be shown in 125 screens at 14 venues across the country in the run-up to the armistice centenary this weekend. The Bible Society is reported to have reached agreement with cinema advertising company Pearl and Dean for the distribution of the film. Pearl and Dean later emailed to say that Empire Cinemas had vetoed the film because they do not accept religious or political advertisements.

 

 

Offsite Article: Promises! Promises!...


Link Here 10th November 2018
The History Of Nudity In R-Rated Films. By Dirk Libbey

See article from cinemablend.com

 

 

Offsite Article: Reverse motion...


Link Here 10th November 2018
As 6 major Hollywood studios become 5, a little speculation how the MPAA will cope with its shrinking budget

See article from torrentfreak.com

 

 

BBFC: Age verification we don't trust...

Analysis of BBFC's Post-Consultation Guidance by the Open Rights Group


Link Here 8th November 2018
Full story: BBFC Internet Porn Censors...BBFC: Age Verification We Don't Trust
Following the conclusion of their consultation period, the BBFC have issued new age verification guidance that has been laid before Parliament. It is unclear why, if the government now recognises that privacy protections like this are needed, the government would also leave the requirements as voluntary.

Summary

The new code has some important improvements, notably the introduction of a voluntary scheme for privacy, close to or based on a GDPR Code of Conduct. This is a good idea, but should not be put in place as a voluntary arrangement. Companies may not want the attention of a regulator, or may simply wish to apply lower or different standards, and ignore it. It is unclear why, if the government now recognises that privacy protections like this are needed, the government would also leave the requirements as voluntary.

We are also concerned that the voluntary scheme may not be up and running before the AV requirement is put in place. Given that 25 million UK adults are expected to sign up to these products within a few months of its launch, this would be very unhelpful.

Parliament should now:

  • Ask the government why the privacy scheme is to be voluntary, if the risks of relying on general data protection law are now recognised;
  • Ask for assurance from BBFC that the voluntary scheme will cover the all of the major operators; and
  • Ask for assurance from BBFC and DCMS that the voluntary privacy scheme will be up and running before obliging operators to put Age Verification measures in place.

The draft code can be found here .

Lack of Enforceability of Guidance

The Digital Economy Act does not allow the BBFC to judge age verification tools by any standard other than whether or not they sufficiently verify age. We asked that the BBFC persuade the DCMS that statutory requirements for privacy and security were required for age verification tools.

The BBFC have clearly acknowledged privacy and security concerns with age verification in their response. However, the BBFC indicate in their response that they have been working with the ICO and DCMS to create a voluntary certification scheme for age verification providers:

"This voluntary certification scheme will mean that age-verification providers may choose to be independently audited by a third party and then certified by the Age-verification Regulator. The third party's audit will include an assessment of an age-verification solution's compliance with strict privacy and data security requirements."

The lack of a requirement for additional and specific privacy regulation in the Digital Economy Act is the cause for this voluntary approach.

While a voluntary scheme above is likely to be of some assistance in promoting better standards among age verification providers, the "strict privacy and data security requirements" which the voluntary scheme mentions are not a statutory requirement, leaving some consumers at greater risk than others.

Sensitive Personal Data

The data handled by age verification systems is sensitive personal data. Age verification services must directly identify users in order to accurately verify age. Users will be viewing pornographic content, and the data about what specific content a user views is highly personal and sensitive. This has potentially disastrous consequences for individuals and families if the data is lost, leaked, or stolen.

Following a hack affecting Ashley Madison -- a dating website for extramarital affairs -- a number of the site's users were driven to suicide as a result of the public exposure of their sexual activities and interests.

For the purposes of GDPR, data handled by age verification systems falls under the criteria for sensitive personal data, as it amounts to "data concerning a natural person's sex life or sexual orientation".

Scheduling Concerns

It is of critical importance that any accreditation scheme for age verification providers, or GDPR code of conduct if one is established, is in place and functional before enforcement of the age verification provisions in the Digital Economy Act commences. All of the major providers who are expected to dominate the age verification market should undergo their audit under the scheme before consumers will be expected to use the tool. This is especially true when considering the fact that MindGeek have indicated their expectation that 20-25 million UK adults will sign up to their tool within the first few months of operation. A voluntary accreditation scheme that begins enforcement after all these people have already signed up would be unhelpful.

Consumers should be empowered to make informed decisions about the age verification tools that they choose from the very first day of enforcement. No delays are acceptable if users are expected to rely upon the scheme to inform themselves about the safety of their data. If this cannot be achieved prior to the start of expected enforcement of the DE Act's provisions, then the planned date for enforcement should be moved back to allow for the accreditation to be completed.

Issues with Lack of Consumer Choice

It is of vital importance that consumers, if they must verify their age, are given a choice of age verification providers when visiting a site. This enables users to choose which provider they trust with their highly sensitive age verification data and prevents one actor from dominating the market and thereby promoting detrimental practices with data. The BBFC also acknowledge the importance of this in their guidance, noting in 3.8:

"Although not a requirement under section 14(1) the BBFC recommends that online commercial pornography services offer a choice of age-verification methods for the end-user".

This does not go far enough to acknowledge the potential issues that may arise in a fragmented market where pornographic sites are free to offer only a single tool if they desire.

Without a statutory requirement for sites to offer all appropriate and available tools for age verification and log in purposes, it is likely that a market will be established in which one or two tools dominate. Smaller sites will then be forced to adopt these dominant tools as well, to avoid friction with consumers who would otherwise be required to sign up to a new provider.

This kind of market for age verification tools will provide little room for a smaller provider with a greater commitment to privacy or security to survive and robs users of the ability to choose who they trust with their data.

We already called for it to be made a statutory requirement that pornographic sites must offer a choice of providers to consumers who must age verify, however this suggestion has not been taken up.

We note that the BBFC has been working with the ICO and DCMS to produce a voluntary code of conduct. Perhaps a potential alternative solution would be to ensure that a site is only considered compliant if it offers users a number of tools which has been accredited under the additional privacy and security requirements of the voluntary scheme.

GDPR Codes of Conduct

A GDPR "Code of Conduct" is a mechanism for providing guidelines to organisations who process data in particular ways, and allows them to demonstrate compliance with the requirements of the GDPR.

A code of conduct is voluntary, but compliance is continually monitored by an appropriate body who are accredited by a supervisory authority. In this case, the "accredited body" would likely be the BBFC, and the "supervisory authority" would be the ICO. The code of conduct allows for certifications, seals and marks which indicate clearly to consumers that a service or product complies with the code.

Codes of conduct are expected to provide more specific guidance on exactly how data may be processed or stored. In the case of age verification data, the code could contain stipulations on:

  • Appropriate pseudonymisation of stored data;
  • Data and metadata retention periods;
  • Data minimisation recommendations;
  • Appropriate security measures for data storage;
  • Security breach notification procedures;
  • Re-use of data for other purposes.

The BBFC's proposed "voluntary standard" regime appears to be similar to a GDPR code of conduct, though it remains to be seen how specific the stipulations in the BBFC's standard are. A code of conduct would also involve being entered into the ICO's public register of UK approved codes of conduct, and the EPDB's public register for all codes of conduct in the EU.

Similarly, GDPR Recital 99 notes that "relevant stakeholders, including data subjects" should be consulted during the drafting period of a code of conduct - a requirement which is not in place for the BBFC's voluntary scheme.

It is possible that the BBFC have opted to create this voluntary scheme for age verification providers rather than use a code of conduct, because they felt they may not meet the GDPR requirements to be considered as an appropriate body to monitor compliance. Compliance must be monitored by a body who has demonstrated:

  • Their expertise in relation to the subject-matter;
  • They have established procedures to assess the ability of data processors to apply the code of conduct;
  • They have the ability to deal with complaints about infringements; and
  • Their tasks do not amount to a conflict of interest.
Parties Involved in the Code of Conduct Process

As noted by GDPR Recital 99, a consultation should be a public process which involves stakeholders and data subjects, and their responses should be taken into account during the drafting period:

"When drawing up a code of conduct, or when amending or extending such a code, associations and other bodies representing categories of controllers or processors should consult relevant stakeholders, including data subjects where feasible , and have regard to submissions received and views expressed in response to such consultations."

The code of conduct must be approved by a relevant supervisory authority (in this case the ICO).

An accredited body (BBFC) that establishes a code of conduct and monitors compliance is able to establish their own structures and procedures under GDPR Article 41 to handle complaints regarding infringements of the code, or regarding the way it has been implemented. BBFC would be liable for failures to regulate the code properly under Article 41(4), [1] however DCMS appear to have accepted the principle that the government would need to protect BBFC from such liabilities. [2]

GDPR Codes of Conduct and Risk Management

Below is a table of risks created by age verification which we identified during the consultation process. For each risk, we have considered whether a GDPR code of conduct may help to mitigate the effects of it.

Risk CoC Appropriate? Details
User identity may be correlated with viewed content. Partially This risk can never be entirely mitigated if AV is to go ahead, but a CoC could contain very strict restrictions on what identifying data could be stored after a successful age verification.
Identity may be associated to an IP address, location or device. No It would be very difficult for a CoC to mitigate this risk as the only safe mitigation would be not to collect user identity information.
An age verification provider could track users across all the websites it's tool is offered on. Yes Strict rules could be put in place about what data an age verification provider may store, and what data it is forbidden from storing.
Users may be incentivised to consent to further processing of their data in exchange for rewards (content, discounts etc.) Yes Age verification tools could be expressly forbidden from offering anything in exchange for user consent.
Leaked data creates major risks for identified individuals and cannot be revoked or adequately compensated for. Partially A CoC can never fully mitigate this risk if any data is being collected, but it could contain strict prohibitions on storing certain information and specify retention periods after which data must be destroyed, which may mitigate the impacts of a data breach.
Risks to the user of access via shared computers if viewing history is stored alongside age verification data. Yes A CoC could specify that any accounts for pornographic websites which may track viewed content must be strictly separate and not in any visible way linked to a user's age verification account or data that confirms their identity.
Age verification systems are likely to trade off convenience for security. (No 2FA, auto-login, etc.) Yes A CoC could stipulate that login cookies that "remember" a returning user must only persist for a short time period, and should recommend or enforce two-factor authentication.
The need to re-login to age verification services to access pornography in "private browsing" mode may lead people to avoid using this feature and generate much more data which is then stored. No A CoC cannot fix this issue. Private browsing by nature will not store any login cookies or other objects and will require the user to re-authenticate with age verification providers every time they wish to view adult content.
Users may turn to alternative tools to avoid age verification, which carry their own security risks. (Especially "free" VPN services or peer-to-peer networks). No Many UK adults, although over 18, will be uncomfortable with the need to submit identity documents to verify their age and will seek alternative means to access content. It is unlikely that many of these individuals will be persuaded by an accreditation under a GDPR code.
Age verification login details may be traded and shared among teenagers or younger children, which could lead to bullying or "outing" if such details are linked to viewed content. Yes Strict rules could be put in place about what data an age verification provider may store, and what data it is forbidden from storing.
Child abusers could use their access to age verified content as an adult as leverage to create and exploit relationships with children and teenagers seeking access to such content (grooming). No This risk will exist as long as age verification is providing a successful barrier to accessing such content for under-18s who wish to do so.
The sensitivity of content dealt with by age verification services means that users who fall victim to phishing scams or fraud have a lower propensity to report it to the relevant authorities. Partially A CoC or education campaign may help consumers identify trustworthy services, but it can not fix the core issue, which is that users are being socialised into it being "normal" to input their identity details into websites in exchange for pornography. Phishing scams resulting from age verification will appear and will be common, and the sensitivity of the content involved is a disincentive to reporting it.
The use of credit cards as an age verification mechanism creates an opportunity for fraudulent sites to engage in credit card theft. No Phishing and fraud will be common. A code of conduct which lists compliant sites and tools externally on the ICO website may be useful, but a phishing site may simply pretend to be another (compliant) tool, or rely on the fact that users are unlikely to check with the ICO every time they wish to view pornographic content.
The rush to get age verification tools to market means they may take significant shortcuts when it comes to privacy and security. Yes A CoC could assist in solving this issue if tools are given time to be assessed for compliance before the age verification regime commences .
A single age verification provider may come to dominate the market, leaving users little choice but to accept whatever terms the provider offers. Partially Practically, a CoC could mitigate some of the effects of an age verification tool monopoly if the dominant tool is accredited under the Code. However, this relies on users being empowered to demand compliance with a CoC, and it is possible that users will instead be left with a "take it or leave it" situation where the dominant tool is not CoC accredited.
Allowing pornography "monopolies" such as MindGeek to operate age verification tools is a conflict of interest. Partially As the BBFC note in their consultation response, it would not be reasonable to prohibit a pornographic content provider from running an age verification service as it would prevent any site from running their own tool. However, under a CoC it is possible that a degree of separation could be enforced that requires an age verification tools to adhere to strict rules about the use of data, which could mitigate the effects of a large pornographic content provider attempting to collect as much user data as possible for their own business purposes.
 

[1] "Infringements of the following provisions shall, in accordance with paragraph 2, be subject to administrative fines up to 10 000 000 EUR, or in the case of an undertaking, up to 2 % of the total worldwide annual turnover of the preceding financial year, whichever is higher: the obligations of the monitoring body pursuant to Article 41(4)."

[2] "contingent liability will provide indemnity to the British Board of Film Classification (BBFC) against legal proceedings brought against the BBFC in its role as the age verification regulator for online pornography."

 

 

A modern swear box...

Its probably not a good idea to leave much money in a Skype or XBox Live account as Microsoft can now seize it if they catch you using a vaguely offence word


Link Here 8th November 2018
Full story: Microsoft Censorship Rules...For Microsoft services, XBoX, Skype, OneDrive…
Microsoft has just inflicted a new 'code of conduct' that prohibits customers communicating nudity, bestiality, pornography, offensive language, graphic violence and criminal activity, whilst allowing Microsoft to steal the money in your account.

If users are found to have shared, or be in possession of, these types of content, Microsoft can suspend or ban the particular user and remove funds or balance on the associated account.

It also appears that Microsoft reserves the right to view user content to investigate violations to these terms. This means it has access to your message history and shared files (including on OneDrive, another Microsoft property) if it thinks you've been sharing prohibited material.

Unsurprisingly, few users are happy that Microsoft is willing to delve through their personal data.

Microsoft has not made it clear if it will automatically detect and censor prohibited content or if it will reply on a reporting system. On top of that, Microsoft hasn't clearly defined its vague terms. Nobody is clear on what the limit on offensive language is.

 

 

Creeping about your life...

Facebook friend suggestion: Ms Tress who visits your husband upstairs at your house for an hour every Thursday afternoon whilst you are at work


Link Here 8th November 2018
Full story: Facebook Privacy...Facebook criticised for discouraging privacy
Facebook has files a patent that describes a method of using the devices of Facebook app users to identify various wireless signals from the devices of other users.

It explains how Facebook could use those signals to measure exactly how close the two devices are to one another and for how long, and analyses that data to infer whether it is likely that the two users have met. The patent also suggests the app could record how often devices are close to one another, the duration and time of meetings, and can even use its gyroscope and accelerometer to analyse movement patterns, for example whether the two users may be going for a jog, smooching or catching a bus together.

Facebook's algorithm would use this data to analyse how likely it is that the two users have met, even if they're not friends on Facebook and have no other connections to one another. This might be based on the pattern of inferred meetings, such as whether the two devices are close to one another for an hour every Thursday, and an algorithm would determine whether the two users meeting was sufficiently significant to recommend them to each other and/or friends of friends.

I don't suppose that Facebook can claim this patent though as police and the security services have no doubt been using this technique for years.

 

 

Rarely challenged until now...

Privacy International challenges major data brokers over GDPR privacy rules


Link Here 8th November 2018
Privacy International has filed complaints against seven data brokers (Acxiom, Oracle), ad-tech companies (Criteo, Quantcast, Tapad), and credit referencing agencies (Equifax, Experian) with data protection authorities in France, Ireland, and the UK.

It's been more than five months since the EU's General Data Protection Regulation (GDPR) came into effect. Fundamentally, the GDPR strengthens rights of individuals with regard to the protection of their data, imposes more stringent obligations on those processing personal data, and provides for stronger regulatory enforcement powers -- in theory.

In practice, the real test for GDPR will be in its enforcement.

Nowhere is this more evident than for data broker and ad-tech industries that are premised on exploiting people's data. Despite exploiting the data of millions of people, are on the whole non-consumer facing and therefore rarely have their practices challenged.

 

 

Hell to pay...

Satanic temple sues Netflix with a copyright claim over a statue of Baphomet


Link Here 8th November 2018
The Satanic Temple in Salem, Massachusetts is suing Netflix and producers Warner Brothers over a statue of the goat-headed deity Baphomet that appears in the TV series Chilling Adventures of Sabrina .

The temple is claiming that Netflix and Warners are violating the copyright and trademark of the temple's own Baphomet statue, which it built several years ago.

Historically, the androgynous deity has been depicted with a goat's head on a female body, but The Satanic Temple created this statue with Baphomet having a male chest an idea that was picked up by Netflix.

The Temple is seeking damages of at least $50 million for copyright infringement, trademark violation and injury to business reputation. In the Sabrina storyline, the use of the statue as the central focal point of the school associated with evil, cannibalism and possibly murder is injurious to TST's business, the Temple says in its suit.

 

 

Offsite Article: The police chiefs vs the thoughtpolice...


Link Here 8th November 2018
Why police should stay out of hate incidents. By Fraser Myers

See article from spiked-online.com

 

2008   2009   2010   2011   2012   2013   2014   2015   2016   2017   2018  
Jan   Feb   Mar   April   May   June   July   Aug   Sep   Oct   Nov   Latest  

Censor Watch logo
censorwatch.co.uk
 

Top

Home

Links
 

Censorship News Latest

Daily BBFC Ratings

Melon Farmers