As Policy Director David Miles is the principal adviser on policy and public affairs to the Chief Executive. He is responsible
for coordinating the BBFC's policy work and managing and leading on its public affairs effort. The role is also responsible for managing the BBFC's research, communications and education programmes.
David Miles, BBFC Policy Director said: The BBFC is an intelligent and innovative organisation with a growing remit online, as well as an important legacy as a British institution and one of the most respected film and video regulators in the
world. I am very pleased to join the BBFC as its Policy Director and look forward to working with all BBFC staff to ensure the BBFC's Classification Guidelines continue to adapt shifting public opinion and the BBFC provides the best possible,
transparent and accessible guidance for anyone making a film, DVD/Blu-ray or VOD viewing decision for themselves or on behalf of children.
I also look forward to the opportunity to work on the BBFC's proposed role as the age verification regulator for pornography online, a significant and vital step in reducing children's exposure to online pornography available in the UK, and a role
I believe the BBFC is well equipped to fulfil.
David joined the BBFC as a consultant in February 2017, before his appointment as Policy Director in June 2017. Prior to this David held a wide range of executive leadership roles in the technology and charitable sector, including IBM and the
Family Online Safety Institute (FOSI). He is currently a member of UNICEF's Expert Panel for the Global Fund to End Violence against Children, as well as former Executive Board member of the UK Council for Child Internet Safety (UKCCIS) and chair
of several key working groups. David is a Freeman of the City of London and a member of the Worshipful Company of Information Technologists (WCIT), one of the Livery Companies of the City of London. The Company received its Royal Charter in 2010.
a. A virtual tour link for a bathroom installation on www.hdsbuilders.co.uk, seen on 21 March 2017, featured an image of a naked woman showering.
b. A still image from the virtual tour, showing the naked woman, with the option to click on the tour, was seen on the home page of www.wetroomswales.co.uk on 15 May 2017.
A complainant challenged whether the image of the naked woman in ads (a) and (b) was offensive and unsuitable for display in an untargeted medium.
HDS Builders said people did not shower wearing clothing and therefore the image of the naked woman showering was appropriate for a virtual tour of a bathroom installation. They appreciated that some people might not find the image acceptable, but
no intimate body parts were visible and they did not believe it was indecent.
ASA Assessment: Complaint upheld
The ASA acknowledged that someone using a shower would be naked, but considered that it was not essential to use such an image in order to explain how a shower worked or to highlight a bathroom installation. Although the image had some relevancy
to a bathroom and shower, we nonetheless considered it was likely to be seen as sexist and to demean women by using their physical features for no other reason than to draw attention to the advertising.
The woman was fully nude, shown full length side on, with her bottom sticking out, her back arched and with some of her breast visible under her folded arms. In light of the nudity, we considered the pose was provocative and could be seen to be
sexually suggestive with the tone further enhanced in the virtual tour in ad (a) because it was possible to freeze the image, zoom in and out and change the angle.
We considered that, because the websites were for a builder, consumers would not expect to see a naked woman either on the home page of ad (b) or at the start of the virtual tour in ads (a) and (b), and the image had the potential to be seen by
many people who were likely to find it offensive.
We therefore concluded that the ads were inappropriately targeted and, because of the amount of nudity and the woman's sexually provocative pose, the image was likely to cause serious offence.
The ads must not appear again in their current form. We told HDS Builders not to use similar images in its advertising in future.
There are references to a review of Counter-terrorism and a Commision for Countering Extremism which will include Internet-related policies. Although details are lacking, these may contain threats to privacy and free speech. The government has
opted for a "Digital Charter", which isn't a Bill, but something else.
This isn't a Bill, but some kind of policy intervention. Perhaps the Digital Charter
will be for companies to voluntarily agree to, or a statement of government preferences. It addresses both unwanted and illegal content or activity online, and the protection of vulnerable people. The work of CTIRU and the IWF are mentioned as
examples of work to remove illegal or extremist content.
At this point, it is hard to know exactly what harms will emerge, but pushing enforcement into the hands of private companies is problematic. It means that decisions never involve courts and are not fully transparent and legally accountable.
There will be a review of counterterrorism powers
. The review includes "working with online companies to reduce and restrict the availability of extremist material online".
This appears to be a watered down version of the Conservative manifesto commitment to give greater responsibility for companies to take down extremist material from their platforms. Already Google and Facebook have issued public statements about
how they intend to improve the removal of extremist material from their platforms.
Commission for Countering Extremism
will look at the topic of countering extremism, likely including on the Internet.
This appears to be a measure to generate ideas and thinking, which could be a positive approach, if it involves considering different approaches, rather than pressing ahead with policies in order to be seen to be doing something. The quality of
the Commission will therefore depend on their ability to take a wide range of evidence and assimilate it impartially; it faces a significant challenge in ensuring that fundamental rights are respected within any policy suggestions they suggest.
Data Protection Bill
A new Data Protection Bill
, "will fulfil a manifesto commitment to ensure the UK has a data protection regime that is fit for the 21st century". This will replace the Data Protection Act 1998, which is in any case being removed as the result of the new
General Data Protection Regulation
passed by the European Parliament last year. Regulations apply directly, so the GDPR does not need to be 'implemented' in UK law before Brexit.
We welcome that (at least parts of) the GDPR will be implemented in primary legislation with a full debate in Parliament. It is not clear if the text of the GDPR will be brought into this Bill, or whether it supplements it.
This appears to be a bill to at least implement some of the 'derogations' (options) in the GDPR, plus the new rules for law enforcement agencies, that came in with the new
law enforcement-related Directive
and have to be applied by EU member states.
The bulk of the important rights are in the GDPR, and cannot be tampered with before Brexit. We welcome the chance to debate the choices, and especially to press for the right of privacy groups to bring complaints directly.
Facebook is launching a UK initiative to train and fund local organisations it hopes will combat extremism and hate speech. The UK Online Civil
Courage Initiative's initial partners include Imams Online and the Jo Cox Foundation.
The recent terror attacks in London and Manchester - like violence anywhere - are absolutely heartbreaking. No-one should have to live in fear of terrorism - and we all have a part to play in stopping violent extremism from spreading. We know we
have more to do - but through our platform, our partners and our community we will continue to learn to keep violence and extremism off Facebook.
Last week Facebook outlined its technical measures to remove terrorist-related content from its site. The company told the BBC it was using artificial intelligence to spot images, videos and text related to terrorism as well as clusters of fake
Facebook explained that it was aiming to detect terrorist content immediately as it is posted and before other Facebook users see it. If someone tries to upload a terrorist photo or video, the systems look to see if this matches previous known
extremist content to stop it going up in the first place.
A second area is experimenting with AI to understand text that might be advocating terrorism. This is analysing text previously removed for praising or supporting a group such as IS and trying to work out text-based signals that such content may
be terrorist propaganda.
The company says it is also using algorithms to detect clusters of accounts or images relating to support for terrorism. This will involve looking for signals such as whether an account is friends with a high number of accounts that have been
disabled for supporting terrorism. The company also says it is working on ways to keep pace with repeat offenders who create accounts just to post terrorist material and look for ways of circumventing existing systems and controls.
Facebook has previously announced it is adding 3,000 employees to review content flagged by users. But it also says that already more than half of the accounts that it removes for supporting terrorism are ones that it finds itself. Facebook
says it has also grown its team of specialists so that it now has 150 people working on counter-terrorism specifically, including academic experts on counterterrorism, former prosecutors, former law enforcement agents and analysts, and engineers.
One of the major challenges in automating the process is the risk of taking down material relating to terrorism but not actually supporting it - such as news articles referring to an IS propaganda video that might feature its text or images. An
image relating to terrorism - such as an IS member waving a flag - can be used to glorify an act in one context or be used as part of a counter-extremism campaign in another.
Sony have been regularly 'sanitizing' their movies but cutting down the violence and strong language so as to make
them suitable for children. These versions are targeted at airlines and daytime TV but earlier this month Sony decided to make these sanitised versions available to download at home, choosing 24 titles:
50 First Dates, Battle Of The Year, Big Daddy, Captain Phillips, Crouching Tiger Hidden Dragon, Easy A, Elysium, Ghostbusters, Ghostbusters II, Goosebumps, Grown Ups, Grown Ups 2Hancock, Inferno, Moneyball, Pixels, Spider-Man, Spider-Man 2,
Spider-Man 3, The Amazing Spider-Man, The Amazing Spider-Man 2Step Brothers, Talladega Nights: The Ballad of Ricky Bobby, White House Down
The censorship cuts are typically very extreme. For example, the clean version of Will Ferrell comedy Step Brothers - originally given an R rating for crude and sexual content according to Sony - has had 23 instances of violence taken out,
152 of bad language and 91 of sexual content.
The Drew Barrymore and Adam Sandler romcom 50 First Dates had a PG13 for crude sexual humour and drug references. Its clean version has 10 violent moments taken out, 34 uses of bad language and 34 instances of sexual content.
Matt Damon sci-fi film Elysium , which also had an R rating for bloody violence, had 18 of those violent moments taken out, 63 uses of bad language and one instance of sexual content.
Horror comedy Goosebumps was a PG when it came out - so could be described as family-friendly already. But its clean version had four fewer incidences of violence, with five uses of bad language and five examples of nudity taken out too.
But now they've had to backtrack after filmmakers complained about the vandalisation of their works. After an outcry, the president of Sony Pictures Home Entertainment, Man Jit Singh, said their directors were of paramount importance to us and
they wanted to respect those relationships to the utmost:
We believed we had obtained approvals from the film-makers involved, for use of their previously supervised television versions as a value-added extra on sales of the full version. But if any of them are unhappy or have reconsidered, we will
discontinue it for their films.
Seth Rogen was one of the first to react when news of Clean Version emerged. He pleaded, adding a swear word for emphasis, please don't do this to our movies.
The Directors Guild of America (DGA) has said the hard-fought-for rights that protect a director's work and vision are at the very heart of our craft and a thriving film industry.