Speakers: David Austin and Amelia Errat, British Board of Film Classification; Peter Violaris, Yoti; and Alistair Graham, Age Checked.
24 April 2019
All-Party Parliamentary Group on a Fit and Healthy Childhood
‘Preventing child access to online pornography’
Chair: Baroness Floella Benjamin
Speakers: David Austin and Amelia Errat, British Board of Film Classification; Peter Violaris, Yoti; and Alistair Graham, Age Checked
Chair’s Opening Remarks:
Good evening everybody and welcome to this, the 43rd meeting of the All-Party Parliamentary Group on a Fit and Healthy Childhood. We are one of the most successful all-party groups here in Parliament, and this evening we are looking at preventing child access to online pornography.
I’ve been saying this for many, many years. Before our daughter was born, 30 years ago, Keith and I tried to make a TV programme for children about relationships, and we were told that children did not want to know about relationships. Just imagine how much damage has been done to our children since then.
I remember 8 years ago asking my party to adopt a motion about how stop children easily accessing online pornography, and there was a big uproar because people felt it would be an infringement of civil liberties. After I had spoken and there was silence, I realised that we had a real problem because people didn’t understand. So it’s been a battle getting to where we are now.
The online world is a wonderful world – it’s captivating and exciting, but it’s also challenging, especially for children. Childhood lasts a lifetime, and what children see at a very young age will stay with them for ever. I’ve got so much evidence of this happening: the 4-year-old daughter of a friend was raped by a 10-year-old boy who said, “I’m going to rape you, and you’re going to like it”. Where the hell did he hear these words? Where did he see all this?
The explosion of social media has brought many benefits but along with those benefits has come some very serious damage, especially to our children. Pornography is one of those aspects that has warped children’s minds – about relationships, sex, who they are, what they do with their bodies, how sacred their bodies are. We have to try to help them navigate in the right direction of life and this is one way I feel, so we warmly welcome the Digital Economy Act of 2017, under which online commercial pornography must be placed behind robust age-verification barriers. This sets an international precedent in child protection online. Last February, the British Board of Film Classification was designated as the regulator for the new system. To me, they are the best people to be looking after our children and protecting them in the online world.
We are therefore delighted to welcome them and potential operators to speak to us tonight. What progress has been made? When will the new system become operational? Will all pornography be covered?
David Austin, CEO, British Board of Film Classification
On 15 July, we will begin our role as the Age-verification Regulator under Part 3 of the Digital Economy Act. This ground-breaking new law requires commercial pornography services, wherever they are based, to place their content behind robust age-verification barriers. So children in the UK won’t normally be exposed to adult content online. Pornographic services must also ensure they do not carry any extreme pornography, which is illegal to possess.
The Digital Economy Act is world-leading. It places the UK at the forefront of child protection online. Government and Parliament should be proud of this law. I am proud the BBFC has been entrusted with the task enforcing it. It marks a commitment to making Britain the safest place in the world to be online. The world is watching to see how we get on.
The BBFC is uniquely placed to take on this new function. First, because of our expertise in classifying pornographic material but also because of our longstanding experience of online regulation. For example, through our work with the UK’s MNOs (Three, O2, EE, Vodafone) under which hundreds of millions of websites are filtered according to trusted BBFC standards. We also work with video-on-demand platforms such as Netflix and Amazon Prime which carry trusted BBFC age ratings and content advice. Protecting children from potentially harmful content and helping families choose content that’s right for them are core missions of the BBFC. So the role of Age-verification Regulator is a natural fit for us.
We’ll use the expertise and resources that we already have to fulfil our new duties. From 15 July, expert BBFC Compliance Officers will actively investigate commercial pornographic services to ensure they meet the requirements of the legislation. This includes websites which offer pornographic content for free, but which generate revenue in other ways, e.g. through advertising. We will concentrate on those websites that are most frequently visited, particularly by children as well as those that child protection charities and members of the public highlight to us.
Pornography is at the moment just one click away for children in the UK. I was up in Glasgow when you (Baroness Benjamin) gave your speech eight years ago and I do remember the silence in the room after it. Things have moved on a log way since then.
Pornography is clearly affecting the way young people understand healthy relationships, sex and consent. Government figures, when they launched a consultation on the Digital Economy Act, found that around 1.4 million UK children visit a pornographic website in a single month. NSPCC research found that children are as likely to stumble across pornography by accident as to search for it deliberately. This is plainly unacceptable. In the physical world we do not allow children to enter a sex shop where they might see pornographic content, so it is only common sense to have equivalent protections online.
Equally importantly, the new law is looking to change norms. We are the first country to say that we as a society say it’s not okay for children to have free, unfettered access to online pornography. Changing attitudes and behaviours around children’s exposure to online pornography is not the work of a moment. But the Digital Economy Act is a big step along that road.
And we are confident that it will be a success. We expect a high level of compliance from the adult industry and we’ve been talking to the biggest global pornography companies for the last two years and we’re confident that they will comply with the new law. We have significant enforcement powers to encourage compliance and to deal with sites that are non-compliant.
There is broad public support for age-verification. A YouGov survey conducted on behalf of the BBFC found that 88% of parents with children aged 7-17 agree there should be robust age-verification controls in place to stop children seeing pornography online. The BBFC has also engaged with a number of children’s charities, and know that they too welcome this bold new step towards a safer internet for young people.
A word of caution. You touched on social media, and we can come back to that. Age-verification is not a silver bullet. A determined, tech-savvy 17-year-old will find ways around the new controls. But what I strongly believe that what will become a thing of the past is what the NSPCC and the Children’s Commissioner highlighted, which is children stumbling across online pornography.
So it’s not a silver bullet, and it needs to work hand in hand with education and other methods.
We at the BBFC are under a legal obligation to report to government on the effectiveness of age-verification a year after entry into force, detailing what has worked well and what hasn’t worked so well. Where it’s right to do so, we will recommend additional or alternative means of achieving the child protection goals of the legislation. The Secretary of State in turn is legally obliged to report back to Parliament on the impact of the Act. You may recall that just before Christmas there were parliamentary debates in both houses on social media and extreme pornography. Those are clearly issues that parliamentarians have shown us they are interested in looking at.
We shouldn’t see the legislation on its own: it’s part of a wider picture. The Government published its white paper on online harms and they’re looking for improved behaviour by the social media companies, so this is part of the wider efforts by the Government to make the internet a safer place for children. We’re really pleased and looking forward to playing our part in that process.
Amelia Erratt, Head of Age-verification at the BBFC
Some of the statistics that underpin this legislation include:
- 60% of young people were 14 or younger when they first saw pornography online
- 62% of young people first saw pornography online when they weren’t expecting to, or because they were shown it by someone else
So these figures show that young people really are accessing pornography, and it’s often by accident. As David said, 88% of adults agree that there should be age-verification controls in place to stop children from accessing online pornography, and this statistic is from your YouGov survey of over 6000 adults. In our role as the Age-verification Regulator we are required to carry out research to help measure the effectiveness of age verification, and this is something we will report back on in your annual report.
Before the law comes into force we’re conducting benchmark research. The YouGov survey is part of that and we have also engaged an agency who will do research with children as well.
Another key statistic is that 75% of girls age 13-21 agree that all pornography sites should carry AV, which tells us that children themselves want AV on online pornography and they think that online pornography can have a detrimental impact on their understanding of relationships and issues of consent.
So the Digital Economy Act section 14.1 is the premise of age verification and it says that online commercial pornographic services must have controls in place that ensure that children cannot normally access their material. The words “not normally accessible” are really key to us because one of the main areas that we are criticised on is that AV can’t solve everything and that there will always be circumvention. The words “not normally accessible” mean that the legislation itself does recognise this. We say that AV is not a silver bullet, but it works hand-in-hand with education and that it will go a long way to stop the 62% of children who are seeing pornography online by accident.
In the course of the bill there was a parliamentary decision that the BBFC can intervene on content, which is extreme pornographic material, which is defined quite narrowly in the Criminal Justice and Immigration Act 2008. We’ll be reporting back to Parliament on this in the instances where we find it.
So why is the BBFC involved? Well, we’re experts in classifying material as it’s something we’ve been doing for years. We have experience in implementing and interpreting legislation such as the VRA (Video Recordings Act 1984). We have an understanding of the ways people access content online through our work with mobile network operators and music videos, but most of all the Digital Economy Act is a child protection legislation, and child protection is one of our guiding principles.
How we will regulate age verification for online pornography? Under the Act, we have to produce guidance on the arrangements and we set out criteria that we will assess against. The crux is that the controls must be:
- the data must not be known by another person
- users must be logged out by default
- controls should be in place to prevent access by non-human operators
Before we set this criteria (which is much more detailed and you can see it all on our website) we assessed the current landscape of age verification and we were surprised at how some of the traditional means of AV were not particularly robust, so we came up with the new standards to improve that.
To ensure continued innovation in the area, we took a principles-based approach instead of listing approved solutions.
Our enforcement powers for non-compliant sites include:
- notifying ASPs (ancillary service providers) who assist or enable access to a non-compliant website, e.g. social media, search engines and ad networks
- notifying PSPs (payment service providers) e.g. Visa and Mastercard to request that they withdraw services
- instructing ISPs (internet service providers) to block access to the sites
We can use these powers in any order and we will determine case by case which will be the most effective.
As the regulator we have to take a proportionate approach, and this will include investigating services that have the highest levels of traffic but it also involves making provisional contact with a non-compliant website before taking enforcement action.
Throughout the process we’ve been engaging with many different stakeholders including children’s charities but also the various providers and the AV Industry, to make sure they are all aware of the regulations and our guidance.
We have recently launched our age verification certificate, which is a voluntary certification scheme separate from our statutory role. To get a certificate, the provider comes to us to be registered for a third party audit, which will assess them against a standard of data protection and security controls. This has been developed in cooperation with the Department for Digital, Culture, Media and Sport (DCMS) and the Information Commissioner’s Office (ICO) and the motivation to do it came from the fact that, while we expected a high level of compliance from the adult industry, a lot of stories in the press focussed on the dangers of AV and the threat to personal data. We often say that in order for AV to work we need two things: 1) adult websites need to carry age verification and 2) consumers need to use age verification. So really this age verification certificate is there to provide some consumer confidence as certified providers will display a symbol on their site.
Peter Violaris, Yoti
We are one of the age verification operators. I’ll give you a quick background on Yoti and then I’ll explain our two AV solutions.
We’re a digital identity company based in London, with 250 employees, and we aim to be the world’s trusted identity provider. We are currently partnered by many different services and providers for a number of different identity and AV purposes.
From our perspective, we think that successful implementation of this law has to do two things: 1) it has to protect children and 2) it has to ensure that adults aren’t too adversely affected by the AV requirements. To achieve those two aims there are three things we need to consider:
- the AV process has to be robust enough to stop most children getting round it easily
- it has to ensure total privacy and anonymity of users online, both technical security and process security
- it needs to be as frictionless as it is possible to be
We welcome the BBFC’s certification scheme which should give consumers and websites confidence that that second aim is being met by AV providers, that privacy can be ensured. As for the first and third of the aims, well, people will vote with their feet if there is too much friction and the BBFC will enforce if it’s too easy to get round.
We have invited ourselves to wider scrutiny and we have numerous other voluntary certifications and security affiliations. We take ethics, transparency and privacy extremely seriously.
We know that it’s possible to provide online AV services in a totally anonymous way, so I’ll explain our two solutions, which are very different from each other.
The first uses the free Yoti app, which is downloaded by a user. Once the account is set up (takes a few minutes) the app can be used over and over again across a range of different industries, including retail, airports, government services and, of course, online pornography. The way it works is that a user downloads the app, registers a mobile number to receive a code and then proves their age. They can do this by scanning a driving licence or passport (which are verified). The data is encrypted and fragmented across different databases and the private key to put the data back together is held by the user on their mobile phone, not by Yoti. This means that Yoti has no knowledge of individuals and has no way of obtaining knowledge about individuals. Further, when users use the app to prove their age the way they do it is to present a QR code to the website. The user is then asked to confirm that they are happy to confirm that they are over 18 with the website, and after that access is granted. Yoti will know the identity of the website but will not know who the user is.
The only data will allow to be shared with adult websites is the fact that the user is over 18. The website can’t learn any other information about the user from Yoti. So it’s an incredibly secure, anonymous and private way of proving age and once it’s set up, it can be used in a matter of seconds.
We also understand that not everyone has a driving licence or passport, so we’ve teamed up with “Citizen Card”, one of the pass scheme providers. The Yoti Citizen Card can be bought for £5 and that can be used to prove age using the Yoti app.
So that’s our first solution. The second solution is very different, and we call it “Yoti Age Estimation”. Using machine learning, we can estimate age just by looking at a face. The way it would work is that you would be on a browser and your camera takes a photo of your face which is then sent to the Yoti servers for age estimation. Once used, the image is deleted and the verdict is sent back to the website being accessed. The message will be simply that the user is – or isn’t – over 18. It’s very simple, secure, private and anonymous.
We’ve been working very hard at this technology and we know that there are biases but we are very proud of what we’ve done and we’ve published our testing results (by age gender and skin tone) online. For the key bracket of age 16-30, our system has a mean error rate of 2½ to 3 years, which is better than most humans, with no bias on skin tone.
Regarding social media, we have teamed up with Yubo, a social media platform that is very popular with British and French teenagers. They have more than 20 million active users. They’ve been using our age estimation technology and our app to firstly to make sure that no under-13s can access their services, and secondly to stop adults lying about their age in order to impersonate younger users.
For anyone close to the threshold for age verification (young or old) will be asked to prove their age in a different way, e.g. through the Yoti app although there are other ways too.
So when you read that social media organisations say it’s not possible to implement age verification solutions, they are lying.
So those are our two solutions and I’ll be happy to answer questions later.
Alistair Graham, Age Checked
The reason that I’m here is that I had the experience of seeing my teenage nephew watching Youtube videos on the television of 8-year-old boys bullying a young girl. I thought, this is not right! He has a very responsible mum but she didn’t even know it was possible to get YouTube on TV, and he was watching content from an organisation that can do a billion page views a day but which can’t do decent age verification. So for me it just started off as a thought experiment and I eventually set up the company that I am CEO of.
We have been doing age verification in other sectors (alcohol, vapes, knives, plastic surgery) for two years. The thing that will make the biggest difference globally for age verification is the Digital Economy Act, and we’ve been engaging with the adult industry for some years now about the measures that are going to come into place.
It’s worth mentioning that I’m also the current chair of the Age Verification Providers’ Association, which is an organisation representing 7 companies all of which are fully behind the BBFC and the Government as regards this legislation. In particular we support the idea of pushing back against this idea that it’s too difficult and too risky. As a trade body we’ve come together to refute these claims.
The journey that all of the providers have gone through is primarily they all started through the Digital Policy Alliance, which is a cross-bench briefing organisation forming part of the Age Verification and Internet Safety Group. Part of the work that we did helped formulate some of the discussions in Parliament, especially about what was technically feasible. The principles we support are:
- age verification should only be used for age verification
- it should be secure
- no extra information should be collected
All the organisations agree to a “data minimisation” approach. At Age Checked, like Yoti, we don’t store data. We might see personal data as part of an age-verification method, but it’s transitory and is removed after use. We don’t have data to misuse even if we wanted to.
Experience in the different sectors has shown us that in the general public there is no resistance to age verification in other sectors, e.g. alcohol. But adult is different and we need a balance where in order to protect minors we need to have measures that are very clearly not going to be data gathering measures. That is a massive concern that we see with the adult users of these sites: they are most concerned about the protection of their personal data. So many of the conversations we have aren’t about child protection but rather about data security. So that’s why in principle the Age Verification Association is behind the voluntary scheme of the BBFC.
The other part is the seamlessness, which is for the merchant side. The adult industry moves an insane amount of traffic. We have clients who have 5 billion page views per month. Any friction in a process means loss of business, so the methods we are creating for the adult industry are somewhat different than for other sectors like alcohol. The process has to be as seamless as possible. Working with the adult industry you need to have a layer that recognises existing customers who have already been verified, so that customers can freely move from site to site to site without problems. Any friction doing so would be too much for the industry to bear so AV providers have been verifying systems where customers are checked on the way in but are then able to move around seamlessly thereafter, and that’s a key part of any provider’s solution.
The other layer is what we would see as standard age verification which would be familiar from other sectors (e.g. alcohol purchase) where the customer would be asked to provide age verification in a number of ways, e.g. driving licence, credit card, mobile phone etc. So users would come to our site before entering the adult site, and it would feel a bit like a payment gateway, and in that environment we see users prepared to use personal information to secure age verification, because it is very obvious that it is not the adult website doing the processing. We ran a test of this last month and we were seeing as many people using the driving licence solution as the mobile phone solution. We’ve seen a massive growth in the last two years in people completing the process. At the beginning, when it wasn’t really a familiar process, we were lucky to get 1 in 100 going through age verification but now it’s more like 1 in 4. That’s quite reassuring when faced with objections that “no one will do it”.
On the methods, as a little bit of background it’s worth mentioning that organisations like ours will sometimes share methods as well, so we’re not out and out competitors. We’re discussing using the Yoti method ourselves, and in the Association we are all trying to make it work in our own different ways as seamlessly and as successfully as possible.
Questions and Comments:
Baroness Benjamin: What do you say to people who worry about the security of their data? What percentage of safety would you be able to give a sceptic person who doesn’t want to age verify? We know that there are some clever people out there who can get to data and exploit it.
Alistair Green: There are two parts to this. We knew that we would be challenged and so we made a decision to never hold the data in the first place, so rather than have to argue that data is secure and protected, we argue that we don’t hold the data. There is nothing to be hacked. We don’t have it. That is a technological standpoint, but I think in order to gain people’s trust, that is where the BBFC scheme comes in, as we have a third party auditor to validate and certify the fact that the system operator does not hold the data.
So it’s really important that the news gets out about what these schemes are doing in terms of privacy. Personally I would like a “they don’t hold your data” badge but I know it’s not that simple. I think it is the third parties that reassure the sceptics more than the operators.
Peter Violaris: I totally agree with that. Third party auditing is the only way to guarantee security. We put ourselves through three audits every year, including one by a top four auditing firm and we put ourselves through various different penetration tests. The whole Yoti system is open to be hacked on a “HackerOne” programme which encourages “white knight” hackers who are people who find vulnerabilities in systems in exchange for rewards. It’s very hard to explain that to the press. I don’t know how much more we can say to generate confidence in users but the BBFC certification is a good first step and ultimately it’s about earning trust.
Sascha Colgan, Consultant GP: One of the things I find most alarming is that on platforms such as YouTube, you can find corrupted versions of children’s programmes such as Sesame Street and Peppa Pig where there are undertones of violence etc. I wondered with your age verification whether those kinds of things would be targeted because on the face of it these programmes appear to be innocent. It’s a big concern of mine.
David Austin: As part of a new initiative with our Dutch colleagues we have come up with a tool to allow members of the public to age rate videos on YouTube and any video sharing platform. This is a simple and easy questionnaire and in the UK it produces a traffic-light rating system showing the age suitability of the material. One of the features of this tool is that you can report anything in the video that you think is beyond the pale and that will go straight to the compliance team of the relevant platform. So having a green/amber/red traffic light before you even look at the video is, we think, very important. We trialled the tool successfully in Italy and we’re looking for other partners to work with now. We’ve shown the tool to Google and Facebook and although they are not biting our hands off to take it, it is ready to be used. New European legislation (the Audiovisual Media Services Directive) which is likely to be fully enforced in the UK regardless of Brexit is likely to bring companies like Google into scope. This means that they need to introduce controls that show that they take child protection seriously. It’s a problem I agree but we have what we think is a really good solution.
Mihalis Papamichail, Barnado’s: Question for Peter regarding the first solution that you mentioned. I’m assuming that there’s nothing in place to stop for example a 15-year-old from using the passport or driving licence of another person?
Peter Violaris: We use a mixture of automatic face-matching and human reviewers. The automatic face-matching is the same as that used at passport control points when people come into the country, and the human reviewers have all passed the test set up by the University of Greenwich. That test means that they have top flight ability to recognise people. If there’s a chip visible we can read the data from the chip and get a very accurate face recognition match from that but if it’s an older passport we have to take an image of the passport and run the comparison against that, but because sometimes doesn’t give a very good score we also have the manual face checkers.
Baroness Benjamin: How can you reassure people that once you have that image, it’s not connected to an adult website?
Peter Violaris: It’s hard to get across to the general public, but the way it works is that when you do a share of your 18+ with an adult website, even Yoti doesn’t know that it’s you who has shared the data. It’s built in such a way that it’s physically impossible for us to know that because it’s encrypted and the private key to decrypt it is stored on the device of the user. Yoti has no access. Alistair mentioned the PAS 1296 standard which we have been certified and audited against. That’s the kind of thing that will reassure the public but you need to think about how you are going to get that message across because at the moment, people don’t believe you.
Keith Taylor: I’m Floella’s husband and we’ve been working on this for a very long time and I think you’re buzz phrase ought to be “we don’t trust ourselves so we don’t keep your data”. If it’s impossible for you to get hold of that data then you’ve got to find a way to get that across to people. Time and again we’ve heard people say that they would not give their data to a porn site and somehow we have to get the protection of data over to people.
Also the age assessment thing fascinates me. How accurate is it and how easy would it be for example for a boy to use his elder brother to do the face assessment? Or even wilder, some make-up?
Peter Violaris: We are totally transparent and we publish all of our face recognition test results. At about the age of 18 the mean average error rate is about 2½ years, so if you set the threshold at 25 or 30, you will get very few 17-year-olds getting through that. You obviously wouldn’t set the threshold at 18 because then a lot of 17-year-olds would get through.
In terms of spoofing the system, well, if your older brother is happy to sit there in front of your laptop and pretend to be you, there is nothing any age verification system can do about it. As BBFC has said many times, you cannot stop a determined 16-year-old getting access to porn. In terms of make-up, we’ve done lots of tests with make-up and beards. Beards are harder because if a 16-year-old puts on a fake beard he is going to look older. So that’s another reason for setting the threshold higher.
Baroness Benjamin: We’ve heard tales from Barnardo’s about young people who have been sexually exploited by adults who have forced them to watch porn with the adults, so obviously that’s something we can’t safeguard against, and we just have to accept that. But we can stop young children under the age of 18 being able to freely access online pornography. I just want to play Devil’s Advocate to be sure we’ve got responses to the questions people raise and that we’ve thought of everything.
Estelle Mackay, Public Health Nutritionist: I’m interested in this idea of “stumbling across” things. As a parent you do your best to protect your children. When the protection comes into force in July what are your plans to go into schools and to let parents know that you are doing this work? Are you planning to use celebrities? How are you going to communicate to the average parent who hasn’t got a clue what’s going on?
Amelia Errat: We will be doing a lot more public engagement than we have been and we’ll be a lot more proactive over the next three months as the law comes into force. In terms of reaching parents and schools we have a charities working group where we will be working with most of the big children’s charities in the UK to use their networks to get the message out.
Baroness Benjamin: You should be part of the debate that’s happening about how to train teachers to deliver relationships and sex education to children. The organisation responsible is looking for partners to work with and I will send you the details because the Government is giving £60m to make sure that this is rolled out in the right way.
Question: As a user, can I appeal the decision if I fail age verification? For example if I’m a 19-year-old and you think I’m 16?
Peter Violaris: There are going to be false positives, especially for 19-year-olds, which is why we aren’t setting the threshold at 18. The threshold will be set between 25 and 30, which means that most 19-year-olds will fail age estimation, but there are very many other methods for them to use.
Amelia Errat: One other thing that we’re doing is establishing a benchmark for how children access pornography at the moment and then in the coming months and years we will look at that repeatedly to see how we’re doing and what changes might need to be made to the legislation.
Baroness Benjamin: There’s a debate about online harms next Tuesday in the Lords and I’d be grateful to you – and Barnardo’s too – if you would send me anything that might be useful for that debate. Porn for a lot of people is victimless, but they don’t see what we see.
Sascha Colgan: One of the interesting things that I find as a female GP is the amount of young women that come for an examination and will apologise for having not shaved their body hair. I believe that is an impact from pornography. I get it from men too – the norms are changing about body image.
Baroness Benjamin: A lot of young girls are telling me that their boyfriends are forcing them to watch porn.
Sascha Colgan: The uniform morphology of female and male bodies in pornography is causing young people to feel that that is what they should aspire to. It’s terrible.
Murray Perkins, BBFC: We’ve talked about the harm to children and we know that a whole generation of children has grown up with ease of access that previous generations haven’t had. Come July 15th we are confident that we will get a high level of compliance and we are confident that we are going to see a really significant difference, in particular stumbling but also it will impact of children being shown porn, e.g. at school. The normalisation of pornography for a whole generation has always been there for some children and as adults we say that it’s not healthy at a young age, but as adults also we’ve never done anything about it – up till now. We’ve been sending a mixed message saying “you shouldn’t be watching it” but on the other hand, it’s freely available. One of the measures of success is going to be a few years down the line a change in normative behaviour and social attitudes to pornography. We are saying that it’s not a silver bullet – and some children will get through – but over time we are sending a message finally, as adults, that it’s not ok.
Baroness Benjamin: Years and years ago they were against having seat belts, others were against the smoking ban, but now they’re accepted. I think this is going to be the same.
Keith Taylor: Estonia is the most technologically advanced country. When you are born you receive a number which stays with you for life. People against the idea of identity cards go on about freedom but we are identified by all kinds of things – passports, driving licences, Tesco club cards – and the real answer is for everyone to be given an ID at birth. That’s what they do in Estonia.
Murray Perkins, BBFC: I think in the essence of that, and Alistair has spoken about this very well, is that what we have seen in the Age Verification industry is phenomenal innovation in a very short space of time in order to address the requirements. There will come a time, and it won’t take that long, when people will take age verification for granted and its absence in context will be seen as shocking.
Baroness Benjamin: You’ve hit the nail on the head that the rest of the world is looking at us. We are leading the way in this. What final message would you all like to leave with us?
Alistair Green: A call for help. When putting out our case I often feel that we are constantly fighting people (e.g. the press) who say it won’t work or it’ll be too hard. So any advice on how to find people who can help us explain these messages in a positive way would be very helpful.
Baroness Benjamin: Do you remember Jeremy Paxman mocking 3D printing? Now it’s totally accepted.
Peter Violaris: Another cry for help. What would help is if the Home Office would allow digital age verification to be used in other areas, e.g. alcohol and the right to work or rent, where you currently have to prove your ID with physical documents because you’re not allowed to use digital ID. Digital ID is totally banned for no good reason and the Home Office don’t want to address it. It’s great to see the Digital Economy Act coming through but really for it to be totally successful there needs to be a joined-up approach.
Baroness Benjamin: Getting the Digital Economy Act was hard, but Play School helped me with that because a lot of the ministers are ex-viewers and they knew they could trust me.
Amelia Errat: Thank you for listening and I hope the brief was helpful for those of you who haven’t heard it before. We will be engaging more in the coming months to get our message across. We have a website: http://www.ageverificationregulator.com which holds information, FAQs etc.
David Austin: Thank you for your support, Floella, and to other parliamentarians like you who have supported our work, not just in this area but over the years as well. And I’d like to thank Barnardo’s as well who are part of our charities working group in helping us get the message out. AV is going to become the norm, like seat belts and smoking, and one of the reasons that we’re very confident that the adult industry is going to comply is that they see this as the norm. Other countries will definitely be following our example and learning from us.
After some further discussion, questions and comments, the meeting closed at 5.30 pm.