Once again the issue of police use of Automatic Facial Recognition (AFR) technology has come into public spotlight. This week it was part of an inquiry at the House of Commons Science and Technology Select Committee – to whom I submitted written evidence.
Many of the themes emerging from that Committee reflect those that have been around for some time. We do of course await the outcome of the legal proceedings raised by Liberty and Big Brother Watch against South Wales Police and Metropolitan Police respectively for their use of AFR. Those deliberations will provide focus regarding the legitimacy and lawfulness of the use of such technology.
In the context of regulation it appears to me that broader and more creative thought needs to be applied to the prospect of society’s increasing use of artificial intelligence linked to surveillance cameras.
The first question I ask – is it reasonably foreseeable that the use of such integrated technology will continue to be an ever-growing phenomena? The answer is surely yes. The innumerable scope for the private sector to provide enhanced services to their consumers utilising facial recognition (shopper identity) or other predictive algorithms is immense. The momentum of the pound pushes forward this debate in the absence, arguably, of a robust regulatory or legislative regime wrapped around it.
Next – is it likely that police/public partnerships will emerge that combines this utility with a security application? Yes – it is already happening. Many will recall my intervention at a large shopping centre in the North West of England and a local police force in October 2018. There is a world of difference between an enhanced shopping experience and focused state surveillance particularly when conducted on a mass scale. This is where society needs to be careful and government need to ensure robust legislation that is fit for purpose.
I have often stated that the public do not expect an analogue police force in a digital age (but they have every right to expect clear, transparent and common-sense laws and rules to govern police conduct and use of the those technologies – as indeed do the police themselves). The recommendations I made to the Committee will (should, if appropriately considered) go a long way to assuaging public concerns about the use of this technology by indicating a pragmatic way forward – a clear legislative framework, recognisable operating and technical standards, a recognition that the actual conduct of surveillance itself is a far more reaching consideration than simply data acquisition (whether it be conducted overtly or covertly), and should be supported by appropriately clear and intrusive oversight.
The establishment of the Home Office Law Enforcement Facial Images and New Biometric Modalities Oversight and Advisory Board has been a step in the right direction but more is required. The Home Secretary’s Surveillance Camera Code of Practice issued by virtue of the Protection of Freedoms Act 2012 clearly sets out within its contents that it regulates the public place use of overt surveillance camera technologies in England and Wales now and in the future.
The police and the public will understandably seek recourse to it as appropriate and indeed have every right therefore to expect it to provide clear and relevant regulatory guidance to those who need it. It was published in 2013. Things were different then. I have advised Government consistently since 2016 that it needs to change and evolve. Government (in June 2018) committed, within its Biometric Strategy, to review the Surveillance Camera Code of Practice.
I’ve had conversations with the Home Office on the review but progress has been glacial. If public reassurance is to be gained this needs to progress quickly. The Code sets standards and principles as well as signposting broader legislative considerations which apply. It is after all operated under the Home Secretary’s responsibilities and is precisely where more effort is required.