While an estimated global TV audience of around 300 million looked on, seeking out well-known faces in the crowd, the use of live facial recognition cameras by the police made the event in London the largest public deployment of Artificial Intelligence (AI) driven policing tech in British history, possibly even in world history. I am certainly not aware of any on a larger scale.
It is now widely known that it is the de facto policy of the Government to put this kind of AI-driven facial recognition at the very heart of British policing. Its use at the Coronation may be a significant step along that road.
It is not my job as Biometrics and Surveillance Camera Commissioner to decide whether or not facial recognition should become a cornerstone of policing in this country, though I have certainly advocated for its accountable and proportionate deployment in appropriate circumstances. How the opportunities it offers should be defined and regulated are questions for parliament to decide on behalf of the citizen and for the courts where its use is challenged.
It is, however, part of my job to try to ensure that where the police do use facial recognition capability, its use complies with the very limited rules that already apply to the public space surveillance by the police. It is also part of my role to draw attention to the fact that oversight and regulation in this increasingly important area of public life is incomplete, inconsistent and incoherent.
While the emergence of ChatGPT may have been the first time that many people really became aware of the power and potential of modern AI, generative AI has actually been with us in many fields for some time now. But its use by the police has a particular significance – legally and societally – and live facial recognition of the kind deployed at the coronation is probably policing’s greatest technology challenge right now.
The AIs in play here have made it possible to explore and exploit the vast pools of unstructured data generated by digital camera systems. AIs can now, working at incredible speed, effectively sift through previously overwhelming masses of source material to not only pick out human faces and compare them with others for which there is a reference to match against (such as a custody image taken by police when they arrested someone), but they can now also recognise events, like a goal being scored in a soccer match, a car crash, or a fight. Some systems can, with varying degrees of success, identify the gender, ethnicity, age and even emotional state of individuals caught on camera.
It is undoubtedly powerful, and newly intrusive technology. Some people are so worried about its impact on our human rights, and on our privacy in particular, that they want an outright ban on its use, not just by the police but also by commercial organisations. Realistically, that ship sailed some time ago; this AI technology is already here and already in use.
I am convinced that modern facial recognition, and other AI-driven biometric surveillance technologies in the pipeline, are potentially too useful an advance in the fight against crime and terrorism for us to turn our noses up at. And while the many legal issues have yet to be defined let alone tested, some victims and their loved ones will not forgive us for eschewing legitimate tools that could have changed the outcome of events which devastated their worlds. Organised crime groups and terrorists already exploit this technology.
So, the question is no longer about whether the police should use it, but about how we can ensure that they use it well. How we strike the right balance between the need to protect privacy and other rights, and the imperative to fight crime effectively. What sort of safeguards are needed so that the public can feel confident that when the police do deploy facial recognition, that they will do so in line with a set of sensible rules that parliament has agreed on behalf of the citizen?
The Government’s Surveillance Camera Code is the only legal instrument to address the police use of live facial recognition directly. Approved by Parliament last year, the amended Code’s purpose is “to enable operators of surveillance camera systems to make legitimate use of available technology in a way that the public would rightly expect and to a standard that maintains public trust and confidence.”
It is perhaps unsurprising then that the police, I and others, are wondering why the Data Protection and Digital Information Bill is in the process of scrapping this enabling Code that has public expectation and trust at its heart and not replacing it with anything.
The division of opinion over the police use of facial recognition in the UK possibly mirrors that of the coronation itself, but there is one aspect on which there is almost complete agreement: the urgent need for better oversight and regulation.
If the coronation heralds the start of a new era for our evolving relationship with the monarchy, maybe the deployment of AI-driven facial recognition at the event ought to mark the beginning of a fresh approach to our relationship with state surveillance that will increasingly rely on this powerful technology.
We may not be that far away from using an AI like ChatGPT to draft warrants or witness statements, yet we have the scantest regulatory framework under which those using the technology will be held to account. Personally, I favour a principles-based approach over a rigid, statutory one so that it will be flexible enough to accommodate the new, related technologies that are likely to follow. Demonising the technology is irrational and banning its use in every case is inviting operational disaster.
Some of our most challenging crime areas illustrate what a footrace between law and technology looks like and the law is always hopelessly outrun. But, whatever the final answer, the first steps must be to understand the public attitude and the place for intrusive AI-enabled surveillance by the state in the future.
]]>
The learning tool is designed for people working in both the public and private sector, from a range of different fields and expertise. It includes regular updates to the learning tool content by the Institute’s expert groups and covers a wide range of topics, types of biometrics modalities and varying standpoints on biometric technologies used in the biometric systems.
The course, developed by the Biometrics Institute, is based on their Good Practice Framework, a risk and decision-making management tool. It will have learners gaining insight into the complex decision-making processes involved when implementing biometrics, while understanding how these decisions impact the biometric user.
The online format allows participants the flexibility to learn at their own pace and schedule. Learners will be taught through real world experiences, exploring the strengths and weaknesses that existing biometrics systems have.
“Over the past 21 years we have been providing better education on biometrics. We are therefore thrilled to offer this user-friendly programme to professionals wanting to expand their knowledge and skills in biometrics” says Isabelle Moeller, Chief Executive Officer, of the Biometrics Institute. “With the rapid growth and adoption of biometrics in various industries, it is more important than ever for users, suppliers and others considering the implementation of biometrics to have a strong understanding of this technology. Our Biometrics Essentials learning tool is designed to provide everyone with the knowledge and skills they need to mitigate risk in this field.”
The Biometrics and Surveillance Camera Commissioner Professor Sampson has stated that, “One of the key elements in ensuring accountability and legitimacy in the use of biometrics by the State is understanding – the better people’s understanding of what’s possible (technologically), what’s permissible (legally) and what’s supported/expected (societally) the closer we will get to balancing these three appropriately. Having learning tools such as this can make a big contribution to that outcome.”
The Biometrics Institute is an independent and impartial international membership organisation for biometric users and other interested parties. They have been educating users and suppliers in the responsible, ethical, and effective use of biometrics for over two decades. The learning tool is a reminder of the importance of applying the Three Laws of Biometrics in biometrics applications and complemented by other good practice guidance material the Institute provides.
The course has been tested by novices, interim and seasonal professionals and is free for all to take, not just those who are already members. It takes approximately one hour, and the user will be awarded with a certificate upon completion.
More information about the Biometrics Essentials learning tool is available here.
]]>And it’s even more important now that the government has said that some leading providers of the surveillance cameras and systems watching our public spaces aren’t to be trusted. The announcement on 24 November is the first public acknowledgement by the government of what security professionals have known for some time: that these companies can’t be trusted with sensitive surveillance functions. They have also been directly linked with human rights atrocities perpetrated by the police. Both of these aspects raise significant trust issues. Some local authorities and even government departments stepped up to the ethical leadership plate - now the UK government has directed the removal of their surveillance systems initially from ‘sensitive’ government sites.
So what’s next? I have likened the situation to digital asbestos because:
Local authorities and police forces in England and Wales have already been in contact with my office asking what constitutes a ‘sensitive’ site in their context. The government hasn’t defined ‘sensitive’ and of course surveillance extends far beyond fixed geographical locations, so we must decide that for ourselves. Here are some questions that might assist in framing the issues.
To be clear, this isn’t a ‘police’ problem and it’s not a ‘local authority’ problem; it’s not just a procurement or an ethical one: it’s a democratic one. And it’s international. As such it extends far beyond my limited statutory functions for the government’s Surveillance Camera Code of Practice. Which is why I met the Security Minister to discuss whether the further risks and requirements from surveillance sensitivities might be taken forward by his recently announced Task Force. My letter can be found here. The meeting was brisk and business like and I left believing that things will begin to happen.
Following almost two years of meetings with politicians, Peers, partners; with chief police officers, chief executive officers, experts in surveillance, cyber risks and ethics; with journalists, lawyers, academics and human rights practitioners all under the auspices of the National Surveillance Camera Strategy, I’m pleased to say that the issue has now moved on. Since the Written Ministerial Statement we’re no longer talking about whether certain surveillance companies can be trusted any more. We’re now asking ourselves what can be done to ensure that only those companies who have demonstrated their trustworthiness are working across any of our sensitive democratic infrastructure.
There’s a Chinese proverb which says “even a journey of a thousand miles begins with one small step”. This is that small step - and there’s a long way to go.
]]>
The survey covers all facial recognition enabled systems, drone mounted camera systems, helicopters or aeroplane mounted systems, Body Worn Video (cameras on police uniforms), ANPR (automated number plate recognition) systems and any other surveillance camera systems in public places that fall within the definition of section 29(6) of the Protection of Freedoms Act 2012.
The survey asks about the capabilities of systems, whether they use equipment from non-UK suppliers about which there have been ethical or security concerns, what due diligence they have undertaken to ensure they are working with trusted partners, and how their systems comply with the Home Secretary’s Surveillance Camera Code which they have a legal duty to observe.
About facial recognition in particular, the survey asks forces whether they currently use facial recognition technology and, if so, whether it’s live (real-time) or retrospective, and whether it is initiated by officers using cameras on their mobile phones or some other kind of system. If none is currently in use, the survey asks whether the force intends to start using facial recognition technology in the future.
There is little doubt that the police use of surveillance camera systems in the public sphere has been increasing in recent years. This survey will provide an important snapshot of what kinds of overt surveillance camera systems police are using, what they are being used for, and the extent to which facial recognition technology is now being used. It should also tell us whether police forces are complying with the new Surveillance Camera Code as they should be. It will be very interesting to see how much things have changed since similar surveys were conducted in 2017 and 2019 by my predecessor in the role of Surveillance Camera Commissioner.
The government’s revised Surveillance Camera Code of Practice came into force in January this year and emphasises the importance of the legitimate use of technology ‘to a standard that maintains public trust and confidence’.
]]>The government today revealed that it had scrapped plans to move several key functions of the Biometrics and Surveillance Camera Commissioner’s role into the hands of the powerful ICO data regulator.
The plan to move oversight of police use of biometrics appeared late last year in a major government consultation called: Data: A New Direction.
Several respondents, including Professor Sampson, told the government that, for various reasons, it was a bad idea. See OBSCC formal response to Data: A New Direction.
Today, in its formal response to that consultation (see section 5.8), the government said:
In the light of this feedback and wider engagement, including with the current Biometrics and Surveillance Camera Commissioner and law enforcement partners, the government… has decided not to transfer these functions to the ICO…
Professor Sampson said:
It’s a sensible decision, as far as it goes. But the government’s response needs detail on what they plan to do now with these particular important functions. I won’t be in a position to offer any meaningful observations until they have some specifics about what will come next in terms of providing strong, principled, and independent oversight in these important areas.
We now have an opportunity to come up with something really good, not only in relation to DNA and fingerprints, but also in relation to other existing and emerging biometric technology such as live facial recognition.
We are talking about technologies that, it seems to me and many others, are going to play larger and larger roles in all our lives. We need a way of keeping in step with fast-paced change in these areas in order to provide the public with the reassurance they need that this tech will be used lawfully, responsibly and according to a set of clear bright line principles that will ensure the circumstances of their use are dictated by what society agrees is acceptable and not just what technology makes possible.
The government response says it will explore whether the existing Investigatory Powers Commissioner, can instead take on some of the Biometrics Commissioner’s functions.
Professor Sampson said:
As Biometrics Commissioner I independently oversee the use of investigatory powers involving biometric material, ensuring they are used in accordance with the law and in the public interest. This description is almost identical to that of the Investigatory Powers Commissioner, and it makes far more sense for any transfer to go in that direction which was not an option in the original consultation questions although who would be taking the 100+ decisions per month on National Security Determinations is not yet clear.
The government’s policy in this area continues to be one of ‘incremental steps’ to simplify regulation in this arena.
Professor Sampson continued:
If Parliament decides to move the functions, the next necessary step in simplification will be to have one definition of biometrics. At the moment ‘biometrics’ in policing only covers the traditional fingerprints and DNA while schools have a wider but less regulated definition. ‘Next generation biometrics’ such as facial recognition, iris, vascular patterns, hormones and gait are as much ‘biometrics’ as our fingerprints, and are – as we heard at the event in London on ‘Legitimacy of facial recognition in policing and law enforcement’ just this week - a matter of growing public concern.
In addition, almost all the capability in this area is privately owned, requiring our private sector technology partners to demonstrate that they can be trusted in respect of their security arrangements and their ethical values. Not only would this simplify things, it would also bring the UK into line with many other countries with whom we share biometrics for law enforcement and national security purposes.
The government’s consultation response also said it would seek to remove “duplication” between the ICO and Surveillance Camera Commissioner aspect of Professor Sampson’s dual role.
The Professor said:
]]>The commissioner’s functions flow directly from the Secretary of State’s duty to publish a Code of Practice for the surveillance of public space. If that duty were to be transferred to the Information Commissioner, it would follow that responsibility for compliance might sit with them, although many of the public’s concerns about the expansion in state surveillance are not data protection issues at all.
The acid test for any framework for the police use of biometric and overt surveillance technology will be how far it allows us to know that their technical capabilities (what is possible) are only being used for legitimate, authorised purposes (what is permissible) and in a way that the affected community is prepared to support (what is acceptable).
From my perspective as Biometrics and Surveillance Camera Commissioner the issue is very simple. The use of biometric surveillance by the state is a matter of increasing sensitivity and significant public concern - not just here but globally.
As almost all of the technological capability for biometric surveillance is privately owned, the only way we will be able to harness the legitimate uses of that technology in the future is in trusted partnership with trusted private sector partners.
The people we trust - the police, fire and rescue, local authorities, the government itself - need to be able to trust their technology partners, both in terms of security and our ethical and professional values.
And the publicly available evidence tells me that some of these companies - notably Hikvision and Dahua - simply cannot be trusted, partly because of concerns about the role they and their technology are believed to have played in perpetuating the appalling treatment we’ve heard about here and also because of those companies’ absolute refusal to engage with even the most cursory level of public accountability in response to those concerns.
This is not about interfering in another country’s domestic affairs; this is about the legitimate expectations in ours.
If the people we trust can’t trust these companies, we have no business giving them public money and no defence to the obvious risks we’re creating by doing so. For those public bodies that have entered into surveillance contracts with these companies I’m very interested to understand how they conducted their due diligence and on what evidence they believe they have created a partnership that can be trusted by the communities they serve.
In his final report as HM Chief Inspector of Constabulary Fire and Rescue Services Sir Tom Winsor says “What is needed is a material intensification of a partnership with the private sector that is soundly and enduringly based on trust and common interest.” Nowhere is that truer than in the context of biometric surveillance”.
]]>Face recognition technology is a contemporary development that combines surveillance cameras with biometric face recognition capability. Although the technology is still in its infancy, there are already calls for wider deployment, especially in relation to identifying known serious offenders. Inevitably there have been disputes about the effectiveness of the technology, its social acceptability and implications for civil liberties. These discourses point to a need for a public debate on the future of the technology and this event is designed to give members of the public an opportunity to question experts and have their say.
To get a better understanding of how facial recognition technology in policing and law enforcement is seen, my office in conjunction with Professor William Webster (Centre for Research into Information, Surveillance and Privacy) has organised an event for expert witnesses to provide evidence on whether there is a legitimate role for facial recognition in policing and law enforcement. Taking place before a live audience, attendees will be able to ask questions to test the strength of the different arguments being put forward.
The event will be organised around a series of short expert statements, which will be debated with the audience and the other expert panellists. There will be lots of opportunities for audience participation, either through the Question and Answer session, social media, or the online polling that will take place during the event.
Experts who will be participating in the event include:
- Silkie Carlo, Director, Big Brother Watch (BBW)
- Jeremy Vaughan, Chief Constable, South Wales Police
- Isabelle Moeller, Chief Executive, Biometrics Institute
- Roger Baldwin, Advisory Council Member, Biometrics Institute
- Gary Pugh, Forensic Science Regulator
- Anne Russell, Group Manager, Information Commissioners Office (ICO)
- Dr Joe Purshouse, Senior Lecturer, University of Sheffield
- Professor William Webster, Director of CRISP, University of Stirling
Doors open at 5pm, with refreshments available. Expert witness statements start at 6pm with the event finished by 9:15pm.
Admission to the event is free. All audience members must register via Eventbrite to attend. Each audience member can reserve up to three tickets. Final joining details, including location, will be emailed to audience members a couple of days prior to the event.
The event will be recorded and will be available online. Audience members will be asked to consent to this recording as part of the ticketing process.
NEW Livestream option: Contact CRISP@stir.ac.uk if you would like to get access to the MSTeams livestream.
]]>It has been a highly interesting first year, with not one but two public consultations being considered in parliament – first was the long awaited review of the Surveillance Camera Code of Practice and second the Department for Culture Media and Sport’s consultation Data: a new direction which contained a couple of surprise questions right at the end!
Within my presentation, however, I decided to focus on four main topics. The first of these was that biometric surveillance is not just data protection. Here, I talked about the harrowing case of David Fuller who was convicted of murder and videoing his abuse of the victims. However, while his behaviour went against everything we see as right (as pointed out by the trial judge) it didn’t breach any data protection rules because they only protect the dignity of the living. The same was true of the egregious conduct by police officers who took pictures of two murder victims. To me these dreadful cases highlight how DP laws have not kept up with the potential intrusiveness of surveillance and why ‘data compliance’ is not always the same as acceptable surveillance behaviour.
Secondly, I highlighted the direct link between the atrocities being committed in Xinxiang, China and the camps where Uyghurs have been detained and observed at every moment by CCTV surveillance. I have been closely following the public and parliamentary disquiet arising from this persecution and, as many public authorities have bought and installed surveillance systems from the companies linked to the human rights abuses in those camps – largely, it seems to me, on the basis that the cameras offer “value for money” - I therefore asked the audience to consider this question: you wouldn’t employ an individual surveillance operator who had designed, built and worked in one of these appalling places so why would you employ a company that designed, built and operated them? I cannot see how the procurement of these systems meets the human rights obligations of public bodies nor how it would meet the National Decision-Making Model for policing which claims to put ethics at the heart of every decision.
Thirdly I referred to a news story from last year titled “terminally ill dad arrested by 6 police officers for mooning at a speed camera”. I highlighted the relationship between the police and the citizen when in relation to surveillance. In this news story, the man in question was filmed by his wife and carer being wrestled to the ground and handcuffed by 6 police officers. This incident prompted the appearance of a ‘Banksy’ mural in the town and is a prime example of how we are moving from public bodies watching the citizen, to a world where the citizen is filming everything – some image captures we may not welcome and some we already rely on to do our jobs.
Finally, I drew the audience’s attention to an interactive doll that was produced for the English-speaking market. Equipped with speech recognition systems, AI-based learning features and operating as an IoT (internet of things) device this doll was designed to have a child at one end and the internet at the other and is thankfully no longer on the market! My warning based on the menace of mixing these technologies was simple: do not connect dolls to the internet…
I was joined by an abundance of great speakers throughout the day including police forces, local authorities and industry experts, as well as the Forensic Science Regular Gary Pugh and a brief walk-on for stalwart supporter Tony Gleason from the PCMA. Furthermore, Professor William Webster, the lead for the Civil Engagement strand of the National Surveillance Camera Strategy (NSCS), did a tremendous job of leading the day and introducing all the speakers, alongside giving his own talk on The Governance of eMerging Tech: Issues for Surveillance Camera Provision.
For me, a personal highlight is getting back out and about and carrying out my own facial recognition seeing so many people for the first time in real life. Delivering speeches behind my computer screen was still the norm many months into my tenure as Commissioner, so I am pleased to be entering my next year in person to engage with biometrics and surveillance colleagues.
]]>It may sound like it came from a Christmas cracker but the question of how many commissioners are needed in the area of surveillance camera regulation is a key part of the government’s review of data reform and is an important one.
At the moment the law requires the appointment of a Surveillance Camera Commissioner to oversee the use of surveillance camera systems by police and local authorities in public space. The role is independent of government (or anyone else) and the Commissioner must publish an annual report which goes to parliament via the Home Secretary. My first report has just been published. The same law also requires there to be a commissioner to oversee the retention and use of biometric material (essentially DNA and fingerprints) by the police in the UK. While each statutory role is separate and has been held by different people in the past, I was appointed to cover both at the same time, reducing the number of commissioners and increasing the consistency of approach. Incidentally I have just published my report as biometrics commissioner which can be found here.
Annual Report Content
My surveillance camera report covers the period between 2020-21 when my predecessor, Tony Porter, was in post and focuses on key activity and developments in the fast-evolving area of surveillance. The report covers issues such as the use of facial recognition technology and the judgment of the Court of Appeal in the case brought against South Wales Police. It also covers activity that has been undertaken in furthering the National Surveillance Camera Strategy including body worn devices, automated number plate recognition and the long-awaited revision of the Code of Practice. The report sets out the Commissioner’s role and functions which the government has proposed transferring to the Information Commissioner’s Office (ICO) along with those of the Biometrics Commissioner and is seeking views on the idea. My detailed response can be found here. For those who want to skip to the conclusion, I don’t think it’s wise or workable and I set out the evidential basis for that view in the formal response. In short, while there has always been a significant element of ‘data protection’ engaged in the area of camera surveillance, there are many respects in which the commissioner’s functions are wider than upholding individual rights of the data subject. If people decide not to attend a public protest, for example, because they believe their photographs will be collected by the police, or fear their vehicle number plates will be stored for future monitoring, this would represent a significant impact on their fundamental rights in a mature democratic society but has little to do with data protection. Our data laws also have limits on which aspects of our lives they cover. For example, the hideous offending of a recently convicted murderer did not, so far as I can see, breach any “data protection principles” even though he recorded his repellent abuse of human bodies in a hospital mortuary, because our data laws only protect the privacy and dignity of the living. While no official ‘surveillance’ was involved, there have since been calls to introduce mandatory CCTV in hospital mortuary areas as a result – were this to happen the limits of the laws that protect the ethical use of intrusive surveillance beyond data compliance would need to be reviewed.
The Changing Context of Surveillance
The figures from the last decade show a huge increase in the presence of public cameras. When measured in cameras-to-people, London was recently ranked the 3rd most surveilled city on Earth (having an estimated 691,000 cameras for 9,425,622 people = 73.31 cameras per 1,000 population); in cameras per square mile, that’s 1,138.48 cameras making London the 2nd most surveilled city in the world. Add in mobile camera platforms such as drones and wearable devices and it gets more speculative - and when privately owned and operated cameras are factored in, it becomes anyone’s guess (literally). This increase in camera use has been matched by an increase in the public attention it has drawn. Although much harder to quantify, public concern still counts and while it does not lend itself to worries-per-1,000 parents or miles-by-anxious-motorist it is as real and present as the cameras in our streets, bus lanes, workplaces and schools.
We are getting more used to surveillance and are installing our own personal systems more readily and cheaply than ever before. But when it’s done by the State with all its apparatus of enforcement some feel surveillance monitoring to be highly questionable - especially as the Government doesn’t yet follow its own Surveillance Camera Code; when expanded into schools and other areas of young people’s lives the surveillance sensitivities and risks are amplified. The law has long recognised the specific vulnerability of children and young people when you capture and retain their biometrics - in fact this is the reason behind the Protection of Freedoms Act that created my roles and functions almost 10 years ago. We need to be able to have confidence in the whole ecosystem of surveillance and be sure that what is technologically possible is only being done in a way that is both legally permissible and societally acceptable. That’s the ‘lightbulb’ that needs changing here. How many commissioners it will take is a matter of ongoing deliberation and I’d encourage answers via email to our team.
]]>The camera that unbelievably mistook a woman’s jumper for a car was a whimsical aside from the floor but raised some profound questions for the surveillance community at the conference. The programme for this return to ‘old school’ conferencing was energetic and interactive after more than a year of Teaming and Zooming and Webexing. It covered a range of more conventional topical subjects around AI and surveillance with exhibitors and sponsors showcasing what can now be done with the state-of-the-surveillance-art.
The principal surveillance question that refuses to go away was naturally facial recognition (plainly not fitted to the traffic camera) and the conference coincided with significant public interest in the use of the technology in schools. I’ve said many times that, in my view there’s nothing intrinsically wrong with facial recognition technology, live, retrospective or otherwise and in the context of law enforcement it has a legitimate role to play as any other tactical option. We’ve been using facial recognition for a while now in our private lives and in other settings it offers some wonderful opportunities to enrich people’s lives, from ensuring the safety of care home residents with dementia to helping visitors navigate around museums and receive a commentary on what exhibit they’re looking at. The exam question this week was the proper place for live facial recognition within a school setting. It’s an interesting one, particularly as I have no direct responsibility for this aspect of intrusive surveillance but the answer remains the same. As ever, it’s the use and the user that are the key and we know from elsewhere in the world that this type of surveillance technology can be used in some very sinister ways under the banner of ‘education’.
The conference considered the regulation of public space surveillance more widely. In England and Wales if the police or local authorities want to use surveillance cameras in public space they have a legal duty to follow the government’s Code of Practice. That’s because Parliament recognised the difference between your decision to sign up to new technology for your own personal convenience and the State’s using it without your express – or even implied- agreement. The Code says that if there’s a less intrusive way of achieving the same objective, the police or local authority should use it. The same rule would apply in schools and if you can achieve largely the same result by using, say, a PIN or QR code, you should.
There are two real challenges with biometric surveillance technology. The first is that it’s SO useful, once you install it for one function you’ll probably want to expand it into other obvious areas. This is why your mobile phone has become the digital Swiss Army knife. Organisationally, if you’ve accepted the case for facial recognition as an appropriate tool in one transactional area – say paying for school meals – there will be other equally compelling use cases involving the same considerations and same arguments – paying for other things like school events or equipment, or the never ending fundraising. It’s difficult to resist those arguments if you’ve already accepted them once; it’s even harder to resist them when you’ve invested good money in the technology and it can be expanded readily and cheaply. Of course, once you’ve installed the new system, you no longer need the old analogue ways of doing things – tills, cashiers, desk sergeants – and so you strip them out. This means there is no “less intrusive” alternative now because you got rid of it, which (artificially) strengthens your case for moving to the new technology in any future settings. This combination of ‘function creep’ and self-fulfilling proportionality is not so much a problem in your own home but a little more significant when it’s your school or local authority or the police. Even in the school scenario there is an inequality of bargaining power in any consideration of whether to ‘opt in’ at the start. What does ‘opting out’ really look like? Is it a once-and-for-all choice or will it come round again in light of experience with the new technology? You can reverse out of many consumer decisions if you change your mind; can you in this case? What are the implications for your opted-out child standing in the dinner queue with their peers? It’s easy to see how tasks such as registration and attendance or safeguarding needs in schools would benefit from facial recognition capabilities but ‘function creep’ would mean you inevitably arrive at a point where the camera in the dining area can recognise who's playing Joseph in the nativity play. With the police your bargaining position may feel even less balanced. You also have to factor in the tech going wrong – sometimes in ways you hadn’t predicted.
The second challenge with biometric surveillance is that it’s so discussible. People tend to have a view on what’s acceptable in the use of intrusive technology for different purposes and when it’s done by the State with all its apparatus of enforcement some feel surveillance monitoring to be highly questionable - especially as the Government doesn’t yet follow its own Surveillance Camera Code; when children are involved the sensitivities and risks are amplified. The law has also recognised the specific vulnerability of children and young people when you capture and retain their biometrics – in fact this is the reason behind the Protection of Freedoms Act that created my roles and functions almost 10 years ago.
In a cautionary tale of how viral narratives can take some recovering from even when disproven beyond all doubt, Professor Martin Innes told the Parable of the Zombie Racoons and how fake news panic had created not one but two social media frenzies in the US. His consonant lessons for us on how a great story will not always be quieted by proven facts began to make me doubt the camera and the jumper story.
But the facts from the last decade show a huge increase in public surveillance. When measured in cameras-to-people, London was recently ranked the 3rd most surveilled city on Earth (having an estimated 691,000 cameras for 9,425,622 people = 73.31 cameras per 1,000 population); in cameras per square mile, that’s 1,138.48 cameras making London the 2nd most surveilled city in the world. Add in mobile camera platforms such as drones and wearable devices and it gets more speculative - and when privately owned and operated cameras are factored in, it’s anyone’s guess (literally). This increase in surveillance has been matched by the increase in public attention it has drawn. Much harder to quantify, public concern still counts. It doesn’t lend itself to worries-per-1,000 parents or miles-by-anxious-motorist but it is as real and present as the cameras in our streets, bus lanes, workplaces and schools.
Conference attendees also heard the noise that doorbells are currently making and how their inbuilt cameras are often pointing at more than just the owner’s property. Last week’s county court judgment against a householder who was being sued by his neighbour for interfering with her privacy raised some further, highly topical surveillance discussion. The expansion in privately owned cameras has brought by a corresponding increase, not just in the sharing of stories but the sharing of images, with the police and other bodies, either of the citizen’s own volition or in response to the now ubiquitous appeals for dashcam, GoPro, doorbell or other captures. When it needed a fallible human with a finite attention span to monitor and analyse it, all this newly-enabled aggregated surveillance data had limited practical use: there is simply too much of it. But advances in video analytics and systems for combining, categorising and editing these datasets now allow very significant uses of this new surveillance capability. Again, that’s not a bad thing in itself: every day there are significant investigations of heinous criminality that have been greatly helped by this pooled technology and there will be many more ahead.
As the conference also heard, the rush towards omniveillance means shifting from on-premises solutions to ‘the cloud’ (a brilliantly conceived fluffy euphemism for putting your data and faith in someone else’s computer). This suggests a future where real-time biometric surveillance allows the State to crowdsource video data from companies and public services (like schools) adding in CCTV feeds and AI capabilities. These technological advances will coalesce to allow commercial businesses and householders to ‘plug’ their cameras into police and local authority networks offering the power for total public surveillance.
From the debates and presentations we are, it seems to me, already building a dependency on aggregated surveillance imagery in high harm areas such as terrorism and serious organised crime – our public services’ reliance on it may soon mean that CCTV forms part of the country’s Critical National Infrastructure – or at least our critical local infrastructure. And like most of the established critical national infrastructure, it’s largely in private ownership. When the Surveillance Camera Code of Practice was first published, the BSIA put the ratio of private to public cameras at 70:1; it is a reasonable hypothesis that this relative imbalance will have increased exponentially since. Security and intelligence journalist Philip Ingram led us through a fascinating presentation about the critical role that public space surveillance played in the investigation into the Novichok poisoning of UK citizens in Salisbury by Russian agents and spoke of Chinese cameras being installed around the country with ‘hidden’ latent functionality such as the ability to read number plates (and presumably our clothing). Against this background, the risks highlighted by the Foreign Secretary over the weekend about dependency within parts of our critical national infrastructure begin to move to closer to our high streets, our hospitals and our schools.
If we attain it – out of necessity or inadvertance – ‘total public surveillance’ will be delivered by the commercial sector operating under managed contracts with public services such as the police and local authorities, augmented by citizen-generated data feeds. But to what standards and at what cost? The risks from cyber attack and other security related issues are central to the professional manufacture, installation and operation of surveillance systems and are directly addressed in the Surveillance Camera Code. Companies are expected to conduct and report the results of technical penetration tests but what about the ethics of operators and suppliers? How many carry out ‘pen testing’ of their ethical and corporate social responsibility arrangements at all let alone with the same degree of transparency that they apply to their technical values?
Closing with a panel session covering a range of challenging issues, technological, homological and biometric the conference left me better informed and wanting to know more. I’m very grateful for having been invited to take part in this excellent event and I’m still smiling at the baffled motorist in Surrey who felt he’d been ‘stitched up’ by the camera that mistook a jumper for his number plate.
]]>