If you are a regular reader of my blog you will know that I began writing a series of blogs in February looking back over the six years that I’ve been Commissioner. That was because my commission was coming to an end in March. Well, it’s April and I’m still here as my commission was extended to June at the eleventh hour – a new person will be taking up the reigns when I depart. I started this series of blogs when there was a large degree of uncertainty about what was happening for me both professionally and personally. I could not have envisaged what was to unfold in the weeks to follow.
Surveillance and COVID-19
We are in the midst of a pandemic which none of us have ever seen the like of. Around 4 billion people around the world are on lockdown in an effort to limit the spread the COVID-19 virus which has claimed the lives of over 160,000 people.
We have seen the use of surveillance come to the forefront in how nations enforce social distancing measures and monitor the spread of the virus in an effort to save lives. From mobile data tracking apps to record personal contact with others, CCTV networks equipped with facial recognition, permission schemes to go outside and drones to enforce social isolation regimes. If new systems (such as data tracking apps) are to be introduced they must surely pass these most basic tests – provide accurate information about infection or immunity, have clear technical capabilities to support that requirement and ensure these tests do not infringe unfairly on human rights. Are these not the similar criteria to justify the use of facial recognition technology? Are we not coming across the same issues but via different surveillance platforms?
In more authoritarian nations than ours facial recognition is tracking people who should be at home and thermal imaging cameras on public transport are assessing individuals’ temperatures to identify those who might have the virus so the authorities can intervene.
That sort of thing doesn’t happen here though, does it? Well, to my knowledge it hasn’t. My office have had some enquiries from police forces around using drones and automatic number plate recognition to enforce social distancing measures put in place by the Government. We also have had some organisations ask about using thermal imaging to identify people who have high temperatures entering their buildings. I must say that first and foremost these conversations have focussed on how we protect the public and keep people safe whilst protecting their fundamental human rights and freedoms. Not about the introduction of mass surveillance without oversight.
Apply the code
The police and local authorities must comply with the 12 guiding principles in the Surveillance Camera Code of Practice. Before the deployment of any surveillance camera system they must establish a legitimate aim and a pressing need. That’s principle one of the Code and it states that might include the protection of health. Once a legitimate aim and pressing need is established the organisation must comply with all the principles in the Code which cover a range of areas such as civil engagement, data protection, security measures, standards and so on. If the tests outlined above can not be met how can an organisation justify the legitimate aim and pressing need objective?
So what? You might be thinking. The what is – if the Code is followed video surveillance systems will only be used for a specified purpose, proportionately and effectively. For example. in the case of thermal imaging it might be proportionate to use this technology in the unique times we are in as a condition of entry to building where individuals have given consent. Whereas mass use without individuals’ knowledge seems disproportionate and would require much stronger justification. Equally, the use of such technology needs to be accurate enough to identify those you want it to i.e. those with raised temperatures – so the ‘kit’ would need to meet some minimum technical requirements. Lastly, people could have higher than normal temperatures for a host of reasons and how that is managed would have to be factored in. There are a number of considerations and processes to go through such as carrying out a data protection impact assessment and using my self-assessment tool – following the Code should help organisations deal with some of these issues. In short, if this technology is not good enough to ensure public trust and support it then it probably shouldn’t be used.
Proper guidance, oversight and accountability
But as regular readers will know – the Code is now almost 7 years old. Issued when the I-Phone 5 was all the rage, we are on the cusp of the I-Phone 12. It’s nearly 2 years since the Home Office announced in their Biometrics Strategy that the Code would be review and updated. There is still no news as to when that updated Code will be issued. So, you have police forces, local authorities and others deploying technology that wasn’t available in 2013 but working to a Code that hasn’t moved with the pace of technology.
My role is also advisory. I provide advice to those who operate surveillance camera systems. I do not have powers of inspection like my colleague the Investigatory Powers Commissioner, nor am able to enforce sanctions like the Information Commissioner. So, I have no way of ensuring compliance myself by means of inspection or rectifying non-compliance by way of sanction. In 2013 that was good policy, but things have moved on significantly. Facial recognition is arguably as intrusive as some covert surveillance techniques. Yet other than complying with the Code and with data protection legislation there is not the same independent oversight and accountability as with covert surveillance. Specifically, authorisation and inspection. Where, particularly intrusive overt surveillance is deployed should it be subject to the same scrutiny afforded to covert surveillance? It is all surveillance after all and surveillance is an investigatory power not simply a data protection issue.
Whilst we may see authorities using surveillance cameras to deal with infractions of social distancing or identify those who may have COVID-19 this must be done within the boundaries of the law.
I see an emerging argument for primary legislation to be advanced to regulate surveillance by tracking and digital contact tracing applications. Legislation should impose clear parameters, access and time limitations. Some commentators are quite rightly raising alarm bells surrounding some of the surveillance techniques being deployed or being considered. If an organisation has deployed what is deemed overt technology in-line with the requirements of the Code and other legislation on what basis are they held to independent account? Some argue that there is none. I argue that it is complex but a more robust form of oversight for all forms of overt surveillance is justified.
That is why I continue my call as I exit office for Government to commission an independent review of the laws governing overt state surveillance in a manner similar to that conducted by Sir David Anderson QC, “A Question of Trust”. I would propose that all manner of state surveillance regulation placed within a single regulatory body with judicial leadership (essentially moving my function or parts of my function under the leadership of IPCO or being more closely aligned).
When life does get back to normal how do we ensure that more intrusive surveillance measures don’t linger? How do we ensure oversight and accountability of those that may linger on or were in use before March 2020? What will we have learnt and how will we translate that into good policy that supports organisations deploying cameras and most importantly support the communities those cameras are watching? Is that the independent oversight and authorisation of the use of particularly intrusive overt surveillance camera technology? A comprehensive review of overt state surveillance could help guide us on these matters.