Regulars to my blog will know that my regulatory role in the context of surveillance camera systems extends to the overt use of automated facial recognition (AFR) technology by the police and local authorities.
Over the past few years South Wales Police and the Metropolitan Police Service (MPS) have been trialling the use of AFR, and not without some controversy. The legitimacy of these trials is being challenged in the courts by civil libertarians Big Brother Watch and Liberty. More trials are planned for the future and it’s important they are conducted in accordance with the Surveillance Camera Code of Practice, which relevant authorities (police forces and local authorities) must have due regard to when operating surveillance camera systems overtly in public places.
Detective Superintendent Bernie Galopin of the MPS has stated in a recent press release that: “The Met is currently developing the use of live facial recognition technology and we have committed to ten trials during the coming months. At the end of the year, there will be a full, independent evaluation.”
My team and I have been supporting police forces in their legitimate use of AFR. I believe it has the potential to be an effective policing tool when used lawfully and ethically, helping to identify persons of interest in large crowds and assisting in the search of vulnerable missing persons. Equally, it also has the potential to encroach on citizen’s privacy rights as enshrined in Article 8 of the European Convention on Human Rights, and indeed a number of other fundamental freedoms. There are clearly both ethical and regulatory considerations at play here, and a balance to be struck between keeping citizens safe and not illegitimately intruding on their freedoms.
You may have seen that last week the London Police and Ethics Panel published their first interim report on the MPS use of live facial recognition. This is an important matter for me, and the initial recommendations made are ones which I fully support.
Elsewhere, the Home Office’s recently published Biometrics Strategy has resulted in the formulation of a strategy board, for regulators and the Home Office to develop the use and governance of AFR by law enforcement agencies. This is good news – but there is still more work to be done. A greater emphasis needs to be placed on the impact of civil engagement to energise debate and engender public trust.
However, the use of AFR goes beyond law enforcement agencies, and there are growing concerns in society about how this technology may be used in other situations by private companies, such as shopping centres and pubs.
Will the technology be used to identify known shoplifters or barred patrons? How is a ‘watch-list’ compiled and what happens to the data collected? Is there any human Intervention before decisions are made that adversely affect people? Who regulates the standards of equipment used? And could the technology be biased or disproportionately used? Ultimately what safeguards are in place to balance the need for security against our basic right to privacy?
Clearly the retail and business communities are not regulated and accountable to the same extent that agents of the State are when conducting surveillance of citizens. The intrusion of these systems is arguably equal in terms of impact on people’s privacy.
So, there needs to be accountability and governance, and its use needs to be justified, necessary and proportionate. AFR technology is still in its infancy, but its capabilities are rapidly evolving. What can we do now to prepare for tomorrow?