Police are using live facial recognition (LFR) to scan the faces of people attending the British Grand Prix at Silverstone this weekend.

Northamptonshire police were deploying the technology on Saturday and Sunday to provide “an extra layer of security” at the Formula One race, which 450,000 people were expected to attend, the force said.

Its use at Silverstone marks the first time LFR has been deployed by forces outside south Wales and London.

The government has faced calls to ban police from using LFR in all public spaces after a study of its deployment by those two forces concluded last autumn that they had failed to incorporate “many of the known practices for the safe and ethical use of large-scale data systems”.

Northamptonshire police said LFR, which scans faces with a camera to match biometrics against those held on a police-generated watchlist, would be used for the “sole purpose of fighting crime and protecting people” attending the event, and that locations where it was in operation would be clearly marked with signs.

The force said the watchlist included suspects wanted for offences or with an outstanding arrest warrant, those who posed a risk of harm to themselves or others, and vulnerable missing people.

Any images that triggered alerts would be deleted immediately after use or within 24 hours, while the images and biometric data of people who did not trigger an alert would be automatically deleted from the system, the police said.

Authoritarian regimes including China have used LFR as part of their suite of repressive tools, and critics have expressed concerns that the use of the technology is racially biased. In 2018, a researcher at the Massachusetts Institute of Technology’s Media Lab in the US concluded that software supplied by three companies made mistakes in 21% to 35% of cases for darker-skinned women. By contrast, the error rate for light-skinned men was less than 1%.

In a ruling in 2020, a court found that South Wales police had failed to properly investigate whether the software exhibited any race or gender bias.

However, police say there has been a “substantial improvement” in the accuracy of the technology, with research commissioned by the Met suggesting the chance of a false match was now one in 6,000 people.

The human rights groups Liberty, Big Brother Watch and Amnesty have all described the technology as “oppressive”. Prof Pete Fussey, a surveillance expert at the University of Essex who was hired by the Met to audit its previous LFR trials, has described it as “a powerful and intrusive technology that has real implications for the rights of individuals”. In May, he said “the Orwellian concerns of people, the ability of the state to watch every move, is very real”.

Earlier this month, the European court of human rights described the technology as highly intrusive, and said in a ruling that using facial recognition technology to identify and arrest participants of peaceful protests could have “a chilling effect in regard of the rights to freedom of expression and assembly”.

At last year’s event at Silverstone five Just Stop Oil protesters stormed the Wellington Straight, the fastest point of the Northamptonshire track, and sat down during the opening lap.

The protest group recently disrupted play at Wimbledon and has targeted other high-profile sports events, including the Ashes and the Premiership rugby final.

In May, when South Wales police used LFR to monitor people during a Beyoncé concert in Cardiff, critics expressed concerns that the technology could modify people’s behaviour and affect democratic processes, such as protests.

Det Supt Richard Tompkins, of Northamptonshire police, said: “We have a robust policing plan in place which will see a large police presence in and around the circuit and wider venue as part of a multilayered security operation, including the use of ANPR [automatic number plate recognition] and LFR technology.

“Our priority will always be to protect the public while relentlessly pursuing those people who are determined to cause harm in our communities, and it is therefore important we embrace and use new technology to help us achieve this.”

  • Solivine
    link
    English
    341 year ago

    Even if it didn’t have the racial bias this technology can and will always be misused by government