facial recognition cameras

Move Over China, London Is Getting in On Facial Recognition Camerass

Publish date March 11, 2020 Comments: 0
Facial recognition technology has been rolled out in London throughout the city’s Metropolitan Police Service. These cameras are situated in key areas of the city and will scan faces of notifying police of any matches with wanted criminals. It is a surprising move, considering that the European Union wants to temporarily ban this technology for up to five years in order to safeguard an individual’s rights.

According to the Met, which serves more than eight million people across 32 boroughs throughout London, the deployment “will be intelligence-led and deployed to specific locations in London. This will help tackle serious crime, including serious violence, gun and knife crime, child sexual exploitation and help protect the vulnerable.”

Using popular social media platform, the Met went on to assure the public that any of the images obtained which didn’t contain potential alerts would be deleted instantly. On top of that, it is solely up to officers whether they choose to stop someone based on the facial recognition camera matches or not.

The Met urges that this is not an attempt of technology taking over from traditional policing, but rather it is another effort in the grand scheme of things in keeping Londoners safe.

Assistant Commissioner Nick Ephgrave, said: “This is an important development for the Met and one which is vital in assisting us in bearing down on violence. As a modern police force, I believe that we have a duty to use new technologies to keep people safe in London. Independent research has shown that the public support us in this regard. Prior to deployment, we will be engaging with our partners and communities at a local level.

We are using a tried-and-tested technology, and have taken a considered and transparent approach in order to arrive at this point. Similar technology is already widely used across the UK, in the private sector. Ours has been trialed by our technology teams for use in an operational policing environment.

Every day, our police officers are briefed about suspects they should look out for; LFR improves the effectiveness of this tactic. Similarly, if it can help locate missing children or vulnerable adults swiftly, and keep them from harm and exploitation, then we have a duty to deploy the technology to do this.”

Facial Recognition Cameras in London

The facial recognition cameras in London will be made by Japanese company NEC, and will be focused on small, targeted areas in order to scan passers-by. It is a standalone system, with the cameras clearly signposted.

However, not everyone is on board with this decision, with many people stating that facial recognition technology is not only ineffectual but also in some cases unlawful. Just last year, a report from the University of Essex revealed that the Met’s facial recognition technology has an inaccuracy rate of 81 percent. This resulted in as many as 2,300 innocent people as potential criminals in South Wales in the past year.

The move has been met with instant criticism from civil liberties NGO Liberty, with advocacy director Clare Collier stating,

“This is a dangerous, oppressive and completely unjustified move by the Met. Facial recognition technology gives the State unprecedented power to track and monitor any one of us, destroying our privacy and our free expression.

Rolling out a mass surveillance tool that has been rejected by democracies and embraced by oppressive regimes is a dangerous and sinister step. It pushes us towards a surveillance state in which our freedom to live our lives free from State interference no longer exists.”

The decision to implement facial recognition cameras still followed through even after an investigation in October as to the effects of live facial recognition by the UK’s Information Commissioner’s Office.

“Moving too quickly to deploy technologies that can be overly invasive in people’s lawful daily lives risks damaging trust not only in the technology but in the fundamental model of policing by consent,” Elizabeth Denham, the UK’s Information Commissioner, said at the time.

This is an issue that has also been documented by Federal researchers from the United States, where some cities have banned the use of facial recognition cameras.

Asians, blacks and Native Americans were most likely to be misidentified according to the National Institute for Standards and Technology, with another report discovering that black women are more likely to be falsely identified than any other groups.

“Even government scientists are now confirming that this surveillance technology is flawed and biased,” said Jay Stanley, a senior policy analyst at the American Civil Liberties Union. “One false match can lead to missed flights, lengthy interrogations, watchlist placements, tense police encounters, false arrests, or worse.”

Facial Recognition Technology Pros

One of the biggest advantages of this type of technology is security and safety. Law enforcement agencies often use this technology to locate fugitives, serious criminals, missing children as well as seniors. For example, there was an incident just last year in New York where police were able to detain an accused rapist by using facial recognition technology.

The apprehension was made within 24 hours of when he threatened a woman with rape at knifepoint. Facial recognition technology is also used by business owners that work and live in cities where police don’t have time to fight petty crime.

Airports are also increasingly using facial recognition technology at security checkpoints. It is predicted that it will be used on 97 percent travelers by 2023 in the United States. The great thing about facial recognition technology is that there is no contact required unlike fingerprinting. It offers a seamless verification experience and there are no tools like a key or ID which can be stolen or lost.

Facial recognition technology is used on social media platforms for tagging friends and family. It is also used to complete payment or download apps via Google and Apple.

Although it isn’t an impossible task, it is pretty difficult to fool facial recognition, meaning that it can be used as a tool in preventing fraud.

Facial Recognition Technology Cons

One of the biggest concerns of facial recognition technology is the implication of data privacy. Facial recognition data usually gets stored in servers that are generally accessible via the cloud. Like any other computer system, it is vulnerable to hackers. Although these systems are progressively getting better at preventing identity theft, if a data breach occurs, the information can easily fall into the wrong hands.

Companies using such technology can also hand over your data for research purposes without your consent in order to gain a profit.

There is also the above-mentioned problem of racial bias with reports suggesting that these tools aren’t as effective at identifying people of color and women.

“Facial recognition software learns from the data that it is trained with, and because of the profiles of the people developing this technology, there is a bias towards using Caucasian male subjects. This leads to low identification rates of females and anyone with darker skin, with one trial misidentifying darker-skinned women for men 31% of the time.

The US Government Accountability Office, in March 2017, found that these technologies were 15% less accurate for women and ethnic minorities. When this technology is being used in high stakes scenarios, such as identifying criminal suspects, it is not difficult to see how this could perpetuate an existing racial bias within the law.”

Some advocacy groups maintain that one of the disadvantages of facial recognition technology lies in its unreliability. There are many factors that can throw off these recognition systems including low illuminations as well as image or video quality, slight changes in camera angles or personal appearances leading to false positives.

“Campaigners, including Liberty UK and Big Brother Watch, have stressed that mass surveillance of innocent people in public violates three articles of the Human Rights Convention, Article 10: The Right to Freedom of Expression; article 11: The Right to Freedom of Assembly and Association; and article 8: The Right to a Private Life.

There is a real worry that the indiscriminate use of facial recognition technology in the public realm stifles non-conformist modes of appearance and expression. Facial recognition technologies and their use have normalized pervasive surveillance practices in public spaces and, in doing so, has undermined several inalienable rights.”

Silkie Carlo, the director of the London-based privacy campaign group Big Brother Watch voiced his concerns.

“This decision represents an enormous expansion of the surveillance state and a serious threat to civil liberties in the U.K. This is a breath-taking assault on our rights and we will challenge it, including by urgently considering next steps in our ongoing legal claim against the Met and the Home Secretary,” Carlo added. “This move instantly stains the new government’s human rights record and we urge an immediate reconsideration.”

China is waging its own war on terror, but rather than striking on elite military units, its targets are domestic minority populations appearing to threaten the government’s authoritarian rule.

China is a country that has more facial recognition cameras than any other country in the world, and they are increasingly hard to avoid. A facial recognition company was contracted by the Chinese police to monitor the GPS coordinates of almost 2.6 million people in the Xinjian region which is in lock-down.

According to the Guardian,

‘Many of the detainees had been arrested for having supposedly committed religious and political transgressions through social media apps on their smartphones, which Uighurs are required to produce at checkpoints around Xinjiang. Although there was often no real evidence of a crime according to any legal standard, the digital footprint of unauthorized Islamic practice, or even a connection to someone who had committed one of these vague violations, was enough to land Uighurs in a detention center. The mere fact of having a family member abroad, or of traveling outside China, as Alim had, often resulted in detention.

Most Uighurs in the detention centers are on their way to serving long prison sentences, or to indefinite captivity in a growing network of internment camps, which the Chinese state has described as facilities for “transformation through education”.’

The technology has been used in northwestern China in what the government is calling a fight against Islamic radicalization and Uyghur separatist movements. More than a million Muslims were detained in a number of camps over the region during the past few years. In the capital, Urumqi, there are checkpoints in place with facial scanners at hotels, gas stations, and even banks. Information like names, addresses, ID numbers, photos, employer details, and other personal data were attached to logs of location information.

“These camps, which function as medium-security prisons and, in some cases, forced-labor factories, attempt to train Uighurs to disavow their Islamic identity and embrace the secular principles of the Chinese state. They forbid the use of the Uighur language and instead offer drills in Mandarin, the language of China’s Han majority. Only a handful of detainees who are not Chinese citizens have been fully released from this “re-education” system.”

Conclusion

London has recently been introduced to facial recognition technology by the Met. Although there are some positive sides to the software, such as protecting the public, there are also many arguments as to why it shouldn’t be used. This is why the European Union is considering banning the use of this technology, at least temporarily.

As seen in the example of how facial recognition cameras are used in China, there are some uses of this technology that are not in the best interest of the public. Ultimately, it shouldn’t be up to the people who want to use facial recognition technology to make the decision, but rather by a body that represents the public interest such as the Parliament.

As long as there is no statutory law that contains a clear and binding code of practice, London is not yet ready for mass deployment of facial recognition technology. It is up to the government to fight the temptation to undermine civil liberties for the purpose of safety and security.

Article comments