How Has Big Tech Influenced Modern Policing?
Fresh issues surrounding facial recognition technology and litigation continue to make headlines following Congress’ proposal to regulate facial recognition tech last June.
But the bill never did make it to a vote. In retaliation, tech industry heavyweights placed sanctions on facial recognition technology sales to law enforcement, until Congressional failures are adequately corrected.
A Booming Market
The facial recognition tech market is growing at an exponential rate; findings of a recent research report conducted by Component suggest that the industry is expected to reach a value of $7 billion by 2024 in the United States. Facial recognition technology, which utilizes a wide database of photos to identify people in videos and security photos, uses biometrics to map individual facial features, creating a mathematical formula which is then compared to a database of known faces.
A Flawed System Requiring Government Regulation
One year on, and Congress is yet to regulate facial recognition technology which, civil liberties groups argue erode public privacy, and uphold racial biases used to marginalize ethnic minority communities. Just this year, Amnesty International launched a global campaign calling on the police and all other government agencies to totally ban the use, production, development, and sale of facial recognition technologies used for the purpose of mass surveillance.
Law enforcement’s use of advanced technology, such as facial recognition software, to identify human suspects is problematic on account of facial analysis systems’ major inaccuracies. Research conducted into the performance of three commercial gender-recognition systems by the Massachusetts Institute of Technology (MIT) found that they had error rates of up to 34% for dark-skinned women, which was nearly 49 times that for white men.
Amazon’s own facial surveillance tool, “Rekognition”, has received direct backlash on account of findings by ACLU in which the software incorrectly matched 28 Congress members with mugshots.
Algorithmic bias in facial surveillance is a dangerous use of technology, which is why people are placing pressure on the government to limit how and when such technology can be used. Notable Founders such as Joy Buolamwini have made waves in this area to even be featured on the Netflix documentary: Coded Bias. Read more here.
Last June, IBM announced it would cease to use its facial recognition software for “mass surveillance or racial profiling”. This was followed by Amazon’s implementing of a one-year ban on police use of its facial recognition technology; the suspension of its Rekognition software was accompanied by a statement which expressed the hope that “this one-year moratorium might give Congress enough time to implement appropriate rules.”
In the days following Amazon’s action, Microsoft also announced its ban on selling its facial recognition software to the police and government departments until the implementation of federal law to regulate its use.
Helping or Hindering?
Weeks later, Senator Edward Markey, Senator Jeff Merkley, Congresswoman Pramila Jayapal, and Congresswoman Ayanna Pressley introduced the Facial Recognition and Biometric Technology Moratorium Act, which sought to prevent federal use of biometric surveillance systems.
With concern over privacy and racial bias issues surrounding facial recognition software, its use in states and cities across America is being brought into question.
Deputy director of activist group Fight for the Future believes that facial recognition technology has no place in society.
“It’s deeply invasive, and from our perspective, the potential harm to society and human liberties far outweigh the potential benefits.”
As it currently stands, tech companies that manufacture facial recognition technology wield near total control over who gets to use it and how.
Is your Startup pioneering new AI technologies? Checkout Floww’s Startup resources to help you find funding.
Header Image via Yahoo! Finance from Elaine Thompson / Associated Press.