By: Jessica Farmer
Senior Associate Editor, American Journal of Trial Advocacy
The new trend in technology is facial recognition software. Taylor Swift used the software at her concert to look out for stalkers. The technology was used to ensure Swift’s safety, but the concert attendees were not aware that watching a video on a kiosk meant their face was being scanned. Several companies have developed their own version of facial recognition software and have marketed the technology to the government as well as private companies.
One of these private companies, Blink Identity, has already teamed up with Ticketmaster and Live Nation to bring this technology to more concerts and events in the future. Blink Identity wants to use facial recognition to allow user friendly access to concerts and an interactive experience for attendees to enjoy. The hope is that the facial recognition software will reduce fraud and scalpers and one day replace the need for tickets.
The facial recognition industry is expected to grow to $10 billion by 2023. Facial recognition software can identify a person from an image or video, which can then be used in a database to find out more information about the person. Private businesses that use facial recognition software are growing, and even Tokyo is trying to implement the software into their international airport. Stores are using facial recognition software to identify shoplifters, but the software is also helping track shoppers’ buying habits. The technology is being used to track people, but it is also improving stores’ marketing capabilities to customers.
On the governmental side, facial recognition software technology has been in use longer than it has been in private businesses. One agency opened up about how they use facial recognition technology. They take photos from case files or retailers and upload them into the software. The software pulls up matches within seconds and ranks the images based on percentage matched. But, the match is never 100 percent. Next, the investigator can use the matches and run the photos through databases to find a person, so they can make an arrest. It is important to note that the police department does not use the technology as a means to establish probable cause. The police department only uses the technology to generate a lead in the investigation.
A concern with government agencies using facial recognition software technology is the regulation of the technology. The oversight available for facial recognition software is limited with the technology being so new to the market. There are not a lot of regulations or laws in place to deter abuse of the technology when the government is involved. Government agencies are not restricted on the databases they can access using facial recognition software technology, which can easily lead to abuse. The agencies are also refusing to reveal how they use the facial recognition software technology to defense attorneys and private advocates. The downside is that police can make incorrect arrests due to the errors in the technology. The positive is that facial recognition software has helped police arrest people that would have gotten away originally.
Multiple arrests have been made across the country using facial recognition software technology that would not have happened otherwise. A few examples of arrests are “a serial robber in Indiana, a rapist in Pennsylvania, a car thief in Maine, robbery suspects in South Carolina, a sock thief in New York City, and shoplifters in Washington County, Oregon.” Police departments enjoy the use of the technology in aiding investigations because facial recognition software produces leads that police have never had access to before.
Despite the positives, a huge problem comes in with facial recognition software not being accurate. Facial recognition software’s lack of accuracy led to San Francisco passing a facial recognition software ban for police and city agencies. The bill’s author Aaron Peskin stated, “[t]his is really about saying we can have security without being a security state.” This bill is supported by civil liberties groups that are concerned facial recognition software is being used to track political activists at rallies and protests. In addition, facial recognition software is not even at a 50% accuracy rate, so the potential for an increase in wrongful arrests is a major concern to American citizens.
The errors in recognizing faces that the facial recognition software technology still contains is detrimental to women and people of color. A study named Gender Shades stated, “[there is] a 34 percent error rate in identifying darker-skinned women.” An MIT study backed this up with an “error rate of more than 31 percent when identifying darker-toned women.” The technology could be easily abused when the accuracy for women and minorities is low. Government agencies would need to improve this part of the technology for more accurate investigations and to ease concern from civil rights activists.
Another concern is that wide spread use of facial recognition software could infringe on privacy because that software takes photos of your face without your permission or knowledge. Facial recognition software also causes alarm because it could be used as a surveillance system turned against the public and drop America right into the middle of the book 1984. Even though companies like Blink Identity want the technology to be introduced in a privacy protective way, the definition of privacy is up to the company.
Next, where does consent fall when people are using facial recognition software technology? A new bill has been introduced in the United States Senate to prevent companies from using facial recognition software without the consent of the person. Facial recognition software is a great invention for society, but it needs some regulation to protect the privacy of citizens. Faces are more personal to consumers than other information, so companies need to ask permission to use consumers’ faces.
Companies need to download as many pictures into facial recognition software as possible in order to improve the technology to be more accurate. The problem is that companies are not asking permission to download pictures off the internet to use in the software. For example, IBM downloaded photos to run through facial recognition software to prove the inaccuracy of the technology but did not ask permission to use the photos for the project. IBM also made it difficult for consumers or photographers to have the photos removed from this project.
In conclusion, the government and private businesses need more regulation when employing facial recognition technology. Consumers need to give permission to government agencies and businesses to use their faces in improving the technology. Additionally, the government needs to put in place protective safeguards for using facial recognition software to prevent the technology from being inappropriately used as evidence. The technology can be useful in aiding an investigation, but it should not be the sole source used in making an arrest. Consumers should retain the right to give permission for their faces to be used in the software no matter who the end user is.
 See Scott Carlson, Facial Recognition Scanning Goes Mainstream, A.B.A. J. (February 22, 2019, 7:00 AM), http://www.abajournal.com/web/article/facial-recognition-scanning-mainstream.
 See Carlson, supra note 1.
 See Jon Schuppe, How Facial Recognition Became a Routine Policing Tool in America, NBC News (May 11, 2019, 3:19 AM), https://www.nbcnews.com/news/us-news/how-facial-recognition-became-a-routine-policing-tool-in-america-n1004251.
 See Schuppe, supra note 12.
 See Schuppe, supra note 12.
 See Debra Cassens Weiss, San Francisco Supervisors Vote to Ban Facial Recognition Technology by City Agencies, A.B.A. J. (May 15, 2019, 9:20 AM), http://www.abajournal.com/news/article/san-francisco-supervisors-vote-to-ban-facial-recognition-surveillance-technology-in-the-city.
 See Carlson, supra note 1.
 Id.; George Orwell, 1984 (1949).
 See Carlson, supra note 1.
 Makena Kelly, New Facial Recognition Bill Would Require Consent Before Companies Could Share Data, The Verge (Mar. 14, 2019, 5:40 PM), https://www.theverge.com/2019/3/14/18266249/facial-recognition-bill-data-share-consent-senate-commercial-facial-recognition-privacy-act.
 See Olivia Solon, Facial Recognition’s ‘Dirty Little Secret’: Millions of Online Scraped Without Consent, NBC News, (Mar. 12, 2019), https://www.nbcnews.com/tech/internet/facial-recognition-s-dirty-little-secret-millions-online-photos-scraped-n981921.
 See Charlotte Jee, People’s Online Photos Are Being Used Without Consent to Train Facial Recognition AI, MIT Tech. Rev. (Mar. 13, 2019), https://www.technologyreview.com/f/613118/peoples-online-photos-are-being-used-without-consent-to-train-face-recognition/.
 See Solon, supra note 41.