{"id":11999,"date":"2019-11-13T17:05:16","date_gmt":"2019-11-13T22:05:16","guid":{"rendered":"https:\/\/journals.law.harvard.edu\/crcl\/?p=11999"},"modified":"2019-11-13T17:05:16","modified_gmt":"2019-11-13T22:05:16","slug":"can-you-see-me-now-facial-surveillance-is-a-civil-liberties-issue","status":"publish","type":"post","link":"https:\/\/journals.law.harvard.edu\/crcl\/can-you-see-me-now-facial-surveillance-is-a-civil-liberties-issue\/","title":{"rendered":"Can You See Me Now? Facial Surveillance Is a Civil Liberties Issue"},"content":{"rendered":"<p><span style=\"font-weight: 400\">Technology frequently progresses faster than legal institutions are able to keep up. Facial surveillance &#8211; the use by police and other entities of technology which can recognize people and identify them by their faces &#8211; is one such area. Facial recognition technologies, generally speaking, are not too far from what is used to unlock the newer versions of the <\/span><a href=\"https:\/\/www.lifewire.com\/face-id-4151714\"><span style=\"font-weight: 400\">iPhone<\/span><\/a><span style=\"font-weight: 400\">. In the context of government and law enforcement use, facial recognition or surveillance <\/span><a href=\"https:\/\/www.aclu.org\/issues\/privacy-technology\/surveillance-technologies\/face-recognition-technology\"><span style=\"font-weight: 400\">refers to<\/span><\/a><span style=\"font-weight: 400\"> practices like capturing information about individuals whereabouts and activities without their consent, and often without their knowledge. Police departments and other law enforcement bodies <\/span><a href=\"http:\/\/www.tomdispatch.com\/post\/176142\/tomgram%3A_harwood_and_stanley%2C_policing_the_dystopia\/\"><span style=\"font-weight: 400\">claim<\/span><\/a><span style=\"font-weight: 400\"> that use of these emerging technologies will help the departments engage more effectively with communities and do their work better. However, despite all the comforting rhetoric, the <\/span><a href=\"http:\/\/www.tomdispatch.com\/post\/176142\/tomgram%3A_harwood_and_stanley%2C_policing_the_dystopia\/\"><span style=\"font-weight: 400\">use of facial recognition technology for law enforcement purposes<\/span><\/a><span style=\"font-weight: 400\"> represents significant infringements on peoples\u2019 privacy and nonconsensual surveillance. This ability for the government to have unprecedented power to track people in their daily business and without cause is \u201c<\/span><a href=\"https:\/\/www.nytimes.com\/2019\/05\/14\/us\/facial-recognition-ban-san-francisco.html\"><span style=\"font-weight: 400\">incompatible with a healthy democracy.<\/span><\/a><span style=\"font-weight: 400\">\u201d Much like other forms of technology and tools used by law enforcement, if unchecked, these tools will harm, rather than help, communities. In particular, they will reinforce the crisis of racial bias in the criminal legal system.<\/span><\/p>\n<p><span style=\"font-weight: 400\">There are a number of issues with these technologies. The primary concern, shared by the <\/span><a href=\"https:\/\/www.aclu.org\/issues\/privacy-technology\/surveillance-technologies\/face-recognition-technology\"><span style=\"font-weight: 400\">ACLU<\/span><\/a><span style=\"font-weight: 400\"> and other activists, is that these technologies will allow for \u201cgeneral, suspicionless surveillance systems.\u201d This would allow a government body not only to track and monitor people\u2019s activities, but to act on that information in order to quell dissent and otherwise keep abreast of political dissidents. The exact implications of the use of facial recognition systems are unknown, especially given how secretive federal agencies are about their programming. Furthermore, it is unclear how the government could adequately keep the data they gather safe from data breaches and hackers &#8211; we have already seen a situation in which hackers stole an incredible amount of data in the form of travelers\u2019 photos after breaching <\/span><a href=\"https:\/\/www.theatlantic.com\/technology\/archive\/2019\/06\/travelers-images-stolen-attack-cbp\/591403\/\"><span style=\"font-weight: 400\">Customs and Border Protection\u2019s<\/span><\/a><span style=\"font-weight: 400\"> security systems. Facial recognition and surveillance technology would be concerning even if it operated effectively &#8212; but there is mounting evidence that these technologies operate far from perfectly. The ACLU of Massachusetts recently <\/span><a href=\"https:\/\/www.aclum.org\/en\/news\/facial-recognition-technology-falsely-identifies-famous-athletes\"><span style=\"font-weight: 400\">ran a test<\/span><\/a><span style=\"font-weight: 400\"> using Amazon\u2019s Rekognition, a widely available platform for facial recognition software, and found that the software incorrectly matched 27 well-known Massachusetts professional athletes with images of other individuals from a mugshot databases. The ACLU ran a similar test earlier this year using Amazon\u2019s proprietary facial recognition system, which matched <\/span><a href=\"https:\/\/www.nytimes.com\/2018\/07\/26\/technology\/amazon-aclu-facial-recognition-congress.html\"><span style=\"font-weight: 400\">28 members of Congress<\/span><\/a><span style=\"font-weight: 400\"> with images from a mugshot database. These matches were disproportionately for African American and Latinx members of Congress. Across all platforms, these algorithms struggle to accurately identify <\/span><a href=\"https:\/\/www.wired.com\/story\/best-algorithms-struggle-recognize-black-faces-equally\/\"><span style=\"font-weight: 400\">people of color and women<\/span><\/a><span style=\"font-weight: 400\">. It is one thing for these mismatched results to happen in a controlled test study, where the people being identified are high-profile persons and the identifications can easily be deemed mistaken. It is quite another for law enforcement agencies to attempt to use these programs in their work as a way to track and surveil ordinary citizens. These concerns are more than luddite speculation: States are already trying &#8211; unsuccessfully &#8211; to use facial identification in law enforcement, with the disastrous result of law enforcement officers relying on imprecise and biased identifications to make arrests, as in the Florida case of <\/span><a href=\"https:\/\/www.aclu.org\/blog\/privacy-technology\/surveillance-technologies\/florida-using-facial-recognition-convict-people\/\"><span style=\"font-weight: 400\">Willie Lynch<\/span><\/a><span style=\"font-weight: 400\">. Even if the software worked perfectly &#8211; that is, didn\u2019t display the incredibly concerning misidentifications exemplified in the ACLU tests &#8211; the privacy and civil liberties concerns should be enough to give lawmakers pause about the unregulated use of facial recognition technology.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">Despite these concerns, there is no federal legislation regarding the use of facial recognition and surveillance. In fact, only a handful of local governments have taken action to check the use of facial recognition technologies for surveillance. In May, the <\/span><a href=\"https:\/\/www.sfchronicle.com\/politics\/article\/San-Francisco-bans-city-use-of-facial-recognition-13845370.php?psid=4T9Xy\"><span style=\"font-weight: 400\">San Francisco<\/span><\/a><span style=\"font-weight: 400\"> Board of Supervisors voted to pass a regulation banning the use of facial recognition technology by city policy and local government departments. The ban was sent to the mayor and is pending final approval. Just across the bay, <\/span><a href=\"https:\/\/www.sfchronicle.com\/bayarea\/article\/Oakland-bans-use-of-facial-recognition-14101253.php\"><span style=\"font-weight: 400\">Oakland<\/span><\/a><span style=\"font-weight: 400\"> followed suit in July. Both cities cited concerns of bias in the technology in justifying the bans, noting the especially high risk of misidentification for African American and Latinx individuals. In September, a group of community members and activists <\/span><a href=\"https:\/\/www.aclu.org\/press-releases\/civil-rights-coalition-urges-detroit-board-police-commissioners-reject-detroit-police\"><span style=\"font-weight: 400\">formally urged<\/span><\/a><span style=\"font-weight: 400\"> the Detroit Board of Police Commissioners to reject the local police department\u2019s proposed facial recognition technology policy. The board <\/span><a href=\"https:\/\/www.freep.com\/story\/news\/local\/michigan\/detroit\/2019\/09\/19\/detroit-police-facial-recognition-policy-approved\/2374839001\/\"><span style=\"font-weight: 400\">approved the policy<\/span><\/a><span style=\"font-weight: 400\"> over these objections, with the caveats that \u201cthe police department cannot use facial recognition software on live or recorded video, and it cannot use it to assess a person&#8217;s immigration status.\u201d Across the country, <\/span><a href=\"https:\/\/www.bostonmagazine.com\/news\/2019\/06\/28\/somerville-facial-recognition-technology-ban\/\"><span style=\"font-weight: 400\">Somerville, MA<\/span><\/a><span style=\"font-weight: 400\"> also banned the use of facial recognition technology \u201cin police investigations and municipal surveillance programs\u201d in June of this year. Over the summer, the Mayor of Cambridge proposed a <\/span><a href=\"https:\/\/www.wbur.org\/news\/2019\/08\/01\/face-surveillance-photos-ordinance-privacy\"><span style=\"font-weight: 400\">similar ordinance,<\/span><\/a><span style=\"font-weight: 400\"> which has not yet been adopted. There are currently bills in the Massachusetts <\/span><a href=\"https:\/\/malegislature.gov\/Bills\/191\/H1538\"><span style=\"font-weight: 400\">House<\/span><\/a><span style=\"font-weight: 400\"> and <\/span><a href=\"https:\/\/malegislature.gov\/Bills\/191\/S1385\"><span style=\"font-weight: 400\">Senate<\/span><\/a><span style=\"font-weight: 400\"> that would institute a Commonwealth-wide ban of the technology because the \u201cMassachusetts General Court finds that government use of face recognition poses <\/span><a href=\"https:\/\/malegislature.gov\/Bills\/191\/S1385\"><span style=\"font-weight: 400\">unique and significant civil rights and civil liberties<\/span><\/a><span style=\"font-weight: 400\"> threats to the residents of the Commonwealth of Massachusetts.\u201d\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">On <\/span><a href=\"https:\/\/www.aclum.org\/en\/news\/we-are-suing-fbi-find-out-how-they-use-face-surveillance-technology\"><span style=\"font-weight: 400\">October 31<\/span><\/a><span style=\"font-weight: 400\">, the national ACLU and the ACLU of Massachusetts filed a <\/span><a href=\"https:\/\/www.aclum.org\/sites\/default\/files\/field_documents\/20191031_aclu_doj_complaint.pdf\"><span style=\"font-weight: 400\">federal suit <\/span><\/a><span style=\"font-weight: 400\">\u201cchallenging the secrecy shrouding federal law enforcement use of face recognition surveillance technology\u201d against the Department of Justice, the Drug Enforcement Administration, and the Federal Bureau of Investigation. The suit comes following the ACLU\u2019s public records requests to these agencies seeking information on the use of facial recognition surveillance technology, which were acknowledged but not answered. Given the serious civil liberties issues regarding the government\u2019s use of this data, it is imperative that more information about these programs be made public.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">The use of facial surveillance technologies means our privacy rights are at risk. Without regulation, police departments and other law enforcement bodies will be able to use systems with questionable-at-best accuracy to surveil communities without notice and perhaps without cause. Even if these technologies operated effectively, they should still give us pause. Concerns about facial recognition and surveillance programs are not mere technophobia &#8211; they represent well-founded worries about misuse of government data and the risks of such data being breached. As these technologies stand now, the high degrees of error are likely to disproportionately harm communities of color, which are already more likely to be subject to injustices of the criminal legal system. Given these harmful effects which threaten our civil liberties, these technologies can no longer go unchecked.<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Facial recognition technology is progressing faster than our laws can keep up. Unregulated, these technologies represent a threat to our privacy rights and civil liberties. <\/p>\n","protected":false},"author":101915,"featured_media":12000,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_exactmetrics_skip_tracking":false,"_exactmetrics_sitenote_active":false,"_exactmetrics_sitenote_note":"","_exactmetrics_sitenote_category":0,"site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[3,45,1217,1210],"tags":[126,127,862,1545,1619,431,462,540],"coauthors":[1543],"class_list":["post-11999","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-amicus","category-criminal-justice","category-policing-and-law-enforcement","category-privacy-and-technology","tag-civil-liberties","tag-civil-rights","tag-criminal-justice","tag-facial-recognition-technology","tag-facial-surveillance","tag-privacy","tag-right-to-privacy","tag-technology"],"jetpack_featured_media_url":"https:\/\/journals.law.harvard.edu\/crcl\/wp-content\/uploads\/sites\/80\/2019\/11\/StockSnap_70GQNT6XZM.jpg","jetpack_shortlink":"https:\/\/wp.me\/peZrWS-37x","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/journals.law.harvard.edu\/crcl\/wp-json\/wp\/v2\/posts\/11999","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/journals.law.harvard.edu\/crcl\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/journals.law.harvard.edu\/crcl\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/journals.law.harvard.edu\/crcl\/wp-json\/wp\/v2\/users\/101915"}],"replies":[{"embeddable":true,"href":"https:\/\/journals.law.harvard.edu\/crcl\/wp-json\/wp\/v2\/comments?post=11999"}],"version-history":[{"count":0,"href":"https:\/\/journals.law.harvard.edu\/crcl\/wp-json\/wp\/v2\/posts\/11999\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/journals.law.harvard.edu\/crcl\/wp-json\/wp\/v2\/media\/12000"}],"wp:attachment":[{"href":"https:\/\/journals.law.harvard.edu\/crcl\/wp-json\/wp\/v2\/media?parent=11999"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/journals.law.harvard.edu\/crcl\/wp-json\/wp\/v2\/categories?post=11999"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/journals.law.harvard.edu\/crcl\/wp-json\/wp\/v2\/tags?post=11999"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/journals.law.harvard.edu\/crcl\/wp-json\/wp\/v2\/coauthors?post=11999"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}