comparemela.com

Card image cap

F facial Recognition Technology. They testified before the House Oversight and reform committee. Working on e is legislation to address the emerging technology. Good morning, everyone, and is out objection the chair authorized to declare a recess of the committee at any time. Would now like to recognize myself to give an opening statement. Today the committee is holding on a critical ng issue. Facial recognition tech. Its clear that despite the sectors expanded use of technology, its just not ready primetime. During this hearing, well private sectors development, use and sale of technology as well as its partnerships with Government Entities, using this technology. We learned from our first may 22 of 2019 that he use of facial Recognition Technology can severely impact American Civil Rights and liberties, including the right to privacy. Protection and equal under the law. We learned during our second federal, n june 4 how state and local Government Entities use this technology on wide scale. Yet provide very little why itsncy on how and being used or on security measures to protect sensitive data. Despite these concerns, we see facial Recognition Technology and more in our everyday lives. Used in ology is being schools, grocery stores, irports, malls, theme parks, stadiums and on our phones, social media platforms, door and even in ootage hiring decisions and its used by Law Enforcement. This technology is completely unregulated at the federal level resulting in some questionable applications. Rous 2019, National Institutes of standards and technology issued a new report that commercial facial recognition algorithms minorities,d racial women, children, and elderly individuals at substantially rates. I look forward to discussing the doctor and director of nists Information Technology laboratory, who is today. Us i also look forward to hearing rom our expert panel from academia, industry, and the advocacy community, on and mended actions policymakers should take into account to address potential harm based on these findings. Facial ination of Recognition Technology is a bipartisan effort. Ranking member jordans tireless and ongoing this issue. We have a responsibility to not innovation, but and otect the privacy safety of Merrill Lynch consumers that. Means educating our fellow and on the ican people different uses of the technology and distinguishing between subjective identification, and surveillance uses that. Lso means exploring what protections are currently in place. To protect civil rights, privacy, Data Security and as ent misidentification well as providing ecommendations for future legislation and regulation. N that vain i would like to announce today that our committee is committed to recognition acial legislation in the very near future and our hope is that we do that in a truly bipartisan way. We have had several forwardtions and i look towards thatgether goal. I now recognized the distinguish asked Ranking Member, jordan, stadium. Opening thank you, madam chair, we appreciate your willingness to ork with us on this legislation. We have a bill we want to talk about as well, facial recognition is a powerful new thats being widely used by both Government Agencies and private sector companies. Its sales have experience add 20 yeartoyear growth since expected he market is to be valued at 8. 9 billion by 2022. Ncrease gi, local state and governmental industries are with little. D. , but to know accountability. The this Technology Government can capture faces in public places, identify individual, which allows the tracking of our movements, patterns, and behavior. Of this is currently happening without legislation, legitimate government functions with american Civil Liberties. That must change. While this hearing is about uses of facial unclearion i have to be that i have no desire to enhance it in the private sector. Promise iate the great that this Technology Holds for making our lives better. Improved Data Security and led to greater efficiency and verification and prevents tion that theft and protects consumers. He urgent issue, the urgent issue we must tackle is reigning in the governments unchecked se of this technology when impairs our freedoms and liberties. About s became concerned government use of facial Recognition Technology after to surveyorwas used protests in his district related to the death of freddie gray. A deeply s as encroachment on their freedom of speech and association and i more. Nt agree this issue transcends politics. It doesnt matter if its a trump rally or a Bernie Sanders rally. The idea of american citizens for tracked and cataloged merely showing their faces in public is deeply troubling. Imperative that congress understands the effects of this technology on our constitutional liberties. The invasiveness of facial Recognition Technology has a number of localities to ban its government using s from buying or digital facial recognition for any purpose. Toens to create a patchwork of laws and may hinder the use of technology. An rtunately this is not issue we should leave to the courts. Thatesents novel questions are best answered by congressional policy making which can establish a national consensus. The unique governmentwide focus of this committee allows us to to address islation facial Recognition Technology here at the federal level. We know that a number of federal agencies possess facial Recognition Technology it without guidance from Congress Despite its serious implications on our first and amendment rights. At a bare minimum we must understand how and when federal agencies are using this type of what purpose. For currently we do not know even this basic information. Our committee has jurisdiction over the entire federal governments use of we must technology, start by pursuing policy address this fundamental information. It is our intention as well to introduce legislation. Work with both sides here, trying to work ogether that will provide transparency an accountability with respect to the federal and use ofs purchase this technology. Want to thank you madam chairwoman and i look forward to hearing our Witnesses Today and thank you for being here. Before we get to the witnesses, i would like to make consent request. I would like to insert into the aclu d a report from the hich found that amazons Recognition Technology misidentified 28 members of as other individuals ho had been arrested for crimes. Including john lewis, a national legend, a National Civil rights leader. So i would like to place that nto the record, and i would also like to mention that three members of this committee were misidentified. Gomez, mr. Clay, and mr. They were misidentified, and is ed that this technology not ready for primetime along with 11 republican members of like to, so i would now colleague, mr. Gomez, who has personal experience with this for an opening statement. You, madam chair. First, this is the Committee Committee is holding its third hearing on this issue and up until two years ago this even on my radar, aclu conducted this test which falsely matched my identity with someone who crime. Ted a then all of a sudden my ears doubt i but i had no was misidentified because of the color of my skin than anything else. O as i started to learn and do research on this issue my concerns only grew. I found out that its being used in so many different ways, not only in Law Enforcement, at the federal level, the local level, its also being used when it comes to apartment buildings. To door bells, when it comes to shoppers. When it comes to a variety of things. But at the same time, this technology is fundamentally flawed. Who gets pulled over by the police, in certain not a big deal. In other areas it could mean think death, if people youre a violent felon. So we need to start taking this seriously. Probably doesnt rank in the top three issues of any in the United States, but as it continues to be used and it continues to have issues, there will be more and more people who are misidentified, and more and more eople who are questioning if their liberties and their freedoms are starting to be fault of their own, just some algorithm misidentified them as somebody committed a crime in the past. This is something we need to aise an alarm and thats what these hearings are doing in a bipartisan way to make sure that American Public doesnt stumble into the dark, and all of a a den, our freedoms are little bit less. Our liberties are a little bit having these start important discussions, in a figure out ay, to how and what can the federal government do. What can congress do . What is our responsibility . The ith that, i appreciate chairs commitment to legislation. Also, i appreciate the ranking commitment to legislation, because i know this issue is a tough one and it can bipartisan in a way. With that i yield back. Now recognize mr. Meadows of North Carolina for an opening statement. You, madam chair. Oth of you, thank you for your leadership on this important issue. Things that i would highlight. Certainly we know mr. Gomez and there is certainly no background that he of, being be accused involved with. So i want to stress that his character is of the upmost to us this side of the aisle and i say that in just because one of the issues we need to focus on and this is very important to me. His is where conservatives and progressives come together. Its on developing our civil and that right to privacy. I agree with the chairwoman and Ranking Member and mr. Gomez, and others where weve about d conversations addressing this issue, to focus only on the false positives is though. Roblem for us, i can tell you, technology is moving so fast that the false eliminated ll be within months, im here to say if we only focus on the fact getting it right with facial recognition weve missed the whole argument at use technology is moving warp speed and what well find is, not only will they my concern is not concern that they improperly identify mr. Gomez, my concern they will properly identify mr. Gomez and use it in manner. Ng so the witnesses that are here today, what i would ask all of we put a is how can safeguard on, to make sure that not a fishing expedition t the cost of our Civil Liberties. Thats essentially what were talking about. Were talking about scanning features, andcial even if they got it a hundred percent right, how should that used . How should we ultimately allow in government to be involved that . Nd so im extremely concerned that as we look at this issue, that we have to come together in way to figure this out. I think it would be headlines on he New York Times and washington post, if you saw coming of both parties to an agreement on how we are to address this issue. Committed to doing that. Madam chair, i was fully predecessor, your he and i both agreed at the very first time that this was brought p that we had to do something and i know the Ranking Member hares that, so im fully engaged. Lets make sure that we get something and get something quickly. And if we can do that, you know, start i think if we focusing, again, on just the accuracy, then they are going to accurate. That its and but what standards should there . E the accuracy should it be a hundred percent . Should it be 95 think when mr. Gomez actually identified, threshold was brought down to 80 . Get a lot of o false positive when is that happens but we need to help set the standards and make sure that government is not using this in an improper fashion, and with yield back. I thank the gentleman for his statement. I would now like to introduce witnesses. We are privileged to have a rich witnesses onexpert our panel today. Renda leon is a Senior Council and director of a. I. And Ethics Future of Privacy Forum. Romine, director of National Institute of standards technology. Whittaker daniel castro, and jake parker is the senior director of government relations, at the Security Industry association. If you would all rise and raise hand, ill begin by swearing you in. Whittaker da castro, and jake parker is the senior director of government [sworn in] witnesses all answered in the affirmative. Thank you, and please be seated. Future ofvery, very sensitive, so please speak directly into them and without objection your written testimony will be made part of our record, and with that, ms. Leon, youre now recognized for five minutes. Thank youvery, very for the y considering d for the commercial use of facialrom Recognition Technology. This is an Privacy Forum is a important ce Nonprofit Organization that support of emerging technologies. We believe that the power of information is a net benefit to society and that it can be appropriately managed to control risks as a catalyst for leadership and scholarship advancing principle data processes in to individuals and groups. Biometric systems such as those recognition Al Technology have the potential to and ce Consumer Services improve security but must be designed, implemented and awareness with full of the challenges they present. Today my testimony focuses on of blishing the importance technical accuracy in discussing face image based systems benefits and e harms to individuals and groups, and recommending expressed for any s the default commercial use of identification or verification systems. Specifics of the how a Technology Works is critical for effectively risks. Ing the relevant not every whittaker camerabas a facial recognition system. Facial recognition system creates unique templates stored database. Olled these databases are used then to oneonone son in a or identify a one to many search. If a match is found that person is identified with greater or lesser certainty depending on the system in use, thresholds and settings in place and the operators expertise. Thus Recognition Systems involve matching two images, without they do l processing not impute other characteristics to the person or image. Therea great deal of confusion in the media particularly in contrast to facial characterization or software, ection which attempts to analyze a single image and inpretty image eristics to that race. Ding gender and these systems may or may not link date to pick individuals but they carry their own significant risks. And acy requirements capabilities for recognition and characterization systems vary with context. The level of certainty acceptable for verifying an individuals identity when mobile device is below the standard that should be required for verifying that individual is included on a terrorist watch list. N addition, quality varies widely among suppliers based on detection, diversity of Training Data sets and the third, tetting methodologies. Reflgts their ability to meet these goals. For example, the most recent in highlights accuracy outcomes that were a hundred times worse for certain groups achieved st systems results across demographic groups with variations that were undetectable. However, the real harm arising recognition te and and characterization systems cannot be ignored. Individuals already use facial to open their phones, access bank accounts, or photos. E their organizational benefits include more secure Facility Access, functions spitality and personalized experiences. New uses are being imagined all the time. The potential harms are real. In addition to inaccuracy, about realtime surveillance societies have led individuals and policymakers to significant reservations. The decision by some [sworn in] let the record show that the municipalities to legislatively use of facial Recognition Systems by Government Agencies reflects heightened concerns. The ethical considerations of and how to use facial Recognition Systems exceed privacy considerations and the egulatory challenges are complex. Even relatively straightforward legal liability questions prove parties bearn many some share of responsibility. When considering the scope of this ries hoping to use technology, from educational and Financial Institutions to retail the potential impact on individuals are mind boggling. Many technologies, facial recognition applications offer benefits and generate on context. Tracking online preferences and ersonalizing Consumer Experiences are features some people value but others strongly oppose. Options closely to the appropriate consent level is essential. Fpf prefers a comprehensive privacy bill to data t all sensitive including biometric we recognize may choose to consider Technology Specific bills. We provide a useful model in requiring the defaults for commercial identification or verification systems to be opt in. That is, express affirmative consent prior to enrollment. Be few, es should narrow, and clearly defined and further restrictions should be and based on the scope and severity of potential harm. Thank you for your attention and commitment to finding responsible regulatory approach to the use of facial Recognition Technology. You, the chair now recognizes dr. Romine for five minutes. Romine director of the Information Technology laboratory at the department of commerces National Institute of standards and technology. Known as nist. Thank you for the opportunity to appear before you today to discuss nists role in standards facial ing for Recognition Technology. In the area of biometrics, nist has been working with public and private sectors since the 1960s. Biometric provides a way identify the identity of humans based on one or more physical or behavioral characteristics. T compares an individuals facial features to available verification or identification purposes. This work improves the accuracy, quality, use ability, intraoperability, in the area of and ensures they are used in the arena. Ational they provide stateoftheart benchmarks and guidance to industry and u. S. Government upon es that depend biometrics technologies. Nist face recognition vendor frvt providesam or technical guidance and scientific support and recommendations for utilization recognition technologies to various u. S. Government and you Law Enforcement agencies f. B. I. , dhs, cbp and iopa. Interagency report 8280 released in december 2019 accuracy of face recognition algorithms for demographic groups defined by and race or country of both oneonone and one to many identification search algorithms. Found empirical evidence for differentials that it miss evaluated. The report distinguishes between false positive and false negative errors and notes that errors are of application dependent. They conducted tests to quantify demographic differences for 189 ace recognition al gore rims from 199 Developers Using four collections of photographs with million images of 8. 49 Million People. Hese images came from operational databases provided by the state department, the security,t of homeland and the f. B. I. Our o ill first address verification applications. They are false positive differentials are much larger han those related to false negative, and exist across many of the algorithms tested. Present a ives might security concern to the system wner as they may allow access impostures. Higher in ives are women than in men. Regarding race, we measured high false positive rates in asian and africanamerican faces those of caucasians, there are also higher rates of alse positives in indian, native american and pacific islanders. Including those in europe and the United States. Notable exception was for some algorithms developed in countries. There was no such dramatic difference in false positives in oneonone matching between countries. Caucasian faces for there was no such algorithms developed in asia. While the study did not explore the relationship between cause effect, one possible connection and an area of research is the relationship algorithms performance and the data used to algorithm itself. Ill now comment on one too many search al gore rims. Its algorithm dependent. False positives in one to many particularly important because the consequences could include false accusations. Algorithms the miss study measured higher false ositive rates in women, africanamericans and particularly in africanamerican women. However, the study found that many algorithms gave similar false positive rates across these specific demographics. Some of the most accurate algorithms fell into this group. Underscores one overall message of the report. Different algorithms perform differently. Indeed, all of our reports note ide variations in recognition accuracy across algorithms and an important result from the emographic study is that demographic effects are smaller with more accurate algorithms. Nist is proud of the positive impact it had in the last 60 years on the evolution of capabilities, with nist extensive experience and broad expertise both in successful and in collaborations with the private sector and other government actively nist is pursuing the standards and Measurement Research necessary interoperable, secure, reliable and usable Identity Management systems. For the opportunity to estify on this Identity Management and ill be happy to questions that you have. Chairwoman maloney, Ranking Member jordan and members of the ommittee, thank you for inviting me to speak stowed. My name is Meredith Whittaker cofounder of the. A. I. Now institute in new york university, were the first niversity Research Institute dedicated to studying the social implications of artificial ntelligence and al gore erythromycin i can technologies. Worked at google for over a decade. Al gore rhythmic. The technology does not work as advertised. Research shows what Tech Companies wont tell you, that facial recognition is often bias, and error prone and there is no disclaimer populations at the already facing societal discrimination bear the brunt of recognitions failures. As dr. Romine mentioned, the audit confirmed that some systems were 100 times less accurate for black and asian white people. R but this isnt facial recognitions only problem, and accuracy will not make it safe. Facial recognition relies on the biometricction of our data. It allows government and private south kingstownors to persist at ny timely track where we go, what we do and who we associate with. Over half of americans are in a Law Enforcement facial recognition database, and usinesses are increasingly using it to surveyor and control workers and the public. At replacing time clocks job sites, keys for housing units, Safety Systems at at stadiums rity and much more. Nd weve seen reallife consequences. Facial recognition authentication system used by recognize to transgender drivers, locking them out of their accounts and livelihoods. Facial roanoke anything analysis is also being used to make udgments about peoples personality, their feelings and their worth. Based on the appearance of their face. The set of capabilities rages urgent concerns especially since claim you can automatically detect interior character based on facial expression is not supported by scientific content us is and has been called go as soon as claim you can in the past. Ost facial systems in use are developed by private companies who license them to government and businesses. The commercial nature of these systems prevents meaningful and accountability, hiding them behind legal claims of trade secrecy. That researchers, lawmakers and the public struggle to answer critical how, and about where, with what consequences this used. Logy is being this is especially troubling since facial recognition is usually deployed by those who power, say, employers, landlords, or the police, to sawyer vary, control in some cases oppress those dont. In brooklyn, tenants pushed back against their landlords plans with facialey entry recognition raising questions about biometric data collection, the very real d possibility that invasive surveillance could be abused by harass and evict tenants, many of whom were black, latino, women and children. To address the harmgs of this Technology Many have turned to assessment and auditing. These are a wonderful step in the right direction, but they to ensure that faci recognition is safe. Deployment criteria risks allowing companies to assert safe and nology is fair without accounting for how it will be used or the concerns communities who will live with it. If such standards are positioned s the sole check on these systems, they could function to mask harm instead of prevent it. Healthcare, to its difficult to think of an industry where we permit aspanies to treat the public experimental subjects. Deploying untested, unverified technology thats been proven to violate civil ights and to amplify bias and discrimination. Facial recognition poses an existential threat to democracy and fundamentally shifts the balance of power etween those using it and the populations on whom its applied. Advocating its responsibility if it continue to allows this technology to go unregulated. Step lawmakers must act rapidly to halt the use of sensitiveognition and domains by both government and commercial actors. The overall bout policing of communities of olor, or gender equity, or the constitutional right to due process and Free Association, unchecked ecretive deployment of flawed facial Recognition Systems is an issue you cannot ignore. Facial recognition is not ready for primetime. Andress has a window to act the time is now. Thank you. Daniel now recognizes castro for five minutes. Thank you. Chairwoman maloney, ranking of thejordan and members committee, thank you for the invitation to testify today. Positive uses of facial Recognition Technology emerging in the private sector. Help es are using it to travelers get to the airport faster saving people time and hassle. Improve using it to security, helping to reduce financial fraud. Hospitals are using it to verify the ight patient receives right treatment, preventing medical errors. That says en an app it uses facial recognition on dos and cats to help find lost pets. Americans are increasingly familiar with commercial uses of the technology, because its now a standard feature on the latest phones. Its also been integrated into household products like security locks. And door this is one reason why a survey last year found the majority of mericans disagreed with strictly limiting the use of facial recognition if it would mean airports cant use the to speed up security lines. N nearly half, if it would shoplifting. Stop ive also seen that facial roanoke Anything Technology is inaccurate and invasive. If that was true i would be worried, too, but it isnt. Facts. E the first, there are many different facial Recognition Systems on the market. Much better than others including, their accuracy across race, gender and age. The most accurate algorithms show no bias. Continue to get measurably better every year and they can outperform the average human. Second, many of the leading companies in the industry responsible for developing and facial recognition have voluntarily adopted robust privacy and transparency guidelines. These include voluntary standards for digital signs, multistake holder guidelines developed for the broader technology community. Sector has ivate made significant progress, Congress Also has an important role. Would like to suggest seven key steps. First, first, pass comprehensive legislation, preoemt state laws and establish basic data rights, while it may be appropriate to opt and consent for certain sensitive uses such as in healthcare or education, it always be feasible. For example, you probably to dnt get sex offenders agree in it. So opt in shouldnt be required across the board. Egislation should also be technology neutral. It shouldnt treat facial recognition differently than other types of biometrics. In addition, a federal law should not establish a private right of action because that would significantly raise costs businesses, and these costs would eventually be passed on to consumers. Second, congress should expand its evaluation of commercial facial Recognition Systems to more real world commercial uses including qualitybased systems and infrared systems. This also should continue to report performance metrics on gender, and age, and should develop a diverse image data set for training and purposes. Congress should direct performance standards for any that recognition system the government procures including for accuracy and error rates. This will ensure federal dont waste tax dollars on ineffective systems or systems with significant disparities. Fourth, congress should Fund Deployment of a fish Recognition Systems in government, for using it to i am prove security in federal buildings and government workers. Congress should continue to support federal funding for research to improve the accuracy of a fish recognition part of the s governments overall commitment to invest in Artificial Intelligence. Of of the key areas fundamental a. I. Reserve is Computer Vision and the u. S. Should continue to invest in the this technology especially as china makes gains in this field. Congress should consider legislation to establish a warrant requirement for authorities to track peoples they nts including when use gio location data from facial Recognition Systems. Finally, congress should continue to provide due oversight ensuring that any Police Surveillance or political protest is justified and conducted with appropriate it should include scrutinizing Racial Disparities in the use of force among color. Ties of congress should also require the department of justice to develop state andices for how local authorities use facial recognition. This guidance should include the mendations on how publicly disclosed, when Law Enforcement will use the echnology what sources and images will be used and what the Data Retention policies will be. Consider hould always the impact of new technologies there are safeguards in place to protect societys best interest. Facial ase of Recognition Technology, there are many unambiguously facial opportunities to use the such as allowing people who are blind, who suffer from face blindness to try to identify others. Rather than imposing bans or moratoriums congress should upport positive uses of the technology, while limiting the potential misuse and abuse. Thank you again. I look forward to answering any questions. My name is jake parker. Associationde providing a broad range of Security Products while employing thousands of innovators in the u. S. And around the globe. Our members include many of the leading developers of official Recognition Technology and products that facial Recognition Technology and products associated. It is because of the experience our members have that we are pleased to be here today to talk to you about how it can be used consistent with our values. We firmly believe all Technology Products including facial recognition shot only be used for lawful, ethical, nondiscriminatory purposes. Faith itwe can have brings value to our everyday lives. Facial recognition offers tremendous benefits. It can be used to allow individuals to prove their identity securely and conveniently to enter a venue or board an airplane. Companies are using the technology to improve the physical security of their property and employees against the threat of violence, theft, or other harm. Usernment agencies made the have used facial recognition to improve Homeland Security and investigations. To rescue trafficking victims. It has been used in almost 40,000 cases in north america identifying 9000 missing children and over 12,000 traffickers. A Law Enforcement officer in california last year saw a social media post about a missing child. ,fter using facial recognition the child was located and recovered. Other success story, they alongacial recognition with human review to identify a suspect within an hour. The chief detective was quoted as saying do not use do not use this technology would be negligent. Members see transparency as a foundation that governs the use official Recognition Technology. It should be clear when and under what circumstances the technology is used as well as a processes governing the collection, processing, and storage of data. We support sensible safeguards that promote transparency and a can ability as the most effective way to ensure the responsible use of the Technology Without unreasonably restricting tools that have become essential to public safety. Moratoriums support or bands on the use of this important technology. As the Committee Works on the proposals, we encourage private Sector Developers to be brought into the conversation to present real world views on how the Technology Works and should be best managed. Congress should provide the resources needed to support the expansion of these efforts. As we think about regulation, we believe any efforts make sense in the context of National Data privacy policy. The efforts included in this area include biometric information. This is the right approach to include it. We encourage our members to play an active role to provide users with the tools they need to use this technology responsibly. Sia is developing a set of principles on the technology. This comes on the heels of recent studies sharing a lot of controversy. On this for working decades, allowing the government to rigorously tested and post results. The accuracy is reaching that of automated print comparison, used as the gold standard. Most significant take away is that it confirms facial Recognition Technology performs far better across racial groups that has been widely reported before. According to this data, four out of 116 algorithms tested had false metrics more than 1 for any demographic. We are committed to providing technology that all users can be comfortable with and transparency policy surrounding the deployment of the technology. X for the opportunity to appear before you today. Thanks for the opportunity to appear before you today. We look forward to working with you. Chairwoman maloney i would like to ask you about the study of mentioned the study you mentioned in your testimony. I would like to ask unanimous consent to place a study in the record without objection. We all know commercial facial Recognition Technology continues to expand in the public and private sector, but your new study found facial Recognition Software misidentified persons of color, women, children, and elderly individuals at a much higher rate. You evaluated 189 on the rhythms from 99 developers. Your analysis found false positives were more likely to occur with people of color and is that correct . The largestrect for collection of the algorithms. That is correct. Chairwoman maloney your report also found that women, elderly individuals, and children were more likely to be must identified by the algorithm. Is that correct . That is correct for most algorithms. Chairwoman maloney they used to do all the studies on men. When you were doing the studies, when you doing them on mens faces, or were you using womens faces . We had a substantial set of images we could pull from. We were able to represent a broad crosssection of demographics. Chairwoman maloney did these disparities and false positives across broadly across the algorithms you tested . Forhey did occur broadly most of the algorithms we tested. Chairwoman maloney your study states across them a graphics, is that if rates falsepositive rates are beyond 100 times. These are staggering numbers, wouldnt you say . How much higher was the error rate when the algorithms were used to identify persons of color as compared to a white individual . As we state in the report, the error rates for some of the algorithms can be significantly higher from 10 to 100 times the identification for caucasian faces. For a subset of the algorithms. Chairwoman maloney what was the difference in the misidentification rate for women . Similar rates. 10 to 100. I will get back to you on the exact number, but it is a substantial difference. Chairwoman maloney what about black women . Black women have a higher some algorithms on the same algorithms we are discussing then either than either black faces broadly speaking or women broadly speaking. Women had differentials even higher than those two other demographics. Chairwoman maloney what were they . Substantially higher on the order of 10 to 100. Chairwoman maloney misidentification as we all know can have very serious consequences for people when they are falsely identified. It can prevent them from entering a plane or a country, it can lead to someone being falsely accused or detained or even jailed. I am deeply concerned that facial Recognition Technology has demonstrated racial gender and age bias range racial, gender, and age bias. We should not rush to the ploy unless we understand the potential risks and mitigates them. Your study provides us with valuable insight into the current limitations of this i appreciate the work that you have done and all of your colleagues on the panel today that have increased our understanding. I would now recognize the Ranking Member. No, i am going to recognize the gentlelady from North Carolina, mrs. Foxx. Now recognize for questions. Recognized for questions. Mr. Parker, how competitive is the racial the facial Recognition Market . It is extremely competitive, because of the advances in technology over the last couple of years, the dramatic increase in accuracy in the last three to five years, combined with advances in Imaging Technology have really made the products more affordable and therefore there has been more interest from consumers and more entry into the market from competitors. To what extent do the Companies Compete on accuracy and how could a consumer know more about the accuracy rates of facial recognition . Compete on a receipt. In this program this program plays a really helpful role to providing a benchmark of accuracy. The companies are competing with each other to get the best course on those tests. The best scores on those tests. They make the results available to their customers. There is an important distinction as well. In this testing, you have static data sets they are using that are already there, whereas those are not necessarily the same type of images you see another blood system. Need topes of testing be done in a fully deployed system to really determine what the accuracy is. What private sector best practices exist for securing facial images and the associated print,uch as face templates, and match results in these facial Recognition Technology systems . I mentioned earlier, sias developing a set of best use practices based on the fact that any of our members have produced best practices when they work with their customers to implement. It would accomplish privacy goals. I have a couple of examples, but heref the most significant is many of these products already have built into them the ability to comply with status privacy laws in europe to muscle the gp are laws in europe gpr laws in europe. This has to do with encrypting photos, encrypting any kind of personal information associated with it, securing channels of communication and the server and the server in the device, as well as procedures for looking at someones information and be able to delete that if requested and tell someone what information is in the system. Summarizeou succinctly some of the best practices that exist for protecting that personally identifiable information that is incorporated into it . Is it too complicated a system to explain here . Is there something we can have to read . I will be happy to provide some more digital study, but certainly one of the most important things is encryption of data if there is a data breach. Important to point out the face template is what the system uses to make a comparison between two photos. By itself, that is basically like the digital version of your fingerprint. Isitself, if that data compromised, it is not useful to anyone. It has to be proprietary software that can read. Read it. I have been reading a lot about the difference between europe and us in developing these kinds of techniques recently. A number of state and International Policies arent acting how information is collected are impacting how information is collected. Address privacy information. How have commercial entities conformed these new legal structures . Is that weare seeing are adapting here and already building interest to products in anticipation for some of because it is good practice, many of the requires, but we anticipate a similar framework in this country at some point. Being proactive in building some of those things in. Thank you. I yield back. Chairwoman maloney i now recognize the gentlewoman from the district of columbia. Ms. Norton is now recognize for questions. This recognized for questions recognized for questions. We are playing catchup. The way to demonstrate that most readily is what the Cell Phone Companies already are doing with this technology. Private industry appears to be way ahead of where the congress of the federal government is. The interesting thing is, they are responding to consumers, and it appears consumers may already be embracing facial recognition in their own devices. Because the latest, as they compete with one another, almost all of them have incorporated facial recognition already in their latest mobile products. If one doesnt, the other is going to do it, all of them already doing. You can unlock your cell phone by scanning your face. Thinks this is i suppose they are right a real convenience instead of logging in numbers, and they have gotten accustomed to cameras. I remember when cameras were first introduced in the streets and people said that is terrible. Of course, there is no right to privacy. My cell phone,ut there is a lot of private information in there. According to recent reports, this technology is not full proof. That is my concern. Foolproof. That is my concern. A simple photograph can fool in some circumstances. Unauthorized individuals can get into your cell phone, and any Sensitive Information you have there, people store things like their email, banking, the rest of it. Do you see problems that are already there . Companies now integrating facial recognition in devices like this. It looks like the public sees convenience, and i dont hear anybody protesting it. Would you comment . Thank you very much. I think that is in a question. We do see the cases completely and many applications, with phones being the most personalized thing people have, that makes a good example of variations in place in the market of the different ways facial Recognition Technology is being used. For example, in your phone, im going to use apple as the example. This is my best understanding. Apple takes a representative picture of your face using both infrared technology and 3d imaging to prevent things like using a photo or using another person, and it takes it at a level of detail that stands up to about a one in 10 million error rate, a pretty substantive level for something that is in fact part of a two factor process, you have to have the phone and know whos phone it is and have their face and match whatever standard there is. Facial recognition can identify someone off that that can identify someone off a video feed is a different level of application and certainly should be considered and regulated in a very different way than that. I do think we see those things being used in different ways already, some of those have started to have some blowback on them in things like the criminal justice system. That is where it has gotten peoples attention. Where are the places when he took draw those lines and say it should not be used here. Maybe at all, port of it is, it should be used in limited and if it is, it should be used in limited and regulated ways. Does the average consumer have any way to confirm . Should they have any way to confirm that these cell phone manufacturers are in fact storing their biometric or all the data on their servers . What should we do about that . Consumer knowledge. The average consumer does not and indeed many researchers, many lawmakers dont, because this technology as i wrote in my written testimony is hidden behind trade secrecy. This is a Corporate Technology that is not open for scrutiny and auditing by external experts. I think it is notable that while we reviewed 189 algorithms in the latest report, amazon refused to submit the recognition algorithm. Their recognition algorithm. They said they could not modify it to meet the standards, but they are a multibilliondollar Company Managing other pretty incredible feats. What we see here is it is at the facial Recognition Companies discretion what they do and dont release. Oftentimes, they release numbers not validated or not possible to validate by the general public. We are left in a position where we have to trust these companies, but we dont have many options to say no or scrutinize the claims they make. Chairwoman maloney thank you. The gentleman from louisiana is now recognize for questions. Thank you. I would like to ask unanimous consent that the statement of chief james craig is written testimony that his written testimony be entered into the record. That his written testimony be entered into the record. I would also like to recognize and thank our esteemed col league for his opening statement, the freedoms and liberties, resisting and curtailing manifestation of big brother, reducing and controlling the size and scope and i wantpowers, you to know good sir, the Party Welcomes your transition. [laughter] madam speaker, facial Recognition Technology is emerging technology. Bycourse, it is produced private entities. Lawenforcement does not produce their own technology. It is common and it is here. As the weeks and months move forward, it should be no effectiveo us that percentages of identification are using a new technology will increase as time will forward. And there is more coming. There is total person Recognition Technology coming. It measures the specific physical features of individuals. Thier gait, length of their arms t, length of their arms, etc. What we should see is a means by which to make sure that big brother is not coming. I have a background in Law Enforcement, and recognition becomeogy has manifested in many ways. You have license plate readers being used from sea to shining sea. There are readers and police units that drive around and read license plates. Out for a eye particular vehicle of a particular color, that is human recognition. If we see that vehicle, we have a license plate reader reading every plate we pass. Thet is expired, or associated drivers license to that vehicle, if a person is wanted. If the guy that walks up the building and gets in that vehicle appears to be a suspect we have identified or have a warrant for, there is going to be some interaction. This is a technology that has evolved and will come to manifest in the next 20 years, and it has gotten very effective. Technology istion is completely common. We use digital records from crime scenes, pictures, the best we can get from surveillance video, surveillance cameras at the business or whatever was available to us. We would pass these images onto pass these images. The other pretty good. Somebody would recognize i get. This is the beginning. The odds are pretty good someone would recognize that guy. This is the beginning of an investigation that helps law a personnt cultivate of interest for a person of interest for us to speak to. Can never be a time there are two things we stand against. This is where the Ranking Member and i have had discussions at length. Both of us stand against live images of free americans as they travel and at businesses to and fro across america through some database and suddenly the police shows up to interview the guy, but we are already, using digital images to the best of our ability to solve crimes. And every american should recognize that this is an important tool. Written statement that i asked be submitted has several examples of the use of this technology. I have three specific questions which time will not allow. We have had several hearings about it. I think the majority partys focus on it. I hope we can come together with effective legislation that both allows the technology to move forward and protects the freedoms and privacy of the american citizens we serve. I yield. Chairwoman maloney thank you. I now recognize the gentleman from massachusetts for questions. And thet to thank you Ranking Member for collaborating on this hearing and approaching it in the right way, i think. First of all, i want to thank the witnesses for your testimony. It is very helpful. I am aderstand it, little skeptical, they tell me that the facial recognition you use on your phone with the iphone, that the way iphone says the way they handle this is in case, it in this stays in the phone and does not go to a server at this point. I sort of question whether they have the ability to do that in the future. I think there is probably a greater danger that they will get facial recognition right. It is what happens when they have all this data out there, whether it is lawenforcement or private firms. We had a massive data breach by suprema, a big biometrics collector, 100 Million People i think 27 Million People in that breach. Then customs and Border Patrol, 100,000 people that they identified along with license plates, that was breached. The concern is, once this information is collected, it is not secure. That is a major problem for all of us. I want to ask some specific aestions about ticktock, chinese company, purchased a chinese company, purchased by a chinese company. In the last 90 days, one billion people have downloaded it in the u. S. And europe. It is owned by the Chinese Government. And im sorry, it is located in beijing. Under chinese law, a recent National Security line china, they have to cooperate with the Chinese Government. We already see it happening. You dont see much about the protests in hong kong in the app, they are already exercising censorship on tiktok. It would have to cooperate with china. That is a National Security concern for us. It is under review. Lee other situation is app phone, the iphone and our efforts because of the pensacola shootings, we are trying to get apple to open up the iphone so we can get that information. If you step back, it is sort of what we are worried about china doing what we are doing with apple, we are trying to get access to that data just like china can get all that data, from tiktok. How do we resolve the dilemma . Is there a way we can protect our citizens and others who share that data or have their identity cap should . Captured . Their facial recognition captured . How do we resolve that so we use it to the better fit of society . Benefit of society . Think the bottom line really is balancing the understanding of the risks associated with policy decisions that are made. Those policy decisions are outside of missed purview, but with regard to the debate on access to apple and encryption, we know that in the government and broadly speaking, there are two that in your discipline. Let me ask miss whitaker the same question. I think the short answer is that we do not have the answer to that question. We have not done the research a ed to affirm affirmatively answer that we cannot protect peoples privacies in a complex geopolitical context. I think we need more of that research and we need clear regulations that ensure these are safe. Think we need to unabashedly support encryption. Control over the data and third parties dont. That is the way consumers control the information and keep it out of the hands of the government. I exhausted my time. Thank you for your courtesy. Chairwoman maloney thank you so much. The gentleman from texas is now recognized for questions. Thank you for your work on this topic. This is an extremely important topic. Mixture of both. In some cases, especially with federal agencies, they developed their own systems over time. It isk increasingly, commercial solutions. Commercial solutions. One has been the Industry Response to the mr. Report . The misreport . From my perspective the industry has been involved from the outset. They have been very supportive of the efforts we have undertaken over the last 20 years. So its generally a very positive thing. The industry feels challenged to do better. I think it depends on the industry. Those are participating were participating will evaluate it. Excludes amazon, because amazon is a cloudbased system, apple, because they are an infrared system, we need to include those as well. And mr. Castro, mr. Parker, you both mention it has been improving dramatically yearbyyear. Are weeks,ay there months, years, decades away from getting this technology to the next that . At the bestok performing algorithms right now, they are at that level of acceptance we would want. There are error rates of. 01 . That is incredibly small. When we are talking about the magnitude between error rates, if you have something 10 times worse, that is still. 1 error rate. Is one outate, that of 10,000, one out of a thousand, these are very small numbers. As mr. Castro said, we are reaching that point now. There are some reasons why the industry is really focused on falsenegative type error rates in reducing that over time. And reducing that over time. It is down to extremely low levels now. It is 20 times better now than it was five years ago. Results of demographic effects that it, we are looking at now some of the falsepositive rate and trying to make those more uniform. Rates, thoseenous that are mostly the same across different demographic groups. There is important context to consider these in. One was mentioned already, the total relative scale, 100 times,. 01. In some cases, it matters more than others. With Law Enforcement investigations, and this report it says falsepositive s they are looking at a set number meet aidates that criteria. In the case of new york city, they actually looked through hundreds of photos that were potential matches. There is that Human Element that the Technology Functions as a tool to enhance their job. It is up to a human to decide if there is an actual match. Falsenegativ he that she want to make sure youre not missing anyone in your dataset. How do we get this right from our perspective . Sometimes we step in as the federal government to fix the problem and end up creating an environment that prohibits the Technological Advancements or the Natural Market angst that work to make us get to that solution. Sometimes it makes us take a step back. What is the right approach . Facial recognition is just one of many advanced technologies. It is important that the issues that we have are not do with theg to technology, they have to do with how to use it. Any i think we need to focus on the concerns we have to tailor restrictions of foreign to. That is a more sensible approach. In the seen a proposal senate that would do Something Like that. I yield back. Chairwoman maloney i now recognize ms. Kelly for questions. Thank you for holding this hearing. We talked previously about bias andfacial recognition Artificial Intelligence generally, but the part three on demographics provides useful data on the development of a facial recognition program. About raised concerns bias and unfair algorithms and the dangers of allowing these biases to perpetuate. There is also the part three report, but not particularly surprising the women and individuals of african and asian descent having higher falsepositive rates than men. In your testimony, i was hoping you could clarify the statements policymakers and the public should not take facial accurateon as always were always errorprone. We should be pushing to have these technologies get as close to always accurate as possible. Why should we not strive to think about this technology is always accurate and how long will we have to wait for this technology to reach close to always accurate for all demographic groups . Figure for the question. For the question. I dont know how long it will be. I cant predict the future. The statement refers to the fact that the characteristics you have to include in any discussion are you have to know the algorithm you have to know the algorithm youre using, that is my testimony that is my testimony stated, if there is substantial bias or demographic effects across the different demographics, the most accurate in the one to many categories. You have to know the algorithm youre using in the context. To automatically identify out muriel and murial any family photo, compare that to the identification of a suspect, where there are some very serious concerns about ensuring you get that right. In have to know the context which you are using the algorithm, you have to know the algorithm youre using and the overall system. Algorithms,ematical we dont have the capacity and we dont test systems that are deployed in the field. Those have implications as well. Can you discuss the benefit of auditing facial Recognition Systems for bias . From our perspective, whether it is policymakers or Government Entities or private sectors entities that want to use facial recognition, the most important thing to do is to understand and have the accurate unbiased data that we can appropriatehat decisions are made with regard to whether to regulate or not what kinds of regulations might , ifeeded in what context you are in a procurement situation procuring a system, you want to know the performance of the system and the algorithm that it depends on. Those are the things that we think are appropriate from an auditing keep ability or are not in perspective. Dodont view the testing we as an audit as much as providing policymakers and government and the private sector with information. I know you talked a little bit about auditing. I would like you to answer. I think auditing is absolutely important, but we need to understand how we are measuring the systems. In my written testimony, i give an example of one of the most famous facial recognition measurement systems. A data set we measure these systems against, labeling faces in the wild. In short, it features photos of mainly men and mainly white people. The way the industry assessed accuracy was to be able to men. Nize wightman white that gives us a sense of why we are seeing these pervasive diocese across the systems. Biases across these systems. It is those standards, dont ask questions about the data that will be used in the system in a deployment environment will be, how the systems will be used if they dont ask questions like what the atlantic tenants were concerned about lyons at to give miss chance before my time runs out. It is critical. Matter. Dards being used one of the regulatory options is to have requirements say government use have to be evaluated or have been ranked by some external objective test that has clear transparent sea into what the standards were and how would was measured and how it was done. Thank you. I yield back. Therwoman maloney gentleman from texas is now recognized for questions. Facial recognition is extremely important and viable for our government. Please like Border Patrol in Law Enforcement. At the same time, there is also no question that this Technology Allows for any individual to be identified in public spaces be through the private sector or Government Entities. Therein lies the potential problem and grave concern for many people, both in the private sector and government should be the responsibility of individual privacy and Data Security. I am not exactly where this question is best directed. Any of you, jump in here. Lets start with the private sector. Companies that are using facial Recognition Technology. Issuere addressing this similarly. The whole question of privacy. In other words, within the private sector setting forth best practices, any of the stakeholders . We have identified the number of companies that have put up principles around privacy. Amazon,ally microsoft, google, they have all had public statements where they identify what specifically they are doing around facial recognition and how they want to protect privacy, how they are doing in terms of development of the technology, what they are doing with developer agreements, what they have to agree to to use technologies. What other principles, the guidelines . Things around jens bencic, consents, data protection, notification, a whole set of issues. This matches the guidelines we have seen come out of other forms as well. Concern brought up that people are being identified without their consent. So what are the safeguards . It is one thing to have policies and things were in down and another thing to implement these things and protect the public, protected the individuals who have not consented to this type of technology. So how will these facial recognition products as they developed inform individuals that they are being expose potentially without their knowledge . A number of recommendations are around how you actually communicate to individuals around what circumstances part of the source of confusion i think in some areas is that there is many different types of systems that are out there. Some are just doing facial analysis. If you walk by and advertising sign without consent . Without consent. They are just tracking the general demographic of who has seen the ad. They are not tracking anyones identity. For that type of purpose, they are not going to be obtaining consent, but if they are going to be targeting people based on your identity, they will require consent, so you have to have signed up. Lets go to the big map to airport, which right now is a pilot airport for some facial Recognition Technology. Have the Busiest Airport in the world, thousands of peoples Walking Around all over the place. Isn this technology implemented, there is no way to get consent from everyone working around. Walking around. They have the ability to opt out. You dont have to go through that if youre going to be the international terminal. How does a person opt out . You simply say you dont want to use the selfserve kiosk and you can go up to one agent. So you are saying that technology would be used just in the security lines . No, for boarding and screening and then checking. Delta hasose areas said they have the ability to opt out and they allow consumers to do that. Do you know any case where the government in particular is using this type of Technology Without the knowledge, without the consent of an individual where it actually violated for the memo . I dont know that. I dont think we have documentation of that. I do think that is what we need a search warrant requirement so we know when those requests are made. I would agree. Therein lies the potential problem with all of this. We see the value of the technology, but somehow we have got to land the plane on a safe zone that does not violate peoples rights. I appreciate you being here. Chairwoman maloney the gentlelady from michigan is now recognized for questions. Thank you so much, madam chair. This year, i introduce hr 153 with my colleague regarding the need for the development of guidelines for the Ethical Development of ai. Of aincy o systems and processes and what implications help to one power women and underrepresented or marginalized populations. Right now, we have the wild west when it comes to ai. Artificial intelligence is not the only emerging technology that requires the development of ethical guidelines. The same discussion must be carried over to the use of facial recognition. A member who introduced a statement from the Detroit Police department. I represent a majority minority district in the city of detroit, its one of my cities. Approximately 67 of my constituents are minorities, meaning the vast majority of my constituents have a higher likelihood of being misidentified by a system that was intended to increase security and reduce crime. Month, released a study test facial recognition part three, evaluating facial recognition algorithms provided by the industry to develop the accuracy of demographic groups. The report yielded there are higher rates of inaccuracies for minorities to caucasians. Develop when algorithms are developed and we use it by biasedcess bi process, it is going to give you a bias result. What can we do . First ill come other should not be any american citizen who is under surveillance where it is not required it is posted and identified in a place to contact the company to say what are you using my image for . We in america have the right to know if we are under surveillance and what are you doing with it. Another thing, any release of data that you gather should be required to go through some type of process for the release of that. I cant just put up a camera and other information and then sell it. We had the conversation about the ring doorbell. It is helping to get criminals, but if you are going to give the information from ring to the local police department, there should be some formal process of inclusion to the public so they know that is happening. About theconcerned movement of this technology. So, some places have just said, we are not going to use it, and we know this technology is here in moving forward. Instead of just saying dont use it, we need to, is congress, be very proactive of setting ethical standards, have the expectation that our public and say that if i am being if my image is being used, i know and i have a right to what are my rights . That is something i feel strongly about. With so manyon, variations of accuracy in the can we do thatt will say that we will take out these biases . Been thehere have not algorithms what can we do as a congress to ensure we are stopping this . Thank you for the question. I think when we talk about this forward, weacing have had an industry that has raced forward selling these technologies, marketing these technologies, making claims to accuracy that end up not being totally accurate for everyone. Validation not seen rates forward. We have nothing public understanding and new mechanisms for real consent. I think we need to pause the technology and let the rest of it catch up so that we dont allow corporate interests and Corporate Technology to raise ahead and be built into our Core Infrastructure without having put the safeguards in place. The police chief and detroit in the choice submitted a record. He made a promise there will never be a trial in court waste solely on based solely on facial recognition. There should be something that does not allow for some to be persecuted based on the fact that we know this data is not accurate and it has biases based on facial recognition. That is something i think we as a congress should do. Thank you. My time is expired. Chairwoman maloney thank you. You raise a lot of very good points. The gentleman from ohio is now recognized for questions. Is wrong sometimes, isnt it . And it is disproportionately run for people of color, is that right . In athis all happens country, and the u. S. , where we now have close to 50 million surveillance or security cameras across the nation. Is that right . You can say yes. Context. Earlier about i think a number of witnesses talked about context. The context of opening your phone is different than your apartment complex having a camera there. But it seems to me the real context concern is what is with the government and how the government may use technology. Wasnow facial recognition used by Baltimore Police to monitor protesters after the death of friday a few years ago in the city of baltimore. Which is scary in and of itself. Then of course he had five bullet points. I appreciate what you are doing with the institute that you cofounded, but point number five poses anal recognition existential threat to democracy and liberty. That is my main concern. How government may use this to harm my First Amendment and Fourth Amendment liberties. You have to think about context even in a broader sense. We have to evaluate it in light of what we have seen the federal government do in just the last several years. You know how many times the fbi the summer court and of 2016 when they sought a warrant to spy on an american citizen . I dont remember the exact number. Courttimes they misled a with no advocate looking up for the rights of citizens were going to lose their liberty. 17 times they misled the court. We found out it was worse. They spied on for americans of four americans having to do with the campaign. It is not how facial recognition can be used by the government, we already know it has been like to was used in baltimore do surveillance on protesters. The fbi went after four american citizens associated with a president ial campaign and the misled the court in the initial application 17 times. Of course, that is after what happened a decade ago. A decade ago. The irs targeted people for their political beliefs. There was no facial recognition, they just did it. Asked them questions like the you pray at your meetings, who is your guess that your meetings . Guest at your meetings . When we talk about why we are nervous about this, context is critical. The context that is most critical than most concerning two republicans and democrats on this committee and frankly all kinds of people around the country taking time to look into this a little bit is how the government will use it and potentially violate their most basic liberties. That is what we are out to get. Yousaid in your testimony , bullet point number five, it is time to halt the use of facial recognition in sensitive and social political context. Can you elaborate on that . Are you looking for a flatout moratorium on government expanding it, stopping it . What would you recommend . Thank you for that question and the statement. I would recommend that. I would also recommend that the communities on home this is going to be used have a say whom this is going to be used have a say on when it is halted and how it is the point. Are they comfortable with the use, do they have the potential harms to themselves and their communities . Is it something have they been given the information they need to do that . Are you talking about any private sector context . The reference would be an apartment complex where you can enter versus a key or something. Or are you talking, elaborate on that. Absolutely. I am talking about both. Pd was using private sector technology. They were scanning instagram photos through a Service Called from that gave them feeds the protests. They were matching the photos against thei facial recognition algorithms to identify people with warrants within they could potentially harassed. There is an interlocking relationship between the private sector, who are essentially the only ones with the resources to build and maintain these systems at scale, and the government use of these systems are there are two levels of obscurity, there is Law Enforcement and military exemption where we do not get the information about the use of these technologies, and then there is corporate secrecy. Create totalck to of purity for the people who are bearing the cost of these violating technologies. Thank you. My time is expired. Chairwoman maloney thank you. The gentleman from california is now recognized for questions. First, every time i listen to a discussion on facial , more and more questions emerge. It is amazing. I would like to thank my colleagues on both sides of the aisle. I know folks think democrats about freedoms and liberties, but we do. Publicy the space, but in the bedroom and in ones body. This is the way i approach this issue, from a very personal perspective. I made my concerns about this technology pretty clear. The danger it poses on communities of color when used by Law Enforcement, racial bias in ai, and i was looking into it, amazon continues to come up because they are one of the most aggressive marketers of this new technology. They do it under a shroud of secrecy. I want to be clear. I know this technology is not going anywhere. Hard to put limits on technology especially when using the law. I have seen this time and time again coming from california. If they can move quickly, they will outpace come out on the government in putting any kind outrun thece, government in putting any kind of limitations. We will react and we will start putting some limitations on it. , butw that it is tough there are a lot of questions. Beenf the things i have trying to figure out, what companies, what agencies, what federal authorities are using it . How are they using it . Who sold it to them . Been trying to figureand if there isy evaluator who has evaluated its accuracy. When this Technology Makes a mistake, the consequences can be severe. An identification of applications such as visa or passport fraud detection or surveillance of a false positives to match another individual could lead to a false accusation, detention, or deportation. Facial Recognition Technology not only makes mistakes, but the mistakes are more likely to occur when an individual is identified as racial minority. Is that correct . For most algorithms we tested, that is correct. Did your study find that these disparities were limited to a few developers or was it more widespread . It was mostly widespread, but there were some developers whose accuracy was sufficiently high that the demographic effects were minimal. If i knoware miss whitaker answered this question, but has amazon ever submitted their technology for review . They have not but we have had ongoing discussions with them anut how we can come to ere submittingt th the algorithm. It is an active situation. How long has it been ongoing . Some months at least. This is in the context of them trying to put out a block post regarding their principles. It was in response to a letter that myself and senator markey sent to them. You would think that it would be more than just a blog post. You would think it would be something more serious that rises to the level of our concerns. Ask each of want to you, can you each discuss the implications of the newly released report on the use of the facial Recognition Software . What are the potential harms of using biased systems . I think the benefit of the report is that it discloses the that is present in many of the algorithms being used and gives consumers and businesses good information on which to make their choices. The pointjust make that even though a large number of algorithms were tested, those are not equally spread across the market as far as representing market share. The vast majority of the market right now, at the highend particularly, as well as the highend commercial uses like the nfl, sports stadiums, venues, amusement parks, things like that already overwhelmingly a ploy the algorithms at the top the algorithms at the top end. It is not an evenly distributed problem and that is part of the problem. And with thater will be my end, but i will let you answer. Thank you. Absolutely, i think it is important to emphasize that accurate facial recognition can also be harmful. Bias is one set of problems but this goes beyond that. I think any place where facial recognition is being used with social consequences, we will see harm these racially and gender biased disparate impacts. I think we can look at the case , whollie lynch in florida was identified solely based on a low confidence facial recognition match taken by an officer of a cell phone photo. He is now serving eight years based on that photo and was eventually denied to get that evidence released during his trial. We are seeing highstakes that really compromise life and liberty here from the use of these biased algorithms. In response to the question of, where are they being used . Which algorithms are being used here . We do not have public documentation of the information. We do not have a way to audit that. We do not have a way to audit whether mixed results in the laboratory represent the performance in different contexts, like amusement parks, or stadiums, or wherever else. There is a big gap in the auditing standards, although the audits we have right now have shown extremely concerning results. With that, i yield back, madam chair. Thank you. The gentleman from West Virginia, misses miller, is now recognized for questions. Thank you. As technology evolves, it is important that we are on top of it. I saw firsthand how they were using facial recognition when i was in china as a form of payment. I was also exposed to several concerning uses of facial Recognition Technology. As a committee, it is our responsibility to make sure that anything that is done in the United States is done thoughtfully and prioritizes the safety and individual security. Mr. Parker, when i am at a busy airport, i am really glad that i have clear to get through. Even though we have tsa, when youre in a hurry, it is nice that you can use recognition and go forward. Can you elaborate on some examples of beneficial uses for consumers and businesses . Sure. I will stick to the private sector uses but also security and safety related. One really important one is protecting people against Identity Theft and fraud. Heres how that works. Someone walks into a bank and asks to open a line of credit using a fake drivers license with the customers real information. As part of the process, tells them they have to have their photo taken. A comparison is made and they determine it may not be the person they say they are. They say, i will have to talk to my management. By that time, the person who is going to commit fraud is probably long gone, right . That is a pretty useful case for the technology that people dont think about. Facialr industry, recognition is also able to provide additional security for Facility Access control. It is typically to augment other credentials, such as keys or cards. These things can be shared, stolen, or lost. Systems provide an additional convenience for registered users. For entry into an Office Building for commercial offices during rush times. Another example of technology being used to reduce organized retail crime and theft in theft, which has skyrocketed in recent years. Do you think that the Mainstream Media outlets have given an honest portrayal of how this technology is utilized and the reality of its capabilities . I am sorry. I dont think so. This is a complex issue we are talking about here. I think it tends to get oversimplified and mischaracterized. If we go back to what i said earlier, i think the issue is that what is causing some concern is about how the technology is used. It is not the technology itself. There are other technologies that could be used in similar ways. When you think more constructively about what the rules should be about the use of different types of technology. I have a very good friend in West Virginia by the name of chuck romine. If we scan both of you, you would not look anything like. Anything alike. During a Committee Hearing on july 11, in your testimony, you discussed accuracy rates across multiple demographics and how inaccurate results are diminishing. That you have published the report, is that still accurate . In what other areas is this Technology Improving . I hope my statement in july was that the most accurate algorithms are exhibiting diminishing demographic effects. We certainly do believe that the report we released just last month confirms that. That anytimetated the overall performance of the system improves, the effects on different demographics decrease as well. Is that still something that is still true to this day . That is correct. Good. Knowing that accuracy rates have improved, can you further explain the role of performance rates and why they are important for the endusers of these technologies . Absolutely. It is essential in that in the selection of a system, you understand the algorithm that the system uses, and select for an accuracy that is sufficiently robust to provide you the minimized risk for the application. In some cases, the application may have a very limited risk and the algorithm may not be important, or as important. In some in cases in some ,ases, the risk may be severe such as access to critical infrastructure. If theres face recognition being used for that, you want to have on algorithm basis for your system that is highperforming. Can you speak to where your researching techniques that exist to mitigate performance differences among the demographics and what is emerging research and standards and standards interested in . Upporting in standards sure. One of the things that we do want to do is point policymakers whichnsumers to ways in these things can be mitigated. One of the medications mitigations can be the determination of unappropriate threshold to be set to determine that any algorithm that use an appropriate threshold to be set to determine that any album any algorithm that is used come you set an appropriate threshold. Having fingerprint or an iris scan or some other type of biometric involved that would help to reduce air substantially more. Thank you error. Subs error substantially more. Thank you. Miss presley is recognized for questions. The use of facial Recognition Technology continues to grow at a breathtaking pace and has now nearly every aspect of our daily lives. Many families are unaware that their faces are being mined when they walked through the mall, even as they drop their children off at school. In response, some municipalities, including within the massachusetts seventh congressional district, which i represent, have stepped up to the plate to protect the residents from this technology. We know that the logical end of surveillance is often over policing and the criminalization of vulnerable and marginalized communities. Myis also why i worked with colleagues on legislation to protect those living in Public Housing from this technology. More recently, School Districts have begun to deploy facial analytics in School Districts have begun to deploy facial analytics. How widespread is the use of this technology on children in schools . Seeing facial Recognition Systems being implemented more and more in schools. I think the actual number is still very small in terms of of theage penetration number of schools in this country but it is certainly growing. There is really just no good justification for a facial recognition system in a k12 school. There amazingly they are mainly being used in security applications. They do not adequately address in any meaningful way these issues and it is not the best use of funds or the best way to heighten security around schools in response to these threats. The other part of your question, the patient facial characterization programs, which i think are being used more in an educational context. What is the Response Rate of students to certain teachers or types of teaching and things like that . That is based on very questionable data at this point, and i think in the not ready for prime time category definitely qualifies, in the sense that we are seeing it very quickly applied in many use cases that the science and research is not there to back up. It is particularly concerning when youre are talking about children in schools, not only because they are essentially a captive population, but because the labels or decisions that might be made about those children based on that data might be very difficult to later challenge or in any way reduce the effects on that particular child. Security and privacy concerns. Your study found that the error rate of facial Analytics Software actually increased when identifying children. Is that correct . For most algorithms, that is correct. Why was that . We dont know the cause and affect exactly. There is speculation that withs faces have less life experience, they are less featurerich. We dont know that for sure, because the Neural Networks used, it is difficult to make a determination of the reason. Got it. Many of you have mentioned in which these image databases can be vulnerable to hacking or manipulation. When childrens images are stored in databases, are there security concerns that they raise or that may arise . Security for minors is always a concern. This technology is clearly biased, inaccurate, and more dangerous when used in schools were black and brown students are over policed and disciplined compared to their peers for the same minor infractions. Girlssachusetts, black are six times more likely to be suspended from school for the same infractions as their white peers. Dont need facial Recognition Technology that can misidentify them. Last fall, i introduced an act which would urge schools to abandon over policing and surveillance and to instead invest resources in access to counselors and Mental Health professionals, resources that will really keep kids safe. In my home state of massachusetts, a Broad Coalition of educators, civil rights and childrens rights activists are leading the flight in fight in saying no to the deployment of facial Recognition Technology in our schools. I am grateful for their activism and solidarity on this issue. I would like to include for the record a letter on the desk from naacp, aclu massachusetts and many others urging our state to reject this surveillance and policing in our schools. Thank you. And i yield. Thank you and the gentleman from north dakota, mr. Armstrong, is now recognized for questions. Thank you. I think there is a couple things that we should talk about for a second, because i think they are important. I am going to go to the Fourth Amendment and criminal context and how this can be deployed there. This is not the first time we have seen a crisis in the Fourth Amendment happen. It happened with telephoto lenses, distance microphones, gps trackers, drones, and now we are at facial recognition. To be fair, the Fourth Amendment has survived pretty well. Biometric information has a different connotation, which we will get to a second. With Ranking Member jordan that we should not let the courts decide. The courts will take a constitutional view of privacy. With these types of issues, i will be the first to admit that my facial recognition did not work on my phone over christmas. You know what i did . Drove immediately to the cell phone and got a new one. The carpenter case is a pretty good example of how at least the u. S. Supreme court is willing to change how they view privacy in the digital age. Part of our job as congress is to ensure that we write a law and write regulations that ensure we can maintain those types of privacy standards. One of the reasons biometric is a little different is because there is one unique thing in a criminal case that is really relevant to facial recognition, and that his identity cannot be suppressed that is identity cannot be suppressed. I can suppressed marijuana, a marijuana, ass gun, a dead body, but i cannot suppress identity. We need to apply a statutory exclusionary rule, otherwise, any regulations we pass do not truly matter in a courtroom. We have to figure out a way for meaningful human review in these cases. There has to be an underlying offense or a crime. It is really important. I think it is important to recognize that not all populations are the same. There is a big difference between using facial recognition in a prison setting and even quite frankly in a tsa or border setting then there is than there is for a Law Enforcement officer walking down the street with a body camera. We have to continue to have those conversations. I also want to point out that one of the things that we have to do when we are dealing with these types of things in the Law Enforcement scenario, and i dont care what Law Enforcement figure out a way to account for false positives. Is, inson i say that north dakota, we have Highway Patrol who have drug dogs. Not all of them, but some of them. If you are speeding down the highway going 75 in a 55 and get pulled over and that Highway Patrolman happens to have a drug dog in his car and he walks that dog around your car and that dog alerts and they search your car and dont find any drugs, and they let you leave, that data never shows up in that dogs training records. It never shows up. When you are talking about the accuracy of a drug dog, when youre talking about the accuracy of finding a missing girl or any of those issues, we cannot wait until that situation arises. If there is a missing girl on the mall out here, i will be the first one standing at the top of the Capitol Steps saying, use whatever technology, grab everybody you can, lets find this little girl. I agree with that. You cannot have meaningful regulation unless you have meaningful enforcement. One of the concerns i have when deploying this technology in a Law Enforcement setting is that it is very difficult by the nature of how that works to deal with those false positives. My questions when we are talking about the missing girl or the dutch missing girl, how many people were stopped missing girl, how many people were stopped . We have to be able to have a situation in place where we can hold people accountable. The only way i can think of to do that is to use in populations and perfect it. The problem with a prison population is you have a static population. I think when we move forward with this, particularly in a Law Enforcement and criminal setting, we have to recognize the fact that you cannot suppress identity. It is a lot different than other technologies. If your number is 90 and you stop somebody at 60 and it still happens to be that person, under current criminal framework, i can make that motion, the judge will rule in my favor and say too bad, still arrested. With that, i yield back. The gentleman from michigan is not recognized for questions now recognized for questions. Thank you, madam chair. I think many of you probably already know i am particularly disturbed by the aspect of facial Recognition Technology being used by landlords and Property Owners to monitor tenants, especially in Public Housing units. In detroit, for example, the citys Public Housing authority recently installed security cameras on these Public Housing units. That is something we believe it will encroach on peoples privacy and Civil Liberties. These are peoples homes. I dont think being poor or being workingclass means somehow you deserve less Civil Liberties or less privacy. What are the privacy concerns associated with enabling facial Recognition Software to monitor Public Housing units . If you live in a low income community, is your Civil Liberties or privacy lessened . Thank you you for the question. Of course not thank you for the question. Of course not, at least, hopefully not. What is the problem they are trying to solve by putting this into a housing complex, any housing complex . Ownersthe landlord or game. Is it convenient, some level of convenience, some level of security owners , what is he trying to gain from it . With that in mind, what are the risks to the occupants . In my opinion, that would be a commercial use. Even if it was installed, it would be only for those residents who chose to opt in and enroll and use it as their way in and out of building. For those that opt out, they should not be included in that. From a Civil Liberties point of view, if this was being used in some way, the other laws about impact or protected classes do not go out the window just because you use a new technology. They still need to be applied. They raise challenging questions. These new technologies, they are forprofit, right these new technologies, they are forprofit, right . S yes, the companie developing them. They are forprofit and testing these technologies on people. I hear my good colleague from massachusetts talk about them installing it in schools. They are using this and i have a police chief that says this is magically going to disappear crime. If you look, my residence dont feel less safe. They actually dont like ash resident residents dont feel less safe. They actually dont like this a greenlight. It takes away peoples Human Dignity peoples Human Dignity when you are being policed and surveilled in that way. Now, they are now trying to say we are going to use faciAl Technology as like the what do they call it . Key fobs. They now want to use access to peoples homes using facial Recognition Technology on key fobs. One of the consequences of that is misidentification. We talked about how he could not even activate his phone. I am really worried that they are testing my people, my residents are being used as a testing ground for this type of technology. Do you have any come in regards to that any comments in regards to that . The algorithm testing that we do is to provide information to people who will make determinations of what is and what is not an appropriate use. Committee,es this any potential regulation or lack of regulation and any deployment that is made in the private sector or otherwise. I am really proud to be coleading with congresswoman presley and clark and leading no biometric barriers to housing act, which would prevent any completely ban facial Recognition Technology on federally funded housing buildings and properties. We should be very careful. I think congressman mark meadows is right. I hear my colleagues on both sides saying, we have got to fix the algorithms. We should not be in the business of fixing forprofit technology industries. They call them tools. They give them all these great names but they are processes in place of human contact, Police Officers on the street. I increasingly talk about this with the police chief and others and all they can say is, well, we did this and we were able to do that. Like my colleague said, how may people did you have to go through . I watched while they matched a suspect with 170 plus people. I watched as they took a male suspect and matched him with a female. Watched the kind of misleading the public of saying, well, you must not care about victims. I actually do care about victims. How about the victims you are missdentif identifying miss identifying . Thank you and i really do appreciate all of your leadership on this. Thank you so much, chairwoman, for doing yet a third hearing on this and continuing this critical issue that i know was important to chairman cummings. Thank you very much. The gentleman from kentucky is now recognized for questions. Thank you. I ask that you bear with me. I am battling laryngitis, so laryngitis with a bad accident does not spell success accent does not spell success. There is bipartisan concern today for facial Recognition Technology as we move forward. Doctortion is for the with respect to the National Institute for war standards testing. What is the role in establishing governmentwide policy . The only role that we have with respect to governmentwide policy is providing the scientific underpinning to make sound decisions. As a neutral, unbiased, and we are able to conduct the testing and provide Scientific Data that can be used by policymakers to make some policy. How does the technical standard differ from a policy standard . Certainly, technical standards can be used by policymakers. So in this case, a determination of a policy that was predicated algorithmscation of that are based on their performance characteristics is would be one example of that. From a policy perspective of what to do or what not to do with face Recognition Technology, that is something we would support with Scientific Data, but not with policy proclamations. Let me ask you this. Is this the right agency to develop governmentwide policy . Sir. Dont think so, what is the companys role in developing acme standards for accurate standards for facial Recognition Technology . Our role is in evaluating the accuracy and in particular, the appropriate measurements to make. These measurements did not exist. We worked with the community to develop a set of technical standards for not just the measurement itself, but how to measure these things, including the reporting of false positives , false negatives and the very detailed definition of what those constitute. Thank you. Mr. Parker, i understand that Security Industry supports the u. S. Chamber of commerces recently released facial recognition policy principles. One of what are the principles and why do you support them . Thank you for the question. Yeah, so i think the chamber put a lot of really great work into developing this framework. It mirrorsthe some of the work that was done earlier. Atistate multistakeholder process was convened that included other parties from the commercial sector about what appropriate commercial use looks like. Some of the principles have to do with transparency, obviously the main one. We were discussing earlier, what should be done as far as consent . I think that will cover most cases. Can you describe those principles . Balance than how those principles balance the need for civil liberty while also promoting the need for Industry Innovation . We are primarily talking about data privacy. It is different from Civil Liberties concerns surrounding government use primarily. Let me followup. What does the path ahead look like for these principles . Congresshink that the debate going on right now about establishing a National Framework for data privacy is a really important one. How to set rules for use of the technology in the commercial setting is within that framework. R know we have had the gdp in europe. The United States is establishing their own framework. That could be a real problem for our economy if we dont establish standardized routers standardized rules. The gentleman from virginia is not recognized for questions now recognized for questions. I think the chair and thank you all thank the chair and thank you also much. We will have to really grapple with, what are the parameters of protecting privacy and controlling the use of this technology . One of the traps i hope on my side of the aisle particularly we dont fall into is continuously siding the false id side testthe false false id false ids. Technologys does Nature Technologies nature technologys nature is that it will improve. What happens when it becomes 95 accurate . Then what. I would certainly argue irrespective of its accuracy, they are intrinsic concerns with this technology and its use. Maybe we have to look at things , wheret in and opt out you actually require the consent of anybody whose face is at beue to be able to transferred to another party, whether you are government or nongovernment. Mr. Parker, you were talking about primarily being concerned about how government uses facial Recognition Technology. Any reason to believe the private sector might also generate some concerns . Sure. That is why we need to establish best practices about how it is used particularly in any applications where there is any kind of serious consequence for errors. Errors . Let me give you an example. Photosgot one million from a photo hosting site called flickr. It sent the link to that toabase, one million faces, chinese universities. That was not the government doing it, it was a private entity. It was not about accuracy, it was about an entire data set adversary whoeign has a track record of actually using this technology to suppress and persecute minorities. For example, uighurs. We know they are doing that. Might you have any concern about a company like ibm engaging in that kind of behavior and transferring an entire data set to chinese universities with close ties obviously to the Chinese Government . Yes, certainly. I think it is reflected in u. S. Government policy, which established a restriction on exports to a number of chinese companies, particularly those developing this technology we are talking about. Miss whitaker, your views about that . I think that highlights one trying toues that implement consent to raises, which is that those photos are already on flickr. Those are photos of somebody may have put up on a very different photost those are somebody may have put up on a very different internet. These data sets are being used to train these data systems that may be erroneous, and may violate Civil Liberties. Where we ask for consent, how consent could work, given that we have a 20 year history where we have clicked through consent notifications without reading to get a matter of habit to the core technological infrastructures of our lives remains a big open question. I think we need to be able to answer that. I think we could agree, could we not, that whether i clicked flickr or any other itity to have access to, never contemplated having that photo transferred to a Foreign Government or to a university with close ties to a Foreign Government. Corporationo have a use it to train a system that they myself to Law Enforcement in ways that targets are community. There is a lot of things that we did not consent to. That thiss to me being the third hearing, where ,e have all expressed concern we have got some work to do. Out the in figuring rules of engagement here and how we protect fundamental privacy rights of citizens. Unless we want to go down the road of expanding and transforming the whole definition of the zone of privacy. That is a very different debate. It seems to me that we cannot only concede the technology will drive the terms of reference for privacy. Thank you, madam chairman. Thank you. The gentleman from wisconsin is now recognized for questions. Ok. Anybody else can jump in if they want, i guess. First of all, i would like to thank mr. Conley for his comments. That is ae inference major problem here is getting false information is i dont think the biggest concern. I think the biggest concern is it becomes more and more, it is better and better at the people uses that is used for at the il uses that it is used for. I think sometimes, the less information the government has, the better. Go ahead. Absolutely. I want to preface my answer by saying that i am an expert on Artificial Intelligence and that i understand the Tech Industry very well. I am not a china expert. It is very clear that these technologies are being used in china to implement social control and the targeting of ethnic minorities. You have networks of facial Recognition Systems that are designed to recognize individuals as they go about their daily lives, and issue things like tickets if they j walk, if they are recognized by a facial recognition system. People who attend religious ceremonies in china, could it be used there . Absolutely. You are just seeing it deployed in a different context. I attended a rally last night for president trump. Do you think it is possible that any facial Recognition Technology was being used in . Heir used there is not they capacity for technological of ordinances certainly exist the capacity for it certainly exists. We are just not told when it is used and where. Would it surprise you if it is being used there . No. Ok. There is the concern i have. We have a government that has weighed in against certain people. OutRanking Member pointed the irs in the past has shown a strong bias against conservatives. We use the power of government against conservatives. We have a major president ial candidate a while ago who said he wants to take peoples guns. Would it surprise you if facial Recognition Technology is being to develop a database of people going to a gun show . Facial recognition is being used to develop or against many different kinds of databases. Ok. Kind of concerning there. Me, that is the major concern, that our country will work its way towards china. A wild back, we had april a while back, we had a president ial candidate hostilely questioned a prospective judge because they were parts of the knights of columbus, which is kind of scary. Could you see the day, where we to they should technology identify people attending a Catholic Church, which apparently seems to bother some people. Thats the same principle as the Baltimore Police department using it to see who attends a freddie gray rally and targeting them. It is already being used in that capacity. If you set up a Catholic Church in china, do you think the red Chinese Government would use facial Recognition Technology to know who was a member of that shirt church. Identifying in china if you show up at a knights of Columbus Church . Identifying and china if you show up at a knights of columbus meeting . Anybody else want to comment on what is going on in china . I think it is a model for extraordinary authoritarian control. There technology is announced as state policy. Is primarilythis Corporate Technology being secretly threatened through Court Infrastructures without that kind of acknowledgment. Amazon a big player here . Absolutely. They have expressed Strong Political opinions. They certainly hire many lobbyists. Ok. Thank you for giving me an extra few seconds. Thank you. The gentlelady from new york. Thank you, chairwoman maloney. Thank you again for holding a third hearing on something that anso important and is such emerging technological issue. We have heard a lot about the risk of harm to everyday people posed by facial recognition. I think it is important for people to really understand how widespread this is. You made a very important point is anow that this potential tool of authoritarian regimes, correct . Absolutely. That authoritarianism or concentration of power could be done by the state, as we see in china, but it also could be executed by mass corporations, as we see in the United States. Correct . Yes. Could you remind us of some of the most common ways that Companies Collect facial recognition data . Absolutely. Tes likeape it from si flickr. Some use wikipedia. They collect it through Massive Network market reach. Facebook is a good example that. If you ever posted a photo of yourself to facebook, that can be used . Absolutely, by facebook and others. Could using a snapchat or instagram filter help hone an algorithm for facial recognition . Absolutely. Can surveillance camera footage that you do not even know his being taken of you be used for facial recognition . Yes. Currently, cameras are being designed. People think i am going to put on a cute filter and not realize that data is being collected by a corporation or the state, depending on what country you are in, in order to track your two surveilling, potentially il you,urve potentially for the rest of your lives . Correct. What can a consumer or constituent like mine do if they have been harmed by companys improper collection . We were talking about how facial recognition is often times often times has the highest error rates for black and brown americans. The worst implications of this is that a computer algorithm thattell a black person they have likely committed a crime when they are innocent. How can a consumer or a constituent really have any sort of recourse against a company or an agency if they have been mis misidentified . Right now, there are very few ways. Litigation can be brought against a company. One, you have to know it was collected, two, you have to know it was used misused, and three, you have to have the resources. Lets say you walk into a Technology Store or as the technology spreads, you walk into a store in the mall, and because the error rates are higher for black and brown folks, you get misidentified as a criminal. You walk out and lets say an officer stops you and says, we think you have been accused of a crime. You have no idea that facial recognition may have been responsible for you being mistakenly accused of a crime. Is that correct . That is correct. We have evidence that it is often not disclosed. That evidence is often not disclosed, which also compounds on unbroken criminal justice system, where people very often on our broken criminal justice system. The willie lynch case in florida is case in point. What we are seeing here is that these technologies are almost on a mating automated injustices automating injustices, but also automating biases that compound on the lack of diversity in Silicon Valley as well. Absolutely these companies do not reflect the general population and the dishes since choices they make are in the interest of a small few. This is some reallife black mirror stuff that we are seeing here. I think it is really important that everybody understand what is happening. This is happening secretly as well, correct . Yes. Thank you and thats my time. The gentleman from pennsylvania is now recognized for five minutes. Thank you, madam chair. I just want to say that we all represent many people that are probably not familiar with the commercial and governments use of facial Recognition Technology. There is a lot of technology out there. I am grateful for the witnesses being here to help shed a little bit of light on the topic of facial Recognition Technology. When we look at the if there is a proper approach toward regulating the use of facial Recognition Technology, you to balanceed personal privacy with whatever appropriate use there may be as a tool to make the government or Law Enforcement capabilities more effective in what they do. The reason i say this is, several years ago, something happened in the Legal Community affect, where Television Shows exaggerated the prevalence of dna and forensic evidence and the ease of its processing in criminal cases. Defense attorneys then used the publics new perception of this evidence to claim the lack of enough forensic evidence meant that the police did not do their due diligence. Today, many Television Shows and movies reference facial Recognition Technology as part of their storytelling. There is a lot of concerns here. I have concerns with the Fourth Amendment and all of our rights that we have. I guess, mr. Parker, if you could just maybe explain to what extent do you think the current pop culture is filled with an exaggerated or distorted view of how prevalent they use or if there is an appropriate use of facial Recognition Technology . First of all, i do think it has i mean, if you look at the portrayal of the technology in the media, it is far beyond what we can do right now. That is one thing to consider. The other thing is that we mentioned earlier about what is happening in china. Unfortunately, their government by policy is using technology, not just this one, many others, to persecute certain groups. Obviously, that is a horrible example of how technology can be misused. I think also the capability is different. I am not an expert on china either. To use facial Recognition Systems, there has to be a database with people enrolled in it. I suspect there is a large database like that over there. I can speak on behalf of our members. We have no interest in helping the government at any level here do massive surveillance of citizens engaged in lawful activity. We have no interest in that. That is not the case right now as a system and i have not seen evidence that that is what is intended, but certainly, that is not a place we want to go. You mentioned that technology can be a great tool and it can. Phones can keep us very wellconnected and do things. It can be a great hindrance and distraction, too. Usingof people now bully social media and so on, so that can happen with anything. Wes a matter of how effectively regulate that and make sure it does not get used inappropriately. Do you think we could be looking effect inssible cfi terms of facial recognition being used by Law Enforcement . I think that is a risk. The key here is to have really locked down and thorough use policies and constraints. I think there is many uses in both the private and Public Sector where that is being done correctly. There is other cases we know less about because there is less transparency. A part of that is some Accountability Measures that ensure use of those systems are auditable. Ok. I appreciate that, because this is a very sensitive issue and i do appreciate the opportunity of having these hearings so that more people are aware of whats happening. Thank you and i yield back. Thank you. I recognize the gentleman from new mexico for questions. Thank you, mr. Chair. Thank you also much for being here today. We appreciate your time and effort in this hearing. I recently read that some employers have begun using facial Recognition Technology to help decide who to hire. At certain companies, such as helton and euna lever, job applicants can hilton and canever, job applicants record videos. The algorithm that ranked the applicant against other applicants based on a socalled employability score. Job applicants who look and sound most like the current high performers at the Company Received the highest scores. I have two questions for you. Oft it true that the use facial recognition and Characterization Technology in job application processes may contribute to biases in hiring practices . If yes, can you please elaborate . Is absolutely true it is absolutely true. The scenario you described so well is a scenario in which you create a bias feedback loop, in which the people already hired and promoted become the models for what a good employee looks like. If you look at the executive atte and goldman sachs, goldman sachs, you see a lot of white men. If that becomes the model for what a successful worker looks like and that is used to judge whether my face looks successful enough to get a Job Interview at goldman sachs, we are going to see a kind of confirmation bias in which people are excluded from opportunity because they happen not to look like the people who had already been hired. Thank you so much for that. You agree that granting higher employability scores to candidates that look and sound like highranking employees may lead to less diversity in hiring . I would agree. I would also say that that methodology is not backed by scientific consensus. Thank you. Envision any privacy concerns that may arise when employers collect and use the data generated from video Job Interviews . Yes. Thank you for the question. That is absolutely a concern, since the individuals may not be aware of what data is being collected, especially if some of those systems are being used in an in person interview. The person may or not be aware of that were whether that is part of the decisionmaking process for their application. Thank you so much. I, like many of my colleagues, have expressed and im concerned over the use of this technology. I am concerned that facial Recognition Technology disenfranchises individuals who do not have access to orernetwork internet video devices. I am worried that relying on algorithms to predict highranking employees will only inhibit the hiring of a more diverse workforce. Your testimony today highlighted many of these risks. Commercial face recognition algorithms misidentify racial minorities and women at substantially higher rates than whitecommercial face recognition algorithms males. We must develop legislation to ensure we get the best of the benefits of this technology, while minimizing the risks of a bias in employment decisions. I yield back. That concludes our hearing. We have no other witnesses. I am recognizing the Ranking Member and others on this side of the aisle for five minutes and then we will close with five in its. Five minutes. Rep. Jordan the broad outlines of what we are trying to do legislatively sort of as a start, and we are working with the chair and members of the majority as really is first just an assessment. Im talking again largely what government is doing, what the federal government is doing. So the first thing we would like to ask we want to know which agencies are using this and how theyre using it and to what extent is it happening . And some of you testified, ms. Whitaker, we dont know that. We dont know to what extent the f. B. I. Or the i. R. S. Or any other agency. We found out a few years ago the i. R. S. Was using Stingray Technology which was like what does the i. R. S. Need that for . So first part of what we hope will be legislation that we can have broad support on, that the chairman and both the republicans and democrats can support is tell us whats going on now. And then second, while were trying to figure that out, while the study and getting into accountability and whats all happening, lets not expand it. Lets just start there. Tell us what youre doing, and dont do anything while were trying to figure out what youre doing. And then once we get that information, then we can move from there. That is what i hope we can start with, madam chair, and frankly what weve been working with now for a year, staffs for both the majority and the minority. So i hope i see a number of you nodding your heads and i hope thats someplace that you all would be happy we would be supportive of us doing as a committee and as a Congress Just to figure out whats going on. With that i yield to my colleague from north dakota if thats ok, madam chair, for the remainder of our time. C. S. I. Was my favorite show when i practiced criminal defense. If this body would pass a law into effect that shut off everybodys facial recognition on their iphones tomorrow, i think we would have a whole different kind of perspective on this from our citizens. So identifying people quickly and easily has so many positive Law Enforcement and safety applications that i think it would be irresponsible to disregard this technology completely. More importantly, i think the private sector and so my intent isnt when im asking these questions is not to demonize Law Enforcement. They will use whatever tools are available to them and they should. And i think we should also recognize that there are very responsible large corporations that want to get this right. And they dont want to get it right just for the bottom line although that is helpful. They have Corporate Cultures as well. And more importantly, there are those of them arguing for a federal regulatory framework. Our job is to get it right. Our job is to ensure that we have responsible regulation that protects the privacy of all americans. But part of doing that is recognizing that is here. And in some way, shape or form it will continue to be here. And there are tremendous amount of positive applications that can be used. But there are dangers. And there are significant dangers. And because for every reason why theres a positive application for identifying people quickly, that is an invasion on everybodys privacy who is in that particular space. So were going to work with it. Were going to continue to use it. It is causing tremendous consumer consumer convenience. There are lots of different applications. But we have to be cognizant of the fact that this is a little different than a lot of other things. Because identity, identity is something that can never go away once its been identified. And right to Free Association and the right to do those things is fundamental in the american population. And anything that has a Chilling Effect on that has to be studied very, very closely. And i agree with mr. Jordan and when we know how this is being used. And i also agree with mr. Connolly, technology will advance. Human reviews will exist. Things will happen. This will get better and better all the time. I dont want any false positives. And i dont want any false positives based on race, age or gender. But my number one concern is not only those false positives. Its the actual positives. Where theyre doing it and how theyre doing it and why theyre doing it. And we have to understand that while this technology has a tremendous benefit to a lot of people, it poses real significant and unique dangers to fundamental basics of First Amendment rights, Fourth Amendment rights, and we have to continue to work for it and i should also say this isnt the first time government has been behind the eight ball on these issues. Were so far behind on online piracy, were so far behind on data collection, data sharing and those types of issues and one of the dangers we run into with that is by the time we get around to dealing with some of these issues, society has come to accept them. And how how the next generation views prichecy in a public setting is completely different than how my generation and generations above us viewed privacy in a public setting. And the world is evolving with technology. And this is this is going to be a part of it going forward. So i appreciate everybody on both sides of this issue. And i really and i appreciate the fact that we had this hearing today. With that i yield back. I thank all of the panelists and all of my colleagues today for participating in this very important hearing. We have another member, mr. Desagne, is on his way and he has been misidentified. Hes a member of the committee but is at another committee. Hes rushing back to share his experiences with us. And i want to allow him to give the information that he has on this issue personally. But i do want to say that one of the things that came out of the hearing is that it really is not ready for prime time. And it can be used in a positive way but it can also as many witnesses pointed out, ms. Whitaker even showed a case, allegedly, where a person was innocent yet put into jail based on false information of his identity which certainly needs to be investigated. But it can be used for positive ways. But also severely impact the civil rights and liberties of individuals. At this point, id like to recognize my colleague from the great state of california for can he finish his questions and his statement . Because he was misidentified. He was one of the 28 that the american Civil Liberties union showed was misidentified. So i recognize my colleague now. Thank you, madam chair. I did have a constituent at a town hall say it was in my case was actually a step up from being a member of congress to being a criminal. But, you know, i was quite offended on behalf of all of us that somebody would well, i really want to thank the chair and the Ranking Member for having this meeting. Its really important being from the bay area, having had a relationship with a lot of these Tech Companies and having that relationship strained recently. And the benefit that this technology could give us but the over marketing of the benefit and the lack of social responsibility as mr. Gomez said in the past, i had a privacy bill in the legislature that was killed and it basically came from a District Attorney in Northern California who told me about a serial rapist who was getting his victim information from Third Party Data that he was paying for and we provided an opt out. It was killed fairly dramatically in the First Committee in the assembly after i was able to get it out of the senate. I tried to get mr. Gomez to help me in those days. So in that context, if i had a dime for every time one of these companies told me when i asked a reasonable question that i was inhibiting innovation i would be a healthy person. And i appreciate the work you do. But in the context of facial recognition, and what is a meaningful sort of reflection . Ive said this to this committee before that Justice Brandeis famously said americans have a right to be left alone, how are you left alone in this kind of surveillance economy . So facial recognition important to get it right. In my personal experience but also the overlay of all the other data accumulation. So how do we get ms. Leong, first of all, whats the danger in allowing companies when weve seen facebook absorb a 5 million penalty when they quite consciously and refer to some of my former friends in the tech company in the bay area is being led by a culture of selfrighteous socio paths where they they think that its all right to take advantage of people. And they get reinforced by the money they make without thinking of the social consequences. Given they were willing to absorb a 5 million hit by ignoring the settlement that they agreed to, and this kind of culture, whats the danger in allowing Companies Like facebook to having access to not just facial templates but the interaction with all the other data theyre collecting . Thank you very much fosh the question. I think that demonstrates greatly the comment that was made earlier about the interrelationship between public and private uses of this technology and how those sometimes can feed off each other in beneficial or not so beneficial ways. And your earlier comment was to the nature of our surveillance technology, i think is the underlying question in terms of what what is it that we want to accept and live with in our country based on our values and then how does technology enable that . I was not asked to show my identification to come into this building today even though most buildings in washington i would have to show it to show it and go to a meeting i would have to give my i. D. But for government building i was checked for a physical threat with a scanner but didnt have to identify myself to come in. I hope that wouldnt change because now it could be collected possibly or i could be identified off a video feed that i still have the right to come into this place of government without that. And i think that that demonstrates that we need to focus on what the things are that we are protecting and discuss so clearly here today in terms of our values and freedoms and liberties and then how we dont let the technology because its here, because it can do certain things or because its even convenient that it does certain things impinge on those in ways that we dont think through carefully and are not ready to accept those compromises. How do americans be allowed to be left alone this this environment . What is affirmative consent look like . Well n. A commercial setting or a commercial context, the companies should not be using facial Recognition Technology unless the person has said they want to use it for the benefit or convenience that it provides. So if i want to use it as a member of a limited membership Retail Establishment or i want to use it to get v. Ics p. Privileges at a hotel or expedite my checkin at a conference i can choose to do that. But i would know that i was doing it and i would have to enroll in that system consciously. Its not something that could happen to me without my awareness. Ok. And who owns the data when you look at this . Weve had hearings about Car Companies say they own the diagnostics and the g. P. S. And all these private sectors say they own it. Shouldnt we own that . Ownership of data is a very complicated topic and way to look at it because it isnt something that should be able to necessarily be sold which is really the nature of property. But in terms of the rights to who has to use it, yes, that should be very clearly spelled out in terms of if ive agreed to a certain amount of service in return for providing for enrolling in a facial recognition system, i have a reasonable expectation not to have that data scraped or used for some other undisclosed purposes so that im not aware of. Thank you, madam chair, thank you for indulging my schedule. Thank you so much. Im so glad you could get back. And just in closing very briefly, i think this hearing showed that this is a wide scale use we dont even have a sense of how widely its being used. Yet very little transparency of how or why its being used. And what security measures are put in place to protect the American People from that use and their own privacy concerns. We also have the dual challenge not only of encouraging and promoting innovation, but also protecting the privacy and safety of the american consumer. I was very much interested in the passion on both sides of the aisle to work on this and get some accountability and reason to it. And i believe that legislation should be bipartisan. I firmly believe the best legislation is always bipartisan. And i hope to work in a very committed way with my colleagues on this side of the aisle and the other side of the aisle to coming up with common sense facial recognition legislation. I would now like to recognize for closing mr. Gomez who was also misidentified and has personal experience with this. So thank you. And thank you very, very much to all of our panelists. Thank you, madam chair. First, i just want to thank all the panelists for being here. All the questions that weve had, we even have we have twice as many more that we didnt even have a chance to ask. I want people to walk away and understanding that this is a technology thats not going away. Its just going to get further and further integrated into our lives through the private sector and through government. Now we have to figure out what does that mean . I dont want athe same time i dont want people to think that false positives are not a big deal. Because for the people who are falsely identified as a particular person, and it changes their life, its a big deal to them. So when people like downplay it as like its Getting Better and not that big of a deal, to that one person that goes to jail or one person who gets pulled over or one person that maybe doesnt make make it to work on time and lose their job and a devastating effect on their lives it matters to them and should matter to all of us. Its not one or the other. I do believe this will get better and better and better and we have to put the parameters on it on that use of that technology. But theres still a lot of questions that we have to do. But ms. Whitaker described it correctly because when i started looking into this issue, i did run into that brick wall of National Security claims plus the corporate sector saying that we you know, its proprietary, this information when it comes to our technology and were not going to tell you what it says, and how accurate it is and who we are selling it to and whos using it. That that wall must come down. And that is i think we share across the the political spectrum. How do we make sure that the wall comes down in a responsible way that keeps innovation going, keeps people safe, but respects their liberties and their freedom . So with that, i yield back, madam chair, thank you so much for this important hearing. Thank you. And i would like to thank all of our witnesses. Without objection, all members will have five legislative days within which to submit additional written questions for the witnesses to the chair which will be afforded to the witnesses for their response. I ask the witnesses to please respond as promptly as you can. This hearing is adjourned. And thank you. Hey, im so glad you got back. [captioning performed by national captioning institute] [captions Copyright National cable satellite corp. 2020] [captioning performed by the national captioning institute, which is responsible for its caption content and accuracy. Visit ncicap. Org] announcer over the next two weeks on q a were folking on the New Hampshire president ial primary. And the iowa caucuses. Sunday night, longtime New Hampshire Union Leader Publisher and now editor at large Joseph Mcquaid talks about his states president ial primary history. And the current state of politics in New Hampshire. New hampshire is always different. I think it appreciates being first, and people turn out. Its one of the highest turnout states at least in primaries in the country. And if it was so you know, too white and not representative of the country, then with the exception of bloomberg who has not cited that as a reason, but why are all these other candidates coming to New Hampshire . Announcer watch sunday night at 8 00 p. M. Eastern on spans q a. This body, this chamber exists precisely, precisely, madam president , so that we can look past the daily dramas and understand how our actions will reverberate for generations so that we can put aside animal reflexes and animosities and coolly consider how to best serve our country in the long run. So that we can break factional fevers before they jeopardize the core institutions of our government. As hamilton put it, only the senate with, quote, confidence enough in its own situation, end quote, can preserve unyou aed and uninfluenced the necessary impartiality between an individual accused and the representatives of the people, his accusers. So madam president , the house s our hour is over. The senates time is at hand. Its time for this proud body to honor our founding purpose. Heres what Alexander Hamilton warned of in federalist 65. He said the greatest danger is that a decision in an impeachment trial will be regulated more by the comparative strength of the parties than by the real demonstration of innocent or guilt. Alexander hamilton, even before the day Political Parties were as strong as they are today, wanted us to come together. The leader wants to do things on his own without any democratic input. But fortunately we have the right to demand votes and to work hard as we can for a fair trial. A full trial a trial with witnesses, a trial with documents. The founders anticipated that impeachment trials would always be buffeted by the wind of politics. But they gave the power to the Senate Anyway because they believed the chamber was the only place where impartial justice of the president could truly be sought. In the coming days, these eventful and important coming days, each of us, each of us will face a choice about whether to begin this trial in a search of the truth or in the service of the president s desire to cover up and rush things through. Announcer for the third time in history, a president is on trial in the u. S. Senate. Watch live tuesday when the trial resumes at 1 00 eastern on cspan2. Cspans washington journal, live every day, with news and policy issues that impact you. Coming up friday morning, ethics and Public Policy centers henry olson discusses president Trumps Senate impeachment trial and potential impact on campaign 2020. Then all in together cofounder and c. E. O. Lauren leader talks about women in politics and her Groups Campaign to train women for civic engagement. Watch cspan spans washington journal live at 7 00 eastern friday morning. Join the discussion. Announcer next, Democratic Legislative Campaign Committee President jessica post, she discusses the states the committee will be targeting in the 2020 election. She talked about the committees flip everything campaign, targeting states like pennsylvania, arizona, and texas. Thanks for coming out. Im the National Press secretary and excitedo

© 2024 Vimarsana

comparemela.com © 2020. All Rights Reserved.