comparemela.com

Card image cap

The committee will come to order. Good morning, everyone. Without objection the chair is authorized to declare a recess of the committee at any time. With that i would like to recognize myself to give an Opening Statement. Today, the committee is holdingg our third hearing this congress on a critical issue, facial Recognition Technology. It is clear that, despite the private sectors expanded use of technology, its just not ready for prime time. During this hearing we will examine the private sectors development, use, and sale of technology as well as its partnerships with Government Entities, using this technology. We learn from our first hearing on may 22 of 2019 that the use of facial Recognition Technology can severely impact American Civil Rights and liberties, including the right to privacy, free speech, and equal protection under the law. We learned during our second hearing on june 4 how federal, state and local Government Entities use this technology on a wide scale, yet abide very little transparency on how andsp why it is being used or on secy measures to protect sensitive data. Despite these concerns we see facial Recognition Technology being used more and more in our everyday lives. The technology is being used in schools, Grocery Stores, airports, theme parks, stadiums and on our phones, social media platforms, doorbell camera footage, and even in hiring decisions, and it is used by Law Enforcement. This technology is completely unregulated at the federal level, resulting in some questionable and even dangerous applications. On december 2019, the National Institute of standards and technology, issued a new report finding the commercial facial recognition algorithms misidentified racial minorities, women, children, and elderly individuals as substantially higher rates. I look forwardrd to discussing this study with doctor ro mine, the director of nists Information Technology laboratory whos joining us today. Also look for to hearing fromi our expert panel hailing from academia, industry, and the Advocacy Community onn recommended actions and policymakers should take into accountli to address potential consumer harm based on these findings. Our examination of facial Recognition Technology is a by press and effort. I applaud Ranking Member jordan timesg and ongoing advocacy on this issue. We have a responsibility to not only encourage innovation but to protect the privacy and safety of american consumers. That means educating our fellow members and the American People on the different uses of the technology and distinction between local subjective identification and surveillance uses. That also means exploring what protections are currently in place to protect civil rights, Consumer Privacy and Data Security and prevent ms. Identifications as well as providing recommendations for future legislation and regulation. In that vein i would like to announce today that our committee is committed to introducing an market common sense facial recognition in the very near future. And and i hope is that we can do that in a truly bipartisan way. Weve had several conversations and i look forward to working together towards that goal. I now recognize a distinguished Ranking Member jordan for his Opening Statement. Thank you, madam chair. Appreciate your willingness to work with us on this legislation. We have built that will talk about assl well. Facial recognition is a powerful new technologies being widely used by both Government Agencies and private sector companies. Expected to be valued at 8. 9 billion by 2022. Ncrease gi, local state and governmental industries are with little. D. , but to know accountability. The this Technology Government can capture faces in public places, identify individual, which allows the tracking of our movements, patterns, and behavior. Of this is currently happening without legislation, legitimate government functions with american Civil Liberties. That must change. While this hearing is about uses of facial unclearion i have to be that i have no desire to enhance it in the private sector. Promise iate the great that this Technology Holds for making our lives better. Improved Data Security and led to greater efficiency and verification and prevents tion that theft and protects consumers. He urgent issue, the urgent issue we must tackle is reigning in the governments unchecked se of this technology when impairs our freedoms and liberties. About s became concerned government use of facial Recognition Technology after to surveyorwas used protests in his district related to the death of freddie gray. A deeply s as encroachment on their freedom of speech and association and i more. Nt agree this issue transcends politics. It doesnt matter if its a trump rally or a Bernie Sanders rally. The idea of american citizens for tracked and cataloged merely showing their faces in public is deeply troubling. Imperative that congress understands the effects of this technology on our constitutional liberties. The invasiveness of facial Recognition Technology has a number of localities to ban its government using s from buying or digital facial recognition for any purpose. Toens to create a patchwork of laws and may hinder the use of technology. An rtunately this is not issue we should leave to the courts. Thatesents novel questions are best answered by congressional policy making which can establish a national consensus. The unique governmentwide focus of this committee allows us to to address islation facial Recognition Technology here at the federal level. We know that a number of federal agencies possess facial Recognition Technology it without guidance from Congress Despite its serious implications on our first and amendment rights. At a bare minimum we must understand how and when federal agencies are using this type of what purpose. For currently we do not know even this basic information. Our committee has jurisdiction over the entire federal governments use of we must technology, start by pursuing policy address this fundamental information. It is our intention as well to introduce legislation. Work with both sides here, trying to work ogether that will provide transparency an accountability with respect to the federal and use ofs purchase this technology. Want to thank you madam chairwoman and i look forward to hearing our Witnesses Today and thank you for being here. Before we get to the witnesses, i would like to make consent request. I would like to insert into the aclu d a report from the hich found that amazons Recognition Technology misidentified 28 members of as other individuals ho had been arrested for crimes. Including john lewis, a national legend, a National Civil rights leader. So i would like to place that nto the record, and i would also like to mention that three members of this committee were misidentified. Gomez, mr. Clay, and mr. They were misidentified, and is ed that this technology not ready for primetime along with 11 republican members of like to, so i would now colleague, mr. Gomez, who has personal experience with this for an Opening Statement. You, madam chair. First, this is the Committee Committee is holding its third hearing on this issue and up until two years ago this even on my radar, aclu conducted this test which falsely matched my identity with someone who crime. Ted a then all of a sudden my ears doubt i but i had no was misidentified because of the color of my skin than anything else. O as i started to learn and do research on this issue my concerns only grew. I found out that its being used in so many different ways, not only in Law Enforcement, at the federal level, the local level, its also being used when it comes to apartment buildings. To door bells, when it comes to shoppers. When it comes to a variety of things. But at the same time, this technology is fundamentally flawed. Who gets pulled over by the police, in certain not a big deal. In other areas it could mean think death, if people youre a violent felon. So we need to start taking this seriously. Probably doesnt rank in the top three issues of any in the United States, but as it continues to be used and it continues to have issues, there will be more and more people who are misidentified, and more and more eople who are questioning if their liberties and their freedoms are starting to be fault of their own, just some algorithm misidentified them as somebody committed a crime in the past. This is something we need to aise an alarm and thats what these hearings are doing in a bipartisan way to make sure that American Public doesnt stumble into the dark, and all of a a den, our freedoms are little bit less. Our liberties are a little bit having these start important discussions, in a figure out ay, to how and what can the federal government do. What can congress do . What is our responsibility . The ith that, i appreciate chairs commitment to legislation. Also, i appreciate the ranking commitment to legislation, because i know this issue is a tough one and it can bipartisan in a way. With that i yield back. Now recognize mr. Meadows of North Carolina for an Opening Statement. You, madam chair. Oth of you, thank you for your leadership on this important issue. Things that i would highlight. Certainly we know mr. Gomez and there is certainly no background that he of, being be accused involved with. So i want to stress that his character is of the upmost to us this side of the aisle and i say that in just because one of the issues we need to focus on and this is very important to me. His is where conservatives and progressives come together. Its on developing our civil and that right to privacy. I agree with the chairwoman and Ranking Member and mr. Gomez, and others where weve about d conversations addressing this issue, to focus only on the false positives is though. Roblem for us, i can tell you, technology is moving so fast that the false eliminated ll be within months, im here to say if we only focus on the fact getting it right with facial recognition weve missed the whole argument at use technology is moving warp speed and what well find is, not only will they my concern is not concern that they improperly identify mr. Gomez, my concern they will properly identify mr. Gomez and use it in manner. Ng so the witnesses that are here today, what i would ask all of we put a is how can safeguard on, to make sure that not a fishing expedition t the cost of our Civil Liberties. Thats essentially what were talking about. Were talking about scanning features, andcial even if they got it a hundred percent right, how should that used . How should we ultimately allow in government to be involved that . Nd so im extremely concerned that as we look at this issue, that we have to come together in way to figure this out. I think it would be headlines on he New York Times and washington post, if you saw coming of both parties to an agreement on how we are to address this issue. Committed to doing that. Madam chair, i was fully predecessor, your he and i both agreed at the very first time that this was brought p that we had to do something and i know the Ranking Member hares that, so im fully engaged. Lets make sure that we get something and get something quickly. And if we can do that, you know, start i think if we focusing, again, on just the accuracy, then they are going to accurate. That its and but what standards should there . E the accuracy should it be a hundred percent . Should it be 95 think when mr. Gomez actually identified, threshold was brought down to 80 . Get a lot of o false positive when is that happens but we need to help set the standards and make sure that government is not using this in an improper fashion, and with yield back. I thank the gentleman for his statement. I would now like to introduce witnesses. We are privileged to have a rich witnesses onexpert our panel today. Renda leon is a Senior Council and director of a. I. And Ethics Future of Privacy Forum. Romine, director of National Institute of standards technology. Whittaker daniel castro, and jake parker is the senior director of government relations, at the Security Industry association. If you would all rise and raise hand, ill begin by swearing you in. Whittaker da castro, and jake parker is the senior director of government [sworn in] witnesses all answered in the affirmative. Thank you, and please be seated. Future ofvery, very sensitive, so please speak directly into them and without objection your written testimony will be made part of our record, and with that, ms. Leon, youre now recognized for five minutes. Thank youvery, very for the y considering d for the commercial use of facialrom Recognition Technology. This is an Privacy Forum is a important ce Nonprofit Organization that support of emerging technologies. We believe that the power of information is a net benefit to society and that it can be appropriately managed to control risks as a catalyst for leadership and scholarship advancing principle data processes in to individuals and groups. Biometric systems such as those recognition Al Technology have the potential to and ce Consumer Services improve security but must be designed, implemented and awareness with full of the challenges they present. Today my testimony focuses on of blishing the importance technical accuracy in discussing face image based systems benefits and e harms to individuals and groups, and recommending expressed for any s the default commercial use of identification or verification systems. Specifics of the how a Technology Works is critical for effectively risks. Ing the relevant not every whittaker camerabas a facial recognition system. Facial recognition system creates unique templates stored database. Olled these databases are used then to oneonone son in a or identify a one to many search. If a match is found that person is identified with greater or lesser certainty depending on the system in use, thresholds and settings in place and the operators expertise. Thus Recognition Systems involve matching two images, without they do l processing not impute other characteristics to the person or image. Therea great deal of confusion in the media particularly in contrast to facial characterization or software, ection which attempts to analyze a single image and inpretty image eristics to that race. Ding gender and these systems may or may not link date to pick individuals but they carry their own significant risks. And acy requirements capabilities for recognition and characterization systems vary with context. The level of certainty acceptable for verifying an individuals identity when mobile device is below the standard that should be required for verifying that individual is included on a terrorist watch list. N addition, quality varies widely among suppliers based on detection, diversity of Training Data sets and the third, tetting methodologies. Reflgts their ability to meet these goals. For example, the most recent in highlights accuracy outcomes that were a hundred times worse for certain groups achieved st systems results across demographic groups with variations that were undetectable. However, the real harm arising recognition te and and characterization systems cannot be ignored. Individuals already use facial to open their phones, access bank accounts, or photos. E their organizational benefits include more secure Facility Access, functions spitality and personalized experiences. New uses are being imagined all the time. The potential harms are real. In addition to inaccuracy, about realtime surveillance societies have led individuals and policymakers to significant reservations. The decision by some [sworn in] let the record show that the municipalities to legislatively use of facial Recognition Systems by Government Agencies reflects heightened concerns. The ethical considerations of and how to use facial Recognition Systems exceed privacy considerations and the egulatory challenges are complex. Even relatively straightforward legal liability questions prove parties bearn many some share of responsibility. When considering the scope of this ries hoping to use technology, from educational and Financial Institutions to retail the potential impact on individuals are mind boggling. Many technologies, facial recognition applications offer benefits and generate on context. Tracking online preferences and ersonalizing Consumer Experiences are features some people value but others strongly oppose. Options closely to the appropriate consent level is essential. Fpf prefers a comprehensive privacy bill to data t all sensitive including biometric we recognize may choose to consider Technology Specific bills. We provide a useful model in requiring the defaults for commercial identification or verification systems to be opt in. That is, express affirmative consent prior to enrollment. Be few, es should narrow, and clearly defined and further restrictions should be and based on the scope and severity of potential harm. Thank you for your attention and commitment to finding responsible regulatory approach to the use of facial Recognition Technology. You, the chair now recognizes dr. Romine for five minutes. Romine director of the Information Technology laboratory at the department of commerces National Institute of standards and technology. Known as nist. Thank you for the opportunity to appear before you today to discuss nists role in standards facial ing for Recognition Technology. In the area of biometrics, nist has been working with public and private sectors since the 1960s. Biometric provides a way identify the identity of humans based on one or more physical or behavioral characteristics. T compares an individuals facial features to available verification or identification purposes. This work improves the accuracy, quality, use ability, intraoperability, in the area of and ensures they are used in the arena. Ational they provide stateoftheart benchmarks and guidance to industry and u. S. Government upon es that depend biometrics technologies. Nist face recognition vendor frvt providesam or technical guidance and scientific support and recommendations for utilization recognition technologies to various u. S. Government and you Law Enforcement agencies f. B. I. , dhs, cbp and iopa. Interagency report 8280 released in december 2019 accuracy of face recognition algorithms for demographic groups defined by and race or country of both oneonone and one to many identification search algorithms. Found empirical evidence for differentials that it miss evaluated. The report distinguishes between false positive and false negative errors and notes that errors are of application dependent. They conducted tests to quantify demographic differences for 189 ace recognition al gore rims from 199 Developers Using four collections of photographs with million images of 8. 49 Million People. Hese images came from operational databases provided by the state department, the security,t of homeland and the f. B. I. Our o ill first address verification applications. They are false positive differentials are much larger han those related to false negative, and exist across many of the algorithms tested. Present a ives might security concern to the system wner as they may allow access impostures. Higher in ives are women than in men. Regarding race, we measured high false positive rates in asian and africanamerican faces those of caucasians, there are also higher rates of alse positives in indian, native american and pacific islanders. Including those in europe and the United States. Notable exception was for some algorithms developed in countries. There was no such dramatic difference in false positives in oneonone matching between countries. Caucasian faces for there was no such algorithms developed in asia. While the study did not explore the relationship between cause effect, one possible connection and an area of research is the relationship algorithms performance and the data used to algorithm itself. Ill now comment on one too many search al gore rims. Its algorithm dependent. False positives in one to many particularly important because the consequences could include false accusations. Algorithms the miss study measured higher false ositive rates in women, africanamericans and particularly in africanamerican women. However, the study found that many algorithms gave similar false positive rates across these specific demographics. Some of the most accurate algorithms fell into this group. Underscores one overall message of the report. Different algorithms perform differently. Indeed, all of our reports note ide variations in recognition accuracy across algorithms and an important result from the emographic study is that demographic effects are smaller with more accurate algorithms. Nist is proud of the positive impact it had in the last 60 years on the evolution of capabilities, with nist extensive experience and broad expertise both in successful and in collaborations with the private sector and other government actively nist is pursuing the standards and Measurement Research necessary interoperable, secure, reliable and usable Identity Management systems. For the opportunity to estify on this Identity Management and ill be happy to questions that you have. Chairwoman maloney, Ranking Member jordan and members of the ommittee, thank you for inviting me to speak stowed. My name is Meredith Whittaker cofounder of the. A. I. Now institute in new york university, were the first niversity Research Institute dedicated to studying the social implications of artificial ntelligence and al gore erythromycin i can technologies. Worked at google for over a decade. Al gore rhythmic. The technology does not work as advertised. Research shows what Tech Companies wont tell you, that facial recognition is often bias, and error prone and there is no disclaimer populations at the already facing societal discrimination bear the brunt of recognitions failures. As dr. Romine mentioned, the audit confirmed that some systems were 100 times less accurate for black and asian white people. R but this isnt facial recognitions only problem, and accuracy will not make it safe. Facial recognition relies on the biometricction of our data. It allows government and private south kingstownors to persist at ny timely track where we go, what we do and who we associate with. Over half of americans are in a Law Enforcement facial recognition database, and usinesses are increasingly using it to surveyor and control workers and the public. At replacing time clocks job sites, keys for housing units, Safety Systems at at stadiums rity and much more. Nd weve seen reallife consequences. Facial recognition authentication system used by recognize to transgender drivers, locking them out of their accounts and livelihoods. Facial roanoke anything analysis is also being used to make udgments about peoples personality, their feelings and their worth. Based on the appearance of their face. The set of capabilities rages urgent concerns especially since claim you can automatically detect interior character based on facial expression is not supported by scientific content us is and has been called go as soon as claim you can in the past. Ost facial systems in use are developed by private companies who license them to government and businesses. The commercial nature of these systems prevents meaningful and accountability, hiding them behind legal claims of trade secrecy. That researchers, lawmakers and the public struggle to answer critical how, and about where, with what consequences this used. Logy is being this is especially troubling since facial recognition is usually deployed by those who power, say, employers, landlords, or the police, to sawyer vary, control in some cases oppress those dont. In brooklyn, tenants pushed back against their landlords plans with facialey entry recognition raising questions about biometric data collection, the very real d possibility that invasive surveillance could be abused by harass and evict tenants, many of whom were black, latino, women and children. To address the harmgs of this Technology Many have turned to assessment and auditing. These are a wonderful step in the right direction, but they to ensure that faci recognition is safe. Deployment criteria risks allowing companies to assert safe and nology is fair without accounting for how it will be used or the concerns communities who will live with it. If such standards are positioned s the sole check on these systems, they could function to mask harm instead of prevent it. Healthcare, to its difficult to think of an industry where we permit aspanies to treat the public experimental subjects. Deploying untested, unverified technology thats been proven to violate civil ights and to amplify bias and discrimination. Facial recognition poses an existential threat to democracy and fundamentally shifts the balance of power etween those using it and the populations on whom its applied. Advocating its responsibility if it continue to allows this technology to go unregulated. Step lawmakers must act rapidly to halt the use of sensitiveognition and domains by both government and commercial actors. The overall bout policing of communities of olor, or gender equity, or the constitutional right to due process and free association, unchecked ecretive deployment of flawed facial Recognition Systems is an issue you cannot ignore. Facial recognition is not ready for primetime. Andress has a window to act the time is now. Thank you. Daniel now recognizes castro for five minutes. Thank you. Chairwoman maloney, ranking of thejordan and members committee, thank you for the invitation to testify today. Positive uses of facial Recognition Technology emerging in the private sector. Help es are using it to travelers get to the airport faster saving people time and hassle. Improve using it to security, helping to reduce financial fraud. Hospitals are using it to verify the ight patient receives right treatment, preventing medical errors. That says en an app it uses facial recognition on dos and cats to help find lost pets. Americans are increasingly familiar with commercial uses of the technology, because its now a standard feature on the latest phones. Its also been integrated into household products like security locks. And door this is one reason why a survey last year found the majority of mericans disagreed with strictly limiting the use of facial recognition if it would mean airports cant use the to speed up security lines. N nearly half, if it would shoplifting. Stop ive also seen that facial roanoke Anything Technology is inaccurate and invasive. If that was true i would be worried, too, but it isnt. Facts. E the first, there are many different facial Recognition Systems on the market. Much better than others including, their accuracy across race, gender and age. The most accurate algorithms show no bias. Continue to get measurably better every year and they can outperform the average human. Second, many of the leading companies in the industry responsible for developing and facial recognition have voluntarily adopted robust privacy and transparency guidelines. These include voluntary standards for digital signs, multistake holder guidelines developed for the broader technology community. Sector has ivate made significant progress, Congress Also has an important role. Would like to suggest seven key steps. First, first, pass comprehensive legislation, preoemt state laws and establish basic data rights, while it may be appropriate to opt and consent for certain sensitive uses such as in healthcare or education, it always be feasible. For example, you probably to dnt get sex offenders agree in it. So opt in shouldnt be required across the board. Egislation should also be technology neutral. It shouldnt treat facial recognition differently than other types of biometrics. In addition, a federal law should not establish a private right of action because that would significantly raise costs businesses, and these costs would eventually be passed on to consumers. Second, congress should expand its evaluation of commercial facial Recognition Systems to more real world commercial uses including qualitybased systems and infrared systems. This also should continue to report performance metrics on gender, and age, and should develop a diverse image data set for training and purposes. Congress should direct performance standards for any that recognition system the government procures including for accuracy and error rates. This will ensure federal dont waste tax dollars on ineffective systems or systems with significant disparities. Fourth, congress should Fund Deployment of a fish Recognition Systems in government, for using it to i am prove security in federal buildings and government workers. Congress should continue to support federal funding for research to improve the accuracy of a fish recognition part of the s governments overall commitment to invest in Artificial Intelligence. Of of the key areas fundamental a. I. Reserve is Computer Vision and the u. S. Should continue to invest in the this technology especially as china makes gains in this field. Congress should consider legislation to establish a warrant requirement for authorities to track peoples they nts including when use gio location data from facial Recognition Systems. Finally, congress should continue to provide due oversight ensuring that any Police Surveillance or political protest is justified and conducted with appropriate it should include scrutinizing Racial Disparities in the use of force among color. Ties of congress should also require the department of justice to develop state andices for how local authorities use facial recognition. This guidance should include the mendations on how publicly disclosed, when Law Enforcement will use the echnology what sources and images will be used and what the Data Retention policies will be. Consider hould always the impact of new technologies there are safeguards in place to protect societys best interest. Facial ase of Recognition Technology, there are many unambiguously facial opportunities to use the such as allowing people who are blind, who suffer from face blindness to try to identify others. Rather than imposing bans or moratoriums congress should upport positive uses of the technology, while limiting the potential misuse and abuse. Thank you again. I look forward to answering any questions. My name is jake parker. Associationde providing a broad range of Security Products while employing thousands of innovators in the u. S. And around the globe. Our members include many of the leading developers of official Recognition Technology and products that facial Recognition Technology and products associated. It is because of the experience our members have that we are pleased to be here today to talk to you about how it can be used consistent with our values. We firmly believe all Technology Products including facial recognition shot only be used for lawful, ethical, nondiscriminatory purposes. Faith itwe can have brings value to our everyday lives. Facial recognition offers tremendous benefits. It can be used to allow individuals to prove their identity securely and conveniently to enter a venue or board an airplane. Companies are using the technology to improve the physical security of their property and employees against the threat of violence, theft, or other harm. Usernment agencies made the have used facial recognition to improve Homeland Security and investigations. To rescue trafficking victims. It has been used in almost 40,000 cases in north america identifying 9000 missing children and over 12,000 traffickers. A Law Enforcement officer in california last year saw a social media post about a missing child. ,fter using facial recognition the child was located and recovered. Other success story, they alongacial recognition with human review to identify a suspect within an hour. The chief detective was quoted as saying do not use do not use this technology would be negligent. Members see transparency as a foundation that governs the use official Recognition Technology. It should be clear when and under what circumstances the technology is used as well as a processes governing the collection, processing, and storage of data. We support sensible safeguards that promote transparency and a can ability as the most effective way to ensure the responsible use of the Technology Without unreasonably restricting tools that have become essential to public safety. Sia does not support additionally aia does not support moratorium and important bans on this technology, as the Company Works on the proposals mentioned earlier for accountability we encourage private Sector Developers to present our world views on how the technology should be best advantaged. We should remember the biotechnology improvements and provide congress with what it needs to support the expansion of the efforts. And to any with commercial use with a data policy. Many include biometric information and this needs to be tech neutral and this is the right approach. In the meantime we encourage our members to provide an active role to use this responsibility. In order to make this possible theres a set of use policies. A study that generates a fair amount of news ap biotechnology has worked for decades to test it and post results. Improving that the accuracy is approaching fingerprint, which is the Gold Standard for identification. The most important take away it confirms facial Recognition Technology performs far better across racial groups than before. Using the mug shot data base, 1 were beginning to provide technology so that all users can be comfortable with it in the transparency and privacy policies. Op before of sia. Thank you for being able to appear before you and look forward to working with you. Thank you, doctor, i would like to ask you about the study that you released last month and that you mentioned in your testimony and id like to ask unanimous consent to place that study in the record, without objection. We all know that commercial facial Recognition Technology continues to expand and both the public and private sectors, but your new study found that racial Recognition Software misidentified persons of color, women, children and elderly individuals at a much higher rate, and in your study, you evaluated 189 algorithms from 99 developers. Your analysis found that false positives were more likely to occur with people of color and is that correct . It is correct for the largest collection of the algorithm, thats correct. And your report also found at that women, elderly individuals and children were more likely to be misidentified by the algorithms, is that correct . Thats correct for most algorithms. Now, in Womens Health they used to do all the studies on men. When were you doing these studies were you doing them on mens faces as a pattern or using womens faces . No, we had substantial set of images that we could pull from so we were able to represent a broad crosssection of demographics. Did these disparities and false positives occur broadly across the algorithms. They did occur broadly throughout most of the algorithms. And your study states, across demographics, they vary by factors on to beyond 100 times in quotes, these are staggering numbers wouldnt you say . How much higher was the error rate when the algorithms were used to identify persons of color as compared to white individuals . So, as we state in the report, the error rates for some of the algorithms can be significantly higher from 10 to 100 times the error rates of identification for caucasian faces, for a subset of the algorithms. And what was the difference in the misidentification rate for women . Similar rates 10 to 100 . 10 to 100 . Id have 0 get back to you. What about black women, is that higher . Black women have a higher rate of on some algorithms, on the same algorithms that were discussing than either black faces broadly speaking or with i am broadly speaking, and higher than two other demographics. What were they, substantially higher on the order of 10 to 100. And misidentification as we all know can have serious consequences for people when theyre falsely identified, it can prevent them from boarding a plane or entering the country. It could lead to someone being falsely accused or detained or even jailed. So im deeply concerned that facial recognition technologies have demonstrated racial, gender and age bias. Facial Recognition Technology has benefits to be sure, but we should not rush to deploy it until we understand the potential risks and met gait them. Your study provides us with valuable insight into the current limitations of this technology and i appreciate the work that you have done and all of your colleagues on the panel today that have increased our understanding. I would now recognize the Ranking Member. I am going to recognize the gentle lady from North Carolina, mrs. Fox. Shes now recognized for questions. Thank you, madam chairman. How competitive is the racial facial Recognition Technology market . Its extremely competitive because of the advances in technology over the last couple of years, particularly the dramatic increase in accuracy in the last three to five years combined with advances in Imaging Technology have really made the products more affordable and therefore, theres been more interest from consumers and than more entry into the market from competitors. To what extent do the Companies Compete on accuracy and how could a consumer know more about the accuracy rates of the facial recognition . Okay, so they do compete on accuracy and, you know, the nis program plays a really helpful role here in providing a useful benchmark in measurement of accuracy and so the companies are competing with each other to get the best scores on those tests. Companies also do their own private testing and make those results available to xh erz customers and thats important because nis you have data sets, photo sets that theyre already using there, whereas, those arent necessarily the same type of images in a deployed system and so other types of testing to be done of a fully deployed system to really determine what its accuracy is. What private sector best practices exist for securing facial images and the associated data, such as face print templates and match results in these facial Recognition Technology systems . So, as i mentioned earlier, sia is developing a set of best use practices, but thats based on the fact that many of our members have produced best practices, they work with their customers on to implement, that would accomplish privacy goals. I have a couple of examples, but one of the most significant facial imaging here, many of these products already have built into them the ability to comply with privacy laws in europe so the gdpr laws in europe and so, theyre this has to do with encrypting photos, encrypting any kind of personal information associated with it, securing channels of communication between the server and the device, as well as procedures for looking up someones information and being able to delete that if requested and being able to tell someone what information is in the system. Could you summarize suscinctly some of the best practices that exist for protecting that personally identifiable information thats incorporated into it . Is it too complicated a system to explain here . Is there something we could have to read or im sure id be happy to provide more details later. Certainly one of the most important things of encryption of the data if theres a data breach. Its important to point out, we talked about a face template is what system uses to make a comparison between two photos, so, by itself, that basically is like the digital version of your fingerprint thats turned into a number, into a fingerprint system. By itself, if that data is compromised, its not useful to anyone because it has to be proprietary software and anybody reading it. Ive read about the difference between europe and us in developing these recently. A number of state and International Policies are impacting how information is collected. For example, illinois, washington, europe, cdpr directly addresses privacy information. How have commercial entities conformed to these new legal structures . So, what were seeing is that were adapter here and building in products in anticipation first of all because its just good practices, many of the things that gdpr requires and also a similar framework here in this country at some point so being proactive in building some of those things in. Thank you, madam chairman, i yield back. We shall now recognize the gentle woman from the district of columbia, ms. Norton is now recognized for questions. Thank you, madam chair. This is an important hearing, but i must say, i think were playing catch up, and the way to demonstrate that, i think, most readily is what the Cell Phone Companies already doing with this technology. Private industry appears to be way ahead of where the congress or the federal government is and the interesting thing is, theyre responding to consumers and it appears that consumers may already be embracing facial recognition in their own devices because the latest, as they compete with one another, almost all of them have incorporated facial recognition already in their latest mobile products and if one does it, the other is going to do it and apple and samsung and the rest of them already do it. You can unlock your cell phone by scanning your face. Now, the public thinks this is, i suppose theyre right, this is a real convenience instead of having to log in those numbers and they have grown accustomed, frankly, to cameras. I remember when cameras were first introduced into the streets and people said thats terrible and of course, theres no right to privacy once you go into the street, but talking about my cell phone, theres a lot of private information in there. And according to recent reports, this technology is not foolproof. Thats my concern. That for example, a simple photograph can fool it in some instances. Well, unauthorized individuals could get into your cell phone and any Sensitive Information that you happen to have in there, and a lot of people store things like, of course, their email list there, but banking and the rest of it. Do you see problems that are already there of companies now integrating faciAl Technology on to Consumer Devices like this . Are we too far behind to do anything about it . Because it looks like the public sees convenience and i dont hear anybody protesting it. Would you comment . Thank you very much. I think thats an excellent question since we do see the use cases advancing quickly in many applications, as you say, with phones being one of the most personalized ones that people have, i think they make a good example of some of the variations that are in place in the market of the different ways that facial Recognition Technology is being used. For example, in your phone, im going to use apple as the example and this is my best understanding, i dont work for or speak for apple, takes a representative picture of your face using both infrared technology and 3d imaging in order to prevent things like a photo or using another person and it takes it at a level of detail that stands up to about one in 10 million error rate, which is a pretty substantive level for something thats, in fact, part of a twofactor process, you have to have the phone and know whose phone it is and have their face and you have to match whatever that standard is. So thats actually a pretty robust standard for the level of risk involved in what might be, have a lot of personal data, but is one level of concern for people being violated. A facial recognition system that identifies somebody off of a video feed as a suspect in a crime would be an entirely different level of risk, entirely different level of implication on that person and certainly should be considered and potentially regulated in a very different way than that. So, yes, i do think we see those things being used in different ways already. Some of those have already started to have blowback on them in things like the criminal Justice System and that i think has gotten peoples attention and where are the places that need to draw those lines and should be used here in cases at all or should be used in very limited and regulated ways. Miss whitaker, can i ask you about the average consumer . Does the average consumer have any way to confirm, should they have any way to confirm that these cell phone manufacturers are, in fact, storing their biometric or other data on their servers . What should we do about that . Consumer knowledge . Yeah, the average consumer does not, and indeed, many researchers, many lawmakers dont, because this technology, as i wrote about in my written testimony is hidden behind trade secrecy, this is a Corporate Technology thats not open for scrutiny and auditing by external experts. I think its notable that white nis reviewed 189 algorithms for their latest reports, amazon refused to submit their algorithm information to nis. They claim they couldnt modify it to meet the standards, but theyre a multibillion Dollar Company and could imagine other feats. Whatever the reason, weve seen eighths the facial recognition companys discretion what they do or dont release, that they release accuracy numbers, often times that arent volume dated or arent valid that its not possible to validate by the general public so were left in a position where we have to trust these companies, but we dont have many options to say no or to scrutinize the claims they make. Thank you, madam chair. Thank you. The gentleman from louisiana, mr. Higgins, is now recognized for questions. Thank you, madam chair. Id like to ask you unanimous consent that the statement of chief james craig, the Detroit Police department, his written testimony be entered into the record. The chief has had a tremendous amount of success using facial Recognition Technology. Without objection. And id also like to recognize and thank our esteemed colleague, representative gomez, his Opening Statement standing upon principles of freedoms and liberties, protecting freedoms and liberties and resisting and curtailing manifestation of big brother, reducing and controlling the size and scope of federal powers and the Republican Party welcomes your transition. [laughter] madam speaker, facial Recognition Technology, its an emerging technology and of course, its produced by private entities. Law enforcement doesnt produce its own technologies. Its coming and its here. It will get better as the weeks and months move forward. It should be no surprise to us that the effective percentages of identifications are using a new technology will increase as time moves forward. Theres more coming. Theres total person Recognition Technology coming, and it measures a specific physical features of individuals. Their gait, length of their arms, et cetera. This technology is coming. Now, what we should seek is a means by which to make sure that big brother is not coming. Now, i have a background in Law Enforcement and Recognition Technology has become manifest in many ways. You have license plate readers being used from sea to shining sea. These are readings and police units that drive around and read license plates. If were looking for a suspect vehicle and we have an eye out for a particular vehicle, a particular color, thats human recognition and we see that vehicle, we read the license plate, we have license plate readers, reading every place we pass. If its expired or the associated drivers license to that registered vehicle is a person is wanted and well keep an eye on that vehicle and if the guy that walks out the building and gets in that vehicle appears to be the suspect that weve identified, we have a warrant for, then theres going to be some interaction there. This is a technology that has evolved and will become manifest over the last 15 or 20 years and its gotten very effective. Part of the facial Recognition Technology is completely common, that we use we use digital record from crime scenes and images, and frozen and best picture we could get from a crime scene video from surveillance cameras at the business or whatever was available to us. We would pass these images on, have the shifts watch these images and someone have shifted and odds are pretty good that somebody would recognize that guy, but this is the beginning, and recognition is the beginning of an investigation, it helps Law Enforcement cultivate the personal of interest for us to speak to. It there can never be a time where there are two things we stand against and that this is where the Ranking Member and i have had discussions at length. Both of us stand against Live Streaming the images of free americans as they travel and eniter businesses or go to and fro across america through some data base and all of a sudden the police show up to interview that guy. But solving a crime, were already using digital images to the west of our ability to solve crimes, and every american should recognize that this is an important tool. The chiefs written statement, which ive asked be committed and the chairwoman graciously accepted as several examples of incredible success using this technology. Im going to have a question ill submit in writing if i may, madam chair, for mr mr. Palker and mr. Romine and miss whitaker. I have three questions that time will not allow. This is an important topic and weve had several hearings about it and i thank the majority partys focus on this and i hope that we can come together with effective legislation that both allows the technology to move forward as a Law Enforcement tool and protects the freedoms and privacy of the american citizens we serve. I yield. Thank you, i now recognize the gentleman from massachusetts, mr. Lynch, is now recognized for questions. Thank you very much, madam chair and i want to thank you and the Ranking Member for collaborating on this hearing and approaching it in the right way, i think. First of all, i want to thank the witnesses for your testimony. I think its very helpful. As i understand it, and im not sure, im a little skeptical, they tell me the facial recognition used on the phone with the iphone, at least the way iphone says they handle this, is that the indicia of identity stays on the phone and doesnt go up to a server at this point. But you know, i sort of question whether theyll have that ability to do that in the future. I think theres probably a great greater greater danger that they will get facial recognition right, you know, and its not the misses that im concerned about right now, although that has to stop. Its what happens when they have all of this data out there, whether its Law Enforcement or private firms. We had a massive data breach by spremma, which is a big biometrics collector, 100 Million People, i think no, im sorry, 27 Million People in that breach and then customs and Border Patrol, 100,000 people that they identified along with license plates, that was breached. So the concern is once this information is collected, it is not secure and thats a major problem for all of us. I want to ask some specific questions about ticktock. So ticktock is a Chinese Company a Company Purchased by a Chinese Company, its an app that children love and a billion people have downloaded in the u. S. And europe and owned by the Chinese Government. And im sorry, its located in beijing and under chinese law, the recent National Security law in china, they have to cooperate, they have to cooperate with the Chinese Government and we already see it happening. If it you look on ticktock, you dont see much about the protests in hong kong. Theyre already exercising censorship on ticktock. So ticktock would have to cooperate with china, so thats a National Security concern for us. Sifius is looking at it, its under review. The other situation is apple phone. The iphone and our efforts because of the pensacola shootings, were trying to get apple to open up the iphone so we can get that information. If you step back, its sort of what were worried about china doing, were doing with apple. Were trying to get access to that data, just like china can get all of that data from ticktock. How do we resolve that dilemma . Is there a way, dr. Romine, that we can protect our citizens and others who share their data or have their data, their identity captured, their facial recognition captured . How do we resolve that so that we use it to the benefit of society . I think the bottom line, thank you for the question. I think the bottom line, really is balancing the understanding of the risks associated with policy decisions that are made. Those policy decisions are outside of nis purview, but with regard to the debate on, you know, access to apple and encryption, we know that in the government and broadly speaking, there are two okay, if its not in your discipline, let me ask miss whitaker the same question. Thank you for the question. The short answer is that we dont have the answer to that question. We have not done the research that is needed to affirmatively answer that, yes, we can protect peoples privacy, their liberty, when these technologies are deployed at wide scale and a complex geopolitical context. I think we need more of that research and we need clear regular i gos la regulations that ensure that these are safe. Anything else to add . And about encryption, i think when you have the encryption and consumers have to the data and then the third parties dont. If we back that, thats the way you give consumers control of the situation and have the government on the other side. All right. Ive exhausted my time. Madam chair, thank you for your courtesy. Thank you so much. The gentleman from texas, mr. Mcleod is now recognized for questions. Hello, thank you all again for being here and your work on this topic. This is an extremely important topic that obviously were going through the birth pains of development on this new technology. Mr. Parker, the government use of facial Recognition Technology, are they using technologies that are primarily developed by the government or commercial entities . I believe its a mixture of both. So, in some cases, especially with federal agencies, theyve developed their own systems over time, but i think increasingly, its leading to commercial solutions. Commercial solutions. And doctor romine, maybe you can help with that, whats in response to the nis report. From my perspective, industry has been involved from the outset. Theyve been very supportive of the efforts that weve undertaken in frvt over the last 20 years. So, its been a its generally a very positive thing. The industry feels challenged to do better. And i think it depends on the industry. Those who are participating are valuing it. As noted its excluding amazon for example, because amazon is a cloudbased system and excludes apple and we need those as well. Okay, and mr. Castro and mr. Parker, both mentioned its improving dramatically year by year, i guess. Would you say that were weeks, months, years, decades away from getting this technology to an acceptable i think if you look at the best performing algorithm right now, they are at that level of acceptance that we would want. Their error rates of 01 . And if you have something thats 10 times worse thats still. 1 area right and its. 1, one out of a thousand, one out of these are very small numbers. All right. So as mr. Castro said, were reaching that point now. I think theres some reasons why the industry is really focused on false negative type error rates and reducing that over time and i think thats down to extremely low levels now. And this is documented for, its 20 times better now than it was five years ago. And, but i think given the results of demographics effects studies were looking now at some of the false positive rates in trying to make those more uniform, so, you know, the way that, you know, achieving homogenous rates, identifying those that are mostly the same across different demographic groups so i think, theres an important context to consider these in and that one that was mentioned already, the total relative scale. 100 times 1 and its the context of what the consequences of errors can be and in some cases it can matter more than others with Law Enforcement investigations, nis says in the report, false positive differentials from the algorithm are immaterial and the reason why that is, the way that Law Enforcement uses the technology theyre looking at a set number of potential candidates that meet a criteria, usually like 50, in the case of the new york city, incident i mentioned before they looked through hundreds of photos that were potential matches. There was a Human Element there, the Technology Functions as a tool to enhance their job, its still up to a human to decide whether its an actual match. So in that case, the false negative error effect is much more important because you want to make sure that youre not missing someone thats actually in your data set. Now, could you speak potentially to the how do we get this right from our perspective of where we sit . Because sometimes, you know, in advancements in technology or anything else, sometimes we step in as the federal government to fix it problem and actually end up creating an environment that prohibits the Technological Advancements or the Natural Market things that work to make us get to that solution. Sometimes we actually make us take a step back. Whats the right approach here . So, i think and this relates to what congressman higgins said earlier. Facial recognition is just one of many advanced technologies. Its important that, you know, i think that the issues that we have in talking about this, are not really dont have to do with the technology, they have to do with how its used so i think we need to focus on addressing the concerns that we have through narrowly tailored restrictions if warranted and i think thats the more sensible approach and i think we have seen a proposal in the senate that would do Something Like that. Thank you. I yield back. I now recognized gentle woman from illinois for questions. Thank you, madam chair and Ranking Member for holding this hearing and continuing to make this an important issue for our committee. Weve talked previously about bias in facial recognition and Artificial Intelligence generally, but the recent part three on demographic effects, provides useful data on the development of commercial facial recognition programs. As chair of the tech accountable caucus, ive raised concerns about biased and unfair algorithms and the dangers of allowing these biases to perpetuate. The results of the part three report were not particularly surprising as has been discussed with women and individuals of african and asian descent having higher false positive rates than caucasian men. Director romine, in your testimony i was hoping to clarify the statement that policy makers and the public should not think of facial recognition of either always accident or always error prone. In my opinion, as policy makers we should be pushing to have these technologies to get as close to always accurate as possible. Why should we not describe this technology as not always accurate and how donning will we have to wait for this technology to reach always accurate for the demographic groups. Thank you for the question. I dont know how long it will be. I cant predict the future and the statement refers to the fact that the characteristics you have to include in any discussion are, you have to know the algorithm that youre using and as my testimony stated while many of the algorithms that we tested exhibit substantial bias or demographics effects across three demographics, the most accurate ones do not, in the one to many categories, so you have to know the algorithm youre using and also know the context, so, the ability to automatically identify aunt muriel in a family photo doesnt have a very high risk, if you get that wrong. And so, i think compare that to, you know, the identification of a suspect where there are some very serious concerns about ep ensuring that you get that right. You have to know the context youre using algorithm and the overall system. We have test overall algorithms, we dont have the capacity and we dont test systems that are deployed in the field as those have implications as well. And while i have you, can you discuss the benefits of auditing facial Recognition Systems for bias . From our perspective, whether its policy makers or Government Entities or private sector entities that want to use face recognition, the most important thing to do is to understand, to have the accurate, accurate unbiased data that we can provide so that appropriate decisions are made with regard to whether to regulate or not, what kinds of regulations might be needed, in what context. If youre in a procurement situation, pro curing a system, you want to know the performance of that system and have the algorithms that it depends on. So those are the things that we think are appropriate from an auditing capability or an auditing perspective. We dont view the testing that we do as an audit so much as providing policy makers and government and the private sector with actionable information. Miss whit ter, i whitaker, id like you to testify as well as. Auditing is really important, but we need to understand how were measuring the system. In my written testimony, gave an example of one of the most famous recognition system. It was a data that we measured these against, label faces in the wild in short, photos of mainly men and mainly white people. So the way that the industry assessed accuracy was to be able to recognize white men and that gives us a sense of why were seeing these pervasive racial and demographic biases across these systems. So the standards we choose to measure ourselves by matter greatly and if those standards dont ask questions about what the data that will be used in these systems in a deployment environment will be, how these systems will be used, if they dont ask questions like what the tenets were concerned about, will they be abused i just want to give ms. Lein a chance. And as ms. Whitaker said, used by the company matters. One of the regulatory options is to have requirements that say government use or purchase of systems has to be nis evaluated or ranked by some external objective test er into transparency how it was measured and what was done. Thank you. I yield back. From the great state of georgia, recognized for questions. Thank you, madam chair. Theres no question this technology of facial recognition is extremely important and viable for our government and i think notably places like Border Patrol and Law Enforcement. At the same time theres also no question that this Technology Allows for any individual to be identified in public spaces, be it through the private sector or Government Entities and therein lies a potential problem and grave concern for many people, both whether were dealing in private sector or government should bear the responsibility of individual privacy and Data Security. And so, im not sure exactly where this question is best directed, be it mr. Park, mr. Castro, so many of you jump in here. Lets start with the private sector, companies that are using facial Recognition Technology, that are addressing this issue of civil liberty, that the whole question of privacy. In other words, are there any, within the private sector, who are setting forth best practices . Any of the stake holders . You can start with that. Yes, we have identified the number of companies that have put out principles around privacy, specifically i can name some, ranked one, microsoft, amazon, google they all have public statements where they identify what specifically theyre doing around facial recognition, how they want to protect privacy and how theyre doing in terms of the technology and what theyre doing with the developer agreements that anyone is using their technology and what they have to do. What are the principles . What are the guidelines . Things around transparency, concept, data protection, notification, they go through a whole set of issues and these match the concensusbased guidelines that come out of other forums as wellments so we have a big concern, people have brought it up, that people are being identified without their consent. So, how, what are the safeguards . Its one thing to have policies and things written down and other things to have them to protect the public and protect the individuals who are not have not consented to this type of technology, so, how will these facial recognition products, as they develop inform individuals that theyre being exposed potentially without their knowledge . So, i notice the recommendations are how you actually communicate to individuals under what circumstances and part of the source of confusion, i think, in some areas, theres many different types of systems that are out there. And some are just doing facial analysis, for example, in the Digital Signage industry. Without content. Without consent. What theyre doing, theyre tracking the general demographics who is seeing the ads and not tracking anyones identity. They said for that type of purpose, theyre not going to be of paying consent. But theyve said if theyre going to target people as for example, based on your identity, they will require consent so they have to sign up lets go to the Atlanta Airport which right now is a pilot airport for some facial Recognition Technology. All right. Yeah, the Busiest Airport in the world, you have thousands of people Walking Around all over the place. When this technology is implemented, theres no way to get consent from everyone Walking Around. For the Atlanta Airport specifically, they have the aint to opt out, so you dont have to go through that if youre going through the terminal. The international terminal. How does a person opt out . You simply say you dont want to use the selfserve kiosk and go to the agent and show your passport. So youre saying that technology in airports will be used just in security lines . No, it will its used for boarding and screening and for bag check, for a variety of purposes, in each of those areas, delta said they have the ability to opt out and allow the consumers to do that. Do you know of any case where government in particular using this type of Technology Without the knowledge, without the consent of an individual, where it actually violated the Fourth Amendment . I dont know that. I dont think we have the documentation of that. I do think thats why we need a search warrant requirement so we know where thats made. Therein lies the problem with this. We see the technology, somehow weve got to land the plane in a safe zone that does not violate peoples rights and i appreciate you being here. Thank you, madam chair. Thank you. The gentle lady from michigan, ms. Lawrence, is now recognized for questions. Thank you so much, madam chair. This year i introduced hr153 with my colleague connor. With regard to the guidelines on the Ethical Development of ai. Transparency of ai systems processes and what implication are a result of it, helping to empower women and underrepresented or marginalized populations. Right now we have the wild, wild west when it comes to ai. Artificial intelligence isnt the only emerging technology that requires the development of ethical guidelines. The same discussion must be carried over to the use of facial recognition. There was a member who introduced the statement from the Detroit Police department. So i represent a majorityminority districts and the city of detroit is one of my cities and approximately 67 of my constituents are minorities, meaning the vast majority of my constituents have a higher likelihood of being misidentified by a system that was intended to increase security and reduce crime. We last month, nhtsa released a study that facial recognition test part three that looked at algorithm provided by the industry to develop the accuracy of demographic groups. The report yielded there are higher rates of inaccuracies for minorities to caucasians. Miss whitaker, you stated it, if we develop, when algorithms are developed and you do use a biased process, it is going to give you a biased result. And one of the things with the and we ask the question initially, what can we do . First of all, there should not be any american citizen whos under surveillance where its not identified and posted in a place to contact that company to say what are you using my image for. We in america have the right to know if theyre under surveillance and what are you doing with it. Another thing, any release of data that youre gathering should be required to go through some type of process for the release of that. So, i cant just put a camera up, gather information, facial, and sell it. Were having this conversation about the ring door bell. We know that that is helping to get criminals, but if youre going to give the information from ring to the local police department, there should be many so formal process of disclosuessure and inclusion toe public so they know that thats happening. Im very concerned about the movement of this technology. Some places have just said were not going to use it and we know this technology is here and its moving forward. Instead of just saying dont use it, we need to be, as congress, very proactive of setting ethical standards. Have an expectation that our public can say that if im being if my image is being used i know and i have a right to what are i moo rights and thats something that i feel strongly. Mr. Whitaker, if your opinion, with so many ms. Whitaker, with so many variations of accuracy in the technology, what can we do that will say that we will take out these biases . We know that there have not beens algorithms. What can we do as a congress to ensure that were stopping them . Thank you for the question. I think, you know, when we talk about this technology racing forward, i think that we have had an industry that raced forward, selling these technologies, making these technologies, making claims to accuracy that end up not totally accurate for everyone. What we have not seen is validation race forward. We have not seen understanding and new mechanisms for real consent. Not just notice. I think we need to pause the technology and let the rest of it catch up so that we dont allow corporate interest and Corporate Technology to race ahead to be built into our Core Infrastructure without having put the save guards in place. Now, the police chief in detroit submitted a record and i said this to him facetoface, and he made a promise that there will never be a trial in court based solely on facial recognition. There should be something in our civil rights law and Justice System that does not allow a person to be persecuted based on the fact that we know this data is not accurate and it has bases, based on facial recognition and thats something i think we as a congress should do. Thank you, and my time is expired. Thank you. Youve raised a lot of very good points. The gentleman from ohio, mr. Jordan, is now recognized for questions. Thank you, madam chair. Miss whitaker, its wrong sometimes, isnt it . And its disproportionately wrong for people of color, is that right . And this all happens, my understanding, this all happens in a country, in the United States where we now have close to 50 million surveillance or security cameras across the nation, is that right . You can say yes, you couldnt just nod okay. And we talked earlier about context. I think a number of witness have talked about context and theres the context of opening your phone is different than your apartment complex having a camera there. Were talking about in the private sector, but it seems to me the real context concern is whats happening in, as a number of my colleagues point out. Whats happening with the government, and how the government may use this technology and we know the American Bar Association said facial recognition was used by Baltimore Police to monitor protesters after the death of freddie gray a few years ago in the city of baltimore, which is scary in and of itself. And then you had five bullet points and i appreciate what youre doing with the institute that you cofounded, but point number five, you said this, facial recognition poses an existential threat to democracy and liberty. Thats my main concern is how government may use this to harm our First Amendment and Fourth Amendment liberties and so youve got to think about context even in a broader sense. I think we have to evaluate it in light of what we have seen the federal government do in just the last several years. You know how many times the fbi lied to the fisa court in the summer of 2016 when they sought a warrant to spy on a fellow american citizen, you remember mr. Horowitzs report from last month, ms. Whitaker. I am, i dont remember the exact number. 17 times. 17 times they misled a court where they go to the court and theres no advocate looking out for the rights of the citizen whos going to lose their liberty, going to be surveilled on and 17 times they misled the court and we found out it was worse than we thought. They didnt spy on one american, they spied and four Americans Associated with a president ial campaign. That he is probably never happened in american history. When we talk about context, its not just how facial recognition can be used by the government. We know it has been, it was used in baltimore to surveil protesters. And you view it in a broader context where the fbi went after four american citizens associated with the president ial campaign and we know they misled the courts in the initial application and through 17 times. And of course, thats after what happened a decade ago, decade ago, the irs targeted people for their political beliefs. Theres no facial Recognition Technology there, they just did it, went out and targeted groups, asked them questions, do you pray at your meetings. Who was a guest at our meetings, before they get a tax exempt status. So this is the context, so when we talk about why were nervous about this, context is critical, and the context that is most critical and most concerning to i think republicans and democrats on this committee and frankly all kinds of people around the country who have taken time to look into this a little bit is how the government will use it and potentially violate their most basic liberties. And thats what were out to get. And you said in your testimony, you said in your testimony, youre for its time for bullet point number five. Time to halt the use of facial recognition in sensitive social and political context, can you elaborate on that . When you say youre looking for a flatout moratorium on expanding it, stopping it, what would you recommend ms. Whitaker. Thank you for that question and at that statement. I would recommend that. I would also recommend that the communities on whom this is going to be used have a say in where its halted and where it may be deployed. Are the people who are the subject of it used, its used comfortable with its use. Do you have the information they need to assess the potential harm to themselves and their communities . And is this something that has have they been given the information they need to do that . Are you talking in a private sector context like, i think, the reference would be like an apartment complex, whether you can enter versus the key, some kind of fob or something, be that or are you talking, explain to me. Elaborate on that if you could. Yeah, absolutely. Im talking about both and i think the baltimore p. D. Example is instructive here because the baltimore p. D. Was using private sector technologies. They were scanning instagram photos through a Service Called geofeeda that gave them feeds from freddie gray protests. They then were matching those photos against their faces, facial recognition algorithm, a privately developed facial recognition algorithm to identify people with warrants whom they could then potentially harass so theres an interlocking relationship, as i say in my written testimony, between the private sector, who are essentially the only ones with the resources to build and maintain these systems at scale, and the government use of these systems. So, theres two levels of obscurity, theres Law Enforcement exemption and military exemption where we dont get the use of the information of technology by government and corporate secrecy and these interlock to create total obscurity for the people who are bearing the costs of violating these technologies, thank you, my name extired, madam chair. Thank you, the gentleman from california, mr. Gomez is now recognized for questions. First, every time i listen to a discussion on facial recognition, more and more questions emerge. Its amazing. I like to thank my colleagues on both sides of the aisle. I know folks think that democrats dont care about liberties or freedoms, but we do. But we also care about not only the public space, but in the bedroom and over ones body, thats the way i kind of approach this issue, from a very personal perspective. I made my concerns about this technology pretty clear, you know, the dangers it poses on people of color, and racial bias and Artificial Intelligence and as i was looking into it, amazon continues to come up because theyre one of the most aggressive marketers of this new technology and they do it under a shroud of secrecy. I want to be clear, i know that this technology isnt going anywhere. Its hard to put limits on technology, especially when using the law, and ive seen this time and time again coming from california where you have large companies. Understand that the wheels of the government turn slowly so if they can just move quickly, they will outpace, outrun the government in putting any kind of limitations, youve seen this with some Scooter Companies who dump thousands of scooters on the streets and no regulations and all of a sudden, it forces the government to react, but we will react and well start putting some limitations on it. I know that it is tough, but theres a lot of questions. One of the things ive been trying to figure out, what agencies, what companies, what federal authorities are using it, how are they using it, who sold it to them and if theres a third party validater thats evaluated its accuracy and because when this technology does make it mistake, the consequences can be severe. And according to the study, it said in applications such as visa or passport fraud production or surveillance of false positive to match another individual could lead to detention or deportation, dr. Romine, recently the study found recognition he can technology not only makes mistake, but occurs when the individual individual racial minority, women, children, is that correct. For most, thats correct. Did you find these were just to a few developers or a bias and accuracy more widespread . It was mostly widespread, but some developers whose accuracy was sufficiently high that the demographic effects were minimal. Question, but has amazon ever submitted their technology for review . They have not but we have had ongoing discussions with them anut how we can come to ere submittingt th the algorithm. It is an active situation. How long has it been ongoing . Some months at least. This is in this is in the context of them trying to put out a blog post, and that blog post regarding the principles you are referring to was then in i response to a letr from myself and senator markey sent to them. You would think it would be more than just a block poster to think that be something more serious to the rises of the level of our concerns. But with that, wanted to ask msi want to ask each of you can each discuss the implications of the newly released report on the use of facial Recognition Software . What a potential harms of u. S. Buys systems using biosystems . I think the benefit ofth the report is that it discloses a bias that iss present in many of the algorithms being used and gives consumers both asn individuals or businesses who might be selecting these algorithmser for use good information on which to make their choices. I want to make the point that even though a large number of algorithms were tested those are not equally spread across the market in terms of representing market share. The vast majority of the market right now at the high end and particularly that his government contracts at federal, state and local levels as well as the highend commercial uses like the nfl or sports stadiums or venues or amusement parks with things like that, overwhelmingly already employ the algorithms that are at the top end of this spectrum and that had very low error rates. Evenly distributed problem and thats part of the problem is understanding where the algorithms are being used and by whom that are causing the most harm. Ms. Whittaker, i will let you enter. Thank you. Actually i think its important to emphasize as mr. Jordan did that i could facial recognition can also be harmful. So bias is one set of problems but the goat be on that. Were facial recognition is being used with social consequences we will see harm from these racially and gender biased disparate impacts. We can look at the case of willie lynch in florida who was identified solely based on a low confidence facial recognition match that wasco taken by an officer of the cell phone photo. Hes now serving eight years based on that photo and had to struggle and was eventually denied to get that evidence released during his trial. We are seeing highstakes really compromise life and liberty here from use of these biased algorithms. In response to the question of where are they being used, which algorithms are being used, we dont have public documentation of that information. We dont have a way to audit that and we dont have a way to audit whether nist result in a laboratory represent the performance in different contexts like a amusement parkse or stadiums or wherever else. Theres a big gap in the auditing standards, though the audits we have had shown extreme the concerning result. With that i yield back. Thank you. The gentlewoman from West Virginia, ms. Miller is to question. The thank you chairwoman maloney and Ranking Member jordan. As Technology Evolves it is important that we are on top of it. I saw firsthand how they were using facial recognition when i was in china as a form of payment. I was also exposed to several concerning uses of facial Recognition Technology. As a committee it is our responsibility to make sure that anything that is done in the United States is done thoughtfully and prioritizes the safety and individual security. Mr. Parker, when im at a busy i have clear to get through. Even though we have tsa, when an hurry its really nic that you can use recognition and go forward. Can you elaborate on a some examples of beneficial uses for consumers and businesses . Sure, and ill, private sector uses but also security and safety related. One important one is protecting people against Identity Theft and fraud. You may not think about that but heres that works. If someone walks into vacant house, asked to open up a line of credit use of take drivers license with the customers will information, the teller tells him they had their photo taken. Comparison is made, maybe be determined they are not who they say they are. Talk to management. That time the person will commit fraud is probably long gone. Thats a really useful case for the technology the people dont think about. Also like from our industry facial recognition is also able to provide additional security for Facility Access control. Its typically to augment the other credentials such as keys or cards but these things can be shared, stolen or simply lost. Biometric Entry Systems provide an additional convenience to registered users, for example, when, for entry into an Office Building or commercial offices during rush times. Another example, reduce organized resale crime in theft which is skyrocketed in recent years. Hurting american businesses, consumers and taxpayers alike. Do you think the Mainstream Media outlets have given an honest portrayal of how this technology is utilized and the reality of its capabilities . Im sorry, i dont think so. This is a complex issue as weve been talking aboutut and it tens to get oversimplified and mischaracterized. Going back to what i said earlier, i think the issue is that whats causing some concern is not how the technology is used. Its not the technology itself. I think theres other technologies that could be used in similar ways and need to think more constructively about what the rules should be about the usese of media different tys of technology. Thank you. Doctor bromide, i have a very good friend in West Virginia by the name of chuck wrote mine and his son is dr. David wrote mine but if we scanned both of you you would not look anything like. During home land security hearing on july 11 inng your testimony you discussed accuracy rates across multiple demographics and inaccurate results are diminishing. Now that you published the report, is that still accurate . In what other areas is this Technology Improving . So i hope my statement in july was that the most accurate algorithms e are exhibiting diminishing demographic effects, and wein do believe the report released just last month confirms that. You also stated that anytime the overall performance of the system improves, the effects on differents demographics decreas with her is that something that is still true to this day . That is correct. Knowing that accuracy rates have improved within 20142018, can you further explain the role of performance rate and why they are important for the endusers of these technologies . Absolutely. Its essential that in the selection of a system you understand the algorithm that the system uses, and select for an accuracy that is sufficiently robust to provide you the minimized risk for the application. In some cases the application may have very limited risk and the algorithm may not be important in other cases or as important. But in other cases the risk may be severe such as identification suspects, for example, or access to Critical Infrastructure if theres facial recognition being used for that that you want to have an algorithm basis for your system that is highperformance. Could you speak to where youre researching techniques that exist to mitigate performance differences among theer demographics and what is emerging research and standards in nist interested in supporting . Thank you for the question. Although we didnt specify to many of the mitigations that we would expect people to adopt today, one of the things that we do want to do is to point policymakers and consumers to ways in which these things can be mitigated. One of the mitigations can be a determination of an appropriate threshold to set, to ensure that any algorithm that you use, you set an appropriate threshold for the use case. Another is a possible use of a separate biometric. So in addition to face, having fingerprint or an iris scan or some other type ofan biometric involve that would help to reduce the error substantial. Thank you. I yield back my time. The gentlewoman from massachusetts, ms. Pressley is recognized for question. Thank you, madam chair. Use facial Recognition Technology continues to grow at a breathtakingro pace and is now seeped into nearly every aspect of our daily lives. Many families are unaware the faces are being mind as they walked to the mall, the isles of the Grocery Store as into their homes or apartment complexes, and even as they drop their children off at school. In response, several municipalities including within the massachusetts seventh Congressional District which i represent, ive stepped up to the plate to protect the residents from this technology. We know the logical end of surveillances often over policing and the criminalization of vulnerable and marginalized communities. It is also white work of my colleagues o have legislation to protect those living in Public Housing from this technology. More recently School Districts have begun to deploy facial analytics in School Buildings and in summer camps, collecting data on teachers, parents and students alike. How widespread is the use of this technology on children in schools . Seeing facial Recognition Systems being implemented more and more in schools. I think the actual number is still very small in terms of of theage penetration number of schools in this country but it is certainly growing. There is really just no good justification for a facial recognition system in a k12 school. There amazingly they are mainly being used in security applications. They do not adequately address in any meaningful way these issues and it is not the best use of funds or the best way to heighten security around schools in response to these threats. The other part of your question, the patient facial characterization programs, which i think are being used more in an educational context. What is the Response Rate of students to certain teachers or types of teaching and things like that . That is based on very questionable data at this point, and i think in the not ready for prime time category definitely qualifies, in the sense that we are seeing it very quickly applied in many use cases that the science and research is not there to back up. It is particularly concerning when youre are talking about children in schools, not only because they are essentially a captive population, but because the labels or decisions that might be made about those children based on that data might be very difficult to later challenge or in any way reduce the effects on that particular child. Security and privacy concerns. Your study found that the error rate of facial Analytics Software actually increased when identifying children. Is that correct . For most algorithms, that is correct. Why was that . We dont know the cause and affect exactly. There is speculation that withs faces have less life experience, they are less featurerich. We dont know that for sure, because the Neural Networks used, it is difficult to make a determination of the reason. Got it. Many of you have mentioned in which these image databases can be vulnerable to hacking or manipulation. When childrens images are stored in databases, are there security concerns that they raise or that may arise . Security for minors is always a concern. This technology is clearly biased, inaccurate, and more dangerous when used in schools were black and brown students are over policed and disciplined compared to their peers for the same minor infractions. Girlssachusetts, black are six times more likely to be suspended from school for the same infractions as their white peers. Dont need facial Recognition Technology that can misidentify them. Last fall, i introduced an act which would urge schools to abandon over policing and surveillance and to instead invest resources in access to counselors and Mental Health professionals, resources that will really keep kids safe. In my home state of massachusetts, a Broad Coalition of educators, civil rights and childrens rights activists are leading the flight in fight in saying no to the deployment of facial Recognition Technology in our schools. I am grateful for their activism and solidarity on this issue. I would like to include for the record a letter on the desk from naacp, aclu massachusetts and many others urging our state to reject this surveillance and policing in our schools. Thank you. And i yield. Thank you and the gentleman from north dakota, mr. Armstrong, is now recognized for questions. Thank you. I think there is a couple things that we should talk about for a second, because i think they are important. I am going to go to the Fourth Amendment and criminal context and how this can be deployed there. This is not the first time we have seen a crisis in the Fourth Amendment happen. It happened with telephoto lenses, distance microphones, gps trackers, drones, and now we are at facial recognition. To be fair, the Fourth Amendment has survived pretty well. Biometric information has a different connotation, which we will get to a second. With Ranking Member jordan that we should not let the courts decide. The courts will take a constitutional view of privacy. With these types of issues, i will be the first to admit that my facial recognition did not work on my phone over christmas. You know what i did . Drove immediately to the cell phone and got a new one. The carpenter case is a pretty good example of how at least the u. S. Supreme court is willing to change how they view privacy in the digital age. Part of our job as congress is to ensure that we write a law and write regulations that ensure we can maintain those types of privacy standards. One of the reasons biometric is a little different is because there is one unique thing in a criminal case that is really relevant to facial recognition, and that his identity cannot be suppressed that is identity cannot be suppressed. I can suppressed marijuana, a marijuana, ass gun, a dead body, but i cannot suppress identity. We need to apply a statutory exclusionary rule, otherwise, any regulations we pass do not truly matter in a courtroom. We have to figure out a way for meaningful human review in these cases. There has to be an underlying offense or a crime. It is really important. I think it is important to recognize that not all populations are the same. There is a big difference between using facial recognition in a prison setting and even quite frankly in a tsa or border setting then there is than there is for a Law Enforcement officer walking down the street with a body camera. We have to continue to have those conversations. I also want to point out that one of the things that we have to do when we are dealing with these types of things in the Law Enforcement scenario, and i dont care what Law Enforcement figure out a way to account for false positives. Is, inson i say that north dakota, we have Highway Patrol who have drug dogs. Not all of them, but some of them. If you are speeding down the highway going 75 in a 55 and get pulled over and that Highway Patrolman happens to have a drug dog in his car and he walks that dog around your car and that dog alerts and they search your car and dont find any drugs, and they let you leave, that data never shows up in that dogs training records. It never shows up. When you are talking about the accuracy of a drug dog, when youre talking about the accuracy of finding a missing girl or any of those issues, we cannot wait until that situation arises. If there is a missing girl on the mall out here, i will be the first one standing at the top of the Capitol Steps saying, use whatever technology, grab everybody you can, lets find this little girl. I agree with that. You cannot have meaningful regulation unless you have meaningful enforcement. One of the concerns i have when deploying this technology in a Law Enforcement setting is that it is very difficult by the nature of how that works to deal with those false positives. My questions when we are talking about the missing girl or the dutch missing girl, how many people were stopped missing girl, how many people were stopped . We have to be able to have a situation in place where we can hold people accountable. The only way i can think of to do that is to use in populations and perfect it. The problem with a prison population is you have a static population. I think when we move forward with this, particularly in a Law Enforcement and criminal setting, we have to recognize the fact that you cannot suppress identity. It is a lot different than other technologies. If your number is 90 and you stop somebody at 60 and it still happens to be that person, under current criminal framework, i can make that motion, the judge will rule in my favor and say too bad, still arrested. With that, i yield back. The gentleman from michigan is not recognized for questions now recognized for questions. Thank you, madam chair. I think many of you probably already know i am particularly disturbed by the aspect of facial Recognition Technology being used by landlords and Property Owners to monitor tenants, especially in Public Housing units. In detroit, for example, the citys Public Housing authority recently installed security cameras on these Public Housing units. That is something we believe it will encroach on peoples privacy and Civil Liberties. These are peoples homes. I dont think being poor or being workingclass means somehow you deserve less Civil Liberties or less privacy. What are the privacy concerns associated with enabling facial Recognition Software to monitor Public Housing units . If you live in a low income community, is your Civil Liberties or privacy lessened . Thank you you for the question. Of course not thank you for the question. Of course not, at least, hopefully not. What is the problem they are trying to solve by putting this into a housing complex, any housing complex . Ownersthe landlord or game. Is it convenient, some level of convenience, some level of security owners , what is he trying to gain from it . With that in mind, what are the risks to the occupants . In my opinion, that would be a commercial use. Even if it was installed, it would be only for those residents who chose to opt in and enroll and use it as their way in and out of building. For those that opt out, they should not be included in that. From a Civil Liberties point of view, if this was being used in some way, the other laws about impact or protected classes do not go out the window just because you use a new technology. They still need to be applied. They raise challenging questions. These new technologies, they are forprofit, right these new technologies, they are forprofit, right . S yes, the companie developing them. They are forprofit and testing these technologies on people. I hear my good colleague from massachusetts talk about them installing it in schools. They are using this and i have a police chief that says this is magically going to disappear crime. If you look, my residence dont feel less safe. They actually dont like ash resident residents dont feel less safe. They actually dont like this a greenlight. It takes away peoples Human Dignity peoples Human Dignity when you are being policed and surveilled in that way. Now, they are now trying to say we are going to use faciAl Technology as like the what do they call it . Key fobs. They now want to use access to peoples homes using facial Recognition Technology on key fobs. One of the consequences of that is misidentification. We talked about how he could not even activate his phone. I am really worried that they are testing my people, my residents are being used as a testing ground for this type of technology. Do you have any come in regards to that any comments in regards to that . The algorithm testing that we do is to provide information to people who will make determinations of what is and what is not an appropriate use. Committee,es this any potential regulation or lack of regulation and any deployment that is made in the private sector or otherwise. I am really proud to be coleading with congresswoman presley and clark and leading no biometric barriers to housing act, which would prevent any completely ban facial Recognition Technology on federally funded housing buildings and properties. We should be very careful. I think congressman mark meadows is right. I hear my colleagues on both sides saying, we have got to fix the algorithms. We should not be in the business of fixing forprofit technology industries. They call them tools. They give them all these great names but they are processes in place of human contact, Police Officers on the street. I increasingly talk about this with the police chief and others and all they can say is, well, we did this and we were able to do that. Like my colleague said, how may people did you have to go through . I watched while they matched a suspect with 170 plus people. I watched as they took a male suspect and matched him with a female. Watched the kind of misleading the public of saying, well, you must not care about victims. I actually do care about victims. How about the victims you are missdentif identifying miss identifying . Thank you and i really do appreciate all of your leadership on this. Thank you so much, chairwoman, for doing yet a third hearing on this and continuing this critical issue that i know was important to chairman cummings. Thank you very much. The gentleman from kentucky is now recognized for questions. Thank you. I ask that you bear with me. I am battling laryngitis, so laryngitis with a bad accident does not spell success accent does not spell success. There is bipartisan concern today for facial Recognition Technology as we move forward. Doctortion is for the with respect to the National Institute for war standards testing. What is the role in establishing governmentwide policy . The only role that we have with respect to governmentwide policy is providing the scientific underpinning to make sound decisions. As a neutral, unbiased, and we are able to conduct the testing and provide Scientific Data that can be used by policymakers to make some policy. How does the technical standard differ from a policy standard . Certainly, technical standards can be used by policymakers. So in this case, a determination of a policy that was predicated algorithmscation of that are based on their performance characteristics is would be one example of that. From a policy perspective of what to do or what not to do with face Recognition Technology, that is something we would support with Scientific Data, but not with policy proclamations. Let me ask you this. Is this the right agency to develop governmentwide policy . Sir. Dont think so, what is the companys role in developing acme standards for accurate standards for facial Recognition Technology . Our role is in evaluating the accuracy and in particular, the appropriate measurements to make. These measurements did not exist. We worked with the community to develop a set of technical standards for not just the measurement itself, but how to measure these things, including the reporting of false positives , false negatives and the very detailed definition of what those constitute. Thank you. Mr. Parker, i understand that Security Industry supports the u. S. Chamber of commerces recently released facial recognition policy principles. One of what are the principles and why do you support them . Thank you for the question. Yeah, so i think the chamber put a lot of really great work into developing this framework. It mirrorsthe some of the work that was done earlier. Atistate multistakeholder process was convened that included other parties from the commercial sector about what appropriate commercial use looks like. Some of the principles have to do with transparency, obviously the main one. We were discussing earlier, what should be done as far as consent . I think that will cover most cases. Can you describe those principles . Balance than how those principles balance the need for civil liberty while also promoting the need for Industry Innovation . We are primarily talking about data privacy. It is different from Civil Liberties concerns surrounding government use primarily. Let me followup. What does the path ahead look like for these principles . Congresshink that the debate going on right now about establishing a National Framework for data privacy is a really important one. How to set rules for use of the technology in the commercial setting is within that framework. R know we have had the gdp in europe. The United States is establishing their own framework. That could be a real problem for our economy if we dont establish standardized routers standardized rules. The gentleman from virginia is not recognized for questions now recognized for questions. I think the chair and thank you all thank the chair and thank you also much. We will have to really grapple with, what are the parameters of protecting privacy and controlling the use of this technology . One of the traps i hope on my side of the aisle particularly we dont fall into is continuously siding the false id side testthe false false id false ids. Technologys does Nature Technologies nature technologys nature is that it will improve. What happens when it becomes 95 accurate . Then what. I would certainly argue irrespective of its accuracy, they are intrinsic concerns with this technology and its use. Maybe we have to look at things , wheret in and opt out you actually require the consent of anybody whose face is at beue to be able to transferred to another party, whether you are government or nongovernment. Mr. Parker, you were talking about primarily being concerned about how government uses facial Recognition Technology. Any reason to believe the private sector might also generate some concerns . Sure. That is why we need to establish best practices about how it is used particularly in any applications where there is any kind of serious consequence for errors. Errors . Let me give you an example. Photosgot one million from a photo hosting site called flickr. It sent the link to that toabase, one million faces, chinese universities. That was not the government doing it, it was a private entity. It was not about accuracy, it was about an entire data set adversary whoeign has a track record of actually using this technology to suppress and persecute minorities. For example, uighurs. We know they are doing that. Might you have any concern about a company like ibm engaging in that kind of behavior and transferring an entire data set to chinese universities with close ties obviously to the Chinese Government . Yes, certainly. I think it is reflected in u. S. Government policy, which established a restriction on exports to a number of chinese companies, particularly those developing this technology we are talking about. Miss whitaker, your views about that . I think that highlights one trying toues that implement consent to raises, which is that those photos are already on flickr. Those are photos of somebody may have put up on a very different photost those are somebody may have put up on a very different internet. These data sets are being used to train these data systems that may be erroneous, and may violate Civil Liberties. Where we ask for consent, how consent could work, given that we have a 20 year history where we have clicked through consent notifications without reading to get a matter of habit to the core technological infrastructures of our lives remains a big open question. I think we need to be able to answer that. I think we could agree, could we not, that whether i clicked flickr or any other itity to have access to, never contemplated having that photo transferred to a Foreign Government or to a university with close ties to a Foreign Government. Corporationo have a use it to train a system that they myself to Law Enforcement in ways that targets are community. There is a lot of things that we did not consent to. That thiss to me being the third hearing, where ,e have all expressed concern we have got some work to do. Out the in figuring rules of engagement here and how we protect fundamental privacy rights of citizens. Unless we want to go down the road of expanding and transforming the whole definition of the zone of privacy. That is a very different debate. It seems to me that we cannot only concede the technology will drive the terms of reference for privacy. Thank you, madam chairman. Thank you. The gentleman from wisconsin is now recognized for questions. Ok. Anybody else can jump in if they want, i guess. First of all, i would like to thank mr. Conley for his comments. That is ae inference major problem here is getting false information is i dont think the biggest concern. I think the biggest concern is it becomes more and more, it is better and better at the people uses that is used for at the il uses that it is used for. I think sometimes, the less information the government has, the better. Go ahead. Absolutely. I want to preface my answer by saying that i am an expert on Artificial Intelligence and that i understand the Tech Industry very well. I am not a china expert. It is very clear that these technologies are being used in china to implement social control and the targeting of ethnic minorities. You have networks of facial Recognition Systems that are designed to recognize individuals as they go about their daily lives, and issue things like tickets if they j walk, if they are recognized by a facial recognition system. People who attend religious ceremonies in china, could it be used there . Absolutely. You are just seeing it deployed in a different context. I attended a rally last night for president trump. Do you think it is possible that any facial Recognition Technology was being used in . Heir used there is not they capacity for technological of ordinances certainly exist the capacity for it certainly exists. We are just not told when it is used and where. Would it surprise you if it is being used there . No. Ok. There is the concern i have. We have a government that has weighed in against certain people. OutRanking Member pointed the irs in the past has shown a strong bias against conservatives. We use the power of government against conservatives. We have a major president ial candidate a while ago who said he wants to take peoples guns. Would it surprise you if facial Recognition Technology is being to develop a database of people going to a gun show . Facial recognition is being used to develop or against many different kinds of databases. Ok. Kind of concerning there. Me, that is the major concern, that our country will work its way towards china. A wild back, we had april a while back, we had a president ial candidate hostilely questioned a prospective judge because they were parts of the knights of columbus, which is kind of scary. Could you see the day, where we to they should technology identify people attending a Catholic Church, which apparently seems to bother some people. Thats the same principle as the Baltimore Police department using it to see who attends a freddie gray rally and targeting them. It is already being used in that capacity. If you set up a Catholic Church in china, do you think the red Chinese Government would use facial Recognition Technology to know who was a member of that shirt church. Identifying in china if you show up at a knights of Columbus Church . Identifying and china if you show up at a knights of columbus meeting . Anybody else want to comment on what is going on in china . I think it is a model for extraordinary authoritarian control. There technology is announced as state policy. Is primarilythis Corporate Technology being secretly threatened through Court Infrastructures without that kind of acknowledgment. Amazon a big player here . Absolutely. They have expressed Strong Political opinions. They certainly hire many lobbyists. Ok. Thank you for giving me an extra few seconds. Thank you. The gentlelady from new york. Thank you, chairwoman maloney. Thank you again for holding a third hearing on something that anso important and is such emerging technological issue. We have heard a lot about the risk of harm to everyday people posed by facial recognition. I think it is important for people to really understand how widespread this is. You made a very important point is anow that this potential tool of authoritarian regimes, correct . Absolutely. That authoritarianism or concentration of power could be done by the state, as we see in china, but it also could be executed by mass corporations, as we see in the United States. Correct . Yes. Could you remind us of some of the most common ways that Companies Collect facial recognition data . Absolutely. Tes likeape it from si flickr. Some use wikipedia. They collect it through Massive Network market reach. Facebook is a good example that. If you ever posted a photo of yourself to facebook, that can be used . Absolutely, by facebook and others. Could using a snapchat or instagram filter help hone an algorithm for facial recognition . Absolutely. Can surveillance camera footage that you do not even know his being taken of you be used for facial recognition . Yes. Currently, cameras are being designed. People think i am going to put on a cute filter and not realize that data is being collected by a corporation or the state, depending on what country you are in, in order to track your two surveilling, potentially il you,urve potentially for the rest of your lives . Correct. What can a consumer or constituent like mine do if they have been harmed by companys improper collection . We were talking about how facial recognition is often times often times has the highest error rates for black and brown americans. The worst implications of this is that a computer algorithm thattell a black person they have likely committed a crime when they are innocent. How can a consumer or a constituent really have any sort of recourse against a company or an agency if they have been mis misidentified . Right now, there are very few ways. Litigation can be brought against a company. One, you have to know it was collected, two, you have to know it was used misused, and three, you have to have the resources. Lets say you walk into a Technology Store or as the technology spreads, you walk into a store in the mall, and because the error rates are higher for black and brown folks, you get misidentified as a criminal. You walk out and lets say an officer stops you and says, we think you have been accused of a crime. You have no idea that facial recognition may have been responsible for you being mistakenly accused of a crime. Is that correct . That is correct. We have evidence that it is often not disclosed. That evidence is often not disclosed, which also compounds on unbroken criminal Justice System, where people very often on our broken criminal Justice System. The willie lynch case in florida is case in point. What we are seeing here is that these technologies are almost on a mating automated injustices automating injustices, but also automating biases that compound on the lack of diversity in Silicon Valley as well. Absolutely these companies do not reflect the general population and the dishes since choices they make are in the interest of a small few. This is some reallife black mirror stuff that we are seeing here. I think it is really important that everybody understand what is happening. This is happening secretly as well, correct . Yes. Thank you and thats my time. The gentleman from pennsylvania is now recognized for five minutes. Thank you, madam chair. I just want to say that we all represent many people that are probably not familiar with the commercial and governments use of facial Recognition Technology. There is a lot of technology out there. I am grateful for the witnesses being here to help shed a little bit of light on the topic of facial Recognition Technology. When we look at the if there is a proper approach toward regulating the use of facial Recognition Technology, you to balanceed personal privacy with whatever appropriate use there may be as a tool to make the government or Law Enforcement capabilities more effective in what they do. The reason i say this is, several years ago, something happened in the Legal Community affect, where Television Shows exaggerated the prevalence of dna and forensic evidence and the ease of its processing in criminal cases. Defense attorneys then used the publics new perception of this evidence to claim the lack of enough forensic evidence meant that the police did not do their due diligence. Today, many Television Shows and movies reference facial Recognition Technology as part of their storytelling. There is a lot of concerns here. I have concerns with the Fourth Amendment and all of our rights that we have. I guess, mr. Parker, if you could just maybe explain to what extent do you think the current pop culture is filled with an exaggerated or distorted view of how prevalent they use or if there is an appropriate use of facial Recognition Technology . First of all, i do think it has i mean, if you look at the portrayal of the technology in the media, it is far beyond what we can do right now. That is one thing to consider. The other thing is that we mentioned earlier about what is happening in china. Unfortunately, their government by policy is using technology, not just this one, many others, to persecute certain groups. Obviously, that is a horrible example of how technology can be misused. I think also the capability is different. I am not an expert on china either. To use facial Recognition Systems, there has to be a database with people enrolled in it. I suspect there is a large database like that over there. I can speak on behalf of our members. We have no interest in helping the government at any level here do massive surveillance of citizens engaged in lawful activity. We have no interest in that. That is not the case right now as a system and i have not seen evidence that that is what is intended, but certainly, that is not a place we want to go. You mentioned that technology can be a great tool and it can. Phones can keep us very wellconnected and do things. It can be a great hindrance and distraction, too. Usingof people now bully social media and so on, so that can happen with anything. Wes a matter of how effectively regulate that and make sure it does not get used inappropriately. Do you think we could be looking effect inssible cfi terms of facial recognition being used by Law Enforcement . I think that is a risk. The key here is to have really locked down and thorough use policies and constraints. I think there is many uses in both the private and Public Sector where that is being done correctly. There is other cases we know less about because there is less transparency. A part of that is some Accountability Measures that ensure use of those systems are auditable. Ok. I appreciate that, because this is a very sensitive issue and i do appreciate the opportunity of having these hearings so that more people are aware of whats happening. Thank you and i yield back. Thank you. I recognize the gentleman from new mexico for questions. Thank you, mr. Chair. Thank you also much for being here today. We appreciate your time and effort in this hearing. I recently read that some employers have begun using facial Recognition Technology to help decide who to hire. At certain companies, such as helton and euna lever, job applicants can hilton and canever, job applicants record videos. The algorithm that ranked the applicant against other applicants based on a socalled employability score. Job applicants who look and sound most like the current high performers at the Company Received the highest scores. I have two questions for you. Oft it true that the use facial recognition and Characterization Technology in job application processes may contribute to biases in hiring practices . If yes, can you please elaborate . Is absolutely true it is absolutely true. The scenario you described so well is a scenario in which you create a bias feedback loop, in which the people already hired and promoted become the models for what a good employee looks like. If you look at the executive atte and goldman sachs, goldman sachs, you see a lot of white men. If that becomes the model for what a successful worker looks like and that is used to judge whether my face looks successful enough to get a Job Interview at goldman sachs, we are going to see a kind of confirmation bias in which people are excluded from opportunity because they happen not to look like the people who had already been hired. Thank you so much for that. You agree that granting higher employability scores to candidates that look and sound like highranking employees may lead to less diversity in hiring . I would agree. I would also say that that methodology is not backed by scientific consensus. Thank you. Envision any privacy concerns that may arise when employers collect and use the data generated from video Job Interviews . Yes. Thank you for the question. That is absolutely a concern, since the individuals may not be aware of what data is being collected, especially if some of those systems are being used in an in person interview. The person may or not be aware of that were whether that is part of the decisionmaking process for their application. Thank you so much. I, like many of my colleagues, have expressed and im concerned over the use of this technology. I am concerned that facial Recognition Technology disenfranchises individuals who do not have access to orernetwork internet video devices. I am worried that relying on algorithms to predict highranking employees will only inhibit the hiring of a more diverse workforce. Your testimony today highlighted many of these risks. Commercial face recognition algorithms misidentify racial minorities and women at substantially higher rates than whitecommercial face recognition algorithms males. We must develop legislation to ensure we get the best of the benefits of this technology, while minimizing the risks of a bias in employment decisions. I yield back. That concludes our hearing. We have no other witnesses. I am recognizing the Ranking Member and others on this side of the aisle for five minutes and then we will close with five in its. Five minutes. Rep. Jordan the broad outlines of what we are trying to do legislatively sort of as a start, and we are working with the chair and members of the majority as really is first just an assessment. Im talking again largely what government is doing, what the federal government is doing. So the first thing we would like to ask we want to know which agencies are using this and how theyre using it and to what extent is it happening . And some of you testified, ms. Whitaker, we dont know that. We dont know to what extent the f. B. I. Or the i. R. S. Or any other agency. We found out a few years ago the i. R. S. Was using Stingray Technology which was like what does the i. R. S. Need that for . So first part of what we hope will be legislation that we can have broad support on, that the chairman and both the republicans and democrats can support is tell us whats going on now. And then second, while were trying to figure that out, while the study and getting into accountability and whats all happening, lets not expand it. Lets just start there. Tell us what youre doing, and dont do anything while were trying to figure out what youre doing. And then once we get that information, then we can move from there. That is what i hope we can start with, madam chair, and frankly what weve been working with now for a year, staffs for both the majority and the minority. So i hope i see a number of you nodding your heads and i hope thats someplace that you all would be happy we would be supportive of us doing as a committee and as a Congress Just to figure out whats going on. With that i yield to my colleague from north dakota if thats ok, madam chair, for the remainder of our time. C. S. I. Was my favorite show when i practiced csi was my favorite show when i practice criminal defense. And if this body passes a law into effect that shut off everybodys facial recognition on the iphones tomorrow, i think we would have a whole different type of perspective on this from our citizen so i think identifying people quickly and easily has only positive Law Enforcement and safety applications that i think it would be irresponsiblele to disregard this technology completely. More importantly the private sector my intent, but in asking this question is not to demonize Law Enforcement. They would use whatever tools are available to them and they should. I do think we should also recognize that there are veryre responsible large corporations that want to get this right and they dont want to get it right just for the bottom line although that is helpful. They have Corporate Cultures asr well. More portly there are those of them arguing for a federal regulatory framework. We have to become innocent that this is different than a lot of other things. Identity never goes away. The association and the american population, has a Chilling Effect on that. I disagree with mister jordan and i disagree with mister connelly. Things will happen. I dont want a false positive. I dont want any false positives. Number one concern is not only those, where they are doing it, how they are doing it. It is tremendous to a lot of people. And and it is not the first time it will save off on these issues. The data sharing and those issues, that is by the time we get around time to accept them. And i really appreciate that we had this hearing. That came back. I think the panelists and my colleagues for participating in this hearing. We have another member, mister bilirakis, on his way. He has been misidentified. He is a member of the committee but is on a brother the committee rushing to share his experiences with us and i want to allow him to give the information he has on this issue personally. One of the things that came out of the hearing, he is not ready for prime time and it can be used in a positive way but as many witnesses pointed out, miss throckmorton showed a case where a person was innocent yet put into jail based on false information which needs to be used in positive ways but also severely impact civil rights and liberties of individuals. At this point i would like to recognize my colleague from the great state of california for his statement because he was misidentified. He was one of 28 that american Civil Liberties union showed was misidentified. And and and we have a relationship with a lot of Tech Companies to have this strained. The benefit this technology could give us but overmarketing and lack of social responsibilities, i had a privacy bill in the legislature that was killed and it came from a District Attorney, a District Attorney told me about a serial rapist, to get victims information for thirdparty he was paying for and provided an opt out. He was killed dramatically in the First Committee after i was able to get it out of the senate and to help mister gomez helped me but in that context if i had a dime for every time one of these companies did that, i would be a wealthy person. I appreciate the work you do but in the context of facial recognition and what is a meaningful reflection that Justice Brandeis said americans have a right to be left alone, how are you left alone in this kind of surveillance economy . Facial recognition, important to get it right but also the overlay of all the data accumulation. What is the danger in allowing Companies Like facebook absorb a penalty when they quite consciously, i refer to my former friends in the bay area, as being led by comfort, selfrighteous, where they think it is all right to take advantage of people, without thinking of the social consequences, given that they were willing to absorb the 5 billion that, like the settlement they agreed to. In this culture what is the danger in allowing Companies Like facebook to have access not just to facial templates but interaction with all the other data they collected. Thank you for the question, that demonstrates a comment about the interrelationship between public and private uses of the technology, beneficial or not so beneficial ways and your earlier comment was to the nature of surveillance technology, what do we want to accept and live within our country based on our values and how does technology enable that . I was not asked to show my identification even though most buildings i would have to show it to show up and go to a meeting i would have to show my id, was checked for a physical Security Threat with the scanner but was not required to identify myself. It be collective passively identified, the video feed i saw in this place of government. And that demonstrates we need to focus on wet things are and what to discuss clearly in terms of values, freedoms and liberties and dont let the technology because it is here because they do certain things and convenient to do certain things to impinge on those in ways that we are not ready to accept those compromises. How do americans feel to be left alone in this environment . In a commercial setting a commercial context, should not be using facial Recognition Technology must a person says they want to use it for the benefit of convenience it provides. If i want to use it as a membership retail establishments, to get the ip privileges at a hotel or expedite my checkin at a conference i can choose to do that but i would know i was doing it. I would have to enroll in the system consciously. It couldnt happen without my awareness. Who owns the data when we look at this . Car companies on the diagnostics and all these private sectors say they own it. Shouldnt we own it . Ownership of data is a very complicated way to look at it because it isnt something that should necessarily be sold which is the nature of property but in terms of rights, who has to use it, that should be clearly spelled out. If i agree to Certain Service in return for enrolling in a facial recognition system i have reasonable expectation not to have the data scraped for use with other undisclosed purposes. Thank you for indulging the schedule. So glad you could get back. I think this hearing showed that this was a widescale use, we dont have a sense how widely it is used but there is very little transparency of how or why it is being used and what security measures are put in place to protect the American People from that use and their own privacy concerns. We also have a dual challenge not only of encouraging and promoting innovation but also protecting privacy and safety of the american consumer. I was very much interested in the passion on both sides of the aisle to work on this and get some accountability. It should be bipartisan, i firmly believe the best legislation is always bipartisan and i hope to work in a committed way with my colleagues on the side of the island the other side of the aisle to come up with facial recognition legislation. I would now like to recognize for closing mister gomez who was also misidentified and has personal experience. Thank you and thank you very much to all our panelists. I want to thank all the panelists for being here. All the questions, we have twice as many more that we didnt have a chance to ask. I want people to walk away understanding this is a technology that is not going away, it will get further integrated into our lives through the private sector and government. Now we have to figure what that means. At the same time i dont want people to think false positives are not a big deal. People who are falsely identified as a particular person and it changes their life it is a big deal to them. When people downplay it as it is getting better, not that big a deal, the one person who goes to jail, who gets pulled over, the one person that may be doesnt make it to work on time, they lose their job and it has a Ripple Effect of devastation on their lives, it matters to them and it should matter to all of us. It is not one or the other. This will get better and better and we have to put the parameters on the use of that technology but there is still a lot of questions. Miss whitaker described it correctly when i started looking into this issue. I did run into the brick wall of National Security claims plus corporate sector saying it is proprietary, the information when it comes to technology. That wall must come down. It is on the political spectrum. How do we make sure the law comes down in a responsible way to keep innovation going. Thank you for this important hearing. Five legislative days to submit written questions for witnesses to the chair which would be forwarded to witnesses for their response. This hearing is adjourns and thank you. So glad you got back. [inaudible conversations] [inaudible conversations] [inaudible conversations] [inaudible conversations] [inaudible conversations] House Majority leader steny hoyer and congressman john saar baines had a discussion about the tenth anniversary of the Citizens United ruling versus fcc. Coverage begins at noon eastern on cspan2. For the third time in history a president is on trial in the u. S. Senate. Watch tuesday on cspan2. As the Senate Begins the trial with a vote on rules, the Senate Impeachment trial of donald trump live unfiltered coverage on cspan2, on demand, cspan. Org impeachment and listen with the free cspan radio apps. Iowa governor kim reynolds delivered the state of the state address of the state capital into more and earlier this week. She talks about budget ri

© 2024 Vimarsana

comparemela.com © 2020. All Rights Reserved.