The quality of information or introducing statistical noise so that the biometric data is less usable. A technique called template protection can ensure that one system biometric information is encrypted such that it cannot be read by another system for example. Someones image obtained from system at a doctor or Psychiatrist Office cannot be linked to the workplace Identity Verification system. Authorities are already working to improve Privacy Protection technologies. The american competes act contains a number of provisions that will future proof the governments definitions and standards for biometric Identification Systems and invest in privacy enhancing technologies. I look forward to hearing from our witnesses as biometrics become more prevalent in our daily lives. The timing of our discussion today is notable. The Supreme Court recently substantially weakened the right to privacy in their overturning of roe v. Wade. Third parties may try to access biometric information to collect the bounties now being offered by some states to enforce their new laws. I watch observed that some of our witnesses testimony came late for this hearing and i apologize to the other members of the subcommittee that we didnt have the usual amount of time that we would usually like to have had to prepare. The chair will now recognize the Ranking Member for a statement. Good morning, everyone. I am excited about our hearing this morning. The benefits and risks of Biometric Technologies and exploring Research Opportunities in these technologies. I was hoping that this hearing turned into a productive discussion. I was reflecting this morning on the fact that biometric knowledge he had changed the way we live our lives. This morning, i used a shall recognition to open my phone. I used the fingerprint reader to open my computer. My car this morning recognized my face to set the seat settings. And used facial recognition to make sure i was paying attention to the road. Its amazing to think this was once the world of Science Fiction and now, we take it completely for granted. Biometrics bring a lot of benefits to our daily lives. Were going to make sure that we are able to continue to allow those benefits while protecting the privacy of the people who rely on biometrics. I am particularly glad that the doctor from the National Institute of standards and technologies is here today to talk about the work they are doing in this space. Nist has been working in Biometrics Technology for over 60 years. They have had an incredible role to play in developing standards for biometrics and i am hoping that in the same way they have helped the fbi establish standards for Fingerprint Technology in the 1960s, they will be able to take a leadership role in establishing standards at the national and International Level for biometrics today. The standards will be critical to enable the exchange of biometric data between agencies and their systems as well as providing guidance for how those biometric systems are tested and help performance is measured and how assurances are made that data is shared securely and privacy is protected. Its important because biometrics are no different than any other advanced technology in that they have an official uses, but also misuse can harm individuals and our society in this case, by compromising the privacy of individuals or the security of their information. As policymakers, we need to be acutely aware of not only the benefits that these biometrics have to our society but also the risks associated with the technology especially in my opinion when it comes to the covert collection and the issue of individual consent to have information stored and used. I think as policymakers, we have to balance that awareness against the potential benefits that biometrics bring to society. You could easily imagine us taking a drug coney and approach to regulating biometrics that effectively prevents the development and use of biometrics which would lose all the benefits that we enjoy from ill metrics. Im not just talking about unlocking our phones or setting seats in our cars. Biometrics have helped helpful applications. In ukraine, the Defense Ministry is using facial Recognition Technology to recognize russian assailants. If we were to take an overly heavyhanded approach to regulating biometrics, we would lose out on lifesaving applications. Before serving in congress, i was a member of the California State Legislature and i served on the committee or privacy and Consumer Protection in the early days of shall recognition before the risks and benefits of the Tech Knowledge he were well understood. We saw a lot of bills that were misguided proposals that could have effectively banned the use of facial Recognition Technology altogether. Its a lot easier for us to push for legislation to Outlaw Technology entirely and it is to do diligence of balancing the benefits and risks. A better understanding of technology and carefully developed safeguards and standards will help us develop biometrics and a way that provides safety for peoples privacy without stifling the innovation that is going to lead to future breakthroughs and benefits to society. Im looking forward to learning about their work today and to hearing from our witnesses. Thank you for convening the hearing. I look forward to the discussion and yield back. I am very much envious of the car you must be driving with all of those features. I wager you arent driving around in a 10yearold ford focus. That technology is amazing. If there are other members who wish to submit additional opening statements, your statements will be added to the record at this point. I would like to introduce our witnesses. Ms. Candice wright, director, science, technology assessment, and analytics, u. S. Government Accountability Office she oversees the work on federally funded research and intellectual Property Protection and management and federal efforts to help virtualize Innovative Technologies and advanced economic competitiveness. She has led reviews on a wide variety of policy issues including federal contracting, risks to the Defense Supplier base, military sales and homeland security. After her is dr. Charles h. Romine, director, Information Technology laboratory, National Institute of standards and technology itl is one of six Research Laboratories within the National Institute of standards and technology. He oversees a Research Program that cultivates trust and Information Technology by developing and disseminating standards measurements and testing for interoperability security usability and reliability of Information Systems. Our final witness is dr. Arun ross, professor, department of Computer Science and engineering, Michigan State university; site director, nsf center for Identification Technology research. His experience is in biometrics, Computer Vision and Machine Learning. He has advocated for the responsible use of biometrics in multiple forms including the nato advanced Research Workshop on identity and security. Each witness will have five minutes for your open testimony. Your written testimony will be included in the record of the hearing. When you have completed your spoken testimony, we will begin with questions. Each ember will have five minutes to question the panel and we may have two rounds of questions. We will start with ms. Wright. Ms. Wright thank you. Thank you for the opportunity to discuss the gao work on federal agency use of Biometric Technology particularly for facial recognition. The technology is used to compare facial images from a photo or video for identification and verification. As it has continued to advance, its use has expanded in commercial and government sectors. Today i will share highlights from our work on how agencies are using facial recognition and to mitigate privacy risk. Last year, we reported on our agencies use of the technology. 18 agencies reported the use and the most common use was a mocking. Other uses included Law Enforcement to generate leads for criminal investigation as well as monitoring or controlling access to a building or facility to identify someone on a watchlist attempting to gain access. This can greatly reduce the ban on the burden on security personnel to myspace is. Two recognize faces. Multiple agencies reported accessing systems owned by commercial vendors. Clearview aia to identify victims and perpetrators in child abuse cases. Agencies are investing in research and development understand application of the Tech Knowledge he. Examples include the dhs sponsor challenges to develop systems. Nsf has awarded grants to Research Methods to prevent dental finding an individual from facial images used in research. There are concerns about the accuracy of the tech elegy, Data Security risk, the transparency in usage, and the protection of privacy and civil liberties. Some agencies did not have complete information on what nonfederal systems were being used by their employees. Multiple agencies had told their employees pulled their employees and found that they were using nongovernment systems. Using facial Recognition Systems without assessing the privacy implications can put agencies at risk of running afoul of privacy related guidance. They are also at risk that data sets could be compromised in a data breach. Unlike a password that can be changed, a breach involving data derived from a face may have more serious consequences. We recommend that agencies assess the risk of such systems. Agencies are in various stages of recommended implementing our recommendations. We found that tsa had incorporated Privacy Protections for the Pilot Program to test the use of the technology for traveler Identity Verification at Airport Security checkpoints. Cbp privacy notices to inform the public of the use were not always correct or complete. Cbp did not conduct surveys of airlines and workers to ensure compliance. Fully implementing our recommendations will be an important step to protect travelers information. Facial Recognition Technology is not going away and demands for it will likely continue to grow. As agencies continue to find utility in the technology, balancing the use of technology, requirements, and Privacy Protection will be continued importance. This concludes my remarks. I will be happy to answer any questions you may have. Next is dr. Romines. I dr. Charles h. Romine, am director, Information Technology laboratory, National Institute of standards and technology thank you for the opportunity to testify today on behalf of nist and our efforts to evaluate the privacy implications of Biometrics Technology. This is home to five nobel prize winners. The mission of nist is to promote innovation and competitiveness by advancing science, standards, and technology in ways that enhance security and improve our quality of life. In the Information Technology laboratory, biased. Nist conducts fundamental and applied research, advances standards to understand and Measure Technology and develops tools to evaluate such measurements. Technology standards and the Foundational Research that enabled their development and use are critical to advancing trust in and promoting and rubber ability between Digital Products and services. Critically, they can provide increased assurance thus enabling more secure, private, and rights preserving technologies with robust collaboration with stakeholders across government, industry, international bodies, and academia. Since its establishment nearly a decade ago, nist privacy Engineering Mission has been to support the development of Information Systems by applying measurement science and System Engineering principles to the creation of frameworks, risk models, guidance, tools, and standards that protect privacy and civil liberties. The ability to conduct thorough privacy risk assessments is essential for organizations to select effective mitigation measures including appropriate privacy enhancing technologies. Modeled after the highly successful security framework, privacy framework is another voluntary tool developed in collaboration with stakeholders through public and transparent process. It is intended to support organizations decisionmaking and product and Service Design and deployment while minimizing adverse consequences for individuals privacy and for society as a whole. Since the 1980s, nist has coordinated the use of a standard data format for the interchange of fingerprint, facial, and other biometric information for interchange of biometric data in Law Enforcement implications. The standard is used globally by Law Enforcement, homeland security, defense, intelligence agencies, and other Identity Management systems to ensure biometric information interchanges are interoperable and maintain system integrity. Since 2002, nist has supported the development of International Standards in civil applications for id cards including passports. Use as authenticators to protect Sensitive Data different degrees of privacy risk. Organizations need to have the means to distinguish between different degrees of risk and implement appropriate mitigation measures. The privacy framework provides the structure for organizations to consider which privacy protected outcomes are suitable to their use cases. The research on privacy enhancing technologies that nist conducts and the guidelines and standards that it publishes helps organizations. Privacy plays a Critical Role in safeguarding fundamental values such as autonomy and dignity as well as people rights and liberties. Nist has prioritized research and the creation of frameworks, guidance, tools, and standards that protect privacy in addition to maintaining the privacy framework, nist provides cybersecurity guidelines as well as the risk framework. I look forward to your questions. Now dr. Ross. Dr. Ross i am grateful for the invitation to testify today. I consider this to be a great privilege and honor to engage with those who serve our nation. Biometrics is a viable technology. Valuable technology. It is necessary to ensure that the privacy of individuals is not unduly compromised when their data are used in certain applications. The purpose of my testimony is to communicate some of the ways in which the privacy of the biometric data of individuals can be enhanced thereby facilitating the responsible use of this powerful technology. Firstly, the benefits of our metrics. The need for reliably determining the identity of a person is critical in a vast number of applications ranging from personal smartphones to border security. From selfdriving vehicles to deporting, tracking child vaccinations, preventing human trafficking. Biometrics is increasingly being used in several such applications. Many smartphones employ automated face or fingerprint recognition when unlocking and for payment authentication purchase purposes. Use of this technology is being driven by significant improvement in recognition accuracy of the systems over the past that gate. The phenomenal rights of the paradigm of deep learning based on Neural Networks has fundamentally changed the landscape of facial recognition and metrics. This brings me to my second point, the privacy concerns associated with the technology. Images of an individual can be linked across different applications using Biometric Technology thereby creating a comprehensive profile of the individual but in some cases, unintentionally divulging the persons identity the privacy was expected. Rapid advances in Machine Learning and ai have led to the development of attribute classifiers that can automatically obstruct information pertaining to demographics from images. This can potentially breach the privacy of individuals. A number of data sets have been curated for Research Purposes by scraping publicly available face images from the web. Legitimate concerns of been expressed for using these images without consent. An anonymous face image can be linked to one or more face images in a curated data set thereby potentially revealing the identity of the anonymous face. How can Biometric Technology the sponsor blue developed and deployed while keeping privacy in mind . First by udo utilizing encryption to ensure that the original data is never revealed and that all computer asians take place in the encrypted domain. By engaging a paradigm where the biometric data of an individual is intentionally distorted using a mathematical function. Distorted data can still be successfully used for biometric wreck recognition purposes. This preempts the possibility of linking the data of an individual cross applications. Thirdly, by turbine face image in such a way that theyre able to fourthly, by making it more difficult for images to be scraped from public websites and social media profiles. Fifth, by deploying privacy preserving cameras where the acquired images are not interpretable by a human. Such cameras have been used in public spaces to ensure the acquired images are not viable or previously unspecified purposes. I must note that researchers in biometrics are becoming increasingly aware of the privacy and at ethical implication of the technology they are developing. The recognition accuracy is not the only metric ring used to evaluate the overall performance of a biometric system rather metrics related to security and privacy are also being increasingly considered. This shift in the culture is remarkable and bodes well for the future of the technology. Thank you and i welcome any questions. We will begin our first round of questions. First on the prospects for secure and privacy preserving digital id. We are all aware of concerning aspects of Biometric Technologies, its important to recognize that there are valuable uses for these technologies that can improve our lives into security. Privacy protections must evolve along with capabilities so we can reap the benefits safely. Our improving digital act and a Bipartisan Group of colleagues have called upon federal agencies to modernize and harmonize our nations Digital Identity infrastructure in large part by leveraging the biometric databases that individual states already have in place as part of their programs to support real id. Additionally using the standards to make sure these tools are interoperable and can be used for presenting that identity both online and offline in the privacy preserving weight. How could Biometric Technologies increase our privacy i making our identities more secure against theft and fraud . Dr. Romine i appreciate the concern that you end the Ranking Member have on this issue. The guidelines that we have put in place for privacy enhancing technologies broadly speaking we have investments in our Privacy Engineering Program related to understanding how we can develop new technology that can enhance Privacy Protections in many different aspects of technologies. That coupled with the guidance that we are updating today on Identity Management and appropriate protections for Identity Management technologies i think they are going to be certainly opportunities to improve the protections of biometrics information across the board through some of these updated guidelines. I look forward to discussing that with you and your staff. Any broad implementation of a identification techniques would require broad implementation of privacy protective measures. How could the methods be strengthened, are they ready for prime time, things like homomorphic encryption, im still told there is a privacy budget that you have to enforce you cant interrogate this using homomorphic decryption, you cant just do it repeatedly without at some point revealing the underlying database. There must be limits to these. Have we understood and hit the limits of these or is there a lot of work yet to be done to understand how effective it can be to Exchange Information between trusted entities without revealing everything . A very short time ago, homomorphic encryption was a theoretical idea whose performance was so unbelievably slow that it was not practical. Since then, enormous strides have been made in improving the performance. I will say that these privacy enhancing technologies particularly using cryptography as a protection mechanism have enormous potential, but there is still a lot more work to be done in enhancing those to make them significantly practical. As you point out, there are situations in which even with an obscured database through encryption, if you provide enough queries and have a Machine Learning back and to take a look at the responses, you can begin to infer some information. We are still in the process of understanding the specific capabilities that Encryption Technology such as homomorphic encryption can provide in support of that. Dr. Ross, do you have any comments . Particularly the idea that you can cancel your fingerprints in some sense. Does that really work yet . Dr. Ross thank you for your question. Cancelable biometrics has been proposed as one way to preserve the security and privacy of the biometric data but also the ability to cancel ones biometric template. The way it works is as follows. Lets say you have a fingerprint image. You subject it to some distortion using a mathematical function and the distorted image is used for matching purposes. In other words, if the particular image is compromised, you would just change the mathematical function. Therefore, you cancel the original template now you generate a new fingerprint template based on the revised mathematical function. In principle, this can allow us to not store the persons original fingerprint but only the distorted version or the transformed version of the fingerprint. Thats why the cancelable property which is really imported by using the transformation function. I dont want to use my time limits which has expired, but we should be able to get to a second round. I now recognize our Ranking Member for five bennetts. Five minutes. I have been reflecting on the fact that when we talk about privacy, it is a nonbinary ethical problem. You cant say that data is completely private or not. We are dealing with a strange continuum where we have to weigh the amount of privacy we are willing to give up against the potential benefit that we expect by giving up that privacy. Its a complicated thing. I would like to organize my questions around that, because i think that solving the problem is going to be key to establishing a Regulatory Framework of what is expected will be asked companies to protect privacy. Im happy to see that gao participate in this hearing and it sends a powerful statement to those that we intend to regulate will be start with ourselves and government because obviously we interact with a lot of data from different users and we should be experimenting on ourselves on solving this problem before we expect others to solve it. I found your testimony very compelling. I was very alarmed when i read that 13 out of 14 agencies that you surveyed did not have complete information about their own use of facial Recognition Technology. I realized most of those were people using facial Recognition Technology to unlock their own smartphones. It may be think about the fact that maybe there is a difference between privacy when it comes to our own data, im using my face to unlock my phone, and the privacy when we are using other peoples data especially will be have a large amount of data. Well be do the surveys in the future, do you think we need to make a distinguished meant between those different kinds of uses . I think thats important, but the cases are refound agencies didnt know what their employees were using, it was the use of nonfederal systems to conduct facial image searches such as for Law Enforcement purposes. They didnt have a good sense of what was happening in the regional and local offices and thats what we think its important for agencies to have a good understanding of what are the systems that are being used and for what purposes and to also make sure that by accounting for that, then they have the necessary tools to ensure that they are balancing the potential privacy risks associated with using the systems. All of these things, you are using commercial source for this kind of technology, has to go through procurement right . Would procurement be a fruitful avenue to look at in terms of informing this flow of information . A couple of different scenarios, one in which agencies might have been accessing state and local systems or commercial systems through a text or trial. Then there might be instances where they have an acquisition or procurement and placed. We have ongoing work right now looking at Law Enforcement use and the kinds of mechanisms they are using and acquiring systems from commercial vendors. I think that information will be telling for us to determine what privacy requirements are being put in place when agencies are acquiring services from these commercial vendors. Dr. Romine, i found it interesting in a written testimony when you were talking about the privacy framework that its not a static thing. Could you talk a little bit about how you would evaluate the fact that this has to be dynamic . Part of it has to be based on use. If youre using facial recognition for verification, that is different than identification. Users expectations on privacy are going to be different. How do you approach that ethical conundrum . Dr. Romine thats exactly right. The context of use is critical to understanding the level of risk associated with privacy consideration. One of the things privacy framework is intended to do is give organizations the ability to establish privacy risk principles as part of their overall Risk Management for the enterprise. You talk about Reputational Risk and financial risk and Human Capital risk. Privacy has not been included in that. Were giving organizations the tools now to understand that data gathered for one purpose when it is translated to a different purpose in the case of biometrics can have a completely different risk profile associated with it. It isnt inherent in the data, it is the context in which those data are being used. Our tools allow for a deeper understanding on the part of the organization on that context issue. If get another round, im going to ask you about because thats going to go right into what we were talking about with framework. We will now recognize the next represented for five minutes. I have a couple of questions i want to touch on. This is the topic of conversation that has come up in oklahoma a couple of times on the stateside. You testified that most agencies accessing nonfederal facial Recognition Technology dont track use of or access related to privacy risks. Is there any federal law that requires agencies to track this information . Ms. Wright there is a broad privacy framework where you have the privacy act that does call for agencies to limit their collection as well as disclosure and use of personal information in the government system. A photo would be considered an example of personal information. You also have the egovernment act which provides provisions for agencies to conduct assessments in their using systems and to be able to use those assessments to analyze how the information is collected, stored, and shared and managed in the federal system. Only be noted that privacy requirements apply to any systems being operated by contractors on behalf of federal agencies. We havent even talked about the contractor piece. I want to circle back around to your comment about these assessments. Do you think that agencies are doing the assessments and if so, are those outcomes published so that other agencies can understand risks or the breath of what they are utilizing within the agencies . We have seen a mix of how agencies are approaching the privacy impact assessments. One of the things i mentioned was when you have agencies using systems and employees using systems and the agencies arent even aware, there is the possibility that the risks have not been assessed and thats an important thing for agencies to keep in mind as they continue to use facial Recognition Systems. Would it be helpful for congress to look at requiring these assessments to be done on a periodic basis for agencies that are utilizing these types of biometrics . The egovernment act calls for agencies to do that but the extent to which they are doing that varies. That is work that we can talk about if there is oversight opportunities for them to look at the extent to which they are using privacy impact assessments especially in the realm of biometrics. What do you think some of the potential adverse consequences might be of agencies failing to track information either themselves or through thirdparty systems . A couple of things come to mind. Are they using systems that have reliable data . That have Quality Images that will affect the sorts of matching results that will come back and the extent to which those can be trusted. You can see where there is the potential for high mismatch error rate which might be in a Law Enforcement example where you are taking down a lead that might not be fruitful or you might be missing an opportunity. That is one piece of it. The other piece is one where thinking about this from a privacy perspective, how are the images being collected and how are they being used and does the individual have any say . Did they provide consent to their data being captured . Theres a number of different risks associated. The issue of Data Security, are there systems secure . We have cybersecurity on the high risk list for many years within the federal government. You can imagine this opens the door for potential greater security breaches. Sitting on the cyber subcommittee, i think you are exactly right. We talk about this from a data privacy perspective but also we need to recognize there is a huge potential for Cyber Security challenges when youre collecting these types of biometrics and storing them through thirdparty which in some cases can be more of an issue but certainly if agencies are storing that information themselves. My time is almost expired. I yield back. I believe we will have time for a second round of questions. I now recognize myself for five minutes. It is abundantly clear that the u. S. Taxpayer has suffered greatly from Identity Fraud. Irs refund fraud, unemployment benefit fraud during covid, you name it. Has anyone to your knowledge inside gao or elsewhere netted out the total loss to the federal government from Identity Fraud that might be prevented with using stateoftheart identity proofing mechanisms . That is certainly not something that came up in the course of the recent work we have done. I am not aware, but happy to take that back and follow up with you. I think we will be asking you what the scope of such a survey would be. There appear to be little bits and pieces of documentation of enormous losses that the taxpayer suffers from this. Trying to get that balance right i think could be an important outcome. I am happy to do that. One of the tough things we will face as a government is sharing data with other government. Biometric databases or regulating crypto where you will need to have uniquely identified crypto drivers license at if you will. Very much like setting up a passport system. Something where you have to identify that someone is operating multiple identities in multiple jurisdictions. Dr. Ross, are you familiar with the stateoftheart and what might be useful there . Are there investments we can make toward more research that would allow you to ask very sensitive questions of big databases owned by other states or governments . Dr. Ross certainly and i think one concept that can be harnessed but has to be further researched is the notion of differential privacy. It would indicate that within certain jurisdiction, you are able to do certain identity assessments using biometrics and you have specific use cases, specific purposes in which identities can be matched, but in other cases the identity cannot be matched. By defining the policies, one could then use these principles that we alluded to earlier including homomorphic encryption and differential privacy in order to ensure that that kind of functionality can be performed. However, i must note that research is still in infancy in the context of biometrics and certainly more investment is needed in order to assess the suitability of this operational environment. Further collaboration and investment is definitely needed to implement these techniques in operational environments. When you are involved in International Standard settings which is part of this mission, do you get the feeling that the United States is leading the way or are there peers around the world that are as sophisticated technologically in biometrics and in privacy preserving methods . Dr. Romine in the work we are doing on the International Standards arena surrounding Identity Management, we certainly believe we are leading in that space. There are certainly other likeminded countries that are partners with us the value democratic ideals. We strive to work closely with them and they do have very strong technical capabilities in these areas as well. I have been struck that in some european nations, you have a right to know when any government accesses your data at least outside of criminal investigation. Are these things that can be guaranteed or is that an unsolvable problem if you understand my question . I dream of some technology that would allow you with cryptographic certainty to know that someone has touched your data. Dr. Romine it is certainly theoretically possible to use cryptography to address the concern. I wouldnt call it foolproof necessarily. The history of advancing technologies is colored with many different advances and risks and the risks are addressed by new technologies which create additional risks. The goal for us is to ensure the trustworthiness of the underlying systems and cryptography can be important there. Dr. Ross, did you have any thoughts on the feasibility of that as a longterm goal . Dr. Ross yes and its an excellent question. One thing it entails is keeping a ledger of interactions between humans and if the data being stored. For example, the block chain principle has been used to keep track of certain transactions that have occurred. These are immutable. I believe some of these principles can be leveraged in the field of biometrics, but i must maintain that or research is needed. More investment is needed. Certainly, the technology is available. Then, it has to be incorporated into the context of metrics. Biometrics. My time is expired. Dr. , we were having the discussion about the continuum of privacy and how that works ethically with our efforts to regulate it. In your written testimony, you talked about the idea that privacy can be violated when the scope of help biometric data is used differs from the expectation of who provided it. That is ethically complex also. Sometimes, there are societally beneficial uses. One we have been talking about with using clearview ai to halt sex trafficking. If you ask the people who are safe from sex trafficking, they didnt give permission for the use of their data in that context. But if you ask them if its ok, they say yes. How do you navigate that minefield . Dr. Romine thats a trick question. When you have acquired biometrics data for whatever purpose any organization that has acquired such data, these are not assets in their control. Sometimes, the pressure to use those assets in ways that work originally intended is pretty enormous. The idea that we could do this instead of thinking should we do this with those data. That is one of the reasons why we always have to stress the importance of context of use. In some cases, new context of use may be enormous the beneficial and perhaps not even controversial. In other cases, it could be extremely potentially damaging. This is the difference between cybersecurity and privacy in the sense that it does not have to take place for her missy privacy harms to occur. Using biometrics data in ways that were not intended and perhaps violate the expectations of those who provided those data can create those privacy events. I completely agree. I want to ask a question about that to dr. Ross. You were talking about privacy violations that can occur with using facial recognition to infer racial sexual or Health Characteristics that were not provided by the person. How do you navigate that in an ethical sense . When i post a picture of myself on facebook and one of my friends looks of my friends looks at that and says he looks unwell, i cant then point my finger and say thats a privacy violation, i didnt intend for you to infer anything about my health, they would just roll their eyes. Its understood my picture is out there and those inferences can be made by anybody who sees it. Why do we make a distinction between that when a human does it and when a machine does it . Dr. Ross we are really distinguishing between human analytics versus machine based analytics. Could have billions of images, you can run the software over these billions of images then make some assessments in the aggregate without use or consent. It is the ability to do this repeatedly over massive amounts of data then use that aggregate in order to perform additional activities that were not indicated to the user. That is where the problem lies. If the user were to give consent saying these images can be used for further analytics, i believe using the machine will be productive in some cases but in other cases as you point out, there might be a violation of privacy. I think it comes down to user consent and the fact that you can do this in mass so how do we do it in a manner that the person is aware of how their data is being used and that does not unwittingly glean additional pieces of information that might violate the privacy. I somewhat agree. The amount of data the distinction is not the amount of data that is progress processed. Another question before a run of time, you talk in your testimony about privacy by design which i think is an elegant concept but consider me a skeptic because if you are using an algorithms that distorts images in a way that sex or ethnicity cannot be read, we will run into the same problem that we did with cryptography where cryptographic algorithms developed 10 years ago dont work anymore because the computers are more powerful. As this Technology Gets better, arent those algorithms not going to work anymore either . Dr. Ross a point, very insightful. This is where more mathematics is needed as we start developing Biometric Technology and applying it. Understanding what the privacy leakages are, information leakage and what is lacking is privacy metrics. Privacy metrics is a moving target. If technology cannot deduce some attribute from a face image today, it might be able to do it tomorrow. Was deemed to be private today may no longer be deemed to be private tomorrow. That is where the concern is heard this is where the Technology Evolves and these technologies must be revisited. It is not static in time. It is dynamic in time. As technology advances, these policies must evolve. Also the metrics being used to evaluate must evolve. In short, i completely agree with your statement. Some of the problems in cryptography can manifest themselves in other techniques, but it is not unsolvable. With Adequate Technology development specially employing mathematical transformation, i believe that a solution can be found. Thank you for that. I yield back. I think there may be time for an additional round. I am just catching up to all of you and i will never be able to catch up to jay or bill on this subject, but stephanie i can at least talk about. I want to thank the panel. There was a word that you used, dr. Ross, immutable. Then you got into the conversation with mr. Ober nolte about the fact that Technology May make some of what we are trying to do today in terms of privacy and cybersecurity outdated tomorrow. It reminded me of a great oklahoman, will watchers will rogers. The only thing that is immutable is death and taxes. I guess my question is, and im really just a sciencefiction person when it comes to this, thinking of minority report with tom cruise. You may have all kind of addressed that. Every place he goes they know him already, and eventually he has to have his eye taken out because of this. I went and i bought an ipad holder from a Company Called weather tech the other day. We were in there for Something Else and i thought it looked good and bought the thing. All of a sudden i am getting ipad holder ads like crazy, and i didnt even look online. I just bought the darn thing. I just feel like ive got either big business looking over my shoulder or Big Government looking over my shoulder. I am making more of a statement than asking a question, but i guess i will start with you. Is there anything, dr. Romine was talking about privacy versus cybersecurity. What can we do in congress to ensure ourselves a little more privacy . I think a key important factor is how do we hold agencies accountable for the information they are collect thing . The purpose for which the information is being used, how it is being stored, shared, and destroyed our fundamental things to start with one we think about privacy. And to really think about what applications or use cases we think should be permitted or restricted, because i think then you will start to get a handle on where the concerns are with respect to privacy. At the end of the day it is about tradeoff. While there might be some convenience factors and some security benefits as well, there is also the issue of privacy and being able to protect your personal information and that is where the tension lies. Rep. Perlmutter there is also tension between the kind of privacy we may want from state or local governments versus the kind of privacy we may want from private enterprise. The thing that i ran into, it was a spontaneous purchase of this ipad holder, and all of a sudden i am getting ads about it. So you have two really sizable entities out there looking over your shoulder. I think we in Congress Need to think about both of those when we are thinking about particularly about privacy and cybersecurity. Gentlemen, anybody have a comment to my general proposition here . It is not sciencebased, but it is personalbased. Dr. Ross i would be happy to share some comments. I think the issue that you are describing is actually very important. Mainly, exchange of biometric data. Biometric Data Collected from one purpose can then be transmitted to another agency, to another entity which might use it for different purpose. One way in order to kind of prevent this, even before it happens, is by ensuring that when we store the biometric data in one entity that it is suitably encrypted. And when it is used in a different entity it is encrypted differently or transformed differently. What happens here is the two sets of data cannot be linked because they have been transformed differently. I think that becomes important. On the flipside it might prevent one agency from communicating with another agency because the biometric data cannot be linked. This is where use case specific qualities must be instituted. There are certain situations when it is acceptable and other situations like the one you described that are not acceptable. This is where Economic Development must be augmented with legislative instruments to engage the data in a manner that is appropriate in different use cases. Rep. Perlmutter thank you, my time has expired. Rep. Foster we will recognize representative bryce for five minutes. Rep. Brice part of that i recognize the connections, it is remarketing. Your email is likely tied to your credit card in some way or you may have entered your email address when you checked out and your email is tied to social media. When they realize you purchased that they started marketing to you all kinds of things. That has been going on for some time, but it is for a lot of folks concerning wonder how did they know, how did they get this information . That is big data at its finest. In oklahoma, this last session we passed the Computer Data privacy act. The bill allows for the option for personal rights to be returned to the individual, along with the option for cancellation of the information in a private companys database. To me this seems like it could be a solution for privately collected biometrics data, but this is, to any of the witnesses here, what do you think are the most concerning aspects of developing Biometric Technology . Dr. Ross i would be happy to offer some comments, if you dont mind. Both parts of your excellent question, one of the most obvious concerns about biometric is the ability to link different data sets. That clearly constitutes a problem in some cases. In other cases it is an advantage. Once again, as technology improves, as the recognition accuracy numbers improve, this kind of linking can be done with more certainty because the errors are decreasing. This is where quality for regulating the use of the Technology Becomes important. In some use cases it is essential to have the functionality. In other cases it may not be required. Secondly, in response to your first comment, again, an excellent comment, when the user in a private enterprise offers their image, it would be nice if they can say for what purposes it can be used. If it is a face image they can say you can use this for biometric recognition, but it should not be used for assessing age or health cues. The moment they specify that, the data should be transformed in a manner that would facilitate the functionality prior to storing it in a database. This gives some degree of control to the user, because the user is now able to specify what kind of information can be cleaned and what kind of information should not be gleaned can be gleaned and what kind of information should not be gleaned. I think that is one important area in which more investment is needed. Many techniques have been proposed, but these have not been evaluated. There is tremendous opportunity if we were to invest in this front, but excellent questions and thank you for hearing me. Rep. Brice anyone else care to comment on that particular aspect . Dr. Romime i would be happy to weigh in. Some challenges involve the ability, as my comment said, as dr. Ross said, to glean certain types of information and some of the potential societal harms or inequities that may result back to the Ranking Members question about his Facebook Page and his friend seeing his image and saying you dont look very good. Imagine if it was an Insurance Company saying you dont look very good and taking steps as a result of that assessment. Those are the societal harms that we need to be wary of. Rep. Brice i think this is a really great point, that use of biometrics is incredibly important and we need to be able to develop systems and controls to be able to allow for individuals to have some sort of say and how their information is utilized. Thank you for your time witnesses today, and chairman, i yield back. Rep. Foster we will embark on a final round of questions. I will recognize myself for five minutes. Dr. Ross, you seem to be coming close to describing something that resembles a licensing regime for collecting biometric data. Say someone wanted to put a facial recognition camera in front of their nightclub to find people who have repeatedly shown up at the nightclub and caused violence. It sounds like a legitimate thing. If they start transferring that information around there are a bunch of issues that come up. Are there standards this might also be a question for dr. Romime are there standards for how you would license the collecting of the data and licensing the transferring the data so ultimately if you are Holding Biometric Data on someone you would have to also be able to demonstrate a chain of custody that showed that you have achieved, obtained this only through a set of licensed distributors of data with customer consent at each point . Have people gone that far . Any country gone in that direction . Dr. Ross thank you for your question, chairman foster. I will address the first question and my distinguished colleague will answer the second part. The first part, there is research that is being conducted in which privacy is being moved closer to the sensor than the d ata. Once the data is acquired it is available. Encrypted you can transform it, but someone has access to the data. What if we move the privacy aspect to the camera itself in such a way that the camera is designed in a manner that it can only extract or acquire specific aspects of the scene . That becomes very important, because know where the digital version of the full scene be available. For images even prior to storing them at the center level, it might be one way in which the scenario that you described can be handled because the data will no longer lend itself to be processed by a Different Organization or entity because the data was processed at the time it was acquired by the camera. That could be one technological solution, but as i mentioned earlier, these things have to be evaluated. So, much more research, investment, and evaluation are needed to substantiate these principles. Rep. Foster will this ultimately require for some purposes basically a governmentbacked door . For example, if you have cameras looking at elevators to make sure that you are opening and closing the elevators as fast as possible where you only really have to detect the presence of a human, and all of a sudden you find a massive crime has been committed the government might want to go through trusted court a trusted court system and say bypass the observation, i want to see the persons face. Dr. Ross the same data can be stored in different formats, different transformations, so that it can be used for some purposes and not for other purposes. I think that technology can be applied in order to transform the data in different formats, but individual formats should be guided by policy as to putin and should axis it and should not access guided by pauly guided by policy as to who should and who shouldnt access it. Rep. Foster do you have any comments about when you engage with some of your foreign colleagues in this . Do they face a very different set of attitudes as in the United States . Dr. Romime certainly that is true. For example, as you know, the gdpr in europe envisions a different way of approaching protections for privacy than we currently have in the United States. That said, one of the reasons that the privacy framework that we have developed is regulation agnostic and Even Technology agnostic is we want it to be adaptable, usable around the globe and be able to provide assurance that if you follow these guidelines you have evidence to support you are complying with whatever Regulatory Regime you happen to be in at any given time. Rep. Foster thank you. I will recognize representative obernolte for five minutes. Rep. Obernolte a couple of interesting things have come up like how do we safeguard privacy . From a 30,000 foot view level . Some things could work and some things probably wont work. Dr. Ross, you were mentioning disclosure. I used to think that that was a great idea, and then i started looking at enduser license agreements for software. There are pages and pages that people scroll through and click agree at the end. What good would it do to add another paragraph that says heres how we will use the facial data that you give us. There was an episode of south park a few years ago, a parity of one of the characters. I have inadvertently given apple the right to do experimentation on him. His friends were like, you clicked on that and signed it without reading it . Who does that . The answer is everyone does that. I think maybe control over who has access to the data. If i give my data to apple for use for a certain purpose, the fact that apple should not give that dated to someone else to use very different purpose, i think that data to someone else for a different purpose i think is closer to the market. We wont find a real regulatory solution to the problem without looking at the things we are trying to prevent. What attorneys called the parade of horribles. Someone asked about that. We are entering into this era when anonymity is a lot less than it used to be. That will be true regardless of what approach we as a government take to privacy. Can you walk us through the worst things that we have failed that, like the worst things that can happen . Those are the ones we have to try to prevent. Dr. Romime fair enough. I will say figuring out with the worst things are might take some time, but some of the things that ive alluded to is the idea of organizations making decisions based on inferences from biometric data that disadvantaged certain groups over others. Rep. Obernolte let me stop you. We have had that problem. There is ethics around ai algorithms that we are dealing with. I think the solution is you focus on the fact that that behavior is already illegal. So, if im going to kill someone it is equally illegal for me to kill them with a knife and a gun. The tool doesnt matter, the act matters. Why is that different in the case of privacy . Dr. Romime i dont think it is so much different as it is a consequence of the lack of privacy or privacy compromise. Privacy in this case or the compromise of privacy, a privacy event, would lead to that activity. There are other things that i can imagine. There are aggregates, societal decisions that are made that might be predicated on aggregate data that violates privacy consideration. Policies may be instituted and culture certain populations as a result of certain issues related to privacy or biometrics. In all of these cases what we have discerned is that there is no technological solution to solve the privac problem privacy problem and no purely policy that will solve the privacy problem. It is providing and improving Privacy Protections and massing those matching those with appropriate policy that can prevent some of these tragedies. Rep. Obernolte i agree with you, but i definitely think in crafting policies we need to look at, asking ourselves the question, what problem are we trying to solve, what are we trying to avoid . Merely focusing on anonymity i think is a fulls foo ls errand. We have less anonymity and theres not anything we can do about that. The parade of horribles, the government has the power that other entities dont. If you want the parade of horribles, look at what china does with the personal data that they have for people. That is the top of my list of the parade of horribles, but i dont think that we will get there from a policy framework standpoint without thinking of the problem we are trying to solve. It is a discussion im sure we will continue to have over the next few years. I yield back. Rep. Foster we will now recognize our lawyer in residence for five minutes. I think mr. Obernolte is focusing on the question of the day. I remember serving in the state senate 20 plus years ago. We were just trying to have an internet within the colorado legislature, and something came up and we were talking about Social Security numbers, should we release them, all that stuff, for privacy purposes. I was being cavalier and i said, there is no such thing as privacy. Your point, there is no such thing as anonymity. It has only grown in the last 30 years. The question is, from a policy perspective, technologically we can address things. As ms. Wright says you give up some things to get some things. You can make it tougher for a cyber criminal or someone to use your data, but you are giving up some efficiency or ease of use in the process. The Supreme Court made several decisions, none of which i like. The one that i like the least is the reversal of roe v. Wade, but they basically say under the United States constitution there is no such thing as a right to privacy. And i dont know. I want to feel secure that when i go buy something spontaneously that that doesnt alert everyone under the sun to something. Or when i walk by a Grocery Store or gas station all of a sudden that doesnt send out in the neighborhood lets send him x, lets get him. This is for everyone, including my two colleagues. To jays question, what are we trying to solve . What do we want . Do we want to create a right to privacy that the Supreme Court says there isnt such a thing . We can legislatively Say Something like that. How far do we want to take it . Then for the technologists, help us put that into play knowing that technology will evolve and change, and things that we thought were in place will be replaced. It is Ed Perlmutter thinking based on jay obernoltes line of questioning. Ms. Wright, as the director of the agency that thinks about this stuff, from a technology standpoint we can do some things if you guys give us clear direction. I think that bill is trying to do that on some of his digital legislation and jay has some stuff too. Dr. Foster, i will turn him back to you and you can do with my two minutes whatever you wish. Rep. Foster that is an interesting you know, this is i will ask a question. So much of this will have to do with our cell phones. Dr. Romime, is there good coordination communication with the manufacturers of the cell phones . There is incredible ai built into the next generation of smartphones, but not all of it is in the secure enclave where you have some idea it is trusted computing. Are you having thoughtful interactions, or do you get that they are just trying to set up a Walled Garden and keep everyones privacy information under their control . Dr. Romime we work with a very large crosssection of technology, including cell phone manufacturers and providers. Having further reflection on Ranking Member obernoltes question about significant harms, one of the significant harms i can imagine is through cell phone tracking or face recognition. Cameras, street cameras and so on. Someone trying to access safe and reliable medical services, whether it is psychiatric or Something Else, suddenly that becomes public record. Someone has now been outed because of biometrics information because it tracks the information trying to obtain services. This is another very serious potential issue. Yes, we are working in discussion with cell phone manufacturers and other advanced Technology Firms all the time. Rep. Foster thank you, again. We could go all afternoon on this. I suppose i have to close the hearing now, but before we bring the hearing to a close i want to thank our witnesses for testifying before the committee. It is really valuable for us in congress as we struggle with all of the policy issues on biometrics of privacy that we have access to real, quality expert so we can understand the technological reality of the feasibility of things and dont generate legislation based on Wishful Thinking than technical reality. The record will remain open for two weeks for additional statements from the records or additional questions that the committee may ask the witnesses. The witnesses are now excused and the hearing is n